You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using processes = -1 in the locust.conf file and 12 or more workers are running, the logfile is getting crushed by all the simultaneous writes. Data is being inserted in the middle of writes from other workers, and a lot of missing newlines.
Command line
locust
Locustfile contents
self.client.post(
"/api/v1/entropy",
name="entropy"+"/"+TEST_CASE.name,
headers=self.headers,
json=TEST_CASE.value,
timeout=90,
)
#conf filehost=http://localhost# uncomment this for local dev env testingusers=300spawn-rate=25processes=-1csv=results/300VUs-32B-Soak-Testonly-summary=truerun-time=36hheadless=true
Python version
3.10
Locust version
2.31.8
Operating system
Ubuntu 22.04
The text was updated successfully, but these errors were encountered:
I'm guessing you wouldn't want workers to write to the csv file at all and only have the master do it? I've added a fix for that, feel free to try it out.
Prerequisites
Description
When using processes = -1 in the locust.conf file and 12 or more workers are running, the logfile is getting crushed by all the simultaneous writes. Data is being inserted in the middle of writes from other workers, and a lot of missing newlines.
Command line
locust
Locustfile contents
Python version
3.10
Locust version
2.31.8
Operating system
Ubuntu 22.04
The text was updated successfully, but these errors were encountered: