Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Locust Congestion Workload #9157

Merged
merged 11 commits into from
Jun 13, 2023
Merged

Conversation

aborg-dev
Copy link
Contributor

@aborg-dev aborg-dev commented Jun 7, 2023

This PR introduces a Locust Workload for the Congestion Test: #8920.

A typical run would be:

CONTRACT="path/to/nearcore/runtime/near-test-contracts/res/test_contract_rs.wasm"
locust -H 127.0.0.1:3030 \
  CongestionUser \
  --congestion-wasm=$CONTRACT \
  --funding-key=$KEY \
  --tags congestion

Then cranking up the number of users to 50-200 should be enough to cause congestion.

Wishlist of things I want to improve:

  • Remove the need to specify CONTRACT variable every time as we know which contract we want to use for each workload type, we just don't know where it is stored
    • We can use the same approach as "runtime/runtime-params-estimator/res/" - submitting WASM contracts that this test depends on alongside the code and update them once in a while.
  • Make it possible to specify which exact method of Congestion workload to run as a parameter

@aborg-dev aborg-dev requested a review from a team as a code owner June 7, 2023 14:41
@aborg-dev aborg-dev requested a review from akhi3030 June 7, 2023 14:41
@aborg-dev aborg-dev marked this pull request as draft June 7, 2023 14:41
@aborg-dev aborg-dev requested review from jakmeier and removed request for akhi3030 June 7, 2023 14:41
@aborg-dev aborg-dev marked this pull request as ready for review June 9, 2023 14:11
@aborg-dev aborg-dev changed the title WIP: Congestion contract Locust Congestion Workload Jun 9, 2023
node,
base.CreateSubAccount(funding_account, account.key, balance=50000.0),
)
account.refresh_nonce(node)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I sometimes get an exception on this line:

[2023-06-12 10:23:49,569] akashin-testnet-1/ERROR/root: Uncaught exception in event handler:
Traceback (most recent call last):
  File "/home/ubuntu/.local/lib/python3.11/site-packages/locust/event.py", line 40, in fire
    handler(**kwargs)
  File "/home/ubuntu/repos/nearcore/pytest/tests/loadtest/locust/common/congestion.py", line 103, in on_locust_init
    account.refresh_nonce(node)
  File "/home/ubuntu/.local/lib/python3.11/site-packages/retrying.py", line 56, in wrapped_f
    return Retrying(*dargs, **dkw).call(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ubuntu/.local/lib/python3.11/site-packages/retrying.py", line 266, in call
    raise attempt.get()
          ^^^^^^^^^^^^^
  File "/home/ubuntu/.local/lib/python3.11/site-packages/retrying.py", line 301, in get
    six.reraise(self.value[0], self.value[1], self.value[2])
  File "/usr/lib/python3/dist-packages/six.py", line 703, in reraise
    raise value
  File "/home/ubuntu/.local/lib/python3.11/site-packages/retrying.py", line 251, in call
    attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
                      ^^^^^^^^^^^^^^^^^^^
  File "/home/ubuntu/repos/nearcore/pytest/tests/loadtest/locust/common/base.py", line 46, in refresh_nonce
    self.current_nonce.value = mocknet_helpers.get_nonce_for_key(
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ubuntu/repos/nearcore/pytest/lib/mocknet_helpers.py", line 25, in get_nonce_for_key
    return get_nonce_for_pk(key.account_id, key.pk, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ubuntu/repos/nearcore/pytest/lib/mocknet_helpers.py", line 53, in get_nonce_for_pk
    raise KeyError(f'Nonce for {account_id} {pk} not found')
KeyError: 'Nonce for congestion.node0 ed25519:u3MU7QizvZ2gEuwdGivh9epU1bVsGJtKtQw7RzdPTHf not found'

But if I don't have this line, transactions fail down the line. Looks like I need to do it conditionally only once?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, that looks to me like the same problem why I added the @retry in the first place. The error occurs on a call to lookup the key info on the RPC node, which contains the newest Nonce. The errors is raised because the key list returned is empty.

Still not quite sure why it happens but one possibility is that the RPC node is behind and doesn't know about the included account yet. Or maybe the account creation failed.

Maybe the @retry arguments need to be relaxed more?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found out why I was getting this error - I'm reusing the same network across multiple locust runs. The first run successfully creates account congestion.node, but all consecutive calls fail because the account already exists:

(True, {'jsonrpc': '2.0', 'result': {'receipts_outcome': [{'block_hash': '7UCWiWPWX8y2SEu6xicJYbrHpPwjg5PiSa7SoTSCT8pG', 'id': 'EZup3kdCjYamFKKP6e4iirv6U4iH2CSEhqLJtAm5fHRX', 'outcome': {
'executor_id': 'congestion.node0', 'gas_burnt': 3958059500000, 'logs': [], 'metadata': {'gas_profile': [], 'version': 3}, 'receipt_ids': ['6YE2hkfvzVUrveY3xQcN3QWkjYwassGUzk8rhsb49dvz', '
8JRNoWFo6zaHu59B67SzoeFwFwnfGEfdyWriQKFrd7a6'], 'status': {'Failure': {'ActionError': {'index': 0, 'kind': {'AccountAlreadyExists': {'account_id': 'congestion.node0'}}}}}, 'tokens_burnt':
 '395805950000000000000'}, 'proof': []}, {'block_hash': '9b1fp2LB6c86G46eAW6j6rSQAvkQrVWiBpEc8MsrZv3K', 'id': '6YE2hkfvzVUrveY3xQcN3QWkjYwassGUzk8rhsb49dvz', 'outcome': {'executor_id': 'n
ode0', 'gas_burnt': 223182562500, 'logs': [], 'metadata': {'gas_profile': [], 'version': 3}, 'receipt_ids': [], 'status': {'SuccessValue': ''}, 'tokens_burnt': '0'}, 'proof': [{'direction
': 'Right', 'hash': 'CKjKfxmogh3x9Max91VxxPN51QfV6xr92e8dLC3P52eQ'}]}, {'block_hash': '9b1fp2LB6c86G46eAW6j6rSQAvkQrVWiBpEc8MsrZv3K', 'id': '8JRNoWFo6zaHu59B67SzoeFwFwnfGEfdyWriQKFrd7a6',
 'outcome': {'executor_id': 'node0', 'gas_burnt': 223182562500, 'logs': [], 'metadata': {'gas_profile': [], 'version': 3}, 'receipt_ids': [], 'status': {'SuccessValue': ''}, 'tokens_burnt
': '0'}, 'proof': [{'direction': 'Left', 'hash': 'DUAdiiLDAPqj2DEUzfq7sswhhAePNU1YT46j7Kt4aF5V'}]}], 'status': {'Failure': {'ActionError': {'index': 0, 'kind': {'AccountAlreadyExists': {'
account_id': 'congestion.node0'}}}}}, 'transaction': {'actions': ['CreateAccount', {'AddKey': {'access_key': {'nonce': 0, 'permission': 'FullAccess'}, 'public_key': 'ed25519:CnEDk9HrMnmiH
XEV1WFgbVCRteYnPqsJwrTdcZaNhFVW'}}, {'Transfer': {'deposit': '49999999999999995716575428608'}}], 'hash': '14rYUDZCasDjeWN7NZnfqauPFesd4azFNofHyB5r8WNd', 'nonce': 24, 'public_key': 'ed2551
9:7PGseFbWxvYVgZ89K1uTJKYoKetWs7BJtbyXDzfbAcqX', 'receiver_id': 'congestion.node0', 'signature': 'ed25519:3oaqPZsNtUQoPoDhfcunU5ea6c4PgXfNRqxKUABeTkVUiDTCKbAwN7opr7WH8BtSAWh2tbZVJ8kbpwu54
7pxiFyk', 'signer_id': 'node0'}, 'transaction_outcome': {'block_hash': '2g81sNcE5G48ZRHKmT2uSAKwVUycnvNNsnSUXeEXK8hd', 'id': '14rYUDZCasDjeWN7NZnfqauPFesd4azFNofHyB5r8WNd', 'outcome': {'e
xecutor_id': 'node0', 'gas_burnt': 4174947687500, 'logs': [], 'metadata': {'gas_profile': None, 'version': 1}, 'receipt_ids': ['EZup3kdCjYamFKKP6e4iirv6U4iH2CSEhqLJtAm5fHRX'], 'status': {
'SuccessReceiptId': 'EZup3kdCjYamFKKP6e4iirv6U4iH2CSEhqLJtAm5fHRX'}, 'tokens_burnt': '417494768750000000000'}, 'proof': []}}, 'id': 'dontcare'})

I think in this case I just want to use the same account and keys across all runs, so I've fixed it using from_seed_testonly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see. Yeah, if you reuse the network you get different keys and creation will fail because it was already created. But I'm not sure if the other type of workload can handle reuse of the same network. It might work but there could be some hick-ups in the setup phase.

pytest/tests/loadtest/locust/common/congestion.py Outdated Show resolved Hide resolved
pytest/tests/loadtest/locust/README.md Outdated Show resolved Hide resolved
pytest/tests/loadtest/locust/common/congestion.py Outdated Show resolved Hide resolved
node,
base.CreateSubAccount(funding_account, account.key, balance=50000.0),
)
account.refresh_nonce(node)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, that looks to me like the same problem why I added the @retry in the first place. The error occurs on a call to lookup the key info on the RPC node, which contains the newest Nonce. The errors is raised because the key list returned is empty.

Still not quite sure why it happens but one possibility is that the RPC node is behind and doesn't know about the included account yet. Or maybe the account creation failed.

Maybe the @retry arguments need to be relaxed more?

@aborg-dev aborg-dev added A-congestion Work aimed at ensuring good system performance under congestion S-automerge labels Jun 12, 2023
aborg-dev added a commit to aborg-dev/nearcore that referenced this pull request Jun 13, 2023
It looks like yapf formatter invoked with `scripts/formatting` uses
system Python 3 version which yields different results for Python 3.11
and Python 3.8. Buildkite seems to use Python 3.8, so this leads to
different behavior locally and remotely.

For now, I'm fixing the current lint warning that prevents submitting near#9157
but ideally we would fix this by either making sure formatting is the
same across versions or by fixing the Python version used in the
formatting script.
@near-bulldozer near-bulldozer bot merged commit d584d02 into near:master Jun 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-congestion Work aimed at ensuring good system performance under congestion
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants