Skip to content
This repository has been archived by the owner on Feb 26, 2024. It is now read-only.

Transaction simulation takes too long to return #4293

Closed
lalittanna opened this issue Mar 7, 2023 · 6 comments
Closed

Transaction simulation takes too long to return #4293

lalittanna opened this issue Mar 7, 2023 · 6 comments

Comments

@lalittanna
Copy link

lalittanna commented Mar 7, 2023

`const Web3 = require("web3");
const ganache = require("ganache");
const { performance } = require("perf_hooks");

const options = {
fork: { url: process.env.INFURA_PRODUCTION_MAINNET_ENDPOINT, blockNumber: 15120604, deleteCache: true, disableCache: true },
chain: { chainId: "1", hardfork: "berlin", time: "Jul-11-2022-09:41:38" },
wallet: { unlockedAccounts: ["0x26fcbd3afebbe28d0a8684f790c48368d21665b5"] },
miner: { defaultGasPrice: "0x4e20" },
logging: {quiet: true}
};
const web3 = new Web3(ganache.provider(options));

async function main() {
var startTimer = performance.now();
var tx;
try {
tx = await web3.eth.sendTransaction({
from: "0x26fcbd3afebbe28d0a8684f790c48368d21665b5",
to: "0x7937d4799803fbbe595ed57278bc4ca21f3bffcb",
value: 0,
data: "0x69328dec000000000000000000000000d3d2e2692501a5c9ca623199d38826e513033a17ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff00000000000000000000000026fcbd3afebbe28d0a8684f790c48368d21665b5",
gasPrice: "20000",
gas: "3000000",
});
} catch (err) {
console.error(
Failed to simulate transaction: ${err.message}
);
return err.message;
}
var endTimer = performance.now();
console.log(
Transaction simulation took ${ endTimer - startTimer } milliseconds
);
}

main();`

I am using Ganache 7.4.3 and this transaction takes 25-30 seconds to return. Does anyone know why it takes so long and if there's a way to reduce this latency?

The Etherscan link of this transaction is https://etherscan.io/tx/0xe614e025da19e1eede290731cf4bba2acafd8550c51512bffe4a5c2d4cba21d3

@davidmurdoch
Copy link
Member

You're deleting and disabling the caches. The only purpose of a cache is to make things faster.

@lalittanna
Copy link
Author

lalittanna commented Mar 7, 2023

@davidmurdoch
Caching would make things faster in the subsequent requests but I don't think it is of any use in the case where a request is never seen before. I am assessing how long the request takes in the very first instance before caching even occurs and I am interested in knowing why it is so slow in that case?

@davidmurdoch
Copy link
Member

davidmurdoch commented Mar 7, 2023

It catches the raw data, not requests. And it's not just a disk cache, but an in-memory cache, too. It's slow because you've turned off the cache.

I am assessing how long the request takes in the very first instance before caching even occurs

You can't do that. The caching occurs during the request, not after.

Anyway, you're measuring network latency for what is probably dozens to hundreds of sequential RPC calls.

@lalittanna
Copy link
Author

Ok, I removed the disable and delete cache fields and in the first run, it took ~28.4 seconds and in the following runs, it took ~2.6 seconds which is great if my business logic was to run the same request multiple times but in my case, I will get a different transaction every time and in that case, caching is not really going to help. I understand that if a transaction is doing more work then it will take longer but that's not the case here. If we compare transaction's work based on number of events and states then this one emits 6 events and changes 5 states whereas this tx: https://etherscan.io/tx/0x7efa4cfeea06b544a30dae0132085825047ad8f9f7cfa756463ab325073fee81
emits 21 events and changes states of 8 addresses so its fair to say that the later one is doing more work but still it only takes ~13-15 seconds to simulate in the first run.

@davidmurdoch I am trying to understand why certain transactions take longer than others and also if you have any tips on making these simulations significantly faster (before caching) from the client side like vertical scaling or anything else, I would really appreciate that.

@davidmurdoch
Copy link
Member

You're measuring network latency for what is probably dozens to hundreds of sequential RPC to Infura. Every time a transaction reads an address of any kind that's 3 RPCs to Infura (to get the balance, nonce, and code), every time it gets state that's an RPC to Infura, and sometimes fetching state can result in multiple RPCs to Infura.

Note: changing state is not a good indicator for how many RPC calls ganache will make, since there are usually way more read operations than there are writes operations (as writing requires an internal read operation first due to dynamic gas costs).

If you want Ganache forking to go faster you'll need to bring the data closer to you (like running your own local node).

There are some tricks that we've been playing with internally to try to preload state for well-known contracts, but ultimately they don't result in massive gains anyway.

@lalittanna
Copy link
Author

Got it, that's what I needed to know. Thanks @davidmurdoch

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants