-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Runtime Support: mobile performance #3
Comments
Follow the steps here to speed up the nodejs process. |
Also - bundling the nodejs files (using browserify) seem to make a huge difference in performance. I'll push an update in the near future. |
I've updated the nodemobility. framework, but it doesn't seem to have a significant impact on performance. At present, I am mainly blocking the "jsipfs identity generated" place at the first run time (which will not be run later), and waiting time is longer to start the "main.js" run |
Strange - that made a pretty big difference for me - cut it down from 6-7 mins down to 1-2 mins. The other thing to try is bundling via browserify - that made the biggest difference for me. Inside
Then in |
You can also lower the bits to 1024 via the ipfs config - that'll speed up key generation but you'll have a weaker key which shouldn't be an issue for most applications. |
Ipfs does start up a lot faster with bundle.js (the current code), and the CPU load continues to change between 100% and 0% after startup, much lower than it was before at 99%. But memory usage is much higher. So it seems that nodejs-mobile uses CPU/memory more than libp2p |
Is there any further optimization |
hmm - for me I saw improvement in memory usage as well. What is your ipfs config? |
I used ipfsd-ctl. |
The only thing I do differently is set |
I also use the |
|
see above - for my explanation of the difference. It will impact performance mainly because it will be interacting with fewer peers - thus less muxing, etc. I wouldn't quite classify this as a performance enhancement though. Your best bet on this front is to wait for the connection manager to get integrated. |
Is there a performance difference between adding |
I find that simulator is more performant than my iPhone 6+. There shouldn't be too much difference between adding that swarm address because there's usually only one peer thats connected to that one, as its my own personal websocket discovery service. For me, the CPU goes up to 100% on load but then drops back down after a minute or so. |
Do you mean connectionManager in jsipfs configuration items? The latest version of 0.30 should already be supported. What is the maximum number of connections? |
Yes - by default the maximum number of connections is |
I set the maximum number of connections to 5, and it doesn't seem to change anything |
It's because of this: libp2p/js-libp2p#224 |
This problem has been closed, does it mean that this problem has been solved and we can use it directly? |
It'll be available in the upcoming libp2p@0.23 version. You can pull it directly from github and override the dep in ipfs for now. |
I have used libp2p@0.23 version, and the situation has not improved. My current situation is that once a cat or add method is called on a slightly larger file (using a local file), the CPU immediately jumps to 99%, causing the wait time to be particularly long, and the situation is so bad that it feels unusable. Is there any other possibility of improvement, nodeJs or ipfs? |
Ipfs starts up very quickly after |
The use of the latest nodejs package has significantly improved the running speed |
When I first ran the program, from a log point of view, identity generated very slowly, It takes a few minutes, I don't know what caused it。The log is as follows:
2018-06-08T04:14:59.157Z jsipfs EXPERIMENTAL Relay is enabled 2018-06-08T04:14:59.564Z jsipfs booting 2018-06-08T04:14:59.664Z jsipfs boot:done 2018-06-08T04:14:59.664Z jsipfs init 2018-06-08T04:14:59.664Z jsipfs repo exists? false 2018-06-08T04:14:59.664Z jsipfs generating peer id: 2048 bits 2018-06-08T04:19:30.065Z jsipfs identity generated
The text was updated successfully, but these errors were encountered: