-
Notifications
You must be signed in to change notification settings - Fork 598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
server stops on large files #876
Comments
What version of Node are you running? Do you receive any errors if you attach an error handler? |
0.12.7 |
where should I put the error handler? The server completely freezes up so I have to restart it. |
Interesting. Let's narrow the scope of testing to just this example: var remoteFile = bucket.file('large.zip')
remoteFile.createReadStream().pipe(fs.createWriteStream('uploads/test.dmg')) For an error handler, do: var remoteFile = bucket.file('large.zip')
remoteFile.createReadStream()
.on('error', console.error)
.pipe(fs.createWriteStream('uploads/test.dmg'))
.on('error', console.error); The server freezing makes me think it's a local issue, but I'm not sure what; out of disk space? We can narrow down the test even more to just "download a huge file", as this is all that our read stream does behind the scenes. This next snippet downloads a 1.1GB file. Can you let me know if this finishes successfully? var request = require('request');
request('http://releases.ubuntu.com/15.04/ubuntu-15.04-desktop-amd64.iso')
.on('error', console.error)
.pipe(fs.createWriteStream('ubuntu.iso'))
.on('error', console.error); |
i did your last example and it gave no errors and the file is placed in my server disk, I will try the other with error handling now |
It's not out of disk space I have a clean google compute engine instance running. when I downloaded the ubuntu file I was able to look at the files and see the bytes increase using ls -l in another terminal window, but when I try to do your other one with error handler using a file from the bucket it says -bash: fork: Cannot allocate memory and that file is only 300mb and it stays that way for alot longer than it took to download the ubuntu file and no errors |
after about 10 minutes I terminated the node process and about 5mb was downloaded |
Thanks for the details. Are you instantiating gcloud the same as this post? var projectId = 'idid'; // E.g. 'grape-spaceship-123'
var gcloud = require('gcloud')({
projectId: projectId
}); |
Yes I do |
upload of the large files works fine it's just download that's a problem |
I'll be checking into this and will keep you posted. |
Great thanks! 2015-09-21 16:30 GMT+02:00 Stephen Sawchuk notifications@github.com:
|
Can you tell me more about the Compute Engine instance you're using? Did you grant it access to the correct scopes? I just tested locally and on a VM, and it was able to download successfully. Did you have any warnings or errors during |
I will try to reinstall it and see if there is any |
I noticed something, it seems like the "large file" I have tried may be the problem I have used a random dmg I uploaded to the bucket. And this file stops but not the other ones it seems.. I haven't tested this very much but I have tried 150mb files and they seems to be downloaded just fine. |
Interesting. Is the dmg publicly available for me to test with? Even with a corrupt file, I wouldn't suspect the download to cause a lockup. To skip some of the cycles we go through during a download, try setting remoteFile.createReadStream({ validation: false })
.on('error', console.error)
.pipe(fs.createWriteStream('uploads/test.dmg'))
.on('error', console.error); |
I'm going to close this one as it seems like things have been resolved. Please re-open if you run into any issues. |
Hi I have tried both with
and
and in both cases the server seems to stop the file I'm trying with is about 300mb but I also need to be able to handle gb files.
The text was updated successfully, but these errors were encountered: