Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server stops on large files #876

Closed
stellanhaglund opened this issue Sep 21, 2015 · 17 comments
Closed

server stops on large files #876

stellanhaglund opened this issue Sep 21, 2015 · 17 comments
Assignees

Comments

@stellanhaglund
Copy link

Hi I have tried both with

app.get('/file/:imageId', function(req, res){

var remoteFile = bucket.file(req.params.imageId);
var localFilename = 'uploads/' + req.params.imageId;
var src = remoteFile.createReadStream()

  src
  .on('error', function(err) {
    console.log(err)
  })
  .on('data', function(data){

  })
  .on('response', function(response) {
    // console.log(response)
    // Server connected and responded with the specified status and headers.
   })
  .on('end', function() {
    // The file is fully downloaded.
    console.log('fully downloaded');
  })

  src.pipe(res);
  // .pipe(fs.createWriteStream(localFilename));
  res.attachment(localFilename);
})

and

var remoteFile = bucket.file('large.zip')
remoteFile.createReadStream().pipe(fs.createWriteStream('uploads/test.dmg'))

and in both cases the server seems to stop the file I'm trying with is about 300mb but I also need to be able to handle gb files.

@stephenplusplus
Copy link
Contributor

What version of Node are you running?

Do you receive any errors if you attach an error handler?

@stellanhaglund
Copy link
Author

0.12.7

@stellanhaglund
Copy link
Author

where should I put the error handler?

The server completely freezes up so I have to restart it.

@stephenplusplus
Copy link
Contributor

Interesting. Let's narrow the scope of testing to just this example:

var remoteFile = bucket.file('large.zip')
remoteFile.createReadStream().pipe(fs.createWriteStream('uploads/test.dmg'))

For an error handler, do:

var remoteFile = bucket.file('large.zip')
remoteFile.createReadStream()
  .on('error', console.error)
  .pipe(fs.createWriteStream('uploads/test.dmg'))
  .on('error', console.error);

The server freezing makes me think it's a local issue, but I'm not sure what; out of disk space?

We can narrow down the test even more to just "download a huge file", as this is all that our read stream does behind the scenes. This next snippet downloads a 1.1GB file. Can you let me know if this finishes successfully?

var request = require('request');

request('http://releases.ubuntu.com/15.04/ubuntu-15.04-desktop-amd64.iso')
  .on('error', console.error)
  .pipe(fs.createWriteStream('ubuntu.iso'))
  .on('error', console.error);

@stellanhaglund
Copy link
Author

i did your last example and it gave no errors and the file is placed in my server disk, I will try the other with error handling now

@stellanhaglund
Copy link
Author

It's not out of disk space I have a clean google compute engine instance running.

when I downloaded the ubuntu file I was able to look at the files and see the bytes increase using ls -l in another terminal window, but when I try to do your other one with error handler using a file from the bucket it says -bash: fork: Cannot allocate memory and that file is only 300mb and it stays that way for alot longer than it took to download the ubuntu file

and no errors

@stellanhaglund
Copy link
Author

after about 10 minutes I terminated the node process and about 5mb was downloaded

@stephenplusplus
Copy link
Contributor

Thanks for the details. Are you instantiating gcloud the same as this post?

var projectId = 'idid'; // E.g. 'grape-spaceship-123' 

var gcloud = require('gcloud')({
    projectId: projectId
});

@stellanhaglund
Copy link
Author

Yes I do

@stellanhaglund
Copy link
Author

upload of the large files works fine it's just download that's a problem

@stephenplusplus
Copy link
Contributor

I'll be checking into this and will keep you posted.

@stellanhaglund
Copy link
Author

Great thanks!

2015-09-21 16:30 GMT+02:00 Stephen Sawchuk notifications@github.com:

I'll be checking into this and will keep you posted.


Reply to this email directly or view it on GitHub
#876 (comment)
.

@stephenplusplus
Copy link
Contributor

Can you tell me more about the Compute Engine instance you're using? Did you grant it access to the correct scopes? I just tested locally and on a VM, and it was able to download successfully.

Did you have any warnings or errors during npm install gcloud?

@stellanhaglund
Copy link
Author

I will try to reinstall it and see if there is any

@stellanhaglund
Copy link
Author

I noticed something, it seems like the "large file" I have tried may be the problem I have used a random dmg I uploaded to the bucket.

And this file stops but not the other ones it seems..

I haven't tested this very much but I have tried 150mb files and they seems to be downloaded just fine.

@stephenplusplus
Copy link
Contributor

Interesting. Is the dmg publicly available for me to test with?

Even with a corrupt file, I wouldn't suspect the download to cause a lockup. To skip some of the cycles we go through during a download, try setting validation to false:

remoteFile.createReadStream({ validation: false })
  .on('error', console.error)
  .pipe(fs.createWriteStream('uploads/test.dmg'))
  .on('error', console.error);

@stephenplusplus
Copy link
Contributor

I'm going to close this one as it seems like things have been resolved. Please re-open if you run into any issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants