You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
wget will return error code when it fails, so that makes sense, but curl won't - it will get 429 Rate Limit error, save html, and then try to "bunzip2" that, resulting in error message like:
bunzip2: phantomjs-2.1.1-linux-x86_64.tar.bz2 is not a bzip2 file.
tar: phantomjs-2.1.1-linux-x86_64.tar: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
Passing -f to curl will "fix" that problem, and instead of failing with meaningless error message, it will raise
raise"\n\nFailed to load phantomjs! :(\nYou need to have cURL or wget installed on your system.\nIf you have, the source of phantomjs might be unavailable: #{package_url}\n\n"
which still leaves the problem of everybody's CIs failing transiently due to bitbucket rate limit error, but at least they'll fail with a meaningful error message.
Possible better "fix" for this would be to just retry loop 10 times with 1s in between or something like that, or try multiple sources (bitbucket→s3, github→s3 etc.), but this is a start.
The text was updated successfully, but these errors were encountered:
Hitting this now. The NPM version of this package allows an option to use a CDN to install phantomjs, maybe that would be useful here? Additionally, it would be nice to pass in an option to avoid re-downloading phantomjs if the tar file already exists. I can make these changes once I get a few spare minutes.
Currently
phantomjs
gem does this:wget
will return error code when it fails, so that makes sense, butcurl
won't - it will get 429 Rate Limit error, save html, and then try to "bunzip2" that, resulting in error message like:Passing
-f
tocurl
will "fix" that problem, and instead of failing with meaningless error message, it will raisewhich still leaves the problem of everybody's CIs failing transiently due to bitbucket rate limit error, but at least they'll fail with a meaningful error message.
Possible better "fix" for this would be to just retry loop 10 times with 1s in between or something like that, or try multiple sources (bitbucket→s3, github→s3 etc.), but this is a start.
The text was updated successfully, but these errors were encountered: