Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better error message on publish when crate is too big #1709

Closed
carols10cents opened this issue Apr 6, 2019 · 9 comments
Closed

Better error message on publish when crate is too big #1709

carols10cents opened this issue Apr 6, 2019 · 9 comments

Comments

@carols10cents
Copy link
Member

Not sure yet if this is a crates.io problem, a cargo problem, or both, but someone was having trouble publishing and cargo only printed that it got a 503 response, so it took time to figure out that the problem was their .crate was over the 10MB size limit.

The code looks like we're returning a nice error, but the output they got from cargo was:

error: failed to get a 200 OK response, got 503
headers:
    HTTP/1.1 100 Continue

    

    HTTP/1.1 503 Service Unavailable

    Connection: keep-alive

    Server: Cowboy

    Date: Sat, 06 Apr 2019 10:50:38 GMT

    Content-Length: 506

    Content-Type: text/html; charset=utf-8

    Cache-Control: no-cache, no-store

    

body:
<!DOCTYPE html>
    <html>
      <head>
        <meta name="viewport" content="width=device-width, initial-scale=1">
        <meta charset="utf-8">
        <title>Application Error</title>
        <style media="screen">
          html,body,iframe {
            margin: 0;
            padding: 0;
          }
          html,body {
            height: 100%;
            overflow: hidden;
          }
          iframe {
            width: 100%;
            height: 100%;
            border: 0;
          }
        </style>
      </head>
      <body>
        <iframe src="//www.herokucdn.com/error-pages/application-error.html"></iframe>
      </body>
    </html>

so it looks like cargo is hitting this arm when it should be hitting this arm and printing the error message returned from crates.io. So something isn't quite right and it seems like it's on the crates.io side.

@sgrif
Copy link
Contributor

sgrif commented Apr 6, 2019

This is the hard timeout from Heroku -- most likely the request body hadn't even fully arrived yet by the time this response gets sent. It's nothing we can improve on our end other than not having requests take more than 30 seconds (which is the goal of publish v2)

@sgrif
Copy link
Contributor

sgrif commented Apr 6, 2019

Or to be more specific, to improve this we'd need to add our own hard timeout that is shorter than Heroku's -- most likely would need to live entirely within nginx since this typically errors before getting to our router

@carols10cents
Copy link
Member Author

Ah. Perhaps we could change the error message Cargo shows in this case-- still show the 503, but then say "This may happen if your crate file is too large. Check the size of the .crate file by [...], and if it is over 10MB, exclude unnecessary files by [...] or send an email to help@crates.io for an exception. If this is not the case, then crates.io might be down, please check ping.crates.io and let someone know by [...]"

@sgrif
Copy link
Contributor

sgrif commented Apr 7, 2019 via email

@Eh2406
Copy link
Contributor

Eh2406 commented Apr 7, 2019

Cargo could only show the special message if the reply takes ~30 sec (from crates.io and 503). I would be r+ on a custom error message if we think this got hit.

@kennytm
Copy link
Member

kennytm commented Apr 7, 2019

This may happen if your crate file is too large. Check the size of the .crate file by [...], and if it is over 10MB, exclude unnecessary files by [...] or send an email to help@crates.io for an exception. If this is not the case, then crates.io might be down, please check ping.crates.io and let someone know by [...]

Nitpick: Since cargo knows exactly how large the .crate file is at https://github.com/rust-lang/cargo/blob/ed858daaf7c62f6f2156c47d00b0c669bcf9b183/src/crates-io/lib.rs#L210, I prefer using a more assertive wording here

If size of .crate file is >10 MB (or a lower threshold)

This may happen if your crate file is too large. The size of the .crate file by [...] is {} MB. Consider excluding unnecessary files by [...] or send an email to help@crates.io for an exception. If this is not the case, then crates.io might be down, please check ping.crates.io and let someone know by [...].

If size of .crate file is ≪10 MB

crates.io might be down, please check ping.crates.io and let someone know by [...].

@sgrif
Copy link
Contributor

sgrif commented Apr 7, 2019 via email

@kzys
Copy link
Contributor

kzys commented Apr 10, 2019

Does/Can cargo publish send the size of a crate file as Content-Length in the beginning?

Regarding Heroku's behavior, https://devcenter.heroku.com/articles/request-timeout is helpful.

sgrif added a commit to sgrif/cargo that referenced this issue May 13, 2019
crates.io is hosted on Heroku, which means we have a hard 30 second time
limit on all requests. Typically this is only hit when someone is
attempting to upload a crate so large that it would have been eventually
rejected anyway, but it can also happen if a user is on a very slow
internet connection.

When this happens, the request is terminated by the platform and we have
no control over the response that gets sent. This results in the user
getting a very unhelpful error message from Cargo, and some generic
error page HTML spat out into their terminal. We could work around this
on our end by adding a 29 second timeout *somewhere else* in the stack,
but we have a lot of layers that buffer requests to protect against slow
client attacks, and it'd be a pretty decent amount of work. Since we
eventually want to switch over to having Cargo do the S3 upload instead
of us, it doesn't make sense to spend so much time on an error scenario
that eventually will go away.

I've tried to keep this uncoupled from crates.io as much as possible,
since alternate registries might not be hosted on Heroku or have the
same restricitions. But I figure "a 503 that took more than 30 seconds"
is a safe bet on this being hit. If we're ok with coupling this to
crates.io, I'd like to include "If your crate is less than 10MB you can
email help@crates.io for assistance" in the error message.

Ref rust-lang/crates.io#1709
bors added a commit to rust-lang/cargo that referenced this issue May 14, 2019
Give a better error message when crates.io requests time out

crates.io is hosted on Heroku, which means we have a hard 30 second time
limit on all requests. Typically this is only hit when someone is
attempting to upload a crate so large that it would have been eventually
rejected anyway, but it can also happen if a user is on a very slow
internet connection.

When this happens, the request is terminated by the platform and we have
no control over the response that gets sent. This results in the user
getting a very unhelpful error message from Cargo, and some generic
error page HTML spat out into their terminal. We could work around this
on our end by adding a 29 second timeout *somewhere else* in the stack,
but we have a lot of layers that buffer requests to protect against slow
client attacks, and it'd be a pretty decent amount of work. Since we
eventually want to switch over to having Cargo do the S3 upload instead
of us, it doesn't make sense to spend so much time on an error scenario
that eventually will go away.

I've tried to keep this uncoupled from crates.io as much as possible,
since alternate registries might not be hosted on Heroku or have the
same restricitions. But I figure "a 503 that took more than 30 seconds"
is a safe bet on this being hit. If we're ok with coupling this to
crates.io, I'd like to include "If your crate is less than 10MB you can
email help@crates.io for assistance" in the error message.

Ref rust-lang/crates.io#1709
@sgrif
Copy link
Contributor

sgrif commented May 15, 2019

Improved the error message on the cargo side in rust-lang/cargo#6936 (comment)

@sgrif sgrif closed this as completed May 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants