Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

429 Too Many Requests from GitHub Actions, can't build anything #140

Open
clemlesne opened this issue Jun 7, 2023 · 33 comments
Open

429 Too Many Requests from GitHub Actions, can't build anything #140

clemlesne opened this issue Jun 7, 2023 · 33 comments

Comments

@clemlesne
Copy link

Receiving error 429 Too Many Requests from two hours. Pulling mcr.microsoft.com/dotnet/aspnet:6.0-jammy.

Date: June 7, 2023, 6:00 PM

Short error:

Error: buildx failed with: ERROR: failed to solve: failed to compute cache key: failed to copy: httpReadSeeker: failed open: unexpected status code https://mcr.microsoft.com/v2/dotnet/aspnet/blobs/sha256:d3b756117bfc66a1c14dcc282f851891ef89aae8f856f4104f5eead552a718f1: 429 Too Many Requests

Long error:

ERROR: failed to solve: failed to compute cache key: failed to copy: httpReadSeeker: failed open: unexpected status code https://mcr.microsoft.com/v2/dotnet/aspnet/blobs/sha256:d3b756117bfc66a1c14dcc282f851891ef89aae8f856f4104f5eead552a718f1: 429 Too Many Requests
Error: buildx failed with: ERROR: failed to solve: failed to compute cache key: failed to copy: httpReadSeeker: failed open: unexpected status code https://mcr.microsoft.com/v2/dotnet/aspnet/blobs/sha256:d3b756117bfc66a1c14dcc282f851891ef89aae8f856f4104f5eead552a718f1: 429 Too Many Requests
@ptr727
Copy link

ptr727 commented Aug 17, 2023

Similar issue here, running a matrix, build fails, getting error 429:

ERROR: failed to solve: mcr.microsoft.com/dotnet/sdk:7.0-alpine: failed to copy: httpReadSeeker: failed open: unexpected status code https://mcr.microsoft.com/v2/dotnet/sdk/manifests/sha256:9efa4cb38fb3b957595b4dd60a028044a1f7750d058405ab428153c3aa30ec01: 429 Too Many Requests
Error: buildx failed with: ERROR: failed to solve: mcr.microsoft.com/dotnet/sdk:7.0-alpine: failed to copy: httpReadSeeker: failed open: unexpected status code https://mcr.microsoft.com/v2/dotnet/sdk/manifests/sha256:9efa4cb38fb3b957595b4dd60a028044a1f7750d058405ab428153c3aa30ec01: 429 Too Many Requests

@AndreHamilton-MSFT
Copy link
Contributor

@clemlesne we are actively working to reduce some of the throttling you have been noticing and will update once we have more

@benmccallum
Copy link

We're getting this too.

Unfortunately docker caching with the gha (GitHub actions cache) is just not feasible with GHA's 10GB cache limit and Docker images with layers that are several GBs.

Patiently waiting for the S3-backed caching and hoping we can make do for now. 🙏

@clemlesne
Copy link
Author

Builds are still failing regularly, from my managed GitHub Actions.

@telia-ankita
Copy link

Hi Team,
We are also facing same issue. Did anyone get the solution?

@AndreHamilton-MSFT
Copy link
Contributor

We are working on ways to reduce the likelihood of this occuring.

@perSjov
Copy link

perSjov commented May 7, 2024

Still happening today

@bnneupart
Copy link

We also got build fails because of this...

@alensindicic
Copy link

This is still an issue.

@ptr727
Copy link

ptr727 commented May 7, 2024

Also getting buildx failed with: ERROR: failed to solve: error writing layer blob: maximum timeout reached

I resorted to using max-parallel: 4 in the matrix strategy, builds now take forever, but seem to be more reliable.

@boukeversteegh
Copy link

Hello, our build pipelines are all blocked due to 429's on the container registry

az webapp deployment slot swap --slot staging --name **** --resource-group ****

Warning: Unable to fetch all az cli versions, please report it as an issue on https://github.com/Azure/CLI/issues. Output: Ref A: 4237CA5349C14533AAC450D8F8CB4763 Ref B: DM2EDGE0507 Ref C: 2024-05-08T08:31:59Z
, Error: SyntaxError: Unexpected token 'R', "Ref A: 423"... is not valid JSON
Starting script execution via docker image mcr.microsoft.com/azure-cli:2.59.0
Error: Error: Unable to find image 'mcr.microsoft.com/azure-cli:2.59.0' locally
docker: Error response from daemon: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: 34A2CDDED8114A60AB26872384E74885 Ref B: DM2EDGE0912 Ref C: 2024-05-08T08:31:59Z".

@ameya-karbonhq
Copy link

Hello,
We are also experiencing the same and blocking our deployments.

Starting script execution via docker image mcr.microsoft.com/azure-cli:2.55.0
Error: Error: Unable to find image 'mcr.microsoft.com/azure-cli:2.55.0' locally
2.55.0: Pulling from azure-cli
96526aa774ef: Pulling fs layer
430548f4d4bf: Pulling fs layer
9ae8a48eae03: Pulling fs layer
2d30bba99930: Pulling fs layer
3d288dfecc47: Pulling fs layer
2a58a5c1116a: Pulling fs layer
4f4fb700ef54: Pulling fs layer
2d30bba99930: Waiting
3d288dfecc47: Waiting
2a58a5c1116a: Waiting
4f4fb700ef54: Waiting
docker: error pulling image configuration: download failed after attempts=1: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: 657DA19E112344D5A432995949A2CA40 Ref B: DM2EDGE1016 Ref C: 2024-05-08T08:36:29Z".
See 'docker run --help'.

cleaning up container...

@grzesuav
Copy link

Observed the same issue from AKS cluster - Azure/AKS#4279

@lfraile
Copy link

lfraile commented May 10, 2024

Hello, one more here with errors 429 :(

@grzesuav
Copy link

In the AKS Issue I created they admit there is a throttling issue in centralus region

@osbash
Copy link

osbash commented May 10, 2024

I am experiencing the same thing. This needs to be resolved immediately as its completely put a stop to our deployments.

@jay-pawar-mastery
Copy link

I am seeing ERROR: pulling from host mcr.microsoft.com failed with status code [manifests 8.0]: 429 Too Many Requests as well.

@dejancg
Copy link

dejancg commented May 10, 2024

Another one here, since today.

@lol768
Copy link

lol768 commented May 11, 2024

Also experiencing this :(

@vitalykarasik
Copy link

It seems that caching should be an answer to this issue, but I'm not sure which method I should use in my Github workflow.
I'm using mcr.microsoft.com for both "dotnet test" and as a base image for my .NET build.
I.e.

docker run --rm -v $(pwd):/app -w /app mcr.microsoft.com/dotnet/sdk:6.0 dotnet test testdir
and
docker build .

@da-zu
Copy link

da-zu commented May 13, 2024

Me too, our CI/CD pipelines are failing randomly due to 429. Please fix ASAP! :)

@codehunter13
Copy link

same problem over here. Pipelines are often failing. with this error

@npiskarev
Copy link

npiskarev commented May 13, 2024

ERROR: failed to solve: mcr.microsoft.com/dotnet/aspnet:8.0: pulling from host mcr.microsoft.com failed with status code [manifests 8.0]: 429 Too Many Requests

same issue happened right now

@mentallabyrinth
Copy link

Using Azure Pipelines, experiencing the same issue:

#3 [internal] load metadata for mcr.microsoft.com/dotnet/aspnet:8.0
#3 ERROR: pulling from host mcr.microsoft.com failed with status code [manifests 8.0]: 429 Too Many Requests

@enf0rc3
Copy link

enf0rc3 commented May 13, 2024

Using Github Actions also having this issue intermittently, we have about 5 docker containers being built in one Github action (same image), what is the best approach to cache?

   2 | >>> FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build-env
   3 |     WORKDIR /app
   4 |     
--------------------
ERROR: failed to solve: mcr.microsoft.com/dotnet/sdk:8.0: pulling from host mcr.microsoft.com failed with status code [manifests 8.0]: 429 Too Many Requests

@AndreHamilton-MSFT
Copy link
Contributor

We are in the process of rolling out new hardware that will resolve this issue. We are already seeing decreased throttling in the last few hours. It will take a while to roll this out globally, but you should already be seeing improvements if you are located in the central us regions

@lol768
Copy link

lol768 commented May 14, 2024

Many thanks for acknowledging, and for the detail on plans to resolve @AndreHamilton-MSFT

@lfraile
Copy link

lfraile commented May 14, 2024

Thank you @AndreHamilton-MSFT !!

@stevef51
Copy link

So we use Azure Devops for our CI builds. We have seen this issue over the last several weeks and I am still getting builds issues today. At the moment I am having to manually re-run builds and cross fingers that I don't get the Too Many Requests error, almost seems like a 50:50 on any given build

image

@AndreHamilton-MSFT when you say "should be seeing improvements if you are located in the central us regions" - I am guessing that is referring to the location of our build machines, we are using Azure VM's in Central US.

@AndreHamilton-MSFT
Copy link
Contributor

@stevef51 correct. We are still in the processing of rolling out the new hardware and i suspect your traffic landed on older hardware more prone to throttling. Going to see if i can make some further tweaks to reduce overall throttling likelihood. until we roll out globally you may still see some throttling, but we are working to roll that out as quickly and safely as possible.

@bsripuram
Copy link

@AndreHamilton-MSFT , We are still facing this issue, may i know the ETA by when you complete the rollout globally?

@AndreHamilton-MSFT
Copy link
Contributor

@bsripuram we are mostly rolled out. How are things looking in the last week

@benmccallum
Copy link

I believe things have stabilised a lot on our end as I can't recall seeing this for quite a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests