-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[cdk-pipelines] Infinite-loop in self-mutating pipeline #32008
Comments
Hi, thank you for your report. That fix was in 2020 and I believe it was a fix for CDK v1. Are we talking about a potential regression from a PR in 2020 and you are still affected even in
Are you able to provide a minimal code snippets that I can paste into my IDE and reproduce it? This would be very helpful to help us what's the best to do next. Thank you. |
@pahud I wasn't aware it was for CDKv1. In that case, no, it would not be a regression. Minimal snippet is below. from aws_cdk import aws_codebuild as codebuild
from aws_cdk import aws_codepipeline as codepipeline
from aws_cdk import pipelines
from constructs import Construct
class PipelineReproExample(Construct):
def __init__(self, scope: Construct, id: str, **kwargs) -> None:
super().__init__(scope, id, **kwargs)
build_spec = codebuild.BuildSpec.from_object(
{
"version": 0.2,
"phases": {
"install": {
"commands": [
"echo Starting pipeline...",
],
},
},
}
)
build_env = codebuild.BuildEnvironment(
build_image=codebuild.LinuxArmBuildImage.from_code_build_image_id("aws/codebuild/amazonlinux2-aarch64-standard:3.0"
),
compute_type=codebuild.ComputeType.SMALL,
privileged=True,
)
codebuild_defaults = pipelines.CodeBuildOptions(
cache=codebuild.Cache.local(codebuild.LocalCacheMode.DOCKER_LAYER),
build_environment=build_env,
partial_build_spec=build_spec,
)
l2_codepipleine_pipeline = codepipeline.Pipeline(
scope,
"CodePipelineL2",
pipeline_type=codepipeline.PipelineType.V2,
cross_account_keys=True,
execution_mode=codepipeline.ExecutionMode.SUPERSEDED, # Note:PARALLEL mode only supports 5 docker assets built in parallel, while SUPERSEDED mode supports 50 docker assets built in parallel.
restart_execution_on_update=True, # Restarts the pipeline when it's updated by self mutation.
reuse_cross_region_support_stacks=True,
)
synth_step = pipelines.CodeBuildStep(
"SynthStep",
commands=[
"cdk synth <YOUR_STACK> -vvvvv --debug=true --trace --validation=true --long=true",
],
cache=codebuild.Cache.local(codebuild.LocalCacheMode.DOCKER_LAYER),
)
pipeline = pipelines.CodePipeline(
scope,
"CodePipelineBasePipeline",
code_pipeline=l2_codepipleine_pipeline,
synth=synth_step,
code_build_defaults=codebuild_defaults,
) |
Hi, We need a minimal code snippet that you can confirm it reproduces the error in your environment so we can better address with that. Can you make sure the code you provided reproduces the behavior in your environment using CDK 2.164.1? And, as I do not have |
@pahud I've edited the snippet slightly to make this as easy as possible. I can confirm this reproduces the error. You will need to add building custom docker images. Giving you all of that code would be prohibitive, so I haven't included it. It's straightforward.
As I said in the ticket, it loops infinitely. There is no error thrown. I've edited the snippet to remove that line. |
@BwL1289 Thank you. I will validate this today and circle back. |
@pahud let me know how else I can help. If this is somehow user error, I want to know. |
Hi @BwL1289 I was not able to reproduce the loop issue and it went pretty well. See screenshot below: I updated your code though as it could not run in my env. Check out my full code below: class PipelineReproExample(Construct):
def __init__(self, scope: Construct, id: str, **kwargs) -> None:
super().__init__(scope, id, **kwargs)
source_bucket = s3.Bucket(
self,
"SourceBucket",
versioned=True,
removal_policy=RemovalPolicy.DESTROY,
auto_delete_objects=True,
)
# CfnOutupt the bucket Name
CfnOutput(self, 'SourceBucketName', value=source_bucket.bucket_name )
build_spec = codebuild.BuildSpec.from_object(
{
"version": 0.2,
"phases": {
"install": {
"commands": [
"echo Starting pipeline...",
],
},
},
}
)
build_env = codebuild.BuildEnvironment(
build_image=codebuild.LinuxArmBuildImage.from_code_build_image_id("aws/codebuild/amazonlinux2-aarch64-standard:3.0"
),
compute_type=codebuild.ComputeType.SMALL,
privileged=True,
)
codebuild_defaults = pipelines.CodeBuildOptions(
cache=codebuild.Cache.local(codebuild.LocalCacheMode.DOCKER_LAYER),
build_environment=build_env,
partial_build_spec=build_spec,
)
l2_codepipleine_pipeline = codepipeline.Pipeline(
self,
"CodePipelineL2",
pipeline_type=codepipeline.PipelineType.V2,
cross_account_keys=True,
execution_mode=codepipeline.ExecutionMode.SUPERSEDED, # Note:PARALLEL mode only supports 5 docker assets built in parallel, while SUPERSEDED mode supports 50 docker assets built in parallel.
restart_execution_on_update=True, # Restarts the pipeline when it's updated by self mutation.
reuse_cross_region_support_stacks=True,
)
synth_step = pipelines.CodeBuildStep(
"SynthStep",
commands=[
"npm install -g aws-cdk",
"pip install -r requirements.txt",
"cdk synth"
],
input=pipelines.CodePipelineSource.s3(
bucket=source_bucket,
object_key="source.zip"
),
cache=codebuild.Cache.local(codebuild.LocalCacheMode.DOCKER_LAYER),
)
pipeline = pipelines.CodePipeline(
self,
"CodePipelineBasePipeline",
code_pipeline=l2_codepipleine_pipeline,
synth=synth_step,
code_build_defaults=codebuild_defaults,
) and in #!/usr/bin/env python3
import os
import aws_cdk as cdk
from issue_triage_py.issue_triage_py_stack import PipelineReproExample
app = cdk.App()
stack = cdk.Stack(app, "cdk-python-stack",
env=cdk.Environment(account=os.getenv('CDK_DEFAULT_ACCOUNT'), region=os.getenv('CDK_DEFAULT_REGION')),)
PipelineReproExample(stack, "PipelineReproExample")
app.synth() initial $ cdk deploy
(will output the s3 bucket name) zip up the source bundle and upload to that s3 bucket $ zip -r ../source.zip . -x ".venv/*" -x "cdk.out/*" -x ".git/*"
$ aws s3 cp ../source.zip s3://<BUCKET_NAME>/source.zip
upload: ../source.zip to s3://<BUCKET_NAME>/source.zip click the release button from the codepipeline console or just wait for the polling to trigger the pipeline The pipeline should go through with no error. No loop happens. Please note:
Let me know if my provided code works for you. |
This issue has not received a response in a while. If you want to keep this issue open, please leave a comment below and auto-close will be canceled. |
I am investigating, thanks. |
Update: I've updated (still using type V2 in PARALLEL mode) to use Execution 1 (intentional - triggered by a commit): Execution 2 (unintentional - triggered by Execution 3 (unintentional - triggered by ... Either this is pure user error, there's a bug in CDK somewhere, or I'm simply missing something. Additionally, @pahud, your example won't (or at least shouldn't) trigger the behavior I am seeing because you're not building any docker assets in your pipeline, and therefore, the asset hashes won't change. See my comment above:
|
Another update: I tore down the pipeline. I redeployed with
Some more context: I am using a custom docker image that I run all steps in. I don't know if or how this could be related to this issue. Having tried everything, I can now only use CDK pipelines in V2 with To summarize:
Again, the reason for all of this is because the docker asset hashes continue to be recalculated on every synth and are, for reasons I don't understand, producing different hashes. |
@pahud any update on this? At the very least, were you able to test when building with docker assets? |
Describe the bug
The pipeline's 'UpdatePipeline' stage succeeds and the pipeline restarts, which it is expected to do once when infrastructure is updated. However, when it restarts, it updates itself again. This loops infinitely, and the pipeline never reaches the
Assets
stage. It appears to happen onV2
, withSUPERSEDED
mode, andrestart_execution_on_update=True
.I could not reproduce it on
V2
withPARALLEL
mode andrestart_execution_on_update=False
.This appears to be caused by the docker asset hashes changing while the rest of the template stays the same. Synth is not introducing nondeterminism and the dockerfiles and directories are exactly the same between runs, but the hashes keep changing.
I initially reported this here and here.
This may be a regression of #9766 and the fix here.
#9080 is likely also related.
Regression Issue
Last Known Working CDK Version
No response
Expected Behavior
The pipeline to self mutate once and then continue to Assets stage at the next cycle
Current Behavior
The pipeline continuously self mutates in an infinite loop.
Reproduction Steps
On codepipeline
V2
, userestart_execution_on_update=True
inSUPERSEDED
mode.Possible Solution
No response
Additional Information/Context
I reverted to
PARALLEL
mode fromSUPERSEDED
andrestart_execution_on_update=False
.It's now able to progress to
Assets
stage. I'm on version2.164.1
.For context:
SUPERSEDED
mode withrestart_execution_on_update=True
on codepipelineV1
.V2
, usingPARALLEL
mode andrestart_execution_on_update=False
.SUPERSEDED
mode withrestart_execution_on_update=False
.SUPERSEDED
mode withrestart_execution_on_update=True
.PARALLEL
mode andrestart_execution_on_update=False
.I would like to switch back to
SUPERSEDED
mode withrestart_execution_on_update=True
asSUPERSEDED
mode supports building 50 docker assets in parallel whilePARALLEL
only supports 5, and I'd like to not worry about the pipeline restarting after infra changes.CDK CLI Version
2.164.1
Framework Version
No response
Node.js Version
v20.15.1
OS
MacOS
Language
Python
Language Version
No response
Other information
No response
The text was updated successfully, but these errors were encountered: