-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix kie-issues #1217 - Replace quay.io with docker.io #3525
Conversation
Signed-off-by: Ricardo Zanini <zanini@redhat.com>
PR job Reproducerbuild-chain build full_downstream -f 'https://raw.githubusercontent.com/${AUTHOR:apache}/incubator-kie-kogito-pipelines/${BRANCH:main}/.ci/buildchain-config-pr-cdb.yaml' -o 'bc' -p apache/incubator-kie-kogito-runtimes -u #3525 --skipParallelCheckout NOTE: To install the build-chain tool, please refer to https://github.com/kiegroup/github-action-build-chain#local-execution Please look here: https://ci-builds.apache.org/job/KIE/job/kogito/job/main/job/pullrequest_jobs/job/kogito-runtimes-pr/job/PR-3525/1/display/redirect Test results:
Those are the test failures: org.kie.kogito.tracing.QuarkusTracingAddonDevServicesIT.testEvaluateLoanEligibilityio.quarkus.builder.BuildException: Build failure: Build failed due to errors[error]: Build step org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor#startTrustyServiceDevService threw an exception: java.lang.RuntimeException: Failed to start Kogito TrustyService DevServices at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyServiceDevService(KogitoDevServicesProcessor.java:145) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at io.quarkus.deployment.ExtensionLoader$3.execute(ExtensionLoader.java:849) at io.quarkus.builder.BuildContext.run(BuildContext.java:256) at org.jboss.threads.ContextHandler$1.runWith(ContextHandler.java:18) at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538) at java.base/java.lang.Thread.run(Thread.java:833) at org.jboss.threads.JBossThread.run(JBossThread.java:501) Caused by: java.lang.RuntimeException: org.testcontainers.containers.ContainerFetchException: Can't get Docker image: RemoteDockerImage(imageName=docker.io/apache/incubator-kie-kogito-trusty-postgresql:latest, imagePullPolicy=AlwaysPullPolicy(), imageNameSubstitutor=Chained substitutor of 'DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor')' and then 'Kogito Image Name Substitutor') at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.lambda$startTrustyService$8(KogitoDevServicesProcessor.java:271) at java.base/java.util.Optional.orElseGet(Optional.java:364) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyService(KogitoDevServicesProcessor.java:276) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyServiceDevService(KogitoDevServicesProcessor.java:133) ... 11 more Caused by: org.testcontainers.containers.ContainerFetchException: Can't get Docker image: RemoteDockerImage(imageName=docker.io/apache/incubator-kie-kogito-trusty-postgresql:latest, imagePullPolicy=AlwaysPullPolicy(), imageNameSubstitutor=Chained substitutor of 'DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor')' and then 'Kogito Image Name Substitutor') at org.testcontainers.containers.GenericContainer.getDockerImageName(GenericContainer.java:1364) at org.testcontainers.containers.GenericContainer.doStart(GenericContainer.java:359) at org.testcontainers.containers.GenericContainer.start(GenericContainer.java:330) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.lambda$startTrustyService$8(KogitoDevServicesProcessor.java:267) ... 14 more Caused by: com.github.dockerjava.api.exception.NotFoundException: Status 404: {"message":"pull access denied for apache/incubator-kie-kogito-trusty-postgresql, repository does not exist or may require 'docker login': denied: requested access to the resource is denied"} at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:241) at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) at java.base/java.lang.Thread.run(Thread.java:833) |
...-kubernetes-service-catalog/runtime/src/test/resources/deployment/deployment-no-service.yaml
Outdated
Show resolved
Hide resolved
...ons/fabric8-kubernetes-service-catalog/runtime/src/test/resources/deployment/deployment.yaml
Outdated
Show resolved
Hide resolved
...ernetes-service-catalog/runtime/src/test/resources/deployment/pod-deployment-no-service.yaml
Outdated
Show resolved
Hide resolved
...ernetes-service-catalog/runtime/src/test/resources/deployment/pod-deployment-no-service.yaml
Outdated
Show resolved
Hide resolved
...service-catalog/runtime/src/test/resources/deployment/replica-set-deployment-no-service.yaml
Outdated
Show resolved
Hide resolved
...netes-service-catalog/runtime/src/test/resources/statefulset/statefulset-pod-no-service.yaml
Outdated
Show resolved
Hide resolved
...netes-service-catalog/runtime/src/test/resources/statefulset/statefulset-pod-no-service.yaml
Outdated
Show resolved
Hide resolved
...s/fabric8-kubernetes-service-catalog/runtime/src/test/resources/statefulset/statefulset.yaml
Outdated
Show resolved
Hide resolved
@@ -17,4 +17,4 @@ | |||
# under the License. | |||
# | |||
|
|||
quarkus.kogito.dev-services-trusty.image-name=quay.io/kiegroup/kogito-trusty-postgresql:${version.org.kie.kogito.trusty-service-postgresql} | |||
quarkus.kogito.dev-services-trusty.image-name=docker.io/apache/incubator-kie-kogito-trusty-postgresql:${version.org.kie.kogito.trusty-service-postgresql} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
where this image is published?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not. We've deprecated it. Hence, we can either keep this image here for ubiquity, keep the outdated, or remove this addon for good. I'd rather take this decision for the maintainers of this addon. @gitgabrio do you know?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
k, thank you for the clarifications.
I would like to preserve as much as possible the tracing decision, but it has to adjust to remove the image reference that has been removed.
@gitgabrio would be possible to adjust the code to avoid use the image?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this property is for the devservice. We should remove the devservice or review it. I recommend doing so in another PR. Waiting for @gitgabrio inputs. :D
@@ -31,6 +31,6 @@ mp.messaging.outgoing.kogito-tracing-model.topic=kogito-tracing-model | |||
mp.messaging.outgoing.kogito-tracing-model.value.serializer=org.apache.kafka.common.serialization.StringSerializer | |||
mp.messaging.outgoing.kogito-tracing-model.auto.offset.reset=earliest | |||
|
|||
quarkus.kogito.dev-services-trusty.image-name=quay.io/kiegroup/kogito-trusty-postgresql:latest | |||
quarkus.kogito.dev-services-trusty.image-name=docker.io/apache/incubator-kie-kogito-trusty-postgresql:latest |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can't find this image source; mind help locate it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same.
@porcelli these are just integration tests that use these YAMLs as mock resources to check the integration with the k8s. I'll change to |
thank you @ricardozanini for clarifying great idea to use bananas... so we don't confuse ourselves :) |
Signed-off-by: Ricardo Zanini <zanini@redhat.com>
PR job Reproducerbuild-chain build full_downstream -f 'https://raw.githubusercontent.com/${AUTHOR:apache}/incubator-kie-kogito-pipelines/${BRANCH:main}/.ci/buildchain-config-pr-cdb.yaml' -o 'bc' -p apache/incubator-kie-kogito-runtimes -u #3525 --skipParallelCheckout NOTE: To install the build-chain tool, please refer to https://github.com/kiegroup/github-action-build-chain#local-execution Please look here: https://ci-builds.apache.org/job/KIE/job/kogito/job/main/job/pullrequest_jobs/job/kogito-runtimes-pr/job/PR-3525/2/display/redirect Test results:
Those are the test failures: org.kie.kogito.tracing.QuarkusTracingAddonDevServicesIT.testEvaluateLoanEligibilityio.quarkus.builder.BuildException: Build failure: Build failed due to errors[error]: Build step org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor#startTrustyServiceDevService threw an exception: java.lang.RuntimeException: Failed to start Kogito TrustyService DevServices at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyServiceDevService(KogitoDevServicesProcessor.java:145) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at io.quarkus.deployment.ExtensionLoader$3.execute(ExtensionLoader.java:849) at io.quarkus.builder.BuildContext.run(BuildContext.java:256) at org.jboss.threads.ContextHandler$1.runWith(ContextHandler.java:18) at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513) at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538) at java.base/java.lang.Thread.run(Thread.java:833) at org.jboss.threads.JBossThread.run(JBossThread.java:501) Caused by: java.lang.RuntimeException: org.testcontainers.containers.ContainerFetchException: Can't get Docker image: RemoteDockerImage(imageName=docker.io/apache/incubator-kie-kogito-trusty-postgresql:latest, imagePullPolicy=AlwaysPullPolicy(), imageNameSubstitutor=Chained substitutor of 'DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor')' and then 'Kogito Image Name Substitutor') at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.lambda$startTrustyService$8(KogitoDevServicesProcessor.java:271) at java.base/java.util.Optional.orElseGet(Optional.java:364) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyService(KogitoDevServicesProcessor.java:276) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.startTrustyServiceDevService(KogitoDevServicesProcessor.java:133) ... 11 more Caused by: org.testcontainers.containers.ContainerFetchException: Can't get Docker image: RemoteDockerImage(imageName=docker.io/apache/incubator-kie-kogito-trusty-postgresql:latest, imagePullPolicy=AlwaysPullPolicy(), imageNameSubstitutor=Chained substitutor of 'DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor')' and then 'Kogito Image Name Substitutor') at org.testcontainers.containers.GenericContainer.getDockerImageName(GenericContainer.java:1364) at org.testcontainers.containers.GenericContainer.doStart(GenericContainer.java:359) at org.testcontainers.containers.GenericContainer.start(GenericContainer.java:330) at org.kie.kogito.tracing.decision.quarkus.deployment.KogitoDevServicesProcessor.lambda$startTrustyService$8(KogitoDevServicesProcessor.java:267) ... 14 more Caused by: com.github.dockerjava.api.exception.NotFoundException: Status 404: {"message":"pull access denied for apache/incubator-kie-kogito-trusty-postgresql, repository does not exist or may require 'docker login': denied: requested access to the resource is denied"} at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.execute(DefaultInvocationBuilder.java:241) at org.testcontainers.shaded.com.github.dockerjava.core.DefaultInvocationBuilder.lambda$executeAndStream$1(DefaultInvocationBuilder.java:269) at java.base/java.lang.Thread.run(Thread.java:833) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rectifying my inicial review... cc @jstastny-cz @rodrigonull
if (version.endsWith('-SNAPSHOT')) { | ||
return "${dataIndexEphemeralImageName}-nightly:${util.getMajorMinorVersion(version)}" | ||
return "${dataIndexEphemeralImageName}:${util.getMajorMinorVersion(version)}" | ||
} | ||
return "${dataIndexEphemeralImageName}:${util.getMajorMinorVersion(version)}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this needs adjustment.. For the main branch, the tag would be main
, that's fine. But when the branch is not main
, we actually agreed to use the branch name to represent the stream, so the tag of this image would be 10.0.x. Now when we're doing an ACTUAL release, meaning versions does not end with -SNAPSHOT
, then that's when we're supposed to use getMajorMinor()
.. Let me know if I'm missing something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In practice:
if version ends in -SNAPSHOT
tagVersion = branch name
else
tagVersion = getMajorMinor(...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With that said, this is not something that should only be done on the deploy job, as the code itself needs to reflect that, so people building locally and developing will have the same behavior. I guess the best solution would be calling:
mvn versions:set-property -Dproperty=data-index-ephemeral.image -DnewVersion=<new-value>
on this module during commits D and R, as per our release procedure described here: https://docs.google.com/document/d/1t-08b970i9PMBy-TSM64x0RUH46pzwnIhEi-LVeL0lg/edit
This way, the deployment CI doesn't need to alter this property when publishing, and the build from source is always accurate.
Better yet would be creating a new property to hold only have the tagVersion
, not the full tag
of the image. This new property (E.g., data-index-ephemeral.image.tagVersion
) can then be used to compose the existing data-index-ephemeral.image
property, like docker.io/apache/incubator-kie-kogito-data-index-ephemeral:${data-index-ephemeral.image.tagVersion}
.
Sorry for dumping the entire reasoning here :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@tiagobento I think we need another PR targeting CI changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok.
What is missing here for us to merge it? @ricardozanini @porcelli |
nothing from my side |
we need to be thoughtful that the build will be broken for a quite due the image references of data-index... but I'm ok to merge as there is no way to solve this in one shot. |
@ricardozanini @tiagobento fyi #3526 this is an attempt to move Integration tests to -apps |
) * Fix kie-issues apache#1217 - Replace quay.io with docker.io Signed-off-by: Ricardo Zanini <zanini@redhat.com> * Replace outdated image with bananas 🍌 Signed-off-by: Ricardo Zanini <zanini@redhat.com> --------- Signed-off-by: Ricardo Zanini <zanini@redhat.com> # Conflicts: # .ci/jenkins/Jenkinsfile.setup-branch
Fixes apache/incubator-kie-issues#1217
Many thanks for submitting your Pull Request ❤️!
Please make sure that your PR meets the following requirements:
KOGITO-XYZ Subject
[0.9.x] KOGITO-XYZ Subject
How to replicate CI configuration locally?
Build Chain tool does "simple" maven build(s), the builds are just Maven commands, but because the repositories relates and depends on each other and any change in API or class method could affect several of those repositories there is a need to use build-chain tool to handle cross repository builds and be sure that we always use latest version of the code for each repository.
build-chain tool is a build tool which can be used on command line locally or in Github Actions workflow(s), in case you need to change multiple repositories and send multiple dependent pull requests related with a change you can easily reproduce the same build by executing it on Github hosted environment or locally in your development environment. See local execution details to get more information about it.
How to retest this PR or trigger a specific build:
for pull request checks
Please add comment: Jenkins retest this
for a specific pull request check
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] tests
for quarkus branch checks
Run checks against Quarkus current used branch
Please add comment: Jenkins run quarkus-branch
for a quarkus branch specific check
Run checks against Quarkus current used branch
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] quarkus-branch
for quarkus main checks
Run checks against Quarkus main branch
Please add comment: Jenkins run quarkus-main
for a specific quarkus main check
Run checks against Quarkus main branch
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] quarkus-main
for quarkus lts checks
Run checks against Quarkus lts branch
Please add comment: Jenkins run quarkus-lts
for a specific quarkus lts check
Run checks against Quarkus lts branch
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] quarkus-lts
for native checks
Run native checks
Please add comment: Jenkins run native
for a specific native check
Run native checks
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] native
for native lts checks
Run native checks against quarkus lts branch
Please add comment: Jenkins run native-lts
for a specific native lts check
Run native checks against quarkus lts branch
Please add comment: Jenkins (re)run [kogito-runtimes|kogito-apps|kogito-examples] native-lts
How to backport a pull request to a different branch?
In order to automatically create a backporting pull request please add one or more labels having the following format
backport-<branch-name>
, where<branch-name>
is the name of the branch where the pull request must be backported to (e.g.,backport-7.67.x
to backport the original PR to the7.67.x
branch).Once the original pull request is successfully merged, the automated action will create one backporting pull request per each label (with the previous format) that has been added.
If something goes wrong, the author will be notified and at this point a manual backporting is needed.
Quarkus-3 PR check is failing ... what to do ?
The Quarkus 3 check is applying patches from the `.ci/environments/quarkus-3/patches`.The first patch, called
0001_before_sh.patch
, is generated from Openrewrite.ci/environments/quarkus-3/quarkus3.yml
recipe. The patch is created to speed up the check. But it may be that some changes in the PR broke this patch.No panic, there is an easy way to regenerate it. You just need to comment on the PR:
and it should, after some minutes (~20/30min) apply a commit on the PR with the patch regenerated.
Other patches were generated manually. If any of it fails, you will need to manually update it... and push your changes.