英文:
Need to run a GitLab pipeline job only if a prior job succeeds
问题
以下是翻译好的部分:
-
我在GitLab中有一个项目,其中包含管道作业,用于构建镜像、运行单元测试并将构建的镜像推送到仓库。
-
需要执行以下操作:
- 构建应用程序。
- 如果构建失败,则失败管道。否则,将生成构建工件。
- 运行单元测试。
- 如果单元测试失败,则失败管道。
- 将测试结果(通过/失败)作为管道中的工件提供。
- 推送构建的应用程序到镜像仓库。
- 需要构建作业工件。
- 不需要单元测试工件。
- 构建应用程序。
-
管道示例:
stages: - build - test - release ... build-image: stage: build extends: - .kaniko-build variables: DOCKERFILE: Dockerfile KUBERNETES_MEMORY_REQUEST: 2Gi KUBERNETES_MEMORY_LIMIT: 2Gi run-unit-tests: stage: test extends: - .kaniko-build variables: DOCKERFILE: Unit-Tests.Dockerfile KUBERNETES_MEMORY_REQUEST: 1Gi KUBERNETES_MEMORY_LIMIT: 1Gi after_script: - mkdir -p test/surefire-reports - cp -r /app-build/target/surefire-reports ./test artifacts: when: always paths: - test/surefire-reports reports: junit: - test/surefire-reports/*.xml push-to-harbor: stage: release extends: - .release-local-image variables: SOURCE_IMAGE_NAME: $IMAGE_NAME SOURCE_IMAGE_TAG: $CI_PIPELINE_ID TARGET_PROJECT: phactory-images TARGET_IMAGE_NAME: $IMAGE_NAME TARGET_IMAGE_TAG: $CI_COMMIT_REF_SLUG needs: - build-image ...
-
发生的情况:
- 构建成功运行。
- 单元测试和推送到Harbor作业并行执行。推送到Harbor作业的
needs
子句允许它与单元测试作业并行运行,尽管它在较后的阶段。 - 构建的应用程序镜像被推送到Harbor仓库。
- 单元测试通过。由于运行时间比推送作业长,因此此作业最后完成。
-
问题出在,如果由于测试失败而导致单元测试作业失败,它不会阻止构建的镜像被推送到仓库。这不是我需要的。
-
建议是从推送到Harbor作业中移除
needs
子句。这将强制推送作业在单元测试作业之后运行。如果单元测试作业失败,推送作业不会运行。根据文档,GitLab默认会将前一作业的工件流向后续作业。问题解决了,对吗?不幸的是,不是。如果单元测试通过,推送作业会执行,但推送的镜像是运行单元测试作业的Dockerfile生成的。构建作业的工件无处可寻。
-
我花了一些时间研究GitLab管道作业的其他属性,并找到了
dependencies
子句(请参阅下面的链接)。我以不同的方式尝试过,但似乎没有按照它所说的方式工作。它仍然将单元测试Dockerfile生成的镜像推送到仓库,完全忽略了我指定的构建作业的工件。push-to-harbor: stage: release extends: - .release-local-image variables: SOURCE_IMAGE_NAME: $IMAGE_NAME SOURCE_IMAGE_TAG: $CI_PIPELINE_ID TARGET_PROJECT: phactory-images TARGET_IMAGE_NAME: $IMAGE_NAME TARGET_IMAGE_TAG: $CI_COMMIT_REF_SLUG dependencies: - build-image
-
dependencies
子句不起作用吗?我使用它的方式不对吗?
GitLab文档链接:
英文:
I have a project in GitLab that has pipeline jobs to build the image, run unit tests and push the built image to a repository.
What needs to happen:
- Application is built.
- If build fails, fail the pipeline. Otherwise, build artifacts are produced.
- Unit tests are run.
- If unit tests fail, fail the pipeline.
- Make test results (pass/fail) available as artifacts in pipeline.
- Built application is pushed to an image repository.
- Needs build job artifacts.
- Does not need unit test artifacts.
Pipeline:
stages:
- build
- test
- release
...
build-image:
stage: build
extends:
- .kaniko-build
variables:
DOCKERFILE: Dockerfile
KUBERNETES_MEMORY_REQUEST: 2Gi
KUBERNETES_MEMORY_LIMIT: 2Gi
run-unit-tests:
stage: test
extends:
- .kaniko-build
variables:
DOCKERFILE: Unit-Tests.Dockerfile
KUBERNETES_MEMORY_REQUEST: 1Gi
KUBERNETES_MEMORY_LIMIT: 1Gi
after_script:
- mkdir -p test/surefire-reports
- cp -r /app-build/target/surefire-reports ./test
artifacts:
when: always
paths:
- test/surefire-reports
reports:
junit:
- test/surefire-reports/*.xml
push-to-harbor:
stage: release
extends:
- .release-local-image
variables:
SOURCE_IMAGE_NAME: $IMAGE_NAME
SOURCE_IMAGE_TAG: $CI_PIPELINE_ID
TARGET_PROJECT: phactory-images
TARGET_IMAGE_NAME: $IMAGE_NAME
TARGET_IMAGE_TAG: $CI_COMMIT_REF_SLUG
needs:
- build-image
...
What happens:
- The build runs successfully.
- The unit test and push to harbor jobs execute in parallel. The
needs
clause of the push to harbor job allows it to run in parallel with the unit test job even though it's in a later stage. - The built application image is pushed to the harbor repository.
- Unit tests pass. This job finishes last since it takes longer to run than the push job.
The problem with this is that if the unit test job fails due to a test failure, it does not prevent the built image from being pushed to the repository. This is not what I need.
A suggestion (see prior answer) was made to remove the needs
clause from the push to harbor job. This forces the push job to run after the unit test job. If the unit test job fails, the push job does not run. According to the documentation, GitLab by default, flows artifacts from prior jobs to following jobs.
Problem solved, right? Unfortunately, no. If the unit tests pass, the push job executes, but the image that's pushed is what results from running the unit test job Dockerfile. The build job artifacts are nowhere to be found.
I spent some time looking at other attributes of GitLab pipeline jobs and found the dependencies
clause (see link below). I've tried this in various ways, but it doesn't seem to do as advertised. It's still pushing the unit test Dockerfile image to the repository and completely ignoring the artifacts of the build job I specified.
push-to-harbor:
stage: release
extends:
- .release-local-image
variables:
SOURCE_IMAGE_NAME: $IMAGE_NAME
SOURCE_IMAGE_TAG: $CI_PIPELINE_ID
TARGET_PROJECT: phactory-images
TARGET_IMAGE_NAME: $IMAGE_NAME
TARGET_IMAGE_TAG: $CI_COMMIT_REF_SLUG
dependencies:
- build-image
Does the dependencies
clause not work? Am I using it the wrong way?
GitLab documentation:
https://docs.gitlab.com/ee/ci/yaml/#needs
Use needs to execute jobs out-of-order. Relationships between jobs that use needs can be visualized as a directed acyclic graph.
You can ignore stage ordering and run some jobs without waiting for others to complete. Jobs in multiple stages can run concurrently.
https://docs.gitlab.com/ee/ci/yaml/#dependencies
Use the dependencies keyword to define a list of jobs to fetch artifacts from. You can also set a job to download no artifacts at all.
If you do not use dependencies, all artifacts from previous stages are passed to each job.
答案1
得分: 1
要实现您的目标,只需从“push-to-harbor”阶段中删除“needs”。在这种情况下,每个作业将依赖于前一个作业,并且仅在所有前一个作业成功完成的情况下才会运行。这是默认行为。此外,默认情况下会将来自前一阶段的所有工件传递给每个作业。
英文:
To archive your goal just remove needs
from push-to-harbor
stage. In this case each job will be dependent on previous jobs and will run only in case if all previous jobs finished successfully. This is default behavior. Also by default all artifacts from previous stages are passed to each job.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论