Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That’s not my experience at all. You do know you can run arbitrary shell scripts, and that means they don’t need to think of every possible use case?

If your CI process is so complex that YAML + arbitrary code doesn’t work, you might want to get that checked, it’s not normal



Random example:

I want to define a setup job to build & push a base Docker image to our container registry. I then want to use this image as IMAGE in all subsequent jobs. This is impossible because the IMAGE field in YAML cannot be dynamic (determined at runtime) but I'd like to version/tag my image using the $CI_COMMIT_SHA.


This is definitely possible. We've been building an image in one phase and then running it in subsequent phases for years with gitlab.

I think you are going about this wrong. Are you generating an image tag dynamically? When you are tagging the image, make sure that you generate the tag deterministically based on information that is available to gitlab when the pipeline is created.

So for example, you could use the tag foo:$PIPELINE_ID instead of foo:$random


My apologies, I misspoke: I wanted the image tag not to be $CI_COMMIT_SHA but to be a hash dynamically generated from certain files in the repo. The issue is that IMAGE won't accept a dynamically generated environment variable (passed from job to job via a dotenv artifact).


I think you could just jam anything you need in pipeline yaml from the dotenv file into outputs/variables then do something like this https://stackoverflow.com/a/71575683/2751619


No, this doesn't work. Also note that the StackOverflow link is about Github Actions, not Gitlab CI.


Did you know that with Gitlab you can generate gitlab ci yaml in a job runtime and then run that yaml as a child pipeline using trigger:include:artifact?

This was the only way I could create dynamic terraform pipelines which changed depending on a plan output.

I'm sure could use it to achieve what you've described.


Thank you, that's indeed a good point. And yes, I did consider that. However, then the Gitlab UI (pipelines overview etc.) ceases to be very useful as everything will be inside one big child pipeline (i.e. individual jobs will no longer be shown in the overview). My coworkers would have hated me.


Can't you use build time variables for this? https://docs.gitlab.com/ee/ci/variables/#pass-an-environment...


I tried this but it didn't work.


Addendum: To see why it doesn't work, please see https://news.ycombinator.com/item?id=38527692 (which includes a slight correction of my original comment).


The image may be determined at runtime, but it’s not required to exist until a runner picks up the job. So use the $CI_COMMIT_SHA in the image name and push the image in a job that runs before the other jobs that use the image.

You might also want to look into Downstream Pipelines.


(Please see https://news.ycombinator.com/item?id=38527692 for a correction of my original comment.)

The issue is that the IMAGE field in YAML doesn't pick up environment variables that are being passed on from a previous job via a dotenv artifact.

As for downstream pipelines, please see https://news.ycombinator.com/item?id=38525954




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: