How do I pass values to an Argo Workflow of Workflows? - continuous-integration

I want to implement a generic CI pattern using Argo Workflows. One Workflow is triggered by the git server using an Argo Events webhook. After validating the signature, the repo is cloned and a custom workflow from the git repo is started (with a fixed name, e.g. .argo-ci.yaml). The Workflow of Workflows pattern is recommended to be implemented using the template.resource field. When using the manifest key, I can use input variables as shown in the example. However, variables are not evaluated using manifestFrom, which I would need to start the Workflow from git.
For a CI use case, the child workflow either needs to be passed the git repo artifact or at least the git revision as a variable so that the Workflow can clone and checkout the correct sources for which the CI was triggered. Both does not seem to be possible as the manifest from template.resource.manifestFrom is applied as-is without templating options.

Related

How to sync Argo workflow-templates b/w a git-repo and Argo instance?

I have a git repository(hosted on Github) where I create my workflow-template YAMLs, and upload them to Argo via UI or REST-api. Now whenever I update any workflow-template, I have to manually update it in 2 places, the git-repo and Argo. There is a chance of either place being missed in this process.
How can I automate the process of updating workflow-templates in Argo-service, whenever the workflow-templates in git repository change?
you can do it in many ways.
you have ArgoCD to do CD on GitOps. It will make sure synced with desired state from GitHub(preferred)
you can use Argo workflow event binding to trigger workflow on GitHub action and get the change from GitHub and apply it. https://github.com/argoproj/argo-workflows/blob/342abcd6d72b4cda64b01f30fa406b2f7b86ac6d/docs/events.md

How to trigger a Jenkins pipeline once a Bitbucket pipeline is finished?

I have the following requirement. I have a Jenkins pipeline which I want to be triggered once a Bitbucket pipeline has been finished with success. The problem is that I need to pass also some params and I don't want to use an asynchronous process like Bitbucket webhooks.
Is it another way to trigger the Jenkins pipeline automatically receiving multiple params?
I want to mention that these params can be retrieved also from the AWS resources created by that Bitbucket pipeline.
I faced the same issue, but i found a solution using Additional Behaviours
from git plugin.
Using Polling ignores commits with certain messages, which allow you to:
ignore any revisions committed with message matched to the regular expression pattern when determining if a build needs to be triggered.
and am using [skip ci] int the commit message, to commit changes after Bit-bucket pipeline finished.
so i used a custom regex ^(?!.*\[skip ci\]).*$ to skip any commit not including the tag.
which will result to only trigger Jenkins once the pipeline finished.

Build Manual GitlabCI pipeline job using specific Commit ID

I need to build a Gitlab CI pipeline manually but not using latest of my master branch, but using a specific commitID.
I have tried running pipeline manually by using variable as below and passing its value but of no use.
Input variable key: CI_COMMIT_SHA
At the time of this writing, GitLab only supports branch/tag pipelines, merge request pipelines and scheduled pipelines. You can't run a GitLab pipeline for a specific commit, since the same commit may belong to multiple branches.
To do what you want, you need to create a branch from the commit you want to run the pipeline for. Then you can run the manual pipeline on that branch.
See this answer for step-by-step instructions on how to create a branch from a commit directly in the GitLab UI.
Use the existing (created by Gitlab CI) workspace to run the .gitlab-ci.yml and from there checkout the code again in a different directory using commitID, and perform all the operations there.

GitLab Pipeline trigger: rerun latest tagged pipeline

We have an app (let’s call it the main repo) on GitLab CE, that has a production build & deploy pipeline, which is only triggered when a tag is deployed. This is achieved in .gitlab-ci.yml via:
only:
- /^v.*$/
except:
- branches
We also have two other (let’s call them side) repositories (e.g. translations and utils). What I’d like to achieve is to rerun the latest (semver) tag’s pipeline of main, when either of those other side repositories’ master branches receives a push. A small detail is that one of the repositories is on GitHub, but I’d be happy to get them working on GitLab first and then work from there.
I presume I’d need to use the GitLab API to trigger the pipeline. What I’ve currently set up for the side repo on GitLab is a webhook integration for push events:
https://gitlab.com/api/v4/projects/{{ID}}/ref/master/trigger/pipeline?token={{TOKEN}}, where ID is the ID of the main project and TOKEN a deploy token for it.
However, this will only trigger a master pipeline for our main repo. How could I get this to (also) rerun the latest tag’s pipeline (or the latest tagged pipeline)?
Secondly, how would I go about triggering this on GitHub?
Either you can create new pipeline specifying ref which can be branches or tags, so in this case you need to know the exact tag value https://docs.gitlab.com/ee/api/pipelines.html#create-a-new-pipeline
Or you can retry already the executed pipeline by providing its id which you can get from https://docs.gitlab.com/ee/api/pipelines.html#list-project-pipelines by sorting by id and filtering by ref but it'll give you the last pipeline with a tag /^v.*$/ which may not match with the specific version you need.

Generate documentation on commit

What is the best practice for automatically-generating ruby documentation as part of a git commit workflow? We'd like to automatically generate new documentation whenever a commit is merged into master.
Use Git hooks.
If you want it to run on the client side, use a post-commit hook. Essentially, registering a hook (by storing it in .git/hooks) makes Git call an arbitrary script you provide after each successful git commit. Your script can then call RDoc or YARD to generate docs into some output directory (outside your source code repository, of course).
Alternatively, you can have it run on the server hosting your Git repo, using a post-receive hook. This will run after you push to the server's repo. For example, you could use this to automatically upload the new docs to /docs/dev/ on your project's web server.
For detailed instructions, see the chapter on hooks in the Git manual.

Resources