I am using bitbucket as a code repository and Aws codebuild/pipeline for CI/CD.
What is the value of CODEBUILD_SOURCE_VERSION env. variable in aws codebuild for bitbucket?
To get the branch name in the build step, use the Environment variables feature.
Add an Environment variable (Key: BRANCH, Value: #{SourceVariables.BranchName}) in the Build step, and then this variable can be called inside the build script like $BRANCH
See this for more details
In my case,
My pipeline has a source part in which it copy from bitbucket and store in s3.
I have a source artifact as a s3 bucket. so it will return s3 ARN.
Source version arn:aws:s3:::codepipeline-<region>-<accountnumber>/<path>
Related
We are using Terraform Enterprise Cloud and Azure DevOps YML pipelines for Azure infra deployments.
Requirement: We want to separate .tfvar files completely from the main terraform folder and keep them in different Repo called config Repository.
Solution 1: We can refer tfvars from the config repository while running the below command,
terraform plan --var-fil -We cannot implement
Note: Since we are using global templates, these terraform commands like fmt, validate, plan, and apply are managed by the template itself, we are not allowed to edit the template.
Here is the logic,
template expects only .tfvars file in the current directory, then there are some bash commands to rename it to .auto.tfvars.
We know that these auto.tfvars files will be automatically identified by Terraform.
Solution 2: We are expecting and struggling to implement and need some help
By default Template copies all terraform folders to ADO Agent Container. we want to make sure the .tfvar file from the Config repository is available in the agent container. Then this solution will be good.
May be,
We can achieve it by Copying the .tfvars file from the config repository to the agent container by writing some shell script. but it has to be inside the terraform folder. because only terraform folder will be copied to the agent container.
Or is there any way that we can integrate a shell script to terraform configuration which can download tfvars file from config repository to container in run time.
Any other solution or approach will be appreciated.
To make sure the config repo files are available during runtime you can add a second artifact to the release pipeline. This will allow you to modify your var argument with the appropriate file.
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/artifacts?view=azure-devops
One approach is to have your tfvars file stored as a secure file, then just add a step in your pipeline to download it, however, if you're using Terraform Enterprise, is there any particular reason to not use Terraform workspace variables?
I'm trying to setup a CI/CD pipeline for a dotnet app which uses AWS Lambda and AWS CDK for infrastructure. The source is on github and we have a Jenkins pipeline which runs the tests and publishes the artifacts. I want to use the artifact and deploy (or better use Code Deploy)
Can I use CodePipeline to run cdk deploy?
How can I use CodeDeploy to do a dotnet test and dotnet publish? and then pass on the artifact to CodePipeline
CodePipeline is a workflow service, it by itself cannot execute any commands. What you need is a Build/Test service like CodeBuild and/or Jenkins as part of the CodePipeline. This is where you will run the commands like 'cdk deploy', 'dotnet test' and 'dotnet publish'.
Once the Deployment artifacts are ready in the Build environment (using the aforementioned commands), the next CodePipeline Stage can use them to deploy - this is where a Service like CodeDeploy will be used.
CodePipeline is just orchestrating the workflow between the building block services like CodeCommit (Source), CodeBuild (Build/Test) and CodeDeploy (Deploy). There are many more integrations available.
Hope this helps.
There are example from AWS on this AWS CDK home page.
https://docs.aws.amazon.com/cdk/latest/guide/codepipeline_example.html
The working implementation on this using code commit is as below, it has screen shots and github link.
https://nikhilbhojcloud.blogspot.com/2019/08/code-pipeline-using-aws-cdk-for-lambda.html
Apart from this AWS CDK team is building CI/CD for CDK applications.
https://github.com/aws/aws-cdk/tree/master/packages/%40aws-cdk/app-delivery
You should use CodeBuild not CodeDeploy to do a dotnet test.
CodePipeline has three stages
Source
CodeBuild
CodeDeploy
For your use case github is the source. CodeBuild can be used to build/test your application. And CodeDeploy to deploy the build artifacts to your environment.
In order use codeBuild you must provide build specification reference.
follow the link for further more information like how to do it in codebuild.
https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html
I want to build a project in Azure Pipelines, but I want to know what the idiomatic way is to obtain the latest tag, latest tag distance, and repo remote path/URL in order to pass those values into the actual build script that is inside the repository.
Previously our build script would invoke hg log -r . --template with a clever template, but we found when moving to Continua CI build server that the build agent doesn't have access to the actual repository during a build, and had to find another way.
I'm assuming the same issue would crop up with Azure Pipelines and haven't quite found the relevant docs yet on artifact versioning.
Many thanks in advance.
For git at least, Azure Pipelines does a full clone of the repo by default, unless you explicitly denote that you're doing a shallow clone (source: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops).
Deriving the version/tag can be done via normal git commands (i.e. git describe --tags or whatever you prefer), which can then be saved as VSO variables to be accessed in later steps in the same job (see https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-using-expressions for more info on how to do that).
I have a front-end build and deploy that I want to run on circleci with node. The deploy part needs a config file with api keys and passwords which I don't want to store in git. How do I add a config file to my build?
disclaimer: Developer Evangelist at CircleCI.
The CircleCI config file itself would be stored in Git. API keys, passwords, and secrets you'd store in private environment variables via the CircleCI UI.
I'm trying to auto deploy a Laravel app from a Github branch into AWS EC2 or Elastic Beanstalk (Prefer) but I haven't found the right solution, one of the tutorials I have followed is the one bellow. Does anyone have a solution for this?
Thank you in advance!
https://aws.amazon.com/blogs/devops/building-continuous-deployment-on-aws-with-aws-codepipeline-jenkins-and-aws-elastic-beanstalk/
You can do this with the following steps
Setup Jenkins with Github plugin
Install AWS Elastic Beanstalk CLI
Create IAM user with Elastic Beanstalk deploying privileges and add the access keys to AWS CLI (if Jenkins run inside a EC2, instead of creating a user, you can create a Role with requird permission and attach to the EC2 instance)
In Jenkins project, clone the branch go to project directory and executive eb deploy in a shell script to deploy it to Elastic Beanstalk. (You can automate this with a build trigger when new code pushed to the branch)
Alternatively there are other approaches, for example
Blue/Green deployment with Elastic Beanstalk
Deploy Gitbranch to specific environment.
Using AWS CodeStar to setup the deployment using templates(Internally it will use AWS Code pipeline, CodeDeploy and etc.)
An alternative to using eb deploy is to use the Jenkins AWS Beanstalk Publisher plugin https://wiki.jenkins.io/display/JENKINS/AWS+Beanstalk+Publisher+Plugin
This can be installed by going to Manage Jenkins > Manage Plugins > Search for AWS Beanstalk Publisher. The root object is the zip file of the project that you want to deploy to EB. The Build Steps can execute a step to zip up the files that are in your repo.
You will still need to fill in the Source Control Management section of the Jenkins Job configuration. This must contain the URL of your GitHub repo, and the credentials used to access them.
Execute Shell as part of the Build Steps which zip up the files from the repo that you want to deploy to EB. For example zip -r myfiles.zip * will zip up all the files within your GitHub repo.
Use the AWS Beanstalk Publisher Plugin and specify myfiles.zip as the value of the Root Object (File / Directory).