Script for replace txt in post build Aws CodeBuild & CodeDeploy - continuous-integration

I have a CI/CD working well at Amazon.
Lets say I have a simple html with a url homolog.test.com.
Is there a way when I merge the repository to master, the post build of CodeBuild & CodeDeploy run thru all files to find homolog. to replace with www.
The point is that I dont wanna change by hand all files to test in homolog and then replace again by hand to merge to master...I would like to have homolog working fine and when I merge to master it would replace at the deploy time the urls...
I know that the correct would be have the correct urls in the corret repository, but to facilitate the tests would be nice to work in this other way not too trivial...
A "sed" shell command would work? like this example? Unix Shell Loop through files and replace texts
would be something like this on the post build section? how I specify the correct folder at S3?
sed "s/homolog./www./g" *.html -i
My BuildSpec is very simple:
version: 0.2
phases:
pre_build:
commands:
- echo Installing source NPM dependencies...
- npm install
build:
commands:
- cp dev-ops/config.production.js src/config.js
- export PUBLIC_URL=/app/
- npm run build
post_build:
commands:
- aws s3 sync site-beta s3://www.test.com/ --cache-control max-age=3600
- aws s3 sync build s3://www.test.com/app/ --cache-control max-age=3600
artifacts:
files:
- build/**/*

Related

Using a bash script in AWS CICD Code Pipeline

In my company we have to use AWS native for the CICD process. I'm used to deploying my services via a bash script which makes deployments very flexible. However, I can't seem to find an option in AWS CodePipeline in which I can call on a script. In the build phase I am able to call on a buildspec.yml file, but in the deploy phase I would like to be able to call on a bash script. Does anyone know if that's possible using AWS CodePipeline?
This is what I put in the deployspec.yml to make it call on a bash script:
version: 0.2
env:
shell: bash
phases:
install:
runtime-versions:
python: 3.8
commands:
- source deployspec.sh && create-stack-name && upload-to-s3 && create-cf-stack

AWS Code Pipeline deploy results in a 404

I am trying to create a pipeline for an existing application. It is a React/Java Spring Boot application. It usually gets bundled into a single war file and uploaded to ElasticBeanstalk. I created my codebuild project and when I run it manually it will generate a war file that I can then upload to ElasticBeanstalk and everything works correctly. The buildspec for that is below:
version: 0.2
phases:
install:
commands:
- echo Nothing to do in the install phase...
pre_build:
commands:
- echo Nothing to do in the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn -Pprod package -X
post_build:
commands:
- echo Build completed on `date`
- mv target/cogcincinnati-0.0.1-SNAPSHOT.war cogcincinnati-0.0.1-SNAPSHOT.war
artifacts:
files:
- cogcincinnati-0.0.1-SNAPSHOT.war
When I run this build step in my pipeline it generates a zip file that gets dropped onto S3. My deploy step takes that build artifact and sends it to ElasticBeanstalk. Elasticbeanstalk does not give me any errors, but when I navigate to my url, I get a 404.
I have tried uploading the zip directly to Elasticbeanstalk and I get the same result. I have unzipped the file and it does appear to have all of my project files.
When I look at the server logs, I do not see any errors. I don't understanding why codebuild appears to be generating a war file when I run it manually, but a zip when executed in code pipeline.
Change artifacts war file name to ROOT.war this will resolve your problem actually your application is deployed successfully but on a different path, this is tomcat inbuild functionality by changing the ROOT it will run the application on '/'
So updated buildspec.yml will be
version: 0.2
phases:
install:
commands:
- echo Nothing to do in the install phase...
pre_build:
commands:
- echo Nothing to do in the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn -Pprod package -X
post_build:
commands:
- echo Build completed on `date`
- mv target/cogcincinnati-0.0.1-SNAPSHOT.war ROOT.war
artifacts:
files:
- ROOT.war
Seems your application is failing, you should review the logs from the Beanstalk environment, specially:
"tomcat8/catalina.out"
"tomcat8/catalina.[date].log"
[1] Viewing logs from Amazon EC2 instances in your Elastic Beanstalk environment - https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.logging.html
For more details about using Tomcat platform on EB environment, you can refer to this document:
- Using the Elastic Beanstalk Tomcat platform - https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/java-tomcat-platform.html
About the folder structuring in your project, please refer to this document:
- Structuring your project folder - https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/java-tomcat-platform-directorystructure.html
Try adding discard-paths: yes at the end of the buildspec.yml file. That will help you resolving the path error.

Codebuild Workflow with environment variables

I have a monolith github project that has multiple different applications that I'd like to integrate with an AWS Codebuild CI/CD workflow. My issue is that if I make a change to one project, I don't want to update the other. Essentially, I want to create a logical fork that deploys differently based on the files changed in a particular commit.
Basically my project repository looks like this:
- API
-node_modules
-package.json
-dist
-src
- REACTAPP
-node_modules
-package.json
-dist
-src
- scripts
- 01_install.sh
- 02_prebuild.sh
- 03_build.sh
- .ebextensions
In terms of Deployment, my API project gets deployed to elastic beanstalk and my REACTAPP gets deployed as static files to S3. I've tried a few things but decided that the only viable approach is to manually perform this deploy step within my own 03_build.sh script - because there's no way to build this dynamically within Codebuild's Deploy step (I could be wrong).
Anyway, my issue is that I essentially need to create a decision tree to determine which project gets excecuted, so if I make a change to API and push, it doesn't automatically deploy REACTAPP to S3 unnecessarliy (and vica versa).
I managed to get this working on localhost by updating environment variables at certain points in the build process and then reading them in separate steps. However this fails on Codedeploy because of permission issues i.e. I don't seem to be able to update env variables from within the CI process itself.
Explicitly, my buildconf.yml looks like this:
version: 0.2
env:
variables:
VARIABLES: 'here'
AWS_ACCESS_KEY_ID: 'XXXX'
AWS_SECRET_ACCESS_KEY: 'XXXX'
AWS_REGION: 'eu-west-1'
AWS_BUCKET: 'mybucket'
phases:
install:
commands:
- sh ./scripts/01_install.sh
pre_build:
commands:
- sh ./scripts/02_prebuild.sh
build:
commands:
- sh ./scripts/03_build.sh
I'm running my own shell scripts to perform some logic and I'm trying to pass variables between scripts: install->prebuild->build
To give one example, here's the 01_install.sh where I diff each project version to determine whether it needs to be updated (excuse any minor errors in bash):
#!/bin/bash
# STAGE 1
# _______________________________________
# API PROJECT INSTALL
# Do if API version was changed in prepush (this is just a sample and I'll likely end up storing the version & previous version within the package.json):
if [[ diff ./api/version.json ./api/old_version.json ]] > /dev/null 2>&1
## then
echo "🤖 Installing dependencies in API folder..."
cd ./api/ && npm install
## Set a variable to be used by the 02_prebuild.sh script
TEST_API="true"
export TEST_API
else
echo "No change to API"
fi
# ______________________________________
# REACTAPP PROJECT INSTALL
# Do if REACTAPP version number has changed (similar to above):
...
Then in my next stage I read these variables to determine whether I should run tests on the project 02_prebuild.sh:
#!/bin/bash
# STAGE 2
# _________________________________
# API PROJECT PRE-BUILD
# Do if install was initiated
if [[ $TEST_API == "true" ]]; then
echo "🤖 Run tests on API project..."
cd ./api/ && npm run tests
echo $TEST_API
BUILD_API="true"
export BUILD_API
else
echo "Don't test API"
fi
# ________________________________
# TODO: Complete for REACTAPP, similar to above
...
In my final script I use the BUILD_API variable to build to the dist folder, then I deploy that to either Elastic Beanstalk (for API) or S3 (for REACTAPP).
When I run this locally it works, however when I run it on Codebuild I get a permissions failure presumably because my bash scripts cannot export ENV_VAR. I'm wondering either if anyone knows how to update ENV_VARIABLES from within the build process itself, or if anyone has a better approach to achieve my goals (conditional/ variable build process on Codebuild)
EDIT:
So an approach that I've managed to get working is instead of using Env variables, I'm creating new files with specific names using fs then reading the contents of the file to make logical decisions. I can access these files from each of the bash scripts so it works pretty elegantly with some automatic cleanup.
I won't edit the original question as it's still an issue and I'd like to know how/ if other people solved this. I'm still playing around with how to actually use the eb deploy and s3 cli commands within the build scripts as codebuild does not seem to come with the eb cli installed and my .ebextensions file does not seem to be honoured.
Source control repos like Github can be configured to send a post event to an API endpoint when you push to a branch. You can consume this post request in lambda through API Gateway. This event data includes which files were modified with the commit. The lambda function can then process this event to figure out what to deploy. If you’re struggling with deploying to your servers from the codebuild container, you might want to try posting an artifact to s3 with an installable package and then have your server grab it from there.

Bitbucket Pipelines - is it possible to download additional file to project via curl?

We have a separate build for the frontend and backend of the application, where we need to pull the dist build of frontend to backend project during the build. During the build the 'curl' cannot write to the desired location.
In detail, we are using SpringBoot as backend for serving Angular 2 frontend. So we need to pull the frontend files to src/main/resources/static folder.
image: maven:3.3.9
pipelines:
default:
- step:
script:
- curl -s -L -v --user xxx:XXXX https://api.bitbucket.org/2.0/repositories/apprentit/rent-it/downloads/release_latest.tar.gz -o src/main/resources/static/release_latest.tar.gz
- tar -xf -C src/main/resources/static --directory src/main/resources/static release_latest.tar.gz
- mvn package -X
As a result of this the build fails with output of CURL.
* Failed writing body (0 != 16360)
Note: I've tried the same with maven-exec-plugin, the result was the same. The solution works on local machine naturally.
I would try running these commands from a local docker run of the image you're specifying (maven:3.3.9). I found that the most helpful way to debug things that were behaving differently in Pipelines vs. in my local environment.
To your specific question yes, you can download external content from the Pipeline run. I have a Pipeline that clones other repos from BitBucket via HTTP into the running container.

GitLab CI .yml file and connection with Web server

I have set GitLab on local server with Ubuntu instalation. Also, I managed to set GitLab CI with runners and now I am struggling a bit with them. So, I have PHP project on which couple of guys are working on. We have set .gitlab-ci.yml file to deploy files on web server (which is also on same local server just under different folder).
Main problem is that GitLab CI (runner basically) deploys all files everytime and not just pushed ones.
We would like to have an option to deploy only files which are changed.
Current yml file looks like:
pages:
stage: deploy
script:
mkdir -p /opt/lampp/htdocs/web/wp2/
mkdir -p .public
yes | cp -rf * .public /opt/lampp/htdocs/web/wp2/
only:
push
So, am I missing something huge here or there is a possibility just to deploy file which is pushed to repository? Runner is set to react on every push.
Thank you in advance for your kind replies.
cheers!
now, after some days and invested hours some notes did the trick...so with
pages:
stage: deploy
script:
- mkdir -p /opt/lampp/htdocs/web/wp2/
- mkdir -p .public
- cp -rfu * .public /opt/lampp/htdocs/web/wp2/
script I have managed to acheiev what I needed. "-rfu" part was the key which figures out should it replace the file if the source is newer than the destination (web server in my case).
So, this worked for me in .yml file and even that CI Lint gives the error that syntax is not correct runner gives an success. I hope that someone will find this thing useful :)

Resources