Passing Public IP from Terraform to Ansible in GitLab CI/CD - ansible

I am using a Terraform stage in GitLab CI/CD to deploy an EC2 instance. I know how to get the public IP of that instance once it's available for use throughout Terraform, but I'm not clear how to hop that over into an Ansible stage for configuration. Is there a way to output the public IP to an environment variable that can be made available to other stages?

The easiest way to pass Variables from one GitLab CI/CD job to another is to use the dotenv report artifact.
You simply put your variable in a file in the form VARIABLE_NAME=VALUE, and upload it as a specific type of artifact:
job1:
stage: stage1
script:
- echo "IP_ADDRESS='127.0.0.1'" >> .env
artifacts:
reports:
dotenv: .env
job2:
stage: stage2
script:
- echo $IP_ADDRESS # echo's 127.0.0.1
Instead of a normal artifact that is downloaded in other jobs, the dotenv report type turns the variables within the file into Environment Variables for other jobs.

Related

Spring boot (Non Docker Application) with Github Actions - Unable to run jar file

I'm trying to deploy & run a java spring boot application using github actions to a AWS Ec2 Instance. The application properties file of spring boot application points to environment variables where are present in the AWS Ec2 Instance. However, these environment variables are not available when the github action runs and so the execution of the jar fails with a null pointer exception.
What is the correct way to deploy a Spring boot (Non Docker Application) to Self hosted Ec2 Server? Can I do it without needing AWS Code Pipeline or AWS Elastic Beanstalk?
How do we read Ec2 instance environment variables while using github actions.
Thanks.
Sample Workflow file:
jobs:
build:
runs-on: [self-hosted]
steps:
- uses: actions/checkout#v3
- name: Set up JDK 11
uses: actions/setup-java#v3
with:
java-version: "11"
distribution: "temurin"
cache: maven
- name: Build with Maven
run: mvn clean -B package
deploy:
runs-on: [self-hosted]
needs: build
steps:
- name: Run Script file
working-directory: ./
run: |
chmod +x ./script.sh
./script.sh
shell: bash
// script.sh - Try to print the env variables inside ec2.
#!/bin/bash
whoami
printenv

Gitlab CI/CD, deploy "Hello World" in Nginx on ec2 instances

I have created my first file gitlab-ci.yml and I have also installed my first runner, I am very happy, but I have a problem.
Here my first yml:
stages:
- init
- deploy
hello:
stage: init
script:
- echo "Hello World"
deploy:
stage: deploy
script:
- cd /var/www/html
- git pull http://$GIT_USER:$GIT_PASS#source.server/ema/ci-cd-test.git
tags:
- shell-test-ema
I have to perform an SSH connection to an ec2 instances (target) using username and password to test the deployment of a "hello world" but I don’t know how to do it.
Is there a safer and easier way to implement this?
My idea is to understand the deploy procedure.
Regards,

How to deploy maven project on aws with Gitlab CI/CD

I'm trying to deploy a java maven project on aws with Gitlab CI/CD.
This is my .gitlab-ci.yml
image: maven:3-jdk-8
services:
- docker:dind
stages:
- test
- build
- deploy
maven-test:
stage: test
script:
- echo "Test stage"
- mvn clean validate compile test -B
maven-build:
stage: build
script:
- echo "Build stage"
- mvn install -B -DskipTests
artifacts:
paths:
- ./target/*.jar
maven-deploy:
stage: deploy
script:
- echo "Deploy stage"
- scp -v -o StrictHostKeyChecking=no -I "mykey.pem" ./target/*.jar ubuntu#xxxxxxx.com:*.jar
when: manual
If I execute the scp command on a terminal in my pc then the jar is uploaded in aws ec2 instance while in gitlab I have errors and the jar is not uploaded.
This is my first approach with Gitlab CI and aws, so can someone explain step by step what I need to do to deploy the project in aws ec2 instance with Gitlab CI?
Thanks!
Since you have not posted much about your problem nor did you post the error I will just suggest a few things to look at:
From a GitLab perspective:
Are you sure that the "mykey.pem" is available within the repository when running that command(maven-deploy) on the the gitlab-runner.?
Also are you sure that you are using a docker gitlab-runner, if you are not then you can't use the image: directive and therefore it might not not have mvn/scp locally.
You might want to look into the dependencies directive and ensure you make that artifact available in next task. This should be done by default!
From an AWS perspective:
Make sure that the ubuntu target machine/server has port 22 exposed to the EC2 machine running the gitlab-runner.
Edit:
If the error you are receiving is with the pem files permissions then take a look at this resolution for AWS EC2 pem file issue. Another similar resolution is here.
Seems like if you put chmod 400 mykey.pem before the scp it might fix your problem.

Gitlab Secret Variable >>> Spring Boot application.yml

How can I reference a gitlab secret variable in an application.yml? I assume it is only accessable within gitlab-ci.yml context and has to be moved from there into the Docker image as a VM parameter somehow?
In case it matters, I am deploying in a Rancher environment.
Just export it or pass as a command line parameter to you CI script. Like:
gitlab-ci.yml
deploy-app:
stage: deploy
image: whatever
script:
- export MY_SECRET
- ...
or
deploy-app:
stage: deploy
image: whatever
script:
- docker run -it -e PASSWORD=$MY_SECRET whatever ...

Unable to connect to a target server via SSH from a GitLab pipeline?

I have set up .gitlab-ci.yml. I am unable to login to the production server from gitlab. I have set the private and public key variables of my server in GITLAB but still getting timeout error in pipeline.
job1:
stage: build1
script:
- mvn package
variables:
SSH_PUBLIC_key: "$SSH_PUBLIC_key"
SSH_PRIVATE_KEY: "$SSH_PRIVATE_KEY"
artifacts:
paths:
- server
script:
- scp "myjar" root#"myIP":/tmp
job1:
stage: build1
script:
- mvn package
variables:
SSH_PUBLIC_key: "$SSH_PUBLIC_key"
SSH_PRIVATE_KEY: "$SSH_PRIVATE_KEY"
artifacts:
paths:
- server
script:
- scp "myjar" root#"myIP":/tmp
timeout error comes, when the instance (in your case the production instance) is not reachable from GitLab (can be hosted on VM, Kubernetes, etc). Please check if you are able to perform telnet/ssh manually from the GitLab hosted VM
Replace myIP with proper values and see if that helps.
telnet <myIP> 22
ssh <myIP>

Resources