CI/CD involves EC2 - amazon-ec2

Our code is divided to modules and stored on local Git server. The various modules are built and uploaded to the ECR.
Question: currently, can execute deployment on certain EC2 instance. What will be the preferred for my local Jenkins server to run the Deploy actions on the EC2?
Note: I've finished Working with SSM in the past with BAD impression!!
Thx - Albert

Related

How to execute some script on the ec2 instance from the bitbucket-pipelines.yml?

I've bitbucket repository, bitbucket pipeline there and EC2 instance. EC2 have access to the repository ( can perform pull and docker build/run)
So it seems I only need to upload to EC2 some bash scripts and call it from bitbucket pipeline. How can I call it? Usually ssh connection is used to perform scripts on EC2, is it applicable from bitbucket pipeline? Is it a good solution?
two ways to solve this problem, I will leave it up to you.
I see you are using AWS, and AWS has a nice service called CodeDeploy. you can use that and create a few deployment scripts and then integrate it with your pipeline. Problem with it is that it is an agent that needs to be installed. so it will consume some resource not much but if u are looking at an agentless design then this solution wont work. you can check the example in the following answer https://stackoverflow.com/a/68933031/8248700
You can use something like Python Fabric (its a small gun) or Ansible (its a big canon) to achieve this. it is an agentless design works purely on SSH.
I'm using both the approaches for different scenarios. For AWS I use CodeDeploy and for any other cloud vendor I use Python Fabric. (We can use CodeDeploy on other than AWS but then it comes under on-premise pricing for which it charges for per deployment)
I hope this brings some clarity.

Setup Github actions to build and deploy java code on EC2

I have been trying to make Github actions to setup CI/CD for aws ec2 machine. I tried working with aws actions i.e. https://github.com/aws-actions but it only talks about deploying to ECS or by using Codebuild.
Is it possible to do this with Github actions alone? Steps which I am trying to do are:
1) build with maven
2) Deploy to ec2
I am new to this so any pointers would be helful.

Advice for Continuous Integration / Development

I've got a Docker based PHP project. PHP framework is Laravel.
The project is setup in Gitlab and I use Jenkins for CI/CD.
When I merge into the master branch, a new build is triggered in Jenkins. I clone the repo, run Unit tests etc etc.
Once completed, I build a new Docker image with the latest codebase inside and push this image up to the Docker registry.
My jenkinsfile then calls a script on the production server that pulls down the latest docker image and stops / starts the running container.
I setup a Nginx proxy/Load balancer so users do not see any down time during the starting and stopping of containers.
This workflow works very well but I have one issue:
The storage folder in Laravel gets wiped when I do a new deployment, so any files uploaded by users are lost.
How do I overcome this?
I've recently started working on a new version of the project that sends all file uploads to Digital Ocean Spaces but I've found this to be very very slow.
I'm assuming S3 will be the same.
All suggestions are welcome.
My solution was to map a volume in container to the host, when I run started my docker container.
I also had to set permissions but now I have persistence during deployments.
No requirement for S3 or Spaces.

amazon EC2 load balanced - how to deploy web app?

We're looking to move to amazon cloud using EC2 and RDS.
I'm looking at load balancing, which I would like to do, two servers, each in a different availability zone to protect against downtime.
My question is how to deploy web applications and updates to them? I assume there is a better way than individually updating the files on each EC2 server?
In systems past, I have used the vcs puppet module to ensure that the appropriate source code is installed on my system, in addition to using puppet to build the configuration files for the apache/nginx server that I'm using. Another possibility is to push your application in a deployable state (if you're not using a scripting language) to Amazon S3, and have your run-time scripts pull the latest build from your S3 bucket.

How do I run my application code (PHP) across my various Amazon EC2 instances?

I've been trying to get to grips with Amazons AWS services for a client. As is evidenced by the very n00bish question(s) I'm about to ask I'm having a little trouble wrapping my head round some very basic things:
a) I've played around with a few instances and managed to get LAMP working just fine, the problem I'm having is that the code I place in /var/www doesn't seem to be shared across those machines. What do I have to do to achieve this? I was thinking of a shared EBS volume and changing Apaches document root?
b) Furthermore what is the best way to upload code and assets to an EBS/S3 volume? Should I setup an instance to handle FTP to the aforementioned shared volume?
c) Finally I have a basic plan for the setup that I wanted to run by someone that actually knows what they are talking about:
DNS pointing to Load Balancer (AWS Elastic Beanstalk)
Load Balancer managing multiple AWS EC2 instances.
EC2 instances sharing code from a single EBS store.
An RDS instance to handle database queries.
Cloud Front to serve assets directly to the user.
Thanks,
Rich.
Edit: My Solution for anyone that comes across this on google.
Please note that my setup is not finished yet and the bash scripts I'm providing in this explanation are probably not very good as even though I'm very comfortable with the command line I have no experience of scripting in bash. However, it should at least show you how my setup works in theory.
All AMIs are Ubuntu Maverick i386 from Alestic.
I have two AMI Snapshots:
Master
Users
git - Very limited access runs git-shell so can't be accessed via SSH but hosts a git repository which can be pushed to or pulled from.
ubuntu - Default SSH account, used to administer server and deploy code.
Services
Simple git repository hosting via ssh.
Apache and PHP, databases are hosted on Amazon RDS
Slave
Services
Apache and PHP, databases are hosted on Amazon RDS
Right now (this will change) this is how deploy code to my servers:
Merge changes to master branch on local machine.
Stop all slave instances.
Use Git to push the master branch to the master server.
Login to ubuntu user via SSH on master server and run script which does the following:
Exports (git-archive) code from local repository to folder.
Compresses folder and uploads backup of code to S3 with timestamp attached to the file name.
Replaces code in /var/www/ with folder and gives appropriate permissions.
Removes exported folder from home directory but leaves compressed file intact with containing the latest code.
5 Start all slave instances. On startup they run a script:
Apache does not start until it's triggered.
Use scp (Secure copy) to copy latest compressed code from master to /tmp/www
Extract code and replace /var/www/ and give appropriate permissions.
Start Apache.
I would provide code examples but they are very incomplete and I need more time. I also want to get all my assets (css/js/img) being automatically being pushed to s3 so they can be distibutes to clients via CloudFront.
EBS is like a harddrive you can attach to one instance, basically a 1:1 mapping. S3 is the only shared storage stuff in AWS, otherwise you will need to setup an NFS server or similar.
What you can do is put all your php files on s3 and then sync them down to a new instance when you start it.
I would recommend bundling a custom AMI with everything you need installed (apache, php, etc) and setup a cron job to sync php files from s3 to your document root. Your workflow would be, upload files to s3, let server cron sync files.
The rest of your setup seems pretty standard.

Resources