Way to let runner execute a bash script - bash

I set a GIT_STRATEGY variable to none in a .gitlab-ci.yml file. So a repository is not built in a deployment server automatically. So we avoid building the git repository with its all history etc. A git-archive command will be used instead.
This way only needed files will be copied from a GitLab server.
Now I am looking for the way to send the bash script to the runner.
As we know, the .gitlab-ci.yml file is actually being sent to the runner. Isn't it a way to send the bash script to the runner too? Then I would make the .gitlab-ci.yml file just to start the bash script and have a job done.
Of course, I could put the commands to the yml, but there are some limitations then because of a yml syntax. I could also make the runner scp the file but this would be too complex. Any ideas?
update
Let me try to approach what I am going to achieve. There are several bash script files. Each of them is a specific scenario of the deployment. For example, one pulls project files from GitLab, deletes files in a folder public and moves project files to it.
The other script file pulls project files from GitLab, moves the files to the folder and creates a backup.
The other script file pulls project files from GitLab, moves the files, creates backups and performs a database migration. Etc.
There are three hosts:
Local PC of an administrator.
Remote GitLab server
Remote deployment server that server websites. (It has Nginx or Apache)
One of the script files will be executed on the deployment server. There is a number of the script files so we must provide replacing the script file from time to time.
The question is how to bring the script file to the deployment server so the runner could run it?

Related

Need to take Jenkins jobs backup last 15 days using shell script

I am new to Jenkins server,
Take the back up of Jenkins jobs and plugins and delete the same which is older than 15days.. with shell script
Everything you need is present in your Jenkins home folder. If you want to take a backup of your job configuration, you can copy the config.xml file found inside the $JENKINS_HOME/jobs/<job_name> directory. The directory will also have a list of builds which will contain the logs of these builds, so you can take a backup of this as well if necessary. As for the plugins, you can find them in the $JENKINS_HOME/plugins folder. The .jpi and .hpi files are the plugin files.
As for deleting files older that 15 days, this article gives a pretty comprehensive method for doing it.

How to download .tfvar files in to ADOagent machine directly before running Terraform plan?

We are using Terraform Enterprise Cloud and Azure DevOps YML pipelines for Azure infra deployments.
Requirement: We want to separate .tfvar files completely from the main terraform folder and keep them in different Repo called config Repository.
Solution 1: We can refer tfvars from the config repository while running the below command,
terraform plan --var-fil -We cannot implement
Note: Since we are using global templates, these terraform commands like fmt, validate, plan, and apply are managed by the template itself, we are not allowed to edit the template.
Here is the logic,
template expects only .tfvars file in the current directory, then there are some bash commands to rename it to .auto.tfvars.
We know that these auto.tfvars files will be automatically identified by Terraform.
Solution 2: We are expecting and struggling to implement and need some help
By default Template copies all terraform folders to ADO Agent Container. we want to make sure the .tfvar file from the Config repository is available in the agent container. Then this solution will be good.
May be,
We can achieve it by Copying the .tfvars file from the config repository to the agent container by writing some shell script. but it has to be inside the terraform folder. because only terraform folder will be copied to the agent container.
Or is there any way that we can integrate a shell script to terraform configuration which can download tfvars file from config repository to container in run time.
Any other solution or approach will be appreciated.
To make sure the config repo files are available during runtime you can add a second artifact to the release pipeline. This will allow you to modify your var argument with the appropriate file.
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/artifacts?view=azure-devops
One approach is to have your tfvars file stored as a secure file, then just add a step in your pipeline to download it, however, if you're using Terraform Enterprise, is there any particular reason to not use Terraform workspace variables?

Jenkins Deploy scripts

So, I'm writing the build and the deploy scripts. To create the build, I used ant. The continuous build is done with Jenkins.
The build generates 3 different artifacts:
The war file
A zip with layouts
A zip with images
So far, so good, but now I need to write the deploy script, which should:
Deploy the war (artifact 1) to the tomcat running at server 1
Place the artifact 2 at server 1 in a specific directory
Place the artifact 3 at server 2 in a specific directory
So I was talking with my colleague and he said that we should also generate an artifact (maybe deploy.xml) that deploys these artifacts when placed at the correct server.
So there would be another script, that would:
Download the jenkins artifacts
scp to each server and place the deploy.xml there
remotely invoke the deploy.xml
What makes me a little uncomfortable is the act of having the deploy.xml as a build artifact. The motivation behind this would be to be able to make a deploy without needing to have access to the VCS repositories, so a build would be self-contained, ie, any build could go into production only with what was generated by Jenkins.
Where should the deploy scripts be placed? Should they be only at the VCS or should they be build artifacts too?
Please provide if any sample deploy scripts
I wrote my own deployment framework, consisting of different shell, batch, python, and .... scripts. It neatly separates environment information from application information and allows me to quickly update deployment information and add new apps or environment. However, the orchestration of the different parts is done by Jenkins. When just copying files to a Windows server, my Jenkins master (running on Windows) just copies the files to a network share that exposes the target directory. Services I can restart remotly using sc.exe. When crossing the borders to AIX, I use jenkins slaves that are started via ssh on the target system. So distribution is managed by Jenkins. The actual work is done by the scripts.

Change Jenkins job workspace

What I have done :
I have Jenkins set up on my Ubuntu in :
/var/lib/jenkins/
I have a job, that runs every 45 minutes that does a hg pull and hg update --clean default
from my bitbucket repository.So this is running fine.I have a folder
/var/lib/jenkins/jobs/Code Deployement
which contains the latest updated code from my repository.
Problem :
However, I want to access my updated code from
/var/www/html/[project-name]
Query :
Is there anyway i can make jenkins job update this folder instead of the /var/lib/jenkins/Code Deployment folder ? I certainly dont want to make /var/www/html/[project-name] as my jenkins home folder.
How can I achieve what I described above.Will I have to copy the folder from the jobs folder to my desired location after every time the job runs ? Please help me out with the solution,I'am a beginner with automated deployment using Jenkins.
Thank you.
Under "Advanced" you can explicitly choose a working directory for the projects without changing the Jenkins home directory. Check the "Use custom workspace" box and set the directory that Jenkins will pull the code to and build in.
In our setup we wait for the build process to complete in the working directory and add a build step for Jenkins to copy (most) of the files out to the directory that serves up the website. We had issues with file locking preventing the build process if someone (e.g. the testers) were using the site.
One simple way to do it is to create a symlink under /var/www/html/ that points to your code directory, e.g.:
sudo ln -s /var/lib/jenkins/jobs/"Code Deployement" /var/www/html/[project-name]

How can I copy the artifacts from Teamcity to another server?

how can I copy the artifacts from Teamcity to another server?
Thanks
The way I have done this, make things a lot easier.. Setup another configuration that pulls in, via artifact dependencies, all the files you need then run a cmd script to xcopy/copy the files to another drive on the network. You can do this using cmd script, vbs, python, shell etc..
Remember, you only need to refer to directories as if they were local as you would have your script in the same working directory
i.e cmd script :: xcopy .\"my build artifact(s)" \path\to\drive\on\my\network\"my build artifacts"
It doesn't get easier than that.
Naturally, if your artifacts are huge, then you may want to consider your more complicated option. However, TeamCity currently have a ticket pending, which you can vote on, that allows you to run multiple runners in one configuration - so you could just add your cmd script to the same configuration to save the copy time; please vote if can spare a minute:
http://youtrack.jetbrains.net/issue/TW-3660
There is a Deployer plugin, that supports deploy by fileshare/SMB, FTP, SSH and other means. The usage is basically the same as the Artifact paths.
We have used just samba, so you must enter:
target Host path: //server/drive/myfolder
Username: mydomain\myusername - in our case we had to write domain
here too
Password: ****
Domain: mydomain
and in path just select the files as in artifacts:
product/* => product.zip
and it will create file //server/drive/myfolder/product.zip
You can do it from your build script or externally.
If you are looking to get artifacts copied from a remote build agent to the primary TeamCity server, you may want to look into configuring Build Artifacts under the General Settings.
According to TeamCity's wiki entry on BuildArtifacts (http://confluence.jetbrains.com/display/TCD7/Build+Artifact) "Upon build finish, TeamCity searches for artifacts in the build's checkout directory according to the specified artifact patterns. Matching files are then uploaded ("published") to the TeamCity server, where they become available for download through the web UI or can be used in other builds using artifact dependencies."

Resources