Spring boot environment based configuration with docker in Pipeline - spring-boot

I have one micro-service, that is running using a pipeline on AWS EKS cluster. So it passes through each environment. I need to update this application, so that whenever, it will pass through from different environment, it takes specific environment variables. What I thought to do this to add application.properties file for different environment builds and add the SPRING_PROFILES_ACTIVE=prod|dev|test, whatever required. But I am new to this pipeline stuffs and need to understand, where I will add this profile specific properties so that each time, the build runs in different environment, it take the specific environment based profile to activate it.

If you are using jenkins pipeline or any other pipeline for the build, have environment variable for for each environment and during docker build it will automatically build based one environment you select
something like this
docker build --build-arg ENVIRONMENT=$environment-variable(specific env can be from pipeline parameter) .
SPRING_PROFILES_ACTIVE=$ENVIRONMENT
Or else
You can have one file for each env and have that specific file picked during docker build

Related

Using Kubernetes plugin for Jenkins project instead of pipeline

I'm using the Kubernetes plugin for Jenkins (https://github.com/jenkinsci/kubernetes-plugin).
Using their documentation, I was able to create a Jenkins Pipeline to create a pod and run some maven commands inside the maven container within the pod with the use of a Jenkins pipeline script. There is another kubectl container running some kubectl commands. I haven't done anything fancy with it yet other than trying it out.
I would like to create two Jenkins Projects (or Jobs). One for the maven step and the other for the kubectl step.Then combine the two jobs into a single pipeline. At the end, there would be two individual jobs, and one pipeline running those two jobs. The pipeline being equivalent to what I described in the previous paragraph. I did not see a way to do this for doing Kubernetes things; specifically, I did not see a way to create a script that creates a pod with maven container and do something within that container with a Jenkins project, unlike a Jenkins Pipeline.
Is it possible to do what I'm saying by using the Kubernetes Plugin or not?
Is there a better way to do this?
If not possible, is there another way to do something similar?

Terraform modules cache

I have .terraform/modules folder generated by terraform itself.
It's where terraform keeps modules by default and I'm fine with that.
when running terraform init command and if .terraform folder is gone it will try to pull modules again I would like to avoid that step by saying to use pre-populated modules folder from different location - it's like building shared cache folder for terraform for our CI/CD pipelines, pull only if new version of a modules specified otherwise use cache.
NOTE:
We don't run anything on Jenkins locally, every `Stage` in Jenkins uses Ephemeral Docker
container agents to run all the `Steps` and to keep Jenkins clean,
otherwise I would use local workspace cache for all that.
is there a way to do that?
Thank you

Configure Multiple SonarQube Instances in a Gradle Build

In our CI environment, we currently have one build server (based on Atlassian Bamboo) and two SonarQube instances (versions 6.0 and 6.5). Initially, our CI server was configured to communicate with the 6.0 SonarQube instance. This has been configured in the /home/bamboo/.gradle/gradle.properties file on our CI server like this:
systemProp.sonar.host.url=<http url of SonarQube 6.0 instance>
systemProp.sonar.login=<username here>
systemProp.sonar.password=<password here>
Now we have another Gradle-based project running on our CI server which shall talk to the new SonarQube 6.5 instance. I tried configuring this but failed all the time.
Things I have done so far:
Added commandline arguments to gradle wrapper command:
I have tried adding -Dsonar.host.url=, -Dsonar.login=, -Dsonar.password= to the Gradle command. As this didn't seem to work, I have also tried to set commandline arguments as SonarQube system properties using -DsystemProp.sonar.host.url=, -DsystemProp.sonar.login=, -DsystemProp.sonar.password=. This didn't work either.
Added properties to the build.gradle file
- Added properties to the build.gradle file like this:
sonarqube {
properties {
property "sonar.host.url", "<http url of SonarQube 6.0 instance>"
property "sonar.login", "<username here>"
property "sonar.password", "<password here>"
...<other SonarQube analysis settings here>...
}
}
In all cases, the CI server talked to the wrong SonarQube instance (6.0). My question is, whether it is possible to configure a single project to talk to another SonarQube instance. As you have seen, we use Gradle 3.2.1 as a build tool. And we are using the org.sonarqube Gradle plugin too.
Thank you for any help.
André
Your first try did not work, because you set the system properties from the commandline, but setting it from the project properties later on resets the system properties to the configured values.
Your second try did not work, because the systemProp.sonar.login syntax is only suppored in gradle.properties files, not via -P commandline project properties.
Your third try did not work because the SonarQube scanner prefers the system property values over the value configured via the DSL, so that one can change what is configured in the build script with the help of local configuration.
You need to set the system properties in your build script manually, this then overwrite what was automatically set from the project property. Using the project gradle.properties file does not work as the user file overwrite the project file. So you need something like System.properties.'sonar.login' = '...' in your build script. You can either hard-code it there, or then use project properties that you can set in your gradle.properties file or via -P parameters.
Besides that, I'd never depend on having any configuration in Gradle User dir on a build server. Most buildservers use build agents that might run on distributed machines, so you would always have to make sure that all build agents are configured the same and so on. I'd always configure in the build setup of the build server the according configuration, either by setting system properties, or environment properties or commandline arguments.
Just my 2ct.

Set TYPO3_ACTIVE_FRAMEWORK_EXTENSIONS ENV variable in deploy process

I'm using Gitlab CI to deploy TYPO3 projects onto a target server and I'm trying to remove the PackageStates.php from the git repository and generate it on the target server with EXT:typo3_console instead.
But I need to set the TYPO3_ACTIVE_FRAMEWORK_EXTENSIONS environment variable in order to have the necessary system extensions loaded.
How do I set this ENV variable?
What's a good way to store these information in my project to have it available in the deployment process? I could use Gitlab variables but I feel like this information should be included in the git repository.
You don't need to set and use the env variable if you don't want to. You can just require the TYPO3 core extensions you need in any package of your project and typo3_console will take care to populate the environment variable for you as needed during composer install time.
If you want to, you can however still override this env var from command line for individual calls.
EDIT: This feature has been integrated in typo3_console 3.3.0

Jenkins: setting env variable from shell script

I have a bash script that I execute from a Jenkins job, using "execute shell".
The script starts an EC2 instance and sets an host variable containing the host name of the new instance.
I would like to set the host name of the new instance (script variable: host ) to a Jenkins environment variable so that I can pass it down to a downstream job (possibly using the Build Flow plugin).
Any idea how to do so?
Thanks
I ended up using a file to propagate data between builds.
The first build creates a file containing the information I need to propagate (host name of the newly created EC2 instance).
The file looks like:
host.name=ec2.123.3345.amazon.com
I use the EnvInject plugin to read the file and "inject" the properties that are then available in the next build (I'm using the Build Flow Plugin to orchestrate builds).
There is a plugin that you can install for inject your variable: EnvInject Plugin
If I understood your problem, I think this simple plugin is what you need.

Resources