Run a Spring Boot application with elastic-apm Java agent on OpenShift - spring-boot

I want to deploy a Spring Boot application on an OpenShift cluster that I want to monitor with elastic-apm, therefore, with the JAVA elastic agent.
I managed to deploy in a project an Elasticsearch instance, a Kibana instance and an apm-server.
Next to that, I also managed to deploy my Spring Boot application. For this I used the web console. I imported my project from GitLab, and chose the Java 8 image builder. However, using this method, I didn't find a way to launch my application by associating the java-agent elastic-apm-agent.
Locally, I run this command to start my application:
mvn package && java -javaagent:elastic-apm-agent/elastic-apm-agent-1.26.0.jar \
-Delastic.apm.service_name=ms-salarie \
-Delastic.apm.server_urls=http://localhost:8200 \
-Delastic.apm.secret_token= \
-Delastic.apm.environment=development \
-Delastic.apm.application_packages=com.leanerp.salarie \
-Delastic.apm.config_file=elastic-apm-agent/elasticapm.properties \
-jar target/salarie-1.1.3-SNAPSHOT.jar
Is there a way to override the command launched by the container of my application? Or another solution allowing me to use the elastic-apm-agent?
I am a newbie on OpenShift, so I don't fully understand all the concepts.

Ok, so the answer was adding this environment variable :
JAVA_OPTS_APPEND=-javaagent:{{path_to_elastic_apm_agent}}
this command allows you to launch your java application with options.

The Java agent allows multiple ways to configure it, one of which are command line system properties. Others include packaging an elasticapm.properties resource file or setting environment variables.
Check out the docs. Small excerpt:
Properties file: The elasticapm.properties file is located in the same folder as the agent jar, or provided through the config_file option. dynamic config.
Environment variables: All configuration keys are in uppercase and prefixed with ELASTIC_APM_.
Different option sources have different priority and precedence.
To attach the agent to a running JVM process (from within your application), you can use the API to self-attach.

Related

Running Spark from a local IDE

I've been spending some time banging my head over trying to run a complex spark application locally in order to test quicker (without having to package and deploy to a cluster).
Some context:
This spark application interfaces with Datastax Enterprise version of Cassandra and their distributed file system, so it needs some explicit jars to be provided (not available in Maven)
These jars are available on my local machine, and to "cheese" this, I tried placing them in SPARK_HOME/jars so they would be automatically added to the classpath
I tried to do something similar with the required configuration settings by putting them in spark-defaults.conf under SPARK_HOME/conf
When building this application, we do not build an uber jar, but rather do a spark-submit on the server using --jars
The problem I'm facing, is when I run the Spark Application through my IDE, it seems like it doesn't pick up any of these additional items from the SPARK_HOME director (config or jars). I spent a few hours trying to get the config items to work and ended up setting them as System.property values in my test case before starting the spark session in order for Spark to pick them up, so the configuration settings can be ignored.
However, I do not know how to reproduce this for the vendor specific jar files. Is there an easy way I can emulate the --jars behavior that spark-submit does and some home set up my spark session with this jar value? Note: I am using in my code the following command to start a spark session:
SparkSession.builder().config(conf).getOrCreate()
Additional information, in case it helps:
The Spark version I have locally in SPARK_HOME is the same version that my code is compiling with using Maven.
I asked another question similar to this related to configs: Loading Spark Config for testing Spark Applications
When I print the SPARK_HOME environment variable in my application, I am getting the correct SPARK_HOME value, so I'm not sure why neither the configs or jar files are being picked up from here. Is it possible that when running the application from my IDE, it's not picking up the SPARK_HOME environment variable and using all defaults?
You can make use of .config(key, value) while building the SparkSession by passing "spark.jars" as the key and a comma separated list of paths to the jar like so:
SparkSession.builder().config("spark.jars", "/path/jar1.jar, /path/jar2.jar").config(conf).getOrCreate()

How to change some java variables before deploying on a remote server?

I am currently building a java web application (with netbeans).
I use Jenkins to create a release version with the following pipeline (for Jenkins):
Build -> Test -> Deploy (to a remote test webserver)
Build and Test are OK but I have a question about the deploy job.
The deploy job is currently taking my previously generated .war file and simply transfer it to a remote web server (with the "Deploy to container" plugin).
But I would like to change the database parameters of my web application first ! (in order to use another remote test database).
I would be glad to modify the java file with shell command but I can't because my .war is only composed of the compiled .class java.
So how could I change some of my web application java code (for database credentials) from the .war file before deploying it to the remote web server ?
If you have multiple environments which have different databases, then the best way to handle this would be application with command line parameters. You can modify your java application to read the command line parameters and use these parameters in application.
For example --dburl = <database url> --dbusername= <db username>
And the another way will be take these paramters from environment variable. And define these variables in the system where you are deploying the applications.

Configure Multiple SonarQube Instances in a Gradle Build

In our CI environment, we currently have one build server (based on Atlassian Bamboo) and two SonarQube instances (versions 6.0 and 6.5). Initially, our CI server was configured to communicate with the 6.0 SonarQube instance. This has been configured in the /home/bamboo/.gradle/gradle.properties file on our CI server like this:
systemProp.sonar.host.url=<http url of SonarQube 6.0 instance>
systemProp.sonar.login=<username here>
systemProp.sonar.password=<password here>
Now we have another Gradle-based project running on our CI server which shall talk to the new SonarQube 6.5 instance. I tried configuring this but failed all the time.
Things I have done so far:
Added commandline arguments to gradle wrapper command:
I have tried adding -Dsonar.host.url=, -Dsonar.login=, -Dsonar.password= to the Gradle command. As this didn't seem to work, I have also tried to set commandline arguments as SonarQube system properties using -DsystemProp.sonar.host.url=, -DsystemProp.sonar.login=, -DsystemProp.sonar.password=. This didn't work either.
Added properties to the build.gradle file
- Added properties to the build.gradle file like this:
sonarqube {
properties {
property "sonar.host.url", "<http url of SonarQube 6.0 instance>"
property "sonar.login", "<username here>"
property "sonar.password", "<password here>"
...<other SonarQube analysis settings here>...
}
}
In all cases, the CI server talked to the wrong SonarQube instance (6.0). My question is, whether it is possible to configure a single project to talk to another SonarQube instance. As you have seen, we use Gradle 3.2.1 as a build tool. And we are using the org.sonarqube Gradle plugin too.
Thank you for any help.
André
Your first try did not work, because you set the system properties from the commandline, but setting it from the project properties later on resets the system properties to the configured values.
Your second try did not work, because the systemProp.sonar.login syntax is only suppored in gradle.properties files, not via -P commandline project properties.
Your third try did not work because the SonarQube scanner prefers the system property values over the value configured via the DSL, so that one can change what is configured in the build script with the help of local configuration.
You need to set the system properties in your build script manually, this then overwrite what was automatically set from the project property. Using the project gradle.properties file does not work as the user file overwrite the project file. So you need something like System.properties.'sonar.login' = '...' in your build script. You can either hard-code it there, or then use project properties that you can set in your gradle.properties file or via -P parameters.
Besides that, I'd never depend on having any configuration in Gradle User dir on a build server. Most buildservers use build agents that might run on distributed machines, so you would always have to make sure that all build agents are configured the same and so on. I'd always configure in the build setup of the build server the according configuration, either by setting system properties, or environment properties or commandline arguments.
Just my 2ct.

Jenkins Websphere Deployment doesnt retain application configurations

I am using Jenkins version 1.644 and trying to deploy a web application to Websphere 8.5 application server. Jenkins job completed successfully and application is visible through admin console. After the first install, i manually configured Three application configurations namely,
1. Virtual Host
2. Context Root and
3 Modules
after these setup application comes up fine.
Now when i run the Jenkins Job again (option used is Install/Update application), it overrides all the configurations.
Please Let me know how to keep the configurations after each build from Jenkins.
Websphere Plugin Configuration
You can create a build deploy job which will call wsadmin tool and there you can pass parameter in key value pairs
Here is an article which talks about how to build job with parameterized configuration.
http://www.touchdownconsulting.nl/2011/03/building-and-deploying-websphere-applications-with-jenkins-ci/
I have not tried this but looks like it suits your requirement.
Hope this helps!
Current Jenkins Websphere deploy plugin (1.3.4) version does not allow to pass
1. Virtual Host
2. Context Root and
3. Modules
I created a Jython script using AdminApp WAS Utility and updated these parameters
AdminApp.edit("appname", ['-MapWebModToVH', [["appname", "appname.war,WEB-INF/web.xml", "api_host"]]])
AdminApp.edit("appname", ['-CtxRootForWebMod', [["appname", "appname.war,WEB-INF/web.xml", "/appname"]]])
AdminApp.edit("appname",['-MapModulesToServers', [["appname","appname.war,WEB-INF/web.xml","WebSphere:cell=appcell01,node=node12v,server=web2+WebSphere:cell=Cell01,node=node11v,server=web1+WebSphere:cell=Cell01,cluster=api-cluster"]]])
AdminConfig.save()
Used Jenkins Remote SSH Plugin to invoke this script.

Spring boot running a fully executable JAR and specify -D properties

The Spring Boot Maven and Gradle plugins can now generate full executable archives for Linux/Unix operating systems.Running a fully executable JAR is as easy as typing:
$ ./myapp.jar
My question is in this case how to set -D properties, e.g.
-Dspring.profiles.active=test
In addition, if server does not install jdk , could this fully executable jar still run?
There are two ways to configure properties like that:
1:
By specifying them in a separate configuration file. Spring Boot will look for a file named like JARfilename.conf which should be stored in the same folder like the JAR file. There you can add the environment variable JAVA_OPTS:
JAVA_OPTS="-Dpropertykey=propvalue"
2:
Or you can just specify the value for the environment variable in the shell before you execute the application:
JAVA_OPTS="-Dpropertykey=propvalue" ./myapp.jar
Have a look at the documentation for the complete list of available variables: http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#deployment-service
Regarding your second question: To execute a JAR, you don't need a JDK, a JRE is sufficient (but you need at least that, if you don't have any java installed on the server, the application won't run).
By default SpringApplication will convert any command line option arguments (starting with ‘--’, e.g. --server.port=9000) to a property and add it to the Spring Environment. As mentioned above, command line properties always take precedence over other property sources.
e.g.
$ java -jar myapp.jar --spring.application.json='{"foo":"bar"}'
please see http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/

Resources