I'm now using Azure DevOps Server to implement a Azure DevOps CI/CD Pipeline(Using Command Line to perform the job in Release Pipeline) with Spring Boot Java Application.
If you are developing java application, you will know that you have to use java -jar to start the Application.
But I encountered a question is that when I use the command line to perform the java -jar command and put it in the Azure DevOps Release (CD) Pipeline, although the Application is already started and finished deployment
The status will not turn to "completed". It will always be "Deploying".
Is there any solutions in cmd?
Supposed that I can only use CMD to do Release pipeline, I want to have a "Completed" stauts so that I can continue to next job.
Many thanks!!
Related
We have a build that runs SonarQube from Jenkins using a bash script, and we want to get the results of the tests back in the Jenkins pipeline so we can prevent merges on fail. We are using v2 of Jenkins, but it is an old version that doesn't support the SonarQube Jenkins plugin, and upgrading Jenkins isn't something we can accomplish in our sprint.
Is there is a way to get the results to gate our pipeline with what we have? At the moment this is how we're running SonarQube from Jenkins in OpenShift.
dotnet build
~/.dotnet/tools/coverlet "./bin/Debug/netcoreapp3.1/AppTests.dll" --target "dotnet" --targetargs 'test . --no-build --logger "trx;LogFileName=TestResults.trx" --logger "xunit;LogFileName=TestResults.xml" --results-directory ../BuildReports/UnitTests' -f opencover -o ./BuildReports/Coverage/coverage
dotnet build-server shutdown
~/.dotnet/tools/dotnet-sonarscanner begin /k:${APP_NAME} /n:${APP_NAME} /d:sonar.host.url=${SONAR_URL} /d:sonar.cs.opencover.reportsPaths="./BuildReports/Coverage/coverage.opencover.xml" /d:sonar.exclusions="**/Migrations/*" /d:sonar.coverage.exclusions="**Tests*.cs","**/Migrations/*","**/Program.cs" /d:sonar.cpd.exclusions="**/Migrations/*" /d:sonar.cs.vstest.reportsPaths="./BuildReports/UnitTests/TestResults.trx" /d:sonar.cs.nunit.reportsPaths="./BuildReports/UnitTests/TestResults.xml"
dotnet build -v n
~/.dotnet/tools/dotnet-sonarscanner end
dotnet build-server shutdown
Install the Build Breaker plugin on the SonarQube server. And enable it for the project you are scanning -- to do this go to Project Settings on the SonarQube server. You may need server level and project level Administrative rights for doing this.
Now the Sonar Scanner will check for the quality gate status after doing the code analysis. In case the quality gate fails, the scanner returns with a non-zero status code that can be used to mark the build as “failed”.
https://github.com/adnovum/sonar-build-breaker#sonarqube-build-breaker-plugin
In case you don't have control over what gets installed on the SonarQube server, then you may write a bash script to use the curl command to hit web API of your SonarQube server to first find if the analysis report has been processed by the SonarQube server and then the quality gate status of the code analysis just concluded.
For the documentation of the web API, see http://<sonarqube-server-host>/web_api.
I have a PowerShell script to push a package to AzureDevOps.
I want to run it, automatically, every time that I Check In the code to AzureDevOps in Visual Studio.
Is this possible? How can I do this?
If you are pushing package to git in Azure DevOps you can create build pipeline that will execute this package, and this build pipeline will have trigger to run after each commit.
IF you are pushing packages to Azure DevOps Artifacts then you have option to use Azure DevOps RestApi or Power Shell Module for Azure DevOps and trigger specific build.
I am currently building a java web application (with netbeans).
I use Jenkins to create a release version with the following pipeline (for Jenkins):
Build -> Test -> Deploy (to a remote test webserver)
Build and Test are OK but I have a question about the deploy job.
The deploy job is currently taking my previously generated .war file and simply transfer it to a remote web server (with the "Deploy to container" plugin).
But I would like to change the database parameters of my web application first ! (in order to use another remote test database).
I would be glad to modify the java file with shell command but I can't because my .war is only composed of the compiled .class java.
So how could I change some of my web application java code (for database credentials) from the .war file before deploying it to the remote web server ?
If you have multiple environments which have different databases, then the best way to handle this would be application with command line parameters. You can modify your java application to read the command line parameters and use these parameters in application.
For example --dburl = <database url> --dbusername= <db username>
And the another way will be take these paramters from environment variable. And define these variables in the system where you are deploying the applications.
(Jenkins newb-newb-newbie here)
Hi there.
I have a Maven project, deployed on Jenkins . In this project, I have an integration test, which depends on a .Net server in order to be run correctly.
The problem is, when I'm trying to build my project on Jenkins, the integration test fails, because the .Net isn't launched...
I need to execute a shell script (for launhing the .Net server) before building my project.
So my question is : how can I launch a script of my project before building from Jenkins?
Theres a build step Execute Windows batch command which you could use to start your server.
You might have to use START to have it launched in a separate process, so your build continues without waiting for the server process to finish, and you might need to put in some delay in case your server needs some time to settle before your tests can run.
You might also need to kill your server after your tests are done, you might be able to use tasklistand taskkill in another Execute Windows batch command build step, and some batch magic to do this.
What is the recommended/available approaches to remote deploying a Mule application to a remote Mule server? I'm using Maven for build btw.
I saw that the appkit can remote deploy to Cloudhub: http://blogs.mulesoft.org/using-continuous-deployment-with-cloudhub/
and theres a rest maven plugin for the management console but I'm using the standalone community edition and can not use either one. I also saw a cargo implementation on Github, but this only handle local deployments
I would write a script that:
scp the application to /tmp
remote mv the application to $MULE/apps
Moving a file being an atomic operation prevents potential issues of uploading directly in the /apps directory and have Mule pick-up a partially uploaded application.
"scp the application to /tmp" reminds me this! ...