MuleSoft Maven plugin not reading variables in Azure pipelines - maven

I have a MuleSoft application that I am trying to deploy from a pipeline.
I am using a Maven plugin and a connected app for credentials. Plugin configuration looks like this:
<configuration>
<armDeployment>
<muleVersion>${app.runtime}</muleVersion>
<uri>https://anypoint.mulesoft.com</uri>
<businessGroupId>${BUSINESSGROUPID}</businessGroupId>
<target>${TARGET}</target>
<targetType>${TARGETGROUP}</targetType>
<connectedAppClientId>${APPCLIENTID}</connectedAppClientId>
<connectedAppClientSecret>${APPCLIENTSECRET}</connectedAppClientSecret>
<connectedAppGrantType>client_credentials</connectedAppGrantType>
<environment>${ENVIRONMENT}</environment>
</armDeployment>
</configuration>
I define variables in Azure pipeline(3 of them are secret credentials) and when I run the pipeline I am getting 401 Unauthorized error.
When I hard-code values in the above configuration it works fine. Only when I try to have the POM file read them from the pipeline variables do I get this error.
Below is my pipeline config too:
trigger:
- master
variables:
APPCLIENTID: $(APPCLIENTID)
APPCLIENTSECRET: $(APPCLIENTSECRET)
ENVIRONMENT: $(ENVIRONMENT)
BUSINESSGROUPID: $(BUSINESSGROUPID)
TARGET: $(TARGET)
TARGETGROUP: $(TARGETGROUP)
pool:
vmImage: ubuntu-latest
steps:
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'clean package deploy -DmuleDeploy'
I am not sure whether I need to define variables here again or not.
How do I make the POM file read variables correctly?

You can't use Azure Pipeline variables directly in the pom. They are not properties in Maven. You have to define them explicitly as such in Maven's command line.
In the goals input you can define Maven properties for the command line and assign them the values from the respective Azure Pipeline variable using the 'goals' input.
I'm guessing that the syntax for referencing variables is $(var) so as an example:
goals: 'clean package deploy -DmuleDeploy -DAPPCLIENTID=$(APPCLIENTID)'
Just try adding the other properties next to APPCLIENTID.

Ok, I found out what the issue is.
Azure Pipelines documentation states that you need to use $() in order to access variables but in the case of a POM file you need to use {}. So the POM file should look like this:
<configuration>
<armDeployment>
<muleVersion>${app.runtime}</muleVersion>
<uri>https://anypoint.mulesoft.com</uri>
<businessGroupId>${BUSINESSGROUPID}</businessGroupId>
<target>${TARGET}</target>
<targetType>${TARGETGROUP}</targetType>
<connectedAppClientId>${APPCLIENTID}</connectedAppClientId>
<connectedAppClientSecret>${APPCLIENTSECRET}</connectedAppClientSecret>
<connectedAppGrantType>client_credentials</connectedAppGrantType>
<environment>${ENVIRONMENT}</environment>
</armDeployment>
</configuration>
One more thing I noticed is that doing this will not allow POM file to read variables set as secret. I am yet to find out how to make this work properly but for now I got most of it to work.

Related

How do you run micronaut from gradle with local properties

I want to run Micronaut server from Gradle command line with "local" environment variables.
The regular command
.\gradlew.bat run
will use default variables defined in application.yml file.
I want to override some of them with values for my local environment and therefore need to specify system property micronaut.environments=local to use overriding values from application-local.yml file.
.\gradlew.bat run -Dmicronaut.environments=local
The command above won't work as Gradle will take only -Dmicronaut for the system property and the rest ".environments=local" will be considered as another task name:
Task '.environments=local' not found in root project 'abc'
What would be the correct way to pass such system property to the java process?
Command below works for unix, probably it should work also for windows:
MICRONAUT_ENVIRONMENTS=local gradle run
or use gradle wrapper
MICRONAUT_ENVIRONMENTS=local .\gradlew.bat run
P.S. also, you can find the same approach for Spring Boot
My approach is to add a gradle task.
task runLocal(type: JavaExec) {
classpath = sourceSets.main.runtimeClasspath
main = "dontdrive.Application"
jvmArgs '-Dmicronaut.environments=local'
}
then start with:
./gradlew runLocal

Differences in ivy/maven publishing for sbt plugin

I have two SBT plugins: PluginA and PluginB. PluginA depends on tasks in PluginB. Whenever I publish PluginB locally to "~/.ivy2" using "publishLocal", then PluginA works. Though the dependency still resolves when I publish PluginB using "publishM2" to my local "~/.m2" the compile task for PluginA fails:
"object xxx is not a member of package yyy".
I have tried setting "publishMavenStyle" to both true and false and adjusting the resolver, but neither work.
Why does this happen and is there a way to get this to work when publishing in a maven style?
This was a mistake on my part. I added plugin incorrectly by using from
addSbtPlugin("com.xxyy" %% "PluginA" % "0.0.2" from "http://internal.repo.com")
Though the POM was found, so the resource was found, the correspoinding the jar was not found so the build would fail.
To fix this I added a resolver before adding the plugin
resolvers += "xxyy" at "http://internal.repo.com"
addSbtPlugin("com.xxyy" %% "PluginA" % "0.0.2")

Is there a jenkins job-dsl block/code for defining Gradle-Artifactory plugin configuration?

In jenkins job, I use the gradle-artifactory plugin to publish the artifact to specific path (which is mentioned in build.gradle of the git project) in Artifactory.
I wanted to have my Jenkins jobs through job-dsl. What would be the job-dsl code/block for gradle-artifactory plugin configuration?
I tried to use ArtifactoryGradleConfigurator class but it did not work.
The artifactory plugin is not yet supportet by job dsl. What you need to do is create the according XML config by yourself via the configure block. Here is an example for you where you can start:
job('artifactory-config') {
configure {
it / buildWrappers / 'org.jfrog.hudson.gradle.ArtifactoryGradleConfigurator' {
deployMaven 'false'
deployIvy 'false'
deployBuildInfo 'true'
includeEnvVars 'false'
deployerCredentialsConfig {
credentialsId 'foobar'
overridingCredentials 'false'
}
}
}
}
The actual configuration you need to do is a bit more extensive. Just have a look at the config.xml of your job, there you will find the XML tag for ArtifactoryGradleConfigurator. It will look like this:
<project>
<buildWrappers>
<org.jfrog.hudson.gradle.ArtifactoryGradleConfigurator">
<deployMaven>false</deployMaven>
<deployIvy>false</deployIvy>
<deployBuildInfo>true</deployBuildInfo>
<includeEnvVars>false</includeEnvVars>
<deployerCredentialsConfig>
<credentials>
<username></username>
<password></password>
</credentials>
<credentialsId></credentialsId>
<overridingCredentials>false</overridingCredentials>
</deployerCredentialsConfig>
<resolverCredentialsConfig>
<credentials>
<username></username>
<password></password>
</credentials>
<credentialsId></credentialsId>
<overridingCredentials>false</overridingCredentials>
</resolverCredentialsConfig>
</org.jfrog.hudson.gradle.ArtifactoryGradleConfigurator>
</buildWrappers>
</project>
One important thing for you to know is, you do not need to configure the whole block. But when you miss any important XML tag, the job will be generated, but you will not see the configuration in the UI. Just try to get the XML to be generated 1:1. The Jenkins Job DSL Playground is a nice tool to help you do that.
Nice thing about the Jenkins Artifactory plugin for Gradle is that all it does is applying Gradle Artifactory plugin (which is, of course, all code – Gradle DSL). So instead of applying the plugin from Jenkins UI you can apply it directly in Gradle, in code.

How to override Gradle `rootProject.name` from the command line?

I'm writing a test that does a build and publish to Artifactory. Since I don't want the test to fail if it's run concurrently (eg by separate build jobs or developers), I'd like to override rootProject.name. Can this be done from the command line? I've tried -ProotProject.name=${module} and -Pproject.archivesBaseName=${module} but they're not working (the latter does have some effect, but the artifact is still published with the rootProject.name setting in settings.gradle).
You'll have to script settings.gradle. For example:
rootProject.name = System.getProperty("rootProjectName")
Now you can run with gradle build -DrootProjectName=foo.
The following is a slightly simpler version when you need the default behavior that just passes through the default when it's not being overwritten.
rootProject.name = System.getProperty('rootProjectName') ?: rootProject.name

Is there a way to post-process project generated from archetype?

Say I have an archetype and I generate a project from it. But I would like to resolve placeholders in a property file of the project I generated on after generation time by passing the value for placeholder through command line.
For example having the following command line:
mvn archetype:create -DarchetypeGroupId=... -DarchetypeArtifactId=... -DarchetypeVersion=1.0 -DgroupId=... -DartifactId=my-project -Dversion=1.0-SNAPSHOT -Dhello=Hello!
say the archetype contains app.properties (as part of project which is being generated) with the following content:
greeting=${hello}
Is it possible to replace ${hello} with "Hello!" right after project has been generated as a result of mvn archetype:create command?
Yes this is possible. From the advanced usage guide for maven archetypes:
If the user wants to customize the generated project even further, a groovy script named archetype-post-generate.groovy can be added in src/main/resources/META-INF/. This script will end up in the generated archetype's META-INF folder and will be executed upon creating a project from this archetype. This groovy script has access to the ArchetypeGenerationRequest object, as well as all the System.getProperties() and all the archetype generation properties the user has specified.
You could define additional properties in the archetype, following the format:
https://maven.apache.org/archetype/maven-archetype-plugin/specification/archetype-metadata.html
For example:
define the file: src\main\resources\META-INF\maven\archetype-metadata.xml
<archetype-descriptor
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/plugins/maven-archetype-plugin/archetype-descriptor/1.0.0"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-archetype-plugin/archetype-descriptor/1.0.0 http://maven.apache.org/xsd/archetype-descriptor-1.0.0.xsd"
name="modelant.metamodel.api">
<requiredProperties>
<requiredProperty key="package"><defaultValue>${groupId}.${artifactId}</defaultValue></requiredProperty>
<requiredProperty key="parentGroupId"><defaultValue>${groupId}</defaultValue></requiredProperty>
<requiredProperty key="parentArtifactId"><defaultValue>${artifactId}</defaultValue></requiredProperty>
<requiredProperty key="parentVersion"><defaultValue>${version}</defaultValue></requiredProperty>
<requiredProperty key="metamodelUrl"/>
</requiredProperties>
</archetype-descriptor>
Here you see that it defines additional required properties, so they have to be mandatorily provided within the dialog, where:
some properties may have no value - see metamodelUrl
some properties may have default values either
-- as static text
-- or referring the values of the previously defined standard properties: groupId, artifactId, version
some poperties may override the values of the standard properties - the "package" property. Here it is redefined.
Please note:
the https://maven.apache.org/archetype/maven-archetype-plugin/advanced-usage.html Apache maven page on archetypes refers just calling "mvn install" in order to publish the artifact in the local repository. This is not enough - use: mvn clean install "archetype:update-local-catalog"
the https://maven.apache.org/archetype/archetype-models/archetype-descriptor/archetype-descriptor.html Apache maven page states that the proeprties are referred using "property name" expressions. This is not correct - the properties are allowed to be used in the filtered resources, treating them as velocity templates, thus the references are ${property name} and #if, #for, etc. statements could be used there
Not sure I understood correctly. For post processing after project creation you could use the param -Dgoals and invoke your custom plugin.
Am not sure about your requirement, but why cant you do the same during the project generation itself ?

Resources