How to override spark jars while running spark-submit command in cluster mode? (okhttp3) - maven

There is a conflict of jar in my project and jar in spark-2.4.0 jars folder. My Retrofit brings okhttp-3.13.1.jar (verified in mvn dependancy:tree) but spark in server has okhttp-3.8.1.jar, and I get NoSuchMethodException. So, I'm trying to give my jar explicitly to override it. When I try running spark submit command in client mode, it picks up the explicit jar that I have provided but when I try running the same in cluster mode, this fails to override the jar at the worker nodes, and executors use the same old jar of Spark which leads to NoSuchMethodError. My jar is a fat jar but spark jar somehow takes precedence over the same. If i can delete the jars provided by Spark, it would work probably but I can't as other services may be using it.
Following is my command:
./spark-submit --class com.myJob --conf spark.yarn.appMasterEnv.ENV=uat --conf spark.driver.memory=12g --conf spark.executor.memory=40g --conf spark.sql.warehouse.dir=/user/myuser/spark-warehouse --conf "spark.driver.extraClassPath=/home/test/okhttp-3.13.1.jar" --conf "spark.executor.extraClassPath=/home/test/okhttp-3.13.1.jar" --jars /home/test/okhttp-3.13.1.jar --conf spark.submit.deployMode=cluster --conf spark.yarn.archive=hdfs://namenode/frameworks/spark/spark-2.4.0-archives/spark-2.4.0-archive.zip --conf spark.master=yarn --conf spark.executor.cores=4 --queue public file:///home/mytest/myjar-SNAPSHOT.jar
final Retrofit retrofit = new Retrofit.Builder()
.baseUrl(configuration.ApiUrl()) //this throws nosuchmethodexception
.addConverterFactory(JacksonConverterFactory.create(new ObjectMapper()))
.build();
My mvn dependency:tree doesnt indicate any other transitive jars in my jar. And it runs fine in local in IntelliJ as well as with mvn clean install.
I even tried providing hdfs path of jars(hdfs://users/myuser/myjars/okhttp-3.13.1.jar) with no luck. Can someone throw some light ?
I get the following exception if I try both --conf "spark.driver.userClassPathFirst=true" --conf "spark.executor.userClassPathFirst=true"
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.<init>(YarnSparkHadoopUtil.scala:48)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.<clinit>(YarnSparkHadoopUtil.scala)
at org.apache.spark.deploy.yarn.Client$$anonfun$1.apply$mcJ$sp(Client.scala:81)
at org.apache.spark.deploy.yarn.Client$$anonfun$1.apply(Client.scala:81)
at org.apache.spark.deploy.yarn.Client$$anonfun$1.apply(Client.scala:81)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:80)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1526)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassCastException: org.apache.hadoop.yarn.api.records.impl.pb.PriorityPBImpl cannot be cast to org.apache.hadoop.yarn.api.records.Priority
at org.apache.hadoop.yarn.api.records.Priority.newInstance(Priority.java:39)
at org.apache.hadoop.yarn.api.records.Priority.<clinit>(Priority.java:34)
... 15 more
But if I have only --conf "spark.executor.userClassPathFirst=true" then it hangs

I have solved the issue using maven shade plugin.
Ignore Spark Cluster Own Jars
Referrence video :
https://youtu.be/WyfHUNnMutg?t=23m1s
I followed answer given here and added the following.
Even in sourcecode SparkSubmit, you will see jar getting appended to total jar list if we give --jar so it will never override with those options but it will add jar
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L644
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>okio</pattern>
<shadedPattern>com.shaded.okio</shadedPattern>
</relocation>
<relocation>
<pattern>okhttp3</pattern>
<shadedPattern>com.shaded.okhttp3</shadedPattern>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
<exclude>log4j.properties</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>

Related

Kotlin JSR-223 ScriptEngineFactory within the fat jar - Cannot find kotlin compiler jar

I have a fat jar where I'm trying to get the instance of Kotlin's ScriptEngine.
For the debugging purposes I'm iterating through available Script Engine Factories and getting the engines.
val scriptEngineManager = ScriptEngineManager()
for (factory in scriptEngineManager.engineFactories) {
val scriptEngine = factory.scriptEngine
}
When it hits the Kotlin's engine, it fails with following exception:
Exception in thread "main" java.io.FileNotFoundException: Cannot find kotlin compiler jar, set kotlin.compiler.jar property to proper location
at org.jetbrains.kotlin.script.jsr223.KotlinJsr223ScriptEngineFactoryExamplesKt$kotlinCompilerJar$2.invoke(KotlinJsr223ScriptEngineFactoryExamples.kt:100)
at org.jetbrains.kotlin.script.jsr223.KotlinJsr223ScriptEngineFactoryExamplesKt$kotlinCompilerJar$2.invoke(KotlinJsr223ScriptEngineFactoryExamples.kt)
at kotlin.SynchronizedLazyImpl.getValue(Lazy.kt:130)
at org.jetbrains.kotlin.script.jsr223.KotlinJsr223ScriptEngineFactoryExamplesKt.getKotlinCompilerJar(KotlinJsr223ScriptEngineFactoryExamples.kt)
at org.jetbrains.kotlin.script.jsr223.KotlinJsr223ScriptEngineFactoryExamplesKt.access$getKotlinCompilerJar$p(KotlinJsr223ScriptEngineFactoryExamples.kt:1)
at org.jetbrains.kotlin.script.jsr223.KotlinJsr223JvmDaemonLocalEvalScriptEngineFactory.getScriptEngine(KotlinJsr223ScriptEngineFactoryExamples.kt:56)
at davidsiro.invoices.InvoiceGeneratorKt.generateInvoice(invoiceGenerator.kt:16)
at davidsiro.invoices.MainKt.main(main.kt:11)
My fat jar contains all of the dependencies (though unpacked), including Kotlin Compiler. I'm using Maven Assembly Plugin to build it, which configured like these:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>${main.class}</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
Any ideas?
Update
For the record, I tried to both KotlinJsr223JvmLocalScriptEngineFactory and KotlinJsr223JvmDaemonLocalEvalScriptEngineFactory with the same result.
The JSR223 factories in the kotlin-script-util are trying to find the compiler jar in order to setup the compilation. In your case, you'll need to write your own factory to supply the script compilation classpath explicitly, e.g.
class MyScriptEngineFactory : KotlinJsr223JvmScriptEngineFactoryBase() {
override fun getScriptEngine(): ScriptEngine =
KotlinJsr223JvmLocalScriptEngine(
Disposer.newDisposable(),
this,
classpath, // !!! supply the script classpath here
KotlinStandardJsr223ScriptTemplate::class.qualifiedName!!,
{ ctx, types -> ScriptArgsWithTypes(arrayOf(ctx.getBindings(ScriptContext.ENGINE_SCOPE)), types ?: emptyArray()) },
arrayOf(Bindings::class)
)
}
You need to put the following jars into classpath:
kotlin-script-util.jar - it contains the template class used as a superclass for the script
kotlin-script-runtime.jar - for base classes used in scripting
any other jars you'll need to use in your scripts, quite likely - kotlin-stdlib.jar
You may put your fat jar there instead, but that would mean that everything from it will be accessible from your scripts. Not talking about the overheads for the compiler.

ERROR [WsdlTestCase] Failed to create test step for [X]

I get this error ERROR [WsdlTestCase] Failed to create test step for [X] when executing my soapui project from Maven.
Tests work fine from the SOAPUI client software. My Soap-ui.error log is empty.
There doesn't seem to be enough information to let me debug this - I'm guessing there's some dependency I'm missing, but my tests are fairly simple (REST requests with a few assertions on the HTTP response). I assumed the core maven plugin would suffice.
My maven config is as below (I've also included the eviware repository -http://www.eviware.com/repository/maven2/)
<plugin>
<groupId>eviware</groupId>
<artifactId>maven-soapui-plugin</artifactId>
<version>2.0.2</version>
<executions>
<execution>
<phase>integration-test</phase>
<id>soapui-tests</id>
<configuration>
<projectFile>${basedir}/src/test/resources/MyTestSuite.xml</projectFile>
<outputFolder>${basedir}/target/soapui</outputFolder>
<junitReport>true</junitReport>
<exportwAll>true</exportwAll>
<printReport>true</printReport>
</configuration>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
the preceeding messages are :
[DefaultSoapUICore] Missing folder [/var/lib/jenkins/jobs/Build-Mule-Configuration/workspace/ext] for external libraries 2015-07-01 15:10:48,961 INFO
[DefaultSoapUICore] Creating new settings at [/var/lib/jenkins/jobs/Build-Mule-Configuration/workspace/soapui-settings.xml]
I've moved up to the 4.6.1 version of the soapui plugin as detailed here
(http://www.soapui.org/test-automation/maven/maven-2-x.html), the tests are now at least attempting to execute. As I understand it the version I've outlined above should work but perhaps it doesn't recognise the current SOAP UI project syntax.
As an aside the 4.6.1 version of the plugin seems to have a much larger number of dependancies and for me is causing memory issues.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool

I get below error when i package (jar) and run my defaulthadoopjob.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Tool
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 12 more
Could not find the main class: DefaultHadoopJobDriver. Program will exit.
Commands used to build Jar.
# jar -cvf dhj.jar
# hadoop -jar dhj.jar DefaultHadoopJobDriver
The above command gave me error "Failed to load Main-Class manifest attribute from dhj.jar"
rebuilt jar with manifest using below command
jar -cvfe dhj.jar DefaultHadoopJobDriver .
hadoop -jar dhj.jar DefaultHadoopJobDriver -- This returned the original error message that I reported above.
My Hadoop job has single class "DefaultHoopJobDrive" that extends Configures and implements Tool, and run method as only code for Job creation and inputpath,outpurpath set.
Aslo I.m using new API.
I'm running hadoop 1.2.1 and the Job works fine from eclipse.
This might be something to do with the classpath. Please help.
For executing that jar you don't have to give hadoop -jar. The command is like so:
hadoop jar <jar> [mainClass] args...
If this jar again gets the java.lang.ClassNotFoundException exception then can you use the:
hadoop classpath
command to see whether hadoop-core-1.2.1.jar is present in your hadoop installations classpath?
FYI, and if it's not present in this list you have to add this jar to the hadoop lib directory.
Try building your hadoop java code with all hadoop jars available in hadoop's lib folder.
In this case you are missing the hadoop util class which is present in the hadoop-core-*.jar
Classpath can be specified while building the code within the jar or you can externalise it using the following command
hadoop -cp <path_containing_hadoop_jars> -jar <jar_name>
In case anyone is using Maven and lands here: Dependency issues can be resolved by asking Maven to include any jars it requires within the parent project's jar itself. That way, Hadoop doesn't have to look elsewhere for dependencies -- it can find them right there itself. Here's how to do this:
1. Go to pom.xml
Add a section to your <project> tag, called <build>
Add the following to your <build></build> section:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.7.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.slf4j:slf4j-api</exclude>
<exclude>junit:junit</exclude>
<exclude>jmock:jmock</exclude>
<exclude>xml-apis:xml-apis</exclude>
<exclude>org.testng:testng</exclude>
<exclude>org.mortbay.jetty:jetty</exclude>
<exclude>org.mortbay.jetty:jetty-util</exclude>
<exclude>org.mortbay.jetty:servlet-api-2.5</exclude>
<exclude>tomcat:jasper-runtime</exclude>
<exclude>tomcat:jasper-compiler</exclude>
<exclude>org.apache.hadoop:hadoop-core</exclude>
<exclude>org.apache.mahout:mahout-math</exclude>
<exclude>commons-logging:commons-logging</exclude>
<exclude>org.mortbay.jetty:jsp-api-2.1</exclude>
<exclude>org.mortbay.jetty:jsp-2.1</exclude>
<exclude>org.eclipse.jdt:core</exclude>
<exclude>ant:ant</exclude>
<exclude>org.apache.hadoop:avro</exclude>
<exclude>jline:jline</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.yaml:snakeyaml</exclude>
<exclude>javax.ws.rs:jsr311-api</exclude>
<exclude>org.slf4j:jcl-over-slf4j</exclude>
<exclude>javax.servlet:servlet-api</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/jruby.home</exclude>
<exclude>META-INF/license</exclude>
<exclude>META-INF/maven</exclude>
<exclude>META-INF/services</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
Now build your project again, and run with the normal hadoop java my.jar ... command. It shouldn't cry about dependencies now. Hope this helps!

Got NoClassDefFoundError when invoking Axis2 webservice deployed with maven in weblogic

I'm trying to deploy axis2 webservices in weblogic server using maven. The project has maven modules and one of that is a war that i have defined the axis servlet in. The wsdl was there, so i used wsdl2code plugin to generate the xmlbean and schema and put that in a jar module. Structure is as below.
--lv-ear (ear with dependency on war)
|
--lv-ws
|
--lv-ws-ccid (jar module with skeleton and xmlbeans)
|
--lv-ws-ecs (jar module with skeleton and xmlbeans)
|
--lv-ws-web (war module with dep on jar modules)
|
--WEB-INF
|
--conf/axis2.xml
--services/ccid/services.xml
I built and deployed the ear to weblogic domain. The war was deployed successfully as part of ear and services were deployed. I am able to access wsdl files. When I tried to call the service, i got the below ClassNotFoundException for a schema file.
Caused by: java.lang.ClassNotFoundException: schemaorg_apache_xmlbeans.system.s2104B1E6E09A2A85656B3E630BA151C1.TypeSystemHolder
I saw that the random string in that path differed fo me. So I tried to call again and got below NoClassDefFoundError, which persists even after I tried with different approaches.
java.lang.NoClassDefFoundError: Could not initialize class com.lv.ws.ccid.xmlbean.InputDocument
at com.lv.ws.ccid.xmlbean.InputDocument$Factory.parse(InputDocument.java:463)
at com.lv.ws.ccid.CcidMessageReceiverInOut.fromOM(CcidMessageReceiverInOut.java:332)
at com.lv.ws.ccid.CcidMessageReceiverInOut.invokeBusinessLogic(CcidMessageReceiverInOut.java:46)
at org.apache.axis2.receivers.AbstractInOutMessageReceiver.invokeBusinessLogic(AbstractInOutMessageReceiver.java:40)
at org.apache.axis2.receivers.AbstractMessageReceiver.receive(AbstractMessageReceiver.java:114)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:173)
at org.apache.axis2.transport.http.HTTPTransportUtils.processHTTPPostRequest(HTTPTransportUtils.java:173)
at org.apache.axis2.transport.http.AxisServlet.doPost(AxisServlet.java:144)
I searched for this and found something that told to configure app server for axis2 based on http://axis.apache.org/axis2/java/core/docs/app_server.html . When I tried it I got below error.
weblogic.xml.stax.XmlStreamInputFactory can not be cast to javax.xml.stream.XmlInputFactory
After discarding that configuration, I did some other possible deployments by having the webservice skeleton and xmlbean files in an aar and put the aar inside WEB-INF/services. I also tried putting Class-Path entry in MANIFEST.MF in ear / war for the jar files, to no avail. Still I got the same NoClassDefFoundError. Can you please give me suggestions on fixing that?
Fixed it now. This was due to my lack of experience with Axis. Issue was that, I had the generated schema and xmlbean files moved to src folder and then just tried to use normal jar function and dependency to deploy.
Now, I removed them from src folder and use wsdl2code and axis2-aar plugins to generate the xmlbean and schema files dynamically and then package them in aar. Then I deployed the aar to webapp and it worked fine. I have listed the plugin configuration inside <build> below.
<plugins>
<plugin>
<groupId>org.apache.axis2</groupId>
<artifactId>axis2-wsdl2code-maven-plugin</artifactId>
<version>1.5.4</version>
<executions>
<execution>
<id>ccid-ws</id>
<goals>
<goal>wsdl2code</goal>
</goals>
</execution>
</executions>
<configuration>
<packageName>com.lv.ws.ccid</packageName>
<wsdlFile>${basedir}/src/main/resources/META-INF/ccid.wsdl</wsdlFile>
<databindingName>xmlbeans</databindingName>
<syncMode>sync</syncMode>
<unpackClasses>true</unpackClasses>
<namespaceToPackages>https://mdm.com/portal/ws/services/ccid=com.lv.ws.ccid.xmlbean</namespaceToPackages>
<outputDirectory>${basedir}/target/generated-sources</outputDirectory> <generateServerSide>false</generateServerSide>
<generateServicesXml>true</generateServicesXml>
<skipWSDL>true</skipWSDL>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.axis2</groupId>
<artifactId>axis2-aar-maven-plugin</artifactId>
<version>1.6.2</version>
<extensions>true</extensions>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>aar</goal>
</goals>
</execution>
</executions>
<configuration>
<aarName>ccid</aarName>
<includeDependencies>false</includeDependencies>
<outputDirectory>${project.build.directory}/aar</outputDirectory>
</configuration>
</plugin>
</plugins>

Sonar and Jacoco : How to ? (Compatibility with Cobertura)

I know there are lot of questions about Sonar and Jacoco not generating the report correctly on SO but none of them helped me solve my problem:
Here is the deal :
I have a project using Cobertura to calculate code coverage and I can't change that. Sonar must use Cobertura for the server-side. But on the other hand I have this client (we're talking about a webapplication using smartGWT in java) which is tested with Selenium RC (because that's the only way to test smartGWT).
Cobertura can't calculate Selenium tests' coverage because those tests are done on the deployed application. Jacoco can calculate that because it is launched as a java agent. So everything is working fine on my computer. The jacoco.exec is generated successfully. But it seems that Sonar can't find how to read that file or maybe it's just cobertura and jacoco that are conflicted...
Here is the error that Jenkins generate :
[INFO] Surefire report directory: D:\JENKINS-SONAR\jobs\agepro PROTO\workspace\target\surefire-reports
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at sun.instrument.InstrumentationImpl.loadClassAndStartAgent(InstrumentationImpl.java:323)
at sun.instrument.InstrumentationImpl.loadClassAndCallPremain(InstrumentationImpl.java:338)
Caused by: java.io.FileNotFoundException: D:\JENKINS-SONAR\jobs\agepro (Accès refusé)
at java.io.FileOutputStream.openAppend(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
at org.jacoco.agent.rt_9w37ty.controller.LocalController.startup(LocalController.java:46)
at org.jacoco.agent.rt_9w37ty.JacocoAgent.init(JacocoAgent.java:83)
at org.jacoco.agent.rt_9w37ty.JacocoAgent.premain(JacocoAgent.java:165)
... 6 more
FATAL ERROR in native method: processing of -javaagent failed
Exception in thread "main"
And here is my configuration of the surefire plugin :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-javaagent:${sonar.jacoco.jar}=destfile=${sonar.jacoco.reportPath}</argLine>
<excludes>
<exclude>**/Selenium*.java</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>surefire-integration-test</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<!-- <argLine>-javaagent:${sonar.jacoco.jar}=destfile=${sonar.jacoco.reportPath}</argLine> -->
<excludes>
<exclude>none</exclude>
</excludes>
<includes>
<include>**/Selenium*.java</include>
</includes>
<goal>selenium-test</goal>
</configuration>
</execution>
</executions>
</plugin>
Do you see any mistake I could have made ? Or maybe I have to set another path for the jacoco jar or something else ? Thanks for your help anyway.
The name reportPath is a bit misleading.
The setting sonar.jacoco.reportPath needs to point to a file and not to a directory.
You seem to have set it to: D:\JENKINS-SONAR\jobs\agepro: But I think you mean D:\JENKINS-SONAR\jobs\agepro\jacoco.exec
On further inspection I noticed that the problem might be that you working directory "agepro PROTO" has a space in it.
This breaks the syntax for the java agent configuration. Can you get rid of the space(I don't know if destfile='${sonar.jacoco.reportPath}' would work)

Resources