Intellij-idea Debugger disconnected when debugging maven test - maven

I have followed this link to debug maven test via Intellij Idea : http://www.grygoriy.com/2012/01/how-to-debug-tests-maven-test-via.html
When reaching the third step and starting debugging, It's connected but quickly disconnected and isn't stopped in breakpoints. I had in Intellij :
Connected to the target VM, address: 'localhost:5005', transport: 'socket'
Disconnected from the target VM, address: 'localhost:5005', transport: 'socket'
Any idea ?

The only thing that prevents Idea from debugging Maven's goals is forking.
Plugins such surefire and spring-boot are using forking by default.
So, when you start debuging maven goal from IDEA it connects to maven, but process that you are really want to debug distincts from maven process, so it doesn't connected.
To prevent such behavior in surefire plugin you should read this article: http://maven.apache.org/surefire/maven-surefire-plugin/examples/fork-options-and-parallel-execution.html
In short:
If you use old surefire:
<configuration>
<forkMode>never</forkMode>
</configuration>
In new surefire:
<configuration>
<forkCount>0</forkCount>
</configuration>
But it's not much clear:
in case of CI (i wish you are using CI tools) you are not have prevent forking while it's much slow
if you ship your project to others - they will not be happy if some of modules behave in not default way
So if you want to please CI, IDEA, co-developers and yourself you should provide more smart way to allow debugging you build.
My suggestion:
default behavior is forked because build process is very often thing, while debugging it - is exception
debugger behavior is isolated with simple to use "switch on"
My variant:
<properties>
<test.forkCount>1</test.forkCount>
</properties>
<profiles>
<profile>
<id>debug</id>
<properties>
<test.forkCount>0</test.forkCount>
</properties>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<!-- surefire -->
<configuration>
<forkCount>${test.forkCount}</forkCount>
</configuration>
</plugin>
</plugins>
</build>
So, in IDEA you just require to create named Run configuration with goal test and include debug to profile list.
But in other contexts - maven behaves still by default.
Where is addition profit - you can incapsulate whole debug behavior in single profile.
For example in my real project debug profile:
swith off forking on spring-boot:run
switch off JaCoCo coverage (it requires forking on surefire)
keep building Docker images locally but prevents pushing to registry
keep full packaging process but prevents publication to nexus
redirects SOAP UI functional tests to specital URL for local debugging
redirects DBCONFIG to docker-based Postgres that is "always empty"
downgrades loglevel for log4j to DEBUG
So if I use mvn <any> -P debug i'm sure that my environment and process is really debug
But if i ran mvn deploy on CI - i will get full stack of my building process.

This can also happen if e.g. the annotation BeforeAll is incorrectly used. IntelliJ swallows problem during the test initialization.
In my case the BeforeAll method was not static:
Incorrect:
#BeforeAll
private void beforeAll() {
}
It can also happen if the BeforeAll method is failing and the test is not started. Using mvn verify should print the error message in these cases.

Related

How to debug a Maven OpenJFX application in IntelliJ

Since JavaFX has become OpenJFX and needs to be added as a dependency to your Maven project, debugging a OpenJFX application has become complicated. In this question it is already answered how to solve it for NetBeans. But IntelliJ works slightly different. The Ppom.xml has been setup according to this example.
How can you run an OpenJFX (JavaFX) application which is configured as a Maven project in debug mode in IntelliJ?
If you would copy the addition of VM options in the pom.xml for javafx-maven-plugin as given by José Pereda here, you can run the application with 'mvn javafx:run#debug' and then manually attach it to the IntelliJ Debugger by go to the menu 'Run - Attach to process...' and the select your application.
However, if you want debugger and application to be started with a single click, IntelliJ is a but troublesome. You can create a Remote Debug configuration which first launches your application and the debugger serially. Or have Compound Configurations which does both in parallel. The problem is to get them synchronized.
I found the following solution. Make your application run as debug client and the IntelliJ debugger as server. The VM options for the javafx-maven-plugin in the pom.xml file should have 'server=n':
<plugin>
<groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>0.0.4</version>
<executions>
<execution>
<!-- Default configuration for running -->
<id>default-cli</id>
<configuration>
<mainClass>org.waninge.test.JFXTest</mainClass>
</configuration>
</execution>
<execution>
<!-- Configuration for debugging -->
<id>debug</id>
<configuration>
<options>
<option>-agentlib:jdwp=transport=dt_socket,server=n,address=localhost:8000,suspend=y</option>
</options>
<mainClass>org.waninge.test.JFXTest</mainClass>
</configuration>
</execution>
</executions>
</plugin>
Create a Maven Run Configuration in IntelliJ with 'javafx:run#debug' in the 'Command line'.
Create a Remote Run Configuration with the following settings:
Debugger mode: 'Listen to remote JVM';
Transport: 'Socket';
Host: as in the pom
Port: as in the pom
Now the Remote Run Configuration will start a debug server waiting for clients. The Maven Run Configuration will connect to the debug server, or fail if the debug server isn't online.
Now, to get everything started with a single click, you can create a Compound Run Configuration and add the Maven and the Remote Run Configuration to it. Starting this configuration will launch the two in parallel, and you can debug your application.

Running Spring Boot "context loads" test in maven causes OutOfMemoryError

I have a multi-module Maven project with two modules being Spring Boot applications. Each of them has a simple test that the Spring application context loads successfully (my tests are very similar to this one). I run this tests with the following command in project root:
mvn -P IntegrationTests clean test
During context initialization things go out of my control, the application "eats" memory (heap size grows quickly to 4 gigabytes) and then the context fails to start with java.lang.OutOfMemoryError: PermGen space error (yes, I run it in Java 7).
Monitoring task manager during testing I noticed that maven spawns two new processes that have something to do with surefire plugin. I have no idea where it comes from, because I don't add the surefire plugin in my pom.xml.
Previously when encountered the same error somewhere I specified VM options (-Xmx256m -Xms128m -XX:MaxPermSize=256m -XX:PermSize=128m for example) and the problem was solved.
This time I tried to
set MAVEN_OPTS environment variable
set VM options (when running mvn test in IntelliJ IDEA) - it affected main java process but not its children
add -Drun.jvmArguments="..." in command line
but the problem persists.
Please help me to fight the OutOfMemoryError in tests.
Add Surefire plugin explicitly to module-specific pom.xml and configure VM options there. I like this solution because this way VM options are
passed to the spawned surefire processes (which should solve your problem)
affect only test application builds
shared between developers in your team
configurable independently for every module
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Xmx256m -Xms128m -XX:MaxPermSize=256m -XX:PermSize=128m</argLine>
</configuration>
</plugin>
<!-- your other plugins go here -->
</plugins>
</build>

How do I configure my Maven-war-plugin's useCache feature so that consecutive builds are faster?

I’m using Maven 3.3.0 on Mac Yosemite. I wanted to make use of the maven-war-plugin’s useCache feature, but it isn’t doing anything in my multi-module project. When I run
mvn clean install -DskipTests
my project takes about 1:25 to run with the below configuration
<profile>
<id>prepare-deploy-war-to-jboss</id>
<activation>
<file>
<exists>${basedir}/src/main/webapp</exists>
</file>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<useCache>true</useCache>
<cacheFile>/tmp/${project.artifactId}/war/work</cacheFile>
</configuration>
</plugin>
</plugins>
</build>
</profile>
Then I run the same command again and the project takes the same amount of time. I can see “work” files getting created so the plugin is definitely running but consecutive builds do not seem to be doing anything.
My question here isn’t so much as why isn’t useCache speeding up my build, but how can I configure my plugin differently so that consecutive runs do speed up the build? If there is another plugin I should be using that would speed up builds on back-to-back runs, then that would also suffice here.
Looking at the WAR mojo code (at the time of writing), the cache is mainly used by its web app structure concerning overlays management, so in most of the cases it would not improve build time indeed.
Moreover, as stated by its official documentation, the cache mechanism is an experimental feature, hence disabled by default, which probably doesn't achieve (yet) user expectations.
Regardless of the effectiveness of this cache option, some hints to speed up maven builds could be:
Consider whether you really need to clean at each and every run
Consider building offline (-o option) if everything you need is already on your local cache
Consider using threads during your build (-T option)
Consider going on quite mode (-q option), swithing off build log temporarely and getting only error logs (basically: no news, good news)
In your case, the War Plugin is activated upon the existence of a structure typical of a war packaging, which probably means this profile is part of the aggregator/parent pom and then activated only on the war module. Although it might impact very little, also consider moving the War Plugin configuration to its concerned module and avoid such a triggered configuration
Last but not least, during development time, build time is probably more important than war size, so you could switch off the default mechanism of re-compressing external libraries added to the war file, via the recompressZippedFiles option:
Indicates if zip archives (jar,zip etc) being added to the war should be compressed again. Compressing again can result in smaller archive size, but gives noticeably longer execution time.
Default: true
So a sample configuration would look like:
<properties>
<war.recompress.files>false</war.recompress.files>
</properties>
<build>
<finalName>webapp</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<recompressZippedFiles>${war.recompress.files}</recompressZippedFiles>
</configuration>
</plugin>
</plugins>
</build>
Note: since there is no user property for this configuration entry, I also added a property for it, to switch it on/off on demand via command line (or via profile).
You could then test the different execution times executing the default build (with the configuration above disabling recompression) against the previous configuration (below, switching recompression on for the current execution, on demand):
mvn clean install -Dwar.recompress.files=true
You may then consider to profile it to switch it on/off depending on development phase.

System brokes when start single test in InteliJ

I have complex web application using Spring,Hibernate, Maven joint build(Java+Groovy) with test in Spock and different Maven profiles. All source files are in Java. I have Maven configuration(part of it) for local testing:
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.0</version>
<configuration>
<path>/services</path>
<port>8062</port>
<contextReloadable>true</contextReloadable>
<backgroundProcessorDelay>2</backgroundProcessorDelay>
<contextFile>../context.xml</contextFile>
</configuration>
</plugin>
So when I have changes in the bytecode maven contextReloadable triggers redeploy of the app. That is the desire state. When I run some test with maven configuration usingin InteliJ, in my system I see that redeploy is triggered.
I have no changes in the files or in the tests. But redeploy is triggered and everything is OK. But there is a problem for the same test when I click right button and click run on UploadTest using InteliJ 14.0.3 Community Edition
Redeploy happens but the profiles are not taken under consideration and the environment variables are not set. If I add them
org.apache.catalina.core.ContainerBase backgroundProcess WARNING: Exception processing loader WebappLoader[/services] background process java.util.ConcurrentModificationExceptio
And my backendSystem crashes. So my questions are:
Why InteliJ triggers redeploy when run Spock test without changes in the code/tests?
Why starting a single Spock test using InteliJ, cause my backednSystem brokes(I need to restart it)(what can be the source of the problems?? InteliJ runs the test with diferent parameters, profiles are not taken or ...).
Does inteliJ start compilation of the test with different eviroment variables set? (I see that inteliJ start its own launcher for the test).

Separate Jenkins-Project for deploying to JBoss

I have a Jenkins build which builds a maven project with -PmyProfile clean package. This works fine. Now I want the project be deployable but in a separate task (JBoss deployment) so it can be triggered explicitly via the jenkins GUI. For that, I have the following in my pom:
<profiles>
<profile>
<id>myProfile</id>
<properties>...</properties>
<build>
<plugins>
<plugin>
<groupId>org.jboss.as.plugins</groupId>
<artifactId>jboss-as-maven-plugin</artifactId>
<version>7.0.0.Final</version>
<configuration>
<hostname>localhost</hostname>
<port>29999</port>
<username>admin</username>
<password>admin</password>
<filename>${project.build.finalName}.war</filename>
<name>my-webapp</name>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Now I only want to call that single deployment via mvn jboss-as:deploy separately. But how would I do that? If I create a second Jenkins project, everything needs to be built again, so that's pretty stupid. Building as a separate module does not work, either (some error with "building single modules not supported for maven 3").
Any ideas?
Thanks
It sucks a little, but you can always get stuff from another Jenkins workspace by using filesystem relative path like ../../SecondJob/workspace (or use symlink). I used to do this for the same case (deploying as separate job) for all my projects and it works, it's just not elegant, but I believe there's no built-in solution in Jenkins for that.
Alternatively, it seems there's Jenkins plugin for that, but I haven't used it and can't tell anything about it.
Possible trick:
Have only one project, but parameterize it with DEPLOY parameter set to FALSE by default. The build will contain your main build as well as an Invoke top-level Maven targets post-build step for deployment. The deployment step will be invoked only if DEPLOY is TRUE. To do that you use Conditional Build Step plugin.
There is a new deploy-only goal added in version 7.5.Final. You can grab the war from the first job with Copy Artifact Plugin.
References:
https://docs.jboss.org/jbossas/7/plugins/maven/latest/deploy-only-mojo.html
https://github.com/jbossas/jboss-as-maven-plugin/pull/56/commits

Resources