I'm having issues with a test, which when executed in maven fails to initialize log4j, although a valid log4j.properties is in src/test/resources and therefore should end up on the classpath of the test. But it doesn't, i.e. log4j prints only
log4j:WARN No appenders could be found for logger (org.springframework.test.context.junit4.SpringJUnit4ClassRunner).
log4j:WARN Please initialize the log4j system properly.
In order to debug the problem I printed the classpath from the test itself, using the code here
But instead of a lengthy list of jars and paths I just get
/<projectpath>/target/surefire/surefirebooter6226797341642271676.jar
So my questions are:
WTF is maven doing with the classpath?
Why doesn't my log4j.properties end up on the classpath?
How do I debug this?
Note: In Eclipse I can run the test just fine and everything works as expected.
Another note: the maven project is a multimodule project and I'm only executing a single test from a single submodule, with a commandline like this:
mvn -U -Dtest=de.company.project.SomeTest clean test
Have a good look at the maven-surefire-plugin. By default it creates a jar stuffed with your entire classpath. This is controlled by the useManifestOnlyJar option. This works around the problem of Windows having a classpath limit of 1024 (quoting off the top of my head). Under Linux you wouldn't really feel this pain much as the limit is much higher.
If you are forking the maven-surefire-plugin, it will use a different classpath than the one you're running Maven (and the compilation).
Debugging this kind of crappy situation can be done as follows:
In one of your tests add a loop that lists all the environment variables along with the java system properties.
Debug the tests:
mvn -Dmaven.surefire.debug="-Xdebug \
-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=9001 \
-Xnoagent" \
test
I found the answer to question 1.
Maven creates the jar with the weird name on the fly and just puts a MANIFEST.MF file in there. That file contains the classpath, and the main class to be started.
This also answers to some extend question 3.
You can copy that jar file somewhere else, while maven is running, so it does not delete it once it is finished. Then you can examine it as long as you want. Turns out my log4.properties is on the classpath (the target directories for the testclasses is there and the properties files is in that directory ....)
Leaves me with question 2.
It turned out somewhere in the forest of pom.xmls the system property log4j.configuration was set to a rather useless value. Setting that value back to the propervalue as described here solved my immediate problem.
Now I just have to find the broken spot in our poms, but that's a story for a different day.
Related
I need to load a property file from src/main/resources/config folder. The loading part is written in a common dependency project where we dont have any control. We are just passing the file name expressed through a dependency. The code snippet in the dependent jar is like below, the standard resource loading.
InputStream inputStream = this.getClass().getClassLoader().getResourceAsStream(propertyFileName);
Spring will always look for recources under recources folder directly, in this case its unable to load the file as its in the custom folder and its not under classpath.
How do we explicitly set the custom folder as additional classpath folder?
With maven we could do something like below which works fine. Is there any OOTB way to achieve this with annotation in spring boot?
`<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
<resource>
<directory>src/main/resources/config</directory>
</resource>
</resources>`
Updated
`// This works if config.properties is directly under resource folder
// What if config.properties is under resources/config folder.
// Dont say to pass argument as /config/config.properties, there are some other limitations.
// So in that case with the same approach, config should come under classpath, so that the below
// method will work always when the property name is passed.
// As mentioned earlier, we can use maven resource settings to achieve this.
// The question here is, is there any way to explicitely advise spring to load property from this folder.
// I have seen something like loader.path config, not sure that helps!
InputStream stream = SpringBootStarter.class.getResourceAsStream("/config.properties");`
Before answering, when you say: Spring will always look for recources under recources folder directly, in this case its unable to load the file as its in the custom folder and its not under classpath., this is not correct.
Spring can look anywhere on your system. Here is how you can load different properties file with Spring and Spring boot:
#PropertySource("classpath:config/common.properties") => Will look under the class path for a file located under the config folder, at the root of your classpath.
#PropertySource("file:/config/common.properties") => Will look for the file common.properties at the root of your filesystem, here under /config/common.properties.
Now, there is the question of "what is the classpath", it seems like it is worth more explanation.
The classpath is for the JVM what the filesystem is for your OS. When you execute some java code (.jar file for instance), the JVM stores all the files you specify. You can specify files when executing java -classpath /a/shared/folder,/a/dependency/app.jar,myApp.jar MainClass. (See this for some others ways: https://javarevisited.blogspot.com/2012/10/5-ways-to-add-multiple-jar-to-classpath-java.html).
Quite often, what happens for developers (before we use Spring) was this:
We develop our application, and use maven for managing the dependencies
We execute our app with the IDE, everything works just as fine, life is wonderful
We are ready to go live (in production). We generate the famous myApp.jar and try executing the application java -jar myApp.jar and... Nothing works. You have issues with java (I assume you setup the main-class in the Manifest...) and you get something like Caused by: java.lang.ClassNotFoundException: my.dependency.OtherClass...
Finally, you realize life is hard and you are not ready to go live right now. You need to have something you can execute easily.
One possible solution to this, to avoid having classpath issues is to put everything in your JAR (called in spring-boot the FAT jar) and you use java -jar myApp.jar and it is working fine.
By default, when you generate a maven project, automatically you have some folders included like:
src/main/java => your java files and packages
src/main/resources => your config files (like .properties)
src/test/java => Your java test files
src/test/resources => the resources handy for your tests
When you generate your jar (more or less every configuration you added to your maven project, but let's say it is okay), what happens is the compiler takes all the folders and files under src/main/java and src/main/resources and put them at the root of your jar. (Don't hesitate to have a look inside your jar files. This is just a Zip, you can open it, browse it, and see for yourself).
With that said, when you say How do we explicitly set the custom folder as additional classpath folder?, and you talk about a custom folder located under src/main/resources, then when you generate your Jar, the custom folder will be in jar, and therefor, in your classpath.
If you still have troubles, this actions will help you:
Unzip your jar files and check what is inside. If you don't see any config/ folder in it, maybe your Jar generation is wrong
Try using #PropertySource(...) to load properties file, in your classpath and in your filesystem, to see how it works and what you achieve
Have also a look to this:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
Don't hesitate to migrate more and more of your old code to Spring-boot, will be a lot easier for you.
Would like to disable the test-out folder means test outputs in my project as it less disk space issues in automation machines. Tried all below options as got it our tool:
SetDefaultListener(false);
setVerbose(0);
command line argument -usedefaultListener false
nothing can work me.I am using maven build tool to generate the jar. We need to give the jar to automachines to run this.
That's not possible by definition if do not want to have that folders, you have to skip tests in process of Jar building.
I have a largish Maven multiproject build. I'm scanning the codebase with SonarQube (5.6.5). For background, I successfully integrated the various JaCoCo exec files into SonarQube by using the "jacoco:merge" goal, to produce a single exec file. The SonarQube property that alleges to allow specifying a list of JaCoCo exec files doesn't work in our version of SonarQube, but specifying a single one does work.
I'm now trying to integrate the numerous "TEST-*" files in "target/surefire-reports" in each of the subprojects. The "sonar.junit.reportsPath" property expects a single directory, so I can't specify a list of them.
That means I'm going to have to do something as part of the build to merge the entire contents of all of the "target/surefire-reports" directories into a single directory, so I can specify that directory.
I already have a pom that does nothing but merge JaCoCo data. This seems like a logical place to store the surefire reports.
How can I do this sort of thing with Maven? Would this be an application of the "maven-resources-plugin", or something else?
Ok, well, I guess I answered my own question. I was able to get this to work with the resources plugin, specifying every one of my modules as resource directories. I now have one ridiculous-looking POM that has three lists of all of the modules in my multiproject build, for three tasks that require me to list all of the modules to process. The Gradle version would be amazingly short.
We have a maven multi-module project in which the following weirdness occurred:
module-x has a TEST (src/test/java/...) which depends on code provided by module-y, whose code is not otherwise used in module-x (i.e. nothing in src/main/java/... depends on module-y)
module-x therefore defines a dependency on module-y, but with <scope>test</scope> and uses that code in one of its tests
someone changed a constructor in module-y and committed the code that broke the module-x test (as it was using the old constructor parameters)
TeamCity ran java org.codehaus.plexus.classworlds.launcher.Launcher -f app-parent/pom.xml -B integration-test when it saw the commit
the build succeeded because maven did NOT recompile/rerun module-x TEST code, it just recompiled module-y and everything that depends on it with default compile scope
I'm sure people will point out, why is that test even in module-x, etc, but this weird setup aside for a minute. What I want to understand is this:
If some project (module-x) has a test dependency (module-y) that changes and is recompiled, should not maven then notice this and do a fresh "test-compile" for the project (module-x) that "test-depends" on what changed (module-y)? Is what I saw here normal behavior?
EDIT: from comment replies it seems like the fact that TeamCity is being used may be a factor; I clarified the command used in the sequence list above.
EDIT 2: I should note that the parameter "-f app-parent/pom.xml" is actually the main module that basically depends on "the world" and is where one would run from command-line "mvn test" or "mvn clean package" to rebuild everything.
Here's my scenario:
Maven 2.0.9 is our build system
We install code to multiple environments
All of our environment-specific properties are contained in property files, one for each environment
We currently read these properties into maven using the properties-maven-plugin; this sub-bullet is not a requirement, just our current solution
Goal:
Perform certain parts of the build (ie. plugin executions) only for certain environments
Control which parts are run by setting values in the environment-specific property files
What I've tried so far:
Maven allows plugins executions to be put inside pom profiles, which can be activated by properties; unfortunately these must be system properties - ie. from settings.xml or the command-line, not from properties loaded by the properties-maven-plugin
If possible, we'd like to keep everything encapsulated within the build workspace, which looks something like this:
project
pom.xml
src
...
conf
dev.properties
test.properties
prod.properties
build-scripts
build.groovy <-- the script that wraps maven to do the build
install.groovy <-- ... wraps maven to do the install
Running a build looks like:
cd build-scripts
./build.groovy
./install.groovy -e prod
Is there any possible way to accomplish these goals with the version of maven we are using? If not, is it possible with a newer version of maven?
This isn't possible using just Maven. (See also How to activate profile by means of maven property?) The reason is that profiles are the first thing evaluated before anything else to determine the effective POM.
My suggestion is to write some preprocessor that parses your environment specific property files and converts them to the required system properties before launching Maven. This script can be included in your ~/.mavenrc so that it runs automatically before Maven is launched. Here is an example script that that assumes the properties file is in a fixed location:
properties=`cat /etc/build-env.properties`
while read line; do
MAVEN_OPTS="$MAVEN_OPTS -D$line"
done <<< "$properties"
If the properties file is not fixed, you'll just need to add something to the script to discover the location (assuming it is discoverable).