Can anyone explain how Spring decides where to look for resources when one uses the ResourceLoader.getResource(...) method?
I am having a problem with a multi-module maven application built using Spring Boot whereby in my integration tests my code is able to find resources using resourceLoader.getResource("templates/") or even resourceLoader.getResource("classpath:templates/"). So far so good...
However, when the module is eventually packaged into the executable JAR and run with embedded Tomcat the resources can no longer be resolved. I also tried resourceLoader.getResource("classpath*:templates/") with no success.
What I find concerning is that when I add a logging statement to output the URL being used in the search i get a path to one of the other modules in the project (not the one that actually contains the resource in question). E.g: jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module1-0.0.1-SNAPSHOT.jar!/templates/ whereas I believe the resource is in jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module2-0.0.1-SNAPSHOT.jar!/templates/
The resource loader was obtained from an Autowired constructor param.
Thanks in advance for any hints.
Edit
Just in case it isn't clear or is of importance, my integration tests for the module in question aren't aware of the other module. I have module1, module2 and a spring-boot module which has dependencies on module1 & module2. Essentially, when I run the integration tests for module 2 the classpath isn't aware of module1 - so I suspect that this has something to do with why it works in the tests.
When you use classpath: or classpath*: prefix, internally, this essentially happens via a ClassLoader.getResources(…) call in spring.
The wildcard classpath relies on the getResources() method of the underlying classloader. As most application servers nowadays supply their own classloader implementation, the behavior might differ especially when dealing with jar files. A simple test to check if classpath* works is to use the classloader to load a file from within a jar on the classpath: getClass().getClassLoader().getResources("<someFileInsideTheJar>"). Try this test with files that have the same name but are placed inside two different locations. In case an inappropriate result is returned, check the application server documentation for settings that might affect the classloader behavior.
Do not use classpath: form as you have multiple classloader locations of templates/ .
Refer to: resources-classpath-wildcards
Related
I'm trying to set a custom log Handler in my Spring Boot (version 2.6.3) application. The result is a ClassNotFound as described in this other question
Can't override java.util.logging.LogManager in a Spring Boot web application: Getting java.lang.ClassNotFoundException on already loaded class
Based on the answer to that question, it seems I need my Handler and all its dependencies to be placed into the root of the executable jar.
Is there a direct way to accomplish this during the Maven build, i.e. not by extracting and repackaging the jar myself post-build?
This issue is a result of BOOT-INF fat jar structure introduced by Spring Boot 1.4.
There is currently no straightforward solution, and it appears some of the Spring Boot maintainers do not agree there is a problem, so it could be a long time before the situation changes:
Issue #6626: Make it easier to package certain content in the root of a fat jar
Issue #12659: Starting executable war with -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager produces a ClassNotFoundException
WORKAROUND #1
I had to do two things to get my application working again with a custom log handler. 1) use Maven Shade to package up the log handler with all its dependencies, and 2) launch the app with using the PropertiesLauncher class in the command line instead of using java -jar:
java -cp executable.jar:logger-shaded.jar -Dloader.main=mypackage.myapp org.springframework.boot.loader.PropertiesLauncher
The executable.jar, logger-shaded.jar, and mypackage.myapp are placeholders specific to my project, so adjust accordingly.
WORKAROUND #2
If the handler is loaded from code in a config class or from main() instead of being specified in the file loaded via java.util.logging.config.file, as discussed in the comments to the answer in this other question, then everything works as expected. I actually prefer this over Workaround #1 as it results in a smaller deployment, but it does require writing a few more lines of code.
I am creating a library using spring-boot (v2.1.6.RELEASE) as a starter project that will facilitate as base extension jar responsible for configuring and starting up some of the components based on client project properties file.
The issue I am facing is that if the client project's SpringBoot Application class contains the same package path as library everything works like charm! but when client project contains different package path and includes ComponentScan, it is not able to load or start components from the library.
Did anyone encounter this issue? how to make client application to auto-configure some of the components from library jar?
Note: I am following the library creation example from here: https://www.baeldung.com/spring-boot-custom-starter
There are many things that can go wrong here, without seeing relevant parts of actual code its hard to tell something concrete. Out of my head, here are a couple of points for consideration that can hopefully lead to the solution:
Since we use starters in our applications (and sometimes people use explicit component scanning in there spring applications) and this obviously works, probably the issue is with the starter module itself. Don't think that the fact that the component scan is used alone prevents the starter from being loaded ;)
Make sure the starter is a: regular library and not packaged as a spring boot application (read you don't use spring boot plugin) and have <packaging>jar</packaging> in your pom.xml or whatever you use to build.
Make sure you have: src/main/resources/META-INF/spring.factories file
(case sensitive and everything)
Make sure that this spring.factories file indeed contains a valid reference on your configuration (java class annotated with #Configuration). If you use component scanning in the same package, it will find and load this configuration even without spring factories, in this case, its just kind of another portion of your code just packaged as a separate jar. So this looks especially "suspicious" to me.
Make sure that #Configuration doesn't have #Conditional-something - maybe this condition is not obeyed and the configuration doesn't start. For debugging purposes maybe you even should remove these #Conditional annotations just to ensure that the Configuration starts. You can also provide some logging inside the #Configuration class, like: "loading my cool library".
I have a project with more than an hundred external library dependencies, here we use tomcat with this endorsed jar libs configured on a directory in the server (now is under $CATALINA_HOME/lib/endorsed), so the webapp can access those resources on runtime start.
I wanted to try jetty instead, because tomcat takes too much memory and crashes frequently. Now I'm wondering if there is a parameter to pass on maven-jetty-plugin to specify this jar's folder so as the webapp class loader find them in its classpath.
I've tried extraClasspath in configuration tag, but it seems to load only classes and ignore all jars in the directory I set into (if I pass the full name path of the jar, it is loaded, but I don't want to set every library that I need there).
Thanks in advance for the help
update:I know it's not a standard maven operation, i'm searching for an emergency workaround since this project is very huge and I can't refactor as I want.
But also I expected this feature was not as tricky as it seemed to me at first glance.
You need to pass them as absolute paths, or, alternatively, have them as dependencies of the plugin itself.
What you want to have done goes against Maven's portability principles, so don't expect it to support it.
I have been struggling to figure out what's the use of scoping that is provided by Maven
as mentioned here.
Why should you not always have compile time scoping? Real life examples would be really appreciated.
The compile scoped dependencies are only used during compilation.
The test scoped ones -- only during tests. Say you have tests using junit, or easymock. You obviously do not want your final artifact to have a dependency on them, but would like to be able to just depend on these libraries while running your tests.
Those dependencies which are marked provided are expected to be on your classpath when you're running the produced artifact. For example: you have a webapp and you have a dependency on the servlet library. Obviously, you should not package it inside your WAR file, as the webapp container will already have it and a conflict may occur.
One of the reasons to have different scopes for dependencies is that different parts of the build can depend on different dependencies. For example, if you are only compiling your code and not executing any tests, then there is no point in having Maven downloading your test dependencies (if they're not already present in your local repository, of course). The other reason is that not all dependencies need to be placed in your final artifact (whether it's an assembly, or WAR file), as some of the dependencies are only used during the build and testing phases.
compile
Will copy these jar files into prepared War file.
Ex: hibernate-core.jar need to have in our prepared War.
provided
These jars will be considered only at complie time and test time
Ex:
servlet.jar will be provided by deployed server, so no need to provide from our prepared War file.
test
These jars are only required for running test classes.
Ex: Junit.jar will be required only for running Junit test classes, no need to deploy these.
Scopes are quite well explained in here:
https://maven.apache.org/pom.html#Dependencies
As a reference, I copied the paragraph:
scope: This element refers to the classpath of the task at hand
(compiling and runtime, testing, etc.) as well as how to limit the
transitivity of a dependency. There are five scopes available:
compile
- this is the default scope, used if none is specified. Compile dependencies are available in all classpaths. Furthermore, those
dependencies are propagated to dependent projects.
provided - this is
much like compile, but indicates you expect the JDK or a container to
provide it at runtime. It is only available on the compilation and
test classpath, and is not transitive.
runtime - this scope indicates
that the dependency is not required for compilation, but is for
execution. It is in the runtime and test classpaths, but not the
compile classpath.
test - this scope indicates that the dependency is
not required for normal use of the application, and is only available
for the test compilation and execution phases.
system - this scope is
similar to provided except that you have to provide the JAR which
contains it explicitly. The artifact is always available and is not
looked up in a repository.
there are a couple of reasons that you might not want to have all dependencies to be default compile scope
reduce the size of final artifact(jar,war...) by indicating different scope.
when you have a multiple-modules project, you have ability to let each module have it's own version of dependency
avoid class version collision by provided scope, for instance if you are going deploy a war file to weblogic server, you need to get rid of some javax jars, like javax.servlet, javax.xml.parsers, JPA jars and etc. otherwise you might end up with class collision error.
I have an application with a Database module which contains the persistence.xml file along with entities and controller classes. During Maven tests it passed all tests so I believe it is well configured. When running the application it complains that it can't find the persistence unit. I verified that it is in the respective nbm's META-INF folder in the application folder being ran.
Any idea? Anything special to make it work?
Edit:
Code can be found here in the Marauroa-Server-Manager folder.
I was lead to the answer by Timon Veenstra. This is a placeholder for the answer in case Timon doesn't have an account so I can credit him with the answer and for anyone else running accros this question.
The key was making sure that the Persistence API, Persistence Implementation (Eclipselink on my case) and the database driver were wrapped in NetBeans modules. After that everything started working.
I guess Maven/NetBeans is misleading on that aspect since the modules compile if you have appropriate dependencies.