In the following senario
I'm wrapping an external jar file (read a dependency I've no control over) and wrapping this in a service to be exposed over RMI.
I'd like my service interface to also be exported as a maven dependency however as it will be returning classes defined in the dependency this means that the dependency itself will be used as a dependency of my service interface.
Unfortunatly the origional jar file contains many classes that are not relevant to my exposed service.
Is it possible to depend on just a few classes in that jar file in maven (possibly by extracting and repackaging the few classes that are relevant)?
uberbig_irrelevant.jar
com.uberbig.beans <-- Need this package or a few classes in it.
com.uberbig.everythingElse
Service project includes all of uberbig jar. But exposes a service BeanService which has a call which returns an insance of com.uberbig.beans.IntrestingLightWeightSerialiasbleBean.
Service interface project needs to have a bean definition that looks like
interface BeanFetcher {
public IntrestingLightWeightSerialiasbleBean fetchBeanById(long beanId);
}
So ideally my serviceInterface jar file would only include the BeanFetcher interface. The definition of IntrestingLightWeightSerialiasbleBean and any direct dependencies of IntrestingLightWeightSerialiasbleBean.
The project is for use internally and won't be publically exposed so there should be no problems repackaging so long as the repackaged bean definitions are binary and searially compatable with the external jar file.
Any Suggestions?
Possibly related question Maven depend on project - no jar but classes
Maybe I could use something from the dependency:copy section of the maven-dependency-plugin but I haven't figured out how to do that.
I think you got the plugin right, but not the goal. You should use dependency:unpack instead.
You should be able to use an inclusion filter to extract only the classes you need, and then repack them into your own jar. (The service interface jar if you do it in the service interface project, but you can just as well set up a separate project.)
Create your own repackaged jar and put it in your local repo. And hope you've actually identified all dependencies, accounting for reflection, etc. IMO not really worth it.
You may be able to do it automatically (with the associated increased risk) by using ProGuard/etc. to pull out unused classes etc. That could be done on your own artifact as well, for example, by making an all-in-one jar via jarjar/onejar/etc.
Related
Zeppelin has an object ZeppelinContext, which can then be used to share state between languages and bind variables to angular and thus create cool user interfaces inside Zeppelin notebooks.
We have written numerous convenience methods to create things like drop down menus, buttons, UI stuff, from scala. These methods call ZeppelinContext. We wish to add these methods to an sbt project, so that we can package them in a jar, but it seems the Zeppelin project provides no artifact that contains ZeppelinContext (we have tried many).
Rather there only seems to exist two work arounds:
Build all of Zeppelin and add the resulting jar as an unmanaged jar (not nice).
Use duck typing (also really not nice).
Question: Is there a lesser known resolver / artifact id to get hold of this type?
The ZeppelinContext class is available on github.
From the related pom.xml file the Maven coordinates are:
<groupId>org.apache.zeppelin</groupId>
<artifactId>zeppelin-spark_2.10</artifactId>
Which leads to this Maven dependency on the Maven Central repository.
<dependency>
<groupId>org.apache.zeppelin</groupId>
<artifactId>zeppelin-spark_2.10</artifactId>
<version>0.6.1</version>
</dependency>
Effectively, the jar file contains the ZeppelinContext.class.
Can anyone explain how Spring decides where to look for resources when one uses the ResourceLoader.getResource(...) method?
I am having a problem with a multi-module maven application built using Spring Boot whereby in my integration tests my code is able to find resources using resourceLoader.getResource("templates/") or even resourceLoader.getResource("classpath:templates/"). So far so good...
However, when the module is eventually packaged into the executable JAR and run with embedded Tomcat the resources can no longer be resolved. I also tried resourceLoader.getResource("classpath*:templates/") with no success.
What I find concerning is that when I add a logging statement to output the URL being used in the search i get a path to one of the other modules in the project (not the one that actually contains the resource in question). E.g: jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module1-0.0.1-SNAPSHOT.jar!/templates/ whereas I believe the resource is in jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module2-0.0.1-SNAPSHOT.jar!/templates/
The resource loader was obtained from an Autowired constructor param.
Thanks in advance for any hints.
Edit
Just in case it isn't clear or is of importance, my integration tests for the module in question aren't aware of the other module. I have module1, module2 and a spring-boot module which has dependencies on module1 & module2. Essentially, when I run the integration tests for module 2 the classpath isn't aware of module1 - so I suspect that this has something to do with why it works in the tests.
When you use classpath: or classpath*: prefix, internally, this essentially happens via a ClassLoader.getResources(…) call in spring.
The wildcard classpath relies on the getResources() method of the underlying classloader. As most application servers nowadays supply their own classloader implementation, the behavior might differ especially when dealing with jar files. A simple test to check if classpath* works is to use the classloader to load a file from within a jar on the classpath: getClass().getClassLoader().getResources("<someFileInsideTheJar>"). Try this test with files that have the same name but are placed inside two different locations. In case an inappropriate result is returned, check the application server documentation for settings that might affect the classloader behavior.
Do not use classpath: form as you have multiple classloader locations of templates/ .
Refer to: resources-classpath-wildcards
I have a Maven NetBeans platform application. One of its modules is a wrapper to a java project (jar) that exposes some services to the Lookup. In the wrapped project I use the maven-processor-plugin to process the annotations so everything gets registered in the Lookup. I’m unable to see the exposed classes on the wrapped module. I tried running the maven-processor-plugin but it is skipped since there are no source files in the wrapped module. Even if there were it wouldn’t fix the problem.
You can get the code here, in the Marauroa Server Manager project, Module: jWrestling Wrapper.
The code for the wrapped module can be found here. Annotated classes within the modules work fine.
Is there a way to execute the annotation processors on the dependencies of a project? Am I missing something obvious?
the wrapped jar project cannot contain nb.org annotations. these generate META_INF/generated-layer.xml file that is only read from a MODULE jar, not the wrapped non-module jar
the binary dependency contains some netbeans-originating annotations? and you want to process it through the maven plugin? that won't work. Most if not all netbeans annotations are compile-time only, meaning that they are processed at compile time and not retained in the bytecode. so only su
Besides for Netbeans annotations (which are based on jdk 1.6 annotation processors, you don't need the processor plugin, compile plugin should be sufficient.
I have some questions derived from a problem that I have already solved through this other question. However, I am still wondering about the root cause. My questions are as follows:
What is the purpose of spring.handlers and spring.schemas?
As I understand it's a way of telling the Spring Framework where to locate the xsd so that everything is wired and loaded correctly. But...
Under what circumstances should I have those two files under the META-INF folder?
In my other question linked above, does anybody know why I had to add the maven-shade-plugin to create those two files (based on all my dependencies) under META-INF? In other words, what was the ROOT CAUSE that made me have to use the maven shade plugin?
What is the purpose of spring.handlers and spring.schemas?
well you more or less found it out by yourself, let's add some more details:
some spring libraries contain a spring.schemas and a spring.handlers file inside a META-INF directory
META-INF/spring.schemas
re-maps(*) schemalocation to a xsd inside the library
(abstract) only re-mapped versions are supported by this library
META-INF/spring.handlers
provides namespace handler classes for specific namespaces
the namespace handler class provides the parser logic to parse spring-batch beans, like job,
step, etc.
(*) the actual re-mapping happens during the build of the spring application context
Under what circumstances should I have those two files under the
META-INF folder?
normally the files are inside the spring library jars you use, but you can use the mechanism to implement own namespace bean parsing, then you would have own files
In my other question linked above, does anybody know why I had to add
the maven-shade-plugin to create those two files (based on all my
dependencies) under META-INF? In other words, what was the ROOT CAUSE
that made me have to use the maven shade plugin?
if you use a spring namespace in your spring configuration, you need the appropriate files
the problem arises when you want to run a java application:
with a main class either
the spring libraries need to be on the classpath
or all is merged into one jar, which has to be on the classpath (*)
as war/ear server application, the spring libaries need to be on the classpath, normally inside the war
i guess you did not start the mainclass with the complete classpath and i updated my answer for your first question too
(*) if you merge all into one jar, you have to make sure, that the contents of all spring.schemas/spring.handlers files are merged into one spring.schemas and one spring.handlers file, see this answer for a configuration with maven to create an all-in-one.jar
We have a very comfortable setup using JPA through Spring/Hibernate, where we attach a PersistenceUnitPostProcessor to our entity manager factory, and this post processor takes a list of project names, scans the classpath for jars that contain that name, and adds those jar files for scanning for entities to the persistence unit, this is much more convenient than specifying in persistence.xml since it can take partial names and we added facilities for detecting the different classpath configurations when we are running in a war, a unit test, an ear, etc.
Now, we are trying to replace Spring with Seam, and I cant find a facility to accomplish the same hooking mechanism. One Solution is to try and hook Seam through Spring, but this solution has other short-comings on our environment. So my question is: Can someone point me to such a facility in Seam if exists, or at least where in the code I should be looking if I am planning to patch Seam?
Thanks.
If you're running in a Java EE container like JBoss 6 (and I really recommend so), all you need is to package your beans into a jar, place a META-INF/persistence.xml inside it and place the jar into your WAR or EAR package. All #Entity annotated beans inside the jar will be processed.
For unit-testing, you could point the <jar-file> element to the generated .class output directory and Hibernate will also pick the Entities. Or even configure during runtime using Ejb3Configuration.addAnnotatedClass;
#see http://docs.jboss.org/hibernate/entitymanager/3.6/reference/en/html/configuration.html