Is it possible to run the maven-compiler-plugin in process-sources only for specific packages?
I know the correct way to do this is to extract the needed classes in a extra module but this seems to be a huge overhead for just two classes.
If one wants to know it is needed for the fornax-oaw-m2-plugin
Related
I know you can access different modules (included using include) in a project via org.gradle.api.Project#getSubprojects(), and I know you can get the name and directories of separate builds that have been included (using includeBuild) via org.gradle.api.invocation.Gradle#getIncludedBuilds().
But how can my plugin get information such as the locations of Java source files and class files for projects included using includeBuild?
My goal here is to determine which files have changed in the current git branch (which I can do), and then collect their corresponding class files into a jar file that's used for our patching mechanism that inserts the patch jars at the front of the classpath rather than redeploying the whole application.
I don’t think it is a goal of Gradle to provide including builds with detailed information on included builds. Currently, the Gradle docs basically only state two goals for such composite builds:
combine builds that are usually developed independently, […]
decompose a large multi-project build into smaller, more isolated chunks […]
Actually, isolation between the involved builds seems to be an important theme in general:
Included builds do not share any configuration with the composite build, or the other included builds. Each included build is configured and executed in isolation.
For that reason, it also doesn’t seem to be possible or even desired to let an including build consume any build configurations (like task outputs) of an included build. That would only couple the builds and hence thwart the isolation goal.
Included builds interact with other builds only via dependency substitution:
If any build in the composite has a dependency that can be satisfied by the included build, then that dependency will be replaced by a project dependency on the included build.
So, if you’d like to consume specific parts of an included build from the including build, then you have to do multiple things:
Have a configuration in the included build which produces these “specific parts” as an artifact.
Have a configuration in the including build which consumes the artifact as a dependency.
Make sure that both configurations are compatible wrt. their capabilities so that dependency substitution works.
Let some task in the including build use the dependency artifact in whatever way you need.
Those things happen kind of automatically when you have a simple dependency between two Gradle projects, like a Java application depending on a Java library. But you can define your own kinds of dependencies, too.
The question is: would that really be worth the effort? Can’t you maybe solve your goal more easily or at least without relying on programmatically retrieved information on included builds? For example: if you know that your included build produces class files under build/classes/java/main, then maybe just take the classes of interest from there via org.gradle.api.initialization.IncludedBuild#getProjectDir().
I know, this may not be the answer you had hoped to get. I still hope it’s useful.
I am working on migrating multi module java project into maven. Now for most of them i migrated to maven.
Finally i am aware my project have lot of unnecessary jars included, and i want to clean them up.
I know maven has plugin command, mvn dependency:analyze. Which works very well.
dependency:analyze analyzes the dependencies of this project and determines which are: used and declared; used and undeclared; unused and declared. based on static code analysis.
Now my question is that, how can i remove reported unused and declared dependency for cleanup purpose. It could be possible those jars were getting used at runtime and my code will compile perfectly fine after removing but blow up at runtime.
An example: mycode compile with one of opensource library antisamy.jar but it require batik.jar at runtime. And mvn dependency:analyze reports me to remove batik.jar.
IS my understanding correct or i need expert inputs here.
Your understanding seems to be correct.
But I'm not sure why you'd think that there is a tool that could cover all the bases.
Yes, if you use stuff by reflection, no tool can reliably detect the fact that you depend on this class or the other.
For example consider this snippet:
String myClassName = "com." + "example." + "SomeClass";
Class.forName(myClassName);
I don't think you can build a tool that can crawl through the code and extract all such references.
I'd use a try-and-fail approach instead, which would consist of:
remove all dependencies that dependency:analyze says are superfluous
whenever you find one that was actually used, you just add it back
This could work well because I expect that the number of dependencies that are actually used by reflection to be extremely small.
I am using Maven to build my project. There are lot of dependencies which are in provided and runtime scope. The build size is large and I want to remove the unwanted dependencies. So is there any way in which I can check which dependencies are unwanted.
The best way to minimize dependencies to the ones you really need is to not embed that dependencies. If you simply use the maven-bundle-plugin on your code it will create Import-Package statements for the code you really need. This might even give you a good hint on what dependencies you might be able to exclude for embedding.
In general in OSGi the goal should be to not have that many dependencies in the first place. If you use libraries with extensive dependencies then your should question the quality of these.
My project is depending on many ZIP resources.
With "maven-dependency-plugin" and its "unpack-dependencies", I kown how to unpack each dependency.
But (for different reasons I cannot explain here), I have to unpack the dependencies in a specific order (*).
Is it possible to unpack in a specific order, or is it possible to manage the dependencies order ?
Thanks,
Xavier
(*) there are some files with the same names, and I have to overwrite some files from one dependency with other from another dependency ....
[EDIT][SOLUTION]
Thanks for answers.
I found a solution with copy-maven-plugin.
Here is an example of solution for my problem :
https://gist.github.com/4164769
As in most cases with Maven, I think there are several ways to do this, and you'll have to find the most elegant way yourself. I'll give you an idea of how I'd get started.
First, you can use the dependency plugin's unpack mojo to unpack a specific set of artifact; you name the artifacts specifically in the configuration of the execution. It's possible that you can name more than one here and they will be processed in order. However, if that doesn't work, you can always configure as many executions of this mojo as necessary, and then order those executions in your pom itself, which DOES control ordering. Note, you can configure the unpack target on a per execution basis too, which may help you.
Another usefull tool that might apply here is the assembly plugin with a custom assembly descriptor. The assembly, like the unpack mojo discussed above, can be configured to handle specific artifacts, rather than just all of them, and the granularity and ordering of the processing is highly flexible.
I have a few dependencies in a Maven pom I am trimming. To check them I comment the dependency out and re-run the build. If it failed, then the dependency is needed. If it did not fail, then I remove it.
Is there an easier, faster way to check if a dependency is required for a Maven build?
No solution will be perfect due to the dynamic nature of the java classloader.
A dependency analysis tool like JarAnalyzer will certainly help identify static or compile-time relationships between jars in a directory.
Dynamic or run-time relationships are much harder to determine, which is why one of the suggestions was to ensure you have a conmprehensive set of units tests that exercises as much of your code as possible.