Is there a best practice to remove old dependencies from maven pom? - maven

My company is trying to get our team off Junit3 (and 4) and onto Junit5.
Our pom file has dependencies for Junit3, 4 and 5 already.
I'd like to somehow mark Junit3 and 4 as "deprecated at my company" to give anyone who is trying to use these older versions a visual clue in the UI that we want to stop using these.
Is there a method to mark some dependency in Maven as "deprected at my company?"
Short of that, I suppose we could write a checkstyle rule or some other static analysis, but I really think it would be less obtrusive to have the visual indicator (like javadocs or #Deprecated annotations).
Thoughts or addins that would do this?
If IDE matters, we have IntelliJ, Visual Studio code and a handful of stalwarts still on vim.

You can use the maven-enforcer-plugin to enforce not using those libraries: http://maven.apache.org/enforcer/enforcer-rules/bannedDependencies.html
You can have the plugin configuration shared across all projects (e.g. in the parent POM).

I'd like to find a utility that doesn't fail if any instance of a banned dependency is found, only new instances.
It looks like ArchUnit may give you the flexibility you are asking for. For example, a simple test below fails if any of the classes in the analyzed packages actually use classes from the org.junit package:
package com.stackoverflow.example;
import com.tngtech.archunit.junit.AnalyzeClasses;
import com.tngtech.archunit.junit.ArchTest;
import com.tngtech.archunit.lang.ArchRule;
import com.tngtech.archunit.lang.syntax.ArchRuleDefinition;
#AnalyzeClasses(
packages = "<root package of your module>"
)
public class JUnit4UsageTest {
#ArchTest
public static final ArchRule RULE_NO_CLASSES_SHOULD_ACCESS_JUNIT4_PACKAGES =
ArchRuleDefinition.noClasses()
.should()
.dependOnClassesThat()
.resideInAPackage("org.junit");
}
The DSL provided by the ArchUnit is powerful enough to express the rules, as long as the information is preserved in the class file.
The library also provides support for the freezing arch rules - they succeed for known recorded issues, but fail for any new one.

Related

Is there a way to define property to be used in both settings.gradle.kts and projects/subprojects build.gradle.kts with gradle 6.0?

We have multi-module android app with build logic written in gradle kotlin dsl. We use buildSrc to extract common logic like dependencies versions. We have something like:
buildSrc/src/main/kotlin/Dependencies.kt:
object Versions {
const val fooVersion = "1.2.3"
const val barVersion = "4.5.6"
}
object Libraries {
val foo = "com.example.foo:foo:$fooVersion"
val bar = "com.example.bar:bar:$barVersion"
}
object Modules {
const val app = ":app"
const val base = ":base"
const val baz = ":baz"
}
Then we can use these in modules' dependencies block to avoid hardcoded/duplicated values:
app/build.gradle.kts:
dependencies {
implementation(Libs.foo)
implementation(Libs.bar)
implementation(project(Modules.base))
implementation(project(Modules.baz))
}
And we also use it in settings.gradle.kts:
settings.gradle.kts:
include(
Modules.app,
Modules.base,
Modules.baz
)
This works ok with gradle 5.6. When I upgrade to 6.0, I get Unresolved reference: Modules in settings.gradle.kts file. I found it mentioned in migration guide:
Previously, the buildSrc project was built before applying the project’s settings script and its classes were visible within the script. Now, buildSrc is built after the settings script and its classes are not visible to it. The buildSrc classes remain visible to project build scripts and script plugins.
Custom logic can be used from a settings script by declaring external dependencies.
So I know what broke the build and I can fix the build by using hardcoded values in settings.gradle.kts:
include(
":app",
":base",
":baz"
)
Is it possible to avoid this duplication with gradle 6.0?
Please make sure you read the updates down below.
Original answer
See ticket #11090 "Definitions from buildSrc/ not found in settings.gradle.kts using gradle 6.0-rc-1". As you already noticed this changed recently:
This has changed in 6.0, and was deprecated in 5.6. Please see: https://docs.gradle.org/current/userguide/upgrading_version_5.html#buildsrc_usage_in_gradle_settings
-- https://github.com/gradle/gradle/issues/11090#issuecomment-544473179
One of the maintainers describes the reasons behind the decision:
Unfortunately, there are pros and cons to both arrangements (settings-then-buildSrc and buildSrc-then-settings), and we opted for the former after considering.
(...)
The pros that compelled us to make the change:
Settings plugins can influence buildSrc and main build (i.e. apply a build plugin to both)
Build cache configuration is applied to buildSrc
buildSrc behaves more like a regular included build
-- https://github.com/gradle/gradle/issues/11090#issuecomment-545697268
And finally some bad news:
We won't be changing the behaviour back to the pre Gradle 6 arrangement. Please let us know if you would like more detail on how to use one of the alternative mechanisms for using complex logic in a settings script.
-- https://github.com/gradle/gradle/issues/11090#issuecomment-545697268
Workarounds
In the aforementioned post the author proposes some workarounds:
The con of this is exactly what you have hit. It's now less convenient to use complex logic in your settings script. Now, you have to either:
Inline the logic into the settings file
Move the logic to a shared script that can be used where it needs to
Move the logic to a pre-built binary that you load in the settings file (i.e. a settings plugin)
-- https://github.com/gradle/gradle/issues/11090#issuecomment-545697268
#1 is pretty straightforward, but I can only assume what #2 and #3 mean. I come from the Groovy world and only recently started making friends with Kotlin DSL. Having said that let's give it a try.
In #3 the author might be talking about developing an external plugin and applying it in both scripts. I'm not really sure if this is something that would make sense to implement (it gives you strong typing though).
"#2 Move the logic to a shared script that can be used where it needs to"
I think it's about having a common script plugin and including it in both settings.gradle and build.gradle files. The plugin would put the static information in the ExtraPropertiesExtension of the ExtensionAware in scope (Settings in case of a settings.gradle script plugin and Project in case of build.gradle). This is described in this answer to "Include scripts with Gradle Kotlin DSL":
How can I put all common constants (such as dependency versions) to the separate file to include them just by using something like springBootVersion or Constants.springBootVersion with compile-time checks?
There is no good way to do it currently. You can use extra properties, but it won't guarantee compile time checks. Something like that:
// $rootDir/dependencies.gradle.kts
// this will try to take configuration from existing ones
val compile by configurations
val api by configurations
dependencies {
compile("commons-io:commons-io:1.2.3")
api("some.dep")
}
// This will put your version into extra extension
extra["springBootVersion"] = "1.2.3"
And you can use it like this:
// $rootDir/build.gradle.kts
subprojects {
apply {
plugin<JavaLibraryPlugin>()
from("$rootDir/dependencies.gradle.kts")
}
And in your module:
// $rootDir/module/build.gradle.kts
// This will take existing dependency from extra
val springBootVersion: String by extra
dependencies {
compile("org.spring:boot:$springBootVersion")
}
-- Include scripts with Gradle Kotlin DSL
UPDATE 1
The issue has become popular and gained attraction from Gradle maintainers:
Since there are so many comments still on this issue, let me clarify a few things about where we are and where we are actively moving to right now.
For the general topic "I want buildSrc to be done before anything else" there will be a solution soon with included builds and setting plugins. Most likely in the next release. Then you will be able to include a build, using a new DSL method, that is available earlier. This build can then contain a settings plugin which you can apply in settings. This makes the Jar containing the plugin available on the settings classpath. (Although ideally you would define a plugin extension and not use classes/static methods directly.)
settings.gradle.kts
pluginManagement {
includeBuildEarly("my-build-logic") //<- WIP - new API under development
}
plugins {
apply("my.settings.plugin) // this is defined in the "my-build-logic" build
}
For the topic of sharing dependencies and versions, we are also working on general improvements that might make some "custom solutions" unnecessary in the future.
Generally, you should try to avoid using buildscript {} or using resolutionStrategy {} in settings. Instead you can define plugins in included builds. You can then manage all dependencies (also to plugins) in the build files of these builds.
If you want to share constants between all these builds, you can have one build only for these constants that you include in all others. Like the libraries build in this example..
See also the sample about sharing convention plugins with build logic build.
Hope these are some helpful pointers.
-- https://github.com/gradle/gradle/issues/11090#issuecomment-734795353
UPDATE 2
Nowadays (Gradle 7+) you could use Version Catalogs to share dependency versions across buildSrc and the regular build. Please refer to the official documentation and this answer.
AFAIK - no, and that is what aforementioned migration guide says: settings.gradle is the first to be evaluated, thus, at that stage, objects defined in buildSrc don't even exist yet.
I can image some hacky workaround - just for the science, but in real project it would smell so bad that, IMO, not worth it.
P. S.: I guess you could reverse it and try to create instance of some enumeration by going through submodules in buildSrc code. But again, ways to achieve this might be quite exotic and should only be used for the sole purpose of proving this can work )

The idea behind using maven to compile source code

I am currently starting my adventure with Maven, and I actually don't understand the idea behind using it to automate compilation of my source code. For the time being I am working on small projects with up to 15-20 classes, and 1 main method in the "app" class. Could someone please give me the explanation with examples, when it's necesarry (or recommended) to use build automatation tool to compile the source code and how could I benefit from using it regarding source code compilation?
Thank you very much in advance!
I was looking for different answers and I have a lot of work to do but since I've seen this question, as a Maven fanboy, I couldn't resist anymore and this below is my answer.
First of all, I agree with JF Meier which answered before me, but I think the answer can be improved.
IMO you have to consider Maven not just as a build tool, but as a multi-purpose tool which can help you to do very different things. The best 3, for me are:
Compiler. Obviously. Maven allows you to easily compile giant projects with a lot of submodules, even if some of these modules are interdependent one with each other.
Dependency and repository manager. Maven allows you to automatically download third party software and bind this downlaod to the build. This is immediately understandable if you think to framework or api dependencies from big corps (Apache found., Spark, Spring, Hibernate and so on ...) but it's really powerful in every enterprise context.
Example: you have a Maven project (let's say project A) which manages requests coming from a webservice and provides responses. This Maven project relys on another Maven project (let's say project B) which actually generates webservice jar and uploads it to a company repository. Well, when you have to add a field or a method to the webservice you just have to implements new software in project B, upload it the repo and change the version in Maven poms in both project A and B. Voilà: now EVERY developer of the company just have to "mvn clean install" project A to have the new version.
Sources and code automatic generator. Since Maven 2.x are available a lot of plugins (from Apache found. and others) which allow you to generate code and sources (tipically xml files) starting from little to none implementations.
Example 1: CXF plugin is commonly used to generate java classes from xml or xsd files.
Example 2: JAXWS plugin is commonly used to generate wsdl from SOAP webservice implementations or implementation starting from wsdl file.
Do you feel the power now?
-Andrea
The question is not very specific, but I will try to answer.
Usually, you want your source code to end up in a jar or war, so that you can use it as a library or run it somewhere (e.g. on an application server).
Maven not only compiles the classes you have and creates the final artifact (jar, war), but also handles your dependencies, e.g. the libraries your project depends upon.

removing extra jars dependencies from java project

I am working on migrating multi module java project into maven. Now for most of them i migrated to maven.
Finally i am aware my project have lot of unnecessary jars included, and i want to clean them up.
I know maven has plugin command, mvn dependency:analyze. Which works very well.
dependency:analyze analyzes the dependencies of this project and determines which are: used and declared; used and undeclared; unused and declared. based on static code analysis.
Now my question is that, how can i remove reported unused and declared dependency for cleanup purpose. It could be possible those jars were getting used at runtime and my code will compile perfectly fine after removing but blow up at runtime.
An example: mycode compile with one of opensource library antisamy.jar but it require batik.jar at runtime. And mvn dependency:analyze reports me to remove batik.jar.
IS my understanding correct or i need expert inputs here.
Your understanding seems to be correct.
But I'm not sure why you'd think that there is a tool that could cover all the bases.
Yes, if you use stuff by reflection, no tool can reliably detect the fact that you depend on this class or the other.
For example consider this snippet:
String myClassName = "com." + "example." + "SomeClass";
Class.forName(myClassName);
I don't think you can build a tool that can crawl through the code and extract all such references.
I'd use a try-and-fail approach instead, which would consist of:
remove all dependencies that dependency:analyze says are superfluous
whenever you find one that was actually used, you just add it back
This could work well because I expect that the number of dependencies that are actually used by reflection to be extremely small.

Running XSpec with Maven and Saxon-PE7

We want to run XSpec as part of our Maven builds to check our XSL transformations. A plugin is available from GitHub. The problem arises when the XSL-stylesheets we check against invoke functions are not available in the Saxon-HE, looking like this:
Error at xsl:if on line 194 column 75 of dyxml_table_cals.xsl:
XPST0017 XPath syntax error at char 0 on line 194 near {...table-enumeration-condition...}:
Cannot find a matching 2-argument function named {http://saxon.sf.net/}evaluate().
Saxon extension functions are not available under Saxon-HE
We own licenses for the PE. According to the Saxon documentation the enhanced editions revert back to the open source HE when no license information is available, which seems to be the case. Is it possible to activate the PE by way of Maven, e.g. using the plugin by codehaus, and how would that look like? We already use a way of activation through Java, but to know of another, arguably more elegant way would be helpful, if possible at all.
I'm not very familiar with the Maven plugins for XSpec but I try to give some hints and workarounds.
The pom.xml of the Maven plugin you mentioned contains a dependency to the version of Saxon used:
<dependency>
<groupId>net.sf.saxon</groupId>
<artifactId>Saxon-HE</artifactId>
<version>9.7.0-1</version>
</dependency>
You should specify the Saxon version in order to use Saxon-PE or Saxon-EE. However, these Saxon versions don't seem to be available on public Maven repositories as, unlike Saxon-HE, they are proprietary software. I guess you can put the .jar file for Saxon-PE in a local repository (see Maven documentation for this). I suggest to put the .lic license file in the same directory as the .jar file.
Other two hints that may help you find a workaround:
XSpec allows to specify the Saxon version in an environment variable inside a shell or batch script. You can then run a shell script in your Maven project using for example the exec-maven-plugin. It's not ideal but it may be enough for your use case.
There is another Maven plugin for running XSpec, you may want to check that out too.
Hope it helps...
After some trial and error we found following solution to be working:
The creator of the XSpec-Maven-Plugin linked above hardcoded the use of the unlicensed Saxon-HE. In particular, following line was causing issues:
private final static Processor processor = new Processor(false);
We forked the code and changed it to:
private final static Processor processor = new Processor(true);
We built a custom class to activate the license and integrated it into the plugin source code. (Can`t post the code here.)
This resolved the licensing issue. Now our XSpec tests are up and running. Yeah us!

Install 3rd Party Libraries with Transitive Dependencies / Dependency Tree? Automated?

I have encountered similar problem as stated in following question: Install 3rd Party Libraries with Transitive Dependencies / Dependency Tree?
I will quote most important part from it below:
I know mvn install:install-file does install a single JAR. But how
to install locally in the repsoitory something like this:
+ Parent.jar
+ ChildA.jar (Requuired by Parent)
+ ChildB.jar (Required by Child A)
To make it more complcated and real life: Parent.jar and ChildA.jar
are legacy/commercial Jars not available in the public maven
Repository but the Child B is a jar that is found in the public
repository (for example like a logging jar).
UPDATE: I do not only want to install them locally (with a system
dependency) but to also "correctly" integregrate them with maven so i
can redistribute this dependency tree to other developers or the
public (and I assume this is important for maven), so that maven knows
and understands the dependecytree (to avoid version conflicts,
unnecessary downloads etc...)
I have similar case jars are legacy and commercial I will deploy them to internal company repository.
Solution is simple which I figured before finding a question, to write pom for every jar and import them using syntax mvn install:install-file -Dfile=<path-to-file> -DpomFile=<path-to-pomfile>. But in my case there is over 200 jars with more than 10.000 class files in total. That's why I was wondering wether there is software which could provide me with easy parse-able and/or process-able output telling which jars depend on which. Otherwise it could take insane amount of time do it manually. Open every jar, decompile every class, find which classes it referenced, are they from other jars (which?) or not.
I think this should be possible to automate since classloaders do something similar. If it need to load a certain class it have to load class that are used by this certain class, so there is a way to know which classes are referenced. And you can tell which classes belongs to which archives.
I'm also aware that I might not be not able to import them because of some circular dependencies between classes from different jars. I'm fine with that if software says - NO if it's not possible. That's actually another reason why I don't want to do it manually and acknowledge after 2 weeks that I just waste my time.
So any idea if such software exists?

Resources