I am wondering if there is a maven enforcer rule or something similar to check my project for any 'opened' (not fixed) version in project (transitive) dependencies.
I would like to archive a stable reproducible build with maven, but I cannot guarantee this if a dependency of mine e.g. declares an open-ended version range for one of its dependencies.
A new release of that transitive dependencies would change the output of my 'otherwise' untouched build.
I haven't found any property or enforcer rule which fits this requirement.
Does anybody know how such a requirement can be done with maven?
Best bet would be to take the mvn dependency:list and fix all those versions in <dependencyManagement>
No transitive dependency way
To have a reproducible build, you should probably fix version of all your direct and indirect dependencies in dependencyManagement.
This will allow :
to fix version for range version dependencies
to avoid dependency-convergence issue.
To not forget flatten all your dependencies in dependencyManagement you can use banTransitiveDependencies rules from maven-enforcer-plugin.
If you have lot of dependency this could be painful to manage but maybe you can create a script to generate dependencyManagement section from mvn dependency:list
I created a new feature request for maven-dependency-plugin about this : https://issues.apache.org/jira/browse/MDEP-811
(See also : https://stackoverflow.com/a/35849405/5088764)
Fix range version only ?
Solution above works, but ideally we want to only fix version for :
convergence issue.
range version dependency.
You can add rules for dependency-convergence but AFAIK there is no kind of noRangeVersion rules.
I created a new feature request for maven-enforcer-plugin about this : https://issues.apache.org/jira/browse/MENFORCER-427
But waiting maybe this is possible to create your own rule : https://maven.apache.org/enforcer/enforcer-api/writing-a-custom-rule.html
(or maybe somebody already do that ?)
dependency-lock-maven-plugin ?
I didn't test this but maybe dependency-lock-maven-plugin could help to solve this issue.
See : https://stackoverflow.com/a/54580971/5088764
Short story
it is (still) "possible" to determine whether there are transitive dependencies with open version range:
...
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-assertions-generator</artifactId>
<version>2.1.0</version>
</dependency>
...
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<!-- version makes sense -->
<version>2.6</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
...
% mvn -X assembly:single -Dassembly.dryRun=true| grep 'setting version to'
[DEBUG] org.assertj:assertj-core:jar:2.9.1:compile
(setting version to: 2.9.1 from range: [2.1.0,2.99.0])
Long story
maven 3 had adopted Aether project and, unfortunately, there is no option to intercept or influence on dependency resolution process, basically "project object model" provides information about direct dependencies, but exhaustive information about transitive dependencies is hidden behind aether, that is the reason why you didn't find desired functionality among maven plugins.
I succeeded to get some relevant information from maven-assembly-plugin just because it's old versions are still compatible with modern maven, so, technically it is still possible to implement a plugin with required functionality or even take advantage of gmavenplus-plugin and write groovy scriptlet:
<plugin>
<groupId>org.codehaus.gmavenplus</groupId>
<artifactId>gmavenplus-plugin</artifactId>
<version>1.13.1</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<bindAllProjectProperties>true</bindAllProjectProperties>
<scripts>
<script><![CDATA[
def resolver = session.container.lookup(org.apache.maven.artifact.resolver.ArtifactResolver.class)
def artifacts = resolver.resolveTransitively(
project.dependencyArtifacts,
project.artifact,
project.managedVersionMap,
session.getLocalRepository(),
project.remoteArtifactRepositories,
null
).artifacts.findAll {
it.versionRange && it.versionRange.restrictions
&& !it.versionRange.recommendedVersion
&& (it.versionRange.restrictions.size() > 1
|| it.versionRange.restrictions[0].lowerBound
|| it.versionRange.restrictions[0].upperBound
)
}.each {
log.error("Found bad guy: $it -> $it.versionRange")
}
]]></script>
</scripts>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>3.0.9</version>
<type>pom</type>
<scope>runtime</scope>
</dependency>
</dependencies>
</plugin>
% mvn initialize
[INFO] Scanning for projects...
...
[INFO] Using plugin classloader, includes GMavenPlus and project classpath.
[INFO] Using Groovy 3.0.9 to perform execute.
Found bad guy: org.assertj:assertj-core:jar:2.9.1:compile -> [2.1.0,2.99.0]
UPD.
The idea of locking versions of all transitive dependencies in dependencyManagement from my perspective seems to be wrong. At first glance it looks attractive to run mvn dependency:list and put all it's output into dependencyManagement, no doubts, at next mvn package we will get the same artifact, but we also need to think about consequences of such "solution":
output of mvn dependency:list is far from ideal: maven tries to do it's best, but it relies on numbers mentioned in version tag and knowns nothing about compatibility, bugs and security issues - we should not blindly trust that output, instead we always need to check everything manually, and the problem is maven answers the question What will we get? when the actual question is Why did we get that?.
By locking versions of transitive dependencies in dependencyManagement we are taking responsibility for the things which we do not actually manage, the question is: how we are going to update versions of dependencies if we have locked versions of their dependencies? Why do we think we know better what to do than the developer of dependency?
Related
I was asked to optimize the dependencies of a big maven project, i.e. find and remove any and all dependencies that are now being used by the project. Therefore I chose the plugin maven-dependency-analysis:
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.shared</groupId>
<artifactId>maven-dependency-analyzer</artifactId>
<version>1.11.1</version>
</dependency>
</dependencies>
</plugin>
to output a report of maven dependencies that are not being used, by executing the command mvn dependency:analyze -DignoreNonCompile=true. I found that most dependencies reported under the "Unused declared dependencies found" section could be removed without any sort of problem, however there were a few dependences whose removal caused compilation errors. I was wondering why such dependencies are included in the "Unused declared dependencies found" section and if there is something that I'm missing?
Thank you for your attention.
It's important to understand how this mechanism works:
Maven uses Object WebASM framework that analyzes your raw bytecode. It goes through all your classes and then builds a list of all classes that these reference.
But you can use a class in different ways, typically using reflection (and there are also some things that are not compiled into class files: https://maven.apache.org/plugins/maven-dependency-plugin/faq.html#unused) and this mechanism cannot detect that.
I have an interesting problem in a project where all of the technologies mentioned in the title are used. I've been able to track it down up to the diagnosis (the test classpath prepared by Surefire), but I don't understand whether it can be fixed and how. It's not a showstopper, indeed it's a minor issue for me, but I'd like to solve it anyway.
First a rough description.
The problem is related to executing tests in a specific module of the project, and only in a specific way.
Everything works (tests pass) when I run from the master pom level:
cd ${projHome}
mvn install
Everything works (tests pass) when I run:
cd ${projHome}/modules/CoreImplementation/
mvn test
That means that I can build and test with no problems, the same for my Jenkins, and NetBeans can run tests from the IDE when I need them.
But that module fails testing when I run from the master pom level:
cd ${projHome}
mvn test
with this error:
java.lang.NoSuchMethodError: it.tidalwave.northernwind.profiling.RequestProfilerAspect.aspectOf()Lit/tidalwave/northernwind/profiling/RequestProfilerAspect;
at it.tidalwave.northernwind.frontend.ui.spi.DefaultSiteViewController.processRequest(DefaultSiteViewController.java:82) ~[classes/:na]
at it.tidalwave.northernwind.frontend.ui.spi.DefaultSiteViewControllerTest.must_call_some_RequestProcessors_when_one_breaks(DefaultSiteViewControllerTest.java:161) ~[test-classes/:na]
Running mvn test as a second pass (after a mvn install -DskipTests) happens to be the way Drone.io and Travis do their job. While I could change their configuration, I'd like to stay with the standard configuration and fix the problem if possible.
The diagnosis in short and my question.
Now, the question in short (details are further below). I was able to track down the problem to different ways in which Surefire prepares the classpath to execute the tests.
When I run mvn install the classpath is:
${repo}/org/apache/maven/surefire/surefire-booter/2.16/surefire-booter-2.16.jar
${repo}/org/apache/maven/surefire/surefire-api/2.16/surefire-api-2.16.jar
${projHome}/modules/CoreImplementation/target/test-classes
${projHome}/modules/CoreImplementation/target/classes
${projHome}/modules/Core/target/it-tidalwave-northernwind-core-1.1-ALPHA-37-SNAPSHOT.952b0c8bdc77.jar
${repo}/it/tidalwave/thesefoolishthings/it-tidalwave-role/3.0-ALPHA-1/it-tidalwave-role-3.0-ALPHA-1.jar
${projHome}/modules/Profiling/target/it-tidalwave-northernwind-core-profiling-1.1-ALPHA-37-SNAPSHOT.952b0c8bdc77.jar
${repo}/org/apache/commons/commons-math3/3.0/commons-math3-3.0.jar
…
When I run mvn test (from the project home) the classpath is:
${repo}/org/apache/maven/surefire/surefire-booter/2.16/surefire-booter-2.16.jar
${repo}/org/apache/maven/surefire/surefire-api/2.16/surefire-api-2.16.jar
${projHome}/modules/CoreImplementation/target/test-classes
${projHome}/modules/CoreImplementation/target/classes
${projHome}/modules/Core/target/unwoven-classes
${repo}/it/tidalwave/thesefoolishthings/it-tidalwave-role/3.0-ALPHA-1/it-tidalwave-role-3.0-ALPHA-1.jar
${projHome}/modules/Profiling/target/unwoven-classes
${repo}/org/apache/commons/commons-math3/3.0/commons-math3-3.0.jar
…
The different portions are the indented ones. In the former case, SureFire uses the classes directory (forget for a moment that in my case they are named unwoven-classes) only for the module under test, and the installed jar files for every dependency. In the latter case, it seems to be using classes for all dependencies in the reactor.
The reason for which this difference in the classpaths gives me troubles is explained below in the "Gory details" section. In short, that unwoven means that they contain bytecode not augmented by AspectJ, hence the methods that can't be found at runtime.
I'm running with SureFire 2.16, but I've also tried the latest 2.19 with no changes. Being able to force SureFire to always use jar files for dependencies would fix my problems. If you have the answer, you can stop reading my post here.
Gory details (just for curiosity).
The faulty module artifactId is it-tidalwave-northernwind-core-default and it depends on aspects available in it-tidalwave-northernwind-core-profiling - that's where the offending RequestProfilerAspect is. The aspect library dependency is both in the regular dependencies of the faulty module and in the configuration of the aspectj plugin:
<dependency>
<groupId>it.tidalwave.northernwind</groupId>
<artifactId>it-tidalwave-northernwind-core-profiling</artifactId>
</dependency>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<configuration>
<aspectLibraries combine.children="append">
<dependency>
<groupId>it.tidalwave.northernwind</groupId>
<artifactId>it-tidalwave-northernwind-core-profiling</artifactId>
</dependency>
</aspectLibraries>
</configuration>
</plugin>
</plugins>
</build>
AspectJ integration is by means of the following profile in a Super POM, which is activated in the build, whose relevant part is:
<profile>
<id>it.tidalwave-aspectj-springaop-v1</id>
...
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>compile</phase>
<configuration>
<outputDirectory>target/unwoven-classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>default-testCompile</id>
<phase>test-compile</phase>
<configuration>
<outputDirectory>target/unwoven-test-classes</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
...
The aspectj plugin is configured in the profile to statically weave binaries in the unwoven-test-classes directories. The reason for this approach is that it's the only feasible solution AFAIK to have both Lombok and AspectJ work together.
Now, back to the two classpaths described above: the fact that SureFire is using unwoven-classes means that it's pointing to bytecode that has not been augmented with AspectJ methods, hence the error.
References
The project is a FLOSS one and can be found at
https://bitbucket.org/tidalwave/northernwind-src
or
https://github.com/tidalwave-it/northernwind-src
A changeset where the problem can be reproduced is f98e9a89ac70138c1b6bd0d4570a22d59ed71be6. JDK 1.8.0 is required to build the project (even though it doesn't use Java 8 code yet).
The SuperPOM can be found here:
https://bitbucket.org/tidalwave/thesefoolishthings-superpom-src
It seems that including direct dependencies with provided scope is well understood. It also appears that including transitive dependencies with runtime scope is also easily accomplished.
But how can I include a dependency two levels of indirection away?
Example:
A --> B --> C
Where A depends on B (compile scope) and B depends on C (provided scope).
I want A to retrieve C (eg: download the jar locally), be it via assembly descriptor or maven-dependency-plugin:copy-dependencies or some other mechanism.
I've tried seemingly every combination of options for both of the aforementioned plugins. Neither approach covers this scenario. They both get B (even if B is changed to a provided dependency), and any compile scope dependencies of B, but not provided dependencies of B.
I suppose that I'm trying to do something similar to a shaded representation of my project but without unpacking dependencies.
Naturally I don't want to have to enumerate all of B's dependencies in A's pom - I'd like to retrieve (and then package) all dependencies implicitly and recursively.
You won't be able to do that. It is not a limitation of the maven-assembly-plugin, but the way Maven considers transitive dependencies. A transitive dependency that is of scope provided will be omitted, always (refer to this table in the documentation).
There is an open bug about this (MNG-2205) but I don't think it will be fixed anytime soon. This really is intended behaviour because provided dependencies, as per the name, as supposed to be provided at runtime.
Although Tunaki is quite right, I found a workable solution that's better than nothing, using a less well-known plugin (NOTE: only works with Maven <= 3.0.X)
This specifies two dependency blocks. The first pulls in normal compile deps including transitive, it's the same as using copy-dependencies. The second block specifically mentions B in addition to the normal pom dependency declaration (unfortunately, but at least both dependency mentions are in the same pom) and requests its provided deps:
<plugin>
<groupId>com.github.goldin</groupId>
<artifactId>copy-maven-plugin</artifactId>
<version>0.2.5</version>
<executions>
<execution>
<id>get-provided-dependencies</id>
<phase>generate-resources</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<resources>
<resource>
<targetPath>${project.build.directory}/lib</targetPath>
<dependencies>
<dependency>
<includeScope>compile</includeScope>
</dependency>
<dependency>
<groupId>some.group</groupId>
<artifactId>artifact_i_call_B</artifactId>
<version>1.0</version>
<includeScope>provided</includeScope>
</dependency>
</dependencies>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
Docs for this plugin may only exist in the archive at this point, not sure what happened to it:
http://web.archive.org/web/20130826193436/http://evgeny-goldin.com/wiki/Copy-maven-plugin
The place where I work uses Maven and we have lots of internal libraries. We try to make changes in a backward-compatible manner, but sometimes one of our libraries requires a newer version of another library. That can cause problems if an end product doesn't end up pulling in the newer library version.
For each end product, we have a dependencyManagement section that declares which versions of transitive dependencies should be used for the project. We do this instead of letting Maven figure out which version to use because we want control of which versions of libraries are being used.
Even if we let maven figure out which versions of libraries should be used, it's possible that the older library may be used, which could cause ClassNotFoundExceptions, etc...
Is there a Maven plugin or way to determine if my project is using an older version of a dependency, when one of the project's dependencies requires a newer version?
There is mvn dependency:tree but we have lots of dependencies, and I don't want to have to look through the giant list by hand.
Thanks!
EDIT
Since we have lots of libraries, if the end product uses libraries A, B, and C, and both A and B use a different version of C, the latest version of C is not always used. From Introduction to the Dependency Mechanism:
Dependency mediation - this determines what version of a dependency
will be used when multiple versions of an artifact are encountered.
Currently, Maven 2.0 only supports using the "nearest definition"
which means that it will use the version of the closest dependency to
your project in the tree of dependencies. You can always guarantee a
version by declaring it explicitly in your project's POM. Note that if
two dependency versions are at the same depth in the dependency tree,
until Maven 2.0.8 it was not defined which one would win, but since
Maven 2.0.9 it's the order in the declaration that counts: the first
declaration wins.
Since we almost always make changes in a backward-compatible manner, it looks like the best solution for us is to ensure that the latest versions of dependencies are used. The Maven Enforcer plugin with the "Require Upper Bound Dependencies" feature accomplishes this.
From http://maven.apache.org/enforcer/enforcer-rules/requireUpperBoundDeps.html:
This rule requires that the version for each dependency resolved
during a build, is equal to or higher than all transitive dependency
declarations. The version of each dependency resolved during the build
will normally be the version specified in the POM or the version with
the least transitive steps (the "nearest" definition). For more
information about Maven dependency resolution, see the Maven site.
Here is a concrete example. This will cause a build to fail:
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>0.9.9</version>
<!-- Depends on org.slf4j:slf4j-api:1.5.0 -->
</dependency>
</dependencies>
Because the project will run logback-classic 0.9.9 with slf4j-api
1.4.0 and slf4j-api 1.4.0 is probably not forwards compatible with slf4j-api 1.5.0.
This is the log message:
Failed while enforcing RequireUpperBoundDeps. The error(s) are [
RequireUpperBoundDeps error for org.slf4j:slf4j-api:1.4.0 paths to dependency are:
+-test:TestParent:1.0-SNAPSHOT
+-org.slf4j:slf4j-api:1.4.0
and
+-test:TestParent:1.0-SNAPSHOT
+-ch.qos.logback:logback-classic:0.9.9
+-org.slf4j:slf4j-api:1.5.0
]
We're probably going to set up this plugin in a profile which we can turn on or off.
I'm not sure there's a plugin that will do as you ask. In general, a newer version will be the one you'll settle on, but that's not always a hard rule - sometimes you'll have a newer version pulled in but an older version is actually what you require.
In my experience, you have to treat each dependency problem in isolation.
The way we handle this is by combination of using the <dependencyManagement/> section as you suggest, along with the Maven Enforcer plugin with the <DependencyConvergence/> rule.
The dependencyConvergence rule:
...requires that dependency version numbers converge. If a project has
two dependencies, A and B, both depending on the same artifact, C,
this rule will fail the build if A depends on a different version of C
then the version of C depended on by B.
Basically, if you've got 2 different versions of the same lib in your dependency tree, then the validate phase fails.
So:
The way I do it is - we usually have multi module Maven projects and I try to define common dependency versions in the <dependencyManagement/> section at the parent level - EG Spring, Jersey etc. Then at the child module levels, define specialist <dependencyManagement/> section as needed.
In all our poms we define the Maven Enforcer plugin configured with the <dependencyConvergence/> rule as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.3.1</version>
<executions>
<execution>
<id>enforce-versions</id>
<goals>
<goal>enforce</goal>
</goals>
</execution>
</executions>
<configuration>
<rules>
<DependencyConvergence/>
</rules>
<fail>true</fail>
</configuration>
</plugin>
Then on every Maven build, if someone's added a new dependency (or there's a new transitive dependency) that clashes with another, your build will fail, and you'll get a (slightly repetitive) log of the dependency issues for you to deal with.
As an end point, I don't see the Maven Enforcer plugin used much, but it's got a load of really useful rules as well as the ability to make custom rules.
If it really was the newest version of a dependency you always wanted, I'm sure you could write a simple rule to do it.
Hope this helps,
Will
Created project with Spring, Hibernate & Maven. My question is what is the logic behind plugin versus dependency ?
Both plugins and dependencies are Jar files.
But the difference between them is, most of the work in maven is done using plugins; whereas dependency is just a Jar file which will be added to the classpath while executing the tasks.
For example, you use a compiler-plugin to compile the java files. You can't use compiler-plugin as a dependency since that will only add the plugin to the classpath, and will not trigger any compilation. The Jar files to be added to the classpath while compiling the file, will be specified as a dependency.
Same goes with your scenario. You have to use spring-plugin to execute some spring executables [ I'm not sure what spring-plugins are used for. I'm just taking a guess here ]. But you need dependencies to execute those executables. And Junit is tagged under dependency since it is used by surefire-plugin for executing unit-tests.
So, we can say, plugin is a Jar file which executes the task, and dependency is a Jar which provides the class files to execute the task.
Hope that answers your question!
Maven itself can be described as food processor which has many different units that can be used to accomplish different tasks. Those units are called plugins. For example, to compile your project maven uses maven-compiler-plugin, to run tests - maven-surefire-plugin and so on.
Dependency in terms of maven is a packaged piece of classes that your project depends on. It can be jar, war etc. For example, if you want to be able to write JUnit test, you'll have to use JUnit annotations and classes thus you have to declare that your project depends on JUnit.
Plugins and dependencies are very different things and these are complementary.
What plugins are ?
Plugins perform tasks for a Maven build. These are not packaged in the application.
These are the heart of Maven.
Any task executed by Maven is performed by plugins.
There are two categories of plugins : the build and the reporting plugins :
Build plugins will be executed during the build and they should be configured in the <build/> element from the POM.
Reporting plugins will be executed during the site generation and they should be configured in the <reporting/> element from the POM.
According to the maven goal specified in the command line (for example mvn clean, mvn clean package or mvn site) , a specific lifecyle will be used and a specific set of plugins goals will be executed.
There are three built-in build lifecycles: default, clean and site. The default lifecycle handles your project deployment, the clean lifecycle handles project cleaning, while the site lifecycle handles the creation of your project's site documentation.
A plugin goal may be bound to a specific phase of a specific lifecyle.
For example the maven-compiler-plugin binds by default the compile goal to the lifecycle phase: compile.
Most of maven plugins (both core plugins and third party plugins) favor convention over configuration. So these generally bound a plugin goal to a specific phase to make their usage simpler.
That is neater and less error prone :
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
</plugin>
than :
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
What dependencies are ?
Dependencies are Maven artifacts/components required for the project.
Concretely most of dependencies are jar (that is libraries) but these may also be other kinds of archives : war, ear, test-jar, ejb-client ... or still POM or BOM.
In a pom.xml, dependencies may be specified at multiple places : the <build><dependencies> part , the dependencies management part or still in a plugin declaration ! Indeed some plugins may need to have some dependencies in the classpath during their execution. That is not common but that may happen.
Here is an example from the documentation that shows that plugin and dependency may work together :
For instance, the Maven Antrun Plugin version 1.2 uses Ant version
1.6.5, if you want to use the latest Ant version when running this plugin, you need to add <dependencies> element like the following:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.2</version>
...
<dependencies>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant</artifactId>
<version>1.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-launcher</artifactId>
<version>1.7.1</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
...
</project>
In Maven, dependencies are referenced in a specific format :
groupId:artifactId:packaging:classifier:version.
The classifier (that is optional) and the packaging (JAR by default) are not commonly specified. So the common format in the dependency declaration is rather : groupId:artifactId:version.
Here is an example of dependency declared in the <build><dependencies> part :
<build>
<dependencies>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.2.14.Final</version>
</dependency>
<dependencies>
</build>
Dependency doesn't have a phase binding as plugins to address the "when" question.
But it has a counterpart : the scope.
Indeed declared dependencies are usable by the application at a specific time according to the scope we defined for these.
The scope is a central concept about how a dependency will be visible for the project.
The default scope is compile. That is the most commonly needed scope (convention over configuration again).
The compile scope means that the dependency is available in all classpaths of a project.
The scope defines in which classpaths the dependency should be added.
For example do we need it at compile and runtime, or only for tests compilation and execution ?
For example we previously defined Hibernate as a compile dependency as we need it everywhere : source compilation, test compilation, runtime and so for....
But we don't want that testing libraries may be packaged in the application or referenced in the source code. So we specify the test scope for them :
<build>
<dependencies>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.1.0</version>
<scope>test</scope>
</dependency>
<dependencies>
</build>
One line answer - basic understanding
Plugin is a tool you use at the execution of your maven build
Dependency means kind of any library which you will use in your code
If you're coming from a front-end background like me, and are familiar with Grunt and npm, think of it like this:
First you would run, say, npm install grunt-contrib-copy --save-dev. This is like maven's <dependency></dependency>. It downloads the files needed to execute a build task.
Then you would configure the task in Gruntfile.js
copy: {
main: {
src: 'src/*',
dest: 'dest/',
},
}
This is like maven's <plugin>/<plugin>. You are telling the build tool what to do with the code downloaded by npm/<dependency></dependency>.
Of course this is not an exact analogy, but close enough to help wrap your head around it.
Plug-ins are used for adding functionalities to Maven itself (like adding eclipse support or SpringBoot support to Maven etc.). Dependencies are needed by your source code to pass any Maven phase (compile or test for example). In case of JUnit since the test code is basically part of your code base and you call JUnit specific commands inside test suites and those commands are not provided by Java SDK therefore JUnit must be present at the time Maven is in the test phase and this is handled by mentioning JUnit as a dependency in your pom.xml file.
In simple words:
Plugins are used to add some additonal features to the software/tools(like Maven). Maven will use the added plugins at the time of building when we use the build command.
Dependecies are used to add some addtional code to your source code, so a dependency will make some extra code (like Classes in Java) in the form of library available for your source code.
Maven at its heart is a plugin execution framework -- as per formal and standard compact definition. To make it more clear, the commands you use like maven-install/clean/compile/build etc for creating/executing jars, which we sometimes manually run too. So, the things which you want to run (or configure or execute) you basically put them in dependency tag of mavens pom and the answer so as to who will run these dependencies (required for environment setup) be the plugins.
javac (compiler) dependency.java (dependency)
A plugin is an extension to Maven, something used to produce your artifact (maven-jar-plugin for an example, is used to, you guess it, make a jar out of your compiled classes and resources).
A dependency is a library that is needed by the application you are building, at compile and/or test and/or runtime time.