Update site built by Tycho still contains erroneous dependency after re-build - maven

I have built an Eclipse update site with Tycho, but when trying to install a feature from it into target IDE fails.
The update site builds fine; I can see it from a target Eclipse installation and select the feature for installation. However, the dependency check fails at start of install as it can't find a declared dependency (org.eclipselabs.xtext.utils.unittesting). This shouldn't be a dependency: it was erroneously included in MANIFEST.MF for one of my eclipse plugin projects.
I removed the dependency from the manifest and run mvn clean install again. The build reported success, but when I try to use the newly built update site it still complains that the dependency to org.eclipselabs.xtext.utils.unittesting (a) exists and (b) can't be satisfied.
So the question is: What else do I need to do to remove the dependency from the generated update site?
Thanks for any pointers.
PS: I know I could add the site for o.e.x.u.unittesting in the target eclipse installation so it can satisfy the dependency. However I don't want to do that; it's not needed for the feature to work and I don't want other users to have to add an unnecessary dependency.

Here is a list of cache locations that may have been involved in your scenario, and how to clear them
Target folder: If the target folder contain results from a previous build, this data may be used by a Maven build to speed up the build. Tycho doesn't make use of this feature, and AFAIK it shouldn't pick up anything existing from the target folder.
To be sure, always include the clean goal in your mvn calls.
Local Maven repository: In order to support builds of parts of a reactor, Tycho adds artifacts that have been built locally with mvn clean install to the target platform. If you are not aware of this feature, this can have various strange effects.
To avoid this, don't build with install unless you have to. Use mvn clean verify instead. Also: Deleting the file ~/.m2/repository/.meta/p2-local-metadata.properties resets what Tycho considers to be "locally installed".
Since Tycho 0.16.0, you can also disable this behaviour for one build through the command line switch -Dtycho.localArtifacts=ignore or for all builds by setting the same property in the settings.xml.
p2: The p2 update manager in Eclipse caches p2 repositories it has used since the start of Eclipse.
To force p2 to reload a repository, go to Preferences > Install/Update > Available Software Sites, select a repository and hit Reload. The repositories will also be reloaded if you re-start Eclipse.

Maybe I'm late to the game, but I still want to share my experience.
I'm using p2-maven-plugin to convert normal jar file to osgi bundle. It caches the converted jars in
~/.m2/repository/p2/osgi/bundle
Unless I change the version of my jar, p2 plugin always load the old bundle from that location.
Delete the old bundle in that folder and rebuilt projects again solve my problem.

Related

how to force mvn redownload snapshot

I got a maven project (myApp) depending on another maven project in snapshot version.
like:
<dependency>
<groupId>org.group.dep</groupId>
<artifactId>arty</artifactId>
<version>12.1.4-SNAPSHOT</version>
</dependency>
But I got a problem with this after the "arty" got an update without changing the version (I know that would be the cleanest solution).
I build the myApp local and got still the old version of the "arty" dependency.
I verified tow option working for me (and a college):
1) Manual cleaning of the local repository: navigating to my .m2/repo/org/group/dep/arty and deleted all folders inside. After rebuilding the myApp local it was working fine - arty was downloaded form the artifactory.company.com again with the updated content.
2) Local building of the arty package so it got updated in the local repository. After rebuilding the myApp local it was working fine.
But I got similar problem on the Jenkins:
I got a Jenkins job just building org.group.myApp without building before org.group.dep.arty. It failed for the missing changes form "arty".
What can I do now to solve my problem there?
I can not rely on first building org.group.dep.arty as I can not be sure for Jenkins to run both jobs on the same host using same local repository (I don't want to change that).
Somehow the myApp-Jobs was failing after I manually cleared on that Jenkins node the org.group.dep.arty in the repository and running than the myApp-job (was somehow not downloading the package).
I finally found the mvn -u but as I tried this I was as well disappointed.
I tried different maven versions on that jenkins and got the same result.
Is there no way to force the update of the snapshot versions?
Is this "another project" is a part of the same multi-module project?
If so you can build your project with --also-make options so that maven will effectively rebuild your module and all of its dependencies
If its an entirely different project, use mvn -U to forcefully download all the snapshot dependencies of your project.
If there is a particular issue with one concrete dependency consider using mvn dependency:get. This get goal of maven-dependency-plugin downloads one specific artifact from the remote repository
Here is a link to the plugin documentation
The simplest solution to redownload -SNAPSHOT is by using the command line option: -U or as long option --update-snapshots
Furthermore your project sounds like the need for a multi module build which prevents such issues. Or you might need to define those Jobs depending on each other (There is an option to build if a SNAPSHOT has been updated in Jenkins).

Trouble with maven in Netbeans

I want to create maven project in Netbeans. So, I do File->New project->Maven->Java Application. After that I try to build the project and get error:
The POM for org.apache.maven.plugins:maven-surefire-plugin:jar:2.10 is missing, no dependency information available.
But I can build this project from command line with mvn compile. Could uou tell me what is the problem with Netbeans?
NetBeans is using 3.0.4 maven by default. Unless you change that in Tools/Options menu. Are you building with 3.0.4 as well or are you using some earlier versions (2.x)?
That would explain the behaviour because 3.0.4 will not blindly rely on what artifact is in local repository but some additional metadata is also consulted to make sure your project build with the given set of defined repositories.
A common example when the problem occurs to me.
I use central directly everything downloads. when I later add a mirror, all artifacts are checked again through the mirror to make sure they are accessible. if teh Mirror doesn't actually mirror central, I get an error that way.
Another common example is: when building with 2.x, the additional metadata is not written, when later building with 3.0.4, all remote context is checked no matter what is present in local repo and the additional metadata files are constructed.

Maven fails to find local artifact

Occasionally maven complains that a particular dependency, which is built and packaged locally, cannot be found in the local repository while building another project that has it as a dependency. We get an error like:
Failed to execute goal on project X: Could not resolve dependencies for project X: Failure to find Y in [archiva repository] was cached in the local repository, resolution will not be reattempted until the update interval of internal has elapsed or updates are forced ->
Where X is the project being built, and Y is the supposedly missing artifact. If you look in the local repository, the artifact is there. This artifact is never installed in our archiva repository, so the problem is purely based in the local repository.
We have tried various profiles in settings.xml, and of course "mvn -U". Neither do any good, nor should they because this artifact never goes any further than the local repository.
The only two things that seem to work are to wait a very long time until maven smartens up, or to completely delete the local repository. Presumably the waiting option is related to the aforementioned update interval.
We have experienced this problem with maven 3.0.2 and 3.0.3. We are using Archiva 1.0.3 (but again this shouldn't be a factor). Any help would be greatly appreciated.
The local Maven repo tracks where artifacts originally came from using a file named "_maven.repositories" in the artifact directory. After removing it, the build worked. This answer fixed the problem for me.
As the options here didn't work for me, I'm sharing how I solved it:
My project has a parent project (with its own pom.xml) that has many children modules, one of which (A) has a dependency to another child (B). When I tried mvn package in A, it didn't work because B could not be resolved.
Executing mvn install in the parent directory did the job. After that, I could do mvn package inside of A and only then it could find B.
Even in offline mode, maven will check remote repositories if there is a _remote.repositories marker for the dependency. If you need to operate in offline mode, you may need to delete these files.
The simple shell command below deletes these marker files. This is safe to do if you only use offline mode for the machine. I would NOT do this on a machine that needs to pull files down from the web.
I have used this strategy on a build server that is disconnected from the web. We have to transfer the repository to it, delete the marker files and then run in offline mode.
On Linux / Unix you can delete the remote repository marker files this way:
cd ~/.m2
find . -name "_remote.repositories" -type f -delete
Maven remembers when it didn't find something. The key is "resolution will not be reattempted until the update interval of internal has elapsed or updates are forced ->"
The quick solution is to delete your local "repository" subdirectory for the problem artifact - assuming you have fixed the problem with it. :)
mvn -U will force update from remote repository - again, assuming you have now populated remote with said artifact.
When this happened to me, it was because I'd blindly copied my settings.xml from a template and it still had the blank <localRepository/> element. This means that there's no local repository used when resolving dependencies (though your installed artifacts do still get put in the default location). When I'd replaced that with <localRepository>${user.home}\.m2\repository</localRepository> it started working.
For *nix, that would be <localRepository>${user.home}/.m2/repository</localRepository>, I suppose.
If you have <repositories/> defined in your pom.xml apparently your local repository is ignored.
Catch all. When solutions mentioned here don't work(happend in my case), simply delete all contents from '.m2' folder/directory, and do mvn clean install.
Even I faced this issue and solved it with 2 ways:
1) In your IDE select project and clean all projects then install all the maven dependencies by right clicking on project -> go to maven and Update project dependencies select all projects at once to install the same. Once this is done run the particular project
2) Else What you can do is check in the pom.xml for the dependencies for which you are getting error and "mvn clean install" those dependent project first and the install maven dependencies of the current project in which you facing issue. By this the dependencies of the local project will be build and jars will be created.
I run to the similar problem when my new project depend on oracle jdbc jar(which I have installed in my local repository and work well for other projects). I tried -U option ,deleting .lastupdate file or the whole directory and downlaod again,but it did not work. finally,I deleted the directory and installed it locally again,it works.
One of the errors I found around Maven is when I put my settings.xml file in the wrong directory. It has to be in .m2 folder under your user home dir. Check to make sure that is in the right place (along with settings-security.xml if you are using that).
I had DependencyResolutionException in Ubuntu Linux when I've installed local artifacts via a shell script. The solution was to delete the local artifacts and install them again "manually" - calling mvn install:install-file via terminal.
This happened because I had http instead of https in this:
<repository>
<id>jcenter</id>
<name>jcenter-bintray</name>
<url>https://jcenter.bintray.com</url>
</repository>
check if if your artifact Y have packaging set to "jar". If you have defined it as "war" by error or copy paste, it will show this strange "was cached in the local repository, resolution will not be reattempted until the update interval of internal has elapsed or updates are forced". I would expect something like "artifact Y is war, jar type expected".
In my case I needed project Y to be a WAR to be deployed through Tomcat, as well as it needed to be a JAR to be able to add it as a dependency in project X.
So in project Y's pom.xml, I added this plugin to create a JAR along with the WAR:
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.2</version>
<configuration>
<attachClasses>true</attachClasses>
<classesClassifier>classes</classesClassifier>
</configuration>
</plugin>
And while adding the dependency of project Y in project X's pom.xml, I had to add a classifier:
<dependency>
<groupId>groupId.of.project.Y</groupId>
<artifactId>project.Y</artifactId>
<version>1.0-SNAPSHOT</version>
<classifier>classes</classifier>
</dependency>
Note: when you build project Y, you will see 2 packagings in the target folder: project-Y.war and project-Y-classes.jar, so that's why while importing you are specifying the classes classifier to import the JAR and not the WAR.
Here is the long Solution to the problem
(Not Quick fix but will work if no other solution)
You're going to hate me for saying this but this is the truth about open source projects like eclipse. Because Open source is modular and allows you to build and develop a project in many ways with many tools such as maven, spring boot, options for xml or groovy, different eclipse updates & Etc. The problem is that eclipse allows you to run the project with missing maven builds because the IDE is smart enough to resolve dependencies using a remote_repository where it stores and catches the jar files that is not properly built on the project.
Because of this feature, You may actually have local build issues but just like DNS servers; if the solution is not found in the local directory, Eclipse will look for a solution in it's remote cached repository. When you delete the remote_repository and let Maven rebuild it a second time, The project may end up creating more errors and not build a second time or may possibly rebuild a cache that was missing. But that is unlikely.
So the long answer to fix your solution.
This is a project architecture issue!
SOLUTION:
What you need to do is look in to all your dependant project's pom.xml file and the maven dependencies folder in your local project and try to resolve all the missing dependency jars in your maven dependency folder. If you have a referenced library, I suggest moving those jars into your local project's maven dependency folder.
You have to work your way into solving every child project and then navigate into your root project and fix every single project by using Maven -> Build -> clean install (check off "skip tests" & "resolve workspace artifacts") until every project builds with a clean success.
most likely, when you force update your entire solution to all your projects, you will get a list of errors that you have the IDE auto-resolve. The auto-resolve will refer to a easy reference to fix the issue. But to deploy, you have to manually fix the project because Eclipse, Spring & Maven will work well together but there are maybe a few things they don't agree on. So, you have to play diplomat in those situations and figure it out.
That's the sad truth.
All said, I have a list of problems in my project. I have this issue. The war file generated has empty jar folders and the build is not clean without errors unless i force it. The WAR file generate will run a 404 error on tomcat server production and my angular application will throw a Cors-Error when executing the API.
All the errors on my front end project is artificial because the root of all issues is the WAR file generated. It did not generate with dependencies, the Main project did not execute in tomcat and tomcat server cannot run the spring initializer to deploy the cors-policy on the server to allow my angular application to communicate. But all in all, development environment works fine with no issues.
So that is my long ended solution for this thread.
I had the same error from a different cause: I'd created a starter POM containing our "good practice" dependencies, and built & installed it locally to test it. I could "see" it in the repo, but a project that used it got the above error. What I'd done was set the starter POM to pom, so there was no JAR. Maven was quite correct that it wasn't in Nexus -- but I wasn't expecting it to be, so the error was, ummm, unhelpful. Changing the starter POM to normal packaging & reinstalling fixed the issue.
In my case I had to add mavenLocal() in root level gradle dependency
mavenCentral()
mavenLocal()

Best practice wrt. `mvn install`, multi-module projects, and running one submodule

I tend to avoid using mvn install in my multi-module projects because I feel like I then don't know which exact version of a submodule is then used when building / launching other submodules (particularly when switching between branches very often).
I tend to use mvn package a lot and then mvn verify.
I'm now facing the issue in a FOSS project (a Maven archetype moreover) where I'd like to use Maven's best practices.
It's a multi-module project with a webapp submodule depending on the other modules, and what worries me is the ease of development along with mvn jetty:run (or jetty:start).
Currently, I defined 2 profiles:
prod, the default one, declares dependencies on the other submodules;
dev on the other hand does not depend on the other modules, and configures the jetty-maven-plugin by adding the other modules' output directories as extraClasspath and resourcesAsCSV.
That way, I can mvn package once and then cd webapp && mvn jetty:start -Pdev and quickly iterate, reloading the webapp without the need to even stop the server.
AFAICT, extraClasspath was added for that exact purpose (JETTY-1206).
I've been pointed at the tomcat7-maven-plugin which can resolve modules from the reactor build when using Maven 3 (and I raised an issue to bring the same to Jetty: JETTY-1517), but that hardly solve my
If I hadn't removed the dependency on the other submodules from in dev profile, I'd have had to do an mvn install first so that validating the POM doesn't fail, even if jetty:start doesn't use those dependencies afterwards.
So here's my question: is mvn install really that common? or my approach of putting the intra-reactor dependencies only in the prod profile OK?
(note that I have the exact same problem with the gwt-maven-plugin, so please don't tell me to simply switch to Tomcat; that wouldn't even work actually, details here)
The mvn install is common in particular in relationship with multi-module builds, cause it will give you the chance to run a single module from your multi-module build.
This can be achieved by using:
mvn -pl submodule LifeCycle
I just found a workaround (which seems logical as an afterthought): https://jira.codehaus.org/browse/JETTY-1517?focusedCommentId=306630&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-306630
In brief: skip the plugin by default in the parent module then re-enable it where needed.
This however only works if the plugin can be skipped (i.e. has a skip configuration) and is only used in one specific submodule, and it has to be selectively done for each plugin you need/want to run that way (in my case, jetty:run and gwt:run).
I do most of my development on my laptop. For the projects I'm currently working on, my local repository is really more of a temporary holding area. I run mvn install all the time. Putting artifacts in one's local repo is the only way I know of to share built artifacts between projects, especially if you are working on projects which are related but are not (and should not be) part of the same multi-module build.
When I'm done developing I commit changes to the shared SCM and let Jenkins build & deploy the code to the shared remote repo. Then I either blow away the changed projects in my local repository so the next build brings down the freshly built artifacts, or I run Maven with -U to force updates.
This works well for me, YMMV.

Maven without (remote) repository?

I have a Maven 2 multi-module project and want to be sure everything is taken from my local checked-out source.
Is it possible to tell Maven to never download anything for the modules it has the source of? Do I have to disable the remote repositories?
Does Maven always have to go the expensive way of installing a module into the local repository, and then extracting it again for each of its dependents?
Does Maven automatically first recompile dependencies for a module if their local source changed, and then compile the dependent?
Is it possible to tell Maven to never download anything for the modules it has the source of?
No. Maven 2 only "sees" the current module while it builds. On the plus side, you can build part of the tree by running Maven in a module.
Do I have to disable the remote repositories?
Yes, use the "offline" option -o or -offline. Or use settings.xml with a proxy that doesn't have any files. This isn't what you want, though.
Does Maven always have to go the expensive way of installing a module into the local repository, and then extracting it again for each of its dependents?
Yes but it's not expensive. During the build, the file is copied (that was expensive ten years ago). When a dependency is used, Maven just adds the path to the file to the Java process. So the file isn't copied or modified again. Maven assumes that files in the local repository don't change (or only change once when a download/install happens).
Does Maven automatically first recompile dependencies for a module if their local source changed?
No. There were plans for Maven 3 but I can't find an option to enable something like that.
To solve your issues, you should install a local proxy (like Nexus).
Maven download stuffs (dependencies) only if it's not available in your local reposiotory ($USER_HOME/.m2/repository). If you do not want anything to be downloaded use offline mode. This can be done by using -o switch. E.g.
mvn -o clean install
There is nothing expensive in it. If you are building the complete parent project, it will build all the modules and then copy the artifacts to your local repository. Then, when you build a project that has dependencies on those project, Maven will just copy them from local repository on your hard disk to the package that is going to be created for current project.
No. I have been burnt. Maven does not compile dependencies automatically. There is a plugin called Maven Reactor Plug-in. This plugin enables you to build a project's dependencies before the project is built.

Resources