Is it possible deploy an artifact (.ear) into a application server (AS) without obtain its dependencies from a repository?
Let's me explain: the maven project I'm trying to configure for deploy into a AS has 3 modules:
Web (.war - front end)
EJB (.ejb - back end)
Entity (.jar - entities classes)
These modules are wrapped into a EAR module and none of then are available in some repository (like Nexus or JFrog Artifactory). When I try to use Cargo Maven plugin or JBoss Deployment Maven Plugin, both notify that cannot resolve dependencies for these modules.
UPDATED (03/01/2019)
The issue is similar to that quoted in items 6 and 7 of the following link: http://webdev.jhuep.com/~jcs/ejava-javaee/coursedocs/content/html/ejb-basicex-eardeploy.html#ejb-basicex-eardeploy-testmodule
It's a workaround but worked. Instead of the project depends on an internal repository (like Nexus or JFrog Artifactory), it's possible defines a folder as a repository on the local machine using the Maven's parameter -Dmaven.repo.local. Thus, the plugin to deploy the artifact also can use this property and obtaining the others artifacts.
That is, to build the application on the current folder:
mvn -Dmaven.repo.local=. package
To deploy the application (.ear, in this case) using Cargo Maven Plugin, for example, without depending on an internal repository:
mvn -pl app-ear/ -Dmaven.repo.local=. cargo:redeploy
OBS: Using the maven.repo.local property, the folder defined as value will be fill with all dependencies of the project. In my case, it isn't a problem because this commands are been used on a continuous integration pipeline and all files and folder are discard on the final.
Related
I am building a maven project within a Docker container as a TeamCity job configuration.
(This is necessary because the maven project builds a JNA library -- so it must be built on a specific distro)
At the end of running the docker container, I'm left with the target folder of the maven module which contains the JAR & associated other files (i.e. class files etc..)
I'm stumped onto now how to get this JAR published to Artifactory? All the integration with TeamCity seems to be if the JAR was built with the maven runner specifically
Usually, you use mvn deploy to build and deploy an artifact with Maven. It is transferred to the Maven repository that you specified in your distributionManagement.
With Artifactory, you can also use the artifactory-maven-plugin for deployment.
I am working on a project that uses Maven for build and Artifactory as Maven repository where the build is published. All works fine when there is just one set of configuration - build is created and published to Artifactory.
I need to be able to create a single WAR that contains necessary binaries and one additional artifact for each environment that contains environment specific resources (MongoDB connection URL and log4j2 xml). Deployment job in Jenkins should deploy WAR and environment specific resource JAR to the server.
I am stuck in creating and publishing artifacts. Approaches I tried and rejects are:
Maven profiles - rejected as it creates a separate WAR file for each environment. I find it illogical to create WAR files of few tens of MBs just to change few configurations.
Spring profiles - rejected as it uses code to solve deployment problem by deploying all configurations on all servers and relying on setting profile in environment + code change to read based on configured profile name.
Maven resources plugin - can be used to copy environment specific resources in appropriate directory structure; but does not get published to Artifactory during "deploy" phase.
Maven Jar plugin - can create attached JAR artifact, but attached artifacts cannot contain resource (or at least I could not figure out a way to include resources in attached JAR)
Maven build helper plugin - can publish individual files as attached artifacts, but file names are changed when deploying to Artifactory.
Multimodule POM - this approach could work to create resource artifact for each environment but has 2 disadvantages:
Updating version of main artifact needs update to all POMs (easy to miss out)
Not scalable - need to create separate POM if new environment is added.
It seems that Maven + Artifactory is not geared towards multi-environment scenario as there simply does not seem to be any straightforward solution. Am I missing something? What approach should I take?
Update
I solved the problem by using https://github.com/khmarbaise/multienv-maven-plugin. This lets me create one WAR and multiple JAR files:
- myapp.0.1.0.war
- myapp.0.1.0-dev.jar
- myapp.0.1.0-qa.jar
- myapp.0.1.0-prod.jar
I am stuck at next step. Install phase installs the JARs as WARs and subsequently Deploy phase uploads them to Artifactory as WARs. Any way to keep packaging type as JAR for the JARs?
Situation
I have a multi-module Maven project. In it, I have several JAR artefacts, it then gets assembled as a WAR file. Thus, the WAR artefact depends on all kinds of JAR artefacts (it also has a WAR overlay), most of them with scope "compile".
Build and deployment to a repository are fine. But when I try to retrieve the WAR artefact, I have issues. previously, I used a simple wget to retrieve it from the Nexus API, but I wanted to try the Jenkins Repository Connector - not the least reason being that it actually shows a list of available versions.
I configure a repository in
Manage Jenkins -> Configure System -> Artifact Resolver
with the URL for our repo:
http://$NEXUS/nexus/content/repositories/releases/
then in the job, i add a parameter:
Maven Repository Artifact
and use the repository configured above, then i add
Artifact Resolver
as a build step and set it up.
Problem
I am not even sure on which side this should be solved: When I run the job to try to get the WAR file from the nexus, it also starts trying to retrieve all kinds of transitive dependencies (some of which are unaccessible to this user) and fails. What I need is just the WAR file. No transitive dependencies (since they're already packaged in the WAR).
The Repository Connector plugin doesn't seem to have a switch for this, and the Maven side it's probably perfectly OK to include those dependencies in the output POM.
Question
What can I do to either stop the repository connector from retrieving transitive dependencies or retrieve the WAR artefact in a different way? Also interesting for me (but a bit broad as a question) would be general ideas about doing this kind of workflow. E.g., does anyone use other ways of deploying the WAR into their Nexus?
i submitted a patch to the repository connector plugin.
my fork:
https://github.com/rmalchow/repository-connector-plugin
working on getting it to be merged:
https://github.com/jenkinsci/repository-connector-plugin/pull/10
I am lacking some basic understanding of using a repository manager for our projects. What I don't know is how, if I use a repository manager, if I run a local install command Maven doesn't deploy the package to something like a shared Nexus instance. I seem to have some confusion between local repositories and shared ones when using a repository manager.
Apologies for the naivity and for not testing this myself. We have started versioning our application and using a shared file system approach to getting artifacts and are left with a few questions about what, within the scope of what we are currently doing, will be gained by using a repository manager instead. We do use TeamCity as a build server which is deploying to that currently used file system. I need to know some answers to a few questions before POCing a repo manager.
install is specific to the local repository.
From Maven's point of view whether a remote repository is hosted by your repository manager or is completely external has no relevance - when you're adding your artifact to any kind of remote repository, you need to use the deploy plugin (or release for non-trivial deployments).
Repository managers usually generate instructions on configuring your projects for deployment to a hosted repo.
maven defines a lifecycle (clean, compile, install, deploy...). There are default mappings when you execute "mvn install". So maven knows which plugins to execute for that maven goal.
The Introduction page gives a good overview what happens for each phase (goal) and what the default plugins are: https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
In your case: mvn install will copy the artifacts into the local maven repository to be shared by other projects you have locally.
If you want to share artifacts with other developers on other locations "mvn deploy" will copy the artifacts to the remote repository. Note you need to configure the distributionManagement section in the pom.xml to be able to do that.
The normal maven setup should look like this:
project -> local repository -> private remote repository -> public remote repository
Project: in the simplest case your project consists of source files and a configuration file (pom.xml). The project may depend on third party libraries like junit. The jar files of the libraries are not stored in your project directory, only the information which jars are needed.
mvn package
This command creates a jar out of your project an places it in the target/ folder of your project.
Local Repository: This is a maven repository stored locally on your machine. It normally resides in ~/.m2/repository/. Every dependency you are using in your project will be stored in this repository. On compiling your project, maven will use the jar files from this location.
mvn install
This command creates a jar file and copies it to your local repository: ~/.m2/repository/groupId/artifactId/version/project.jar. Now you can use this jar in different independent projects as a dependency, but only your machine.
Private Remote Repository: Most of the time this is a Nexus in your company network. This server allows to share the build project across developers. Your TeamCity server builds the jar and copies it to your nexus server. Beyond this the nexus server works like a proxy, e.g. A Developer needs junit-4.1.1.jar, so the server looks for it on public remote repositories and caches it.
mvn deploy
This command builds a jar and sends it to your nexus server ('to your private remote repository') After that every developer inside your company network can access the jar.
Public Remote Repository: These are repositories available on the internet which contain several jar files, e.g. maven.codehaus.org
Summary:
If you call mvn compile maven looks for the dependencies in your local repository. If maven can't find them, it will ask the (private/public) remote repository, and copy the files to the local repository.
You should not synchronize a local repository over network, since this type of repository is not targeted at such use and may break in some obscure way.
What you need is a mvn deploy - copies the final package to the remote repository for sharing with other developers and projects
mvn install you tried will just build and install the project in your local ~/.m2 repository. It will_not publish the artifacts to your nexus repository which you have configured.
Both install and deploy are valid build phase - meaning it executes all previous phases. Please refer to Maven docs below for more understanding.
From maven documentation:
the default Maven lifecycle has the following build phases (for a complete list of the build phases, refer to the Lifecycle Reference):
validate - validate the project is correct and all necessary information is available
compile - compile the source code of the project
test - test the compiled source code using a suitable unit testing framework. These tests should not require the code be packaged or deployed
package - take the compiled code and package it in its distributable format, such as a JAR.
integration-test - process and deploy the package if necessary into an environment where integration tests can be run
verify - run any checks to verify the package is valid and meets quality criteria
**install** - install the package into the local repository, for use as a dependency in other projects locally
**deploy** - done in an integration or release environment, copies the final package to the remote repository for sharing with other developers and projects.
Main Goal: deploy a project as jar and eclipse-plugin
current state: project builds fine as jar package
Now i want to create a second project which wraps the jar project as eclipse plugin
use tycho-maven-plugin to create eclipse-plugin
add the jar of the original project (with copy-dependency)
add an Activator
export packages from jar
create correct MANIFEST.MF
i tried to copy the jar with copy-dependencies bound to create-resources. This works as long the jar is found in repository, but the local project gets ignored.
This results in a build failure since the jar is not found.
Is it possible to tell copy-dependencies to take the jar from the target directory of the project? Or should i use some other method than using tycho?
Edit:
I solved my problem with 4 projects:
normal project (nothing special here)
the wrapper project using tycho maven and copy-dependencies.
bound copy dependencies to some goal before compile (e.g. generate-resources). Excluded all artefactid which were set as dependency in the MANIFEST.MF.
a prepare project, which calls the normal project and installs it into the repo. This is needed because the tycho-maven-plugin is bound to validate and it is not possible to call the exec plugin beforehand (at least not easy).
a multi module project which calls the prepare project before the wrapper project.
Build your local project (which artifact was missed) with "mvm install". It will be deployed in your local repository ($USER_HOME$/.m2/repositories). After this dependency should be resolved.
Alternatively you can "mvn deploy" if you have local company maven repository like Artifactory or Nexus.