I have a lot of jar files in my local hard drive and I want to use them as a repository so that my internet connection requirement can be removed.
I installed Archiva but I don't know how can I deploy the jar files to Archiva. there is a UI form which does this task but deploying huge amount of jar files by hand is not easy and waste a lot of time.
How can I use my local jar files as a repository so that I can use Maven(or Ivy) to manage dependencies?
You can upload them with mvn deploy:deploy-file, which you could use in a script for a bulk upload (discussed here, and here's an example script).
As an alternative, if you want to use those local files directly, you can copy them into the storage location of an Archiva repository. From this thread:
One way to put all your local .m2 repository content is to copy from
~/.m2/repository to
${archiva-install-dir}/data/repository/${managed-repository-name}/
...
Once the copy is done, you can force Archiva d/b scan on the managed
repository.
Related
Our project is using Nexus Repository Manager to store all the jars.
Along with the jar I see that under a group there are other files like pom.xml, .md5, .sha1 file. I am in need of these files at our server startup.
Is there a way that I can download all the files under a particular group programatically using Java/Curl command/mvn dependency command at runtime?
Maven also uses local repository for caching. The default location is Default: ${user.home}/.m2/repository. You can check this setting in file settings.xml under [maven_dir]/conf/.
To update dependencies, use -U option. i.e. mvn clean install -U
Do not forget setting nexus repo inside pom.xml file http://maven.apache.org/pom.html#Repository
I could do it with the simple approach.
Club all the xmls into a JAR/ZIP file and upload that zip file under my groupId to the Maven Repository.
Then programatically use CURL/WGET to download the zip and unzip the contents of that zip file (using any available utility) at runtime.
Put all XMLs under a zip file.
Use mvn deploy to load this zip file to my Maven Nexus Repo.
Then Programatically during runtime, download the zip file using simple weburl call to that ZIP file.
Use ZIP4J or any other library to unzip the contents to my required output folder
Pick files from this server when needed during the flow.
Hope its helpful to someone somewhere sometime. :)
I have a Maven archetype that I created myself from one of my projects and installed in my own local Maven repository. From Eclipse I am able to create new projects using this archetype and I obviously have access to the archetype's "installed" files (the ones that are placed in the repository after the install).
The problem is that now I want to share the archetype to a colleague, but after I created it and installed it to my local repository, I deleted the "installer" files (the ones that are generated in the generated-sources/archetype folder of my source project), so now I only have the archetype installed in my repository and I don't know how to export it in order to share it with someone else.
I could create a new archetype, but I put so much effort and spent several hours in doing that in the first place, so I would like to find out exporting the current archetype is possible, and if so, how can I do it?
EDIT
Asked in a different way, can I take the "installed" files from my local repository and put them in other repository, and have Maven recognize them as an Archetype?
What we ended up doing was to zip the repository archetype folder and send it to my colleague, who then performed these steps:
Unzip the archetype folder file to the root of his local Maven repository.
Execute the command ´mvn archetype:crawl´.
That was it. He was able to use the archetype for creating new projects.
I am frequently using maven artifacts as dependencies from external repositories which go (permanently) offline surprisingly often. I'd like to save all dependencies a given project has and save them in a local repository - just like using maven deploy -DaltDeploymentRepository=... for a single project. This repository should then be usable like any other maven repo when put on an HTTP server.
I tried using mvn dependency:copy-dependencies -Dmdep.useRepositoryLayout=true, but it does not create files like maven-metadata.xml or copy .pom files.
I do not want to use any repository managers like Artifactory, I just have a static file server.
Thanks in advance.
In my project I have many modules.
One of the modules requires a jar file to be deployed to the repository which it does fine.
The others involve every other kind of file: zip, kar etc.
I can see the zip get uploaded if I look for it via the terminal but if I browse Archiva it is not there.
The kar file, for example, does not need to be built but it's being worked on and is currently manually uploaded to the repository (Archiva). This is not desirable.
Each module has a POM and each POM uploads empty jar files to Archiva when it is built (with Jenkins). How can I avoid that? And can I copy files to Archiva without them having to be built into a jar file?
You can also give a try to "maven-deploy-plugin"
Invoke a maven target in jenkins with this plugin and provide the suitable parameters.
You would also need the repository to be added in you settings.xml if your repository requires login credentials and then use the ID, you mentioned in settings.xml, in the maven target.
org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy-file
-Durl=<artifact-repo URL>
-Dfile=<name of the file>
-DgroupId=<Group Id>
-DartifactId=<Artifact Id>
-Dversion=<VERSION>
-Dpackaging=<packaging Type>
-DrepositoryId=<ID as mentioned in settings.xml>
Hope this may be of some help.
i wanted to setup artifactory as internal repo after i had actually used maven and populated my local repository during various builds. Before I set up artifactory on my machine, my local repository has already downloaded various libraries on to my local machine under .m2. Now I am setting up internal repo using Artifactory. Is there a quick way to move my local repository under .m2 to artifactory so that i don't have to again download all the libraries to get my artifactory up with the required libs.
Currently what i have to do is remove all folders under local repo (.m2\repository) and then allow my maven build to download on to artifactory. I am looking for a more efficient way to do this.
You've got a number of options:
Assuming that you would like to push all artifacts into one repository and keep the same folder structure as in your file-system, Artifactory's got a number of good import utilities in its administration UI at:
Admin->Import & Export->Repositories->Import Repository from Path.
For more flexibility, you can write a simple script that iterates over the .m2 folder and sends an HTTP PUT command for every artifact and thus be able to customize the paths and target repositories, for example:
curl -X PUT -u username:password -T path/to/file.jar "http://myhost:8080/artifactory/my-target-repo/path/to/file.jar"