storing internal artifacts for how long? [closed] - maven

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am interested in knowing about what other teams are doing about limiting internal artifact storage.
So how long an internal artifact is stored in Artifactory?

Sonatype (people behind Maven and Nexus) published a blog article on this issue:
http://www.sonatype.com/people/2012/01/releases-are-forever/
The vast majority of files published into our Maven repository are snapshot releases. Both Nexus and Artifactory have functionality for periodically purging old snapshots (useful for keeping disk under control)
It's the management of release builds that becomes the issue. In my opinion this falls into a couple of categories
Not every release is used
During QA some releases are rejected, this means it makes sense to publish these into a temporary "staging" repository, prior to full release.
I call these "release candidates" and Nexus Professional has functionality to manage these for me. (I assume artifactory also supports staging)
Not every release is needed
Sonatype's Blog addresses this point. Applications in production rarely need to rollback to a version older than 6 months. Applications in your Maven repository are unlikely to be used as depedencies in a 3rd party build, so it calls into question the need for continued storage.
Deleting such artifacts remains a judgement call.

We store every released artifact and I'm pretty sure you should also do that unless you have really strong reason to not to. We limit our snapshots to just last one for the artifact and only if there's no such release version, however we ensure that every snapshot live at least 3 days. It's easy to set this up in Nexus and AFAIR it's more or less its default policy about snapshots.

Related

How to automatically delete old versions of artifacts from hosted maven repository on OSS 3.0.0?

We are building and deploying multiple releases for various services in a single day. Due to this we are wasting a lot of storage for storing older versions of artifacts which will never be used again.
Is there a way to automatically delete older versions and just keep few versions such as last 10 in OSS 3.0.0?
I searched there documentation but couldn't find anything that works automatically. Currently I have to manually select and delete them which is very error prone and time consuming.
Few details about my setup:
"File" type "blob" is used for storage.
Repository is self "hosted" with format "maven2"
There are a few options you can use in Nexus Repository 3.x for Snapshots, from https://books.sonatype.com/nexus-book/reference3/admin.html#admin-system-tasks:
Purge unused Maven snapshot versions
Remove snapshots from Maven repository
As for Releases, removing Releases can be an anti-pattern, you generally should keep your releases around if others rely on them, etc...
There is a JIRA ticket for Removing Releases which you can follow at: https://issues.sonatype.org/browse/NEXUS-10821
This is also answered here: Purge old release from Nexus 3

Best practice for storing and referencing 3rd party source code [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Currently our main product solution is stored in a single TFS project, and all 3rd-party libraries that are referenced by our projects are pulled in from nuget, so the binary files are not stored in TFS. Now we are having to create a customized fork of one of the libraries (class library) that our product will reference. The library we are going to "fork" lives on codeplex as a TFS/SVN style project that we don't control, so we only have read access to it via the SVN interface, meaning there's no way to do a proper fork like you might be able to do with a git project. As such, our fork has to live elsewhere, disconnected from the codeplex project going forward. Since we use TFS for our main product, we'd like to store the fork and develop it in our TFS environment.
What is the best practice for storing this library in TFS so that our main project can reference it? Should it be in its own TFS Project? How should another project in TFS reference the library?
The team Project is a choice that you will have to make as to which best suits you, either a new stand alone isolated project or whether it just sits with in your current one project, it really depends on how much development work you are going to do with it and whether you want to be able to manage that through the same sprints etc.
To consume it i would just build it and create a NuGet Package and consume it in the same way as you do with all the rest of your referenced projects, obviously you may need to have this in your own NuGet feed.
If this is the only project that requires this customized library, put it in a folder inside of your current project that is parallel to your Sources (now cloaking is not required). Create a manual triggered build definition that builds the library and creats a NuGet package. Use a post-build script (or modify nuget.target) to push that NuGet library to your NuGet repository. Then reference this in your daily build.
You can overlay new releases from codeplex in your folder structure, if needed.

Continuous Delivery- Handling Inbetween svn commits [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Suppose a deployment pipeline is going on. SVN tagging and development version change is going on. At that time a developer is committing his changes. so there is a chance that CI server releasing the new committed untested changes to the production or some other conflicts occure. How can I handle this situation. Do I need to lock the entire trunk until the build pipeline completes. Is there any other work around.
If I understand correctly, you assume the following steps
after a commit, the build server will check out the current trunk (let's say revision A),
perform the build,
execute some tests,
tags the trunk if the tests are successful
and deploys to production (still only if tests are successful)
The "crazy" developer commits between step 3 and 4 and thus creates revision B. Now you assume that the build server will again check out the latest revision (which would be revision B). This behaviour could indeed cause some trouble.
However, the build server should do all the steps based on a specific revision which is not a problem in common setups. E.g. Jenkins usually has a check-out step at the beginning of the job. If there is a tagging step in the end, you will usually not want that Jenkins blindly tags the current trunk (causing the problem you describe) but instead tag the revision that is checked out in Jenkins' workspace.
Additionally, please consider that there should be at least some manual approval step before anything gets deployed automatically to production. This is usually mentioned in the context of continuous delivery as far as I can see.
The key to continuous delivery is IMHO that you are able to deploy the current version of your source code at any time at the push of a button. IMHO it does not mean that every commit should be deployed automatically.

Maven repository hosting for non-public artifacts? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Is there some hosting solution, be it paid or free, that offers explicit maven repository hosting for non-public artifacts, preferably with support?
These are the alternatives I'm aware about:
Hosting on your own public server with credentials
For open source projects, there is free sonatype hosting
Hosting on Amazon
It can be hosted on github, google code or some other VCS hosting
However, all of these either require some maintenance overhead beyond just using the repository manager (beyond just using nexus) and/or are not really fully supported solutions, or are not meant for closed-source projects.
If I need to have a solution that is available on the internet but it is "private" as it should be available for the people of the Company only, are there some other alternatives? I'm assuming here that there is no server that is already public, so having a new server just for maven artifact hosting seems a bit big. I'm a bit surprised that I was unable to find commercial alternatives.
I'm developer of mymavenrepo.com - it's very simple maven hosting which perfectly fits for personal use and small companies
Jfrog offer their artifactory repository manager as a cloud service.
Personally some of the default configuration choices ("fixing" metadata, etc) are just plain wrong, but you can configure it to do the right thing.
(Full disclosure: both jfrog and sonatype are partners of cloudbees (my employer))
Edit:
They offer a 30-day trial, and their pricing can be seen here.
JitPack is a services that makes it easy to host non-public (private) Maven artifacts.
The way it works is that it builds your private Git repositories from source and publishes resulting artifacts.
The artifacts are only accessible to you and those who have access to Git repo itself, like people in your company.
The way you use it is by adding the repository and point your dependencies at the Git Repo:
Add repository:
<repository>
<id>jitpack.io</id>
<url>https://jitpack.io</url>
</repository>
Add dependency
<dependency>
<groupId>com.github.User</groupId>
<artifactId>Repository</artifactId>
<version>Tag</version>
</dependency>
More information and authentication in the docs. Their pricing can be seen on their pricing page.
I've been searching for this as well and came acrosss this link https://blog.openshift.com/nexus-repository-manager-in-the-cloud-for-free-with-openshift/ which explains how to set up a Nexus application on OpenShift. I followed the steps outlined in this page and got it up and running pretty quickly. You can disble the "anonymous" user to remove public access and set up your own users. It can tie into LDAP you have that available.
It seems there is a service called deps about to open in 2017. From their description, it sounds like the answer to my question, but we'll have to see how it turns out.
This might be considered a promotion, but we just released support for hosting Maven repositories in the cloud at Deveo. There is no other information available yet than the release blog post. The pricing, however, should be more friendly than what jFrog offers.
Disclaimer: I'm affiliated with the company.
There is no commercial offering of Nexus Repository in the cloud as such, but any managed server that includes the features to run a Java application is suitable. And there are LOTS of them around. And others partners like CA automatically include it in a stack they provision for customers.
The only overhead you are going to have to manage is to install and run Nexus Repository. That however is trivial and can be done within a couple of minutes.
Depending on your usage you could even run this on a VM that you turn off when no one needs it. E.g. out of 24 hours a day .. if your dev and CI servers only need it for 12 .. shut the VM down the rest of the time. And you can automate that all easily as well.
DropBox is another possible option see https://code.google.com/p/peter-lavalle/wiki/MavenOnDropBox

How do you manage common software on a large project? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I work on a project far too big to reside in a single Visual Studio / Eclipse / NetBeans project and we have a "common software" team responsible for developing and maintaining software libraries used by other teams.
I'm struggling with how to manage the development of and changes to the common software. When method signatures and classes change, do I keep the old versions and mark them deprecated? The current plan is to distribute a new build of common libraries every two weeks.
Definitely set up a repository. If you are a Maven-hater check out Gradle, it uses Ivy. Maven has a reputation for being complex but it does have better tool support. IDEs support Maven either out-of-the-box or with plugins, they give you graphs showing what the jars in your project depend on, so you can see conflicts easily.
Either Ivy or Maven will sort out your dependencies so your projects are using the right versions. Each of your projects should list (in the pom.xml for Maven) what version of which of your common libraries that it uses.
A common feature of most version control systems is the use of external branches. Common software is fetched from a shared repository and integrated in each project on update.
A key difficulty lies in documentation changes to the public API of common software and I see two solutions : good communication of deprecated signatures adn continuous integration where finding out deprecated methods can prove painfull.
There are a few options you can have.
Option A: use a repository
For Java based systems I would recommend that you use Ant+Ivy or Maven and create an internal repository with the code in those common projects.
Option B: Classpath Project
If setting up a repository is too much, what you can do is a create an eclipse project called classpath with the following three directories in it
classpath\
docs\
sources\
jars\
The team working on the common project can have a build script which complier the common code and places it into the classpath project, all that the rest of the dev team need to do is checkout the classpath project and reference the files in it during development.
Personally I am a fan of option B unless there is a full time person dedicated to doing builds in which case I go for option A.
The way to manage changes in method signatures is to follow a common version convention so when you do a major version number increase you can say dependent code will have to be changed, and if it is a minor version number increase then dependent code does not need to change. marking code as deprecated is a very practical option because IDE and build systems should issue warnings and allow the coders to switch to newer versions. If the same team is changing the common code and the main project then you will need to have the actual eclipse projects all checked out in the same workspace so that re factoring tools can do their job.
Unless the code in common will be used across across many projects I would keep it in all in one project, you can use multiple source folders to make navigating to various parts of the code easy. If you are having trouble with developers checking in stuff that is breaking things, then I would recommend you have more frequent checkins or have developers work on branches where they merge from the trunk to their work branch frequently to eliminate sync problems, when done they can merge from the branch back to the trunk, the latest version of subversion have decent support for this, and DVCS source control systems like mercurial, and git hub are excellent at this.

Resources