We have several repositories on Github and just recently explored to integrate it our continuous integration/deployment to CircleCI. The thing is, every time we update something on .circleci directory of a certain repository, we need to do the same for the others. Is there a way to have it like a dependency so all our repository can just fetch from that?
Related
I'm trying to migrate old fashioned Jenkins DSL jobs (in Groovy) to the new descriptive pipeline form.
Since I'm very new to the pipeline and I could not find any answer to my noob problem, I'll firstly describe my scenario here:
Supposing I have 3 DSL jobs, one to build and save the artifact generated in a repository like Artifactory, another to tag the master branch and the last one is used to deploy to prod. All jobs use the same Git repository.
The building job is usually run many times during development. It can be triggered manually or as a response to events in the Git repo, e.g. merge requests and pushes.
For simplicity, let's assume the tagging job only needs to tag the master branch in the repo. This will only be run once in a while, manually, when we are pretty sure the master branch will go to prod.
Artifact gets deployed using a third job, also manually.
So here are my questions:
As I understand we can only have one file per branch in the repo, so how can I configure such a setup using a pipeline defined in only one Jenkinsfile?
How can I manually trigger the tagging job only (meaning compile/test/generate the artifact without uploading and then if everything tests ok, tag the version)?
In this situation, will it be easier for me if I just implement the building job in the pipeline and keep the others as DSL scripts?
Many thanks for any suggestions!
Where are Maven and pom.xml file kept in a real-time project if the code is at GitHub. I mean can I keep my local repository somewhere in another machine and use it in my project. If yes, how?
Local repositories are not meant for sharing. They are also not "thread-safe" in any way, so accessing them simultaneously from two different builds might break things.
They are populated by the artifacts Maven downloads from MavenCentral and other repositories, and also the stuff you build yourself. As they are more or less a form of cache, there is no need to share them.
If you need a repository that is used from different machines or by different users, set up a Nexus/Artifactory server.
I had a task to delete old SNAPSHOT artefacts which are under many folders/directories.
We can't go and delete each and every artefact manually so I would like to go with restAPI.
For clear info:
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/abc.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/dddd/XYZ-SNAPSHOT/xyz.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/cccc/eeee/XYZ-SNAPSHOT/pqr.jar
https://artifactory.com/artifactory/maven-local/com/aa/bbb/dddd/eeee/XYZ-SNAPSHOT/lmn.jar
Above 4 examples have different directories.
My script needs to go each and every directory and have to verify for XYZ-SNAPSHOT, if it found then we can make a url and delete through CURL.
How can we achieve this? Or is there any other way to do it?
You should probably want to use Artifactory Query Language (AQL) which is the easiest way to find artifacts and modules according to patterns. You can find bunch of examples in the page. Moreover, to perform the deletion easily and even automate the process in the future, I advise using JFrog CLI. You can also read this interesting blog about similar use case.
Also, there is the 'Max Unique Snapshots' field in your local Maven repository settings. You can use that for Artifactory to keep a specified number of unique snapshots per artifact.
The requirements are as follows. We need copies from binaries we need in our projects on our repository server. We can't just proxy the public repository because we had several cases in the past where the binaries on the public repository were changed without changing the release number and we want to avoid problems imposed by that, thus we want to manually specify when to download it from the public repository and when to update. No changes are ever to be made to the binary stored on our repository server without manual interaction.
Is there a way achieve this? I.e. to say "I want artefacts X, Y, Z" copied to my repository server(preferably including their dependencies). Is this possible with either Nexus or Artifactory?
Yes. In Nexus define your own local repository, manually download the versions you want and add them to your repository. You may have to set up "manual routing" for dependency resolution to ensure that Nexus consults the repos in the correct order.
Then make sure your pom files refer to the specific versions you have downloaded.
One thing that will make this a little easier is that you can place the downloaded artifacts directly into the local storage directory of a Nexus repository (you don't need to upload them into Nexus).
See here for details: https://support.sonatype.com/entries/38605563
I am looking for an article which describes a set of guidelines to follow when creating repositories in an artifact repository manager.
I know that:
You need to keep snapshots in snapshot repositories.
You need to keep releases in release repositories.
Third-party artifacts should be in a separate repository (the same goes for forked/patched
versions of third-party libraries).
It's generally a good idea to prefix the names with int-* and ext-*.
Usually different product lines end up having their own repositories as sometimes their artifacts don't depend on each other.
I've been trying to find an article on this to illustrate to a client how this artifact separation abstraction is done by other companies and organizations using repositories.
Many thanks in advance!
I am not aware of existence of such an article, but as #tieTYT mentioned, you can look at Artifactory default repositories. They reflect years of experience in binaries management, continuous integration and delivery.
Those practices still apply even if you use Nexus (and you can observe them even without installing Artifactory, by looking at JFrog public Artifactory instance http://repo.jfrog.org)
For your convenience, here are the defaults (important usage emphasised):
Local Repositories:
libs-snapshot-local: Deploy here your local snapshots
libs-release-local: Deploy here your local releases
ext-snapshot-local: Deploy here 3rd-party snapshots which aren't available in remote repos
ext-release-local: Deploy here 3rd-party releases which aren't available in remote repos
plugins-snapshot-local: Deploy here your plugin (usually, maven) snapshots
plugins-release-local: Deploy here your plugin (usually, maven) releases
Remote Repositories:
jcenter: proxy of http://jcenter.bintray.com. Normally, that's the only remote repo you'll need. It includes whatever exists in maven central plus all other major maven repositories
Virtual Repositories:
remote-repos: aggregation of all the remote repositories
libs-release: this is the resolution repository for release builds. It includes remote-repos, libs-release-local and ext-release-local
libs-snapshot: this is the resolution repository for snapshot builds. It includes remote-repos, libs-snapshot-local and ext-shapshot-local
repo: this is special virtual repository, that aggregates everything. Generally, do not use it, if you ever plan building release pipeline using binary repository.
I'll be glad to advice on specific question.
As is the case with many questions about best practices, the answer is: It depends.
Technically there are only two distinctions that are required:
Snapshot vs release repo
Hosted vs proxy repository
Snapshots vs release repositories as a distinction is required since the Maven repository format and therefore Maven and other build tools differentiate how they work with the the meta data and what they do during upload.
For proxy repositories you will just have to add as many you need to proxy. This will depend on what components you require and will be separate for proxying snapshot and release repos.
For hosted repositories you also have to have separate snapshot and release repos. Beyond that is is all up for grabs. Having a separate third party repo as preconfigured in Nexus (and Artifactory) and other setups are certainly useful, but not really necessary. You can have all those distinctions sorted out by internal meta data where required.
Along the same lines you can have one release repo for everyone or one for each team or whatever. You can still apply access rights within those repositories to separate access and so on in Nexus with repository targets. I assume Artifactory and Archiva can do something similar. The question here mostly boils down to ease of administration, backups, security setup and access for users.
Naming conventions like you mentioned can help if you want to have separate repositories, but technically none of this is necessary.
Other things I have seen are e.g. migration repos that are used to migrate legacy project libraries into a repo but become frozen after the migration is done, separate repos per team, separate repos per project and so on. Another aspect are separate repos for different levels of approval and so on (e.g. check out problems with that on http://blog.sonatype.com/people/2013/10/golden-repository/)
In the end however this all hinges really on usability and meta data and is not required. Ultimately these repositories will in most cases grouped together and accessed via one group, which flattens out the whole separation. And access rights still carry through into the group so everything can still be controlled as you like. So it turns to be a matter of taste on how you want to slice and dice and manage it.
PS: I am referring to the Maven repositories and format. Once you add a whole bunch of other formats into the mix and wrappers around them exposing them in other formats, everything gets more complicated, but the ideas behind things stay similar.