We are evaluating bintray and have the following build file
https://github.com/deanhiller/webpieces/blob/evaluateBintray/build.gradle
(not sure the build file matters since it reports that it is uploading each and every artifact correctly). I see stuff like this...
> Task :webserver-plugins:plugin-backend:bintrayUpload
Uploaded to 'https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19-sources.jar'.
Uploading to https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19-javadoc.jar...
Uploaded to 'https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19-javadoc.jar'.
Uploading to https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19.jar...
Uploaded to 'https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19.jar'.
Uploading to https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19.pom...
Uploaded to 'https://api.bintray.com/content/deanhiller/maven/plugin-backend/2.0.19/org/webpieces/plugin-backend/2.0.19/plugin-backend-2.0.19.pom'.
The really weird part though is it would create directories
https://dl.bintray.com/deanhiller/maven/org/webpieces/plugin-backend/2.0.19
but nothing is there until I deploy everything then the files show up. It seems kind of odd that the directories are created but there are no files in the upload phase. Then, my final question is, are these artifacts only available there, or are they published to jcenter or maven central as well. I can't really tell.
We are trying bintray to maybe switch off of sonatype because every time we upload, we get exceptions on sonatype(for last 4 days we have not been able to release). In bintray so far, every single upload has been successful(176 artifacts in one go).
Possibly I need more stuff to get this stuff out to jcenter or maven central?
thanks,
Dean
I do see these files at the moment.
If the artifacts are not published then only authorized users can view and download the content, this means that unauthorized users, such as anonymous, will not see content in your download repository.
Your package is not linked to jcenter therefore they are not automatically published on jcenter.
I would suggest to include your packages to jcenter.
Related
There are many questions related to forcing maven to update local repo.
But after been downloaded, it's often happen somehow that broken files occur in place of expected fresh dependencies|files in local repo. They may look like partly downloaded files or error pages provided failed remote server, in my experience. Repeated attempts to refresh download dependencies may even broke more dependencies, including those, previously downloaded and putted in by hands.
There are separate application solutions to have a near guaranteed stock of already downloaded dependencies - Nexus, Artifactory, etc.
But is there a simpler solution of how to force maven verify downloaded artifacts before putting in local repo, via checksums provided to not pollute local repo?
So my job has 2 parts. The first part downloads all the required maven artifacts from a central online repository. The second part uses the download folder as an online repository(by changing URL from central repo to local folder) to build our product.
But there are certain artifacts which do not have a maven-metadata.xml after the first part of the job. As a result, the second part fails. I don't want to hardcode the online repo and curl as it's not what we're looking for. Neither do I want to copy some other maven-metadata.xml and edit my changes. Also, I cant give the offline option during build, as I do need the artifacts to be picked up from my local repo(downloads).
Any solution to this? Thanks IA.
I have an opensource project and periodically upload downloads of new versions (the build is with maven). Currently I have a download of jaudiotagger-2.2.4-SNAPSHOT uploaded
https://bitbucket.org/ijabz/jaudiotagger/downloads
and this is a work in progress, Ive made a number of modifications and I want to upload new versions of 2.2.4 so should I just overwrite existing 2.2.4 uploads (this will mean losing download count) or should I timestamp the files so the name dont clash, and if so do I do this by manually modifying the filename or is there a procedure to do this via editing the pom.
You can continue uploading as 2.2.4-SNAPSHOT
When your snapshot artifact is uploaded to the artifact repository, it is timestamped, and all the artifacts that are being uploaded to the same version are not lost. E.g. If you have uploaded 10 snapshot artifacts to 2.2.4, then there will be exactly 10 artifacts in the repository. But when 2.2.4 is being requested, only the latest artifact being uploaded will be returned to the requestor.
Maven versioning follows major.minor.bug, therefore it is advisable to only change a version in response to the change.
For example, if the latest artifact that you have released is 2.2.3-GA, and there is a bug discovered in 2.2.3-GA, then you would start to work on the bug fix as version 2.2.4-SNAPSHOT, and continue uploading to the same version until you are ready to ship it (release it), and then only it becomes 2.2.4-GA.
By the same theory, even though you are not making your artifacts available through an artifact repository such as Sonatype or Artifactory, but via a download page (bitbucket), it makes little reason for you to version your Snapshot artifacts unless it is really meaningful to do so.
I am having issues with hosting maven jars with Github (site-maven-plugin) so I want to move with Bintray asap.
What are the steps to host a existing maven jar in bintray?
Here is my error when doing: mvn releae prepare
Caused by: org.apache.maven.shared.release.ReleaseFailureException: You don't have a SNAPSHOT project in the reactor projects list.
What I have now is I can do mvn clean install with no problems at all. Can I just upload the files under ./m2 repo? I basically uploaded the .jar and .pom I found in the maven repository.
How can I access the library I uploaded on Bintray from my pom.xml?
You can get started with publishing from Maven to Bintray by copy-pasting some pom parts from "Set Me Up" guide:
Full user manual is available as well.
Please note, that you can't upload SNAPSHOTs to Bintray. It's a distribution platform and it is not intended for development process.
Saying that, you are welcome to take advantage of a free Artifactory account for hosting your snapshot during development.
Using OJO you don't need to use the troublesome Maven Release Plugin anymore. Once you're satisfied with the snapshots quality you can promote them to be releases and upload them to Bintray in one REST call (or click of a button in Jenkins),
I am using Artifactory to support an enterprise multi-module project. Often, we change the names of modules and the associated dependencies in POM files are not updated to use the new module name. Because SNAPSHOT dependencies are not automatically cleaned up on a regular interval, these old module references can stay there for months. I discovered a few when I migrated Artifactory to another server and the old module dependencies resulted in build errors. I am building these SNAPSHOT artifacts nightly using Jenkins so I would like some way to automate cleaning up the SNAPSHOT artifacts.
Does Artifactory (or another artifact server such as Nexus) support a concept where if a SNAPSHOT artifact is older than X days, the artifact is deleted? Is there another way to automate artifact server cleanup to accomplish what I want to do? The only thing I can think of is to create a cron job to clear out libs-snapshot-local on a regular interval before the nightly build starts. Has someone already built this capability?
As far as I know, Artifactory doesn't have an automated way to delete modules that are older than a certain value. At my shop we've written a Groovy client that uses Artifactory's REST API to do exactly this.
Note that, if your artifacts are shared libraries, you need to be careful that nothing depends on them before you delete them. Our script takes this into account, too.
If you're interested in following up, post a comment and I'll see if it's OK to share our script with you.
Another solution might be a user plugin. You can write a simple Groovy script that will run in Artifactory itself (as opposite to remote invocation by REST Gareth proposed) on a scheduled basis, searching for artifacts not downloaded for a long time and deleting them.
I've made a Ruby script to delete artifacts which aren't download for X days. The way it works just like what JBaruch mentioned in his answer.
It isn't a plugin. It works with Artifactory Open Source. Plugin is only supported by Artifactory Pro.
The source code: https://gist.github.com/aleung/5203736