I'm building few scala libraries and use them in business projects. To share libraries i use 'publishLocal' in sbt. This command uploads artifacts into my local folder. This is ok, it's fast but when i work from another machine i have to republish all this libraries because some changes had been made. So, this my is question: Is my workflow correct? Or i have to publish my artifacts to remote binary repository (ex. nexus) and add it to my business projects as resolvers? Should i use 'publishLocal' at all?
As indicated in the comments, it is strongly suggested to use a Repository Manager such as Nexus Repository Manager, or Artifactory.
You might try Nexus Repository Manager 3.x, as it should help quite a bit with your problem. You can install it on your server, as well as locally if you'd like as it can be used with other tools such as npm, NuGet, etc...
You can download the Open Source Software Edition of Nexus Repository Manager from this link: https://www.sonatype.com/download-oss-sonatype
Related
We have Artifactory and Xray for our developers and we have Azure DevOps pipelines integrated with these tools where the builds are scanned for each pipeline execution.
But when developers are doing local builds from their development workstations they also need to be scanned before merging to the repos in ADO.
So we are looking for some possibility where the developers are able to connect to Xray from their IDE client itself.
They are using IDEs like, Visual Studio and Visual Studio Code
need to Run the local builds of - NuGet, Maven, Gradle, Android, IOS, Nodes..
Can anyone suggest how this can be achieved from IDEs or CLIs like (jFrog CLI, or git bash, etc...)
You can use the JFrog VS Code Extension which allows you to scan project dependencies using JFrog Xray in VS Code.
It allows developers to view panels displaying vulnerability information about the components and their dependencies directly in their VS Code IDE. The extension also allows developers to track the status of the code while it is being built, tested and scanned on the CI server.
Is there a way to automate storing code updates to GitHub after a developer publishes a new version of code from Visual Studio (2017 or 2019)?
Or is there a way to automate storing code updates to any code repository?
We also currently use VisualSVN, but are open to other repository software packages if they solve this problem for us.
We publish web projects and console apps to on-prem servers, so my understanding is that GitHub Actions won't work for us (yet).
If you are pushing to an on-premise Git repository hosting server, you can add to that remote repository a post-receive hook.
Said hook can in turn analyze what just got pushed and push it in turn to GitHub.
See git-post-receive-hook-push-to-mirrors as an example of such hook.
Can any one suggest a flow how to publish Azure Devops server 2019 maven artifacts (exposes API) to Maven central for a more global access?
update:
let me rephrase my issue.
my team uses on-premise Azure Devops to develop and host android packages used as API for other applications.
when the API version is stable we want to publish the API packages (in the on-premise artifacts) to Maven central, i wish to use the release management so i could use the approve mechanism
Azure Artifacts, part of Azure DevOps, offers the ability to host and share Maven, npm, NuGet, and Python package feeds within your organization.
If you want to host the packages on Maven central instead of Azure Artifacts. You would check how to upload packages to Maven central on Maven side:
https://maven.apache.org/repository/guide-central-repository-upload.html
If your goal is only sharing packages publicly, you may try Azure DevOps Service (https://dev.azure.com), which support sharing your packages publicly. This feature currently only support Azure DevOps Service.
We have an on-premise infrastructure with git repository, a CI server and an artifact repository. We are using gradle in our project. This is working perfectly fine for our regular CICD processes.
We are planning to move our code to cloud using private Github and Travis-CI for CI. The problem is that we have a couple of third party jars that are not available in any artifact repositories on the internet, e.g. maven central and others. The on-premise infrastructure worked fine as we had manually installed those jar files in our internal artifact repository.
Our builds on travis are now failing. What is the best way to provide these third party jars to travis during build time?
P.S.: The third party jars are drivers provided by some of our vendors. These are not open source and cannot be pushed to artifact repositories on the internet.
I am working on Automated Build using maven and Jenkins. I am looking for best open source Repository Management for Maven. So that I can have an integaration between Maven and jenkins via respository manager.
You have at least four choices:
Nexus
Artifactory
Apache Archiva
Reposilite
Each has pros and cons. I'd go with Nexus since it is backed by Sonatype who are also involved in Maven development. I liked the Artifactory UI though.
Both Nexus and Artifactory have supported professional editions as well.
These are linked from the maven site as well.