Is it safe to commit the Gopkg.lock file? - go

I am new to Go and was wondering whether it's safe to commit the Gopkg.lock file in a VCS?

Yes, you should commit it. This file makes much less sense if everybody has his own version.
It makes the build reproducible from one developer to another, and more importantly it ensures the deployment build environnment won't use an unexpected set of dependency versions.

Related

Should I commit vendor directory with go mod?

I am using go modules on go1.12 to handle my Go dependencies. Is it best practice to also commit the vendor/ directory into version control?
This is somewhat related to Is it best-practice to commit the `vendor` directory? which asks this question in the case of using dep. With dep, commiting vendor/ is the only way to get truly reproducible builds. What about for go modules?
I'd like to give some arguments in favour of committing vendor, go.mod and go.sum.
I agree with the accepted answer's arguments that it's technically unnecessary and bloats the repo.
But here is a list of contra-arguments:
Building the project doesn't depend on some code being available on Github/Gitlab/... or the Go proxy servers. Open source projects may disappear because of censorship, authors incentives, licensing changes or some other reasons I can't currently think of, which did happen on npm, the JavaScript package manager, and broke many projects. Not in your repo, not your code.
We may have used internal or 3rd party Go modules (private) which may also disappear or become inaccessible, but if they are committed in vendor, they are part of our project. Nothing breaks unexpectedly.
Private Go modules may not follow semantic versioning, which means the Go tools will rely on the latest commit hash when fetching them on-the-fly. Repo history may be rewritten (e.g. rebase) and you, a colleague or your CI job may end up with different code for the dependencies they use.
Committing vendor can improve your code review process. Typically we always commit dependency changes in a separate commit, so they can be easily viewed if you're curious.
Here's an interesting observation related to bloating the repo. If I make code review and a team member has included a new dependency with 300 files (or updated one with 300 files changed), I would be pretty curious to deep dive into that and start a discussion about code quality, the need for this change or alternative Go modules. This may lead to actually decrease your binary size and overall complexity.
If I just see a single new line in go.mod in a new Merge Request, chances are I won't even think about it.
CI/CD jobs which perform compilation and build steps need not waste time and network to download the dependencies every time the CI job is executed. All needed dependencies are local and present (go build -mod vendor)
These are on top of my head, if I remember something else, I'll add it here.
Unless you need to modify the vendored packages, you shouldn't. Go modules already gives you reproducible builds as the go.mod file records the exact versions and commit hashes of your dependencies, which the go tool will respect and follow.
The vendor directory can be recreated by running the go mod vendor command, and it's even ignored by default by go build unless you ask it to use it with the -mod=vendor flag.
Read more details:
Go wiki: How do I use vendoring with modules? Is vendoring going away?
Command go: Modules and vendoring
Command go: Make vendored copies of dependencies

Is it best-practice to commit the `vendor` directory?

I am using dep to handle my Go dependencies. Is it best practice to also commit the vendor directory into version control? Or is best practice to always execute dep ensure after checking out a repository?
The dep tool's FAQ answers this:
Should I commit my vendor directory?
It's up to you:
Pros
It's the only way to get truly reproducible builds, as it guards
against upstream renames, deletes and commit history overwrites.
You don't need an extra dep ensure step to sync vendor/ with
Gopkg.lock after most operations, such as go get, cloning, getting
latest, merging, etc.
Cons
Your repo will be bigger, potentially a lot bigger, though prune can help minimize this problem.
PR diffs will include changes for files under vendor/ when Gopkg.lock is modified, however files in vendor/ are hidden by default on GitHub.
imagine what would happen to your project if a dependency was taken offline by the author. Until Go has a central server to hold all packages which are unable to be deleted a lot of people will always see the need to commit the vendor folder
I don’t commit for well available sources. Because on this case many commit messages are bloated with changes in vendors. When I want update then I do it and then commit updated Gopkg.*.

Is it possible to force a teamcity vcs root to always use the same commit?

My build configuration pulls code from multiple vcs roots. As part of my build process I build an open source project from github, which unfortunately has just introduced a dependency that breaks the build on my server.
Is there any way I can change the specification of the VCS root to limit it to the commit before the dependency was introduced? I don't want to manually run a specific commit as this would force the other repository in the build back to the same point in time, which would mean I'm never building my latest code. For reasons outside of the scope of this question, I need to build all projects from source, so can't take a pre-compiled version.
Is it possible to force teamciy to always check out the same commit of a VCS root?
Most of projects are using tags to identify versions so probably you can use it.
Or if you always need only one version you can fork repo and add branch/tack on your copy
Unfortunately git refspec does not provide possibility to specify commit by hash.

What is the most effective way to lock down external dependency "versions" in Golang?

By default, Go pulls imported dependencies by grabbing the latest version in master (github) or default (mercurial) if it cannot find the dependency on your GOPATH. And while this workflow is quite simple to grasp, it has become somewhat difficult to tightly control. Because all software change incurs some risk, I'd like to reduce the risk of this potential change in a manageable and repeatable way and avoid inadvertently picking up changes of a dependency, especially when running clean builds via CI server or preparing to deploy.
What is the most effective way I can pin (i.e. lock down or capture) a package dependency so I don't find myself unable to reproduce an old package, or even worse, unexpectedly broken when I'm about to release?
---- Update ----
Additional info on the Current State of Go Packaging. While I ended up (as of 7.20.13) capturing dependencies in a 3rd party folder and managing updates (ala Camlistore), I'm still looking for a better way...
Here is a great list of options.
Also, be sure to see the go 1.5 vendor/ experiment to learn about how go might deal with the problem in future versions.
You might find the way Camlistore does it interesting.
See the third party directory and in particular the update.pl and rewrite-imports.sh script. These scripts update the external repositories, change imports if necessary and make sure that a static version of external repositories is checked in with the rest of the camlistore code.
This means that camlistore has a completely repeatable build as it is self contained, but the third party components can be updated under the control of the camlistore developers.
There is a project to help you in managing your dependencies. Check gopack
godep
I started using godep early last year (2014) and have been very happy with it (it met the concerns I mentioned in my original question). I am no longer using custom scripts to manage the vendoring of dependencies as godep just takes care of it. It has been excellent for ensuring that no drift is introduced regardless of timing or a machine's package state. It works with the existing mechanism of go get and introduces the ability to pin (godep save) and restore (godep restore) based on Godeps/godeps.json.
Check it out:
https://github.com/tools/godep
There is no built in tooling for this in go. However you can fork the dependencies yourself either on local disk or in a cloud service and only merge in upstream changes once you've vetted them.
The 3rd party repositories are completely under your control. 'go get' clones tip, you're right, but you're free to checkout any revision of the cloned-by-go-get or cloned-by-you repository. As long as you don't do 'go get -u', nothing touches your 3rd party repositories already sitting at your hard disk.
Effectively, your external, locally cloned, dependencies are always locked down by default.

Is it possible to use the maven-release-plugin with a specific revision?

I am thinking about a deployment pipeline using SVN, Jenkins and Maven. At the moment I'm stuck at the point where I usually would call mvn release:perform on a working copy.
When thinking in deployment pipelines, I want to create a pipeline where every commit could be used to release a software to test/production. Let's say I have 5 builds, and I decide to release build 3 (with revision 3) to production. There will already be 2 new commits to trunk (which is now at revision 5).
Is it possible to use the maven-release-plugin to checkout/build/tag/commit a release at revision 3? When the maven-release-plugin finishes the release it usually commits the modified POMs to trunk.
I'm happy about any kind of information or advice here, so feel free to point me to books (like http://www.amazon.com/Continuous-Delivery-Deployment-Automation-Addison-Wesley/dp/0321601912), blog posts, Jenkins documentation... Maybe I'm completely on the wrong track.
By default, the release plugin creates the release based on the contents of your working copy, it just ensures that you don't have any uncommitted content before doing so. AFAIK it doesn't force an update of the sources, as that's usually the job of the Continuous Integration system (Jenkins in your case). So whatever is checked out by Jenkins will be released.
What you're trying to do sounds more like a configuration change on the Jenkins side, pointing it to the right revision.
On the other hand, if the POM files are modified as part of the release, but have been changed in SVN in the meantime, you will run into a conflict when Maven wants to check in the modified POM files. That's a situation that might happen, depending on how for back you want to go with the release.
Based on this, it might make more sense to always create a branch before doing a release. So you would create a branch based on revision 3 and then create your release in that branch. This way, you wouldn't run into issues with committing resources that have changed in more recent revisions.
Creating the branch and checking it out could probably be automated through Jenkins and Maven as well.
As far as I tested it, it is not possible.
More explicitely, as nwinler said, when you release, maven try to commit the modified pom. But, if it's an older revision than the current one, SVN will complain that your sources are not up to date. So it won't work. ... as far as I know.
You may read docs about promotion build. I don't find any one clear enough to be pointed out (in th few minutes of the writing of this message).

Resources