Is it ok to use the master branch of lib/pq in production?
When you execute go get gitlab.com/lib/pq you get the master branch, but there is a release v1.0.0.
Would it be better to use releases instead of master branch?
lib/pq
Releases
v1.0.0
Initial tagged release. No major recent changes.
Merge pull request #778 from lib/go-mod
add a go.mod file in preparation for a tagged release
lib/pq v1.0.0 adds support for Go versioned modules.
For Go1.12, consider upgrading your production code for Go versioned modules.
The first beta release of Go 1.12 is scheduled for this week (Dec, 3, 2018).
Go 1.11 Release Notes
Modules, package versioning, and dependency management
Go 1.11 adds preliminary support for a new concept called “modules,”
an alternative to GOPATH with integrated support for versioning and
package distribution. Using modules, developers are no longer confined
to working inside GOPATH, version dependency information is explicit
yet lightweight, and builds are more reliable and reproducible.
Module support is considered experimental. Details are likely to
change in response to feedback from Go 1.11 users, and we have more
tools planned. Although the details of module support may change,
projects that convert to modules using Go 1.11 will continue to work
with Go 1.12 and later. If you encounter bugs using modules, please
file issues so we can fix them. For more information, see the go
command documentation.
Proposal: Versioned Go Modules
Go 1.11 Modules.
Related
Hi I want to pin a particular version of dependency in my go.mod, like
github.com/dependecy v1.7.0
And when I run go test or go build, sometimes it gets updated to
github.com/dependecy v1.8.0
The tricky part is sometimes it changes, and sometimes it does not. We would want to pin to an older version because the new version has a bug. Any idea why this is happening?
I believe the reason why this is happening is because you might have a dependency that might have requirement for higher version of the module. From go documentation here
If multiple versions of a particular module are added to the list, then at the end only the latest version (according to semantic version ordering) is kept for use in the build.
You can try the commands listed in documentation or, run go build with -mod=readonly flag. That should help you understand what might be triggering this.
Go modules does not support multiple minor versions of same package in a single module, if added then at the end only the latest version is kept for use in the build.
You can have some dependency which have requirement of higher version and replacing the old one.
If a module out there pushed v1.8.0 with a bug, file a bug or fork the repository as needed.
In Golang, we can specify open source libraries on GitHub as dependencies. For example:
import "github.com/RichardKnop/somelibrary"
This will try to look for a branch based on your Go version and default to master if I understand correctly.
So there is no way to import a specific release of a dependency, e.g.:
import "github.com/RichardKnop/somelibrary#v1.4.8"
What is the best practise to manage dependencies in Go then?
I can see two approaches.
I. Version Modules
Is it to create new modules for major versions with breaking changes?
For example, my Go library could define modules v1 and v2 so then you could do:
import "github.com/RichardKnop/somelibrary/v1"
Or:
import "github.com/RichardKnop/somelibrary/v2"
Based on what you need. Any changes made to v1 or v2 would be required not to break any APIs or working functionality.
II. Forking
This would give you a complete control over a version of external dependency your Go code requires.
For example, you could fork github.com/RichardKnop/somelibrary into your own GitHub account and then in your code do:
import "github.com/ForkingUser/somelibrary"
Then you would have to fork all external dependencies which seems a bit overkill. However it would give you total control over versions. You could keep your forks at a version you know is working with your code and only update forks once you have checked that new releases of dependencies do not break anything.
Thoughts?
Feb. 2018: the vendoring approach presented below (in 2015/2016) might end up disappearing if vgo is integrated to the toolchain.
See my answer below.
August 2015 edition: Go 1.5 comes with a built-in (but still experimental) vendoring support. Setting the environment variable GO15VENDOREXPERIMENT will make go build and friends look for packages in ./vendor directory as well as GOPATH. See VonC's answer and the design document for more details.
III. Vendoring
AFAIK, this is the most widely used way of ensuring that your builds are reproducible and predictable. The Go team itself uses vendoring in their repo. The Go team is now discussing the unified dependency manifest file format. From the Go toolchain developers mailing list:
In Google’s internal source tree, we vendor (copy) all our dependencies into our source tree and have at most one copy of any given external library. We have the equivalent of only one GOPATH and rewrite our imports to refer to our vendored copy. For example, Go code inside Google wanting to use “golang.org/x/crypto/openpgp” would instead import it as something like “google/third_party/golang.org/x/crypto/openpgp”.
(...)
Our proposal is that the Go project,
officially recommends vendoring into an “internal” directory with import rewriting (not GOPATH modifications) as the canonical way to pin dependencies.
defines a common config file format for dependencies & vendoring
makes no code changes to cmd/go in Go 1.5. External tools such as “godep” or “nut” will implement 1) and 2). We can reevaluate including such a tool in Go 1.6+.
Note: June 2015, the first support for vendoring appears in Go 1.5!
See c/10923/:
When GO15VENDOREXPERIMENT=1 is in the environment, this CL changes the resolution of import paths according to the Go 1.5 vendor proposal:
If there is a source directory d/vendor, then, when compiling a source file within the subtree rooted at d, import "p" is interpreted as import "d/vendor/p" if that exists.
When there are multiple possible resolutions, the most specific (longest) path wins.
The short form must always be used: no import path can contain “/vendor/” explicitly.
Import comments are ignored in vendored packages.
Update January 2016: Go 1.6 will make vendoring the default.
And as detailed in the article "MOST GO TOOLS NOW WORK WITH GO15VENDOREXPERIMENT":
1.6 brings support for /vendor/ to most tools (like the oracle) out of the box; use the Beta to rebuild them.
issue 12278 has been resolved.
there still is an issue with goimports, and there is a CL that can be cherry-picked.
Update August 2018: this (vgo presented below) is now implemented with Go 1.11 and modules.
Update Feb. 2018, 3 years later.
There is a new approach developed by Russ Cox, from the core Golang development team.
The vgo project.
go get -u golang.org/x/vgo
This proposal:
keeps the best parts of go get,
adds reproducible builds,
adopts semantic versioning,
eliminates vendoring,
deprecates GOPATH in favor of a project-based workflow, and
provides a smooth migration path from dep and its predecessors.
It is based on a new MVS ("Minimal Version Selection") algorithm:
You can see:
the first articles in Go & Versioning
the discussion in this thread
the analysis of that thread by Jon Calhoun
the counter-point by Sam Boyer in "Thoughts on vgo and dep":
He was leading the private initiative of go dep
the first debate (YouTube) between Russ, Sam and Jesse Frazelle!
May 2018: Artifactory proposes a vgo proxy
The recent release of Artifactory 5.11 added support for vgo-compatible Go registries (and go proxy) giving the community a variety of capabilities when developing with Go.
Here are just a few of them:
Local repositories in Artifactory let you set up secure, private Go registries with fine-grained access control to packages according to projects or development teams.
A remote repository in Artifactory is a caching proxy for remote Go resources such as a GitHub project. Accessing a go proxy through Artifactory removes your dependency on the network, or on GitHub since all dependencies needed for your Go builds are cached in Artifactory and are therefore locally available. This also removes the risk of someone mutating or removing a dependency from version control, or worse, force-pushing changes to a remote Git tag thus changing what should be an immutable version which can create a lot of confusion and instability for dependent projects.
A virtual repository aggregates both local and remote Go registries giving you access to all Go resources needed for your builds from a single URL, hiding the complexity of using a combination of both local and remote resources.
also it is possible just describe dependencies in maven, if to use mvn-golang-wrapper, in the case it looks like in maven as below text
<plugin>
<groupId>com.igormaznitsa</groupId>
<artifactId>mvn-golang-wrapper</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<id>golang-get</id>
<goals>
<goal>get</goal>
</goals>
<configuration>
<packages>
<package>github.com/gizak/termui</package>
</packages>
<buildFlags>
<flag>-u</flag>
</buildFlags>
<goVersion>1.6</goVersion>
</configuration>
</execution>
</plugin>
In Golang, we can specify open source libraries on GitHub as dependencies. For example:
import "github.com/RichardKnop/somelibrary"
This will try to look for a branch based on your Go version and default to master if I understand correctly.
So there is no way to import a specific release of a dependency, e.g.:
import "github.com/RichardKnop/somelibrary#v1.4.8"
What is the best practise to manage dependencies in Go then?
I can see two approaches.
I. Version Modules
Is it to create new modules for major versions with breaking changes?
For example, my Go library could define modules v1 and v2 so then you could do:
import "github.com/RichardKnop/somelibrary/v1"
Or:
import "github.com/RichardKnop/somelibrary/v2"
Based on what you need. Any changes made to v1 or v2 would be required not to break any APIs or working functionality.
II. Forking
This would give you a complete control over a version of external dependency your Go code requires.
For example, you could fork github.com/RichardKnop/somelibrary into your own GitHub account and then in your code do:
import "github.com/ForkingUser/somelibrary"
Then you would have to fork all external dependencies which seems a bit overkill. However it would give you total control over versions. You could keep your forks at a version you know is working with your code and only update forks once you have checked that new releases of dependencies do not break anything.
Thoughts?
Feb. 2018: the vendoring approach presented below (in 2015/2016) might end up disappearing if vgo is integrated to the toolchain.
See my answer below.
August 2015 edition: Go 1.5 comes with a built-in (but still experimental) vendoring support. Setting the environment variable GO15VENDOREXPERIMENT will make go build and friends look for packages in ./vendor directory as well as GOPATH. See VonC's answer and the design document for more details.
III. Vendoring
AFAIK, this is the most widely used way of ensuring that your builds are reproducible and predictable. The Go team itself uses vendoring in their repo. The Go team is now discussing the unified dependency manifest file format. From the Go toolchain developers mailing list:
In Google’s internal source tree, we vendor (copy) all our dependencies into our source tree and have at most one copy of any given external library. We have the equivalent of only one GOPATH and rewrite our imports to refer to our vendored copy. For example, Go code inside Google wanting to use “golang.org/x/crypto/openpgp” would instead import it as something like “google/third_party/golang.org/x/crypto/openpgp”.
(...)
Our proposal is that the Go project,
officially recommends vendoring into an “internal” directory with import rewriting (not GOPATH modifications) as the canonical way to pin dependencies.
defines a common config file format for dependencies & vendoring
makes no code changes to cmd/go in Go 1.5. External tools such as “godep” or “nut” will implement 1) and 2). We can reevaluate including such a tool in Go 1.6+.
Note: June 2015, the first support for vendoring appears in Go 1.5!
See c/10923/:
When GO15VENDOREXPERIMENT=1 is in the environment, this CL changes the resolution of import paths according to the Go 1.5 vendor proposal:
If there is a source directory d/vendor, then, when compiling a source file within the subtree rooted at d, import "p" is interpreted as import "d/vendor/p" if that exists.
When there are multiple possible resolutions, the most specific (longest) path wins.
The short form must always be used: no import path can contain “/vendor/” explicitly.
Import comments are ignored in vendored packages.
Update January 2016: Go 1.6 will make vendoring the default.
And as detailed in the article "MOST GO TOOLS NOW WORK WITH GO15VENDOREXPERIMENT":
1.6 brings support for /vendor/ to most tools (like the oracle) out of the box; use the Beta to rebuild them.
issue 12278 has been resolved.
there still is an issue with goimports, and there is a CL that can be cherry-picked.
Update August 2018: this (vgo presented below) is now implemented with Go 1.11 and modules.
Update Feb. 2018, 3 years later.
There is a new approach developed by Russ Cox, from the core Golang development team.
The vgo project.
go get -u golang.org/x/vgo
This proposal:
keeps the best parts of go get,
adds reproducible builds,
adopts semantic versioning,
eliminates vendoring,
deprecates GOPATH in favor of a project-based workflow, and
provides a smooth migration path from dep and its predecessors.
It is based on a new MVS ("Minimal Version Selection") algorithm:
You can see:
the first articles in Go & Versioning
the discussion in this thread
the analysis of that thread by Jon Calhoun
the counter-point by Sam Boyer in "Thoughts on vgo and dep":
He was leading the private initiative of go dep
the first debate (YouTube) between Russ, Sam and Jesse Frazelle!
May 2018: Artifactory proposes a vgo proxy
The recent release of Artifactory 5.11 added support for vgo-compatible Go registries (and go proxy) giving the community a variety of capabilities when developing with Go.
Here are just a few of them:
Local repositories in Artifactory let you set up secure, private Go registries with fine-grained access control to packages according to projects or development teams.
A remote repository in Artifactory is a caching proxy for remote Go resources such as a GitHub project. Accessing a go proxy through Artifactory removes your dependency on the network, or on GitHub since all dependencies needed for your Go builds are cached in Artifactory and are therefore locally available. This also removes the risk of someone mutating or removing a dependency from version control, or worse, force-pushing changes to a remote Git tag thus changing what should be an immutable version which can create a lot of confusion and instability for dependent projects.
A virtual repository aggregates both local and remote Go registries giving you access to all Go resources needed for your builds from a single URL, hiding the complexity of using a combination of both local and remote resources.
also it is possible just describe dependencies in maven, if to use mvn-golang-wrapper, in the case it looks like in maven as below text
<plugin>
<groupId>com.igormaznitsa</groupId>
<artifactId>mvn-golang-wrapper</artifactId>
<version>1.1.0</version>
<executions>
<execution>
<id>golang-get</id>
<goals>
<goal>get</goal>
</goals>
<configuration>
<packages>
<package>github.com/gizak/termui</package>
</packages>
<buildFlags>
<flag>-u</flag>
</buildFlags>
<goVersion>1.6</goVersion>
</configuration>
</execution>
</plugin>
I've read a whole bunch of articles and SO questions on importing 3rd party go packages which all seems straight forward, but what I don't understand is that none that I have read make any references to versioning. In Dartlang there's the pubspec file that defines your package including its version and its dependencies including their required versions. What if I do a go get github.com/gorilla/sessions and write my app then 6 months later I have to clear my directories and re get everything again, in which time that package has been update and broken backwards compatibility with my code that was using the older version?
The official version, from the GO FAQ:
If you're using an externally supplied package and worry that it might change in unexpected ways, the simplest solution is to copy it to your local repository. (This is the approach Google takes internally.) Store the copy under a new import path that identifies it as a local copy.
There are many alternative to that approach, mainly based on declaring the exact version of those projects you are using.
See for instance "Dead Simple Dependencies in Go -- Keep it simple and keep your sanity." (based on emil2k/vend)
The main different options for Go Dependency Management are listed at:
"Go Package Management -- A summary of dependency management in Go"
(And its associate GOPM mailing list)
Update July 2015:
the official vendoring approach from Go team is discussed here.
an alternative go build tool called "gb" is proposed at getgb.io by Dave Cheney.
Update Q4 2017: as mentioned below, go dep is the official tool for pinning version of dependencies (even though that pinning approach is not without criticism: see "The cargo cult of versioning").
It should be merged into the toolchain when Go 1.10 development begins, according to its roadmap.
Update Q2 2018: go dep has been replaced by go mod (modules) in Go 1.11, following works on vgo.
I use dep as a dependency management tool for golang project. Please use the following link dep tool for more info.
dep is a dependency management tool for Go. It requires Go 1.9 or newer to compile.
dep was the "official experiment." The Go toolchain, as of 1.11, has (experimentally) adopted an approach that sharply diverges from dep. As a result, we are continuing development of dep, but gearing work primarily towards the development of an alternative prototype for versioning behavior in the toolchain.
Current status: January 2019
dep is safe for production use.
We're using composer, satis and SVN to manage our in-house PHP libraries.
We commit changes to SVN trunk during development, then tag versions (following semantic versioning) when they're ready for testing.
Once a library version is tagged, we can use composer as part of our deployment to the testing environment. Following successful testing, we'd then deploy that same version to production.
The issue here, is that once we've tagged a version for testing, we have to be very careful as the newly tagged version will be picked up by composer when preparing the next prod release.
What I'm imagining, is that we'd tag a version as a beta or RC, (eg v1.1RC1) and somehow configure our deployment process such that it will refuse to deploy an RC or beta to production. If a version is tested successfully, we'd re-tag that version as a released version (v1.1RC1 -> v1.1) and release that.
Can this be achieved?
From what you are saying, I understand that you are actually afraid of tagging a new version of a library because that code could actually be used and break that other application, right?
One approach would be to do good testing. I don't see it should be a problem to tag a version of a library. If the tests are all green, there should be no reason not to tag it. This would work even if the tests are basically only "let's see if it works, manually".
Now the second step is to integrate that new version into the application: Run composer update and see if the application is still running, i.e. start all the tests and wait for green.
I guess it might be a good idea to have a separate area where you check out the application, intentionally run composer update to fetch all the newest libraries, run all the tests and report that a) there are updates and b) they work. A developer should then confirm the update, i.e. do it again manually and commit the resulting composer.lock file, or grab the resulting lock file from that update test.
I don't think there is benefit in using non-production release versions. You have to deal with the next version anyways - constantly toggling the minimum stability setting or adding #RC or #beta flags to the version requirements of the library don't really help.