How to use Go package that is in another module - go

I have two repos github.com/x/a and github.com/x/b and they both have Go modules at the root of each. How can i access a package that that is in the other repo? Normally I would be able to access it by doing a go get github.com/x/b if I want to use b in a. But neither are in production and we are still working in a development branch so I need to find a way to do this locally. Any ideas?

The quick and dirty way is to clone the repository into GOPATH/src. Then you can import the packages simply with
import "a" or import "b"

Related

How to structure a golang project with several go modules: a "core" module, and some "adapter" modules referencing the "core" one?

Hard to put in a single sentence, but here is the situation. I am developing a golang package, my intention is for it to be go-gettable. The core of the package provides some "central" functionality, an http middleware. And I need several "adapter" packages to support some of the most famous golang http frameworks.
Each adapter responsibility is to get the required information from the http request and then consume a core service where the logic resides. So the main logic resides in a single place while each adapter acts like, well, an adapter.
The first approach I can think of is to have both the core and adapters as part of the same module, but this will add a lot of unnecessary dependencies to the importing project. For instance, if you want to import the package to support framework A the package will indirectly add all the dependencies required by the adapters for other frameworks, even when not used.
The approach I am considering is to have several modules in the same package: a core module and a separate module for each adapter. Each adapter module will then import the core module:
|- core
| |- go.mod
|
|- adapter1
| |- go.mod
|
|- adapter2
|- go.mod
This way, adapter 1 module could be imported and will only carry it's dependencies and those of the core module, leaving adapter 2 dependencies out of the picture.
I got this structure to work locally: I can successfully import adapter 1, or adapter 2, from another golang project using go mod replace statement but when I push changes to the git repo and try to import directly from there I can't get go mod to download the latest version/tag of each package, not even by explicitly providing the version tag I want to use. It keeps downloading an older version and complaining about some missing code parts (that exist only in the latest version).
I followed this guide on sourcing multiple modules on a single repository but an important distinction is that in my case, I am sourcing a module that references another module in the same repo, while the example in the guide shows how to source modules independently.
So my question is, is it at all possible to source a go module that references another go module on the same repo?
Would it be a better approach to have my "core" module on a separate repository and then an "adapters" repo/module with a package for each adapter?
The purpose of having it all in the same repo is to make the development easier, but it is complicating version control a lot.
Any advice will be greatly appreciated. If you need me to clarify something I would gladly do so. Thanks in advance.
Consider that any go install would not be possible with a replace directive in your go.mod (issue 44840).
That would result in the error message:
The go.mod file for the module providing named packages contains one or more
replace directives.
It must not contain directives that would cause it to be interpreted differently
than if it were the main module.
So one module per repositories is preferable, and you can then group your repositories into one parent Git repository (through submodules, each one following a branch for easy update) for convenience.

go get is adding folders when attempting to import?

We have 2 repos relevant to this question:
gitlab.com/company/product/team/service
gitlab.com/company/product/team/api
They both have go.mod files. The api repo is dead simple. It contains a dozen or so typescript files that go through a process to generate mirror image Go structs. We use these in integration and unit tests sort of like a stylesheet so we can test the structure of responses from service. Api has 0 imports. No fmt, time, nothing. Here's api's go.mod file:
module gitlab.com/company/product/team/api/v2
go 1.16
The latest tag on its master branch is v2.68.0. Service's go.mod has this require line for api:
require (
...
gitlab.com/company/product/team/api/v2 v2.68.0
...
)
And it Works on My Machine(TM). When I run go get, go mod tidy, go mod download, go mod vendor, etc, the correct version gets downloaded. Not so in docker. To work with our CI/CD pipeline and eventually get deployed in k8, we build a docker image to house the code. The RUN go mod vendor in service's Dockerfile is giving us an error when it tries to import the api (adding a verbose flag does not result in more details here):
go: gitlab.com/company/product/team/api/v2#v2.68.0: reading gitlab.com/company/product/team/api/team/api/go.mod at revision team/api/v2.68.0: unknown revision team/api/v2.68.0
Note the repeated folders in the ../team/api/team/api/go.mod url.
Our Dockerfile is loaded with an SSH key that's registered with gitlab, the necessary entries into ~/.ssh/known_hosts are present, there's a ~/.netrc file with a gitlab access token that has the read_api permission, GOPRIVATE=gitlab.com/company/*, GO111MODULE=on, and when it runs go mod vendor it doesn't have issues with any of the rest of our gitlab.com/company/product/team/* imports. These mirror my local machine where go get works.
Why, for this one repo, would go mod vendor decide to repeat folders when attempting to import?
Turns out the answer isn't anything that SO users would have had much insight into. The .netrc token we have in service's Dockerfile is an access token that only works in the service project. It was created by opening the settings of the service repo and creating the access token there instead of using a service account that has access to all the necessary repos. When go uses it to access the api it no longer grants access and go has issues asking gitlab which folder in the URL is the RepoRoot because gitlab erroneously claims all folders in the path are a repo. If you're using subgroups in gitlab and running go get, you know what I'm talking about.
It worked on my machine because my .netrc has my token and I have enough access to not be an issue.
We fixed it by updating the .netrc file in the dockerfile to use a token generated by a service account that has api access to all relevant repos.
I was about to delete the question, but finding resources talking about this problem are few and far between.

How to create golang modules for others?

Example Scenario
I have an AWS S3 bucket with lots of object which a couple of my application needs to access. The applications use some info available to them to form the S3 object name, get the object and run a transformation on the object data before using it for further processing specific to the application.
I would like to create a module which will hold the logic for forming the object name, obtain it from S3 and then run the transformation on the object data so that I wont be duplicating these functions in multiple places.
In this scenario should I add AWS SDK as a dependency in the module? Keep in mind that the applications might have to use AWS SDK for something else specific to that application or they might not require it at all.
In general what is the best way to solve problems like this i.e where to add the dependency? And how to manage different versions?
If your code has to access packages from the AWS SDK then yes, you have no choice but to add it as dependency. If it doesn't and the logic is generic, or you can abstract it away from the AWS SDK then you don't need the dependency (and in fact the go tooling like go mod tidy will remove the dependency from go.mod if you add it)
In this scenario should I add AWS SDK as a dependency in the module?
Keep in mind that the applications might have to use AWS SDK for
something else specific to that application or they might not require
it at all.
Yes, if any package from your module depends on AWS SDK, Go Modules system is going to add AWS SDK as a dependency for your module. There is nothing special you are supposed to do with your module.
Try this script with Go 1.11 or higher (and make sure to work out of GOPATH):
Write your module like this:
Tree:
moduledir/packagedir1
moduledir/packagedir2
Initialize the module:
Recipe:
cd moduledir
go mod init moduledir ;# or go mod init github.com/user/moduledir
Build module packages:
Recipe:
go install ./packagedir1
go install ./packagedir2
Module things are supposed to automagically work!
In general what is the best way to solve problems like this i.e where to add the dependency? And how to manage different versions?
The Modules system is going to automatically manage dependencies for your module and record them in the files go.mod and go.sum. If you need to override some dependency, you should use the 'go get' command for that. For instance, see this question: How to point Go module dependency in go.mod to a latest commit in a repo?
You can also find much information on Modules here: https://github.com/golang/go/wiki/Modules

unrecognized import path on openshift

I have some local packages defined within my application, for example, I have a crud model located at model/crud/crud.go
Within my application I am calling upon them using import("model/crud"), for all of my local dependencies.
This resolves perfectly fine within the context of my application on my local machine, however when I try to push to openshift I get the following error:
imports model/crud: unrecognized import path "model/crud"
It looks like when openshift runs the build tool, it is attempting to run go get on these imports in order to include them in the build path when compiling.
Is there a better way to resolve vendor specific dependencies without having to create a separate repo for them? I don't want to have to manage two separate repositories if I don't have to.
To find the import path you are supposed to use, take "$GOPATH/src/[...]/model/crud" and just remove the "$GOPATH/src/" part.
You should use the full import path. For example "github.com/user/project/model/crud"

Maintaining staging+prod environments from single repo, 2 remotes with revel buildpack on heroku

Revel models are defined under the models package; so in order to import them one must use the full repo path relative to the %GOPATH/src folder which in this case project/app/models thus results in
import PROJECTNAME/app/models
so far, so good i'f you'r using your app name as the folder name of your local dev machine and have dev+prod environments only.
Heroku's docs recommends using multiple apps for different environment (i.e. for staging). with the same repository with distinct origins;
This is where problem starts, now, since the staging enviromnent resides on alternative appname(let's say PROJECTNAME_STAGING), it's sources are stored under PROJECTNAME_STAGING but the actual code still import PROJECTNAME/app/models instead of import PROJECTNAME_STAGING/app/models; so compile fails, etc.
Is there any possibility to manage multiple environments with a single local repo and multiple origins with revel's heroku buildpack? or a feature is needed in the buildpack that is yet to be implemented?
In addition, there is this possible issue with the .godir file that is required to be versioned and contain the git path to the app, so what about the multi-environment duality regarding this file?
Solution was simple enougth;
The buildpack uses the string in .godir both for the argument for revel run as well as the directory name under GOPATH/src. My .godir file had a git.heroku.com/<APPNAME>.git format; Instead I just used APPNAME format.

Resources