I have a private repository on azure. I had a difficult time understanding the setup so I can use the different sections of my code - but I got it I think. But I am left with something that I don't like - the length of my import paths seems like a lot to type. Is there a solution I can use to shorten them? Did I get the configuration right?
here is my setup (Subscripify is my org name)
GOPPRIVATE=dev.azure.com
I have a ..gitconfig like so
[url "git#ssh.dev.azure.com:v3/subscripify/"]
insteadOf = https://dev.azure.com/subscripify/
I am working in a project named subscripify-prod on a repo named tenant-mgmt-ss. I have
a main package and another package within my repo named tenant.
My imports look like this one
"dev.azure.com/Subscripify/subscripify-prod/_git/tenant-mgmt-ss/tenant"
they are really long IMO. coming from javascript all I'd have to do is import like this
import tenant from './tenant'
Related
I want to set my go module path to example.com/myrepo instead of github.com/myusername/myrepo such that I am able to import in inside another repository.
for example, if my go.mod looks like this
module example.com/myrepo
go 1.13
how will I make go get example.com/myrepo work?
I am getting the following on go get example.com/myrepo
unrecognized import path "example.com/myrepo" (parse https://example.com/myrepo?go-get=1: no go-import meta tags ())
Given I am the owner of example.com how can I do this?
it is called vanity import paths.
In addition to the common hosting sites (GitHub, Bitbucket, etc) and custom VCS URLs (.git, .hg, etc) known to the go command, this mechanism can be used to point a custom URL to any of the services.
you must be looking for this https://sagikazarmark.hu/blog/vanity-import-paths-in-go/.
When developing a small Google Cloud Function in Go. I noticed it will throw an error if you have everything in your package main - eg. import "<whatever>" is a program, not an importable package
So the solution is switch it out to its own package, then deploy. If something goes wrong, throw it back into a package main and work on it locally, then switch it back.
Is this the best workflow? The other option i see is possibly making the Cloud Function its own module and importing it into a main.go file.
I was able to create a cli folder in project's top level and then put main.go file using package main and main() function inside it. That allowed me to have separate file cloud_functions.go in root with different package name that has one or more google cloud functions in it.
I'm just learning how to use VGO and it seems like a very simple problem but I could not find any good example explaining how to solve it.
I have project hosted in a private bitbucket repository. Let's assume the project URL is bitbucket.org/mycompany/myapp
At the root level I have the main.go, which imports from a subpackage. The import looks like this:
import "bitbucket.org/mycompany/myapp/subpackage"
Question 1. After I just added that subpackage I do "vgo get ." because I want to fetch some other libraries, but that fails because it tries also to fetch my subpackage from bitbucket rather than using my local version. Obviously, I have not committed my changes so that fetch fails with "remote: Not Found" error. Do I have to push my changes before I do "vgo get ." ?
Question 2. Assuming I have my subpackage in the repository, but I made a small change something in it. Now I want to verify it it works, do I have always push every single change every time before I do vgo build?
In general, is there a way to tell vgo that if an absolute import path refers to my local repository it should take the files from the filesystem, rather than pulling from the bitbucket.org?
I'm starting a new project and considering gb as my build tool but it doesn't appear to be integrating very well with vscode...
I've referenced 3rd party dependencies no problem using gb vendor fetch but as for creating local packages, this is proving a little trickier! Am I missing something obvious?
Here's my local src directory:
src
/cmd
/model
calc.go
/server
server.go
The following code compiles and creates a bin\server.exe file successfully but the import path isn't picked up, nor does gocode recognise it
Here's the server code:
package main
import (
"cmd/model" // not a happy reference...
"fmt"
)
func main() {
fmt.Println(model.Add(1, 2))
}
Here's the model code:
package model
func Add(a int, b int) int {
return a + b
}
I've found what appears to be a similar issue on Github (https://github.com/joefitzgerald/go-plus/issues/325) and while nsf's solution sorts out auto-complete (post import), the import statement itself still claims to be searching in the GOROOT and GOPATHs.
Any ideas?
Thanks to an answer from lukehoban here https://github.com/Microsoft/vscode-go/issues/249 I was able to get my environment working.
I simply created a settings.json file under the .vscode directory (which will now have to be checked in) into which I've configured:
{
"go.gopath": "${workspaceRoot}"
}
This makes me feel unclean and it still doesn't provide a way to reference both 3rd party dependencies and local packages together...
Do not try to work against Go, work with Go.
First of all give all your packages fully qualified import paths. Go is designed around global import paths, do not try to force Go into using flat hierarchies or even relative paths.
You can point to your import path repository endpoints either directly or by using Go's remote import path mechanism. BTW, if you happen to run a self-hosted GitLab instance, it supports remote import path meta tags out of the box.
I prefer glide, but maybe the following is possible with gb, too. Certainly something simililar will be possible with the upcoming go dep: You can point to ssh+git endpoints and others using glide's repo stanza. Frankly I have no idea if gb supports an equivalent mechanism, but if it doesn't this is a good reason to reconsider.
From what I understand, golang imports modules like
import (
"bitbucket.org/user/project"
"github.com/user/project"
)
is there a way to import modules in all files, without explicitly typing an absolute remote location out, from
1) a single remote location?
2) multiple locations?
So for 1), you could specify somewhere that the host is github.com/user and any import that is not a default library and doesn't have a remote prefix is prefixed by github.com/user. Or have a prefix_variable + relative/path and be able to set the prefix_variable somewhere?
So like
// in some config file
github = "github.com/user/"
bitbucket = "bitbucket.org/user/"
// imported in file
import (
bitbucket + "project" // "bitbucket.org/user/project"
github + "project" // "github.com/user/project"
)
or
// in some config file
default = "github.com/user"
// imported in file
import (
"bitbucket.org/user/project" // this has a remote prefix, so default prefix is not added
"project" // "github.com/user/project"
)
Unfortunately to my knowledge there is no way to do this in the way you have stated. There is a discussion in Google Go Group which is somewhat related Go Packaging: building a great packaging story which might give you some ideas of the thought process for why this cant be done (assuming you were not aware of this already).
I actually have a related problem which is associated with producing a build for two different server environments, one for Google App Engine and one for a local linux development environment sharing packages (imports) and I am still looking for the solution, hence watching this type of discussion.