Go Get for Google's Cloud Source Repository - go

Making two different go modules
source.cloud.google.com/me/a
source.cloud.google.com/me/b
With source.cloud.google.com/me/common as a common lib dependency (to share a model)
I'm trying to go get source.cloud.google.com/me/common (even manually wrote it in the go.mod file) but I keep receiving the following error:
package source.cloud.google.com/me/common:
unrecognized import path "source.cloud.google.com/me/common"
(parse https://source.cloud.google.com/me/common?go-get=1: no go-import meta tags ())
I have gcloud set up to be able to use app deploy and create new source repositories. I've tried setting up ssh for google cloud and attempted to use manual credentials. This neither works locally or in the google cloud build service.
I want to do two things:
Be able to go get this dependencsource.cloud.google.com/me/common
Be able to integrate this go get into my App Engine automated build pipeline.
Any help would be appreciated.

Configure repo on https://source.cloud.google.com/
Authorize manual git access https://source.developers.google.com/p/YOUR_PROJECT_ID/r/YOUR_REPO
In this example: https://source.developers.google.com/p/m/r/common
Your common module should go like source.developers.google.com/p/m/r/common.git
Run: go get source.developers.google.com/p/m/r/common.git on the other module

I would try the following steps:
Make sure it has manual git access - You can try a git clone from folder "a" to check if correct git access is in place. Delete it after it gets cloned.
Make sure that you are using HTTPs - looks like you are good in that regards - go1.14 made HTTPs as default for go get's.
Now, coming to the actual problem - looks like your private version control systems isn't sending the required "go-import" meta tag.
For example - refer any github go module, you can see the "go-import" meta tag:
In order to fix it, the VCS server needs to respond with this tag when go get tries to download "common" module
<meta name="go-import" content="source.cloud.google.com/me/common git https:source.cloud.google.com/me/common">

This works:
got get source.developers.google.com/p/YOUR_PROJECT_ID/r/YOUR_REPO.git

Related

How to clone multiple private repositories using GitHub Actions?

I have been searching a lot online for how to clone multiple private repositories while running a GitHub action script. Moreover, since the repositories I wish to clone are written in a text file within the repo itself, it complicates things a bit more for me.
I mean, GitHub explains how to do this here: https://github.com/actions/checkout#checkout-multiple-repos-private but it assumes you know what you want to clone so you can list all the repos in the yml file. Also, they assume you have just one or two repos. What if you have 100 repos... I would rather use a script for that... So how to do that? Any Idea?
Summary:
So you need to find a way to authenticate with github when you do the cloning. Then you do the cloning from a bash script that you can call from your yml file via github actions.
Part1: Authentication:
You can find in this link (https://dev.to/dtinth/authenticating-as-a-github-app-in-a-github-actions-workflow-27co) FOUR ways to authenticate and the pros and cons of each. Here is a summary of the methods:
Method 1: Using the built-in GITHUB_TOKEN secret
Method 2: Using your personal access token --> This is what I used with a small twist.
Method 3: Creating a bot account and using its personal access token
Method 4: Creating a GitHub App and generating tokens from it
So the solution I used is Method 2 above in which I basically used my own PAT (Personal Access Token) to send to the bash script I wrote that does all the cloning for me. The nice thing about this is that I used the PAT as a secret and this way it is not exposed to anyone.
Part2: Here is the part of the yml file that I used in github actions to do the cloning:
- name: Run multi repo cloning script
env:
PA_TOKEN: ${{ secrets.PAT_SECRET }} # `PAT_SECRET` is a secret that contains your PAT (Personal access token)
run: ".github/clone_repos.sh"
shell: bash
Moreover, GitHub has a mechanism to detect GitHub tokens in the run logs when GitHub Actions run and if their mechanism detects a token it hides it with "***". So that is why there is very little risk for your token to be exposed by someone reviewing the GitHub Action output.
Part3: in the bash script itself, I simply used the following command to clone all the repos I needed:
#clone subrepo
git clone "https://"$PA_TOKEN"#github.com/<remote_name>/"$SUBREPO_NAME".git"

Passing code from CodePipeline to PythonFunction

I'm trying to create a CDK app that will deploy a pipeline-stack and a lambda-stack. Similar to the tutorial here. I'm trying to implement a basic CI/CD application that is triggered with every push to a Github Enterprise Repo.
I chose to use PythonFunction from (#aws-cdk/aws-lambda-python) instead of function from #aws-cdk/aws-lambda because PythonFunction builds the dependencies from requirements.txt. I have various lambdas that use different packages (like awswrangler, pandas, requests, etc.).
But, PythonFunction does not support CfnParametersCode (Where the code is passed through CDK instead of being read from an asset).
What other option do I have to pass my code from GithubEnterprise to
the PythonFunction?
If function from #aws-cdk/aws-lambda is the only option I have, how
can I include the packages from requirements.txt
This does seem like an option for #aws-cdl/aws-lambda, but how would I pass my code from Github? This example relates to building from asset code.
I apologize if I'm missing something obvious, I just started working with AWS CDK last week.
First of all I would recommend to take a look at pipelines.CdkPipeline which is able to deal with Assets. That means you can directly use lambda.Code.from_asset instead of overriding CfnParametersCode in the Pipeline.
Regarding your other question, you can deal with the requirements by installing them into your lambda folder during the build step with: pip install -r requirements.txt -t .
CfnParametersCode gives you the ability to upload your code from an S3 file.
You can do the same via lambda.Code.fromBucket.
Taking your link from the third point (https://github.com/aws/aws-cdk/tree/master/packages/%40aws-cdk/aws-lambda#bundling-asset-code) You just need to use lambda.Code.fromBucket instead of code: lambda.Code.fromAsset. Docs can be found here: https://github.com/aws/aws-cdk/tree/master/packages/%40aws-cdk/aws-lambda

how to fetch file from remote repository on bitbucket using go language

I have latest file which is on remote bitbucket repo.
I need use Go language program in that I need to fetch above file from "go get" is this possible ?
How to write code in go language to do this.
You don't even need to write any go code for this, just do:
go get <bitbucket repo URL>
Make sure that the URL is public so go get can work. Bear in mind that to use this package, you need to import in in your main.go program.

Golang and gqlgen with GitHub

I see this line of command in the gqlgen getting started page.
go mod init github.com/[username]/gqlgen-todos
why Golang is linked to GitHub?
In this case, is it necessary to really initiate the repo in my Github account? Will gqlgen or Golang automatically push the code to GitHub?
Go has no links with github.
Modules allow you use any name you want as a package name, for example
go mod init bestPackageEver
Using VCS URLs are often used because of the convenience of working with a remote server:
git remote add origin https://github.com/my/repo
go mod init github.com/my/repo
gqlgen uses github in docks just because their source code is hosed on github.
You can find more details in Russ Cox's article.

How does one get all the artifacts as a zip using TeamCity Rest API?

The Docs show this
/repository/downloadAll/BUILD_TYPE_ID/BUILD_SPECIFICATION
for getting all of your artifacts as a zip file, but that isn't using the REST API. Is there a way in the REST API do do the same thing? The Docs seem to indicate that the repository links are only there for backwards compatibility.
You can use this URL, it works for me:
http://<TeamcityUrl>/httpAuth/app/rest/builds/id:<BuildId>/artifacts/archived
I use TeamCity 9.
From the documentation: http://confluence.jetbrains.net/display/TW/REST+API+Plugin#RESTAPIPlugin-buildartifacts
Artifacts:
GET <TeamcityUrl>/httpAuth/app/rest/builds/<buildLocator>/artifacts/files/<artifact relative name>
If you download the artifacts from within a TeamCity build, consider using teamcity.auth.userId/teamcity.auth.password system properties as credentials for the download artifacts request: this way TeamCity will have a way to record that one build used artifacts of another and will display that on build's Dependencies tab.
have you tried this?
I'm not sure it's documented, but it works.
http://teamcity-url/downloadArtifacts.html?buildId=216886
If you are using it from .NET you may use the following code:
List<string> downloadedFiles = new RemoteTc()
.Connect(a => a.ToHost("tc").AsGuest())
.DownloadArtifacts(123, #"C:\DownloadedArtifacts");
The above code uses FluentTc library

Resources