AWS Codebuild Golang Lambda using Serverless Framework - go

I'm building golang lambda functions using the serverless framework in AWS CodeBuild. This project is in a private bitbucket repository. And the CodeBuild source is the bitbucket repo
I'm having some problems of using local packages in the code.
I have this project structure
hello
test
test.go
main.go
serverless.yml
...other files
what I'm trying to do is to use package test in hello/test/test.go in the hello/main.go.
And I've tried 2 options.
Import the package using "myproject/hello/test"
Using bitbucket url and dep ensure. Moving test to another project.
the problems:
in AWS CodeBuild, the package cannot be found as the real package directory is /....aws path/bitbucket.org/<username>/<repo>/ and the $GOPATH is /...aws path/.
dep ensure freezes when writing the private bitbucket repo. I'm assuming that it's because authentication issues?
So, what can I do to use my golang packages in AWS CodeBuild? And it's a private repo not a public one.

Based on Peter's comments, I resolved the issue by using the full path of my bitbucket repo in my $GOPATH/src.
so the path should be bitbucket.org/<username>/<repo>

Related

How can I get branch name in aws codebuild for bitbucket?

I am using bitbucket as a code repository and Aws codebuild/pipeline for CI/CD.
What is the value of CODEBUILD_SOURCE_VERSION env. variable in aws codebuild for bitbucket?
To get the branch name in the build step, use the Environment variables feature.
Add an Environment variable (Key: BRANCH, Value: #{SourceVariables.BranchName}) in the Build step, and then this variable can be called inside the build script like $BRANCH
See this for more details
In my case,
My pipeline has a source part in which it copy from bitbucket and store in s3.
I have a source artifact as a s3 bucket. so it will return s3 ARN.
Source version arn:aws:s3:::codepipeline-<region>-<accountnumber>/<path>

How to deploy a Go app using local package

I'm trying to upload my Go app but the problem is that I get an error because I have made modifications to the package files
The imports from Github lie in C:\Users\Myuser\go\pkg\mod but when I try to upload my Go App, Heroku imports from Github instead of uploading the local modified files. Is there any way to upload my local packages instead of importing them when I deploy the app?
You could try building the app with the "-mod vendor" option so it uses the vendored files instead of pulling the git dependencies
go build -mod vendor

Why is go-get trying to download local code from a remote location?

I recently added a new package and directory to my Go project, and now when I try to build it, I get errors about a password error on Gitlab.
I am not importing a new remote package, I am simply adding a new directory underneath my already declared module path. For instance, my go.mod has gitlab.com/example/api and the package I added is gitlab.com/example/api/postgres.
I am not actually hosting on gitlab, I just needed something to name the project as I worked on it. Clearly it won't find it on gitlab, but it is available locally. Why is go-get trying to download a package/path that is available locally?
Why is it only happening for this new package, and not for all of the existing package under this path?
Golang 1.14
You have to add replace above the require block in your go.mod to work with local package. For example:
replace gitlab.com/example => /Users/abc/projects/gitlab.com/example
Ref: https://github.com/golang/go/wiki/Modules

Setting up CI/CD for an AWS CDK app using AWS CodeBuild/Deploy/Pipeline

I'm trying to setup a CI/CD pipeline for a dotnet app which uses AWS Lambda and AWS CDK for infrastructure. The source is on github and we have a Jenkins pipeline which runs the tests and publishes the artifacts. I want to use the artifact and deploy (or better use Code Deploy)
Can I use CodePipeline to run cdk deploy?
How can I use CodeDeploy to do a dotnet test and dotnet publish? and then pass on the artifact to CodePipeline
CodePipeline is a workflow service, it by itself cannot execute any commands. What you need is a Build/Test service like CodeBuild and/or Jenkins as part of the CodePipeline. This is where you will run the commands like 'cdk deploy', 'dotnet test' and 'dotnet publish'.
Once the Deployment artifacts are ready in the Build environment (using the aforementioned commands), the next CodePipeline Stage can use them to deploy - this is where a Service like CodeDeploy will be used.
CodePipeline is just orchestrating the workflow between the building block services like CodeCommit (Source), CodeBuild (Build/Test) and CodeDeploy (Deploy). There are many more integrations available.
Hope this helps.
There are example from AWS on this AWS CDK home page.
https://docs.aws.amazon.com/cdk/latest/guide/codepipeline_example.html
The working implementation on this using code commit is as below, it has screen shots and github link.
https://nikhilbhojcloud.blogspot.com/2019/08/code-pipeline-using-aws-cdk-for-lambda.html
Apart from this AWS CDK team is building CI/CD for CDK applications.
https://github.com/aws/aws-cdk/tree/master/packages/%40aws-cdk/app-delivery
You should use CodeBuild not CodeDeploy to do a dotnet test.
CodePipeline has three stages
Source
CodeBuild
CodeDeploy
For your use case github is the source. CodeBuild can be used to build/test your application. And CodeDeploy to deploy the build artifacts to your environment.
In order use codeBuild you must provide build specification reference.
follow the link for further more information like how to do it in codebuild.
https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html

AutoDeploy a Laravel app from GitHub branch to AWS EC2 or Elastic Beanstalk

I'm trying to auto deploy a Laravel app from a Github branch into AWS EC2 or Elastic Beanstalk (Prefer) but I haven't found the right solution, one of the tutorials I have followed is the one bellow. Does anyone have a solution for this?
Thank you in advance!
https://aws.amazon.com/blogs/devops/building-continuous-deployment-on-aws-with-aws-codepipeline-jenkins-and-aws-elastic-beanstalk/
You can do this with the following steps
Setup Jenkins with Github plugin
Install AWS Elastic Beanstalk CLI
Create IAM user with Elastic Beanstalk deploying privileges and add the access keys to AWS CLI (if Jenkins run inside a EC2, instead of creating a user, you can create a Role with requird permission and attach to the EC2 instance)
In Jenkins project, clone the branch go to project directory and executive eb deploy in a shell script to deploy it to Elastic Beanstalk. (You can automate this with a build trigger when new code pushed to the branch)
Alternatively there are other approaches, for example
Blue/Green deployment with Elastic Beanstalk
Deploy Gitbranch to specific environment.
Using AWS CodeStar to setup the deployment using templates(Internally it will use AWS Code pipeline, CodeDeploy and etc.)
An alternative to using eb deploy is to use the Jenkins AWS Beanstalk Publisher plugin https://wiki.jenkins.io/display/JENKINS/AWS+Beanstalk+Publisher+Plugin
This can be installed by going to Manage Jenkins > Manage Plugins > Search for AWS Beanstalk Publisher. The root object is the zip file of the project that you want to deploy to EB. The Build Steps can execute a step to zip up the files that are in your repo.
You will still need to fill in the Source Control Management section of the Jenkins Job configuration. This must contain the URL of your GitHub repo, and the credentials used to access them.
Execute Shell as part of the Build Steps which zip up the files from the repo that you want to deploy to EB. For example zip -r myfiles.zip * will zip up all the files within your GitHub repo.
Use the AWS Beanstalk Publisher Plugin and specify myfiles.zip as the value of the Root Object (File / Directory).

Resources