How Google Cloud build works? - google-cloud-build

I am new to Google cloud build, I want to build an application which runs on Windows server. I am not using any containerized application for webhosting.Is it necessary build steps written in cloudbuild.yaml to run on container, as most of the examples I saw from Google doc's for dockerized build. If not please let me know how can I do that.

All build steps are containers that are executed as part of your pipeline, so your build itself is a containerized tool chain. However, you need not build containers -- you can build jars, pars, zip files, and anything else you might want. Export these artifacts yourself in a build step or declare them as artifacts.

Related

how to update active build in google cloud services?

The project I'm working on uses Google Cloud Services with Firebase. We have some services that run on Google Cloud Run. I cloned a Golang repo and made a small modification to a struct which is a dependency for couple of the Cloud Function Triggers.
I am attempting to get this new code running on the cloud but seem to be missing something. I did the following command:
gcloud builds submit --config ./cloudbuild.yaml .
which completed successfully. I now have that build showing in Google Cloud Builds, however I am unsure of how to make that the active build.
Where do I set this build to be the active build?
You don't need to worry about activating it. Once you run the command gcloud builds submit, you have already created the build and it's active in your builds. So, you don't have to set this build as the active build or anything like that, as it's already in your platforms. This works like this as you can have many active builds in your platform.
In case you want to check your build details, you can access your Cloud Build page, select your project and click Open. Once there, just click on a particular build so you will see the Build details page. To view the artifacts of your build, under Build Summary, click Build Artifacts.
In addition to that, if you have more doubts in general on how to use Cloud Build and how it works, this tutorial Serverless CI/CD —Cloud Build has all the details about it.

Why Use Spring Boot with Docker?

I'm quite new in docker, and i'm wondering that when using spring-boot, we can easily build, ship and deploy the application with maven or gradle plugin; and we can easily add Load Balance feature. So, what is the main reason to use docker in this case? Is containerized really needed in everywhere? Thanks for the reply!
Containers helps you to get the software to run reliably when moved from one computing environment to another. Docker consists of an entire runtime environment: your application and all its dependencies, libraries and other binaries and configuration files needed for its execution.
It also simplifies your deployment process, reducing a hell lot of mess to just one file.
Once you are done with your code, you can simply build and push the image on docker hub. All you need to do now on other systems is to pull the image and run container. It will take care of all the dependencies and everything.

Run newman on the local build instead of deploying to a test environment using TeamCity

I am looking to be able to run my postman scripts using newman during a TeamCity build.
Instead of deploying the build to a test environment, I'd like to run the postman scripts on that particular build, so it isn't deployed to an environment used by other developers which could potentially break it.
My current build chain in TeamCity is:
Build main project (contains the REST Api and all required code)
Run Postman scripts using Newman on that project
I have the collection and environment file, along with the CLI command to call it. When I try and point the environment for a local build, it does not work.
I am thinking of running an IIS Express server on the agent and then with that active port, run the tests but I have been unsuccessful.
Any ideas on how to approach this would be appreciated!
I have looked at How do I integrate my Postman Integration Tests with TeamCity and this uses a test environment, which is not what I am after.
I looked at https://ie.com.au/a-how-set-up-automated-api-testing and this was helpful, but I think this is still reliant on setting up a test envrionment.
TeamCity isn't really equipped to handle what you are trying to do. You are trying to run API tests against a build, in order to do that, you'll need an environment. You need something to run your project in order to query against it.
The only potential path you might try looking at is containerizing your project, in docker or something similar, then running your image after it's built and querying against that. However this isn't a great practice and bloats the build time.
A good practice would be to build your project > deploy it to a test environment, you should set up a separate 'test' or 'dev' environment that is ok being broken > after deploy trigger a service to run your tests against the 'dev'

how to set up a Appium UI test maven project to work with Gitlab CI to test Android App?

I am an intern now, new to automation test.My goal here is to help my company set up CI for client side.
Right now I have a maven project contains several tests using Appium java-client lib, under Eclipse IDE, which could run the UI tests locally. My goal next step is to hook my tests with the gitlab repo(which is already there, created by the android developers), but I am stuck here. Could somebody help me out?
Please try to be specific:
how should I set up the .gitlab.yaml?
can we just have the script in yaml to download Appium and maven?
or we could just download Appium, but import all the Appium java-client jars to libs in main?
If either of above is true, how? if neither, what and how should I
do?
Where should I put my test in gitlab in that repo? Or I don't have to
put my tests in the existing repo. Instead, I could have another one
and tell yaml where to reach? Again, how?
It will be helpful if you could help me go through the workflow.
Like, when I developers check in code, gitlab read the yaml, then
build, then find my test suits in where(Q3), then execute etc.
Many thanks in advance!
Since finally someone is also interested in this question, let me share my solution to this.
So, if you are looking at this question, I assume you already have your test suite and you could test it locally in your machine, either have your app installed in a simulator or a real device. Now you need to read more about gitlab pipeline and gitlab CI :
pipeline: https://docs.gitlab.com/ee/ci/pipelines.html
gitlab CI: https://docs.gitlab.com/ee/ci/quick_start/
And you should have noticed that, one of the advantages of Appium is that you don't need to change a thing about the App you are testing, you are testing exactly the same App which is going into production. To learn more about Apppium:
http://appium.io/docs/en/about-appium/intro/
Now, to run the automation test, you need your test suite, the app, and Appium server. What we need to do is adding another stage in .gitlab-ci.yml, tell it to
take the newly compiled App, compile your test suite
install the App in simulator/real device
compile your test suite and run it.
To make things easier to understand, we start with question 4, workflow:
So when the code is checked in to gitlab, the gitlab runner runs the jobs of each stage in your .gitlab-ci.yml, and when it runs to your stage, it does the automation test, and note that it is running on your server, so it means you need to have Appium installed on your server and have it up and running when try to run your automation test suite. Now the problem is that, is your server capable to do so? If you wanna do the automation test in your server, you need to install Appium on it, simulator probably(and which might need your server to equip with GPU), etc, these are the concerns of maintaining server. The alternative would be using the third-party service ,which is what I did. Turns out our(when I was in that company) server isn't capable of running automation UI test, so we turned to AWS-ADF(Amazon Device Farm), there are many other service providers you could choose, see the link for references:
https://adtmag.com/blogs/dev-watch/2017/05/device-clouds.aspx
So I basically have a python script in my functional test stage, and it will grab the newly complied App, the automation test suite, upload them to AWS ADF, and then schedule a run, yields result when the run is finished.
so, to answer question 1:
we need to create one more stage for our functional test in .gitlab.yaml, in my case, I have a stage functionalTest_project stage after the stage which compiles the Android App. And then you script the necessary cmd in your stage, or if its too lengthy, your script in another file(put it in your repo) and then execute it. In my case, I put my script in python_ci.py, and then I execute it in my stage use “python python_ci.py” .(here you need a docker with these requirement, see below too)
You don’t download Appium, you set up Appium on your or if you use a cloud service, that service should set up Appium for you.
What I did it is that I use maven built and package the test suite locally and then push it to gitlab repo, which now I believe the better way would be compile and package it in the your functionalTest stage in .gitlab.yml. now it comes back to first point of question 1, how to get maven, my understanding is that its a dependency of the server, like python, so they could both be obtained by telling gitlab to execute your script with a docker that has python and maven dependency.
answer to question 3:
put it in the same repo, but out of the Android project(i.e. they will under the same directory).
how to tell yml to reach the test suite? remember they are in the same server, so you could the relative path in your yml script to tell yml where to get your test suite.
Hope this helps!

Jenkins - Build, Deploy and Promote

Recently, I started learning how to use Jenkins CI. So I am a little bit of a noob at jenkins. I am about to start to try and do the following:
I have setup a maven multi-module job on jenkins, which builds, tests, and finally creates 4 seperate war applications. I archive the war artifacts as part of this job. These war files will only ever be built once, they contain multiple environment properties, and the war file along with each environments server will manage the profile it runs in, eg dev, test, staging, prod, etc
I have another job on jenkins which will deal with the deployment to multiple environments.
This second job, uses the copy artifact plugin, and uses a post build action to deploy to a dev environment.
The job in step 2 will hopefully be able to have multiple promotions, allowing deployment to multiple environments: test/staging/performance/production etc.
I have searched stackoverflow and google, and all the posts I see, always use the parameterized plugin, specifying a parameter for the environment. This means there is a seperate build for each env which I don't like.
Can anyone tell me if this is the right way to go? Or direct me to some tutorial on how to do this properly.
Looks like what you need is a matrix-project build.
P.S.
A good introduction to Jenkins could be found in Jenkins: The Definitive Guide
After playing around with the jenkins configuration. I have this working very nicely now.
In the deployment job, I didn't see the "Add another promotion process" button, which allows me to promote the same build to multiple environments manually or automatically.

Resources