How can I connect Coveralls and Circle CI in GitHub? - continuous-integration

I need a little (maybe a lot more than a little) direction in combining these two services to work together. I have added the COVERALLS_REPO_TOKEN enviornment variable in the CircleCI settings online. I also have a .coveralls.yml file that has the following:
service_name: circleci
repo_token: the automated generated token
The CircleCI documentation does point this out, but I this leaves me confused. The repo is enabled on Coveralls, but from then on it is not very clear.
Anyone know of a solution?
I have cross-posted this question to the CircleCI Discourse forum.

For a java, gradle project that generates code coverage reports using jacoco, along with adding COVERALLS_REPO_TOKEN environment variable, I had to use the gradle coveralls plugin
In CircleCI, I see the below logs after using the gradle plugin:
> Task :coveralls
service name: circleci
service job id: null
repo token: present (not shown for security)
[message:Job #39.1, url:https://coveralls.io/jobs/XXXjobId]
I didn't have to add the .coveralls.yml file

Related

CircleCI does not identify my config.yml from my GitHub repository

I had tried to use CircleCI to run some tests on my spring project.I have added the config.yml as required, but since I found some of my builds fails. I delete the config.yml and unfollow the git repository to use a different template for the config.yml. But, now CircleCI won’t detect the new .circleci directory even though it is on the GitHub repository. .circleci on GitHub
This concerns an incident with CircleCI, as can be found on their website:
And on their Twitter channel:
So UI requests were temporarily disabled, but the issue has been resolved a few hours later.

Using Cloud source repository in a GCP production project

I have a standalone cloud source repository, (not cloned from Github).
I am using this to automate deploying of ETL pipelines . So I am folowing Google recommended guidelines, i.e committing the ETL pipeline as a .py file.
The cloud build trigger associated with the Cloud source repository will run the code as mentioned in the cloudbuild.yaml file and put the resultant .py file on the composer DAG bucket.
Composer will pick up this DAG and run it .
Now my question is, how do I orchestrate the CICD in dev and prod? I did not find any proper documentation to do this. So as of now I am following manual approach. If my code passes in dev, I am committing the same to the prod repo. Is there a way to do this in a better way?
Cloud Build Triggers allow you to conditionally execute a cloudbuily.yaml file on various ways. Have you tried setting up a trigger that fires only on changes to a dev branch?
Further, you can add substitutions to your trigger and use them in the cloudbuild.yaml file to, for example, name the generated artifacts based on some aspect of the input event.
See: https://cloud.google.com/build/docs/configuring-builds/substitute-variable-values and https://cloud.google.com/build/docs/configuring-builds/use-bash-and-bindings-in-substitutions

Rest api to get sonarqube info from bamboo build or vice versa

Is there any way I can get any sonarqube api which can provide the build number of bamboo or any bamboo api which can give sonarqube info.This would really help.
I have so far tried both side but i am surprised both the system are quite clueless about each other. Why it is not possible that a build which ran sonarqube as as one of it's job doesn't have any information about that. Also neither in sonar it tell which build has actually triggered that sonar execution
(Not sure I understand what exactly you are looking for, perhaps this ...? )
I don't believe you can relate a specific Activity (SonarQube analysis) to a specific build (Bamboo), just project to job.
You must have SonarQube Server configured in Bamboo
When executing your job, you can add these sonar.links optional parameters to the analysis step:
sonar.links.homepage Project home page.
sonar.links.ci Continuous integration.
and
sonar.links.scm Project source repository.
sonar.links.issue Issue tracker.
Maybe also specify sonar.host.url=$SONAR_HOST_URL (where SONAR_HOST_URL is the global setting in Bamboo) in the analysis step parameters.
Those populate the Project Overview page sidebar:
That should provide the links from SonarQube back to the other systems of interest.
If you have properly configured Bamboo, you should see a link in Bamboo to the SonarQube project, post execution.

Deploy code from gitlab on ec2 WITHOUT.gitlab-ci.yml file

I am using gitlab as repository and want to push my code on ec2 whenever any commit is done on gitlab. The gitlab CD/CI documentation states that I have to add a file .gitlab-ci.yml at the root directory of my repo. This is actually a problem for me because, I want project repo to have only code and not any configuration related info like build and deploy etc. Also when anybody clones the repo, they would have access to location where my code is pushed/deployed on ec2. Is there any work around for this problem ?
You'll need to use a gitlab-ci.yml filke to deploy your application. The file provides instructions and a pipeline "infrastructure" which, if properly configured, will build, test and automatically deploy your code.
If you are worried about leaking credentials, you should use the built-in instance variables to mask your important bits, like a "$SERVERNAME" or "$DB_PASSWORD" for instance.
Lastly, you can use the power of gitignore, in order to not publish all of your credentials or sensitive bits to your projects' servers or instances.

How can I setup a trigger in Bamboo if there is a specific tag been pushed to my bitbucket repository

I am trying to use bamboo to manage my release procedure and just wondering if this is feasible:
The developer finished the integration test at local environment
The developer create a specific tag, e.g. "UAT_1.0.0" and pushed the tag to bitbucket
Bamboo sensed that there is an new tag "UAT_1.0.0" has been created on bitbucket and then start the building process; after that it will deploy the war file to the UAT server
Tester signed off the UAT and created a tag "REL_1.0.0" and pushed the tag to bitbucket
Bamboo sensed the new tag "REL_1.0.0" and start the building process. After build finished, it deploy the war file to the PROD server
It looks like the "Repository triggers the build when changes are committed" is to best way to implement the process. But I can't find out the way to move any further. Any idea?
Yes you can do this (Only if you can deploy custom plugins to it).
You need to build custom triggers as plugins for bamboo.
Get/Install the SDK
Create a plugin. See here.
If you have access to bamboo's source code I suggest you look into classes DependencyTriggerReason, InitialbuildTriggerReason, ScheduledTriggerReason. You need to create a class implementing TriggerReason. You should start with this tutorial if you are new to developing bamboo plugins.
Deploy it to bamboo.
A bit late but... I found a solution. You need to put a "negative" regular expression in the "Exclude changesets" section in the repository configuration, that include the word that you want.
The regular expression is like :
^(?!.test).$
like that: repository configuration
with this, bamboo will only build commits with this word in the message.
Now, if your commit is like "commit for test", bamboo will build it

Resources