We have a maven project on github for which we use Semaphore-CI.
Each time we merge a PR, the person who merges it is responsible for deploying a copy to our public package repo, incrementing the version number and pushing it directly to the code repo.
I was wondering if there is a way to automate this. Any ideas/suggestions are highly appreciated.
You could use Semaphore's custom commands to add the steps for pushing to this other repository.
Ex:
git remote add public_package http://example.com/public.git
git push --mirror public_package
Related
I have a few repositories that use the same submodule. Is there a way to set those repositories to run a pipeline automatically when the origin of their submodule changes?
For instance if I push something on the parent repo, I want it to automatically notify the change to all the repositories using it as a submodule that the change has happened and that their pipeline needs to be run.
There is a feature called multi-project pipeline in GitLab that can trigger builds between repositories, but only in the Premium and Silver version:
https://docs.gitlab.com/ee/ci/multi_project_pipelines.html
You could get something to work with this. You just need to remember that just because you push new changes to your submodule, the repositories that use this submodule are not updated. They are still on the same commit as before.
You can of course do anything you want if you really want to. If you have a server where you can set up a gitlab runner, you can do something like this:
Create a new GitLab user, RoboGit, with access to the submodule and repositories that use that submodule
Install and register a gitlab runner on the server
Create a job for the submodule
This job can then manually check out your dependent projects, find the submodule, update them to the newest version, and then commit these changes. The author of these changes would then be RoboGit, and these changes would trigger build jobs in these repositories. You would get something like what you described in your question.
Following on from this post of mine:
API Management with GIT
I have an API management instance running. I know API management has its own GIT repository.
I can successfully clone, change and push changes up to my API management GIT repository.
I am also running Octopus deploy and am trying to use this:
Git Push
and this:
Git Pull
To pull my code from my companies GIT repository and push to the APIM GIT repository.
The thing is, these to plugins fail immediately with an issue not being able to find file paths on the Octopus server. Also, these were written in 2014.
Is there a recommended better way to pull from your companies repo and push to APIM repository? Also, if I am pulling to Octopus, where does the code get stored before it is pushed to APIM?
In the end, I think this plug in is out of date. I ended up writing my own PowerShell GIT bash and it works a treat.
I get the APIM json code from my companies source control then push it ti the APIM GIT repository and publish it using PowerShell.
For anyone who has this issue in the future.
The cause is most likely you are trying to use the GitPull step from the octopus server, while the code behind the step makes reference to this parameter $OctopusParameters['Octopus.Tentacle.Agent.ApplicationDirectoryPath'].
This parameter seems to return an empty value. I have not tried running from a Deployment Target.
The git clone directory could be another parameter/variable specified
I am raising this with the Octopus team.
In Jenkins, we want to implement a hook on push (merge) to the Git master (GitHub). The goal of this hook is updating the pom.xml.
The problem that we envision is that we would create cyclic process. Updating a pom.xml is a new push (merge), which triggers the hook, which updates the pom.xml etc.
Is there a solution out there that solves this issue?
The solution (do not commit to master but to a separate tag) is described here.
My workflow encompasses the following steps:
Git push (to BitBucket or GitHub depending on the project).
BitBucket/GitHub is integrated with CodeShip, tests are run.
If tests are ok, CodeShip automatically deploys to Heroku.
Everything works fine when, by pushing to the remote repo, the deployment tasks are triggered which ends up with the new version going live when everything is ok.
My question is:
Sometimes, I simply do a git push heroku master which defeats the whole purpose of this workflow.
How can I prevent it from happening? Is there a way to make Heroku only accept the deploy when the source is CodeShip?
After looking around for quite some time, I noticed that there are a some ways to accomplish this, all of them related to simply not giving access to the Heroku Account for the developer:
If you're a single developer ("one-man / one-woman show"):
Do not add the Heroku Remote to your Git Repository. If it is already added, remove it. That way you're not going to push to it by mistake.
If you're managing a team:
Do not give the team a user/pass to access Heroku Toolbelt. That way, the only remote repo they will have access to should be GitHub/BitBucket/Whatever.
You could just create another branch called dev and push to that branch your changes and when you are ready to deploy to heroku merge changes into master branch.
I just came accross your issue and this is what i did as quickest resolution
What is the best practice for automatically-generating ruby documentation as part of a git commit workflow? We'd like to automatically generate new documentation whenever a commit is merged into master.
Use Git hooks.
If you want it to run on the client side, use a post-commit hook. Essentially, registering a hook (by storing it in .git/hooks) makes Git call an arbitrary script you provide after each successful git commit. Your script can then call RDoc or YARD to generate docs into some output directory (outside your source code repository, of course).
Alternatively, you can have it run on the server hosting your Git repo, using a post-receive hook. This will run after you push to the server's repo. For example, you could use this to automatically upload the new docs to /docs/dev/ on your project's web server.
For detailed instructions, see the chapter on hooks in the Git manual.