What is the best way to pull a playbook down from bitbucket and execute it using ansible?
description:
I wrote a playbook and have it checked into bitbucket. I have docker which is spinning up ansible(image) and I need it to pull down the playbook and run a command against it. Anyone have experience with this?? Any help would be appreciated.
You can use ansible-pull http://docs.ansible.com/ansible/playbooks_intro.html or just do a regular git checkout and then run the code. Both work well with a public repo but will require git credentials if the repo is private. If you want to do this with a private repo you will want some form of secrets management for storing a token or ssh key to authenticate with.
Alternatively, at build time you can copy the files you need into the Docker image from your dev machine or CI tool and run the playbook, instead of doing that at runtime. This will decrease the boot time of your docker image as well.
You can simply do:
# clone repo
git clone https://bitbucket.org/user/repo.git && cd repo
docker run --rm=true -v `pwd`:/repo:rw ansible/ubuntu14.04-ansible /bin/bash -c "ansible-playbook -i /repo/hosts /repo/main.yml"
PS:
I am using with Travis CI such a way: https://github.com/weldpua2008/ansible-pycharm/blob/master/.travis.yml
Related
Running a gitlab CI pipeline I'm trying to easy-deploy the repository's code on a EC2 instance.
I generated my ssh-keys for gitlab in my PC to clone and push my code. Then I moved the public and private keys also in the ec2. This just o allow to make the git clone "git.repo.git" in the ec2 instance.
I think this should be the problem, but I can't find a solution, I get this error
Cloning into 'repo-name'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
This is my gitlb ci file: (really easy, just for tests)
stages:
- deploy
deploy-job:
stage: deploy
script:
- cat $SSH_KEY > cred.pem && chmod 400 cred.pem
- ssh -o StrictHostKeyChecking=no -i cred.pem ubuntu#id-amazon.com git clone git#gitlab.com:repo/test.git
Is there a way to pass these credentials correctly?
You are using your keys to connect to id-amazon.com, user ubuntu.
But they would not be used by the git clone executed on that server, which would explain why the clone fails.
Although it fails first because the gitlab.com fingerprint is not found in ~ubuntu/.ssh/knwon_hosts.
See "Using SSH keys with GitLab CI/CD" and its example SSH project.
Double-check the result of ssh-keyscan gitlab.com with GitLab SSH known_hosts entries.
As explained by the OP Alex Sander in the comments
I think a mix of permissions problems, if the folder with the .ssh for the aws-key has 'too much permissions' I saw that it gives problems (I read that is because the ssh key has to be not 'viewable', inside the ubuntu folder I created another folder with chmod 777 in which I which I cloned the directory.
And for the .ssh files all he permissions in the aws docs.
To solve these problems I changed in a strange way the commands ran in the GitLab job but it was just this permission problem I think.
I have a job that SSH into server and after that I need to git clone repository into this server that I SSHed before. How could I do it?
I set VCS checkout mode on agent, and set custom path
I think there are two ways to achieve it:
Use SSH Exec runner to execute git clone on remote machine
Use SSH Upload to upload previously cloned repository to remote machine
First one is faster but you need to take care of git auth on remote machine.
I am trying to clone an AWS CodeCommit repository via HTTPS within a Python script. Doing this in a terminal, I simply use git clone and then I get prompted to enter the git credentials I generated, however this is not possible within a script. What I am looking for is a one line command like git clone https:{USERNAME}:{PASSWORD}#{HTTPS_URL} like it is possible for Azure and Github. I have searched for documentation regarding this and tried many credential combinations for achieving this, however none works.
I am using Jenkins in our builds
So I am already using the GIT Plugin
This plugin lets me specify Jenkins credentials, where we have already specified and installed/setup SSH keys.
However at the end of the build, I'd like to git tag my repo. I am calling the git.exe command line, and I get this error on a push:
Permission denied (publickey).
fatal: Could not read from remote repository.
Ideally, we don't want to use another plug in, (e.g. Git Publisher), as we are trying to do more of this via our own scripts as there is a good possibility that we may not use Jenkins in the future.
Also, ideally, we don't want to install items on our build server if we don't have to.
So the question is - how can I specify ssh keys/credentials on the command line for the given 'session'?
Thank you.
Put this in a shell script:
ssh -i path-to-your-private-key
Set the path to the shell script in GIT_SSH for Jenkins. git pull will then use that instead of plain ssh to access the remote repository.
Alternately you could configure ssh in $HOME/.ssh for the account under which Jenkins runs, but that can get tricky if your Jenkins runs as a Windows service.
I've looked around trying to figure out a way to get a Packer build to download a private repository for an ec2 ami build that will be used for spinning up new instances under an auto-launch configuration, having a newly created ec2 instance grab a private repo.
It seems the most secure way to grab a private repo is to use a deploy key, but I believe I would have to manually add each deploy key to the repo for each instance… which defeats the automation purpose, unless I'm doing something wrong.
I'm wondering how to clone a private repository through packer, be it through a shell script or other wise. I understand I can use Chef, but I don't think I should have to install another dependency when the only thing I'm trying to do is clone a github repository.
Do I have to write a send/expect type of script that uses the https github clone url?
Any and all help appreciated.
There's a "workaround" using ssh-agent. I say workaround because it's not particularly elegant. It would be better to have this part of a Puppet module (maybe there's one already).
The idea is that you need to generate a pair of Public/Private key for each of your private Github repository. Then you add the public key as a Deploy key in the Github project settings (Settings/Deploy Keys). Where you store the key pair is up to you.
Now in Packer, you could use a Shell provisioner and execute something along these lines:
#!/usr/bin/env bash
ssh-keyscan github.com >> /home/ec2-user/.ssh/known_hosts
eval `ssh-agent`
ssh-agent bash -c \
'ssh-add /home/ec2-user/.ssh/[privateKey]; git clone git#github.com:[account]/project.git'
The advantage with this approach is that you can clone multiple private repositories easily.
There are a few ways to upload your key pair on the EC2 box, either by using a file provisioner , Chef or Puppet.
I've been using this for Windows but in theory should be similar to this
{
"type": "powershell",
"inline": [
"Set-Location C:\\{{ user `LAB_ENVIRONMENT` }}; git clone https://{{ user `GITLAB_USERNAME` }}:{{ user `GITLAB_ACCESS_TOKEN` }}#gitlab.com/{{ user `REPOSITORY` }}"
]},