I'm new to Jenkins and I have been searching around but I couldn't find what I was looking for.
I'd like to know how to run docker command in Jenkins (Build - Execute Shell):
Example: docker run hello-world
I have set Docker Installation for "Install latest from docker.io" in Jenkins Configure System and also have installed several Docker plugins. However, it still didn't work.
Can anyone help me point out what else should I check or set?
John
One of the following plugins should work fine:
CloudBees Docker Custom Build Environment Plugin
CloudBees Docker Pipeline Plugin
I normally run my builds on slave nodes that have docker pre-installed.
I came across another generic solution. Since I'm not expert creating a Jenkins-Plugin out of this, here the manual steps:
Create/change your Jenkins (I use portainer) with environment variables DOCKER_HOST=tcp://192.168.1.50 (when using unix protocol you also have to mount your docker socket) and append :/var/jenkins_home/bin to the actual PATH variable
On your docker host copy the docker command to the jenkins image "docker cp /usr/bin/docker jenkins:/var/jenkins_home/bin/"
Restart the Jenkins container
Now you can use docker command from any script or command line. The changes will persist an image update.
Related
I want to run Apache Airflow using Docker on Windows to experiment with running tasks.
I'm new to Docker and have so far (1) initiated an Airflow instance that is running on Docker and (2) have created a DAG script locally.
What I want to do is to move this DAG script into a "dags" folder located somewhere in the container - I read that the directory is specified in airflow.cfg, but haven't been able to read it with either nano/vim.
(1) Do I have to create the 'dags' folder myself? (2) How would one copy over local scripts to the dags folder?
Found my answer. It's considered "hacky", may not be safe and may cause versioning issues to "directly" copy scripts into a Docker container.
Sharing a directory as a "volume" in Docker is the way to go:
docker run -d -p 8080:8080 -v /path/to/dags/on/your/local/machine/:/usr/local/airflow/dags puckel/docker-airflow webserver
Source: Getting Started with Airflow using Docker
My goal is to run a docker container that will simulate a physical device that sends requests to an external API.
The simulator lives in a GitHub repo and after cloning it is run by bash scripts that:
Set up a virtualenv.
Run maven build.
Execute other commands via Python that runs a docker container.
It can also remove docker container.
How can I simplify these dependencies and make sure everything is packed in a working container (application that otherwise would be run through shell scripts)?
I'm trying to implement docker with jenkins and am not sure if I am on the right track.
Given:
Running jenkins on docker from Windows
Plan on fetching code from github, building the solution, running functional tests, etc on a container somehow
What I've currently done:
(1) Installed Docker on Windows
(2) Successfully launched Jenkins on Docker with the command
"docker run –name myJenkins -p 8080:8080 -p 50000:50000 -v ~/Jenkins:/var/jenkins_home/ jenkins/jenkins:lts"
I believe this step binds the docker volume to my host machine's directory. This allows me to view and access the Jenkins content.
(3) In my host machine's Jenkins directory, I've created a plugin.txt (containing a variety of Jenkins plugins I want installed) and a Dockerfile. The Dockerfile installs the specified plugins in the plugins.txt file.
FROM jenkins/jenkins:lts
COPY plugins.txt /usr/share/jenkins/ref/plugins.txt
RUN /usr/local/bin/install-plugins.sh < /usr/share/jenkins/ref/plugins.txt
(4) In the windows command prompt, I built the Dockerfile with the command "docker build -t new_jenkins_image ."
(5) I stop my current container "myJenkins" and create a new container with the command "docker run –name myJenkins2 -p 8080:8080 -p 50000:50000 -v ~/Jenkins:/var/jenkins_home/ new_jenkins_image". This loads up Jenkins with the newly installed jenkins plugins.
What I'm stuck/confused on
(1) Do I have to create a new container with a new name every time I want to install new jenkins plugins through the Dockerfile? This seems like a manual process as well... There has to be a better way.
(2) I started a basic jenkins pipeline job with the "Pipeline script from SCM" option. I entered in the correct repository URL and credentials but left the "Script Path" blank for now (I do not have a Jenkinsfile yet). When I execute the build, Jenkins did not fetch the code from github.
java.lang.IllegalArgumentException: Empty path not permitted.
at org.eclipse.jgit.treewalk.filter.PathFilter.create(PathFilter.java:80)
at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:205)
at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:249)
at org.eclipse.jgit.treewalk.TreeWalk.forPath(TreeWalk.java:281)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:171)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:165)
at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:193)
at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:72)
at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:189)
at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:165)
at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:110)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:67)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:293)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Finished: FAILURE
I believe it's because the docker container does not have git installed? The container cannot access the Git or MSBuild from my host machine... Do I have to create a new container here to simply fetch the code?
Can someone explain to me what I'm missing or where I went wrong?
From my understanding, the process goes like this: Create new pipeline job -> select pipeline script from scm -> enter repo URL, credentials, branch to build and Jenkinsfile -> Jenkinsfile will execute instructions to compile, test, and deploy.
Where does the Dockerfile come into play here? Is my thought process on the right track?
you need to create container every time if you change/update the image. but it not required to give a new name each time. Did you stopped and removed previously running container? if not so docker gives errors like same name container cannot start. So stop and remove your previous container. and you will be able to start a new container with an updated image.
Yes, you need to install git in the same container to pull the code. it cannot access git on the host machine. But the error you are showing is like validation error. (i mean Jenkins validates input even before trying to pull the code. if you add some fake name it will throw next error like git not found)
Your thought is on correct track. Create new pipeline job -> select pipeline script from scm -> enter repo URL, credentials, branch to build and Jenkinsfile -> Jenkinsfile will execute instructions to compile, test, and deploy.
At the end of the question, you mentioned about different Dockerfile, i assume you are talking about Dockerfile in your repository (git). you can run your pipeline in docker agent. This removes to setup everything on jenkins host means you dont need to install dependencies to run your pipeline code on host, for example, if you are trying to execute some nodejs code in pipe, you need to setup nodejs on Jenkins host before you run the pipe, to get rid of this you can run pipe in container where everything is pre-setup. But I don't think you can use this feature if you are running Jenkins itself in docker. you need to setup Jenkins on the host directly in that case.
I'm trying to use the Docker plugin in Jenkins to commit a docker container when the build running on it fails. Currently I have a Jenkins server with ~15 nodes, each with its own docker cloud. The nodes all have the latest version of docker-ce installed. I have a build set up to run on a docker container. What I want to do is commit the container when the build fails. Below are the things I have tried:
Adding a post build task, where I obtain the container ID and the hostname of the node running the container. I then SSH into the node and then commit the container.
The problem: Not able to SSH from inside a container as it requires a password and there's no way adding the node to the container's list of known hosts
Checking the "commit container" box in the build's general configurations
The problem: This is probably working but I don't know where the container is being committed to. Also this happens every time, and not just when the build fails.
Using the build script
Same problem as using the post build task
Execute a docker command (Build step)
This option asks for the container ID, which I have no way of knowing as it is new every time a build is run.
Please let me know if I have misunderstood any of the above ways! I am still new to Jenkins and Docker so I am learning as I go. :)
We would like to leverage the excellent catalogue of DockerFiles on DockerHub, but the team is not in a position to use Docker.
Is there any way to run a DockerFile as if it were a shell script against a machine?
For example, if I chose to run the Docker container ruby:2.4.1-jessie against a server running only Debian Jessie, I'd expect it to ignore the FROM directive but be able to set the environment from ENV and run the RUN commands from this Dockerfile: Github docker-library/ruby:2.4.1-jessie
A dockerfile assumes to be executed in an empty container or an image on which it builds (using FROM). The knowledge about the environment (specifically the file system and all the installed software) is important and running something similar outside of docker might have side effects because files are at places where no files are expected.
I wouldn't recomend it