"Move" Python script into Airflow running on Docker (Windows) - windows

I want to run Apache Airflow using Docker on Windows to experiment with running tasks.
I'm new to Docker and have so far (1) initiated an Airflow instance that is running on Docker and (2) have created a DAG script locally.
What I want to do is to move this DAG script into a "dags" folder located somewhere in the container - I read that the directory is specified in airflow.cfg, but haven't been able to read it with either nano/vim.
(1) Do I have to create the 'dags' folder myself? (2) How would one copy over local scripts to the dags folder?

Found my answer. It's considered "hacky", may not be safe and may cause versioning issues to "directly" copy scripts into a Docker container.
Sharing a directory as a "volume" in Docker is the way to go:
docker run -d -p 8080:8080 -v /path/to/dags/on/your/local/machine/:/usr/local/airflow/dags puckel/docker-airflow webserver
Source: Getting Started with Airflow using Docker

Related

Modify local file with Airflow running on Windows (with Docker Compose)

I have Airflow running on Windows using Docker Compose.
I created a DAG which appends data to a SQLite database.
Currently the DAG works with the database file being inside the Docker container but I'd like it to be on my local machine. How can I do this?
You have to create a volume in your docker-compose refering to the file you want to share between the container and your computer

Can't copy airflow local DAG to docker container's dags folder (Windows)

I was trying out setting up Airflow through Windows docker, found a good one here: https://stackoverflow.com/a/59368063/2085454, but got stuck in the step to copy local DAG to docker docker cp sample_dag.py containerName:/usr/local/airflow/dags
the command I'm using here was docker cp Desktop\test_dag.py naughty_bolbs:/usr/local/airflow/dags, the terminal shows nothing when running this command and I can't find any copied DAGs in the container, "naughty_bolbs" is the container's name.
This .py file works well when I was running it locally on Mac. So the file itself should not be a problem.
Do you know how can I copy local DAG to the container's dags folder successfully on Windows?

Can I use a DockerFile as a script?

We would like to leverage the excellent catalogue of DockerFiles on DockerHub, but the team is not in a position to use Docker.
Is there any way to run a DockerFile as if it were a shell script against a machine?
For example, if I chose to run the Docker container ruby:2.4.1-jessie against a server running only Debian Jessie, I'd expect it to ignore the FROM directive but be able to set the environment from ENV and run the RUN commands from this Dockerfile: Github docker-library/ruby:2.4.1-jessie
A dockerfile assumes to be executed in an empty container or an image on which it builds (using FROM). The knowledge about the environment (specifically the file system and all the installed software) is important and running something similar outside of docker might have side effects because files are at places where no files are expected.
I wouldn't recomend it

What is a simple workflow to use docker in Windows with a basic file sharing possibility?

For the sake of simplicity, use ubuntu image as an example.
I often find it easier to use docker-compose, particularly if there's a high chance I'll want to both mount-volumes and link the container to another container at some point in the future.
Create a folder for working in, say "ubuntu".
In the "ubuntu" folder, create another folder called "files"
Create a file in that folder called "docker-compose.yml". In this file, enter:
ubuntucontainer:
image: "ubuntu:latest"
ports:
- "80:80"
volumes:
- ./files:/files
Whenever you need to start the box, navigate to "ubuntu" and type docker-compose up. To stop again, use docker-compose stop.
The advantage of using docker compose is that if you ever want to link-up a database container this can be done easily by adding another container to the yaml file, and then in the ubuntucontainer container adding a links section.
Not to mention, docker-compose up is quite minimal on the typing.
(Also, forwarding the ports with 80:80 may not be strictly necessary, it depends on what you want the box to do.)
TL;DR version:
Open Docker Quickstart Terminal. If it is already open, run $ cd ~
Run this once: $ docker run -it -v /$(pwd)/ubuntu:/windows --name ubu ubuntu
To start every time: $ docker start -i ubu
You will get an empty folder named ubuntu in your Windows user directory. You will see this folder with the name windows in your ubuntu container.
Explanation:
cd ~ is for making sure you are in Windows user directory.
-it stands for interactive, so you can interact with the container in the terminal environment. -v host_folder:container_folder enables sharing a folder between the host and the container. The host folder should be inside the Windows user folder. /$(pwd) translates to //c/Users/YOUR_USER_DIR in Windows 10. --name ubu assigns the name ubu to the container.
-i stands for interactive

How to run a docker command in Jenkins Build Execute Shell

I'm new to Jenkins and I have been searching around but I couldn't find what I was looking for.
I'd like to know how to run docker command in Jenkins (Build - Execute Shell):
Example: docker run hello-world
I have set Docker Installation for "Install latest from docker.io" in Jenkins Configure System and also have installed several Docker plugins. However, it still didn't work.
Can anyone help me point out what else should I check or set?
John
One of the following plugins should work fine:
CloudBees Docker Custom Build Environment Plugin
CloudBees Docker Pipeline Plugin
I normally run my builds on slave nodes that have docker pre-installed.
I came across another generic solution. Since I'm not expert creating a Jenkins-Plugin out of this, here the manual steps:
Create/change your Jenkins (I use portainer) with environment variables DOCKER_HOST=tcp://192.168.1.50 (when using unix protocol you also have to mount your docker socket) and append :/var/jenkins_home/bin to the actual PATH variable
On your docker host copy the docker command to the jenkins image "docker cp /usr/bin/docker jenkins:/var/jenkins_home/bin/"
Restart the Jenkins container
Now you can use docker command from any script or command line. The changes will persist an image update.

Resources