I have Airflow running on Windows using Docker Compose.
I created a DAG which appends data to a SQLite database.
Currently the DAG works with the database file being inside the Docker container but I'd like it to be on my local machine. How can I do this?
You have to create a volume in your docker-compose refering to the file you want to share between the container and your computer
Related
I want to run Apache Airflow using Docker on Windows to experiment with running tasks.
I'm new to Docker and have so far (1) initiated an Airflow instance that is running on Docker and (2) have created a DAG script locally.
What I want to do is to move this DAG script into a "dags" folder located somewhere in the container - I read that the directory is specified in airflow.cfg, but haven't been able to read it with either nano/vim.
(1) Do I have to create the 'dags' folder myself? (2) How would one copy over local scripts to the dags folder?
Found my answer. It's considered "hacky", may not be safe and may cause versioning issues to "directly" copy scripts into a Docker container.
Sharing a directory as a "volume" in Docker is the way to go:
docker run -d -p 8080:8080 -v /path/to/dags/on/your/local/machine/:/usr/local/airflow/dags puckel/docker-airflow webserver
Source: Getting Started with Airflow using Docker
I was trying out setting up Airflow through Windows docker, found a good one here: https://stackoverflow.com/a/59368063/2085454, but got stuck in the step to copy local DAG to docker docker cp sample_dag.py containerName:/usr/local/airflow/dags
the command I'm using here was docker cp Desktop\test_dag.py naughty_bolbs:/usr/local/airflow/dags, the terminal shows nothing when running this command and I can't find any copied DAGs in the container, "naughty_bolbs" is the container's name.
This .py file works well when I was running it locally on Mac. So the file itself should not be a problem.
Do you know how can I copy local DAG to the container's dags folder successfully on Windows?
I am quite new to docker technology and still learning and reading through the docs. I have an oracle base image which i would like to use as a parent image to build my own image and then pushing it towards custom docker registry/repository.
The base image already provides a full setup of oracle db. But as next steps, i would like
download a dump file (e.g. dump url) directly into the docker image (without downloading to local
workspace)
run some sql script
lastly, import the dump using data pump (impdp)
I tried to follow https://github.com/mpern/oracle-docker, but here you always need to store dump file locally and point it as volume.
Is it possible if i can use curl command to download and directly store in oracle docker container workspace? Afterwards importing it from there
You can run an interactive bash session inside your container to check if curl is installed, and if it is not installed then you need to install Curl. Using an interactive bash session, you can then download your dump file.
The ports you require will also need to be be published, if the container is connecting outside of Docker and the host machine, you can use docker run with the -p parameter.
An example is below,
docker run -p 80:80 -it (Your image) /bin/bash
More information on this regarding the docker run command, and Dockerfiles
https://docs.docker.com/engine/reference/commandline/run/
https://docs.docker.com/engine/reference/builder/
I am running robot framework (Selenium based) testing inside a Docker container. But I need to access files outside the Docker container (in Mac).
I have tried providing absolute path of the Mac but Docker refers it's core folder as the root folder.
I found below links for Windows, but not for Mac.
Docker - accessing files inside container from host
Access file of windows machine from docker container
one approach is copy your files inside docker container at creation time, but if your files updates by another service on host, and it needs to access them too, just mount it like below.
docker run -d --name your-container -v /path/to/files/:/path/inside/container containername:version
this way files on the host machine mounts into docker container and the user inside container can access them.
First i just want to mention i am very new to docker.
I am using Win 10, "Docker for Windows".
I am using the default linux containers option.
I have downloaded the latest image from here,
https://github.com/camunda/docker-camunda-bpm-platform.
So now, my Docker is online, and the container + image are working. A tomcat server and a Camunda engine are online and working.
My problem is the following,
I need to do some changes and i cant find where Tomcat and Camunda are being stored. I need to edit some XML files both in the Camunda and in the Tomcat ( to setup which database to use for example ).
Can it be that it is not being stored on my local machine?
For example when i open the container with Kitematic ( Docker UI ) i can see environment variables for it, there is a SERVER_CONFIG and its value is /camunda/conf/server.xml ( this is one of the files i need to edit! but i cant find it or anything else anywhere on my local machine ).
you should access container using following command
sudo docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
5e978f353734 camunda/camunda-bpm-platform:latest "/sbin/tini -- ./cam…" 4 days ago Up 4 days
the issie
sudo docker exec -it 5e978f353734 /bin/bash
then you will see the container insie via shell command. good luck!
You may want to consider using Camunda BPM RUN, which aims to allow configuration without having to change the WAR deployment or Tomcat. Instead configuration is done as described here:
https://docs.camunda.org/manual/latest/user-guide/camunda-bpm-run/
Config files can be mounted into the docker images, but you may prefer to compose your own docker image based on the Camunda BPM Run base image.
The example here shows another approach which sets Camunda properties from outside the docker image by passing the environment variable SPRING_APPLICATION_JSON into the docker image.
https://medium.com/#robert.emsbach/anyone-can-run-camunda-bpm-on-azure-in-10-minutes-4b4055cc8e9