Robotframework does not find ENV-Variable - shell

I have created a docker container with an environment variable like this:
docker run -d -it --name upload-testservice-1 --env SERVICE_URL=http://localhost:9400 image
If I run the command env in the shell of the container, I receive lots of env-variables, containing the wanted SERVICE_URL=http://localhost:9400
When I now want to use this env-variable in my robot test like the docs tell me:
${test} = Get Environment Variable SERVICE_URL
or like this:
${test} = %{SERVICE_URL}
I receive the error Variable '${SERVICE_URL}' not found.
Can you tell me whats wrong and how I can access the env-variable?
I also tried:
${test} = Evaluate os.environ.get("SERVICE_URL")
generating the same error.

Related

Docker run image with password file

I am trying to run image-inspector ( I know it is outdated, using for clamav test only) in a docker executing in bash.
with the parameters and one of them is -password-file ( password dumped into a file, there is not an option to do str password without file).
I am trying to run like:
PWD_FILE="./pwdfile"
docker run -p 8080:8080 openshift/image-inspector --scan-type=clamav
--clam-socket=/var/run/clamd.socket -path "${MKDIR}" -username "${SP_ID}" -password-file "${PWD_FILE}" --image "${URI_DIGEST}"
Although, I am still getting "error: File not found" , for password file , even though the file is in the direct folder, when executing docker run. I have tried to create env variable for path inside Bash, tried to readlink of the file, variable to variable file. Not sure, if I am missing something documentation to image-inspector does not seem to say much...
image-inspector is executed for make a test of container ( one execution for container)
Does anybody have any experience or idea ?
Thank you!

Setting environment variable derived in container at runtime within a shell

I have custom entrypoint where a environment variable is exported. The value of the environment variable is constructed using two other variables provided at runtime.
Snippet from Dockerfile CMD ["bash", "/opt/softwareag/laas-api-server/entrypoint.sh"].
Snippet from entrypoint.sh
export URL="$SCHEME://$HOST:$PORT"
echo "URL:$URL"
Command docker run -e HOST="localhost" -e PORT="443" mycentos prints URL:localhost:443 as expected but the same variable appears to have lost the value when the following command is executed.
docker exec -ti <that-running-container-from-myimage> bash
container-prompt> echo $URL
<empty-line>
Why would the exported variable appear to have lost the value of URL? What is getting lost here?
The environment variable will not persist across all bash session. when the container run it will only available in that entrypoint session but later it will not available if it set using export.
docker ENV vs RUN export
If you want to use across all session you should set them in Dockerfile.
ENV SCHEME=http
ENV HOST=example.com
ENV PORT=3000
And in the application side, you can use them together.also it will be available for all session.
curl "${SCHEME}://${HOST}:${PORT}
#
Step 8/9 : RUN echo "${SCHEME}://${HOST}:${PORT}"
---> Running in afab41115019
http://example.com:3000
Now if we look into the way you are using, it will not work because
export URL="$SCHEME://$HOST:$PORT"
# only in this session
echo "URL:$URL"
# will be available for node process too but for this session only
node app.js
For example look into this Dockerfile
FROM node:alpine
RUN echo $'#!/bin/sh \n\
export URL=example.com \n\
echo "${URL}" \n\
node -e \'console.log("ENV URL value inside nodejs", process.env.URL)\' \n\
exec "$#" \n\
' >> /bin/entrypoint.sh
RUN chmod +x /bin/entrypoint.sh
entrypoint ["entrypoint.sh"]
So you when you Run docker container for the first time you will able to see the expected response.
docker run -it --rm myapp
example.com
ENV URL value inside nodejs example.com
Now we want to check for later session.
docker run -it --rm abc tail -f /dev/null
example.com
ENV URL value inside nodejs example.com
so the container is up during this time, we can verify for another session
docker exec -it myapp sh -c "node -e 'console.log(\"ENV URL value inside nodejs\", process.env.URL)'"
ENV URL value inside nodejs undefined
As we can same script but different behaviour because of docker, so the variable is only available in that session, you can write them to file if you are interested in later use.

Access a bash script variable outside the docker container in which the script is running

I have a bash script running inside a docker container. In this script, I set the value of some variable.
Can I somehow access the value of this variable outside the container?
I tried to make the variable "global" but could not figure out how to do it. Is it a good idea to make the required variable an environment variable inside the container?
How to reproduce
Create a bash script called temp.sh with the following contents:
a=$RANDOM
Now, run this file in a docker container as follows:
docker run -it --rm -v $(pwd):/opt alpine sh -c "sh /opt/temp.sh"
Desired behaviour: To be able to access the variable a outside the docker container
Credit: This comment by Mark
I mounted a directory on the docker filesystem using
docker run -v <host-file-system-directory>:<docker-file-system-directory>
In the bash script, I added
echo "$variable" >docker-file-system-directory/variable.txt
As I had mounted a host filesystem directory on the docker filesystem, I can still access variable.txt simply using cat <host-file-system-directory>/variable.txt
Note that docker-file-system-directory must be an absolute path, and not a relative path.
One way of achieving that is using docker exec, if your container is running and has access to bash.
#!/usr/bin/env bash
set -x
yourContainerName="testContainerName"
test=$(docker exec -i "${yourContainerName}" bash <<EOF
# do some work here e.g. execute your script
testVar="thisIsTest" # the value we want to access outside of container
echo \$testVar
EOF
)
echo $test
We pass a multiline script to docker container, which in the end echo's the value we need. This value is then accessible from shell that executed docker exec.
Output looks like this:
++ docker exec -i testContainerName bash
+ test=thisIsTest
+ echo thisIsTest
thisIsTest

Dockerfile assign bash command to var

I need to assign bash command to var in Dockerfile. Following is what I guess:
FROM centos:7
RUN data=$(ls /)
ENV DATA $data
After running container (docker run exec -it <image> bash), then echo $DATA output is empty. I have searched on google, but noway. I am stuck!
How to assign bash command to value in dockerfile?
You can't do that, since RUN command spawns its own shell.
Alternatively, you can save the information to some file, and use an ENTRYPOINT to set the env variable using some script once the container is running.
You can't set any variables while building docker image, because image build as layered filesystem, after executing RUN command instruction, it will execute command in run time and exit, so you can write docker file like below:
FROM centos:7
RUN echo 'export data=$(ls /)' >> ~/.bashrc
ENV DATA $data

Copy Current Env Vars into `docker run`'s Scope

If I'm using a docker container with an entry point set, I can run that container via the following command
docker run -it my-container-tag
If the program in my container requires an environmental variable, I can pass that var via the -e flag
docker run -it -e FOO=bar my-container-tag
If I have a program that uses many environment variable, I get an unwieldy mess that becomes hard to type.
docker run -it -e FOO=bar -e BAZ=zip -e ZAP=zing -e ETC=omg-stop my-container-tag
Is there a way to tell docker run to inherit all the env variables currently set in my shell's scope? If not, are there common practices for working around needing to type in these variables again and again?
You cant inherit the envs, I usually use docker-compose to set my envs when there is too much, or build the container with the environments variables inside it if you dont need to change frequently.

Resources