Poetry & Tox: How to pass arguments to commands via Tox when running in Poetry - python-poetry

When I try to pass an argument to a command via Tox v4.2.6 within Poetry it is failing to recognise -- as a separator.
Command:
poetry run tox -- \$username:\$password
Tox error:
tox: error: unrecognized arguments: $username:$password
hint: if you tried to pass arguments to a command use -- to separate them from tox ones
When I run the same Tox command outside of Poetry it works as expected (tox -- \$username:\$password)
Does anyone know what I am missing?
my tox.ini is:
[tox]
envlist = py39,lint,bandit
isolated_build = True
toxworkdir = {toxinidir}/build/tox
[base]
setenv =
PIP_INDEX_URL =
https://{posargs}#blah.artifactory.com/artifactory/api/pypi/pypi-blah-prod-virtual/simple
[testenv]
setenv =
{[base]setenv}
deps =
pytest
pyyaml
commands = pytest --junitxml=junit-{envname}.xml
Thanks!

-- is eaten by poetry so it runs tox \$username:\$password. To pass additional -- execute
poetry run tox -- -- \$username:\$password

Related

Poetry config - command: poetry not found, but path added to bashrc and shrc files. export path command caused directory error running script

I am a new linux/python user trying to get poetry set up. Basically I can make the poetry command work by using the export path statement, but that only causes a directory issue when executing script. Ive downloaded poetry using the command:
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
I know that it's installed because if I try and run that again, it says it already downloaded. I run into the error:
command: poetry not found
I used
nano ./bashrc
nano ./shrc
in terminal and I've edited both of those with
export PATH="$HOME/.poetry/bin:$PATH"
then, I'm following instructions to run a set of scripts under directory "thermalproc" that I've been given, and the path is ~/Desktop/VideoProcessing/thermalproc-20220625/thermalproc . I'm to use the command
poetry run thermalproc
the only thing that I have done to make the command work is to use that export line directly in the terminal, and that does allow the command to work, but if i run just the command as so:
joshua#Andrea-G750JZA:~$ poetry run thermalproc
it returns
RuntimeError
Poetry could not find a pyproject.toml file in /home/joshua or its parents
at .poetry/lib/poetry/_vendor/py3.8/poetry/core/factory.py:369 in locate
365│ if poetry_file.exists():
366│ return poetry_file
367│
368│ else:
→ 369│ raise RuntimeError(
370│ "Poetry could not find a pyproject.toml file in {} or its parents".format(
371│ cwd
372│ )
373│ )
and if I run in the directory of the script as I originally thought I should:
joshua#Andrea-G750JZA:~/Desktop/VideoProcessing/thermalproc-20220625/thermalproc$ poetry run thermalproc.
FileNotFoundError
[Errno 2] No such file or directory: b'/snap/bin/thermalproc.'
at /usr/lib/python3.8/os.py:601 in _execvpe
597│ path_list = map(fsencode, path_list)
598│ for dir in path_list:
599│ fullname = path.join(dir, file)
600│ try:
→ 601│ exec_func(fullname, *argrest)
602│ except (FileNotFoundError, NotADirectoryError) as e:
603│ last_exc = e
604│ except OSError as e:
605│ last_exc = e
Thanks for sticking it out this long, any advice would be greatly appreciated!

How do we access the variables set inside the tox environment again in another block of the tox?

I am using tox-docker and it sets POSTGRES_5432_TCP_PORT as an environment variable. How do I access this env variable again? I want to do this because I have to provide this to the pytest command.
[tox]
skipsdist = True
envlist = py37-django22
[testenv]
docker = postgres:9
dockerenv =
POSTGRES_USER=asd
POSTGRES_DB=asd
POSTGRES_PASSWORD=asd
setenv =
PYTHONDONTWRITEBYTECODE=1
DJANGO_SETTINGS_MODULE=app.settings.base
deps=-rrequirements.txt
-rrequirements_dev.txt
commands =
env
python -c "print('qweqwe', {env:POSTGRES_5432_TCP_PORT:'default_port'})"
pytest -sv --postgresql-port={env:POSTGRES_5432_TCP_PORT:} --cov-report html --cov-report term --cov=app -l --tb=long {posargs} --junitxml=junit/test-results.xml
here, POSTGRES_5432_TCP_PORT is set by the tox-docker. but when I try to access it inside tox it is not available. But when I execute the env command inside tox it prints the variable.
py37-django22 docker: run 'postgres:9'
py37-django22 run-test-pre: PYTHONHASHSEED='480168593'
py37-django22 run-test: commands[0] | env
PATH=
TOX_WORK_DIR=src/.tox
HTTPS_PROXY=http://0000:8000
LANG=C
HTTP_PROXY=http://0000:8000
PYTHONDONTWRITEBYTECODE=1
DJANGO_SETTINGS_MODULE=app.settings.base
PYTHONHASHSEED=480168593
TOX_ENV_NAME=py37-django22
TOX_ENV_DIR=/.tox/py37-django22
POSTGRES_USER=swordfish
POSTGRES_DB=swordfish
POSTGRES_PASSWORD=swordfish
POSTGRES_HOST=172.17.0.1
POSTGRES_5432_TCP_PORT=32822
POSTGRES_5432_TCP=32822
VIRTUAL_ENV=.tox/py37-django22
py37-django22 run-test: commands[1] | python -c 'print('"'"'qweqwe'"'"', '"'"'default_port'"'"')'
qweqwe default_port
py37-django22 run-test: commands[2] | pytest -sv --postgresql-port= --cov-report html --cov-report term --cov=app -l --tb=long --junitxml=junit/test-results.xml
If a script sets an environment variable, that envvar is visible to that process only. If it exports the variable, the variable will be visible whatever sub-shells that script may spawn. Once the script exits, all envvars set by the shell process and any child processes will be gone since they existed only in that memory space.
Not sure what you're trying to do, Docker is not my speciality, but 5432 is the common Postgres port. If you're trying to supply it to pytest, you could say
POSTGRES_5432_TCP_PORT=5432 pytest <test_name>
Or something to that effect.

How to pass a value from JSON to a command in shell script at Docker? [duplicate]

This question already has answers here:
Command not found error in Bash variable assignment
(5 answers)
Closed 3 years ago.
I have a dockerfile to create an image with Ubuntu 16.04. Inside the docker file:
# dotnet tool command
RUN apt-get install dotnet-sdk-2.2 -y
# for dot net tool
ENV PATH="${PATH}:/root/.dotnet/tools"
# jq command to read json file
RUN apt-get install jq -y
ADD xxx /xxx
# Copy the deploy-tool.json file
ADD deploy-tool.json /xxx/deploy-tool.json
# Copy the main sh script to xxx
ADD main-migrate.sh /xxx/main-migrate.sh
# Run the main sh script to run script in each xxx/*/bla..bla.sh
RUN chmod +x /xxx/main-migrate.sh
RUN /permasense/main-migrate.sh
my deploy-tool.json is as followed:
{
"name": "xxx.DEPLOY",
"version": "1.2.3-dev.29"
}
here is main-migrate.sh
name = $(jq '.name' /xxx/deploy-tool.json)
nugetFileVersion = $(jq '.version' /xxx/deploy-tool.json)
# TODO how to pass value from JSON to the command below
# install dot net nuget
dotnet tool install -g $name --version "$nugetFileVersion" --add-source /xxx/
I have the xxx.DEPLOY.nupkg in xxx folder.
When dockerfile runs main-migrate.sh, I got error message complaining that the name and nugetFileVersion cannot be found.
How do I pass name & nugetFileVersion from the jq command to the dotnet tool install above as shown in main-migrate.sh?
Thank you
The idea is right, but the problem is with your main-migrate.sh script in the assignment statements. Shell assignments don't take spaces around the = symbol. It should have been
name=$(jq '.name' /xxx/deploy-tool.json)
# ^^^ no spaces
nugetFileVersion=$(jq '.version' /xxx/deploy-tool.json)
# ^^^ no spaces

How To WrapUp the Shell Script with One Line of Command with Argument Passing

I use lots of virtual environment nowadays since the different project parallely going on in my company.
Following is what I usually do for the initial setting of conda creation of new virtual environment
conda install --yes --file requirements.txt
source activate myenv
python -m ipykernel install --user --name myenv --display-name “kernel_name”
Upon the above sequence of code must be ran sequentially while myenv and kernel_name being passed as an manually given argument.
How could I do this at once with wrapped up .sh file? or is this possible without creating .sh file?
You can do it using a shell script. I would do:
#!/usr/bin/env bash
myenv="$1"
kernel_name="$2"
source /path/to/base/conda/installation/etc/profile.d/conda.sh
conda install --yes --file /path/to/requirements.txt
conda activate "$myenv"
python -m ipykernel install --user --name "$myenv" --display-name "$kernel_name"
And run it like: /path/to/script.sh <env-name> <kernel-name>

How to declare a shell constant for Jenkins execute shell build configuration?

I'm running the following command in the "execute shell" entry in my Jenkins job configuration.
AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX CASS_HOSTNAME=XXX DB_HOSTNAME=XXX DB_PASSWORD=XXX DB_USERNAME=XXX EMAIL_REGION=XXX RED_HOSTNAME=XXXX RED_PORT=XXX npm run
How could I create a single variable for those parameters? I tried declaring a regular shell constant like:
readonly parameters="AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX .."
$parameters npm run init_test_tenant
But I get the following error
/tmp/hudson6912171417213995775.sh: 4: /tmp/hudson6912171417213995775.sh: AWS_ACCESS_KEY_ID=XXX: not found
In order to avoid place eval in front of everything and open a hole in your code, you can use a trick:
export $parameters
npm run init_test_tenant
The difference from the attempt in your question: all variables will be visible to npm and all other commands you would execute after that, while VAR=value command syntax let VAR visible only to command.
Prefix with env
readonly parameters="AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX .."
env $parameters npm run init_test_tenant
Also if you do not need any other environment variables in your nodejs program you can add the -i option to env so that you get a clean environment for npm, but then you would have to address npm via full path ( eg: /usr/bin/npm ) and all other programs that might run within you nodejs program:
readonly parameters="AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX .."
env -i $parameters /usr/bin/npm run init_test_tenant
Or add the PATH=/usr/bin/ to the $parameters constant like so:
readonly parameters="PATH=/usr/bin AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX .."
env -i $parameters npm run init_test_tenant

Resources