This is very funny but very frustrating problem. I am using an ENV variable, which specifies date. I provide an ISO 8601 compliant version and in application, I retrieve it and parse. When I specify it in GH action workflow, it is get parsed as a date (rather than a string) and formatted. Therefore, my application parsing fails.
Example:
.github/workflows/rust.yaml
env:
MY_DATE: '2020-10-07T12:00:00+01:00'
run: echo $MY_DATE
Result (GH action UI):
env:
TMOU_GAME_END: 10/07/2020 11:00:00
10/07/2020 11:00:00
It is specific to GitHub action and their yaml parsing, it works OK on Heroku, on various local setups, etc.
Things I tried and they don't work:
using no quotes, single quotes ('), double quotes (")
setting another ENV var, LC_TIME to en_DK.UTF-8
using !!str shorthand (see https://yaml.org/spec/1.2/spec.html, section Example 2.23. Various Explicit Tags); this one fails either with The workflow is not valid. .github/workflows/rust.yml: Unexpected tag 'tag:yaml.org,2002:str' or with The workflow is not valid. .github/workflows/rust.yml: The scalar style 'DoubleQuoted | SingleQuoted' on line 29 and column 24 is not valid with the tag 'tag:yaml.org,2002:str'
Is there any help? Any secret parameter I can turn on? Any escaping sequence? I just wanna GH Actions yaml parser to treat the value as a string.
Surprisingly, it seems GitHub Actions workflow YAML parser does not fully implement the standard and using explicit typing (like !!str) does not work. You can, however, workaround it by setting the environment variable to the desired value not in the YAML file itself but dynamically during workflow execution using a dedicated workflow command:
steps:
- name: Dynamically set MY_DATE environment variable
run: echo "MY_DATE=2020-10-07T12:00:00+01:00" >> $GITHUB_ENV
- name: Test MY_DATE variable
run: echo ${{ env.MY_DATE }}
This should do the trick.
Related
I've got a CI script setup something like this with 3 files :
# file Vars
.def-vars:
STAGING_SSH_DEST: mysite.com
PROJECT_ROOT: myRoot
# file gitlab-ci
variables:
extends: .def-vars
STAGING_SSH_DEST: myrealsite.com
PROJECT_ROOT: /myRealRoot
deploy-stage:
extends: .deploy
variables:
SSH_DESTINATION: $STAGING_SSH_DEST
# file deploy
.deploy:
variables:
SSH_DESTINATION: mysite.com
RSYNC_DESTINATION: $SSH_DESTINATION:$PROJECT_ROOT
I have my files and variables split up like this to increase the re-usability of the scripts.
The idea was that since I have multi site destinations, staging prod, I want to be able to pass the ssh destination to each and have the job figure out the rsync on its own. Problem is, the variable expansion is not working the way I'd think it would.
In the deploy script I added a print and got the following :
$ echo $SSH_DESTINATION # This is the variable name local to job
myrealsite.com # Yep! printed the passed in value
$ echo $RSYNC_DESTINATION # $SSH_DESTINATION:$PROJECT_ROOT
$STAGING_SSH_DEST:/myRealRoot # That is the name of the variable passed in
The root and SSH_DESTINATION print just fine. When appending the two, the former seems to be expanded one too few times.
I've had the idea to just create the rsync variable within the script section but I'd like to avoid this as I want to be able to override the rsync variable without editing the .deploy job.
How can this be accomplished?
There is an issue with Gitlab CI variables, preventing you to expand variables correctly with your extends setup.
Your options to solve this are:
Use this solution with the help of before_script, posted inside aforementioned issue. Some limitations are there, but for simple stuff it is working pretty good.
before_script:
- export VAR1="${CI_PIPELINE_ID}"
- export VAR2="test-${VAR1}"
Do some kind of preparation via .env artifacts and downstream pipelines. This one is tougher to setup, but it will allow to create dynamic tasks (and pipelines), for example spawning multi-stage deploy after successful build.
"Setting a variable within another variable" should be easier with GitLab 15.6 (November 2022)
Support for special characters in CI/CD variables
Previously, it was difficult to use the $ character in a CI/CD variable because $ normally signifies the start of another variable.
GitLab would interpret it as a variable and try to expand it.
In this release, we are introducing the variable: expand: keyword which will allow you to mark a variable as “raw”.
A raw variable can contain any special characters and is not expanded when passed to the GitLab runner.
See Documentation and Issue.
And:
Example of variables:expand:
variables:
VAR1: value1
VAR2: value2 $VAR1
VAR3:
value: value3 $VAR1
expand: false
The result of VAR2 is value2 value1.
The result of VAR3 is value3 $VAR1.
I'm getting the following warning in Ansible:
[WARNING]: Non-string value found for env option. Ambiguous env options should be wrapped in quotes to avoid YAML parsing. This will become an error in Ansible 2.8. Key: PORT; value will be treated as: 12345
So I went and looked up the origin of this value and wrapped all instances of it in quotes. Or so I thought. I'm still getting the warning.
So I went to the place in the code where it appeared and it seems to be this:
docker_container:
env: '{{ params | combine(extra_params, {"PORT": my_port|int + amount|int * 10 })}}'
This is a setup for dealing with multiple instances of the same container, each getting a unique port, as to not interfere with one another.
And I'm not sure how to fix that without breaking that setup. Can it be cast to string again after the calculation is done? Should I do it beforehand? What's the best option here?
As the ansible documentation for the docker_container module under env states
Values which might be parsed as numbers, booleans or other types by the YAML parser must be quoted (e.g. "true") in order to avoid data loss.
so you have to convert your result to a quoted string.
env: '{{ params | combine(extra_params, {"PORT": (my_port|int + amount|int * 10) | string })}}'
I have Terraform that is using a templated bash script to set my user data section for an AWS launch configuration.
data "template_file" "user_data" {
template = "${file("${path.module}/user-data.tpl")}"
vars {
file_system_id = "${aws_efs_mount_target.my_efs_alpha.dns_name}"
}
}
The file_system_id variable then needs to be used in my template:
sudo mount -t nfs -o nfsvers=4.1,rsize=1048576,wsize=1048576,hard,timeo=600,retrans=2 $$$${file_system_id}:/ /mnt/efs
Bash will interpret a single dollar sign as a bash variable. As I understand it, Terraform will interpret a double-dollar-sign as a Terraform variable. For added fun, the dollar signs in the template need to be escaped with another dollar sign -- hence the 4 dollar signs in front of file_system_id.
Looking at the user data in my Launch Config over in AWS Console, Terraform does not appear to be making any effort to replace my $$$${file_system_id) with the variable value from my template_file definition. Rather, it just shows up in the user data section as literally $${file_system_id}.
So, the question is, how do I get my EFS DNS name (or whatever other value I want) to replace the file_system_id variable in my template? What have I missed?
As BMW mentioned, you don't need to escape the dollar signs. ${file_system_id} works just fine.
Terraform's variable-replacement in templates will run first so you don't need to worry about how Bash will parse it until after the variables are replaced.
I'm trying to get going with some more advanced Ansible playbooks and have hit a wall. I'm trying to get Ansible to do what this /bin/bash 'for' loop does;
for i in $(</filename.txt);do '/some/command options=1 user=usera server=$i';done
filesnames.txt contains 50-100 hostnames.
I can't use jinja templates as the command has to be run, not just the config file updated.
Any help would be greatly appreciated.
Thanks in advance,
Jeremy
you can use jinja templates, but differently
your specific code is not doing something that is most advisable
for multi-line code you should use shell module.
example of multi-code piece of call:
- name: run multiline stuff
shell: |
for x in "${envvar}"; do
echo "${x}"
done
args:
executable: /bin/bash
note I'm explicitly setting executable, which will ensure bash-isms would work.
I just used envvar as an example, of arbitrary environment variable available.
if you need to pass specific env variables, you should use environment clause of the call to shell module, refer to: http://docs.ansible.com/ansible/playbooks_environment.html
For simple variables you can just use their value in shell: echo "myvar: {{myvar}}"
If you wish to use an ansible list/tuple variable inside bash code, you can make it bash variable first. e.g. if you have a list of stuff in mylist, you can expand it and assign into a bash array, and then iterate over it. the shell code of the call to shell would be:
mylist_for_bash=({{mylist|join(" ")}})
for myitem in "${mylist_for_bash[#]}"; do
echo "my current item: ${myitem}"
done
Another approach would be to pass it as string env variable, and convert it into an array later in the code.
NOTE:
of course all this works correctly only with SPACELESS values
I've never had to pass array with space containing items
I'm using the CI Lint tester to try and figure out how to store an expected JSON result, which I later compare to a curl response. Neither of these work:
Attempt 1
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
Fails with:
did not find expected key while parsing a block mapping at line 4 column 5
Attempt 2
---
image: ruby:2.1
script:
- EXPECT_SERVER_OUTPUT="{\"message\": \"Hello World\"}"
Fails with:
jobs:script config should be a hash
I've tried using various combinations of echo as well, without a working solution.
You could use literal block scalar1 style notation and put the variable definition and subsequent script lines on separate lines2 without worrying about quoting:
myjob:
script:
- |
EXPECT_SERVER_OUTPUT='{"message": "Hello World"}'
or you can escape the nested double quotes:
myjob:
script:
- "EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
but you may also want to just use variables like:
myjob:
variables:
EXPECT_SERVER_OUTPUT: '{"message": "Hello World"}'
script:
- dothething.sh
Note: variables are by default expanded inside variable definitions so take care with any $ characters inside the variable value (they must be written as $$ to be literal). This feature can also be turned off.
1See this answer for an explanation of this and related notation
2See this section of the GitLab docs for more info on multi-line commands
I made it work like this:
script: |
"EXPECT_SERVER_OUTPUT='{\"message\": \"Hello World\"}'"
echo $EXPECT_SERVER_OUTPUT