Semaphore CI using environment conditions in "when" clauses - continuous-integration

I'm trying to only run a Semaphore CI job when it's being ran by the automated scheduler.
To do this, the docs has the environment variable SEMAPHORE_WORKFLOW_TRIGGERED_BY_SCHEDULE.
I'm trying to use this in the when clause. e.g.
dependencies: [System Tests]
run:
when: "SEMAPHORE_WORKFLOW_TRIGGERED_BY_SCHEDULE = true"
This does not work, I get the error
Unprocessable YAML file.
Error: "Invalid 'when' condition on path '#/blocks/5/run/when':
Syntax error on line 1. - Invalid expression on the left of '=' operator."
Referring to the env variable with a $ doesn't work either, as apparently in the when clause it's an invalid character.
How can I only run the CI job when it's scheduled?

when doesn't support environment variables. It only supports the notation specified here: https://docs.semaphoreci.com/reference/conditions-reference/.
Let me get your use case straight. Do you want to run specific blocks based on whether the pipeline is executed from the scheduler or not? If that's the case, when will not help you, unfortunately.
However, I think I can provide a workaround by using an IF statement. You could add all the commands from the job inside an IF statement that checks SEMAPHORE_WORKFLOW_TRIGGERED_BY_SCHEDULE. This way, if the variable matches you run the commands, if not you skip all the commands and the job ends successfully there. Effectively "skipping" the job with almost no execution time. Something like, please check if the syntax is correct:
if [ SEMAPHORE_WORKFLOW_TRIGGERED_BY_SCHEDULE = true ]; then run commands; else echo "skip commands"; fi

Related

What's "auto" value in "AC_ARG_ENABLE"?

I've been checking whether which files of some open source components are used during compile.
But, I don't know well about autotools, autoconf, etc.. so, I want to know the "auto" value means in AC_ARG_ENABLE(). Here is an example.
AC_ARG_ENABLE(launchd, AS_HELP_STRING(--description--),enable_launchd=$enableval,enable_launchd=auto)
If "--enable-launchd" option is given, run the command "enable_launchd=$enableval", right??
But if not, run the command "enable_lauchd=auto".
What is a value of the "auto"?
If "--enable-launchd" option is given, run the command
"enable_launchd=$enableval", right??
Right.
But if not, run the command
"enable_lauchd=auto".
Yes.
What is a value of the "auto"?
"auto" has no significance in that command other than as itself. It is the value assigned to variable enable_launchd.
You may well find that configure.ac later checks for that value and performs additional processing if it is seen, or you might find that it just emits it as-is into an external file.

How to prevent syntax errors when reading BASH associative array values which contain slashes from a child process?

I'm using bash 4.4.19(1)-release.
At the start of my program I read customer configuration values from the command line, configuration file(s), and the environment (in decreasing order of precedence). I validate these configuration values against internal definitions, failing out if required values are missing or if the customer values don't match against accepted regular expressions. This approach is a hard requirement and I'm stuck using BASH for this.
The whole configuration process involves the parsing of several YAML files and takes about a second to complete. I'd like to only have to do this once in order to preserve performance. Upon completion, all of the configured values are placed in a global associative array declared as follows:
declare -gA CONFIG_VALUES
A basic helper function has been written for accessing this array:
# A wrapper for accessing the CONFIG_VALUES array.
function get_config_value {
local key="${1^^}"
local output
output="${CONFIG_VALUES[${key}]}"
echo "$output"
}
This works perfectly fine when all of the commands are run within the same shell. This even works when the get_config_value function is called from a child process. Where this breaks down is when it's called from a child process and the value in the array contains slashes. This leads to errors such as the following (line 156 is "output="${CONFIG_VALUES[${key}]}"):
config.sh: line 156: path/to/some/file: syntax error: operand expected (error token is "/to/some/file")
This is particularly obnoxious because it seems to be reading the value "path/to/some/file" just fine. It simply decides to announce a syntax error after doing so and falls over dead instead of echoing the value.
I've been trying to circumvent this by running the array lookup in a subshell, capturing the syntax failure, and grepping it for the value I need:
# A wrapper for accessing the CONFIG_VALUES array.
function get_config_value {
local key="${1^^}"
local output
if output="$(echo "${CONFIG_VALUES[${key}]}" 2>&1)"; then
echo "$output"
else
grep -oP "(?<=: ).*(?=: syntax error: operand expected)" <<< "$output"
fi
}
Unfortunately, it seems that BASH won't let me ignore the "syntax error" like that. I'm not sure where to go from here (well... Python, but I don't get to make that decision).
Any ideas?

Passing calculation commands to cluster job

TL;DR
Trying to pass a computation of the form $(($LSB_JOBINDEX-1)) to a cluster call, but getting an error
$((2-1)): syntax error: operand expected (error token is "$((2-1))")
How do I escape correctly or what alternative command to use so that this works?
Detailed:
For automatations in my workflow I am currently trying to write a script that automatically issues bsub commands in a predefined order.
Some of these commands are array jobs that are supposed to work on a file each.
If done without the cluster calls, it would look something like this:
samplearray=(sample0.fasta sample1.fasta) #array of input files
for s in samplearray
echo $s #some command on $s
done
for the cluster call I want to use an array job, the command for this looks like this:
bsub -J test[1-2] 'samplearray=(sample0.fastq sample1.fastq)' echo '${samplearray[$(($LSB_JOBINDEX-1))]}'
which launches two jobs with LSB_JOBINDEXset to 1 or 2 respectively, which is why I need to subtract 1 for correct indexing of the array.
The problem now is in the $((...)) part, because what is being executed on the node is ${samplearray[$\(\($LSB_JOBINDEX-1\)\)]} which does not trigger the computation but instead throws an error:
$((2-1)): syntax error: operand expected (error token is "$((2-1))")
What am I doing wrong here? I have tried other ways of escaping and quoting, but this was the closest I got to the correct solution

Rundeck sharing variables across job steps

I want to share a variable across rundeck job steps.
Initialized a job option "target_files"
Set the variable on STEP 1.
RD_OPTION_TARGET_FILES=some bash command
echo $RD_OPTION_TARGET_FILES
The value is printed here.
Read the variable from STEP 2.
echo $RD_OPTION_TARGET_FILES
Step 3 doesn't recognize the variable set in STEP 1.
What's a good way of doing this on rundeck other than using environment variables?
The detailed procedure from RUNDECK 2.9+:
1) set the values - three methods:
1.a) use a "global variable" workflow step type
e.g. fill in: Group:="export", Name:="varOne", Value:="hello"
1.b) add to the workflow a "global log filter" (the Data Capture Plugin cited by 'Amos' here) which takes a regular expression that is evaluated on job step log outputs. For instance with a job step command like:
echo "CaptureThis:varTwo=world"
and a global log filter pattern like:
"CaptureThis:(.*?)=(.*)"
('Name Data' field not needed unless you supply a single capturing group in the pattern)
1.c) use a workflow Data Step to define multiple variables explicitly. Example contents:
varThree=foo
varFour=bar
2) get the values back:
you must use the syntax ${ctx.name} in command strings and args, and #ctx.name# within INLINE scripts. In our example, with a job step command or inline script line like:
echo "values : #export.varOne#, #data.varTwo#, #stub.varThree#, #stub.varFour#"
you'll echo the four values.
The context is implicitly 'data' for method 1.b and 'stub' for method 1.c.
Note that a data step is quite limitative! It only allows to benefit from #stub.name# notations within inline scripts. Value substitution is not performed in remote files, and notations like ${stub.name} are not available in job step command strings or arguments.
After Rundeck 2.9, there is a Data Capture Plugin to allow pass data between job steps.
The plugin is contained in the Rundeck application by default.
Data capture plugin to match a regular expression in a step’s log output and pass the values to later steps
Details see Data Capture/Data Passing between steps (Published: 03 Aug 2017)
Nearly there are no ways in job Inline scripts other than 1, exporting the value to env or 2, writing the value to a 3rd file at step1 and step2 reading it from there.
If you are using "Scriptfile or URL" method, may be you can execute step2 script with in script1 as a work around.. like
Script1
#!/bin/bash
. ./script2
In the above case script2 will execute in the same session as script1 so the variables and values are preserved.
EDIT
Earlier there was no such options but later there are available plugins. Hence check Amos's answer.

Using Shell to Check Whether a File Exists, and only if it does, Execute a Set of Commands

I have a few lines of code in Stata. I'd like the lines to be executed only if the .txt file to which the lines refer exist a priori. I am wondering whether there is a shell command that I can use for this that I can embed in an if statement.
For example might something like the following exist and be possible:
insheet using "file.txt" if ('file.txt')
My intent is to say insheet the file file.txt only if it exists. My concern is that the program would otherwise stop, fail, die, or whatever you call it due to a syntax error if I have that insheet statement but the file does not exist.
Immediate answer is No. There is nothing like that syntax for several reasons.
The if qualifier tests whether some condition is true separately for each observation and whether a file exists is not an appropriate condition for testing observation by observation.
The quite different if command tests once and once only whether something is true and might seem more appropriate. In practice it is not used for this purpose, but to learn more, see help ifcmd.
Stata has no special syntax based on paired identical single quotes ' '.
However, Stata provides a separate construct here
confirm file file.txt
In practice that is going to stop a do-file or program whenever the file does not exist and the file does not exist. A general scheme to catch the error is something like
capture confirm file file.txt
if _rc == 0 insheet using file.txt
else {
<code if the file does not exist>
}
capture is to be thought of as eating the return code from the confirm command. In general the return code _rc from any command is 0 if the command was valid and executed and some non-zero value otherwise. Sometimes one tests for a specific non-zero code. Experiment shows that file not found is return code 601. The main reason for looking up error codes (in [P] error) is to deliver official-looking error messages, but in practice knowing the zero/non-zero rule is the main detail under this heading.
The example here uses == to test for equality.
Note that insheet using file.txt is not strictly a syntax error if the file does not exist. As far as Stata's language is concerned, that is legal syntax. However, that is a fine distinction: it is an error in every ordinary sense.
(LATER) It would be possible to short-circuit the entire process
capture insheet using file.txt
if _rc != 0 {
<code if the file does not exist>
}
as in this case the non-existence of the file is the presumed explanation for any failure of the insheet command. If, however, the insheet call were more complicated, with a varlist and/or options, then failure of the command could arise for other reasons. So in general separating out a check for the existence of the file seems a better strategy.
The confirm command has what you're looking for.
capture confirm file "file.txt"
if !_rc { # if the file exists, confirm will return error code 0
insheet using "file.txt"
}
Alternatively, you could put a capture before the insheet command, which will catch the syntax error. Check the [P] manual for more on capture and confirm.

Resources