How to call a variable created in the script in Nextflow? - bash

I have a nextflow script that creates a variable from a text file, and I need to pass the value of that variable to a command line order (which is a bioconda package). Those two processes happen inside the "script" part. I have tried to call the variable using the '$' symbol without any results, I think because using that symbol in the script part of a nextflow script is for calling variables defined in the input part.
To make myself clearer, here is a code sample of what I'm trying to achieve:
params.gz_file = '/path/to/file.gz'
params.fa_file = '/path/to/file.fa'
params.output_dir = '/path/to/outdir'
input_file = file(params.gz_file)
fasta_file = file(params.fa_file)
process foo {
//publishDir "${params.output_dir}", mode: 'copy',
input:
path file from input_file
path fasta from fasta_file
output:
file ("*.html")
script:
"""
echo 123 > number.txt
parameter=`cat number.txt`
create_report $file $fasta --flanking $parameter
"""
}
By doig this the error I recieve is:
Error executing process > 'foo'
Caused by:
Unknown variable 'parameter' -- Make sure it is not misspelt and defined somewhere in the script before using it
Is there any way to call the variable parameter inside the script without Nextflow interpreting it as an input file? Thanks in advance!

The documentation re the script block is useful here:
Since Nextflow uses the same Bash syntax for variable substitutions in
strings, you need to manage them carefully depending on if you want to
evaluate a variable in the Nextflow context - or - in the Bash
environment execution.
One solution is to escape your shell (Bash) variables by prefixing them with a back-slash (\) character, like in the following example:
process foo {
script:
"""
echo 123 > number.txt
parameter="\$(cat number.txt)"
echo "\${parameter}"
"""
}
Another solution is to instead use a shell block, where dollar ($) variables are managed by your shell (Bash interpreter), while exclamation mark (!) variables are handled by Nextflow. For example:
process bar {
echo true
input:
val greeting from 'Hello', 'Hola', 'Bonjour'
shell:
'''
echo 123 > number.txt
parameter="$(cat number.txt)"
echo "!{greeting} parameter ${parameter}"
'''
}

declare "parameter" in the top 'params' section.
params.parameter="1234"
(..)
script:
"""
(...)
create_report $file $fasta --flanking ${params.parameter}
(...)
"""
(...)
and call "nextflow run" with "--parameter 87678"

Related

How to do a bash `for` loop in terraform termplatefile?

I'm trying to include a bash script in an AWS SSM Document, via the Terraform templatefile function. In the aws:runShellScript section of the SSM document, I have a Bash for loop with an # sign that seems to be creating an error during terraform validate.
Version of terraform: 0.13.5
Inside main.tf file:
resource "aws_ssm_document" "magical_document" {
name = "magical_ssm_doc"
document_type = "Command"
document_format = "YAML"
target_type = "/AWS::EC2::Instance"
content = templatefile(
"${path.module}/ssm-doc.yml",
{
Foo: var.foo
}
)
}
Inside my ssm-doc.yaml file, I loop through an array:
for i in "$\{arr[#]\}"; do
if test -f "$i" ; then
echo "[monitor://$i]" >> $f
echo "disabled=0" >> $f
echo "index=$INDEX" >> $f
fi
done
Error:
Error: Error in function call
Call to function "templatefile" failed:
./ssm-doc.yml:1,18-19: Invalid character;
This character is not used within the language., and 1 other diagnostic(s).
I tried escaping the # symbol, like \#, but it didn't help. How do I
Although the error is pointing to the # symbol as being the cause of the error, it's the ${ } that's causing the problem, because this is Terraform interpolation syntax, and it applies to templatefiles too. As the docs say:
The template syntax is the same as for string templates in the main Terraform language, including interpolation sequences delimited with ${ ... }.
And the way to escape interpolation syntax in Terraform is with a double dollar sign.
for i in "$${arr[#]}"; do
if test -f "$i" ; then
echo "[monitor://$i]" >> $f
echo "disabled=0" >> $f
echo "index=$INDEX" >> $f
fi
done
The interpolation syntax is useful with templatefile if you're trying to pass in an argument, such as, in the question Foo. This argument could be accessed within the yaml file as ${Foo}.
By the way, although this article didn't give the answer to this exact issue, it helped me get a deeper appreciation for all the work Terraform is doing to handle different languages via the templatefile function. It had some cool tricks for doing replacements to escape for different scenarios.

execution of shell command from jenkinsfile

I am trying to execute set of commands from jenkinsfile.
The problem is, when I try to assign the value of stdout to a variable it is not working.
I tried different combinations of double quotes and single quotes, but so far no luck.
Here I executed the script with latest version of jenkinsfile as well as old version. Putting shell commands inside """ """ is not allowing to create new variable and giving error like client_name command does not exist.
String nodeLabel = env.PrimaryNode ? env.PrimaryNode : "slave1"
echo "Running on node [${nodeLabel}]"
node("${nodeLabel}"){
sh "p4 print -q -o config.yml //c/test/gradle/hk/config.yml"
def config = readYaml file: 'devops-config.yml'
def out = sh (script:"client_name=${config.BasicVars.p4_client}; " +
'echo "client name: $client_name"' +
" cmd_output = p4 clients -e $client_name" +
' echo "out variable: $cmd_output"',returnStdout: true)
}
I want to assign the stdout from the command p4 clients -e $client_name to variable cmd_output.
But when I execute the code the error that is thrown is:
NoSuchPropertyException: client_name is not defined at line cmd_output = p4 clients -e $client_name
What am I missing here?
Your problem here is that all the $ are interpreted by jenkins when the string is in double quotes. So the first 2 times there's no problem since the first variable comes from jenkins and the second time it's a single quote string.
The the third variable is in a double quote string, therefore jenkins tries to replace the variable with its value but it can't find it since it's generated only when the shell script is executed.
The solution is to escape the $ in $client_name (or define client_name in an environment block).
I rewrote the block:
String nodeLabel = env.PrimaryNode ? env.PrimaryNode : "slave1"
echo "Running on node [${nodeLabel}]"
node("${nodeLabel}"){
sh "p4 print -q -o config.yml //c/test/gradle/hk/config.yml"
def config = readYaml file: 'devops-config.yml'
def out = sh (script: """
client_name=${config.BasicVars.p4_client}
echo "client name: \$client_name"
cmd_output = p4 clients -e \$client_name
echo "out variable: \$cmd_output"
""", returnStdout: true)
}

In a Jenkinsfile, how to access a variable that is declared within a sh step?

Say I have a Jenkinsfile. Within that Jenkinsfile, is the following sh step:
sh "myScript.sh"
Within myScript.sh, the following variable is declared:
MY_VARIABLE="This is my variable"
How can I access MY_VARIABLE, which is declared in myScript.sh, from my Jenkinsfile?
To import the variable defined in your script into the current shell, you can use the source command (see explanation on SU):
# Either via command
source myScript.sh
# Or via built-in synonym
. myScript.sh
Supposing your script does not output anything, you can then instead output the variable to fetch it in Jenkins:
def myVar = sh(returnStdout: true, script: '. myScript.sh && echo $MY_VARIABLE')
If indeed outputs comes from your script, you can fetch the last output either per shell:
(. myScript.sh && echo $MY_VARIABLE) | tail -n1
or via Groovy:
def out = sh(returnStdout: true, script: '. myScript.sh && echo $MY_VARIABLE')
def myVar = out.tokenize('\n')[-1]
The bash variable declared in .sh file is ending with the pipeline step: sh complete.
But you can make you .sh to generate a properties file, then use pipeline step: readProperties to read the file into object for accessing.
// myScript.sh
...
echo MY_VARIABLE=This is my variable > vars.properties
// pipeline
sh 'myScript.sh'
def props = readProperties file: 'vars.properties'
echo props.MY_VARIABLE

How to pass arguments to a jshell script?

Question
I am willing to pass arguments to a jshell script. For instance, I would have liked something like this:
jshell myscript.jsh "some text"
and then to have the string "some text" available in some variable inside the script.
However, jshell only expects a list of files, therefore the answer is:
File 'some text' for 'jshell' is not found.
Is there any way to properly pass arguments to a jshell script?
Workaround so far
My only solution so far is to use an environment variable when calling the script:
ARG="some test" jshell myscript.jsh
And then I can access it in the script with:
System.getenv().get("ARG")
And what about option -R
> jshell -v -R-Da=b ./file.jsh
for script
{
String value = System.getProperty("a");
System.out.println("a="+value);
}
/exit
will give you
> jshell -v -R-Da=b ./file.jsh
a=b
Another way, would be following:
{
class A {
public void main(String args[])
{
for(String arg : args) {
System.out.println(arg);
}
}
}
new A().main(System.getProperty("args").split(" "));
}
and execution
> jshell -R-Dargs="aaa bbb ccc" ./file_2.jsh
Update
Previous solution will fail with more complex args. E.g. 'This is my arg'.
But we can benefit from ant and it's CommandLine class
import org.apache.tools.ant.types.Commandline;
{
class A {
public void main(String args[])
{
for(String arg : args) {
System.out.println(arg);
}
}
}
new A().main(Commandline.translateCommandline(System.getProperty("args")));
}
and then, we can call it like this:
jshell --class-path ./ant.jar -R-Dargs="aaa 'Some args with spaces' bbb ccc" ./file_2.jsh
aaa
Some args with spaces
bbb
ccc
Of course, ant.jar must be in the path that is passed via --class-path
Oracle really screwed this up, there is no good way to do this. In addition to #mko's answer and if you use Linux(probably will work on Mac too) you can use process substitution.
jshell <(echo 'String arg="some text"') myscript.jsh
And then you can just use arg in myscript.jsh for example:
System.out.println(arg) // will print "some text"
You can simplify it with some bash function and probably write a batch file that will write to a temp file and do the same on windows.
It's completely beyond me how Oracle could ignore this. 8-() But anyway: if your system uses bash as shell, you can combine this approach replacing the shebang with the idea to (ab-)use system properties to transport the whole command line into a variable:
//usr/bin/env jshell --execution local "-J-Da=$*" "$0"; exit $?
String commandline = System.getProperty("a");
System.out.println(commandline);
/exit
This way, you can call the script on the commandline simply adding the arguments: thisscript.jsh arg1 arg2 would print arg1 arg2.
Please note that this joins all parameters into one String, separated by one space. You can split it again with commandline.split("\s"), but please be aware that this isn't exact: there is no difference between two parameters a b and one parameter "a b".
If you have a fixed number of arguments, you can also pass all of these into separate system properties with "-J-Darg1=$1" "-J-Darg2=$1" "-J-Darg3=$1" etc. Please observe that you have to use -R-D... if you are not using --execution local
Another variant is generating the script on the fly with bash's process substitution. You can use such a script also simply as thisscript.jsh arg1 arg2 also on Unix-like systems having a bash.
#!/usr/bin/env bash
jshell <(
cat <<EOF
System.out.println("$1");
System.out.println("$2");
/exit
EOF
)
This allows to access individual parameters, though it will break when there are double quotes or other special characters in a parameter. Expanding on that idea: here's a way to put all parameters into an Java String array, quoting some of those characters:
#!/usr/bin/env bash
set -- "${#//\\/\\\\}"
set -- "${#//\"/\\\"}"
set -- "${#/#/\"}"
set -- "${#/%/\",}"
jshell <(
cat <<EOF
String[] args = new String[]{$#};
System.out.println(Arrays.asList(args));
/exit
EOF
)
The set -- statements double backslashes, quote double quotes and prefix a " and append a ", to transform the arguments into a valid Java array.
Recently, I was inspired by answers from Oleg and Hans-Peter Störr enough to try to combine them so that a) I could use normal shell arguments b) write regular Java code expecting a String[] args input:
//usr/bin/env jshell <(ARGS=; for var in "$#"; do ARGS+="\"$var\","; done; echo "String[] args = {$ARGS}") "$0"; exit $?
System.out.println("RESULT: " + Arrays.asList(args));
/exit
Using Hans' header line and then inlining as demonstrated by Oleg which builds the $# args into a String[] args var.
With that you can chmod +x your script and execute it with regular old arguments:
]$ ./script.jsh foo bar
RESULT: [test, bar]

Create variable from string/nameonly parameter to extract data in bash?

I want to save the variable name and its contents easily from my script.
Currently :-
LOGFILE=/root/log.txt
TEST=/file/path
echo "TEST : ${TEST}" >> ${LOGFILE}
Desired :-
LOGFILE=/root/log.txt
function save()
{
echo "$1 : $1" >> ${LOGFILE}
}
TEST=/file/path
save TEST
Obviously the above save function just saves TEST : TEST
Want I want it to save is TEST : /file/path
Can this be done? How? Many thanks in advance!
You want to use Variable Indirection. Also, don't use the function keyword, it is not POSIX and also not necessary as long as you have () at the end of your function name.
LOGFILE=/root/log.txt
save()
{
echo "$1 : ${!1}" >> ${LOGFILE}
}
TEST=/file/path
save TEST
Proof of Concept
$ TEST=foo; save(){ echo "$1 : ${!1}"; }; save TEST
TEST : foo
Yes, using indirect expansion:
echo "$1 : ${!1}"
Quoting from Bash reference manual:
The basic form of parameter expansion is ${parameter} [...] If the first character of parameter is an exclamation point (!), a level of variable indirection is introduced. Bash uses the value of the variable formed from the rest of parameter as the name of the variable; this variable is then expanded and that value is used in the rest of the substitution, rather than the value of parameter itself. This is known as indirect expansion
Consider using the printenv function. It does exactly what it says on the tin, prints your environment. It can also take parameters
$ printenv
SSH_AGENT_PID=2068
TERM=xterm
SHELL=/bin/bash
LANG=en_US.UTF-8
HISTCONTROL=ignoreboth
...etc
You could do printenv and then grep for any vars you know you have defined and be done in two lines, such as:
$printenv | grep "VARNAME1\|VARNAME2"
VARNAME1=foo
VARNAME2=bar

Resources