Question
I am willing to pass arguments to a jshell script. For instance, I would have liked something like this:
jshell myscript.jsh "some text"
and then to have the string "some text" available in some variable inside the script.
However, jshell only expects a list of files, therefore the answer is:
File 'some text' for 'jshell' is not found.
Is there any way to properly pass arguments to a jshell script?
Workaround so far
My only solution so far is to use an environment variable when calling the script:
ARG="some test" jshell myscript.jsh
And then I can access it in the script with:
System.getenv().get("ARG")
And what about option -R
> jshell -v -R-Da=b ./file.jsh
for script
{
String value = System.getProperty("a");
System.out.println("a="+value);
}
/exit
will give you
> jshell -v -R-Da=b ./file.jsh
a=b
Another way, would be following:
{
class A {
public void main(String args[])
{
for(String arg : args) {
System.out.println(arg);
}
}
}
new A().main(System.getProperty("args").split(" "));
}
and execution
> jshell -R-Dargs="aaa bbb ccc" ./file_2.jsh
Update
Previous solution will fail with more complex args. E.g. 'This is my arg'.
But we can benefit from ant and it's CommandLine class
import org.apache.tools.ant.types.Commandline;
{
class A {
public void main(String args[])
{
for(String arg : args) {
System.out.println(arg);
}
}
}
new A().main(Commandline.translateCommandline(System.getProperty("args")));
}
and then, we can call it like this:
jshell --class-path ./ant.jar -R-Dargs="aaa 'Some args with spaces' bbb ccc" ./file_2.jsh
aaa
Some args with spaces
bbb
ccc
Of course, ant.jar must be in the path that is passed via --class-path
Oracle really screwed this up, there is no good way to do this. In addition to #mko's answer and if you use Linux(probably will work on Mac too) you can use process substitution.
jshell <(echo 'String arg="some text"') myscript.jsh
And then you can just use arg in myscript.jsh for example:
System.out.println(arg) // will print "some text"
You can simplify it with some bash function and probably write a batch file that will write to a temp file and do the same on windows.
It's completely beyond me how Oracle could ignore this. 8-() But anyway: if your system uses bash as shell, you can combine this approach replacing the shebang with the idea to (ab-)use system properties to transport the whole command line into a variable:
//usr/bin/env jshell --execution local "-J-Da=$*" "$0"; exit $?
String commandline = System.getProperty("a");
System.out.println(commandline);
/exit
This way, you can call the script on the commandline simply adding the arguments: thisscript.jsh arg1 arg2 would print arg1 arg2.
Please note that this joins all parameters into one String, separated by one space. You can split it again with commandline.split("\s"), but please be aware that this isn't exact: there is no difference between two parameters a b and one parameter "a b".
If you have a fixed number of arguments, you can also pass all of these into separate system properties with "-J-Darg1=$1" "-J-Darg2=$1" "-J-Darg3=$1" etc. Please observe that you have to use -R-D... if you are not using --execution local
Another variant is generating the script on the fly with bash's process substitution. You can use such a script also simply as thisscript.jsh arg1 arg2 also on Unix-like systems having a bash.
#!/usr/bin/env bash
jshell <(
cat <<EOF
System.out.println("$1");
System.out.println("$2");
/exit
EOF
)
This allows to access individual parameters, though it will break when there are double quotes or other special characters in a parameter. Expanding on that idea: here's a way to put all parameters into an Java String array, quoting some of those characters:
#!/usr/bin/env bash
set -- "${#//\\/\\\\}"
set -- "${#//\"/\\\"}"
set -- "${#/#/\"}"
set -- "${#/%/\",}"
jshell <(
cat <<EOF
String[] args = new String[]{$#};
System.out.println(Arrays.asList(args));
/exit
EOF
)
The set -- statements double backslashes, quote double quotes and prefix a " and append a ", to transform the arguments into a valid Java array.
Recently, I was inspired by answers from Oleg and Hans-Peter Störr enough to try to combine them so that a) I could use normal shell arguments b) write regular Java code expecting a String[] args input:
//usr/bin/env jshell <(ARGS=; for var in "$#"; do ARGS+="\"$var\","; done; echo "String[] args = {$ARGS}") "$0"; exit $?
System.out.println("RESULT: " + Arrays.asList(args));
/exit
Using Hans' header line and then inlining as demonstrated by Oleg which builds the $# args into a String[] args var.
With that you can chmod +x your script and execute it with regular old arguments:
]$ ./script.jsh foo bar
RESULT: [test, bar]
Related
I have a nextflow script that creates a variable from a text file, and I need to pass the value of that variable to a command line order (which is a bioconda package). Those two processes happen inside the "script" part. I have tried to call the variable using the '$' symbol without any results, I think because using that symbol in the script part of a nextflow script is for calling variables defined in the input part.
To make myself clearer, here is a code sample of what I'm trying to achieve:
params.gz_file = '/path/to/file.gz'
params.fa_file = '/path/to/file.fa'
params.output_dir = '/path/to/outdir'
input_file = file(params.gz_file)
fasta_file = file(params.fa_file)
process foo {
//publishDir "${params.output_dir}", mode: 'copy',
input:
path file from input_file
path fasta from fasta_file
output:
file ("*.html")
script:
"""
echo 123 > number.txt
parameter=`cat number.txt`
create_report $file $fasta --flanking $parameter
"""
}
By doig this the error I recieve is:
Error executing process > 'foo'
Caused by:
Unknown variable 'parameter' -- Make sure it is not misspelt and defined somewhere in the script before using it
Is there any way to call the variable parameter inside the script without Nextflow interpreting it as an input file? Thanks in advance!
The documentation re the script block is useful here:
Since Nextflow uses the same Bash syntax for variable substitutions in
strings, you need to manage them carefully depending on if you want to
evaluate a variable in the Nextflow context - or - in the Bash
environment execution.
One solution is to escape your shell (Bash) variables by prefixing them with a back-slash (\) character, like in the following example:
process foo {
script:
"""
echo 123 > number.txt
parameter="\$(cat number.txt)"
echo "\${parameter}"
"""
}
Another solution is to instead use a shell block, where dollar ($) variables are managed by your shell (Bash interpreter), while exclamation mark (!) variables are handled by Nextflow. For example:
process bar {
echo true
input:
val greeting from 'Hello', 'Hola', 'Bonjour'
shell:
'''
echo 123 > number.txt
parameter="$(cat number.txt)"
echo "!{greeting} parameter ${parameter}"
'''
}
declare "parameter" in the top 'params' section.
params.parameter="1234"
(..)
script:
"""
(...)
create_report $file $fasta --flanking ${params.parameter}
(...)
"""
(...)
and call "nextflow run" with "--parameter 87678"
I'm trying to build a Jenkins Pipeline for which a parameter is
optional:
parameters {
string(
name:'foo',
defaultValue:'',
description:'foo is foo'
)
}
My purpose is calling a shell script and providing foo as argument:
stages {
stage('something') {
sh "some-script.sh '${params.foo}'"
}
}
The shell script will do the Right Thing™ if the provided value is the empty
string.
Unfortunately I can't just get an empty string. If the user does not provide
a value for foo, Jenkins will set it to null, and I will get null
(as string) inside my command.
I found this related question but the only answer is not really helpful.
Any suggestion?
OP here realized a wrapper script can be helpful… I ironically called it junkins-cmd and I call it like this:
stages {
stage('something') {
sh "junkins-cmd some-script.sh '${params.foo}'"
}
}
Code:
#!/bin/bash
helpme() {
cat <<EOF
Usage: $0 <command> [parameters to command]
This command is a wrapper for jenkins pipeline. It tries to overcome jenkins
idiotic behaviour when calling programs without polluting the remaining part
of the toolkit.
The given command is executed with the fixed version of the given
parameters. Current fixes:
- 'null' is replaced with ''
EOF
} >&2
trap helpme EXIT
command="${1:?Missing command}"; shift
trap - EXIT
typeset -a params
for p in "$#"; do
# Jenkins pipeline uses 'null' when the parameter is undefined.
[[ "$p" = 'null' ]] && p=''
params+=("$p")
done
exec $command "${params[#]}"
Beware: prams+=("$p") seems not to be portable among shells: hence this ugly script is running #!/bin/bash.
A recent vulnerability, CVE-2014-6271, in how Bash interprets environment variables was disclosed. The exploit relies on Bash parsing some environment variable declarations as function definitions, but then continuing to execute code following the definition:
$ x='() { echo i do nothing; }; echo vulnerable' bash -c ':'
vulnerable
But I don't get it. There's nothing I've been able to find in the Bash manual about interpreting environment variables as functions at all (except for inheriting functions, which is different). Indeed, a proper named function definition is just treated as a value:
$ x='y() { :; }' bash -c 'echo $x'
y() { :; }
But a corrupt one prints nothing:
$ x='() { :; }' bash -c 'echo $x'
$ # Nothing but newline
The corrupt function is unnamed, and so I can't just call it. Is this vulnerability a pure implementation bug, or is there an intended feature here, that I just can't see?
Update
Per Barmar's comment, I hypothesized the name of the function was the parameter name:
$ n='() { echo wat; }' bash -c 'n'
wat
Which I could swear I tried before, but I guess I didn't try hard enough. It's repeatable now. Here's a little more testing:
$ env n='() { echo wat; }; echo vuln' bash -c 'n'
vuln
wat
$ env n='() { echo wat; }; echo $1' bash -c 'n 2' 3 -- 4
wat
…so apparently the args are not set at the time the exploit executes.
Anyway, the basic answer to my question is, yes, this is how Bash implements inherited functions.
This seems like an implementation bug.
Apparently, the way exported functions work in bash is that they use specially-formatted environment variables. If you export a function:
f() { ... }
it defines an environment variable like:
f='() { ... }'
What's probably happening is that when the new shell sees an environment variable whose value begins with (), it prepends the variable name and executes the resulting string. The bug is that this includes executing anything after the function definition as well.
The fix described is apparently to parse the result to see if it's a valid function definition. If not, it prints the warning about the invalid function definition attempt.
This article confirms my explanation of the cause of the bug. It also goes into a little more detail about how the fix resolves it: not only do they parse the values more carefully, but variables that are used to pass exported functions follow a special naming convention. This naming convention is different from that used for the environment variables created for CGI scripts, so an HTTP client should never be able to get its foot into this door.
The following:
x='() { echo I do nothing; }; echo vulnerable' bash -c 'typeset -f'
prints
vulnerable
x ()
{
echo I do nothing
}
declare -fx x
seems, than Bash, after having parsed the x=..., discovered it as a function, exported it, saw the declare -fx x and allowed the execution of the command after the declaration.
echo vulnerable
x='() { x; }; echo vulnerable' bash -c 'typeset -f'
prints:
vulnerable
x ()
{
echo I do nothing
}
and running the x
x='() { x; }; echo Vulnerable' bash -c 'x'
prints
Vulnerable
Segmentation fault: 11
segfaults - infinite recursive calls
It doesn't overrides already defined function
$ x() { echo Something; }
$ declare -fx x
$ x='() { x; }; echo Vulnerable' bash -c 'typeset -f'
prints:
x ()
{
echo Something
}
declare -fx x
e.g. the x remains the previously (correctly) defined function.
For the Bash 4.3.25(1)-release the vulnerability is closed, so
x='() { echo I do nothing; }; echo Vulnerable' bash -c ':'
prints
bash: warning: x: ignoring function definition attempt
bash: error importing function definition for `x'
but - what is strange (at least for me)
x='() { x; };' bash -c 'typeset -f'
STILL PRINTS
x ()
{
x
}
declare -fx x
and the
x='() { x; };' bash -c 'x'
segmentation faults too, so it STILL accept the strange function definition...
I think it's worth looking at the Bash code itself. The patch gives a bit of insight as to the problem. In particular,
*** ../bash-4.3-patched/variables.c 2014-05-15 08:26:50.000000000 -0400
--- variables.c 2014-09-14 14:23:35.000000000 -0400
***************
*** 359,369 ****
strcpy (temp_string + char_index + 1, string);
! if (posixly_correct == 0 || legal_identifier (name))
! parse_and_execute (temp_string, name, SEVAL_NONINT|SEVAL_NOHIST);
!
! /* Ancient backwards compatibility. Old versions of bash exported
! functions like name()=() {...} */
! if (name[char_index - 1] == ')' && name[char_index - 2] == '(')
! name[char_index - 2] = '\0';
if (temp_var = find_function (name))
--- 364,372 ----
strcpy (temp_string + char_index + 1, string);
! /* Don't import function names that are invalid identifiers from the
! environment, though we still allow them to be defined as shell
! variables. */
! if (legal_identifier (name))
! parse_and_execute (temp_string, name, SEVAL_NONINT|SEVAL_NOHIST|SEVAL_FUNCDEF|SEVAL_ONECMD);
if (temp_var = find_function (name))
When Bash exports a function, it shows up as an environment variable, for example:
$ foo() { echo 'hello world'; }
$ export -f foo
$ cat /proc/self/environ | tr '\0' '\n' | grep -A1 foo
foo=() { echo 'hello world'
}
When a new Bash process finds a function defined this way in its environment, it evalutes the code in the variable using parse_and_execute(). For normal, non-malicious code, executing it simply defines the function in Bash and moves on. However, because it's passed to a generic execution function, Bash will correctly parse and execute additional code defined in that variable after the function definition.
You can see that in the new code, a flag called SEVAL_ONECMD has been added that tells Bash to only evaluate the first command (that is, the function definition) and SEVAL_FUNCDEF to only allow functio0n definitions.
In regard to your question about documentation, notice here in the commandline documentation for the env command, that a study of the syntax shows that env is working as documented.
There are, optionally, 4 possible options
An optional hyphen as a synonym for -i (for backward compatibility I assume)
Zero or more NAME=VALUE pairs. These are the variable assignment(s) which could include function definitions.
Note that no semicolon (;) is required between or following the assignments.
The last argument(s) can be a single command followed by its argument(s). It will run with whatever permissions have been granted to the login being used. Security is controlled by restricting permissions on the login user and setting permissions on user-accessible executables such that users other than the executable's owner can only read and execute the program, not alter it.
[ spot#LX03:~ ] env --help
Usage: env [OPTION]... [-] [NAME=VALUE]... [COMMAND [ARG]...]
Set each NAME to VALUE in the environment and run COMMAND.
-i, --ignore-environment start with an empty environment
-u, --unset=NAME remove variable from the environment
--help display this help and exit
--version output version information and exit
A mere - implies -i. If no COMMAND, print the resulting environment.
Report env bugs to bug-coreutils#gnu.org
GNU coreutils home page: <http://www.gnu.org/software/coreutils/>
General help using GNU software: <http://www.gnu.org/gethelp/>
Report env translation bugs to <http://translationproject.org/team/>
I need to receive arguments I have no control over into a shell script, and preserve any single or double quotes. For instance, a script that simply outputs the given arguments should act as follows:
> my_script.sh "double" 'single' none
"double" 'single' none
I don't have the privilege of augmenting the arguments such as in:
> my_script.sh \"double\" \'single\' none
or
> my_script.sh '"double"' "'single'" none
And neither "$#" nor "$*" work.
I also thought of reading from STDIN and try something like:
> echo "double" 'single' none | my_script.sh
thinking it may help, but no breakthrough so far.
Any suggestions?
CSH / PERL solutions are welcomed.
This is not possible (without escaping), because the shell processes the arguments and removes the quotes before your script is called. As a result, your script does not know about the quotes specified on the command line.
You cannot recover the single/double quotes exactly as they were, because the shell 'eats' them. If you need to call some other script from your script, you can e.g. single quote the arguments again. Here is a PERL solution I use:
sub args2shell
{
local (#argv) = #_;
local $" = '\' \'';
local (#margv);
#margv = map { s/'/'\\''/g; $_ } #argv;
return "\'#margv\'" if #margv;
return undef;
}
Example usage:
$args = args2shell #ARGV;
open F, "find -follow $args ! -type d |";
...
This question already has answers here:
Make a Bash alias that takes a parameter?
(24 answers)
Closed 5 years ago.
Is it possible to do the following:
I want to run the following:
mongodb bin/mongod
In my bash_profile I have
alias = "./path/to/mongodb/$1"
An alias will expand to the string it represents. Anything after the alias will appear after its expansion without needing to be or able to be passed as explicit arguments (e.g. $1).
$ alias foo='/path/to/bar'
$ foo some args
will get expanded to
$ /path/to/bar some args
If you want to use explicit arguments, you'll need to use a function
$ foo () { /path/to/bar "$#" fixed args; }
$ foo abc 123
will be executed as if you had done
$ /path/to/bar abc 123 fixed args
To undefine an alias:
unalias foo
To undefine a function:
unset -f foo
To see the type and definition (for each defined alias, keyword, function, builtin or executable file):
type -a foo
Or type only (for the highest precedence occurrence):
type -t foo
to use parameters in aliases, i use this method:
alias myalias='function __myalias() { echo "Hello $*"; unset -f __myalias; }; __myalias'
its a self-destructive function wrapped in an alias, so it pretty much is the best of both worlds, and doesnt take up an extra line(s) in your definitions... which i hate, oh yeah and if you need that return value, you'll have to store it before calling unset, and then return the value using the "return" keyword in that self destructive function there:
alias myalias='function __myalias() { echo "Hello $*"; myresult=$?; unset -f __myalias; return $myresult; }; __myalias'
so..
you could, if you need to have that variable in there
alias mongodb='function __mongodb() { ./path/to/mongodb/$1; unset -f __mongodb; }; __mongodb'
of course...
alias mongodb='./path/to/mongodb/'
would actually do the same thing without the need for parameters, but like i said, if you wanted or needed them for some reason (for example, you needed $2 instead of $1), you would need to use a wrapper like that. If it is bigger than one line you might consider just writing a function outright since it would become more of an eyesore as it grew larger. Functions are great since you get all the perks that functions give (see completion, traps, bind, etc for the goodies that functions can provide, in the bash manpage).
I hope that helps you out :)
Usually when I want to pass arguments to an alias in Bash, I use a combination of an alias and a function like this, for instance:
function __t2d {
if [ "$1x" != 'x' ]; then
date -d "#$1"
fi
}
alias t2d='__t2d'
This is the solution which can avoid using function:
alias addone='{ num=$(cat -); echo "input: $num"; echo "result:$(($num+1))"; }<<<'
test result
addone 200
input: 200
result:201
In csh (as opposed to bash) you can do exactly what you want.
alias print 'lpr \!^ -Pps5'
print memo.txt
The notation \!^ causes the argument to be inserted in the command at this point.
The ! character is preceeded by a \ to prevent it being interpreted as a history command.
You can also pass multiple arguments:
alias print 'lpr \!* -Pps5'
print part1.ps glossary.ps figure.ps
(Examples taken from http://unixhelp.ed.ac.uk/shell/alias_csh2.1.html .)
To simplify leed25d's answer, use a combination of an alias and a function. For example:
function __GetIt {
cp ./path/to/stuff/$* .
}
alias GetIt='__GetIt'