mulitple parameters in perl system function for windows - windows

I have one scneario where i have to execute one java program & for that i have to first set the class path & all those being invoked under single perl program. I am trying below command which doesn't work:
$command1="echo \" First command\"";
$command2="echo \" Second command\"";
system("$command1;$command2");
Above command works fine in LINUX but not in windows. Please help me in execution of this.

On most platforms,
system($shell_command);
means
system('sh', '-c', $shell_command);
On Windows, it means something closer to
system('cmd', '/x', '/c', $shell_command);
Option 1
Keep using a bourne shell command, but explicitly specify a bourne shell is needed.
system('sh', '-c', 'echo 1 ; echo 2');
This isn't likely to work since the computer is not likely to have a bourne shell installed.
Option 2
Use the correct syntax for the local shell.
if ($O eq 'MSWin32') {
system('echo 1 & echo 2');
} else {
system('echo 1 ; echo 2');
}
Option 3
Call system twice.
system('echo 1');
system('echo 2');

If you want to use ; between commands you have to invoke shell. This is alternative to ;
my #cmds =(
[ "echo", q{" First command"} ],
[ "echo", q{" Second command"} ],
);
system (#$_) for #cmds;

Is this what you are trying
$a=" echo wasssup people";
$b=" echo hi guys";
system("$a"."$b");

Related

Jenkins pipeline undefined variable

I'm trying to build a Jenkins Pipeline for which a parameter is
optional:
parameters {
string(
name:'foo',
defaultValue:'',
description:'foo is foo'
)
}
My purpose is calling a shell script and providing foo as argument:
stages {
stage('something') {
sh "some-script.sh '${params.foo}'"
}
}
The shell script will do the Right Thing™ if the provided value is the empty
string.
Unfortunately I can't just get an empty string. If the user does not provide
a value for foo, Jenkins will set it to null, and I will get null
(as string) inside my command.
I found this related question but the only answer is not really helpful.
Any suggestion?
OP here realized a wrapper script can be helpful… I ironically called it junkins-cmd and I call it like this:
stages {
stage('something') {
sh "junkins-cmd some-script.sh '${params.foo}'"
}
}
Code:
#!/bin/bash
helpme() {
cat <<EOF
Usage: $0 <command> [parameters to command]
This command is a wrapper for jenkins pipeline. It tries to overcome jenkins
idiotic behaviour when calling programs without polluting the remaining part
of the toolkit.
The given command is executed with the fixed version of the given
parameters. Current fixes:
- 'null' is replaced with ''
EOF
} >&2
trap helpme EXIT
command="${1:?Missing command}"; shift
trap - EXIT
typeset -a params
for p in "$#"; do
# Jenkins pipeline uses 'null' when the parameter is undefined.
[[ "$p" = 'null' ]] && p=''
params+=("$p")
done
exec $command "${params[#]}"
Beware: prams+=("$p") seems not to be portable among shells: hence this ugly script is running #!/bin/bash.

Is the behavior behind the Shellshock vulnerability in Bash documented or at all intentional?

A recent vulnerability, CVE-2014-6271, in how Bash interprets environment variables was disclosed. The exploit relies on Bash parsing some environment variable declarations as function definitions, but then continuing to execute code following the definition:
$ x='() { echo i do nothing; }; echo vulnerable' bash -c ':'
vulnerable
But I don't get it. There's nothing I've been able to find in the Bash manual about interpreting environment variables as functions at all (except for inheriting functions, which is different). Indeed, a proper named function definition is just treated as a value:
$ x='y() { :; }' bash -c 'echo $x'
y() { :; }
But a corrupt one prints nothing:
$ x='() { :; }' bash -c 'echo $x'
$ # Nothing but newline
The corrupt function is unnamed, and so I can't just call it. Is this vulnerability a pure implementation bug, or is there an intended feature here, that I just can't see?
Update
Per Barmar's comment, I hypothesized the name of the function was the parameter name:
$ n='() { echo wat; }' bash -c 'n'
wat
Which I could swear I tried before, but I guess I didn't try hard enough. It's repeatable now. Here's a little more testing:
$ env n='() { echo wat; }; echo vuln' bash -c 'n'
vuln
wat
$ env n='() { echo wat; }; echo $1' bash -c 'n 2' 3 -- 4
wat
…so apparently the args are not set at the time the exploit executes.
Anyway, the basic answer to my question is, yes, this is how Bash implements inherited functions.
This seems like an implementation bug.
Apparently, the way exported functions work in bash is that they use specially-formatted environment variables. If you export a function:
f() { ... }
it defines an environment variable like:
f='() { ... }'
What's probably happening is that when the new shell sees an environment variable whose value begins with (), it prepends the variable name and executes the resulting string. The bug is that this includes executing anything after the function definition as well.
The fix described is apparently to parse the result to see if it's a valid function definition. If not, it prints the warning about the invalid function definition attempt.
This article confirms my explanation of the cause of the bug. It also goes into a little more detail about how the fix resolves it: not only do they parse the values more carefully, but variables that are used to pass exported functions follow a special naming convention. This naming convention is different from that used for the environment variables created for CGI scripts, so an HTTP client should never be able to get its foot into this door.
The following:
x='() { echo I do nothing; }; echo vulnerable' bash -c 'typeset -f'
prints
vulnerable
x ()
{
echo I do nothing
}
declare -fx x
seems, than Bash, after having parsed the x=..., discovered it as a function, exported it, saw the declare -fx x and allowed the execution of the command after the declaration.
echo vulnerable
x='() { x; }; echo vulnerable' bash -c 'typeset -f'
prints:
vulnerable
x ()
{
echo I do nothing
}
and running the x
x='() { x; }; echo Vulnerable' bash -c 'x'
prints
Vulnerable
Segmentation fault: 11
segfaults - infinite recursive calls
It doesn't overrides already defined function
$ x() { echo Something; }
$ declare -fx x
$ x='() { x; }; echo Vulnerable' bash -c 'typeset -f'
prints:
x ()
{
echo Something
}
declare -fx x
e.g. the x remains the previously (correctly) defined function.
For the Bash 4.3.25(1)-release the vulnerability is closed, so
x='() { echo I do nothing; }; echo Vulnerable' bash -c ':'
prints
bash: warning: x: ignoring function definition attempt
bash: error importing function definition for `x'
but - what is strange (at least for me)
x='() { x; };' bash -c 'typeset -f'
STILL PRINTS
x ()
{
x
}
declare -fx x
and the
x='() { x; };' bash -c 'x'
segmentation faults too, so it STILL accept the strange function definition...
I think it's worth looking at the Bash code itself. The patch gives a bit of insight as to the problem. In particular,
*** ../bash-4.3-patched/variables.c 2014-05-15 08:26:50.000000000 -0400
--- variables.c 2014-09-14 14:23:35.000000000 -0400
***************
*** 359,369 ****
strcpy (temp_string + char_index + 1, string);
! if (posixly_correct == 0 || legal_identifier (name))
! parse_and_execute (temp_string, name, SEVAL_NONINT|SEVAL_NOHIST);
!
! /* Ancient backwards compatibility. Old versions of bash exported
! functions like name()=() {...} */
! if (name[char_index - 1] == ')' && name[char_index - 2] == '(')
! name[char_index - 2] = '\0';
if (temp_var = find_function (name))
--- 364,372 ----
strcpy (temp_string + char_index + 1, string);
! /* Don't import function names that are invalid identifiers from the
! environment, though we still allow them to be defined as shell
! variables. */
! if (legal_identifier (name))
! parse_and_execute (temp_string, name, SEVAL_NONINT|SEVAL_NOHIST|SEVAL_FUNCDEF|SEVAL_ONECMD);
if (temp_var = find_function (name))
When Bash exports a function, it shows up as an environment variable, for example:
$ foo() { echo 'hello world'; }
$ export -f foo
$ cat /proc/self/environ | tr '\0' '\n' | grep -A1 foo
foo=() { echo 'hello world'
}
When a new Bash process finds a function defined this way in its environment, it evalutes the code in the variable using parse_and_execute(). For normal, non-malicious code, executing it simply defines the function in Bash and moves on. However, because it's passed to a generic execution function, Bash will correctly parse and execute additional code defined in that variable after the function definition.
You can see that in the new code, a flag called SEVAL_ONECMD has been added that tells Bash to only evaluate the first command (that is, the function definition) and SEVAL_FUNCDEF to only allow functio0n definitions.
In regard to your question about documentation, notice here in the commandline documentation for the env command, that a study of the syntax shows that env is working as documented.
There are, optionally, 4 possible options
An optional hyphen as a synonym for -i (for backward compatibility I assume)
Zero or more NAME=VALUE pairs. These are the variable assignment(s) which could include function definitions.
Note that no semicolon (;) is required between or following the assignments.
The last argument(s) can be a single command followed by its argument(s). It will run with whatever permissions have been granted to the login being used. Security is controlled by restricting permissions on the login user and setting permissions on user-accessible executables such that users other than the executable's owner can only read and execute the program, not alter it.
[ spot#LX03:~ ] env --help
Usage: env [OPTION]... [-] [NAME=VALUE]... [COMMAND [ARG]...]
Set each NAME to VALUE in the environment and run COMMAND.
-i, --ignore-environment start with an empty environment
-u, --unset=NAME remove variable from the environment
--help display this help and exit
--version output version information and exit
A mere - implies -i. If no COMMAND, print the resulting environment.
Report env bugs to bug-coreutils#gnu.org
GNU coreutils home page: <http://www.gnu.org/software/coreutils/>
General help using GNU software: <http://www.gnu.org/gethelp/>
Report env translation bugs to <http://translationproject.org/team/>

AIX - bash script

I'm trying to implement a small bash script in AIX, but I'm having some problems. Bellow you can find a example. I have another question, if I want to add the script to Crontab, I think I'll have problems to call serverStatus.sh from IBM, how can avoid this problem.
#!/usr/bin/sh
WAS_HOME="/usr/IBM/WebSphere/AppServer/profiles/bpmnprd01/"
function StatusCheck()
{
$WAS_HOME/bin/serverStatus.sh BPM.AppTarget.bpmnprd01.0 -username admin -password admin
status=$(cat /usr/IBM/WebSphere/AppServer/profiles/bpmnprd01/logs/BPM.AppTarget.xxxxx/serverStatus.log| awk '{ if (NF > 0) { last = $NF } } END { print last }' "$#")
text="STOPPED"
if [[ $text == $status ]]
then
echo "OK"
else
echo "NOK"
fi
}
function start()
{
StatusCheck
}
start
-----------------------
when I try to execute the script above, I get the following error:
[root#bpmnprd01]/root/health_check# ./servers_check.sh
./servers_check.sh[7]: 0403-057 Syntax error at line 7 : `(' is not expected.
...after this I search on google, and I found some examples without "()" on subroutine.But I got this:
[root#bpmnprd01]/root/health_check# ./servers_check.sh
./servers_check.sh[30]: 0403-057 Syntax error at line 33 : `StatusCheck' is not expected.
Thanks in Advance
Tiago
AIX has a true bourne shell living in /bin/sh, not sure about /usr/bin/sh, but would expect that to be Bourne shell as well.
Change your script heading line (the #shebang!) to
#!/usr/bin/bash
Or the result of which bash
IHTH
You are using bash specific syntax but calling the script with sh, which has more limited capabilities. Since you want to use sh, you can use a tool like checkbashisms or shellcheck to help uncover non-portable syntax.
The immediate problem is that function foo() { ..; } is not a POSIX compliant function definition, and you should drop the keyword function and use just foo() { ..; }.
Your shell may also be lacking [[ ]] in which case you should use [ ] instead, with = instead of ==.

Increment a global variable in Bash

Here's a shell script:
globvar=0
function myfunc {
let globvar=globvar+1
echo "myfunc: $globvar"
}
myfunc
echo "something" | myfunc
echo "Global: $globvar"
When called, it prints out the following:
$ sh zzz.sh
myfunc: 1
myfunc: 2
Global: 1
$ bash zzz.sh
myfunc: 1
myfunc: 2
Global: 1
$ zsh zzz.sh
myfunc: 1
myfunc: 2
Global: 2
The question is: why this happens and what behavior is correct?
P.S. I have a strange feeling that function behind the pipe is called in a forked shell... So, can there be a simple workaround?
P.P.S. This function is a simple test wrapper. It runs test application and analyzes its output. Then it increments $PASSED or $FAILED variables. Finally, you get a number of passed/failed tests in global variables. The usage is like:
test-util << EOF | myfunc
input for test #1
EOF
test-util << EOF | myfunc
input for test #2
EOF
echo "Passed: $PASSED, failed: $FAILED"
Korn shell gives the same results as zsh, by the way.
Please see BashFAQ/024. Pipes create subshells in Bash and variables are lost when subshells exit.
Based on your example, I would restructure it something like this:
globvar=0
function myfunc {
echo $(($1 + 1))
}
myfunc "$globvar"
globalvar=$(echo "something" | myfunc "$globalvar")
Piping something into myfunc in sh or bash causes a new shell to spawn. You can confirm this by adding a long sleep in myfunc. While it's sleeping call ps and you'll see a subprocess. When the function returns, that sub shell exits without changing the value in the parent process.
If you really need that value to be changed, you'll need to return a value from the function and check $PIPESTATUS after, I guess, like this:
globvar=0
function myfunc {
let globvar=globvar+1
echo "myfunc: $globvar"
return $globvar
}
myfunc
echo "something" | myfunc
globvar=${PIPESTATUS[1]}
echo "Global: $globvar"
The problem is 'which end of a pipeline using built-ins is executed by the original process?'
In zsh, it looks like the last command in the pipeline is executed by the main shell script when the command is a function or built-in.
In Bash (and sh is likely to be a link to Bash if you're on Linux), then either both commands are run in a sub-shell or the first command is run by the main process and the others are run by sub-shells.
Clearly, when the function is run in a sub-shell, it does not affect the variable in the parent shell (only the global in the sub-shell).
Consider adding an extra test:
echo Something | { myfunc; echo $globvar; }
echo $globvar

Run a string as a command within a Bash script

I have a Bash script that builds a string to run as a command
Script:
#! /bin/bash
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
teamAComm="`pwd`/a.sh"
teamBComm="`pwd`/b.sh"
include="`pwd`/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
illcommando="$serverbin include='$include' server::team_l_start = '${teamAComm}' server::team_r_start = '${teamBComm}' CSVSaver::save='true' CSVSaver::filename = 'out.csv'"
echo "running: $illcommando"
# $illcommando > server-output.log 2> server-error.log
$illcommando
which does not seem to supply the arguments correctly to the $serverbin.
Script output:
running: /usr/local/bin/rcssserver include='/home/joao/robocup/runner_workdir/server_official.conf' server::team_l_start = '/home/joao/robocup/runner_workdir/a.sh' server::team_r_start = '/home/joao/robocup/runner_workdir/b.sh' CSVSaver::save='true' CSVSaver::filename = 'out.csv'
rcssserver-14.0.1
Copyright (C) 1995, 1996, 1997, 1998, 1999 Electrotechnical Laboratory.
2000 - 2009 RoboCup Soccer Simulator Maintenance Group.
Usage: /usr/local/bin/rcssserver [[-[-]]namespace::option=value]
[[-[-]][namespace::]help]
[[-[-]]include=file]
Options:
help
display generic help
include=file
parse the specified configuration file. Configuration files
have the same format as the command line options. The
configuration file specified will be parsed before all
subsequent options.
server::help
display detailed help for the "server" module
player::help
display detailed help for the "player" module
CSVSaver::help
display detailed help for the "CSVSaver" module
CSVSaver Options:
CSVSaver::save=<on|off|true|false|1|0|>
If save is on/true, then the saver will attempt to save the
results to the database. Otherwise it will do nothing.
current value: false
CSVSaver::filename='<STRING>'
The file to save the results to. If this file does not
exist it will be created. If the file does exist, the results
will be appended to the end.
current value: 'out.csv'
if I just paste the command /usr/local/bin/rcssserver include='/home/joao/robocup/runner_workdir/server_official.conf' server::team_l_start = '/home/joao/robocup/runner_workdir/a.sh' server::team_r_start = '/home/joao/robocup/runner_workdir/b.sh' CSVSaver::save='true' CSVSaver::filename = 'out.csv' (in the output after "runnning: ") it works fine.
You can use eval to execute a string:
eval $illcommando
your_command_string="..."
output=$(eval "$your_command_string")
echo "$output"
I usually place commands in parentheses $(commandStr), if that doesn't help I find bash debug mode great, run the script as bash -x script
don't put your commands in variables, just run it
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
PWD=$(pwd)
teamAComm="$PWD/a.sh"
teamBComm="$PWD/b.sh"
include="$PWD/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
$serverbin include=$include server::team_l_start = ${teamAComm} server::team_r_start=${teamBComm} CSVSaver::save='true' CSVSaver::filename = 'out.csv'
./me casts raise_dead()
I was looking for something like this, but I also needed to reuse the same string minus two parameters so I ended up with something like:
my_exe ()
{
mysql -sN -e "select $1 from heat.stack where heat.stack.name=\"$2\";"
}
This is something I use to monitor openstack heat stack creation. In this case I expect two conditions, an action 'CREATE' and a status 'COMPLETE' on a stack named "Somestack"
To get those variables I can do something like:
ACTION=$(my_exe action Somestack)
STATUS=$(my_exe status Somestack)
if [[ "$ACTION" == "CREATE" ]] && [[ "$STATUS" == "COMPLETE" ]]
...
Here is my gradle build script that executes strings stored in heredocs:
current_directory=$( realpath "." )
GENERATED=${current_directory}/"GENERATED"
build_gradle=$( realpath build.gradle )
## touch because .gitignore ignores this folder:
touch $GENERATED
COPY_BUILD_FILE=$( cat <<COPY_BUILD_FILE_HEREDOC
cp
$build_gradle
$GENERATED/build.gradle
COPY_BUILD_FILE_HEREDOC
)
$COPY_BUILD_FILE
GRADLE_COMMAND=$( cat <<GRADLE_COMMAND_HEREDOC
gradle run
--build-file
$GENERATED/build.gradle
--gradle-user-home
$GENERATED
--no-daemon
GRADLE_COMMAND_HEREDOC
)
$GRADLE_COMMAND
The lone ")" are kind of ugly. But I have no clue how to fix that asthetic aspect.
To see all commands that are being executed by the script, add the -x flag to your shabang line, and execute the command normally:
#! /bin/bash -x
matchdir="/home/joao/robocup/runner_workdir/matches/testmatch/"
teamAComm="`pwd`/a.sh"
teamBComm="`pwd`/b.sh"
include="`pwd`/server_official.conf"
serverbin='/usr/local/bin/rcssserver'
cd $matchdir
$serverbin include="$include" server::team_l_start="${teamAComm}" server::team_r_start="${teamBComm}" CSVSaver::save='true' CSVSaver::filename='out.csv'
Then if you sometimes want to ignore the debug output, redirect stderr somewhere.
For me echo XYZ_20200824.zip | grep -Eo '[[:digit:]]{4}[[:digit:]]{2}[[:digit:]]{2}'
was working fine but unable to store output of command into variable.
I had same issue I tried eval but didn't got output.
Here is answer for my problem:
cmd=$(echo XYZ_20200824.zip | grep -Eo '[[:digit:]]{4}[[:digit:]]{2}[[:digit:]]{2}')
echo $cmd
My output is now 20200824

Resources