Command Substitution and variables - shell

I'm running into an issue with command substitution that I'd like help with. I have a few processes that create a text file with commands that need to be executed.
File1.txt
Command_ID|Command_Name|Command
112121|Export XML Components|/Scripts/Export_XML.sh "Argument1" "Argument2"
112122|Test XML integrity|/Scripts/Test_XML.sh "Argument1" "Argument2" "Argument3"
My Script to execute these commands reads File1.txt and tries to execute the command in the third column using the following Command substitution. The goal here is to read and execute the commands sequentially and update a table with their return strings and return codes. I also have logic in the script to stop processing if a non-zero return code is encountered and store the current line number. This way the script can be restarted from the failed line after the issue has been addressed
VAR_File=/files/File1.txt
while IFS=$'|' read -r -a myArray
do
echo "${myArray[2]}"
VAR_Command="${myArray[2]}"
VAR_Return_String=$("${VAR_Command}")
VAR_Return_Code=$?
done < ${VAR_File}
The commands where the Arguments have double quotes are not being executed correctly.
What am I doing wrong and how can I fix this?
Thanks

In your script, VAR_Command is set to some string from File1.txt like /Scripts/Export_XML.sh "Argument1" "Argument2".
When running $(${VAR_Command}" with this string, the shell attempts to execute a script named Export_XML.sh "Argument1" "Argument2" (with quotes inside the file name), rather than the script Test_XML.sh to which the arguments "Argument1" and "Argument2" are passed.
If you remove the quotes by replacing $("${VAR_Command}") by $(${VAR_Command}), your code will work as expected.

Related

How do I write a shell script that takes a data file name as an argument to run my perl script?

I am learning Perl and Shell scripting and of the challenges I was given is to write a shell script that takes a csv file as an argument and then have that shell script run my perl script (test.pl). I can't seem to get the proper syntax down and I have terminate every time because it hangs up my terminal.
For example shell script is test.sh
#/bin/bash
./test.pl $i`
on my terminal I write type out
test.sh testfile.csv
Ultimately I want the test file to be read by my perl script so it can run.
I think your error comes from the $i` part.
First the trailing backquote is probably a typo and should raise a syntax error. Second, the i variable is not defined, so $i resolve to an empty string. As it is not quoted, shell will omit it and call test.pl without any arguments... Thus your terminal is probably hanging because your perl script is waiting for input.
As #fra suggested, you should use $1 instead of $i, hence passing the first argument passed to your bash script, to your perl script.
Depending on your perl script (shebang, execution write) you may or may not call the perl executable manually.

How to pass a file which may have a different name using Execute Shell command in Jenkins

I have a Jenkins job in which I want to read a file from a directory using the shell and pass that file in ant test step.
Say the file I want to read is /home/xxx/y.txt. The name of the file always changes but there will be only single file with .txt extension at any given point in that directory.
So, I am trying to pass that file in the "Execute Shell" build action as ant -Dfile=/home/xxx/*.txt but the build is "unable to read the file".
The shell won't expand -Dfile=/home/xxx/*.txt into -Dfile=/home/xxx/y.txt because -Dfile=/home/xxx/y.txt is not a file. However, the shell will expand /home/xxx/*.txt into /home/xxx/y.txt. You can get the result you want using command substitution:
ant -Dfile=`echo /home/xxx/*.txt`
To protect against whitespace in the file path, you can use double quotes around the backticks:
ant -Dfile="`echo /home/xxx/*.txt`"
General tip: If you are having trouble with a shell script running in a Jenkins job, try enabling command tracing and view the console output to help debug. Command tracing can be enabled in one of two ways (take your pick):
Pass -x as an option to the shebang at the beginning of the script. For example, replace #!/bin/sh with #!/bin/sh -x. All commands will be output on standard error before they are executed.
Place set -x somewhere in your script. Commands after this line will be traced.
Consider:
set -- /home/xxx/*.txt
{ [ "$#" -eq 1 ] && [ -e "$1" ]; } || {
echo "ERROR: There should be exactly one file matching /home/xxx/*.txt" >&2
exit 1
}
ant -Dfile="$1"
This has several advantages:
You're actually detecting the unexpected cases instead of letting it passed unnoticed when (not if) an impossible thing happens.
Everything is happening in a single shell -- there's no subshell performance impact.
Your filenames aren't being mangled at all -- all the odd corner cases (ie. names with literal backslashes, which echo is allowed by POSIX to mangle) are fully supported.
It's fully compliant with any POSIX shell.
There's also a caveat:
set -- /home/xxx/*.txt overrides "$#", the argument vector, in the current context. If you need to refer to arguments as "$1", "$2", etc. in the outside script, you might put this code inside a function.
file_name=(`/home/xxx/*.txt`)
ant -Dfile=${file_name}

Difference between typing a shell command, and save it to a file and using `cat myfile` to execute it?

I have an rsync command that works as expected when I type it directly into a terminal. The command includes several --include='blah' and --exclude='foo' type arguments. However, if I save that command to a one-line file called "myfile" and I try `cat myfile` (or, equivalently $(cat myfile)), the rsync command behaves differently.
I'm sure it is the exact same command in both cases.
Is this behavior expected/explainable?
I've found the answer to this question. The point is that the cat command takes the contents of the file and treats it like a string. Any string operators (like the escape operator, ) are executed. Then, the final string output is what is passed to a command via the backticks.
As a solution, I've just made "myfile" a shell script that I can execute rather than trying to use cat.

How to start a CLI program and give input to it using a bash script?

I have a third-party CLI program I downloaded using Node's package manager. This program is started by typing the name of the program in a terminal. Once you start the program, the program expects the user to enter strings of characters in which the program will interpret at proper commands if the strings of characters are those it recognizes. I want to automate the process of reading lines from a file, and passing these lines are strings of characters into the program.
Right now when I look up for help on Google for how to automate a CLI program, all I get it how to write a bash script. This is not enough, as what I need is a bash script that opens up a program and then passes arguments to that program, NOT to the terminal itself. Basically I need my script (which will take the file to read lines from as the only argument) to do the following
run my_program
while there are more lines to read from the file:
"Lookup"
$line
close my_program
where "Lookup" is a string of characters recognized as a command by my_program, and $line is meant to convey that I want to pass the line currently being read from the file as an argument to the program.
EDIT: I wrote the following script, but it's interpretting "while read line" as an argument to pass to my_program. How do I make it so that it only interprets the commands inside the while loop as arguments to my_program?
#!/bin/bash
while read line
do
my_program
"Lookup"
"$line"
done < $1
#!/bin/bash
while read line
do
my_program $line
done < $1
and run this script with input file as parameter

While executing shell scripts, how to know which line number it's executing,

While executing shell scripts, how to know which line number it's executing, do have write a wrapper , where i can execute a shell scripts from a shell scripts and to know which line number it's executing.
You can set the PS4 variable to cause set -x output to include the line number:
PS4=':${LINENO}+'
set -x
This will put the line number before each line as it executes:
:4+command here
:5+other command
It's important to have some sigil character (such as a + in my examples) after your variable expansions in PS4, because that last character is repeated to show nesting depth. That is, if you call a function, and that function invokes a command, the output from set -x will report it like so:
:3+++command run within a function called from a function
:8++command run within a function
:19+line after the function was called
If multiple files are involved in running your script, you might want to include the BASH_SOURCE variable as opposed to only LINENO (assuming this really is a bash script, as opposed to /bin/sh -- be sure your script starts with #!/bin/bash!):
PS4=':${BASH_SOURCE}:${LINENO}+'
set -x
Bash has a special variable $LINENO which does what you want.
#!/bin/bash
echo "$LINENO"
echo "$LINENO"
echo "$LINENO"
Demo:
$ ./lineno
2
3
4
#!/bin/sh -x
will report the lines as they're executed (the -x option, to be clear). It won't give you the line number, but report the actual line.
An alternative, but more painful, approach is to use a trap handler, as documented here.

Resources