In one of my Bash scripts, there's a point where I have a variable SCRIPT which contains the /path/to/an/exe, and what the script ultimately needs to do, is executing that executable. Therefore the last line of the script is
$($SCRIPT)
so that $SCRIPT is expanded to /path/to/an/exe, and $(/path/to/an/exe) executes the executable.
However, running shellcheck on the script generates this error:
In setscreens.sh line 7:
$($SCRIPT)
^--------^ SC2091: Remove surrounding $() to avoid executing output.
For more information:
https://www.shellcheck.net/wiki/SC2091 -- Remove surrounding $() to avoid e...
Is there a way I can rewrite that $($SCRIPT) in a more appropriate way? eval does not seem to be of much help here.
$($SCRIPT) indeed does not do what you think it does.
The outer $() will execute any commands inside the parenthesis and execute the result string.
The inner $SCRIPT will expand to the value of the SCRIPT variable and execute this string while splitting words on spaces/
If you want to execute the command contained into the SCRIPT variable, you just write as an example:
SCRIPT='/bin/ls'
"$SCRIPT" # Will execute /bin/ls
Now if you also need to handle arguments with your SCRIPT variable command call:
SCRIPT='/bin/ls'
"$SCRIPT" -l # Will execute /bin/ls -l
To also store or build arguments dynamically, you'd need an array instead of a string variable.
Example:
SCRIPT=(/bin/ls -l)
"${SCRIPT[#]}" # Will execute /bin/ls -l
SCRIPT+=(/etc) # Add /etc to the array
"${SCRIPT[#]}" # Will execute /bin/ls -l /etc
It worked for me with sh -c:
$ chrome="/opt/google/chrome/chrome"
$ sh -c "$chrome"
Opening in existing browser session.
It also passed the ShellCheck without any issues.
with bash, just use $SCRIPT:
cat <<'EOF' > test.sh
SCRIPT='echo aze rty'
$SCRIPT
EOF
bash test.sh
produce:
aze rty
Related
i am writing a shell script practice.sh. I want to give my first argument $1 from command line to ls command in script.e.g
if I run my script in terminal $bash practice.sh *.mp3
the argument *.mp3
I want to use for ls command
#!/bin/bash
output=$ls $1
it doesn't work
any help?
The obvious answer for what you say you want is just
#!/bin/bash
ls "$1"
which will run ls, passing it (just) the first argument to the script.
However, you also say you want to run this like: practice.sh *.mp3 which runs the script with many arguments (not just one) -- the *.mp3 will be expanded to be all the of the .mp3 files in the current directory. For that, you likely want something more like
#!/bin/bash
ls "$#"
which will pass all of the arguments to your script (however many there are) to the ls command.
These scripts will just run ls with its stdout connected to whatever your script has its stdout connceted to, so the output will (likely) just appear on your terminal. If you instead want to capture the output of the ls command (so you can do something else with it), you need something like
#!/bin/bash
output=$(ls "$#")
which will run ls with all the arguments, and capture the output in the variable $output. You can then do things with that variable.
Use shell expansion to record the output of the command in the variable output:
output=$(ls $1)
This will record the output of the command ls $1 in the variable output.
You can then use echo $output to print out your output.
You can read more about shell expansion in the GNU Bash reference manual.
I have a shell script that sets a variable. I can access it inside the script, but I can't outside of it. Is it possible to make the variable global?
Accessing the variable before it's created returns nothing, as expected:
$ echo $mac
$
Creating the script to create the variable:
#!/bin/bash
mac=$(cat \/sys\/class\/net\/eth0\/address)
echo $mac
exit 0
Running the script gives the current mac address, as expected:
$ ./mac.sh
12:34:56:ab:cd:ef
$
Accessing the variable after its created returns nothing, NOT expected:
$ echo $mac
$
Is there a way I can access this variable at the command line and in other scripts?
A child process can't affect the parent process like that.
You have to use the . (dot) command — or, if you like C shell notations, the source command — to read the script (hence . script or source script):
. ./mac.sh
source ./mac.sh
Or you generate the assignment on standard output and use eval $(script) to set the variable:
$ cat mac.sh
#!/bin/bash
echo mac=$(cat /sys/class/net/eth0/address)
$ bash mac.sh
mac=12:34:56:ab:cd:ef
$ eval $(bash mac.sh)
$ echo $mac
12:34:56:ab:cd:ef
$
Note that if you use no slashes in specifying the script for the dot or source command, then the shell searches for the script in the directories listed in $PATH. The script does not have to be executable; readable is sufficient (and being read-only is beneficial in that you can't run the script accidentally).
It's not clear what all the backslashes in the pathname were supposed to do other than confuse; they're unnecessary.
See ssh-agent for precedent in generating a script like that.
I am trying to dynamically create alias' from the output of another command line tool.
For example:
> MyScript
blender="/opt/apps/blender/blender/2.79/blender"
someOtherAlias="ls -l"
I am trying the following code:
MyScript | {
while IFS= read -r line;
do
`echo alias $line`;
done;
}
But when I run this, I get the following error:
bash: alias: -l": not found
Just trying to run this command by itself gives me the same error:
> `echo 'alias someOtherAlias="ls -l"'`
bash: alias: -l": not found
But obviously the following command does work:
alias someOtherAlias="ls -l"
I've tried to find someone else who may have done this before, but none of my searches have come up with anything.
I would appreciate any and all help. Thanks!
See how bash (and posix shells) command parsing and quoting works and see difference between syntax and literal argument: for example '.."..' "..'.." are litteral quotes in an argument whereas " or ' are shell syntax and are not part of argument
also, enabling tacing with set -x may help to understand :
set -x
`echo 'alias someOtherAlias="ls -l"'`
++ echo 'alias someOtherAlias="ls -l"'
+ alias 'someOtherAlias="ls' '-l"'
bash: alias: -l": not found
bash sees 3 words : alias, someOtherAlias="ls and -l".
and alias loops over its arguments if they contain a = it create an alias otherwise it displays what alias argument is as -l" is not an alias it shows the error.
Note also as backquotes means command is run in a subshell (can be seen with mutiple + in trace) it will have no effect in current shell.
eval may be use to reinterpret literal as bash syntax (or to parse again a string).
So following should work, but be careful using eval on arbitrary arguments (from user input) can run arbitrary command.
eval 'alias someOtherAlias="ls -l"'
Finally also as bash commands after pipe are also run in subshell.
while IFS= read -r line;
do
`echo alias $line`;
done <MyScript
In my program I need to know the maximum number of process I can run. So I write a script. It works when I run it in shell but but when in program using system("./limit.sh"). I work in bash.
Here is my code:
#/bin/bash
LIMIT=\`ulimit -u\`
ACTIVE=\`ps -u | wc -l \`
echo $LIMIT > limit.txt
echo $ACTIVE >> limit.txt
Anyone can help?
Why The Original Fails
Command substitution syntax doesn't work if escaped. When you run:
LIMIT=\`ulimit -u\`
...what you're doing is running a command named
-u`
...with the environment variable named LIMIT containing the value
`ulimit
...and unless you actually have a command that starts with -u and contains a backtick in its name, this can be expected to fail.
This is because using backticks makes characters which would otherwise be syntax into literals, and running a command with one or more var=value pairs preceding it treats those pairs as variables to export in the environment for the duration of that single command.
Doing It Better
#!/bin/bash
limit=$(ulimit -u)
active=$(ps -u | wc -l)
printf '%s\n' "$limit" "$active" >limit.txt
Leave off the backticks.
Use modern $() command substitution syntax.
Avoid multiple redirections.
Avoid all-caps names for your own variables (these names are used for variables with meaning to the OS or system; lowercase names are reserved for application use).
Doing It Right
#!/bin/bash
exec >limit.txt # open limit.txt as output for the rest of the script
ulimit -u # run ulimit -u, inheriting that FD for output
ps -u | wc -l # run your pipeline, likewise with output to the existing FD
You have a typo on the very first line: #/bin/bash should be #!/bin/bash - this is often known as a "shebang" line, for "hash" (#) + "bang" (!)
Without that syntax written correctly, the script is run through the system's default shell, which will see that line as just a comment.
As pointed out in comments, that also means only the standardised options available to the builtin ulimit command, which doesn't include -u.
I am trying to write an fgrep statement removing records with a full record match from a file. I can do this on the command line, but not inside a ksh script. The code I am using boils down to these 4 lines of code:
Header='abc def|ghi jkl' #I use the head command to populate this variable
workfile=abc.txt
command="fgrep -Fxv \'$Header\' $workfile" >$outfile
$command
When I echo $command to STDIN the command is exactly what I would type on the command line (with the single quotes) and that works on the command line. When I execute it within the ksh script (file) the single quotes seem not to be recognized because the errors show it is parsing on spaces.
I have tried back ticks, exec, eval, double quotes instead of single quotes, and not using the $command variable. The problem remains.
I can do this on the command line, but not inside a ksh script
Here's a simple, portable, reliable solution using a heredoc.
#!/usr/bin/env ksh
workfile=abc.txt
outfile=out.txt
IFS= read -r Header <<'EOF'
abc def|ghi jul
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile" > "$outfile"
EOF
eval "$command"
Explanation :
(Comments can't be added to the script above because they would affect the lines in the heredoc)
IFS= read -r Header <<'EOF' # Line separated literal strings
abc def|ghi jul # Set into the $Header variable
EOF # As if it were a text file
IFS= read -r command <<'EOF' # Command to execute
grep -Fxv "$Header" "$workfile" > "$outfile" # As if it were typed into
EOF # the shell command line
eval "$command" # Execute the command
The above example is the same as having a text file called header.txt, which contains the contents: abc def|ghi jul and typing the following command:
grep -Fxvf header.txt abc.txt
The heredoc addresses the problem of the script operating differently than the command line as a result of quoting/expansions/escaping issues.
A Word of caution regarding eval:
The use of eval in this example is specific. Please see Eval command and security issues for information on how eval can be misused and cause potentially very damaging results.
More Detail / Alternate Example:
For the sake of completeness, clarity, and ability to apply this concept to other situations, some notes about the heredoc and an alternative demonstration:
This implementation of the heredoc in this example is specifically designed with the following criteria:
Literal string assignment of contents, to the variables (using 'EOF')
Use of the eval command to evaluate and execute the referenced variables within the heredoc itself.
File or heredoc ?
One strength of using a heredoc combined with grep -F (fgrep), is the ability to treat a section of the script as if it were a file.
Case for file:
You want to frequently paste "pattern" lines into the file, and remove them as necessary, without having to modify the script file.
Case for heredoc:
You apply the script in an environment where specific files already exist, and you want to match specific exact literal patterns against it.
Example:
Scenario: I have 5 VPS Servers, and I want a script to produce a new fstab file but to ensure it doesn't contain the exact line:
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
This scenario fits the type of situation addressed in this question. I could use the boilerplate from the above code in this answer and modify it as following:
#!/usr/bin/env ksh
workfile=/etc/fstab
IFS= read -r Header <<'EOF'
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile"
EOF
eval "$command"
This would give me a new fstab file, without the line contained in the heredoc.
Bash FAQ #50: I'm trying to put a command in a variable, but the complex cases always fail! provides comprehensive guidance - while it is written for Bash, most of it applies to Ksh as well.[1]
If you want to stick with storing your command in a variable (defining a function is the better choice), use an array, which bypasses the quoting issues:
#!/usr/bin/env ksh
Header='abc def|ghi jkl'
workfile=abc.txt
# Store command and arguments as elements of an array
command=( 'fgrep' '-Fxv' "$Header" "$workfile" )
# Invoke the array as a command.
"${command[#]}" > "$outfile"
Note: only a simple command can be stored in an array, and redirections can't be part of it.
[1] The function examples use local to create local variables, which ksh doesn't support. Omit local to make do with shell-global variables instead, or use function <name> {...} syntax with typeset instead of local to declare local variables in ksh.