the following script is working fine on one server but on the other it gives an error
#!/bin/bash
processLine(){
line="$#" # get the complete first line which is the complete script path
name_of_file=$(basename "$line" ".php") # seperate from the path the name of file excluding extension
ps aux | grep -v grep | grep -q "$line" || ( nohup php -f "$line" > /var/log/iphorex/$name_of_file.log & )
}
FILE=""
if [ "$1" == "" ]; then
FILE="/var/www/iphorex/live/infi_script.txt"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists. Script will terminate now."
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not be read. Script will terminate now."
exit 2
fi
fi
# read $FILE using the file descriptors
# $ifs is a shell variable. Varies from version to version. known as internal file seperator.
# Set loop separator to end of line
BACKUPIFS=$IFS
#use a temp. variable such that $ifs can be restored later.
IFS=$(echo -en "\n")
exec 3<&0
exec 0<"$FILE"
while read -r line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
IFS=$BAKCUPIFS
exit 0
i am just trying to read a file containing path of various scripts and then checking whether those scripts are already running and if not running them. The file /var/www/iphorex/live/infi_script.txt is definitely present. I get the following error on my amazon server-
[: 24: unexpected operator
infinity.sh: 32: cannot open : No such file
Thanks for your helps in advance.
You should just initialize file with
FILE=${1:-/var/www/iphorex/live/infi_script.txt}
and then skip the existence check. If the file
does not exist or is not readable, the exec 0< will
fail with a reasonable error message (there's no point
in you trying to guess what the error message will be,
just let the shell report the error.)
I think the problem is that the shell on the failing server
does not like "==" in the equality test. (Many implementations
of test only accept one '=', but I thought even older bash
had a builtin that accepted two '==' so I might be way off base.)
I would simply eliminate your lines from FILE="" down to
the end of the existence check and replace them with the
assignment above, letting the shell's standard default
mechanism work for you.
Note that if you do eliminate the existence check, you'll want
to either add
set -e
near the top of the script, or add a check on the exec:
exec 0<"$FILE" || exit 1
so that the script does not continue if the file is not usable.
For bash (and ksh and others), you want [[ "$x" == "$y" ]] with double brackets. That uses the built-in expression handling. A single bracket calls out to the test executable which is probably barfing on the ==.
Also, you can use [[ -z "$x" ]] to test for zero-length strings, instead of comparing to the empty string. See "CONDITIONAL EXPRESSIONS" in your bash manual.
Related
Example script:
#!/bin/bash
printf '1\n1\n1\n1\n' | ./script2*.sh >/dev/null 2>/dev/null
Shellcheck returns the following:
In script1.sh line 3:
printf '1\n1\n1\n1\n' | ./script2*.sh >/dev/null 2>/dev/null
^-- SC2211: This is a glob used as a command name. Was it supposed to be in ${..}, array, or is it missing quoting?
According to https://github.com/koalaman/shellcheck/wiki/SC2211, there should be no exceptions to this rule.
Specifically, it suggests "If you want to specify a command name via glob, e.g. to not hard code version in ./myprogram-*/foo, expand to array or parameters first to allow handling the cases of 0 or 2+ matches."
The reason I'm using the glob in the first place is that I append or change the date to any script that I have just created or changed. Interestingly enough, when I use "bash script2*.sh" instead of "./script2*.sh" the complaint goes away.
Have I fixed the problem or I am tricking shellcheck into ignoring a problem that should not be ignored? If I am using bad bash syntax, how might I execute another script that needs to be referenced to using a glob the proper way?
The problem with this is that ./script2*.sh may end up running
./script2-20171225.sh ./script2-20180226.sh ./script2-copy.sh
which is a strange and probably unintentional thing to do, especially if the script is confused by such arguments, or if you wanted your most up-to-date file to be used. Your "fix" has the same fundamental problem.
The suggestion you mention would take the form:
array=(./script2*.sh)
[ "${#array[#]}" -ne 1 ] && { echo "Multiple matches" >&2; exit 1; }
"${array[0]}"
and guard against this problem.
Since you appear to assume that you'll only ever have exactly one matching file to be invoked without parameters, you can turn this into a function:
runByGlob() {
if (( $# != 1 ))
then
echo "Expected exactly 1 match but found $#: $*" >&2
exit 1
elif command -v "$1" > /dev/null 2>&1
then
"$1"
else
echo "Glob is not a valid command: $*" >&2
exit 1
fi
}
whatever | runByGlob ./script2*.sh
Now if you ever have zero or multiple matching files, it will abort with an error instead of potentially running the wrong file with strange arguments.
This part of my script is comparing each line of a file to find a preset string. If the string does NOT exist as a line in the file, it should append it to the end of the file.
STRING=foobar
cat "$FILE" | while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
However, it appears as if both $LINE and $ISLINEINFILE are both cleared upon finishing the do loop. How can I avoid this?
Using shell
If we want to make just the minimal change to your code to get it working, all we need to do is switch the input redirection:
string=foobar
while read line
do
if [ "$string" == "$line" ]; then
islineinfile="yes"
fi
done <"$file"
if [ ! "$islineinfile" == yes ]; then
echo "$string" >> "$file"
fi
In the above, we changed cat "$file" | while do ...done to while do...done<"$file". With this one change, the while loop is no longer in a subshell and, consequently, shell variables created in the loop live on after the loop completes.
Using sed
I believe that the whole of your script can be replaced with:
sed -i.bak '/^foobar$/H; ${x;s/././;x;t; s/$/\nfoobar/}' file*
The above adds line foobar to the end of each file that doesn't already have a line that matches ^foobar$.
The above shows file* as the final argument to sed. This will apply the change to all files matching the glob. You could list specific files individually if you prefer.
The above was tested on GNU sed (linux). Minor modifications may be needed for BSD/OSX sed.
Using GNU awk (gawk)
awk -i inplace -v s="foobar" '$0==s{f=1} {print} ENDFILE{if (f==0) print s; f=0}' file*
Like the sed command, this can tackle multiple files all in one command.
Why does my variable set in a do loop disappear?
It disappears because it is set in a shell pipeline component. Most shells run each part of a pipeline in a subshell. By Unix design, variables set in a subshell cannot affect their parent or any already running other shell.
How can I avoid this?
There are several ways:
The simplest is to use a shell that doesn't run the last component of a pipeline in a subshell. This is ksh default behavior, e.g. use that shebang:
#!/bin/ksh
This behavior can also be bash one when the lastpipe option is set:
shopt -s lastpipe
You might use the variable in the same subshell that set it. Note that your original script indentation is wrong and might lead to the incorrect assumption that the if block is inside the pipeline, which isn't the case. Enclosing the whole block with parentheses will rectify that and would be the minimal change (two extra characters) to make it working:
STRING=foobar
cat "$FILE" | ( while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
)
The variable would still be lost after that block though.
You might simply avoid the pipeline, which is straigthforward in your case, the cat being unnecessary:
STRING=foobar
while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done < "$FILE"
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
You might use another argorithmic approach, like using sed or gawk as suggested by John1024.
See also https://unix.stackexchange.com/a/144137/2594 for standard compliance details.
I am trying to write a loop, and this doesn't work:
for t in `ls $TESTS_PATH1/cmd*.in` ; do
diff $t.out <($parser_test `cat $t`)
# Print result
if [[ $? -eq 0 ]] ; then
printf "$t ** TEST PASSED **"
else
printf "$t ** TEST FAILED **"
fi
done
This also doesn't help:
$parser_test `cat $t` | $DIFF $t.out -
Diff shows that output differs (it's strange, I see output of needed error line as it was printed to stdout, and not caught by diff), but when running with temporary file, everything works fine:
for t in `ls $TESTS_PATH1/cmd*.in` ; do
# Compare output with template
$parser_test `cat $t` 1> $TMP_FILE 2> $TMP_FILE
diff $TMP_FILE $t.out
# Print result
if [[ $? -eq 0 ]] ; then
printf "$t $CGREEN** TEST PASSED **$CBLACK"
else
printf "$t $CRED** TEST FAILED **$CBLACK"
fi
done
I must avoid using temporary file. Why first loop doesn't work and how to fix it?
Thank you.
P.S. Files *.in contain erroneous command line parameters for program, and *.out contain errors messages that program must print for these parameters.
First, to your error, you need to redirect standard error:
diff $t.out <($parser_test `cat $t` 2>&1)
Second, to all the other problems you may not be aware of:
don't use ls with a for loop (it has numerous problems, such as unexpected behavior in filenames containing spaces); instead, use: for t in $TESTS_PATH1/cmd*.in; do
to support file names with spaces, quote your variable expansion: "$t" instead of $t
don't use backquotes; they are deprecated in favor of $(command)
don't use a subshell to cat one file; instead, just run: $parser_test <$t
use either [[ $? == 0 ]] (new syntax) or [ $? -eq 0 ] (old syntax)
if you use printf instead of echo, don't forget that you need to add \n at the end of the line manually
never use 1> $TMP_FILE 2> $TMP_FILE - this just overwrites stdout with stderr in a non-predictable manner. If you want to combine standard out and standard error, use: 1>$TMP_FILE 2>&1
by convention, ALL_CAPS names are used for/by environment variables. In-script variable names are recommended to be no_caps.
you don't need to use $? right after executing a command, it's redundant. Instead, you can directly run: if command; then ...
After fixing all that, your script would look like this:
for t in $tests_path1/cmd*.in; do
if diff "$t.out" <($parser_test <"$t" 2>&1); then
echo "$t ** TEST PASSED **"
else
echo "$t ** TEST FAILED **"
fi
done
If you don't care for the actual output of diff, you can add >/dev/null right after diff to silence it.
Third, if I understand correctly, your file names are of the form foo.in and foo.out, and not foo.in and foo.in.out (like the script above expects). If this is true, you need to change the diff line to this:
diff "${t/.in}.out" <($parser_test <"$t" 2>&1)
In your second test you are capturing standard error, but in the first one (and the pipe example) stderr remains uncaptured, and perhaps that's the "diff" (pun intended).
You can probably add a '2>&1' in the proper place to combine the stderr and stdout streams.
.eg.
diff $t.out <($parser_test cat $t 2>&1)
Not to mention, you don't say what "doesn't work" means, does that mean it doesn't find a difference, or it exits with an error message? Please clarify in case you need more info.
I'm trying to write a simple script that will tell me if a filename exist in $Temp that starts with the string "Test".
For example, I have these files
Test1989.txt
Test1990.txt
Test1991.txt
Then I just want to echo that a file was found.
For example, something like this:
file="home/edward/bank1/fiche/Test*"
if test -s "$file"
then
echo "found one"
else
echo "found none"
fi
But this doesn't work.
One approach:
(
shopt -s nullglob
files=(/home/edward/bank1/fiche/Test*)
if [[ "${#files[#]}" -gt 0 ]] ; then
echo found one
else
echo found none
fi
)
Explanation:
shopt -s nullglob will cause /home/edward/bank1/fiche/Test* to expand to nothing if no file matches that pattern. (Without it, it will be left intact.)
( ... ) sets up a subshell, preventing shopt -s nullglob from "escaping".
files=(/home/edward/bank1/fiche/Test*) puts the file-list in an array named files. (Note that this is within the subshell only; files will not be accessible after the subshell exits.)
"${#files[#]}" is the number of elements in this array.
Edited to address subsequent question ("What if i also need to check that these files have data in them and are not zero byte files"):
For this version, we need to use -s (as you did in your question), which also tests for the file's existence, so there's no point using shopt -s nullglob anymore: if no file matches the pattern, then -s on the pattern will be false. So, we can write:
(
found_nonempty=''
for file in /home/edward/bank1/fiche/Test* ; do
if [[ -s "$file" ]] ; then
found_nonempty=1
fi
done
if [[ "$found_nonempty" ]] ; then
echo found one
else
echo found none
fi
)
(Here the ( ... ) is to prevent file and found_file from "escaping".)
You have to understand how Unix interprets your input.
The standard Unix shell interpolates environment variables, and what are called globs before it passes the parameters to your program. This is a bit different from Windows which makes the program interpret the expansion.
Try this:
$ echo *
This will echo all the files and directories in your current directory. Before the echo command acts, the shell interpolates the * and expands it, then passes that expanded parameter back to your command. You can see it in action by doing this:
$ set -xv
$ echo *
$ set +xv
The set -xv turns on xtrace and verbose. Verbose echoes the command as entered, and xtrace echos the command that will be executed (that is, after the shell expansion).
Now try this:
$ echo "*"
Note that putting something inside quotes hides the glob expression from the shell, and the shell cannot expand it. Try this:
$ foo="this is the value of foo"
$ echo $foo
$ echo "$foo"
$ echo '$foo'
Note that the shell can still expand environment variables inside double quotes, but not in single quotes.
Now let's look at your statement:
file="home/edward/bank1/fiche/Test*"
The double quotes prevent the shell from expanding the glob expression, so file is equal to the literal home/edward/bank1/finche/Test*. Therefore, you need to do this:
file=/home/edward/bank1/fiche/Test*
The lack of quotes (and the introductory slash which is important!) will now make file equal to all files that match that expression. (There might be more than one!). If there are no files, depending upon the shell, and its settings, the shell may simply set file to that literal string anyway.
You certainly have the right idea:
file=/home/edward/bank1/fiche/Test*
if test -s $file
then
echo "found one"
else
echo "found none"
fi
However, you still might get found none returned if there is more than one file. Instead, you might get an error in your test command because there are too many parameters.
One way to get around this might be:
if ls /home/edward/bank1/finche/Test* > /dev/null 2>&1
then
echo "There is at least one match (maybe more)!"
else
echo "No files found"
fi
In this case, I'm taking advantage of the exit code of the ls command. If ls finds one file it can access, it returns a zero exit code. If it can't find one matching file, it returns a non-zero exit code. The if command merely executes a command, and then if the command returns a zero, it assumes the if statement as true and executes the if clause. If the command returns a non-zero value, the if statement is assumed to be false, and the else clause (if one is available) is executed.
The test command works in a similar fashion. If the test is true, the test command returns a zero. Otherwise, the test command returns a non-zero value. This works great with the if command. In fact, there's an alias to the test command. Try this:
$ ls -li /bin/test /bin/[
The i prints out the inode. The inode is the real ID of the file. Files with the same ID are the same file. You can see that /bin/test and /bin/[ are the same command. This makes the following two commands the same:
if test -s $file
then
echo "The file exists"
fi
if [ -s $file ]
then
echo "The file exists"
fi
You can do it in one line:
ls /home/edward/bank1/fiche/Test* >/dev/null 2>&1 && echo "found one" || echo "found none"
To understand what it does you have to decompose the command and have a basic awareness of boolean logic.
Directly from bash man page:
[...]
expression1 && expression2
True if both expression1 and expression2 are true.
expression1 || expression2
True if either expression1 or expression2 is true.
[...]
In the shell (and in general in unix world), the boolean true is a program that exits with status 0.
ls tries to list the pattern, if it succeed (meaning the pattern exists) it exits with status 0, 2 otherwise (have a look at ls man page for details).
In our case there are actually 3 expressions, for the sake of clarity I will put parenthesis, although they are not needed because && has precedence on ||:
(expression1 && expression2) || expression3
so if expression1 is true (ie: ls found the pattern) it evaluates expression2 (which is just an echo and will exit with status 0). In this case expression3 is never evaluate because what's on the left site of || is already true and it would be a waste of resources trying to evaluate what's on the right.
Otherwise, if expression1 is false, expression2 is not evaluated but in this case expression3 is.
for entry in "/home/loc/etc/"/*
do
if [ -s /home/loc/etc/$entry ]
then
echo "$entry File is available"
else
echo "$entry File is not available"
fi
done
Hope it helps
The following script will help u to go to a process if that script exist in a specified variable,
cat > waitfor.csh
#!/bin/csh
while !( -e $1 )
sleep 10m
end
ctrl+D
here -e is for working with files,
$1 is a shell variable,
sleep for 10 minutes
u can execute the script by ./waitfor.csh ./temp ; echo "the file exits"
One liner to check file exist or not -
awk 'BEGIN {print getline < "file.txt" < 0 ? "File does not exist" : "File Exists"}'
Wildcards aren't expanded inside quoted strings. And when wildcard is expanded, it's returned unchanged if there are no matches, it doesn't expand into an empty string. Try:
output="$(ls home/edward/bank1/fiche/Test* 2>/dev/null)"
if [ -n "$output" ]
then echo "Found one"
else echo "Found none"
fi
If the wildcard expanded to filenames, ls will list them on stdout; otherwise it will print an error on stderr, and nothing on stdout. The contents of stdout are assigned to output.
if [ -n "$output" ] tests whether $output contains anything.
Another way to write this would be:
if [ $(ls home/edward/bank1/fiche/Test* 2>/dev/null | wc -l) -gt 0 ]
I have a number of bash scripts, each doing its own thing merrily. Do note that while I program in other languages, I only use Bash to automate things, and am not very good at it.
I'm now trying to combine a number of them to create "meta" scripts, if you will, which use other scripts as steps. The problem is that I need to parse the output of each step to be able to pass a part of it as params to the next step.
An example:
stepA.sh
[...does stuff here...]
echo "Task complete successfuly"
echo "Files available at: $d1/$1"
echo "Logs available at: $d2/$1"
both the above are paths, such as /var/www/thisisatest and /var/log/thisisatest (note that files always start with /var/www and logs always start with /var/log ). I'm only interested in the files path.
steB.sh
[...does stuff here...]
echo "Creation of $d1 complete."
echo "Access with username $usr and password $pass"
all variables here are simple strings, that may contain special characters (no spaces)
What I'm trying to build is a script that runs stepA.sh, then stepB.sh and uses the output of each to do its own stuff. What I'm currently doing (both above scripts are symlinked to /usr/local/bin without the .sh part and made executable):
#!/bin/bash
stepA $1 | while read -r line; do
# Create the container, and grab the file location
# then pass it to then next pipe
if [[ "$line" == *:* ]]
then
POS=`expr index "$line" "/"`
PTH="/${line:$POS}"
if [[ "$PTH" == *www* ]]
then
#OK, have what I need here, now what?
echo $PTH;
fi
fi
done
# Somehow get $PTH here
stepB $1 | while read -r line; do
...
done
#somehow have the required strings here
I'm stuck in passing the PTH to the next step. I understand this is because piping runs it in a subshell, however all examples I've seen refer to to files and not commands, and I could not make this to work. I tried piping the echo to a "next step" such as
stepA | while ...
echo $PTH
done | while ...
#Got my var here, but cannot run stuff
done
How can I run stepA and have the PTH variable available for later?
Is there a "better way" to extract the path I need from the output than nested ifs ?
Thanks in advance!
Since you're using bash explicitly (in the shebang line), you can use its process substitution feature instead of a pipe:
while read -r line; do
if [[ "$line" == *:* ]]
.....
fi
done < <(stepA $1)
Alternately, you could capture the command's output to a string variable, and then parse that:
output="$(stepA $1)"
tmp="${output#*$'\nFiles available at: '}" # output with everything before the filepath trimmed
filepath="${tmp%%$'\n'*}" # trim the first newline and everything after it from $tmp
tmp="${output#*$'\nLogs available at: '}"
logpath="${tmp%%$'\n'*}"