Here is my problem statement.
Write a shell script that takes a name of a folder as a command line argument, and produce a file that contains the names of all sub folders with size 0 (that is empty sub folders)
This is my shell script.
ls $1
while read folder
do
files = 'ls $folder | wc -l'
if[$files -eq 0];
then
echo "$folder">>output.txt
echo "File deleted"
else
echo "File is not empty"
fi
done
When I execute my command (using 'sh filename'), it shows syntax error!
Syntax error: "then" unexpected (expecting "done")
Is there any wrong with my script?
Don't forget, in shell [ is a binary that take parameters and return true or false (0 or 1).
if is a keyword that verifies the return of next binary called is true (0).
So, when you do
if[$files -eq 0]
Your shell understand nothing because it try to launch the if[2 programm, and he find a then after without detecting the if.
For fix your problem, you have to put a space after your if and after the [ because binary must have a space between between his name and their arguments.
ls $1
while read folder
do
files = `ls $folder | wc -l`
if [ $files -eq 0 ]
then
echo "$folder">>output.txt
echo "File deleted"
else
echo "File is not empty"
fi
done
Try something like this
ls $1
while read folder
do
files=`ls $folder | wc -l`
if [ $files -eq 0 ]; then
echo "$folder">>output.txt
echo "File deleted"
else
echo "File is not empty"
fi
done
Notice no space files=.., and there is `(back tick) not '(single quote)
Notice space between 'if' and '[' ...
There may be spacing error:
Just do 2 steps:
run hexdump -C yourscript.sh
run cat yourscript.sh | tr -d '\r' >> yournewscript.sh
it will create new correct file then run new file.
You've already got answers describing how your existing script needs to be fixed:
no spaces around the = when you set the $files variable,
backquotes instead of single ticks for your command substitution,
a space after if, and spaces around the parts of the conditional expression.
Your script suffers from the Parsing LS issue, in that filenames may be treated badly if they contain special characters like newlines. While you may think this isn't a big issue when all you want to do is check for the existence or nonexistence of files (i.e. count == 0), but the way you're doing it is still cumbersome, and encourages bad habits.
How about, instead consider:
while read folder; do
files=0
for files in $folder/*; do
files=1
break
done
if [ $files -eq 0 ]; then
echo "$folder" >> output.txt
else
echo "not empty: $folder" >&2
fi
done
Instead of counting files in a command substitution and pipe, this uses a for loop to set a semaphore if any files exist. This will always be faster.
Note that this is POSIX-compliant. If your shell is a more advanced one, like bash or zsh, you have more elegant/efficient options available.
Related
I'm trying to write a shell script to check if there's a file existing that ends with .txt using an if statement.
Within single bracket conditionals, all of the Shell Expansions will occur, particularly in this case Filename expansion.
The condional construct acts upon the number of arguments it's given: -f expects exactly one argument to follow it, a filename. Apparently your *.txt pattern matches more than one file.
If your shell is bash, you can do
files=(*.txt)
if (( ${#files[#]} > 0 )); then ...
or, more portably:
count=0
for file in *.txt; do
count=1
break
done
if [ "$count" -eq 0 ]; then
echo "no *.txt files"
else
echo "at least one *.txt file"
fi
I finally get your perspective now. I've been giving you some incomplete advice. This is what you need:
for f in *.txt; do
if [ -f "$f" ]; then
do_something_with "$f"
fi
done
The reason: if there are no files matching the pattern then the shell leaves the patten as a plain string. On the first iteration of the loop, we have f="*.txt" and mv responds with "file not found".
I'm used to working in bash with the nullglob option that handles this edge case.
The script found error but it always goes to Else condition "No Found Error". Am I missing how to compare two variables?
ERROR="Error:"
for i in `find /logs -mtime -1`
do
CHECK=`cat $i |grep -i "Error"|cut -f 1 -d " "`
if [ "$CHECK" == $ERROR ]
then
echo "Found Error"
else
echo "Not Found Error"
fi
done
Did you tried something like if [[ "$CHECK" == $ERROR ]] ?
To simply detect error without printing the error message, you can use
CHECK=$(cat $i | grep "Error" | wc -l)
if [[ $CHECK -ne 0 ]]
then
echo "Found error"
else
echo "Not found error"
fi
You are using grep -i for case-insensitive matching, but then testing the result for exact equality with the string Error:. If the case-insensitive matching is important then the exact equality test is not an appropriate complement.
You are also capturing a potentially multi-line output and comparing it to a string that can be the result only of a single-line output.
And you are matching "Error:" anywhere on the line, but assuming that it will appear at the beginning.
Overall, you are going about this a very convoluted way, as grep tells you via its exit status whether it found any matches. For example:
#!/bin/bash
for log in `find /logs -mtime -1`; do
if grep -i -q '^Error:' "$log"; then
echo "Found Error"
else
echo "Not Found Error"
fi
done
There is two things that I would advise and may fix your issue:
Add #!/bin/bash on the first line, to make sure it is interpreted as bash and not sh. Many time I had trouble with comparison because of this
When comparing two variables, uses double brackets ([[ and ]]) Also, if it strings, always put quotes "$ERROR" around it. It's missing for the $ERROR variable.
Look at the other answers also, there are many ways to do the same thing in a much simpler way.
Note: When comparing numbers you should use -eq
I'm creating a bash script to read a file in line by line, that is formatted later to be organised by name and then date. I cannot see why this code isn't working at this time though no errors show up even though I have tried with the input and output filename variables on their own, with a directory finder and export command.
export inputfilename="employees.txt"
export outputfilename="branch.txt"
directoryinput=$(find -name $inputfilename)
directoryoutput=$(find -name $outputfilename)
n=1
if [[ -f "$directoryinput" ]]; then
while read line; do
echo "$line"
n=$((n+1))
done < "$directoryoutput"
else
echo "Input file does not exist. Please create a employees.txt file"
fi
All help is very much appreciated, thank you!
NOTE: As people noticed, I forgot to add in the $ sign on the data transfer to file, but it was just in copying my code, I do have the $ sign in my actual application and still no result
Reading in File line by line w/ Bash
The best and idiomatic way to read file line by line is to do:
while IFS= read -r line; do
// parse line
printf "%s" "$line"
done < "file"
More on this topic can be found on bashfaq
However don't read files in bash line by line. You can (ok, almost) always not read a stream line by line in bash. Reading a file line by line in bash is extremely slow and shouldn't be done. For simple cases all the unix tools with the help of xargs or parallel can be used, for more complicated awk and datamesh are used.
done < "directoryoutput"
The code is not working, because you are passing to your while read loop as input to standard input the content of a file named directoryoutput. As such a file does not exists, your script fails.
directoryoutput=$(find -name $outputfilename)
One can simply append the variable value with newline appended to a read while loop using a HERE-string construction:
done <<< "$directoryoutput"
directoryinput=$(find -name $inputfilename)
if [[ -f "$directoryinput" ]]
This is ok as long as you have only one file named $inputfilename in your directory. Also it makes no sense to find a file and then check for it's existance. In case of more files, find return a newline separated list of names. However a small check if [ "$(printf "$directoryinput" | wc -l)" -eq 1 ] or using find -name $inputfilename | head -n1 I think would be better.
while read line;
do
echo "$line"
n=$((n+1))
done < "directoryoutput"
The intention is pretty clear here. This is just:
n=$(<directoryoutput wc -l)
cat "directoryoutput"
Except that while read line removed trailing and leading newlines and is IFS dependent.
Also always remember to quote your variables unless you have a reason not to.
Have a look at shellcheck which can find most common mistakes in scripts.
I would do it more like this:
inputfilename="employees.txt"
outputfilename="branch.txt"
directoryinput=$(find . -name "$inputfilename")
directoryinput_cnt=$(printf "%s\n" "$directoryinput" | wc -l)
if [ "$directoryinput_cnt" -eq 0 ]; then
echo "Input file does not exist. Please create a '$inputfilename' file" >&2
exit 1
elif [ "$directoryinput_cnt" -gt 1 ]; then
echo "Multiple file named '$inputfilename' exists in the current path" >&2
exit 1
fi
directoryoutput=$(find . -name "$outputfilename")
directoryoutput_cnt=$(printf "%s\n" "$directoryoutput" | wc -l)
if [ "$directoryoutput_cnt" -eq 0 ]; then
echo "Input file does not exist. Please create a '$outputfilename' file" >&2
exit 1
elif [ "$directoryoutput_cnt" -gt 1 ]; then
echo "Multiple file named '$outputfilename' exists in the current path" >&2
exit 1
fi
cat "$directoryoutput"
n=$(<"$directoryoutput" wc -l)
I'm trying to write a simple script that will tell me if a filename exist in $Temp that starts with the string "Test".
For example, I have these files
Test1989.txt
Test1990.txt
Test1991.txt
Then I just want to echo that a file was found.
For example, something like this:
file="home/edward/bank1/fiche/Test*"
if test -s "$file"
then
echo "found one"
else
echo "found none"
fi
But this doesn't work.
One approach:
(
shopt -s nullglob
files=(/home/edward/bank1/fiche/Test*)
if [[ "${#files[#]}" -gt 0 ]] ; then
echo found one
else
echo found none
fi
)
Explanation:
shopt -s nullglob will cause /home/edward/bank1/fiche/Test* to expand to nothing if no file matches that pattern. (Without it, it will be left intact.)
( ... ) sets up a subshell, preventing shopt -s nullglob from "escaping".
files=(/home/edward/bank1/fiche/Test*) puts the file-list in an array named files. (Note that this is within the subshell only; files will not be accessible after the subshell exits.)
"${#files[#]}" is the number of elements in this array.
Edited to address subsequent question ("What if i also need to check that these files have data in them and are not zero byte files"):
For this version, we need to use -s (as you did in your question), which also tests for the file's existence, so there's no point using shopt -s nullglob anymore: if no file matches the pattern, then -s on the pattern will be false. So, we can write:
(
found_nonempty=''
for file in /home/edward/bank1/fiche/Test* ; do
if [[ -s "$file" ]] ; then
found_nonempty=1
fi
done
if [[ "$found_nonempty" ]] ; then
echo found one
else
echo found none
fi
)
(Here the ( ... ) is to prevent file and found_file from "escaping".)
You have to understand how Unix interprets your input.
The standard Unix shell interpolates environment variables, and what are called globs before it passes the parameters to your program. This is a bit different from Windows which makes the program interpret the expansion.
Try this:
$ echo *
This will echo all the files and directories in your current directory. Before the echo command acts, the shell interpolates the * and expands it, then passes that expanded parameter back to your command. You can see it in action by doing this:
$ set -xv
$ echo *
$ set +xv
The set -xv turns on xtrace and verbose. Verbose echoes the command as entered, and xtrace echos the command that will be executed (that is, after the shell expansion).
Now try this:
$ echo "*"
Note that putting something inside quotes hides the glob expression from the shell, and the shell cannot expand it. Try this:
$ foo="this is the value of foo"
$ echo $foo
$ echo "$foo"
$ echo '$foo'
Note that the shell can still expand environment variables inside double quotes, but not in single quotes.
Now let's look at your statement:
file="home/edward/bank1/fiche/Test*"
The double quotes prevent the shell from expanding the glob expression, so file is equal to the literal home/edward/bank1/finche/Test*. Therefore, you need to do this:
file=/home/edward/bank1/fiche/Test*
The lack of quotes (and the introductory slash which is important!) will now make file equal to all files that match that expression. (There might be more than one!). If there are no files, depending upon the shell, and its settings, the shell may simply set file to that literal string anyway.
You certainly have the right idea:
file=/home/edward/bank1/fiche/Test*
if test -s $file
then
echo "found one"
else
echo "found none"
fi
However, you still might get found none returned if there is more than one file. Instead, you might get an error in your test command because there are too many parameters.
One way to get around this might be:
if ls /home/edward/bank1/finche/Test* > /dev/null 2>&1
then
echo "There is at least one match (maybe more)!"
else
echo "No files found"
fi
In this case, I'm taking advantage of the exit code of the ls command. If ls finds one file it can access, it returns a zero exit code. If it can't find one matching file, it returns a non-zero exit code. The if command merely executes a command, and then if the command returns a zero, it assumes the if statement as true and executes the if clause. If the command returns a non-zero value, the if statement is assumed to be false, and the else clause (if one is available) is executed.
The test command works in a similar fashion. If the test is true, the test command returns a zero. Otherwise, the test command returns a non-zero value. This works great with the if command. In fact, there's an alias to the test command. Try this:
$ ls -li /bin/test /bin/[
The i prints out the inode. The inode is the real ID of the file. Files with the same ID are the same file. You can see that /bin/test and /bin/[ are the same command. This makes the following two commands the same:
if test -s $file
then
echo "The file exists"
fi
if [ -s $file ]
then
echo "The file exists"
fi
You can do it in one line:
ls /home/edward/bank1/fiche/Test* >/dev/null 2>&1 && echo "found one" || echo "found none"
To understand what it does you have to decompose the command and have a basic awareness of boolean logic.
Directly from bash man page:
[...]
expression1 && expression2
True if both expression1 and expression2 are true.
expression1 || expression2
True if either expression1 or expression2 is true.
[...]
In the shell (and in general in unix world), the boolean true is a program that exits with status 0.
ls tries to list the pattern, if it succeed (meaning the pattern exists) it exits with status 0, 2 otherwise (have a look at ls man page for details).
In our case there are actually 3 expressions, for the sake of clarity I will put parenthesis, although they are not needed because && has precedence on ||:
(expression1 && expression2) || expression3
so if expression1 is true (ie: ls found the pattern) it evaluates expression2 (which is just an echo and will exit with status 0). In this case expression3 is never evaluate because what's on the left site of || is already true and it would be a waste of resources trying to evaluate what's on the right.
Otherwise, if expression1 is false, expression2 is not evaluated but in this case expression3 is.
for entry in "/home/loc/etc/"/*
do
if [ -s /home/loc/etc/$entry ]
then
echo "$entry File is available"
else
echo "$entry File is not available"
fi
done
Hope it helps
The following script will help u to go to a process if that script exist in a specified variable,
cat > waitfor.csh
#!/bin/csh
while !( -e $1 )
sleep 10m
end
ctrl+D
here -e is for working with files,
$1 is a shell variable,
sleep for 10 minutes
u can execute the script by ./waitfor.csh ./temp ; echo "the file exits"
One liner to check file exist or not -
awk 'BEGIN {print getline < "file.txt" < 0 ? "File does not exist" : "File Exists"}'
Wildcards aren't expanded inside quoted strings. And when wildcard is expanded, it's returned unchanged if there are no matches, it doesn't expand into an empty string. Try:
output="$(ls home/edward/bank1/fiche/Test* 2>/dev/null)"
if [ -n "$output" ]
then echo "Found one"
else echo "Found none"
fi
If the wildcard expanded to filenames, ls will list them on stdout; otherwise it will print an error on stderr, and nothing on stdout. The contents of stdout are assigned to output.
if [ -n "$output" ] tests whether $output contains anything.
Another way to write this would be:
if [ $(ls home/edward/bank1/fiche/Test* 2>/dev/null | wc -l) -gt 0 ]
the following script is working fine on one server but on the other it gives an error
#!/bin/bash
processLine(){
line="$#" # get the complete first line which is the complete script path
name_of_file=$(basename "$line" ".php") # seperate from the path the name of file excluding extension
ps aux | grep -v grep | grep -q "$line" || ( nohup php -f "$line" > /var/log/iphorex/$name_of_file.log & )
}
FILE=""
if [ "$1" == "" ]; then
FILE="/var/www/iphorex/live/infi_script.txt"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists. Script will terminate now."
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not be read. Script will terminate now."
exit 2
fi
fi
# read $FILE using the file descriptors
# $ifs is a shell variable. Varies from version to version. known as internal file seperator.
# Set loop separator to end of line
BACKUPIFS=$IFS
#use a temp. variable such that $ifs can be restored later.
IFS=$(echo -en "\n")
exec 3<&0
exec 0<"$FILE"
while read -r line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
IFS=$BAKCUPIFS
exit 0
i am just trying to read a file containing path of various scripts and then checking whether those scripts are already running and if not running them. The file /var/www/iphorex/live/infi_script.txt is definitely present. I get the following error on my amazon server-
[: 24: unexpected operator
infinity.sh: 32: cannot open : No such file
Thanks for your helps in advance.
You should just initialize file with
FILE=${1:-/var/www/iphorex/live/infi_script.txt}
and then skip the existence check. If the file
does not exist or is not readable, the exec 0< will
fail with a reasonable error message (there's no point
in you trying to guess what the error message will be,
just let the shell report the error.)
I think the problem is that the shell on the failing server
does not like "==" in the equality test. (Many implementations
of test only accept one '=', but I thought even older bash
had a builtin that accepted two '==' so I might be way off base.)
I would simply eliminate your lines from FILE="" down to
the end of the existence check and replace them with the
assignment above, letting the shell's standard default
mechanism work for you.
Note that if you do eliminate the existence check, you'll want
to either add
set -e
near the top of the script, or add a check on the exec:
exec 0<"$FILE" || exit 1
so that the script does not continue if the file is not usable.
For bash (and ksh and others), you want [[ "$x" == "$y" ]] with double brackets. That uses the built-in expression handling. A single bracket calls out to the test executable which is probably barfing on the ==.
Also, you can use [[ -z "$x" ]] to test for zero-length strings, instead of comparing to the empty string. See "CONDITIONAL EXPRESSIONS" in your bash manual.