I have a placeholder file and would like to override it with the output of a command only if the output is not zero length. I guess I could do FOO="$(command)" then [-z $FOO]. Is there a better way?
There are different ways, but I don't know about "better". You could block on a read and only set up the redirection once some data come through:
cmd | { read j && { echo "$j"; cat; } > placeholder; }
(Note, if your command generates output but no newlines, this will ignore the data.)
If you don't need the output of the command, you can run: [ -z "$(command)" ] directly. For example, the following prints "empty":
#!/bin/sh
if [ -z "$(echo -n)" ]; then
echo "empty"
fi
Your example with -z will work, but you can also check if a variable is non-empty just with
[ "$var" ]
So, a simple solution could look like this:
#!/bin/sh
output="$( command )"
[ "$output" ] && echo "$output" > your_file.txt
If you are going to do this type of thing a lot, better make a function:
write_if_non_zero(){
local msg=$1
local file=$2
if [[ ! -z "$msg" ]]; then
echo "$msg" > "$file"
fi
}
Then
write_if_non_zero "$FOO" "$FILE"
Related
I am trying to check if the string is in format in shell script.
below is the code i am trying and the output i want to get.
Format: <datatype>(length,length) | <datatype>(length,length)
I have multiple cases with this scenario, if both datatypes have () then it should show pass, else fail.
Eg. decimal(1,0)|number(11,0) this should pass but int|number(11,0) or decimal(1,0)|int should fail.
Code1:
INPUT='decimal(1,0)|number(11,0)'
sub="[A-Z][a-z]['!##$ %^&*()_+'][0-9][|][A-Z][a-z]['!##$ %^&*()_+'][0-9][|]"
if [ "$INPUT" == "$sub" ]; then
echo "Passed"
else
echo "No"
fi
Code 2:
INPUT='decimal(1,0)|number(11,0)'
sub="decimal"
if [ "$INPUT" == *"("*") |"*"("*") " ]; then
echo "Passed"
else
echo "No"
fi
Any help will be fine. Also note, I am very new to shell scripting.
Reading both values into variables, first removing alpha characters and then checking variables are not empty
result='FAIL'
input='int|number(6,10)'
IFS="|" read val1 val2 <<<"$(tr -d '[:alpha:]' <<<"$input")"
if [ -n "$val1" ] && [ -n "$val2" ]; then
result='PASS'
fi
echo "$result: val1='$val1' val2='$val2'"
Result:
FAIL: val1='' val2='(6,10)'
For input='decimal(8,9)|number(6,10)'
PASS: val1='(8,9)' val2='(6,10)'
That looks like just a simple regex to write.
INPUT='decimal(1,0)|number(11,0)'
if printf "%s" "$INPUT" | grep -qEx '[a-z]+\([0-9]+,[0-9]+\)(\|[a-z]+\([0-9]+,[0-9]+\))*'; then
echo "Passed"
else
echo "No"
fi
I am trying to write a file that mimics the cat -n bash command. It is supposed to accept a filename as input and if no input is given print a usage statement.
This is what I have so far but I am not sure what I am doing wrong:
#!/bin/bash
echo "OK"
read filename
if [ $filename -eq 0 ]
then
echo "Usage: cat-n.sh file"
else
cat -n $filename
fi
I suggest to use -z to check for empty variable $filename:
if [ -z $filename ]
See: help test
I am new to bash scripting and I have to create this script that takes 3 directories as arguments and copies in the third one all the files in the first one that are NOT in the second one.
I did it like this:
#!/bin/bash
if [ -d $1 && -d $2 && -d $3 ]; then
for FILE in [ ls $1 ]; do
if ! [ find $2 -name $FILE ]; then
cp $FILE $3
done
else echo "Error: one or more directories are not present"
fi
The error I get when I try to execute it is: "line 7: syntax error near unexpected token `done' "
I don't really know how to make it work!
Also even if I'm using #!/bin/bash I still have to explicitly call bash when trying to execute, otherwise it says that executing is not permitted, anybody knows why?
Thanks in advance :)
Couple of suggestions :
No harm double quoting variables
cp "$FILE" "$3" # prevents wordsplitting, helps you filenames with spaces
for statement fails for the fundamental reason -bad syntax- it should've been:
for FILE in ls "$1";
But then, never parse ls output. Check [ this ].
for FILE in ls "$1"; #drastic
Instead of the for-loop in step2 use a find-while-read combination:
find "$1" -type f -print0 | while read -rd'' filename #-type f for files
do
#something with $filename
done
Use lowercase variable names for your script as uppercase variables are reserved for the system. Check [this].
Use tools like [ shellcheck ] to improve script quality.
Edit
Since you have mentioned the input directories contain only files, my alternative approach would be
[[ -d "$1" && -d "$2" && -d "$3" ]] && for filename in "$1"/*
do
[ ! -e "$2/${filename##*/}" ] && cp "$filename" "$3"
done
If you are baffled by ${filename##*/} check [ shell parameter expansion ].
Sidenote: In linux, although discouraged it not uncommon to have non-standard filenames like file name.
Courtesy: #chepner & #mklement0 for their comments that greatly improved this answer :)
Your script:
if ...; then
for ...; do
if ...; then
...
done
else
...
fi
Fixed structure:
if ...; then
for ...; do
if ...; then
...
fi # <-- missing
done
else
...
fi
If you want the script executable, then make it so:
$ chmod +x script.sh
Notice that you also have other problems in you script. It is better written as
dir1="$1"
dir2="$2"
dir3="$3"
for f in "$dir1"/*; do
if [ ! -f "$dir2/$(basename "$f")" ]; then
cp "$f" "$dir3"
fi
done
this is not totally correct:
for FILE in $(ls $1); do
< whatever you do here >
done
There is a big problem with that loop if in that folder there is a filename like this: 'I am a filename with spaces.txt'.
Instead of that loop try this:
for FILE in "$1"/*; do
echo "$FILE"
done
Also you have to close every if statement with fi.
Another thing, if you are using BASH ( #!/usr/bin/env bash ), it is highly recommended to use double brackets in your test conditions:
if [[ test ]]; then
...
fi
For example:
$ a='foo bar'
$ if [[ $a == 'foo bar' ]]; then
> echo "it's ok"
> fi
it's ok
However, this:
$ if [ $a == 'foo bar' ]; then
> echo "it's ok";
> fi
bash: [: too many arguments
You've forgot fi after the innermost if.
Additionally, neither square brackets nor find do work this way. This one does what your script (as it is now) is intended to on my PC:
#!/bin/bash
if [[ -d "$1" && -d "$2" && -d "$3" ]] ; then
ls -1 "$1" | while read FILE ; do
ls "$2/$FILE" >/dev/null 2>&1 || cp "$1/$FILE" "$3"
done
else echo "Error: one or more directories are not present"
fi
Note that after a single run, when $2 and $3 refer to different directories, those files are still not present in $2, so next time you run the script they will be copied once more despite they already are present in $3.
I need a script to keep polling "receive_dir" directory till "stopfile" get written in the directory.
This has to run despite empty directory.
So far i have this but fails if receive_dir is empty with no files with "unary operator expected". Help !!
#!/usr/bin/ksh
until [ $i = stopfile ]
do
for i in `ls receive_dir`; do
time=$(date +%m-%d-%Y-%H:%M:%S)
echo $time
echo $i;
done
done
This will do what you ask for (loop until the stop file exist). I added a "sleep 1" to lower resource usage. It's also good practice to use "#!/usr/bin/env ksh" as shebang.
#!/usr/bin/env ksh
until [ -e receive_dir/stopfile ]
do
time=$(date +%m-%d-%Y-%H:%M:%S)
echo $time
sleep 1
done
If you have empty dir, the
until [ $i = stopfile ]
is evaluated as
until [ = stopfile ]
what is ofcourse syntax error.
One comment: Never parse output from ls.
#!/bin/bash
do_something() {
echo $(date +%m-%d-%Y-%H:%M:%S) "$1"
}
dir="."
until [[ -f "$dir/stopfile" ]]
do
find "$dir" -print0 | while IFS= read -r -d '' filename
do
do_something "$filename"
done
done
or (much slower)
do_something() {
echo $(date +%m-%d-%Y-%H:%M:%S) "$1"
}
export -f do_something
dir="."
until [[ -f "$dir/stopfile" ]]
do
find "$dir" -exec bash -c 'do_something "{}"' \;
done
You're evaluating nothing, and the 'test' isn't able to evaluate it.
~> [ $empty_var = stopfile ]
-bash: [: =: unary operator expected
First, don't parse ls:
http://mywiki.wooledge.org/BashPitfalls#for_i_in_.24.28ls_.2A.mp3.29
EDIT: Part of the issue is your loop is actually doing the test, try something like this (assuming receive_dir is relative):
#user000001 is right; my original find example would suffer the same issue, so this:
for i in receive_dir/*
do
time=$(date +%m-%d-%Y-%H:%M:%S)
echo $time
echo $i
[ $i = stopfile ] && break
done
EDIT: Adding in another example based on your comment:
How about this...
FOUND="N"
while [ "${FOUND}" = "N" ]
do
for i in receive_dir/*
do
time=$(date +%m-%d-%Y-%H:%M:%S)
echo $time
echo $i
[ "$i" = stopfile ] && FOUND="Y"
done
sleep 60
done
Another option is to use inotifywait to monitor the status of the directory. For example, this script will run the loop until the file "stopfile" is touched.
until inotifywait "receive_dir" | grep "stopfile"
do
echo "running"
done
echo "done"
The advantage is that these is no busy loop, and that you don't have to repeatedly call the (potentially expensive) find command
In a bash script, I have to check for the existence of several files.
I know an awkward way to do it, which is as follows, but that would mean that my main program has to be within that ugly nested structure:
if [ -f $FILE1 ]
then
if [ -f $FILE2 ]
then
echo OK
# MAIN PROGRAM HERE
fi
fi
The following version does not work:
([ -f $FILE1 ] && [ -f $FILE2 ]) || ( echo "NOT FOUND"; exit 1 )
echo OK
It prints
NOT FOUND
OK
Is there an elegant way to do this right?
UPDATE: See the accepted answer. In addition, in terms of elegance I like Jonathan Leffler's answer:
arg0=$(basename $0 .sh)
error()
{
echo "$arg0: $#" 1>&2
exit 1
}
[ -f $FILE2 ] || error "$FILE2 not found"
[ -f $FILE1 ] || error "$FILE1 not found"
How about
if [[ ! ( -f $FILE1 && -f $FILE2 ) ]]; then
echo NOT FOUND
exit 1
fi
# do stuff
echo OK
See help [[ and help test for the options usable with the [[ style tests. Also read this faq entry.
Your version does not work because (...) spawns a new sub-shell, in which the exit is executed. It therefor only affects that subshell, but not the executing script.
The following works instead, executing the commands between {...} in the current shell.
I should also note that you have to quote both variables to ensure there is no unwanted expansion or word splitting made (they have to be passed as one argument to [).
[ -f "$FILE1" ] && [ -f "$FILE2" ] || { echo "NOT FOUND"; exit 1; }
I think you're looking for:
if [ -f $FILE1 -a -f $FILE2 ]; then
echo OK
fi
See man test for more details on what you can put inside the [ ].
You can list the files and check them in a loop:
file_list='file1 file2 wild*'
for file in $file_list; do
[ -f $file ] || exit
done
do_main_stuff
I usually use a variant on:
arg0=$(basename $0 .sh)
error()
{
echo "$arg0: $#" 1>&2
exit 1
}
[ -f $FILE2 ] || error "$FILE2 not found"
[ -f $FILE1 ] || error "$FILE1 not found"
There's no particular virtue in making the shell script have a single exit point - no harm either, but fatal errors may as well terminate the script.
The only point of debate would be whether to diagnose as many problems as possible before exiting, or whether to just diagnose the first. On average, diagnosing just the first is a whole lot easier.