for i in `cat ${DIR}/${INDICATOR_FLIST}`
do
X=`ls $i|wc -l`
echo $X
echo $X>>txt.txt
done
I have a code like this to check if file is present in a directory or not
but this is not working and gives error like this:
not foundtage001/dev/BadFiles/RFM_DW/test_dir/sales_crdt/XYZ.txt
You can see there is no space between not found and file path.
I have a code like this to check if file is present in a directory or not ...
It seems that you're trying to read a list of files from ${DIR}/${INDICATOR_FLIST} and then trying to determine if those actually exist. The main problem is:
You're trying to parse ls in order to figure whether the file exists.
This is what results in the sort of output that you see; mix of stderr and stdout. Use the test operator instead as given below.
The following would give tell you whether the file exists or not:
while read line; do
[ -e "${line}" ] && echo "${line} exists" || echo "${line} does not exist";
done < ${DIR}/${INDICATOR_FLIST}
Your file ${INDICATOR_FLIST} has CRLF line terminators (DOS-style). You need to strip out the CR characters, as Unix convention is for LF-only line terminators.
You can tell this by the way "not found" is printed at the start of the line. The immediately preceding character (the last char of the filename) is a CR, which sends the cursor back to the start of the line.
Find a dos2unix utility, or run tr -d \\015 over it (this deletes all CR's indiscriminately).
maybe the location of your file isn't ok.
take this example
m:~ tr$ echo "1 2 3 4 5" > file.txt
m:~ tr$ cat file.txt
1 2 3 4 5
m:~ tr$ for i in `cat file.txt`;do echo $i ;done
1
2
3
4
5
m:~ tr$
you could write the file before "for" and maybe check if the file exists:
echo "location : ${DIR}/${INDICATOR_FLIST}"
if [ -e ${DIR}/${INDICATOR_FLIST} ];then echo "file exists ";else echo "file was not found";fi
for i in `cat ${DIR}/${INDICATOR_FLIST}`;
do
let "X=`ls $i|wc -l`";
echo $X;
echo $X>>txt.txt;
done
The proper syntax is:
for i in `cat data-file`
do
echo $I
done
which you are following, therefore the problem must be in the specification of $DIR and $INDICATOR_LIST so therefore double check the location of your file.
Related
I've seen several answers on SO about how to append to a file if it exists and create a new file if it doesn't (echo "hello" >> file.txt) or overwrite a file if it exists and create one if it doesn't (echo "hello" > file.txt).
But how do I make sure that echo "hello" only works and appends to the file if it already exists and raises an error if it doesn't?
EDIT: Right now, I'm already checking for the file using [ -f file.txt ]. I was wondering if there's a way in which I could simply use echo.
Assuming the file is either nonexistent or both readable and writable, you can try to open it for reading first to determine whether it exists or not, e.g.:
command 3<file 3<&- >>file
3<&- may be omitted in most cases as it's unexpected for a program to start reading from file descriptor 3 without redirecting it first.
Proof of concept:
$ echo hello 3<file 3<&- >>file
bash: file: No such file or directory
$ ls file
ls: cannot access 'file': No such file or directory
$ touch file
$ echo hello 3<file 3<&- >>file
$ cat file
hello
$
This works because redirections are processed from left to right, and a redirection error causes the execution of a command to halt. So if file doesn't exist (or is not readable), 3<file fails, the shell prints an error message and stops processing this command. Otherwise, 3<&- closes the descriptor (3) associated with file in previous step, >>file reopens file for appending and redirects standard output to it.
I think a simple if as proposed in the other answers would be best. However, here are some more exotic solutions:
Using dd
dd can do the check and redirection in one step
echo hello | dd conv=nocreat of=file.txt
Note that dd prints statistics to stderr. You can silence them by appending 2> /dev/null but then the warning file does not exist goes missing too.
Using a custom Function
When you do these kind of redirections very often, then a reusable function would be appropriate. Some examples:
Run echo and redirect only if the file exists. Otherwise, raise the syntax error -bash: $(...): ambiguous redirect.
ifExists() { [ -f "$1" ] && printf %s "$1"; }
echo hello >> "$(ifExists file.txt)"
Always run echo, but print a warning and discard the output if the file does not exist.
ifExists() {
if [ -f "$1" ]; then
printf %s "$1"
else
echo "File $1 does not exist. Discarding output." >&2
printf /dev/null
fi
}
echo hello >> "$(ifExists file.txt)"
Please note that ifExists cannot handle all file names. If you deal with very unusual filenames ending with newlines, then the subshell $( ...) will remove those trailing newlines and the resulting file will be different from the one specified. To solve this problem you have to use a pipe.
Always run echo, but print a warning and discard the output if the file does not exist.
appendIfExists() {
if [ -f "$1" ]; then
cat >> "$1"
else
echo "File $1 does not exist. Discarding output." >&2
return 1
fi
}
echo hello | appendIfExists file.txt
Just check:
if [ -f file.txt ]; then
echo "hello" >> file.txt
else
echo "No file.txt" >&2
exit 1
fi
There's no way in bash to alter how >> works; it will always (try to) create a file if it doesn't already exist.
For example:
if [ -f "filename" ]; then
echo "hello" >>filename
fi
I'm creating a bash script to read a file in line by line, that is formatted later to be organised by name and then date. I cannot see why this code isn't working at this time though no errors show up even though I have tried with the input and output filename variables on their own, with a directory finder and export command.
export inputfilename="employees.txt"
export outputfilename="branch.txt"
directoryinput=$(find -name $inputfilename)
directoryoutput=$(find -name $outputfilename)
n=1
if [[ -f "$directoryinput" ]]; then
while read line; do
echo "$line"
n=$((n+1))
done < "$directoryoutput"
else
echo "Input file does not exist. Please create a employees.txt file"
fi
All help is very much appreciated, thank you!
NOTE: As people noticed, I forgot to add in the $ sign on the data transfer to file, but it was just in copying my code, I do have the $ sign in my actual application and still no result
Reading in File line by line w/ Bash
The best and idiomatic way to read file line by line is to do:
while IFS= read -r line; do
// parse line
printf "%s" "$line"
done < "file"
More on this topic can be found on bashfaq
However don't read files in bash line by line. You can (ok, almost) always not read a stream line by line in bash. Reading a file line by line in bash is extremely slow and shouldn't be done. For simple cases all the unix tools with the help of xargs or parallel can be used, for more complicated awk and datamesh are used.
done < "directoryoutput"
The code is not working, because you are passing to your while read loop as input to standard input the content of a file named directoryoutput. As such a file does not exists, your script fails.
directoryoutput=$(find -name $outputfilename)
One can simply append the variable value with newline appended to a read while loop using a HERE-string construction:
done <<< "$directoryoutput"
directoryinput=$(find -name $inputfilename)
if [[ -f "$directoryinput" ]]
This is ok as long as you have only one file named $inputfilename in your directory. Also it makes no sense to find a file and then check for it's existance. In case of more files, find return a newline separated list of names. However a small check if [ "$(printf "$directoryinput" | wc -l)" -eq 1 ] or using find -name $inputfilename | head -n1 I think would be better.
while read line;
do
echo "$line"
n=$((n+1))
done < "directoryoutput"
The intention is pretty clear here. This is just:
n=$(<directoryoutput wc -l)
cat "directoryoutput"
Except that while read line removed trailing and leading newlines and is IFS dependent.
Also always remember to quote your variables unless you have a reason not to.
Have a look at shellcheck which can find most common mistakes in scripts.
I would do it more like this:
inputfilename="employees.txt"
outputfilename="branch.txt"
directoryinput=$(find . -name "$inputfilename")
directoryinput_cnt=$(printf "%s\n" "$directoryinput" | wc -l)
if [ "$directoryinput_cnt" -eq 0 ]; then
echo "Input file does not exist. Please create a '$inputfilename' file" >&2
exit 1
elif [ "$directoryinput_cnt" -gt 1 ]; then
echo "Multiple file named '$inputfilename' exists in the current path" >&2
exit 1
fi
directoryoutput=$(find . -name "$outputfilename")
directoryoutput_cnt=$(printf "%s\n" "$directoryoutput" | wc -l)
if [ "$directoryoutput_cnt" -eq 0 ]; then
echo "Input file does not exist. Please create a '$outputfilename' file" >&2
exit 1
elif [ "$directoryoutput_cnt" -gt 1 ]; then
echo "Multiple file named '$outputfilename' exists in the current path" >&2
exit 1
fi
cat "$directoryoutput"
n=$(<"$directoryoutput" wc -l)
I have the following script called test.sh:
echo "file path is : $1"
path=$1
while read -r line
do
num=$($line | tr -cd [:digit:])
echo num
done < $path
exit 0
I am attempting to grab the digit at the start of each line of the file stored as $path. the end result will be to loop over each line, grab the digit and remove it from the file if it is less than 2.
Every time i run this loop i get the error "./test.sh: line 5: : command not found. What part of the while loop am I doing wrong? Or is it something to do with the tr command?
I can spot a few things wrong with your script:
#!/bin/bash
echo "file path is : $1"
path=$1
while read -r line
do
num=$(tr -cd '[:digit:]' <<<"$line") # use here string to "echo" variable to tr
echo "$num" # added quotes and $
done < "$path" # added quotes, changed $dest to $path
In summary:
cmd <<<"$var" (here string) is a bash built-in designed as a replacement for echo "$var" | cmd. I added #!/bin/bash to the top of the script, as I am using this bash-only feature.
I have quoted your variables to prevent problems with word splitting and glob expansion.
I made the assumption that you really meant to use $path on the last line (though I may be wrong).
Finally, there's no need to exit 0 at the end of your script.
What I have to to is edit a script given to me that will check if the user has write permission for a file named journal-file in the user's home directory. The script should take appropriate actions if journal-file exists and the user does not have write permission to the file.
Here is what I have written so far:
if [ -w $HOME/journal-file ]
then
file=$HOME/journal-file
date >> file
echo -n "Enter name of person or group: "
read name
echo "$name" >> $file
echo >> $file
cat >> $file
echo "--------------------------------" >> $file
echo >> $file
exit 1
else
echo "You do not have write permission."
exit 1
fi
When I run the script it prompt me to input the name of the person/group, but after I press enter nothing happens. It just sits there allowing me to continue inputting stuff and doesn't continue past that part. Why is it doing this?
The statement:
cat >>$file
will read from standard input and write to the file. That means it will wait until you indicate end of file with something like CTRL-D. It's really no different from just typing cat at a command line and seeing that nothing happens until you enter something and it waits until you indicate end of file.
If you're trying to append another file to the output file, you need to specify its name, such as cat $HOME/myfile.txt >>$file.
If you're trying to get a blank line in there, use echo rather than cat, such as echo >>$file.
You also have a couple of other problems, the first being:
date >> file
since that will try to create a file called file (in your working directory). Use $file instead.
The second is the exit code of 1 in the case where what you're trying to do has succeeded. That may not be a problem now but someone using this at a later date may wonder why it seems to indicate failure always.
To be honest, I'm not really a big fan of the if ... then return else ... construct. I prefer fail-fast with less indentation and better grouping of output redirection, such as:
file=${HOME}/journal-file
if [[ ! -w ${file} ]] ; then
echo "You do not have write permission."
exit 1
fi
echo -n "Enter name of person or group: "
read name
(
date
echo "$name"
echo
echo "--------------------------------"
echo
) >>${file}
I believe that's far more readable and maintainable.
It's this line
cat >> $file
cat is concatenating input from standard input (ie whatever you type) to $file
I think the part
cat >> $file
copies everything from stdin to the file. Maybe if you hid Ctrl+D (end of file) the script can continue.
1) You better check first whether the file exists or not:
[[ -e $HOME/journal-file ]] || \
{ echo "$HOME/journal-file does not exist"; exit 1 }
2) You gotta change "cat >> $file" for whatever you want to do with the file. This is the command that is blocking the execution of the script.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Unix for loop help please?
I am trying to list the names of all the files in a directory separated by a blank line. I was using a for loop but after trying a few examples, none really work by adding blank lines in between. Any ideas?
Is there any command which outputs only the first line of a file in unix? How could I only display the first line?
for i in ls
do
echo "\n" && ls -l
done
for i in ls
do
echo "\n"
ls
done
Use head or sed 1q to display only the first line of a file. But in this case, if I'm understanding you correctly, you want to capture and modify the output of ls.
ls -l | while read f; do
printf '%s\n\n' "$f"
# alternately
echo "$f"; echo
done
IFS="
"
for i in $(ls /dir/name/here/or/not)
do
echo -e "$i\n"
done
To see the first part of a file use head and for the end of a file use tail (of course). The command head -n 1 filename will display the first line. Use man head to get more options. (I know how that sounds).
Use shell expansion instead of ls to list files.
for file in *
do
echo "$file"
echo
if [ -f "$file" ];then
read firstline < "$file"
echo "$firstline" # read first line
fi
done