I am trying to loop through files in a specified directory. But I can't seem to figure out the logic. I am looping through each file and asking if they want to delete that file.
#!/bin/bash
dirpath=$1
y=y
Y=Y
echo "changing directory '$dirpath' `cd $dirpath`"
for f in $1/*
do
#####################################
if test -f `ls -1 $1`
then
echo -n "remove file '$f' `ls -1` ?"
read answer
##########################
if test $answer = $y || test $answer = $Y
then
echo "Processing $f file..."
echo `rm $f`
echo "file '$f' deleted "
else
echo "file '$f' not removed"
fi#2nd if loop
############################
else
echo 'not a file'
fi#1st if loop
#######################################
done
Your code seems much more complicated that it should be. Does this fulfill your needs or are you doing some shell practice?
rm -iv DIRECTORY/*
There's no need for ls, you already have the filename. Change this:
if test -f `ls -1 $1`
to:
if test -f "$f"
Why are you using echo and backticks here? Change
echo `rm $f`
to:
rm "$f"
Here's another place you're using backticks unnecessarily. Change this:
echo "changing directory '$dirpath' `cd $dirpath`"
to:
echo "changing directory '$dirpath'"
cd "$dirpath"
Always quote variables that contain filenames.
You can have rm do the "asking" for you via its -i flag to prompt user before removal. I am assuming you want to consider only files, not directories, and not recurse any sub-directories.
#!/bin/bash
for f in $1/* ; do
if [ -f $f ] ; then
rm -i $f ;
fi
done
Without the error, can't really help, but it could be written like this, not as verbose though
rm -i *
If $1 is a relative path, then once you've cd'd into $1, the wildcard in your for loop will be meaningless. I'd recommend something more like
cd $1
for f in *; do
...
done
Since it will accept both relative and absolute paths.
Moreover, the arguments to the first test are wrong. Each time through the loop, $f will hold one filename, so your test should be like
if (test -f $f); then
You also repeat this in your echo arguments.
The following does basically what you want, with only slight modifications from your script.
#!/bin/bash
dirpath=$1
y=y
Y=Y
echo "changing directory '$dirpath' `cd $dirpath`"
for f in ./*; do
if (test -f $f); then
echo -n "remove file '$f' ?"
read answer
if (test $answer == $y) || (test $answer == $Y); then
echo "Processing $f file..."
rm $f
echo "file '$f' deleted "
else
echo "file '$f' not removed"
fi
else
echo 'not a file'
fi
done
Related
I'm having the hardest time finding out how to do a fairly simple task succesfully:
Archive files and folders to a different location (eg cp -rp /source/file /destination/file)
Remove them once coppied (rm -rf /source/file)
But only if they are 30+ days or older, which I know requires find -mtime 30. But I do not know how to make the following code work with find:
#!/bin/bash
#functions
help()
{
echo "archiveer <doelmap>
archiveer /home/goedvoorbeeld/testmap
Dit commando zorgt ervoor dat gebackupte users van meer dan 30 dagen oud worden gearchiveerd. Alleen de meest recente data wordt opgeslagen.
De doelmap parameter is verplicht.";
exit
}
doemaar()
{
echo "Doe maar wordt uitgevoerd"
for f in *; do
if [[ "${f}" -nt "${doelmap}${f}" ]]
then
echo "file $f is nieuwer of niet aanwezig, gebacked up"
cp -rp "$f" "${doelmap}${f}"
rm -rf "$f"
else
echo "file $f is ouder"
fi
done
}
#check for parameters
if [[ $1 = "help" ]]
then
help
elif [[ $1 = "" ]]
then
echo "Missende parameter!"
help
fi
#set variables
doelmap=$1
#script gaat runnen
echo "${doelmap}" && sleep 1
if [ ! -d "${doelmap}" ]
then
mkdir -p "${doelmap}"
echo "${doelmap} is gemaakt"
else
echo "doelmap bestaat al"
fi
doemaar
echo "Script end"
Note that * can sometimes cause problems if filenames have spaces or special characters in them. That is why I always redirect any lists to a file, from which I work from. Also, the results list from * might exceed shell limits, causing different issues.
Also, "-rp" implies recursive, so directory contents. But earlier you stated that you were only looking at files. So the assumption is that there are no sub-directories under the directory where you are trying to find old files.
I would process the file archiving, then I would check the parent directories to purge directories that are left empty after the archiving. This is a safety measure in case of power loss.
You could consider adapting the following into your script:
cd "${LIVE_DIR}"
TOPIC=`basename "${LIVE_DIR}" `
find . -type f -mtime 30+ -print >joblist.files
mkdir "${ARCHIVE_ROOT}/${TOPIC}"
if [ $? -ne 0 ] ; then echo "unable to create dir" ; exit 1 ; fi
while read file
do
ARC_FILE="${ARCHIVE_ROOT}/${TOPIC}/${file}"
cp -pv "${file}" "${ARC_FILE}"
if [ $? -eq 0 ]
then rm -f "${file}"
if [ $? -ne 0 ]
then echo "PURGE|rm -f \"${file}\"" >&2
fi
else rm -fv "${ARC_FILE}"
echo "ARCHIVE|cp -pv \"${file}\" \"${ARC_FILE}\"" >&2
fi
echo "LAST|${file}" >joblist.last
done <joblist.files 2>joblist.failed
joblist.failed contains any cleanup that needs to be performed, commands that need to be repeated to complete the initial job.
joblist.last contains the last file processed if you need to kill the job and keep track of where you left off.
Sorry for asking this question again. I have already received answer but with using find but unfortunately I need to write it without using any predefined commands.
I am trying to write a script that will loop recursively through the subdirectories in the current directory. It should check the file count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like:
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirectories don't include subdirectories in turn.
I am confused about this because it doesn't work as I expect.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
How can I modify my script to produce the desired output?
This bash script produces the output that you want:
#!/bin/bash
bigfile="$PWD/BigList"
shortfile="$PWD/ShortList"
shopt -s nullglob
loop_folder_recurse() {
(
[[ -n "$1" ]] && cd "$1"
for i in */; do
[[ -d "$i" ]] && loop_folder_recurse "$i"
count=0
files=''
for j in *; do
if [[ -f "$j" ]]; then
files+="$j"$'\n'
((++count))
fi
done
if ((count > 10)); then
outfile="$bigfile"
else
outfile="$shortfile"
fi
echo "$i" >> "$outfile"
echo "$files" >> "$outfile"
done
)
}
loop_folder_recurse
Explanation
shopt -s nullglob is used so that when a directory is empty, the loop will not run. The body of the function is within ( ) so that it runs within a subshell. This is for convenience, as it means that the function returns to the previous directory when the subshell exits.
Hopefully the rest of the script is fairly self-explanatory but if not, please let me know and I will be happy to provide additional explanation.
OK, so, I'm trying to create a script that...
a. Will look in a folder where I drop my txt notes, and if any of them starts with "a2c_" it will recognize it as an "aria2c download list". Which brings as to...
b. Will pass the first of the matching files to aria2c, together with a directory name similar to the txt file, so that it will download each URL found in the txt to the same-named directory.
I've ended up with this..:
#!/bin/bash
#PARAMETERS: _______________________________
workingdir="/home/username/downloads/"
#___________________________________________
echo Working dir is: $workingdir
mkdir -p $workingdir
echo "Making dir"
cd $workingdir
if [ -f a2c_* ]
then
echo "Found files"
mkdir -p !a2c_downloaded
echo "Making !a2c_downloaded dir"
counter=0
echo "Set counter to 0"
for f in a2c_*.txt
do
counter=$((counter+1))
echo Download List File is: $f
echo $counter file processing.
tempfile=${f%%.*}
tempfile="`echo "$tempfile" | sed ' s/a2c_//' `"
downdir=$tempfile
echo Download Dir is: $downdir
mkdir -p $downdir
echo ___________________________________________
echo $endfilename
aria2c --auto-file-renaming -i $f -d $downdir --force-sequential
echo "Will download $f to $downdir"
sleep 5
mv $f !a2c_downloaded/
done
else
echo "No files found"
fi
...that worked when I tested it. Today, one day later, I throw some a2c_*.txt files in the dir and I'm met with "unexpected operator" errors. Any ideas? And is there an easier way to accomplish what I'm trying to do?
Thanks.
_UPDATED: _________________________________
#!/bin/bash
#PARAMETERS: _______________________________
workingdir="/home/username/downloads/"
MatchPattern="a2c_*"
#___________________________________________
echo Working dir is: $workingdir
mkdir -p $workingdir
echo "Making dir"
cd $workingdir
#if [ -f a2c_* ];
#if [ find . -maxdepth 1 -type f -name "a2c_*.txt" 2>/dev/null | grep "a2c_*" ]
#if [ -f a2c_* ]
#if [ "$?" = "0" ];
echo "Match Pattern set as $MatchPattern"
echo "Now looking in $workingdir for $MatchPattern"
echo "Manual list:"
echo "_________________________________________"
MatchList=$(ls -1 "$MatchPattern")
echo "$MatchList"
echo "_________________________________________"
if ls -1 $MatchPattern >/dev/null 2>&1
then
echo "Found files"
mkdir -p !a2c_downloaded
echo "Making !a2c_downloaded dir"
counter=0
echo "Set counter to 0"
for f in a2c_*.txt
do
counter=$((counter+1))
echo "Download List File is: $f"
echo $counter file processing.
tempfile=${f%%.*}
tempfile="`echo "$tempfile" | sed ' s/a2c_//' `"
downdir=$tempfile
echo "Download Dir is: $downdir"
mkdir -p "$downdir"
echo ___________________________________________
echo "$endfilename"
aria2c --auto-file-renaming -i "$f" -d "$downdir" --force-sequential
echo "Will download $f to $downdir"
sleep 5
mv "$f" !a2c_downloaded/
done
else
echo "No files found"
fi
...The above works, after some fixes http://www.shellcheck.net/ told me to do. Problem is, it advises me to use double quotes in line:
if ls -1 $MatchPattern >/dev/null 2>&1
like
if ls -1 "$MatchPattern" >/dev/null 2>&1
..."Double quote to prevent globbing and word splitting.", but when I do, the script stops working for me. Should I leave it as it is? It seems to be working fine - for now.
Instead of
if [ -f a2c_* ]
then
You can try this:
file_exists() {
for _i do
[ -f "$_i" ] && break
done
}
and then
if file_exists a2c_*
then
I have a text file (ListOfAllFiles.txt) that has a list of 500 files some of which exist and some don't.
I'd like to make two texts files that indicate which files exist and which don't.
This is my code thus far:
#!/bin/bash
for f in $(cat /path/to/ListOfAllFiles.txt)
do
if [[ -f $f ]]; then
echo $f > /path/to/FilesFound.txt
else
echo $f > /path/to/FilesNOTFound.txt
fi
done
What am I doing wrong??
Your biggest problem is that each pass through the loop will overwrite either /path/to/FilesFound.txt or /path/to/FilesNOTFound.txt; instead of using >, you should be using >>. Fixing that, and making other improvements for robustness, we get:
#!/bin/bash
echo -n > /path/to/FilesFound.txt # reset to empty file
echo -n > /path/to/FilesNOTFound.txt # reset to empty file
while IFS= read -r f ; do
if [[ -f "$f" ]]; then
echo "$f" >> /path/to/FilesFound.txt
else
echo "$f" >> /path/to/FilesNOTFound.txt
fi
done < /path/to/ListOfAllFiles.txt
How does one test for the existence of files in a directory using bash?
if ... ; then
echo 'Found some!'
fi
To be clear, I don't want to test for the existence of a specific file. I would like to test if a specific directory contains any files.
I went with:
(
shopt -s dotglob nullglob
existing_files=( ./* )
if [[ ${#existing_files[#]} -gt 0 ]] ; then
some_command "${existing_files[#]}"
fi
)
Using the array avoids race conditions from reading the file list twice.
From the man page:
-f file
True if file exists and is a regular file.
So:
if [ -f someFileName ]; then echo 'Found some!'; fi
Edit: I see you already got the answer, but for completeness, you can use the info in Checking from shell script if a directory contains files - and lose the dotglob option if you want hidden files ignored.
I typically just use a cheap ls -A to see if there's a response.
Pseudo-maybe-correct-syntax-example-ahoy:
if [[ $(ls -A my_directory_path_variable ) ]] then....
edit, this will work:
myDir=(./*) if [ ${#myDir[#]} -gt 1 ]; then echo "there's something down here"; fi
You can use ls in an if statement thus:
if [[ "$(ls -a1 | egrep -v '^\.$|^\.\.$')" = "" ]] ; then echo empty ; fi
or, thanks to ikegami,
if [[ "$(ls -A)" = "" ]] ; then echo empty ; fi
or, even shorter:
if [[ -z "$(ls -A)" ]] ; then echo empty ; fi
These basically list all files in the current directory (including hidden ones) that are neither . nor ...
If that list is empty, then the directory is empty.
If you want to discount hidden files, you can simplify it to:
if [[ "$(ls)" = "" ]] ; then echo empty ; fi
A bash-only solution (no invoking external programs like ls or egrep) can be done as follows:
emp=Y; for i in *; do if [[ $i != "*" ]]; then emp=N; break; fi; done; echo $emp
It's not the prettiest code in the world, it simply sets emp to Y and then, for every real file, sets it to N and breaks from the for loop for efficiency. If there were zero files, it stays as Y.
Try this
if [ -f /tmp/foo.txt ]
then
echo the file exists
fi
ref: http://tldp.org/LDP/abs/html/fto.html
you may also want to check this out: http://tldp.org/LDP/abs/html/fto.html
How about this for whether directory is empty or not
$ find "/tmp" -type f -exec echo Found file {} \;
#!/bin/bash
if [ -e $1 ]; then
echo "File exists"
else
echo "Files does not exist"
fi
I don't have a good pure sh/bash solution, but it's easy to do in Perl:
#!/usr/bin/perl
use strict;
use warnings;
die "Usage: $0 dir\n" if scalar #ARGV != 1 or not -d $ARGV[0];
opendir my $DIR, $ARGV[0] or die "$ARGV[0]: $!\n";
my #files = readdir $DIR;
closedir $DIR;
if (scalar #files == 2) { # . and ..
exit 0;
}
else {
exit 1;
}
Call it something like emptydir and put it somewhere in your $PATH, then:
if emptydir dir ; then
echo "dir is empty"
else
echo "dir is not empty"
fi
It dies with an error message if you give it no arguments, two or more arguments, or an argument that isn't a directory; it's easy enough to change if you prefer different behavior.
# tested on Linux BASH
directory=$1
if test $(stat -c %h $directory) -gt 2;
then
echo "not empty"
else
echo "empty"
fi
For fun:
if ( shopt -s nullglob ; perl -e'exit !#ARGV' ./* ) ; then
echo 'Found some!'
fi
(Doesn't check for hidden files)