Unix Bash - Copy files from a source folder recursively to destination/*file_extension*(ex. “txt”) folder - bash

This is my code, something in the rec_copy() function isn't working properly, probably this line:
cp $1/$f $HOME/$2/$dest
The extension named folders are created in the destination folder but the files are not copied there. Can you help me?
#!/bin/bash
if [ $# -ne 2 ]
then
echo "Usage: $0 <source> <destination>"
exit
fi
if [ ! -d $1 ]
then
echo "Source folder does not exist"
exit
fi
if [ -d $2 ]
then
rm -r $2
mkdir $2
else
mkdir $2
fi
extension=`ls -l $1 | grep -v "^d" | awk '{ print $10; }' | sed 's/^.*\.//g'`
for f in $extension
do
if [ ! -d $1/$f ]
then
mkdir $2/$f
fi
done
rec_copy(){
folder=`ls $1`
for f in $folder
do
dest=`echo "$f" | sed 's/.*\.//g'`
if [ -f $1/$f ]
then
cp $1/$f $HOME/$2/$dest
elif [ -d $1/$f ]
then
rec_copy $1/$f
fi
done
}
rec_copy $1

Here is the answer in case someone ever needs it:
#!/bin/bash
if [ $# -ne 2 ]
then
echo "Usage: $0 <izvor> <destinacija>"
exit
fi
if [ ! -d "$1" ]
then
echo "Izvorniot folder ne postoi"
exit
fi
if [ -d "$2" ]
then
rm -r "$2"
mkdir "$2"
else
mkdir "$2"
fi
extension=`ls -l "$1" | grep -v "^d" | awk '{ print $10; }' | sed 's/^.*\.//g' | sort -u`
for f in $extension
do
if [ ! -d "$1/$f" ]
then
mkdir "$2/$f"
fi
done
rec_copy(){
folder=`ls "$1"`
for f in $folder
do
dest=`echo "$f" | sed 's/.*\.//g'`
to=`cp "$1/$f" "$2/$dest"`
if [ -f "$1/$f" ]
then
echo "$to"
elif [ -d "$1/$f" ]
then
rec_copy "$1/$f" "$2"
fi
done
}
rec_copy "./$1" "./$2"

Related

Counting directories and files in bash

Simple counting dirs and files does not work. I am checking each file by -f and -d flag.
Where is a problem?
LOCATION=$1
for FILE in $(ls $LOCATION | egrep '^.{0,3}$');
do
echo "$FILE"
if [ -f $FILE ]
then
echo "its a file"
fi
if [ -d $FILE ]
then
echo "its a dir"
fi
done
shopt -s dotglob # count hidden files
for file in "$LOCATION/"*; do
[[ -f $file ]] && ((f++))
[[ -d $file ]] && ((d++))
done
echo "${d:-0} dirs"
echo "${f:-0} files"
without involving external utilities

Passing ls command to a for loop breaks when trying to use command line arguments

Im writing a simple script to list all files in a directory and whether each file is in fact a file or if it is a directory. If it is a directory then it outputs how many files are in the directory.
#!/bin/bash
for filename in $(ls)
do
if [ -f "$filename" ]
then
printf "$filename - file\n"
fi
if [ -d "$filename" ]
then
count=$(ls "$filename" | wc -l)
printf "$filename - directory $count files\n"
fi
done
This works perfectly fine. But if I try to pass a command line argument (directory name) to ls then the script doesn't work. Does anyone know what causes this to break. Example below.
#!/bin/bash
for filename in $(ls $1)
do
if [ -f "$filename" ]
then
printf "$filename - file\n"
fi
if [ -d "$filename" ]
then
count=$(ls "$filename" | wc -l)
printf "$filename - directory $count files\n"
fi
done
$filename exists in directory/$filename. And you are check $filename in current (./) directory.
You should check against directory/$filename
if [ -z $1 ]; then
echo USAGE: basename $0 directory
exit
else
directory=$1
fi
...
for ...
if [ -f "$directory/$filename" ]; then
...
...
When you are passing directory as argument you shouldn't check files $filename but $1/$filename.
If you are sure that there will always be an argument you should use something like:
#!/bin/bash
for filename in $(ls "$1")
do
if [ -f "$1/$filename" ]
then
printf "$1/$filename - file\n"
fi
if [ -d "$1/$filename" ]
then
count=$(ls "$1/$filename" | wc -l)
printf "$1/$filename - directory $count files\n"
fi
done
If dirname as argument is optional you should check if there is argument and process as you want. My suggestion:
#!/bin/bash
if [ -z ${1+x} ];
then
echo "No argument";
else
echo "There is argument";
cd $1;
fi
for filename in $(ls)
do
if [ -f "$filename" ]
then
printf "$filename - file\n"
fi
if [ -d "$filename" ]
then
count=$(ls "$filename" | wc -l)
printf "$filename - directory $count files\n"
fi
done

Bash script loop through subdirectories and write to file

I have no idea I have spent a lot of hours dealing with this problem. I need to write script. Script should loop recursively through subdirectories in current directory. It should check files count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirecotries don't include subdirectories in turn.
I am confused about this. Because it doesn't work as I expect. It will take less than 5 minutes to write this on any programming language for my.
Please help to solve this problem , because I have no idea how to do this.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
I believe that this does what you want:
find . -type d -exec env d={} bash -c 'out=Shortlist; [ $(ls "$d" | wc -l) -ge 10 ] && out=Biglist; { echo "--$d"; ls "$d"; echo; } >>"$out"' ';'
If we don't want either to count subdirectories to the cut-off or to list them in the output, then use this version:
find . -type d -exec env d={} bash -c 'out=Shortlist; [ $(ls -p "$d" | grep -v "/$" | wc -l) -ge 10 ] && out=Biglist; { echo "--$d"; ls -p "$d"; echo; } | grep -v "/$" >>"$out"' ';'

How to find the filenames changed between to linux directories as per contents of the file using md5sum followed by diff command

I have two linux directories dir1 and dir2 with some files in both. Now i want list of filenames with files added and files deleted in dir2 as compared to dir1. The files should be compared as per the data or contents in the file. I am new to the linux bash scripting. Please help me.
Currently i am doing this like below :
find dir1 -iname *.c -o -iname *.h -o -iname *.prm | xargs -n1 md5sum > dir1.fingerprint.md5sum
find dir2 -iname *.c -o -iname *.h -o -iname *.prm | xargs -n1 md5sum > dir2.fingerprint.md5sum
cat dir1.fingerprint.md5sum | cut -d" " -f1 | sort -u > dir1.fingerprint
cat dir2.fingerprint.md5sum | cut -d" " -f1 | sort -u > dir2.fingerprint
diff -NrU 2 dir1.fingerprint dir2.fingerprint
I am getting the result as some change id's as shown below :
--- dir1.fingerprint 2013-03-08 11:57:24.421311354 +0530
+++ dir2.fingerprint 2013-03-08 11:57:34.901311856 +0530
## -1,3 +1,3 ##
-43551a78e0f5b0be4aec23fdab881e65
-4639647e4f86eb84987cd01df8245d14
4c9cc7c6332b4105197576f66d1efee7
+9f944e70cb20b275b2e9b4f0ee26141a
+d41d8cd98f00b204e9800998ecf8427e
I want the result as the filenames for files modified or added newly to dir2. How to get this. Please help me.
Try this script with the arguments dir2 and dir1
#!/bin/sh
if [ "x$1" == "x" ]
then
exit 0
fi
if [ "x$2" == "x" ]
then
exit 0
fi
#echo "DIFF $1 $2"
if [ -f $1 ]
then
if [ -e $2 ]
then
diff $1 $2 >/dev/null
if [ "$?" != "0" ]
then
echo "DIFFERENT $1"
fi
fi
exit 0
fi
if [ "x`ls $1`" != "x" ]
then
for f in `ls $1`
do
$0 $1/$f $2/$f
done
fi
exit 0
EDIT:
if [ "x`ls $1`" != "x" ]
then
for f in `ls $1`
do
if [ -f $1/$f ]
then
for g in `ls $2`
do
if [ -f $2/$g ]
then
diff $1/$f $2/$g >/dev/null
if [ "$?" == "0" ]
then
echo "SAME CONTENT $1/$f $2/$g"
fi
fi
done
fi
done
fi

I cannot seem to run this properly... It stucks and does not display an output

Here's my script:
while [[ $startTime -le $endTime ]]
do
thisfile=$(find * -type f | xargs grep -l $startDate | xargs grep -l $startTime)
fordestination=`cut -d$ -f2 $thisfile | xargs cut -d ~ -f4`
echo $fordestination
startTime=$(( $startTime + 1 ))
done
I think your cut and grep commands could get stuck. You probably should make sure that their parameters aren't empty, by using the [ -n "$string" ] command to see if $string isn't empty. In your case, if it were empty, it wouldn't add any files to the command that would use it afterwards, meaning that the command would probably wait for input from the command line (ex: if $string is empty and you do grep regex $string, grep wouldn't receive input files from $string and would instead wait for input from the command line). Here's a "complex" version that tries to show where things could go wrong:
while [[ $startTime -le $endTime ]]
do
thisfile=$(find * -type f)
if [ -n "$thisfile" ]; then
thisfile=$(grep -l $startDate $thisfile)
if [ -n "$thisfile" ]; then
thisfile=$(grep -l $startTime $thisfile)
if [ -n "$thisfile" ]; then
thisfile=`cut -d$ -f2 $thisfile`
if [ -n "$thisfile" ]; then
forDestination=`cut -d ~ -f4 $thisfile`
echo $fordestination
fi
fi
fi
fi
startTime=$(( $startTime + 1 ))
done
And here's a simpler version:
while [[ $startTime -le $endTime ]]
do
thisfile=$(grep -Rl $startDate *)
[ -n "$thisfile" ] && thisfile=$(grep -l $startTime $thisfile)
[ -n "$thisfile" ] && thisfile=`cut -d$ -f2 $thisfile`
[ -n "$thisfile" ] && cut -d ~ -f4 $thisfile
startTime=$(( $startTime + 1 ))
done
The "-R" tells grep to search files recursively, and the && tells bash to only execute the command that follows it if the command before it succeeded, and the command before the && is the test command (used in ifs).
Hope this helps =)

Resources