Check if a specific file in a folder exists in another - Shellscript - shell

Im starting shellscripting and i'm having trouble with a script.
So, i have two folers (doesn't matter the kind of files i have in them) and i need to check if the files on folder1 exists in folder 2. If do, check if its modified date is more recent.
This is what i have:
#!/bin/sh
for i in `find $1 -type f`
do
for j in `find $2 -type f`
do
if [ -e $2/$i ]
then
if [ $i -ot $j ]
then
echo File "`basename $i`" its newer and it will be copied
else
echo File is updated
fi
else
echo "`basename $i`" will be copied because it doesn't exist
fi
done
done
$1 and $2 are the folder arguments
Thanks in advance
EDIT:
With 3 files in folder1 and one of them (file2) in folder 2 i get this output.
i had 3 files in folder1 and one of them was also in folder2 and i got (file2 was in both folders):
file1 will be copied because it doesn't exist
file2 will be copied because it doesn't exist
file2 will be copied because it doesn't exist
file1 will be copied because it doesn't exist
file3 will be copied because it doesn't exist
file3 will be copied because it doesn't exist

Two things are wrong in that script:
A. You don't need nested for loop, because you are doing some checking against each file in first directory, not for each file in first directory and respectively each file in scond directory (that's why you have so much files displayed).
B. When you use find to get files in directory you don't need the path, just filename, which existence you will check in second directory.
Here's the fixed version. Let me know whether is solves your problem.
#!/bin/bash
for i in `find $1 -type f -printf "%f\n" | sort`;
do
if [ -e "$2/$i" ]
then
if [ "$1/$i" -ot "$2/$i" ]
then
echo "File basename $i its newer and it will be copied"
else
echo "File is updated"
fi
else
echo "basename $i will be copied because it doesn't exist"
fi
done

Related

move every file from every folder to specific folder

I am trying to write in shell script to run this..
I guess it requires for syntax or find syntax..
but I am stuck dealing with scan every folder..
I have tried with "find" -maxdepth 1 -name "*.jpg | mv " but failed...
for every jpg files in every dir (folder1, folder2, folder3...folder5...etc)
move files to target dir which is parent dir.
if file name duplicated, move to dup dir.
DIR tree looks like this
Something like
for f in folder*/*.jpg; do
if [ -e "$(basename "$f")" ]; then
mv "$f" dup/
else
mv "$f" .
fi
done
run from the parent directory. Just iterates over every jpg in the folder subdirectories, moving to one place or another depending on if a file with that name already exists or not.
Slightly more efficient bash version:
for f in folder*/*.jpg; do
if [[ -e ${f##*/} ]]; then
mv "$f" dup/
else
mv "$f" .
fi
done

Command line for loop with a nested if statement

I need to loop through all the files in the same directory, and IF a specific line "File needs to be deleted" exists within any of the files within that directory, delete those files only. How would it work from the command line please?
For example, the directory contains file1, file2, file3 and so on for 1000's of files. Each file has 10,000 rows of strings. If any file contains the string"File needs to be deleted", delete those files, but don't delete the files that do not contain that string.
I was going along the lines of
for each file the directory; do
if [ row text == "File needs to be deleted" ]; then
delete file
fi
done
grep -d skip -lF 'File needs to be deleted' file* | xargs echo rm --
If you only have files, no directories, in your current directory then you can just remove -d skip. If your version of grep doesn't have -d but your directory does contain sub-directories then:
find . -maxdepth 1 -type f -exec grep -lF 'File needs to be deleted' {} + | xargs echo rm --
Remove the echo once you've tested and are happy that it's going to remove the files you expect.
Simple bash example:
#!/bin/bash
# Get the running script name
running_script=$(realpath $0 | awk -F '/' '{ print $NF }')
# Loop throw all files in the current directory
for file in *; do
# If the filename is the same as the running script pass it.
[ "$file" == "$running_script" ] && continue
# If "File needs to be deleted" exists in the file delete the file.
grep -q "File needs to be deleted" "$file" && rm "$file"
done

Check if files exists in 3 different directories and move them one to another

I'm quite new in creating shell scripts.
I'm developing a shell script that will backup my files once a day only.
I need to check which *.war files are in three different folders (input folder, production folder, backup folder)
If the same files exists in the three directories, don't perform backup.
If it doesn't, it must move the files in folder 2 to folder 3.
This is what I've done so far.
===============================
TODAY=$(date +%d-%m-%Y)
INPUT=/home/bruno.ogasawara/entrada/
BACKUP=/home/bruno.ogasawara/backup/
PROD=/home/bruno.ogasawara/producao/
DIR1=$(ls $INPUT)
DIR2=$(ls $PROD)
DIR3=$(ls $BACKUP$TODAY)
for i in $DIR1; do
for j in $DIR2; do
for k in $DIR3; do
if [ $i == $j ] && [ $j == $k ]; then
exit 1; else
mv -f $PROD$j $BACKUP$TODAY
fi
done
done
done
mv -f $INPUT*.war $PROD
===============================
The verification is not working. Only thing working is the mv -f $INPUT*.war $PROD in the end.
Where am I missing something or doing something wrong?
Thanks in advance people.
What I understand is you want to sync those three folders.
In that case you should not modify the file names as we are using file names to compare them.Otherwise you should use md5 or sha checksums.But linux filesystem already has timestamps feature you don't have to attach date to filename.
In your code you used ls to list files ...but actually ls command lists files in column mode which is not comaptible with for loop in bash.
correct command is
find $DIR -maxdepth 1 -type f -exec basename {} \;
you want to sync the *.war files to all folders...then simply you can use this:
#!/bin/bash
DIR1=/home/bruno.ogasawara/entrada/
DIR2=/home/bruno.ogasawara/backup/
DIR3=/home/bruno.ogasawara/producao/
cp -n $DIR1/*.war $DIR2
cp -n $DIR1/*.war $DIR3
cp -n $DIR2/*.war $DIR1
cp -n $DIR2/*.war $DIR3
cp -n $DIR3/*.war $DIR1
cp -n $DIR3/*.war $DIR2
-n: will check if file already exists.it will not overwrite the existing file.

Shell Script to list files in a given directory and if they are files or directories

Currently learning some bash scripting and having an issue with a question involving listing all files in a given directory and stating if they are a file or directory. The issue I am having is that I only get either my current directory or if a specify a directory it will just say that it is a directory eg. /home/user/shell_scripts will return shell_scipts is a directory rather than the files contained within it.
This is what I have so far:
dir=$dir
for file in $dir; do
if [[ -d $file ]]; then
echo "$file is a directory"
if [[ -f $file ]]; then
echo "$file is a regular file"
fi
done
Your line:
for file in $dir; do
will expand $dir just to a single directory string. What you need to do is expand that to a list of files in the directory. You could do this using the following:
for file in "${dir}/"* ; do
This will expand the "${dir}/"* section into a name-only list of the current directory. As Biffen points out, this should guarantee that the file list wont end up with split partial file names in file if any of them contain whitespace.
If you want to recurse into the directories in dir then using find might be a better approach. Simply use:
for file in $( find ${dir} ); do
Note that while simple, this will not handle files or directories with spaces in them. Because of this, I would be tempted to drop the loop and generate the output in one go. This might be slightly different than what you want, but is likely to be easier to read and a lot more efficient, especially with large numbers of files. For example, To list all the directories:
find ${dir} -maxdepth 1 -type d
and to list the files:
find ${dir} -maxdepth 1 -type f
if you want to iterate into directories below, then remove the -maxdepth 1
This is a good use for globbing:
for file in "$dir/"*
do
[[ -d "$file" ]] && echo "$file is a directory"
[[ -f "$file" ]] && echo "$file is a regular file"
done
This will work even if files in $dir have special characters in their names, such as spaces, asterisks and even newlines.
Also note that variables should be quoted ("$file"). But * must not be quoted. And I removed dir=$dir since it doesn't do anything (except break when $dir contains special characters).
ls -F ~ | \
sed 's#.*/$#/& is a Directory#;t quit;s#.*#/& is a File#;:quit;s/[*/=>#|] / /'
The -F "classify" switch appends a "/" if a file is a directory. The sed code prints the desired message, then removes the suffix.
for file in $(ls $dir)
do
[ -f $file ] && echo "$file is File"
[ -d $file ] && echo "$file is Directory"
done
or replace the
$(ls $dir)
with
`ls $`
If you want to list files that also start with . use:
for file in "${dir}/"* "${dir}/"/.[!.]* "${dir}/"/..?* ; do

How to copy and rename files in shell script

I have a folder "test" in it there is 20 other folder with different names like A,B ....(actually they are name of people not A, B...) I want to write a shell script that go to each folder like test/A and rename all the .c files with A[1,2..] and copy them to "test" folder. I started like this but I have no idea how to complete it!
#!/bin/sh
for file in `find test/* -name '*.c'`; do mv $file $*; done
Can you help me please?
This code should get you close. I tried to document exactly what I was doing.
It does rely on BASH and the GNU version of find to handle spaces in file names. I tested it on a directory fill of .DOC files, so you'll want to change the extension as well.
#!/bin/bash
V=1
SRC="."
DEST="/tmp"
#The last path we saw -- make it garbage, but not blank. (Or it will break the '[' test command
LPATH="/////"
#Let us find the files we want
find $SRC -iname "*.doc" -print0 | while read -d $'\0' i
do
echo "We found the file name... $i";
#Now, we rip off the off just the file name.
FNAME=$(basename "$i" .doc)
echo "And the basename is $FNAME";
#Now we get the last chunk of the directory
ZPATH=$(dirname "$i" | awk -F'/' '{ print $NF}' )
echo "And the last chunk of the path is... $ZPATH"
# If we are down a new path, then reset our counter.
if [ $LPATH == $ZPATH ]; then
V=1
fi;
LPATH=$ZPATH
# Eat the error message
mkdir $DEST/$ZPATH 2> /dev/null
echo cp \"$i\" \"$DEST/${ZPATH}/${FNAME}${V}\"
cp "$i" "$DEST/${ZPATH}/${FNAME}${V}"
done
#!/bin/bash
## Find folders under test. This assumes you are already where test exists OR give PATH before "test"
folders="$(find test -maxdepth 1 -type d)"
## Look into each folder in $folders and find folder[0-9]*.c file n move them to test folder, right?
for folder in $folders;
do
##Find folder-named-.c files.
leaf_folder="${folder##*/}"
folder_named_c_files="$(find $folder -type f -name "*.c" | grep "${leaf_folder}[0-9]")"
## Move these folder_named_c_files to test folder. basename will hold just the file name.
## Don't know as you didn't mention what name the file to rename to, so tweak mv command acc..
for file in $folder_named_c_files; do basename=$file; mv $file test/$basename; done
done

Resources