grep spacing error - bash

Hi guys i've a problem with grep . I don't know if there is another search code in shell script.
I'm trying to backup a folder AhmetsFiles which is stored in my Flash Disk , but at the same time I've to group them by their extensions and save them into [extensionName] Folder.
AhmetsFiles
An example : /media/FlashDisk/AhmetsFiles/lecture.pdf must be stored in /home/$(whoami)/Desktop/backups/pdf
Problem is i cant copy a file which name contains spaces.(lecture 2.pptx)
After this introduction here my code.
filename="/media/FlashDisk/extensions"
count=0
exec 3<&0
exec 0< $filename
mkdir "/home/$(whoami)/Desktop/backups"
while read extension
do
cd "/home/$(whoami)/Desktop/backups"
rm -rf "$extension"
mkdir "$extension"
cd "/media/FlashDisk/AhmetsFiles"
files=( `ls | grep -i "$extension"` )
fCount=( `ls | grep -c -i "$extension"` )
for (( i=0 ; $i<$fCount ; i++ ))
do
cp -f "/media/FlashDisk/AhmetsFiles/${files[$i]}" "/home/$(whoami)/Desktop/backups/$extension"
done
let count++
done
exec 0<&3
exit 0

Your looping is way more complicated than it needs to be, no need for either ls or grep or the files and fCount variables:
for file in *.$extension
do
cp -f "/media/FlashDisk/AhmetsFiles/$file" "$HOME/Desktop/backups/$extension"
done
This works correctly with spaces.
I'm assuming that you actually wanted to interpret $extension as a file extension, not some random string in the middle of the filename like your original code does.

Why don't you
grep -i "$extension" | while IFS=: read x ; do
cp ..
done
instead?
Also, I believe you may prefer something like grep -i ".$extension$" instead (anchor it to the end of line).
On the other hand, the most optimal way is probably
cp -f /media/FlashDisk/AhmetsFiles/*.$extension "$HOME/Desktop/backups/$extension/"

Related

Send files to folders using bash script

I want to copy the functionality of a windows program called files2folder, which basically lets you right-click a bunch of files and send them to their own individual folders.
So
1.mkv 2.png 3.doc
gets put into directories called
1 2 3
I have got it to work using this script but it throws out errors sometimes while still accomplishing what I want
#!/bin/bash
ls > list.txt
sed -i '/list.txt/d' ./list.txt
sed 's/.$//;s/.$//;s/.$//;s/.$//' ./list.txt > list2.txt
for i in $(cat list2.txt); do
mkdir $i
mv $i.* ./$i
done
rm *.txt
is there a better way of doing this? Thanks
EDIT: My script failed with real world filenames as they contained more than one . so I had to use a different sed command which makes it work. this is an example filename I'm working with
Captain.America.The.First.Avenger.2011.INTERNAL.2160p.UHD.BluRay.X265-IAMABLE
I guess you are getting errors on . and .. so change your call to ls to:
ls -A > list.txt
-A List all entries except for . and ... Always set for the super-user.
You don't have to create a file to achieve the same result, just assign the output of your ls command to a variable. Doing something like this:
files=`ls -A`
for file in $files; do
echo $file
done
You can also check if the resource is a file or directory like this:
files=`ls -A`
for res in $files; do
if [[ -d $res ]];
then
echo "$res is a folder"
fi
done
This script will do what you ask for:
files2folder:
#!/usr/bin/env sh
for file; do
dir="${file%.*}"
{ ! [ -f "$file" ] || [ "$file" = "$dir" ]; } && continue
echo mkdir -p -- "$dir"
echo mv -n -- "$file" "$dir/"
done
Example directory/files structure:
ls -1 dir/*.jar
dir/paper-279.jar
dir/paper.jar
Running the script above:
chmod +x ./files2folder
./files2folder dir/*.jar
Output:
mkdir -p -- dir/paper-279
mv -n -- dir/paper-279.jar dir/paper-279/
mkdir -p -- dir/paper
mv -n -- dir/paper.jar dir/paper/
To make it actually create the directories and move the files, remove all echo

create a directory for every file and generate ā€œnā€ copies for each file

while I was looking for a solution for my files, I found something that is perfect, I include the answer here: https://unix.stackexchange.com/questions/219991/how-do-i-create-a-directory-for-every-file-in-a-parent-directory/220026#220026?newreg=94b9d49a964a4cd1a14ef2d8f6150bf8
but now, my problem is how can generate 50 copies to the directories generated by each file I was dealing with the following command line
ls -p | grep -v / | xargs -t -n1 -i bash -c 'for i in {1..50}; do cp {} "{}_folder/copy${i}_{}" ; done'
to get the following
-file1.csv---->folder_for_file1---->1copy_file1.csv,2copy_file1.csv,3copy_file1.csv........50copy_file1.csv
-file2.csv---->folder_for_file2---->1copy_file2.csv,2copy_file2.csv,3copy_file2.csv........50copy_file2.csv
-file3.csv---->folder_for_file3---->1copy_file3.csv,2copy_file3.csv,3copy_file3.csv........50copy_file3.csv
...
-file256.csv---->folder_forfile256---->1copy_file256.csv,2copy_file256.csv,3copy_file256.csv........50copy_file256.csv
How can I match this with the previous answer??, include the functional code of that answer
cd ParentFolder
for x in ./*.csv; do
mkdir "${x%.*}" && mv "$x" "${x%.*}"
done
all the credits to the person who generated this great answer and thanks in advance to everyone
Replace the move for a copy/remove and add a for loop:
cd ParentFolder
for x in ./*.csv; do
mkdir "${x%.*}"
for (( i=1;i<=50;i++ )); do # Create a loop, looping 50 times
cp "$x" "${x%.*}/copy$i_$x" # use i in the copy command
rm -f "$x" # Remove the file after the 50 copies
done
done
I have done some tests and I can publish the following code that works partially, because it effectively copies each file 50 times within the generated folder, but with the name "copy" to each new file, and also adds the extension .csv, but if someone can provide a solution to solve this would be great, I thank to #Raman Sailopal for his help and comments
code
cd pruebas
for x in ./*.csv; do
mkdir "${x%.*}"
for ((i=1;i<=50;i++)); do # Create a loop, looping 50 times
cp "$x" "${x%.*}/copy_$x_$i.csv" # use i in the copy command
#rm -f "$x" # Remove the file after the 50 copies
done
done

why 'ls' command printing the directory content multiple times

I have the following shell script in which I want to check the specific directory content on the remote machines and print them in a file.
file=serverList.csv
n=0
while [ $n -le 2 ]
do
while IFS=: read -r f1 f2
do
# echo line is stored in $line
if echo $f1 | grep -q "xx.xx.xxx";
then
ssh user#$f1 ls path/*war_* > path/$f1.txt < /dev/null; ls path/*zip_* >> path/$f1.txt < /dev/null;
ssh user#$f1 ls -d /apps/jetty*_* >> path/$f1.txt < /dev/null;
fi
done < "$file"
sleep 15
n=$(( n+1 ))
done
I am using this script inside a cron job for every 2 minute as following:
*/2 * * * * /path/myscript.sh
but somehow I am ending up with the following output file:
/apps/jetty/webapps_wars/test_new.war
path/ReleaseTest.static.zip_2020-08-05
path/ReleaseTest.static.zip_2020-08-05
path/ReleaseTest.static.zip_2020-08-05
path/jetty_xx.xx_2020-08-05
path/jetty_new
path/jetty_xx.xx_2020-08-05
path/jetty_new
I am not sure why am I getting the files in the list twice, sometimes 3 times. but I execute the shell directly from putty, it works fine. What do I need to change in order to correct this script?
Example:
~$ cd tmp
~/tmp$ mkdir test
~/tmp$ cd !$
cd test
~/tmp/test$ mkdir -p apps/jetty/webapp_wars/ && touch apps/jetty/webapp_wars/test_new.war
~/tmp/test$ mkdir path
~/tmp/test$ touch path/{ReleaseTest.static.zip_2020-08-05,jetty_xx.xx_2020-08-05,jetty_new}
~/tmp/test$ cd ..
~/tmp$ listpath=$(find test/path \( -name "*2020-08-05" -o -name "*new" \) )
~/tmp$ listapps=$(find test/apps/ -name "*war" )
~/tmp$ echo ${listpath[#]}" "${listapps[#]} | tr " " "\n" | sort > resultfile
~/tmp$
~/tmp$ cat resultfile
test/apps/jetty/webapp_wars/test_new.war
test/path/jetty_new
test/path/jetty_xx.xx_2020-08-05
test/path/ReleaseTest.static.zip_2020-08-05
~/tmp$ rm -rf test/ && unset listapps && unset listpath && rm resultfile
~/tmp$
This way you get only one result for each pattern you are looking for in your if...then...else block of code.
Just adapt the ssh ..... find commands and take care of quotes & parentheses but there is the easiest solution, this way you do not have to rewrite the script from scratch. And be careful on local / remote variables if you use them.
You really should not use ls but the fundamental problem is probably that three separate commands with three separate wildcards could match the same file three times.
Also, one of your commands is executed locally (you forgot to put ssh etc in front of the second one), so if the wildcard matches on your local computer, that would produce a result which doesn't reflect the situation on the remote server.
Try this refactoring.
file=serverList.csv
n=0
while [ $n -le 2 ]
do
while IFS=: read -r f1 f2
do
# echo line is stored in $line <- XXX this is not true
if echo "$f1" | grep -q "xx.xx.xxx";
then
ssh user#$f1 "printf '%s\n' path/*war_* path/*zip_* /apps/jetty*_*" | sort -u >path/"$f1".txt < /dev/null
fi
done < "$file"
sleep 15
n=$(( n+1 ))
done
The sort gets rid of any duplicates. This assumes none of your file names contain newlines; if they do, you'd need to use something which robustly handles them (try printf '%s\0' and sort -z but these are not portable).
ls would definitely also accept three different wildcards but like the link above explains, you really never want to use ls in scripts.

Trouble with cp command for directories

I'm trying to use the cp function to do copy directories:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
then the copying should result in
tgt/a/4
tgt/b/1
tgt/c/3
tgt/d/2
tgt/e/5
I tried to use the 'basename' function as well as 'cp dir1/*dir2'. With the basename, do I make a loop to find every directory or is there a recursive builtin? Also tried the 'cp-r' recursive copy function. But nothing so far has worked.
I used tmp folder that will hols the SOURCE list of files, yo can readjust:
cat tmp
result:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
from here, I echo out the command, but you can remove echo and it will execute, if this output seems correct:
#!/bin/bash
cat tmp |while read z
do
echo cp "$z" "tgt/$(echo "$z"|cut -d/ -f 3)/$(echo "$z"|cut -d/ -f 2)"
done
result:
cp src/1/b tgt/b/1
cp src/2/d tgt/d/2
cp src/3/c tgt/c/3
cp src/4/a tgt/a/4
cp src/5/e tgt/e/5
you can also add parameters to cp as you see fit. But first test with the echo command, then execute :)

bash move is failing

I am running below commands in a script
move_jobs() {
cd $JOB_DIR
for i in `cat $JOBS_FILE`
do
if [ `ls | grep -i ^${i}- | wc -l` -gt 0 ]; then
cd $i
if [ ! -d jobs ]; then
mkdir jobs && cd .. && mv "${i}"-* "${i}"/jobs/
else
cd .. && mv "${i}"-* "${i}"/jobs/
fi
error_handler $?
fi
done
}
but it failing as
mv: cannot stat `folder-*': No such file or directory
Not sure why move command is failing with regular expression
Your script is overly complicated and has several issues, one of which will be the problem, I guess it's the ls | grep ... part, but to find that out, you should include some debug logging.
for i in $(cat ...) loops through words, not lines.
Do not parse ls
And if you still do, do not ever grep for filenames but include it in your ls call: ls "${i}"-* | wc -l.
You do not need to check if a folder exists when the only thing that is different then is that you create it. You can use mkdir -p instead.
Jumping around folders in your script makes it almost unreadable, as you need to keep track of all cd commands when reading your script.
You could simply write the following, which I think will do what you want:
xargs -a "$JOBS_FILE" -I{} \
sh -c "
mkdir -p '$JOB_DIR/{}/jobs';
mv '$JOB_DIR/{}-'* '$JOB_DIR/{}/jobs';
"
or if you need more control:
while IFS= read -r jid; do
if ls "$JOB_DIR/$jid-"* &>/dev/null; then
TARGET_DIR="$JOB_DIR/$jid/jobs"
mkdir -p "$TARGET_DIR"
mv "$JOB_DIR/$jid-"* "$TARGET_DIR"
echo "OK"
else
echo "No files to move."
fi
done < "$JOBS_FILE"

Resources