create a directory for every file and generate “n” copies for each file - bash

while I was looking for a solution for my files, I found something that is perfect, I include the answer here: https://unix.stackexchange.com/questions/219991/how-do-i-create-a-directory-for-every-file-in-a-parent-directory/220026#220026?newreg=94b9d49a964a4cd1a14ef2d8f6150bf8
but now, my problem is how can generate 50 copies to the directories generated by each file I was dealing with the following command line
ls -p | grep -v / | xargs -t -n1 -i bash -c 'for i in {1..50}; do cp {} "{}_folder/copy${i}_{}" ; done'
to get the following
-file1.csv---->folder_for_file1---->1copy_file1.csv,2copy_file1.csv,3copy_file1.csv........50copy_file1.csv
-file2.csv---->folder_for_file2---->1copy_file2.csv,2copy_file2.csv,3copy_file2.csv........50copy_file2.csv
-file3.csv---->folder_for_file3---->1copy_file3.csv,2copy_file3.csv,3copy_file3.csv........50copy_file3.csv
...
-file256.csv---->folder_forfile256---->1copy_file256.csv,2copy_file256.csv,3copy_file256.csv........50copy_file256.csv
How can I match this with the previous answer??, include the functional code of that answer
cd ParentFolder
for x in ./*.csv; do
mkdir "${x%.*}" && mv "$x" "${x%.*}"
done
all the credits to the person who generated this great answer and thanks in advance to everyone

Replace the move for a copy/remove and add a for loop:
cd ParentFolder
for x in ./*.csv; do
mkdir "${x%.*}"
for (( i=1;i<=50;i++ )); do # Create a loop, looping 50 times
cp "$x" "${x%.*}/copy$i_$x" # use i in the copy command
rm -f "$x" # Remove the file after the 50 copies
done
done

I have done some tests and I can publish the following code that works partially, because it effectively copies each file 50 times within the generated folder, but with the name "copy" to each new file, and also adds the extension .csv, but if someone can provide a solution to solve this would be great, I thank to #Raman Sailopal for his help and comments
code
cd pruebas
for x in ./*.csv; do
mkdir "${x%.*}"
for ((i=1;i<=50;i++)); do # Create a loop, looping 50 times
cp "$x" "${x%.*}/copy_$x_$i.csv" # use i in the copy command
#rm -f "$x" # Remove the file after the 50 copies
done
done

Related

Send files to folders using bash script

I want to copy the functionality of a windows program called files2folder, which basically lets you right-click a bunch of files and send them to their own individual folders.
So
1.mkv 2.png 3.doc
gets put into directories called
1 2 3
I have got it to work using this script but it throws out errors sometimes while still accomplishing what I want
#!/bin/bash
ls > list.txt
sed -i '/list.txt/d' ./list.txt
sed 's/.$//;s/.$//;s/.$//;s/.$//' ./list.txt > list2.txt
for i in $(cat list2.txt); do
mkdir $i
mv $i.* ./$i
done
rm *.txt
is there a better way of doing this? Thanks
EDIT: My script failed with real world filenames as they contained more than one . so I had to use a different sed command which makes it work. this is an example filename I'm working with
Captain.America.The.First.Avenger.2011.INTERNAL.2160p.UHD.BluRay.X265-IAMABLE
I guess you are getting errors on . and .. so change your call to ls to:
ls -A > list.txt
-A List all entries except for . and ... Always set for the super-user.
You don't have to create a file to achieve the same result, just assign the output of your ls command to a variable. Doing something like this:
files=`ls -A`
for file in $files; do
echo $file
done
You can also check if the resource is a file or directory like this:
files=`ls -A`
for res in $files; do
if [[ -d $res ]];
then
echo "$res is a folder"
fi
done
This script will do what you ask for:
files2folder:
#!/usr/bin/env sh
for file; do
dir="${file%.*}"
{ ! [ -f "$file" ] || [ "$file" = "$dir" ]; } && continue
echo mkdir -p -- "$dir"
echo mv -n -- "$file" "$dir/"
done
Example directory/files structure:
ls -1 dir/*.jar
dir/paper-279.jar
dir/paper.jar
Running the script above:
chmod +x ./files2folder
./files2folder dir/*.jar
Output:
mkdir -p -- dir/paper-279
mv -n -- dir/paper-279.jar dir/paper-279/
mkdir -p -- dir/paper
mv -n -- dir/paper.jar dir/paper/
To make it actually create the directories and move the files, remove all echo

Trouble with cp command for directories

I'm trying to use the cp function to do copy directories:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
then the copying should result in
tgt/a/4
tgt/b/1
tgt/c/3
tgt/d/2
tgt/e/5
I tried to use the 'basename' function as well as 'cp dir1/*dir2'. With the basename, do I make a loop to find every directory or is there a recursive builtin? Also tried the 'cp-r' recursive copy function. But nothing so far has worked.
I used tmp folder that will hols the SOURCE list of files, yo can readjust:
cat tmp
result:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
from here, I echo out the command, but you can remove echo and it will execute, if this output seems correct:
#!/bin/bash
cat tmp |while read z
do
echo cp "$z" "tgt/$(echo "$z"|cut -d/ -f 3)/$(echo "$z"|cut -d/ -f 2)"
done
result:
cp src/1/b tgt/b/1
cp src/2/d tgt/d/2
cp src/3/c tgt/c/3
cp src/4/a tgt/a/4
cp src/5/e tgt/e/5
you can also add parameters to cp as you see fit. But first test with the echo command, then execute :)

if then else statement will not loop properly

I figured how to get an if then else statement to work but it now seems to have broken. =( I cannot work out what is going wrong!
There are up to 10 directories in ./ called barcode01 - 09 and one called unclassified. This script is supposed to go into each one, prep the directory for ~/Taxonomy.R (Which requires all the fastq files to be gzipped and put into a sub-directory titled "data". It then runs the ~/Taxonomy.R script to make a metadata file for each.
Edit the tmp.txt file is created using ls > tmp.txt then echo "0" >> tmp.txt to make a sacrificial list of directories for the script to chew through then stop when it gets to 0.
#!/bin/bash
source deactivate
source activate R-Env
value=(sed -n 1p tmp.txt)
if [ "$value" = "0" ]
then
rm tmp.txt
else
cd "$(sed -n 1p tmp.txt)"
gzip *fastq
#
for i in *.gz
do
mv "$i" "${i%.*}_R1.fastq.gz";
done
#this adds the direction identifier "R1" to all the fastq.gzips
mkdir Data
mv *gz Data
~/Taxonomy3.R
cd Data
mv * ..
cd ..
rm -r Data
cd ..
sed '1d' tmp.txt > tmp2.txt
mv tmp2.txt tmp.txt
fi
Currently, it is only making the metadata file in the first barcode directory.
If you indent your code, things will get a lot clearer.
On the other hand, modifying your tmp.txt file this way id slow and dangerous. Better traverse its contents only reading it.
#!/bin/bash
source deactivate
source activate R-Env
for value in $(<tmp.txt)
do
cd "$value"
gzip *fastq
for i in *.gz
do
# This adds the direction identifier "R1" to all the fastq.gzips
mv "$i" "${i%.*}_R1.fastq.gz"
done
mkdir Data
mv *gz Data
~/Taxonomy3.R
mv Data/* .
rmdir Data
cd -
done
rm tmp.txt
With this reworked script you only need to create the tmp.txt file WITHOUT adding any marker at the end (in fact, you never needed it, you could have checked for empty file).
For each folder in the script, the operations you wanted are executed. I simplified some folder changing, minimizing it to the required ones for the R script to properly run. To go back, I used cd -, which goes to the previous folder, that way you can have more than one leven in your tmp.txt file.
Hope everything else is clear.

grep spacing error

Hi guys i've a problem with grep . I don't know if there is another search code in shell script.
I'm trying to backup a folder AhmetsFiles which is stored in my Flash Disk , but at the same time I've to group them by their extensions and save them into [extensionName] Folder.
AhmetsFiles
An example : /media/FlashDisk/AhmetsFiles/lecture.pdf must be stored in /home/$(whoami)/Desktop/backups/pdf
Problem is i cant copy a file which name contains spaces.(lecture 2.pptx)
After this introduction here my code.
filename="/media/FlashDisk/extensions"
count=0
exec 3<&0
exec 0< $filename
mkdir "/home/$(whoami)/Desktop/backups"
while read extension
do
cd "/home/$(whoami)/Desktop/backups"
rm -rf "$extension"
mkdir "$extension"
cd "/media/FlashDisk/AhmetsFiles"
files=( `ls | grep -i "$extension"` )
fCount=( `ls | grep -c -i "$extension"` )
for (( i=0 ; $i<$fCount ; i++ ))
do
cp -f "/media/FlashDisk/AhmetsFiles/${files[$i]}" "/home/$(whoami)/Desktop/backups/$extension"
done
let count++
done
exec 0<&3
exit 0
Your looping is way more complicated than it needs to be, no need for either ls or grep or the files and fCount variables:
for file in *.$extension
do
cp -f "/media/FlashDisk/AhmetsFiles/$file" "$HOME/Desktop/backups/$extension"
done
This works correctly with spaces.
I'm assuming that you actually wanted to interpret $extension as a file extension, not some random string in the middle of the filename like your original code does.
Why don't you
grep -i "$extension" | while IFS=: read x ; do
cp ..
done
instead?
Also, I believe you may prefer something like grep -i ".$extension$" instead (anchor it to the end of line).
On the other hand, the most optimal way is probably
cp -f /media/FlashDisk/AhmetsFiles/*.$extension "$HOME/Desktop/backups/$extension/"

Deleting files with same using shell script

Im totally newbie in shell script.
Im need compare file name in two directories and delete files with same name.
EG:
Directory1/
one
two
three
four
Directory2/
two
four
five
After run script the directories will be:
Directory1/
one
three
Diretory2/
five
Thanks
test -f tests if a file exists:
cd dir1
for file in *
do
test -f ../dir2/$file && rm $file ../dir2/$file
done
cd ..
Quick and dirty:
while read fname
do
rm -vf Directory{1,2}/"$fname"
done < <(sort
<(cd Directory1/ && ls)
<(cd Directory2/ && ls) |
uniq -d)
This assumes a number of things about the filenames, but it should get you there with the input shown, and similar cases.
Tested too, now:
mkdir /tmp/stacko && cd /tmp/stacko
mkdir Directory{1,2}
touch Directory1/{one,two,three,four} Directory2/{two,four,five}
Runnning the command shows:
removed `Directory1/four'
removed `Directory2/four'
removed `Directory1/two'
removed `Directory2/two'
And the resulting tree is:
Directory1/one
Directory1/three
Directory2/five

Resources