multiple loops based on files in directories - shell

I'm making a shell script that I need to make a loop. I have a directory called Files. Inside Files there are two folders. Each holding 500 files (folder1 and folder2). I need to get the filenames from both folder1 and folder2 because I need to concatenate the filename in folder1 with folder2. It needs to do this for every single file in there. So 250,000 times.
Anyone know how I would write that loop so that I can get all the names from both directories and loop correctly?

Assuming you're in bash, then something like this
cd Files
for f1 in folder1/*
do
for f2 in folder2/*
do
concat_name="${f1#*/}-${f2#*/}"
done
done

something like this should do, assuming that the two subdirectories are called dir1 and dir2, this example only echoes the names naturally..
#!/bin/bash
for d1 in `ls Files/dir1`;
do
for d2 in `ls Files/dir2`;
do
echo ${d1}_${d2}
done
done

Related

Exclude two or more directories in bash for loop

I am having a parent directory called Stem in that parent directory I am having around 20 sub-directories and 20 sub-directories names are ending with _a, and each sub-directory is having a file called violations.txt. In those 20 sub-directories, two sub-directories name are Trans_a and shift_a, I do not need to perform any script action on these two sub-directories.
I need to execute my script on only 18 sub-directories.
I tried the below code but not getting the exact output.
#!/bin/bash
echo "These warning reports are based on this run directory:" ; pwd
echo " " ;
File="violatons.txt"
for d in *_a;
do
if ( "$d"=="Trans_a"||"shift_a" ); then
True
else
Statement
fi
done
Note : In some situations the sub-directories which we need to exclude in script may increase or decrease, it depends upon the situation.
for f in !(Trans|shift)_a/violations.txt; do
echo "$f"
# do stuff
done
!(pattern-list) Matches anything except one of the given patterns.
Reference: https://www.gnu.org/savannah-checkouts/gnu/bash/manual/bash.html#Pattern-Matching

remove parent folder but keep content of childs

I'm trying to organize a bit my music folder. Right now it has many subfolders inside.
Inside every one of those subfolders, there is a subsubfolder with the artist_name which contains music files.
E.g.:
Music_folder/silly_name001/artist_A;
Music_folder/silly_name002/artist_B_discA;
Music_folder/silly_name003/artist_B_discB;
I can list the content of the Music_ folder and get the names of the subfolders with this:
for i in $(ls -d */); do echo ${i%%/}; done
But when I try to move the content of those subsubfolders to the parent folder, I can't achieve my goal. I'm using this code:
for i in $(ls -d */); do mv ${i%%/*} .; done
Actually, it moves the subsubfolders but not the content of them.
Any ideas about how to achieve it?
Thanks in advance
${i%%/*} removes the longest prefix matching /*; see:
$ i=Music_folder/silly_name003/artist_B_discB;
$ echo ${i%%/*}
Music_folder
You're looking for something like this:
set ./*/
echo mv ./*/* .
echo rmdir "$#"
Drop echos if you're happy with the output.
Note that this can't deal with name collisions.

Bash pass variable value from for loop to nested for loop

I have a nested for loop with the outer loop going through a list of directories (subject folders) that will enter into a subdirectory of each and look for a specific file. After finding that file, the directory backs up one so that the inner loop is in the proper directory to work. The inner loop then looks for specific subdirectories within the subject folder, enters those subdir, and looks for 3 specific files. The file names are assigned to variables that will then be passed to a Matlab function.
#!/usr/bin/env bash
for i in *; do #subject ID
cd "/cygdrive/g/Data/2015_0197/$i/ASL_Data/T1_Processed" #enter into
specific directory
ref_img="*smoothed*" #look for specific file1
echo $ref_img
cd .. #move out of directory
for j in *ASL*; do #look for specific subdirectories
cd "/cygdrive/g/Data/2015_0197/$i/ASL_Data/$j" #cd to each
subdirectory
src_img=$i"_"$j".nii" #look for specific file2
other_img1=$i"_"$j"_PDmap.nii" #look for specific file3
other_img2=$i"_"$j"_ASL_fmap.nii" #look for specific file4
echo "2nd instance---- $ref_img"
echo $src_img
echo $other_img1
echo $other_img2
#eventually call matlab function
#matlab -nodesktop -nosplash -wait -r
#"coreg('$ref_img','$src_img','$other_img1','$other_img2'); exit;"
cd ..
done
done
At line 7, I get the correct filename for ref_img, but it does not pass into the nested for loop as line 14 will simply echo 2nd instance---- *smoothed*. How can I get it to pass that filename? I've looked into piping but due to my novice Bash knowledge, I don't know how to implement or if it's even appropriate.
The problem is that ref_img contains the wildcards, and keeps them. So when you use it, it uses those wildcards. Those files are not in the new directory. So, fix it by replacing
ref_img="*smoothed*"
with
ref_img=$( echo *smoothed* )

How To store grep data in a variable so that it can we used later on?

I wrote the code as follows:
#!/bin/bash
cd /home/ubuntu/MouniKaShell/newfolder/dev/EC2-Var
grep ec2_name: *.txt >tempStore
The data is not being stored in tempStore and when run there is no output.
I want to have all the files that contains the ec2_name and after that all file I want to store in tempStore.
If I understand well your question, you want all the .txt files located in a specific directory and which names contains "ec2_name" to be stored in a bash variable.
Here is a possible solution
#!/bin/bash
cd /home/ubuntu/MouniKaShell/newfolder/dev/EC2-Var;
# mylist contains all files
mylist="`ls *.txt | grep ec2_name`";
# you can iterate over files with for-loop
for file in $mylist; do
echo "file: $file";
done;
Correct me if I haven't understood your question
edit: simplified code

bash script to append word to filenames

I'm trying append to word "dicom" to the front of many filenames in a set of folders. The folders all begin with "s" (referred to by "s*" in the script below), and each contain many files (specified by "*" below)--I'd like all of these files to be changed using this bash script. I tried to run this:
for file in /Volumes/USB_AIB/DICOMFunCurrentBatch/MOVr1unzip/s*/*
do
mv $file dicom${file%%}
done
but got thousands of lines of the following error (once for each file within each folder--this is just an example of one of them):
mv: rename /Volumes/USB_AIB/DICOMFunCurrentBatch/MOVr1unzip/s307_1/29217684 to dicom/Volumes/USB_AIB/DICOMFunCurrentBatch/MOVr1unzip/s307_1/29217684: No such file or directory
Any ideas on how to fix it?
I don't you have a valid path as dicom/Volumes/USB_AIB/DICOMFunCurrentBatch/MOVr1unzip/s307_1/, why do you add dicom at the beginning?
maybe you want to append dicom to the end of the $file?
mv "$file" "${file}_dicom"
or something like that.
the following variable expansion ${file%%} is strange because it does nothing:
${parameter%%word} : remove the longest matching suffix pattern.
to move the file into a directory the path should exists, to create the path:
mkdir -p "$(dirname "${newfilename}")"
Maybe what you are trying to do:
for file in /Volumes/USB_AIB/DICOMFunCurrentBatch/MOVr1unzip/s*/*
do
mv "$file" "${file%/*}/dicom${file##*/}"
done

Resources