Create infinite number of nested folders using infinite loop - bash

I am working on a small project. The task is to create a bash script that will create infinite number of folders when clicked, and inside each folder, it creates infinite number of folders in each folder and so on. For example,
Folder1
SubFolder1
SubFolder2
SubFolder3
...
SubFolder∞
SubSubFolder1
SubSubFolder2
SubSubFolder3
...
SubSubFolder∞
SubSubSubFolder1
SubSubSubFolder2
SubSubSubFolder3
...
SubSubSubFolder∞
...
Folder∞
SubFolder1
SubFolder2
SubFolder3
...
SubFolder∞
SubSubFolder1
SubSubFolder2
SubSubFolder3
...
SubSubFolder∞
SubSubSubFolder1
SubSubSubFolder2
SubSubSubFolder3
...
SubSubSubFolder∞
The sequence is like this:
Folder 1 has infinite folders (SubFolder1, SubFolder2, ..., SubFolder∞). The first folder(SubFolder1) has further sub folders (SubSubFolder1, ..., SubSubFolder∞). The SubSubFolder1 has further subfolders (SubSubSubFolder1, ... SubSubSubFolder∞) and so on. Similar for Folder2 and its sub folders.
I tested the script for 10 folders. It created 10 top-level folders. In each folder, it created 10 subfolders each. But it stopped there, I want it to run continuously (infinite loop). Also, it works sequentially (it will create 10 subfolders in Folder1, then comeout and create Folder2, and its sub folders and so on), I want it continuous (make Folder1 and its subfolders continuously with Folder2 and its subfolders and so on). The code:
#!/bin/bash
# Set the number of outer and inner directories to create
num_outer=10
num_inner=10
# Create the outer loop
for ((i=0; i<num_outer; i++))
do
# Create a new outer directory with a unique name
mkdir "outer_folder_$i"
# Navigate into the new outer directory
cd "outer_folder_$i"
# Create the inner loop
for ((j=0; j<num_inner; j++))
do
# Create a new inner directory with a unique name
mkdir "inner_folder_$j"
done
# Navigate back to the parent directory
cd ..
done

This rekfolder.sh script does not produce an infinite number of directories, because the constant flow of new harddrives would be too expensive for me, but about x! (x fac) numbers of directories, where x is is given as a command line parameter and counted downwards in each recursive step.
Start it like this:
timeout -k 5 3s ./rekfolder.sh f 8
which stops the script after 3s and - if things go wrong - kills it after 5. f will be the name part of the starting dir.
#!/bin/bash
name=$1
count=$2
# save my harddrive!
if (( ${#name} > 25 ))
then
echo err namelength
exit 0
elif (( count == 0 ))
then
echo err count
exit 0
else
# Set the number of directories to create
num=3
name=$name
for ((i=0; i<count; i++))
do
# save my harddrive again
sleep 0.3
echo mkdir ${name}$i
mkdir ${name}$i
./rekfolder.sh ${name}$i/ $((count-1)) &
done
fi
This produced 109600 directories. The sleep is in there, to allow to interrupt the process of exponential growth with a killall rekfolder.sh in a second terminal, but it's getting hard, if you don't interrupt early.
You may delete all those folders, if they are created in a fresh dir, with
find -type d -delete
(took me about a second (SSD)).
Note that there is much trailing output, long after finishing all the ./rekfolder.sh scripts, which makes it look, as if the timeout does not work. You may observe the processes in a second terminal
for i in {1..10}
do
ps -C rekfolder.sh
sleep 1
done

Related

FSL - Can I set the path within an fsf file (program setup file) to current working directory?

I am performing MRI analysis, and have written a script that loops through all scans for each subject, with the next step being to run a command called feat, which can be seen towards the end of the coding block below.
#! /bin/sh
path=~/rj82/james_folder/data_copy/
cd $path
# Loop through the MND and Control directories
for directory in * ; do
cd $path
cd $directory
# Loop through each subject in each directory
for subject in ??? ; do
cd $path/$directory/$subject
# Loop through each scan for each subject
for scan in MR?? ; do
cd $path/$directory/$subject/$scan
# Run feat on each scan
feat design.fsf
cd ..
done
done
done
You will also see feat takes a design.fsf file which sets the parameters for feat. To make this file I used one MRI scan as input data.
Below I have attached the regions within the design.fsf code that show the path of the files used to create the file.
# 4D AVW data or FEAT directory (1)
set feat_files(1) "/projects/rj82/james_folder/data_copy/mnd/002/MR02/fmri/fmri_data"
# Add confound EVs text file
set fmri(confoundevs) 0
# Session's alternate reference image for analysis 1
set alt_ex_func(1) "/projects/rj82/james_folder/data_copy/mnd/002/MR02/fmri_ref/fmri_ref_brain"
# B0 unwarp input image for analysis 1
set unwarp_files(1) "/projects/rj82/james_folder/data_copy/mnd/002/MR02/fmaps/fmap_rads"
# B0 unwarp mag input image for analysis 1
set unwarp_files_mag(1) "/projects/rj82/james_folder/data_copy/mnd/002/MR02/fmaps/mag_e1_brain"
# Subject's structural image for analysis 1
set highres_files(1) "/projects/rj82/james_folder/data_copy/mnd/002/MR02/t1/t1_brain"
If I run the script (first coding block) feat will run correctly, however as the path within the design.fsf file only refers to one scan, it will just continuously run feat on this single scan.
As the subdirectories and files within each subject have the same name, I want to replace the path "/projects/rj82/james_folder/data_copy/mnd/002/MR02 with the current directory from the script (first coding block), whilst keeping the end portion (e.g fmri/fmri_data") allowing me to loop through and run feat on each subject.
I have tried setting path=pwd and replacing the above path with "$path/fmri/fmri_data" which does not work, as well as removing the "/projects/rj82/james_folder/data_copy/mnd/002/MR02 portion entirely, as I hoped it would just use the current directory, but this also doesn't work. Error message is the same for both:
/bin/mkdir: cannot create directory ‘/fmri’: Permission denied
while executing
"fsl:exec "/bin/mkdir -p $FD" -n"
(procedure "firstLevelMaster" line 22)
invoked from within
"firstLevelMaster $session"
invoked from within
"if { $done_something == 0 } {
if { ! $fmri(inmelodic) } {
if { $fmri(level) == 1 } {
for { set session 1 } { $session <= $fmri(mult..."
(file "/usr/local/fsl/6.0.4/fsl/bin/feat" line 390)
I could not get what I was trying to achieve to work, so instead I looped through the same as above, copied my design.fsf file into each directory, and edited them with sed to have the right path.
#! /bin/sh
path=~/rj82/james_folder/data_copy/
cd $path
# Loop through the MND and Control directories
for directory in * ; do
cd $path
cd $directory
# Loop through each subject in each directory
for subject in ??? ; do
cd $path/$directory/$subject
# Loop through each scan for each subject
for scan in MR?? ; do
cd $path/$directory/$subject/$scan
# Copy template design.fsf into each scan folder
cp $path/../design.fsf design.fsf
# Change design.fsf file and replace the directory used
# to create the template with the current directory
current=$directory/$subject/$scan
sed -i "s|mnd/002/MR02|$current|g" design.fsf
# Run feat using custome design.fsf file
feat design.fsf
cd ..
done
done
done
My script navigates into each individual subject and scan, so I have change the design file to: set feat_files(1) "/fmri/fmri_data"
That was a good idea, but in order to make relative paths you have to also remove the leading /; thereto also hints the error message cannot create directory ‘/fmri’.
With truly relative paths in the .fsf file, it should also be possible to use a common design.fsf by calling feat /path/design.fsf

copying files to another folder using bash

I have a script that copies 1000 files to another folder. However, I have files that end with the following:
*_LINKED_1.trees
*_LINKED_2.trees
*_LINKED_3.trees
.
.
.
*_LINKED_10.trees
'*' is not part of the name but there's some string in place of it.
What I want is to copy 1000 files for each of the types I've mentioned above in bullet points in a smart way.
#!/bin/bash
for entry in /home/noor/popGen/sweeps/hard/final/*
do
for i in {1..1000}
do
cp $entry /home/noor/popGen/sweeps/hard/sample/hard
done
done
Could there be a smart way to copy all 1000 files for each type? One way would be to have an if statement but I'll have to change that if- statement 10 times.
This script below will do the required task.
file_count=0
for i in {1..10};
do
for j in source/*_LINKED_$i.trees;
do
file_count=$((file_count+1))
echo "cp $j destination/"
if ((file_count>1000));
then
file_count=0
break;
fi;
done
done
Outer loop for i in {1..10} is used to mark the type of the files (*_LINKED_$i.trees).
Inner loop will iterate through all the files of each type (eg: *_LINKED_1.trees, *_LINKED_2.trees etc till *_LINKED_10.trees). Then it copies the first 1000 files (set using file_count=1000) into the destination for that particular type of file.

Nesting a case statement within a for loop

I'm trying to follow the instructions below in order to create one directory containing four subdirectories inside, each of these latter with five new empty files:
Create a directory with the name of the first positional parameter (choose whatever name you want).
Use a for loop to do the following:
2.1. Within the directory, create four subdirectories with these names: rent, utilities, groceries, other.
2.2. Within the for loop, use case statements to determine which subdirectory is currently being handled in the loop. You will need 4
cases.
2.3. Within each case, use a for loop to create 5 empty files with the touch command. Each subdirectory must have its 5 files inside.
So far, I have created a directory and several subdirectories at once, each of them with a specific name, as you can see in my code:
mkdir $1
for folder in $1; do
mkdir -p $1/{Rent,Utilities,Groceries,Other}
done
However, I'm stuck in the following step (2.2.), and I don't know how to continue from here.
Any ideas?
As I read it, this is what 2.1 and 2.2 are asking for:
for folder in rent utilities groceries other; do
mkdir "$1/$folder"
case $folder in
rent)
...
;;
utilities)
...
;;
groceries)
...
;;
other)
...
;;
esac
done
I've left the cases blank for you to fill out.
For what it's worth, I would never code a script this way. Having a case statement inside a for loop is an anti-pattern. The loop really adds no value. If this weren't an assignment I would code it differently:
mkdir "$1"
# Populate rent/ directory.
mkdir "$1"/rent
touch "$1"/rent/...
# Populate utilities/ directory.
mkdir "$1"/utilities
touch "$1"/utilities/...
# Populate groceries/ directory.
mkdir "$1"/groceries
touch "$1"/groceries/...
# Populate other/ directory.
mkdir "$1"/other
touch "$1"/other/...

Files not found in the current working dir

I have a really basic question. I'm trying to run STAR to align some reads to a sequencing experiment. I have around 30 samples. I firstly splitted the fast.gz files in 30 folders according to the sample of origin. In other words in each folder corresponding to a Sample* there are 4 files named *.fastq.gz (4 because of 4 lanes). I'm trying to loop STAR in each folder in this way:
for i in Sample*/;
do
cd $i;
qsub my_align_script.sh;
cd ..; done
where my_align_script.sh contains the following:
for i in *fastq.gz; do
STAR --runMode alignReads --genomeLoad LoadAndKeep --readFilesCommand zcat --outSAMtype BAM Unsorted --genomeDir "path" --readFilesIn $i ${i%.fastq.gz}.fastq.gz --runThreadN 10 --outFileNamePrefix ${i%.fastq.gz}
done
but unfortunately it seems not to find any *fast.gz file.
I tried to force to look in the current working dir of each Sample* folder specifying cd $path at the beginning but nothing when I run the for loop to launch jobs over folders.

How to set automator (mac) to create folders and fill them with x number of files?

I have seen several variations of this question asked however none of witch fully answer my problem. I have several folders that contain between 2,000 and 150,000 image files. Searching these directories becomes very inefficient as the number of files increases as speed is drastically decreased.
What I want to do is use automator to:
1. select folder
2. create subfolder
3. fill newly created subfolder with the first 1000 files in the folder selected in (1.)
4. if more files exist in the outer folder, create another subfolder and fill with the next 1000 files etc.
Is this possible? Any help is much apreciated.
Thank you,
Brenden
This takes a directory and moves the contents to new folders called "newFolder1", "newFolder2" etc.
Have you used Terminal much? Let me know if you need more instruction. I also haven't put in any checks, so let me know if you get any errors.
o Save this file to your desktop (as script.sh for the purpose of tutorial)
#!/bin/bash
cd "$1" #Change directory to the folder to sort
filesPerFolder=$2 #This is how many files will be in each folder
currentDir='newFolder1';
currentFileCount=0;
currentDirCount=1;
mkdir $currentDir;
for file in *
do
if [ -f "$file" ]
then
mv "$file" "$currentDir";
fi
currentFileCount=$(($currentFileCount + 1));
if [ $(($currentFileCount % $filesPerFolder)) -eq "0" ] #Every X files, make a new folder
then
currentDirCount=$(($currentDirCount + 1));
currentDir='newFolder'$currentDirCount;
mkdir "$currentDir";
fi
done
o Open Terminal and type cd ~/Desktop/
o Type chmod 777 script.sh to change the permissions on the file
o Type ./script.sh "/path/to/folder/you/want/to/sort" 30
o The 30 here is how many files you want in each folder.

Resources