I'm trying to create a file hierarchy to store data. I want to create a folder for each data acquisition session. That folder has five subfolders, which are named below. My code attempt below gives an error, but I'm not sure how to correct it.
Code
#!/bin/sh
TRACES = "/Traces"
LFPS = '/LFPS'
ANALYSIS = '/Analysis'
NOTES = '/Notes'
SPIKES = '/Spikes'
folders=($TRACES $LFPS $ANALYSIS $NOTES $SPIKES)
for folder in "${folders[#]}"
do
mkdir $folder
done
Error
I get an error when declaring the variables. As written above, bash flips the error Command not found. If, instead, I declare the file names as TRACES = $('\Traces'), bash flips the error No such file or directory.
Remove the spaces between the variable names and the values:
#!/bin/sh
TRACES="/Traces"
LFPS='/LFPS'
ANALYSIS='/Analysis'
NOTES='/Notes'
SPIKES='/Spikes'
folders=($TRACES $LFPS $ANALYSIS $NOTES $SPIKES)
for folder in "${folders[#]}"
do
mkdir $folder
done
With spaces, bash interprets this like
COMMAND param1 param2
with = as param1
I'm taking the 'no spaces around the variable assignments' part of the fix as given.
Using array notation seems like overkill. Allowing for possible spaces in names, you can use:
for dir in "$TRACE" "$LFPS" "$NOTES" "$PASS"
do mkdir "$dir"
done
But even that is wasteful:
mkdir "$TRACE" "$LFPS" "$NOTES" "$PASS"
If you're worried that the directories might exist, you can avoid error messages for that with:
mkdir -p "$TRACE" "$LFPS" "$NOTES" "$PASS"
The -p option is also valuable if the paths are longer and some of the intermediate directories might be missing. If you're sure there won't be spaces in the names, the double quotes become optional (but they're safe and cheap, so you might as well use them).
Also you would want to do some checking beforehand if folders exist or not.
Also you can always debug the shell script with set -x, you could just use "mkdir -p" which would do the trick.
I made the following changes to get your script to run.
As a review comment it is unusual to create such folders hanging off the root file system.
#!/bin/sh
TRACES="/Traces"
LFPS='/LFPS'
ANALYSIS='/Analysis'
NOTES='/Notes'
SPIKES='/Spikes'
folders="$TRACES $LFPS $ANALYSIS $NOTES $SPIKES"
for folder in $folders
do
mkdir $folder
done
Spaces were removed from the initial variable assignments and I also simplified the for loop so that it iterated over the words in the folders string.
Related
I have a script that I call with an application, I can't run it from command line. I derive the directory where the script is called and in the next variable go up 1 level where my files are stored. From there I have 3 variables with the full path and file names (with wildcard), which I will refer to as "masks".
I need to find and "do something with" (copy/write their names to a new file, whatever else) to each of these masks. The do something part isn't my obstacle as I've done this fine when I'm working with a single mask, but I would like to do it cleanly in a single loop instead of duplicating loop and just referencing each mask separately if possible.
Assume in my $FILESFOLDER directory below that I have 2 existing files, aaa0.csv & bbb0.csv, but no file matching the ccc*.csv mask.
#!/bin/bash
SCRIPTFOLDER=${0%/*}
FILESFOLDER="$(dirname "$SCRIPTFOLDER")"
ARCHIVEFOLDER="$FILESFOLDER"/archive
LOGFILE="$SCRIPTFOLDER"/log.txt
FILES1="$FILESFOLDER"/"aaa*.csv"
FILES2="$FILESFOLDER"/"bbb*.csv"
FILES3="$FILESFOLDER"/"ccc*.csv"
ALLFILES="$FILES1
$FILES2
$FILES3"
#here as an example I would like to do a loop through $ALLFILES and copy anything that matches to $ARCHIVEFOLDER.
for f in $ALLFILES; do
cp -v "$f" "$ARCHIVEFOLDER" > "$LOGFILE"
done
echo "$ALLFILES" >> "$LOGFILE"
The thing that really spins my head is when I run something like this (I haven't done it with the copy command in place) that log file at the end shows:
filesfolder/aaa0.csv filesfolder/bbb0.csv filesfolder/ccc*.csv
Where I would expect echoing $ALLFILES just to show me the masks
filesfolder/aaa*.csv filesfolder/bbb*.csv filesfolder/ccc*.csv
In my "do something" area, I need to be able to use whatever method to find the files by their full path/name with the wildcard if at all possible. Sometimes my network is down for maintenance and I don't want to risk failing a change directory. I rarely work in linux (primarily SQL background) so feel free to poke holes in everything I've done wrong. Thanks in advance!
Here's a light refactoring with significantly fewer distracting variables.
#!/bin/bash
script=${0%/*}
folder="$(dirname "$script")"
archive="$folder"/archive
log="$folder"/log.txt # you would certainly want this in the folder, not $script/log.txt
shopt -s nullglob
all=()
for prefix in aaa bbb ccc; do
cp -v "$folder/$prefix"*.csv "$archive" >>"$log" # append, don't overwrite
all+=("$folder/$prefix"*.csv)
done
echo "${all[#]}" >> "$log"
The change in the loop to append the output or cp -v instead of overwrite is a bug fix; otherwise the log would only contain the output from the last loop iteration.
I would probably prefer to have the files echoed from inside the loop as well, one per line, instead of collect them all on one humongous line. Then you can remove the array all and instead simply
printf '%s\n' "$folder/$prefix"*.csv >>"$log"
shopt -s nullglob is a Bash extension (so won't work with sh) which says to discard any wildcard which doesn't match any files (the default behavior is to leave globs unexpanded if they don't match anything). If you want a different solution, perhaps see Test whether a glob has any matches in Bash
You should use lower case for your private variables so I changed that, too. Notice also how the script variable doesn't actually contain a folder name (or "directory" as we adults prefer to call it); fixing that uncovered a bug in your attempt.
If your wildcards are more complex, you might want to create an array for each pattern.
tmpspaces=(/tmp/*\ *)
homequest=($HOME/*\?*)
for file in "${tmpspaces[#]}" "${homequest[#]}"; do
: stuff with "$file", with proper quoting
done
The only robust way to handle file names which could contain shell metacharacters is to use an array variable; using string variables for file names is notoriously brittle.
Perhaps see also https://mywiki.wooledge.org/BashFAQ/020
I am trying to split the path of a file to get the directory name to check if the directory exists in the new location or not using shell script.
I tried using
cf=src/classes/CarExperience.cls
echo ${cf%/*}
echo ${cf##/*/}
echo ${cf#/*/*/}
echo ${cf%/*}
echo $(dirname "$cf")
But none of these are giving me desired result
Desired result is get part after the src and check if that inner directory exists or not.
cf=src/classes/CarExperience.cls
directory_name=classes
Appreciate any help on this regard.
You could do:
full_dir=$(dirname "$cf")
last_dir=$(basename "$full_dir")
or in one shot
last_dir=$(basename "$(dirname "$cf")")
Yes, you want all those quotes.
With shell parameter expansion:
full_dir=${cf%/*}
last_dir=${full_dir##*/}
That one has to be done in 2 steps.
Like this, using parameter expansion as you try to do:
cf=src/classes/CarExperience.cls
cf=${cf#src/*} # become 'classes/CarExperience.cls'
echo ${cf%/*} # become 'classes'
Output
classes
I'm trying to run a command for all the folders that start with the word SAM which are inside another folder called date (the date changes)and folder date is inside another folder called subject_01 (subject changes) and the folder subject_01 is inside the main folder called root.
Structure:
root/subject/date/SAM_folders
This is the command I want to run and need to be executed from the folder date:
dtiConvPrep.sh folder_name
Example:
dtiConvPrep.sh SAM_03_14_25
I created a script:
#!/bin/bash
array=(/root/*/*) #this vector contains all the folders (subject/date)
len=${#array[#]}
for (( q=0; q<$len; q++ ));
do
cd ${array[$q]} #To execute the command from the folder date for each subject
sleep 1
dtiConvPrep.sh SAM*
done
But it only runs for 1 SAM folder in each folder called date for all the subjects.
Any idea how can I solve this problem? Thanks
for dir in /root/*/*/SAM_*; do
(
cd "$(dirname "$dir")"
dtiConvPrep.sh "$(basename "$dir")"
)
done
A for ((i = 0; i < len; ++i)) style loop is a very C-/Java-like thing to do. In Bash it's more idiomatic to iterate over arrays directly. Or in this case, iterate over the glob directly.
I put parentheses around the loop body so the cds run in a subshell and are only temporary. It's not necessary here since you're cding to absolute paths, but it's a good habit to get into. I like to avoid cding in the middle of scripts as it changes global state in an easy to mess up way.
You may find all the double quotes a bit of an eyesore but it's a good habit to always quote variable expansions and $(...) expansions in case they contain whitespace or other special characters. In this script we need nested quotes to be 100% safe.
I am trying to run a bash script:
#~/bin/sh
mkdir metaphlan2
echo "Profiling from reads"
echo
samples="CFC280618 MFC280618 SBW280618"
for x in ${samples}
do metaphlan2.py ${x}_S*_R1_001.fastq.gz,${x}_S*_R2_001.fastq.gz --bowtie2out metaphlan2/\${x}.bowtie2.bz2 -o metaphlan2/\${x}.metaphlan2.txt --input_type multifastq --nproc 10
done
Where the * is supposed to represent any character.
However when I ran my script in terminal, I got the following error:
IOError: [Errno 2] No such file or directory: 'SBW280618_S*_R1_001.fastq.gz'
Will anyone be kind enough to help please?
Thank you.
The * is not supposed to represent any character, but to expand the string to match something that exists by substituting the * for whatever is needed.
Unfortunately, you are trying to expand
${x}_S*_R1_001.fastq.gz,${x}_S*_R2_001.fastq.gz
and probably you don't have any file called that way. I'm pretty sure you have two files, one called ${x}_S*_R1_001.fastq.gz and another called ${x}_S*_R2_001.fastq.gz. The comma in the middle make that string to be considered a single string/file.
You need to separate the files between them with spaces. If that's not possible, you will have to expand both files independently and finally generate the string your binary needs.
Try with this second option:
#/bin/sh
mkdir metaphlan2
echo "Profiling from reads"
echo
samples="CFC280618 MFC280618 SBW280618"
for x in ${samples}
do
fq1="${x}"_S*_R1_001.fastq.gz
fq2="${x}"_S*_R2_001.fastq.gz
metaphlan2.py "$fq1,$fq2" --bowtie2out metaphlan2/\${x}.bowtie2.bz2 -o metaphlan2/\${x}.metaphlan2.txt --input_type multifastq --nproc 10
done
I have recently just made this script:
if test -s $HOME/koolaid.txt ; then
Billz=$(grep / $HOME/koolaid.txt)
echo $Billz
else
Billz=$HOME/notkoolaid
echo $Billz
fi
if test -d $Billz ; then
echo "Ok"
else touch $Billz
fi
So basically, if the file $HOME/koolaid.txt file does NOT exist, then Billz will be set as $HOME/koolaid.txt. It then sucesfully creates the file.
However, if I do make the koolaid.txt then I get this
mkdir: cannot create directory : No such file or directory
Any help would be appreciated
Here is a difference between content of a variable and evaluated content...
if your variable contains a string $HOME/some - you need expand it to get /home/login/same
One dangerous method is eval.
bin=$(grep / ~/.rm.cfg)
eval rbin=${bin:-$HOME/deleted}
echo "==$rbin=="
Don't eval unless you're absolutely sure what you evaling...
Here are a couple things to fix:
Start your script with a "shebang," such as:
#!/bin/sh
This way the shell will know that you want to run this as a Bourne shell script.
Also, your conditional at the top of the script doesn't handle the case well in which .rm.cfg exists but doesn't contain a slash character anywhere in it. In that case the rbin variable never gets set.
Finally, try adding the line
ls ~
at the top so you can see how the shell is interpreting the tilde character; that might be the problem.