I have some data for several stations that are separated for the station they are in, and the day they were recorded, so for station 1, for example, I have multiple folders called 2019.001,2019.002 etc. and inside these folders I have the files (all with the same name) ending with HHZ. What I have done is getting these files from each of the stations and putting them on another folder while renaming them to have the name of the folder above and maintaining the name of the station, afterwards I created the folders corresponding to their names. My actual question is how to move the files that correspond to the same day, e.g. 2019.001.station1 and 2019.001.station2 to the folder 2019.001.
dir0=`pwd`
mkdir -p data || exit 1
for pathname in $dir0/stam/*/*HHZ; do
cp "$pathname" "data/$( basename "$( dirname "$pathname" )" )STAMHHZ"
done
for pathname in $dir0/macu/*/*HHZ; do
cp "$pathname" "data/$( basename "$( dirname "$pathname" )" )MACUHHZ"
done
cd $dir0/data
mkdir 2019.0{10..31}
mkdir 2019.00{1..9}
If there is also another way of executing the part of the code where I take the files so I can generalize for several stations that would be nice, since I am only working with two stations right now but in the future I'll work with more.
Here is the tree to where the data is
macu
├── 2019.001
│ └── MACUHHZ
├── 2019.002
│ └── MACUHHZ
├── 2019.003
And
stam
├── 2019.001
│ └── STAMHHZ
├── 2019.002
│ └── STAMHHZ
├── 2019.003
│ └── STAMHHZ
So ideally the final situation would be:
data
├── 2019.001
│ ├── 2019.001MACUHHZ
│ └── 2019.001STAMHHZ
And so on
The script below creates the wanted file structure. The top level directories from which you want to copy data (in your example macu and stam) should be added to the top_dir variable. You can also change it to use a wildcard, or read them from a file, etc.
The basic idea is simple: For each top level directory, for each data directory, create the corresponding directory in data, and for each file, copy the file.
pushd and popd are used as a simple hack to make the * wildcards do what we want. $dir0 contains the root folder of the operation, so we always know where data is.
set -e is used to exit immediately if there is an error.
#!/bin/bash
set -e
top_dirs=( macu stam )
dir0="$(pwd)"
mkdir -p data
for dir in "${top_dirs[#]}" ; do
pushd "$dir" >/dev/null
for datadir in * ; do
mkdir -p "$dir0/data/$datadir"
pushd "$datadir" >/dev/null
for file in *HHZ ; do
cp "$file" "$dir0/data/$datadir/$datadir$file"
done
popd >/dev/null
done
popd >/dev/null
done
Related
I'm trying to utilize a bash script to delete some unwanted files with the same name in different directories, eg: text1.txt exists in multiple directories and I wish to remove it in every directory it exists in.
I need the script to delete the unwanted files and then also delete the directory in which that filename 'text1.txt' exists, so if it exists in a folder named 'TextFiles' I need that folder directory to be deleted.
This is my current code I'm working on:
for files in "/*"
do
rm file.1txt file2.txt file3.txt
I'm a bit curious about whether the "/*" will look into all directories and whether the 'do' is working to remove the files stated.
Also, after utilising the 'rm' to remove specific files how do I delete the directory it exists in.
Many thanks!
Before I start, I have to note that the rm command can do some nasty things in your system. Automating it can lead to unintended data loss (system or personal files and folders) if used carelessly.
Now that I said that, imagine the following file structure:
bhuiknei#debian:~/try$ tree
.
├── dir1
│ └── this.txt
└── dir2
├── dir3
│ ├── this
│ └── this.txt
├── notthis.txt
└── this.txt
3 directories, 5 files
To find and filter specific files find and grep are your friends. The "-w" option will match to whole words only (so the notthis.txt is not picked up):
bhuiknei#debian:~/try$ find . | grep -w this.txt
./dir1/this.txt
./dir2/dir3/this.txt
./dir2/this.txt
Now that we have all paths for the files lined up, these can be piped into a while loop where we can delete the files one-by-one. Then the empty directories can be deleted in a second step.
I would not suggest deleting the containing folders forcibly as they might contain other files and folders too.
The following script does the trick:
#!/bin/bash
#Exiting if no file name was given
[[ $# -ne 1 ]] && { echo "Specify a filename to delete in all sub folders"; exit 1; }
#Deleting files matching input parameter
echo "Deleting all files named ${1} in current and sub-directories."
find . | grep -w "$1" | \
while IFS= read LINE; do
rm -v "$LINE"
done
#Deleting only-empty folders
rmdir -v *
exit 0
And the result:
bhuiknei#debian:~/try$ tree
.
├── dir1
│ └── this.txt
├── dir2
│ ├── dir3
│ │ ├── this
│ │ └── this.txt
│ ├── notthis.txt
│ └── this.txt
└── script
3 directories, 6 files
bhuiknei#debian:~/try$ ./script this.txt
Deleting all files named this.txt in current and sub-directories.
removed './dir1/this.txt'
removed './dir2/dir3/this.txt'
removed './dir2/this.txt'
rmdir: removing directory, 'dir1'
rmdir: removing directory, 'dir2'
rmdir: failed to remove 'dir2': Directory not empty
rmdir: removing directory, 'script'
rmdir: failed to remove 'script': Not a directory
bhuiknei#debian:~/try$ tree
.
├── dir2
│ ├── dir3
│ │ └── this
│ └── notthis.txt
└── script
2 directories, 3 files
Also a side note: I didn't test what happens if the working directory is different where the script is located, so make sure to run it locally from the parent dir, or add some protection. Working with absolute paths can be a solution.
Good luck!
You know the extension of the file name and so you can utilise this in a loop parsing the output of find with parameter expansion and so:
find /path -name "file1.txt" | while read var
do
echo "rm -Rf ${var%/file1.txt}" # echo the command
# rm -Rf "${var%/file1.txt}" # execute the command when sure that command list as expected
done
${var%/file1.txt} -
will expand the output from find and expand the output only up to /file1.txt (the directory) rm -Rf will then force removal the directory along with the file
Alternatively you can use printf natively in find to print only the directory without the file:
find /path -name "file1.txt" -printf "%h\n" | while read var
do
echo "rm -Rf $var" # echo the command
# rm -Rf "$var" # execute the command when sure that command list as expected
done
Maybe one of you guys has something like this at hand already? I tried to use robocopy on windows but to no avail. I also tried to write a bash script in linux with find etc... but gave up on that one also ^^ Google search brought no solution also unfortunately. I need this for my private photo library.
Solution could be linux or windows based, both are fine. Any ideas?
I would like to get rid of hundreds of 'intermediary folders'.
I define an 'intermediary folder' as a folder that contains nothing else than exactly one sub-folder. Example
folder 1
file in folder 1
folder 2 <-- 'intermediary folder: contains exactly one sub-folder, nothing else'
folder 3
file in folder 3
What I would like to end up with is:
folder 1
file in folder 1
folder 3
file in folder 3
I do not need the script to be recursive (removing several layers of intermediary folders at once), I'll just run it several times.
Even cooler would be if the script could rename folder 3 in the above example to 'folder 2 - folder 3', but I can live without this feature I guess.
I guess one of you linux experts has a one liner handy for that? ^^
Thank you very much!
Take a look at this code:
#!/usr/bin/env bash
shopt -s nullglob
while IFS= read -rd '' dir; do
f=("$dir"/*)
if ((${#f[#]}==1)) && [[ -d $f ]]; then
mv -t "${dir%/*}" "$f" || continue
rm -r "$dir"
fi
done < <(find folder1 -depth -mindepth 1 -type d -print0)
Explanation:
shopt -s nullglob: allows filename patterns which match no files to expand to a null string
find ... -depth: makes find traverse the file system in a depth-first order
find ... -mindepth 1: processes all directories except the starting-point
find ... -type d: finds only directories
find ... -print0: prints the directories separated by a null character \0 (to correctly handle possible newlines in filenames)
while IFS= read ...: loops over all the directories (the output of find)
f=("$dir"/*): creates an array with all files in the currently processed directory
((${#f[#]}==1)) && [[ -d $f ]]: true if there is only one file and it is a directory
mv -t "${dir%/*}" "$f": moves the only subdirectory one directory above
mv ... || continue: mv can fail if the subdirectory already exists in the directory above. || continue ignores such subdirectory
rm -r "$dir": removes the processed directory
Test run:
$ tree folder1
folder1
├── file1
├── folder2
│ └── folder3
│ └── file3
├── folder4
│ ├── file4a
│ ├── file4b
│ └── file4c
└── folder5
└── folder6
├── file6
└── folder7
└── folder8
└── folder9
├── dir9
└── file9
$ ./script
$ tree folder1
folder1
├── file1
├── folder3
│ └── file3
├── folder4
│ ├── file4a
│ ├── file4b
│ └── file4c
└── folder6
├── file6
└── folder9
├── dir9
└── file9
I am looking for a way to archive all files of certain file types in one zip file per subfolder.
My folder structure is as follows:
/path/to
└── TopLevel
├── SubLevel1
│ ├── SubSubLevel1
│ ├── SubSubLevel2
│ └── SubSubLevel3
├── SubLevel2
│ ├── SubSubLevel1
│ ├── SubSubLevel2
│ └── SubSubLevel3
├── SubLevel3
│ ├── SubSubLevel1
│ └── SubSubLevel2
└── SubLevel4
In each folder or subfolder or sub-subfolder, there are files of the file type *.abc, *.xyz and also *.001 through *.999 and all these files I want to compress into one zip file per folder, i.e. all files of the specified types in folder "SubSubLevel1" of "SubLevel1" of "TopLevel" should be packaged into one file named "SubSubLevel1_data.zip" inside the "SubSubLevel1" folder. All other files in these folders, which do not match the search criteria as described above, should be kept unzipped in the same directory.
I have found some ideas here or here, but both approaches are based on a different way of archiving the files and I have so far not found a way to adopt them to my needs since I am not very experienced with shell scripting. I have also tried to get a solution with AppleScript, but there I face the problem how to get all files in the folder with the number as an extension (*.001 through *.999). With RegEx, I would do something like ".abc|.xyz.\d\d\d" which would cover my search for certain file types, but I am also not sure now how to implement the result of a grep in AppleScript.
I guess someone out there must have an idea how to address my archiving issue. Thanks in advance for your suggestions.
After some playing around I came up with the following solution:
#!/bin/bash
shopt -s nullglob
find -E "$PWD" -type d -maxdepth 1 -regex ".*201[0-5][0-1][0-9].*" -print0 | while IFS="" read -r -d "" thisFolder ; do
echo "The current folder is: $thisFolder"
to_archive=( "$thisFolder"/*.[Aa][Bb][Cc] "$thisFolder"/*.[Xx][Yy][Zz] "$thisFolder"/*.[0-9][0-9][0-9] )
if [ ${#to_archive[#]} != 0 ]
then
7z a -mx=9 -uz1 -x!.DS_Store "$thisFolder"/"${thisFolder##*/}"_data.7z "${to_archive[#]}" && rm "${to_archive[#]}"
fi
find "$thisFolder" -type d -mindepth 1 -maxdepth 1 -print0 | while IFS="" read -r -d "" thisSubFolder ; do
echo "The current subfolder is: $thisSubFolder"
to_archive=( "$thisSubFolder"/*.[Aa][Bb][Cc] "$thisSubFolder"/*.[Xx][Yy][Zz] "$thisSubFolder"/*.[0-9][0-9][0-9] )
if [ ${#to_archive[#]} != 0 ]
then
7z a -mx=9 -uz1 -x!.DS_Store "$thisSubFolder"/"${thisSubFolder##*/}"_data.7z "${to_archive[#]}" && rm "${to_archive[#]}"
fi
done
done
My script has two nested for loops to iterate through subfolders and sub-subfolders. With "find" I look for a regex pattern in order to only backup folders from 2010-2015 . All files matching the specified extensions inside the folders are compressed in one target archive per folder.
I have multiple files in a folder. Every file has different names and have different extensions. I need a way to move each of the files into folders called Archive_1, Archive_2, Archive_n, and so on.
It doesn't matter the order of the file but I need one file per folder.
I was looking for something like sorting the files by name and based on that move the first one to Archive_1, and then the second one to Archive_2, etc. Couldn't find it.
Any help?
Assuming all files are in PWD, you can execute:
i=0
for f in ./*; do
new_dir=Movie_$((++i))
mkdir -p "$new_dir"
mv "$f" "$new_dir"
done
Test ( I created a script called sof with the above command ):
$ touch a b c
$ ./sof
$ tree
.
├── Movie_1
│ └── a
├── Movie_2
│ └── b
└── Movie_3
└── c
3 directories, 3 files
I have a folder called folder1 that has a bunch of pictures in it, and I would like to save each picture in an individual folder. The folder should have the same name as the file, for example if the picture is called "st123" I would like to create a folder called "st123" and move the picture into the folder. I have tried the following, but I get the error cannot move to a subdirectory of itself. Is there another way to solve this?
Otherwise, would it be possible to save the image "st123" in a folder called "123" (so the last 3 characters)?
#!/bin/bash
Parent="/home/me/myfiles/folder1"
for file in $Parent; do
dir="${file%%.*}"
mkdir -p "$dir"
mv "$file" "$dir"
done
This solution might be useful to you, though if you'd like to iterate over the images recursively I suggest you to use find. If that's indeed the case, let me know and I'll edit this answer with relevant code.
for image in /home/me/myfiles/folder1/*; do
if [[ -f $image ]]; then
newf="${image%%?([ab]).*}"
mkdir -p "$newf"
mv -- "$image" "$newf"
fi
done
Test ( extglob is enabled ):
$ [/home/rany/] touch test/st123a.tif test/st123b.tif test/st456.jpg test/st456b.jpg test/st789.tif
$ [/home/rany/] for image in test/*; do newf="${image%%?([ab]).*}"; mkdir -p "$newf"; mv -- "$image" "$newf";done
$ [/home/rany/] tree test
test
├── st123
│ ├── st123a.tif
│ └── st123b.tif
├── st456
│ ├── st456b.jpg
│ └── st456.jpg
└── st789
└── st789.tif
3 directories, 5 files
EDIT:
According to OP request I added the following changes:
Trim file-name's suffix a or b, so that, for example, file names st123a.ext and st123b.ext will go to the same directory st123. Important: this feature requires extglob enabled ( i.e. shopt -s extglob ).
You almost have it working! Change part of line 3 so the for loop iterates over what's inside the directory instead of the directory itself:
#!/bin/bash
Parent="/home/me/myfiles/folder1"
for file in $Parent/*; do
dir="${file%%.*}"
mkdir -p "$dir"
mv "$file" "$dir"
done
https://www.gnu.org/software/bash/manual/html_node/Command-Substitution.html#Command-Substitution