I have a small interactive unix script that takes a terminal command input for a chosen file type and upon receiving such iterates through a large folder of unsorted files on my desktop and pulls files of that chosen selection to a new sorted folder.
i.e. The user types jpg in the command and all jpg files are pulled out of the unsorted folder and into the sorted folder.
It works terrific as it stands, but I would like to develop my script further so that instead of all file types being pushed into a communal sorted folder, I could have jpg files being pushed into a dedicated folderjpg, png files pushed in folderpng and finally all docx files moved into docxfolder.
How can I achieve such in the leanest possible manner assuming that these dedicated folders for the file types mentioned have been created on my desktop.
#!/bin/bash
echo "Good Morning, Please enter your file type name for sorting [ENTER]:"
read extension
mv -v /Users/christopherdorman/desktop/unsorted/*.${extension} /Users/christopherdorman/desktop/sorted/
if [[ $? -eq 0 ]]; then
echo "Good News, Your files have been successfully processed"
fi
I would write it this way:
read -p "Good Morning, Please enter your file type name for sorting [ENTER]:" extension
if cd /Users/christopherdorman/desktop; then
destination="folder$extension"
# ensure the destination folder exists
mkdir -p "$destination"
if mv -v unsorted/*."$extension" "$destination"; then
echo "Good News, Your files have been successfully processed"
fi
fi
Related
I am running numerous simulations on a remote server (via ssh). The outcomes of these simulations are stored as .tar archives in an archive directory on this remote server.
What I would like to do, is write a bash script which connects to the remote server via ssh and extracts the required output files from each .tar archive into separate folders on my local hard drive.
These folders should have the same name as the .tar file from which the files come (To give an example, say the output of simulation 1 is stored in the archive S1.tar on the remote server, I want all '.dat' and '.def' files within this .tar archive to be extracted to a directory S1 on my local drive).
For the extraction itself, I was trying:
for f in *.tar; do
(
mkdir ../${f%.tar}
tar -x -f "$f" -C ../${f%.tar} "*.dat" "*.def"
)
done
wait
Every .tar file is around 1GB and there is a lot of them. So downloading everything takes too much time, which is why I only want to extract the necessary files (see the extensions in the code above).
Now the code works perfectly when I have the .tar files on my local drive. However, what I can't figure out is how I can do it without first having to download all the .tar archives from the server.
When I first connect to the remote server via ssh username#host, then the terminal stops with the script and just connects to the server.
Btw I am doing this in VS Code and running the script through terminal on my MacBook.
I hope I have described it clear enough. Thanks for the help!
Stream the results of tar back with filenames via SSH
To get the data you wish to retrieve from .tar files, you'll need to pass the results of tar to a string of commands with the --to-command option. In the example below, we'll run three commands.
# Send the files name back to your shell
echo $TAR_FILENAME
# Send the contents of the file back
cat /dev/stdin
# Send EOF (Ctrl+d) back (note: since we're already in a $'' we don't use the $ again)
echo '\004'
Once the information is captured in your shell, we can start to process the data. This is a three-step process.
Get the file's name
note that, in this code, we aren't handling directories at all (simply stripping them away; i.e. dir/1.dat -> 1.dat)
you can write code to create directories for the file by replacing the forward slashes / with spaces and iterating over each directory name but that seems out-of-scope for this.
Check for the EOF (end-of-file)
Add content to file
# Get the files via ssh and tar
files=$(ssh -n <user#server> $'tar -xf <tar-file> --wildcards \'*\' --to-command=$\'echo $TAR_FILENAME; cat /dev/stdin; echo \'\004\'\'')
# Keeps track of what state we're in (filename or content)
state="filename"
filename=""
# Each line is one of these:
# - file's name
# - file's data
# - EOF
while read line; do
if [[ $state == "filename" ]]; then
filename=${line/*\//}
touch $filename
echo "Copying: $filename"
state="content"
elif [[ $state == "content" ]]; then
# look for EOF (ctrl+d)
if [[ $line == $'\004' ]]; then
filename=""
state="filename"
else
# append data to file
echo $line >> <output-folder>/$filename
fi
fi
# Double quotes here are very important
done < <(echo -e "$files")
Alternative: tar + scp
If the above example seems overly complex for what it's doing, it is. An alternative that touches the disk more and requires to separate ssh connections would be to extract the files you need from your .tar file to a folder and scp that folder back to your workstation.
ssh -n <username>#<server> 'mkdir output/; tar -C output/ -xf <tar-file> --wildcards *.dat *.def'
scp -r <username>#<server>:output/ ./
The breakdown
First, we'll make a place to keep our outputted files. You can skip this if you already know the folder they'll be in.
mkdir output/
Then, we'll extract the matching files to this folder we created (if you don't want them to be in a different folder remove the -C output/ option).
tar -C output/ -xf <tar-file> --wildcards *.dat *.def
Lastly, now that we're running commands on our machine again, we can run scp to reconnect to the remote machine and pull the files back.
scp -r <username>#<server>:output/ ./
I have this script that I use for backups. The problem is that it is kind of slow. I want to know if there is a diff command that stops when finds the first difference.
DocumentsFiles=("Books" "Comics" "Distros" "Emulators" "Facturas" "Facultad" "Laboral" "Mods" "Music" "Paintings" "Projects" "Scripts" "Tesis" "Torrents" "Utilities")
OriginDocumentsFile="E:\Documents\\"
DestinationDocumentsFile="F:\Files\Documents\\"
## loop file to file and copy in backup
for directory in "${DocumentsFiles[#]}"
do
RealOrigin="${OriginDocumentsFile}${directory}"
RealDestination="${DestinationDocumentsFile}${directory}"
echo $directory
if [ -a "$RealDestination" ]; then
echo ok
if diff -r $RealOrigin $RealDestination; then
echo "${directory} are equal!"
else
rm -rfv $RealDestination
cp -ruv $RealOrigin "${DestinationDocumentsFile}"
fi
else
cp -ruv $RealOrigin "${DestinationDocumentsFile}"
fi
done
diff -q reports "only when files differ" (per man diff), so I believe it'll stop after the first difference.
But this is a bit of an XY problem. Really you need a better backup program like rsync:
It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.
From man rsync
I have a script that sorts and processes unsorted files to newly created directories.
Its working great, but I am trying to understand the leanest method to get the script to create an additional file within each newly created directory that:
1.Contains a list of all files in the directory sorted by size order.
2.Outputs a further file to the desktop that contains a label for each directory and lists the files sorted in reverse alphabetical order.
#!/bin/bash
read -p "Good Morning, Please enter your file type name for sorting [ENTER]:" all_extensions
if cd /Users/christopherdorman/desktop
then while read extension
do destination="folder$extension"
mkdir -p "$destination"
mv -v unsorted/*."$extension" "$destination"
done <<< "${all_extensions// /$'\n'}"
mkdir -p foldermisc
if mv -v unsorted/* "foldermisc"
then echo "Good News, the rest of Your files have been successfully processed"
fi
fi
src_dir="/export/home/destination"
list_file="client_list_file.txt"
file=".csv"
echo "src directory="$src_dir
echo "list_file="$list_file
echo "file="$file
cd /export/home/destination
touch $list_file
x=`ls *$file | sort >$list_file`
if [ -s $list_file ]
then
echo "List File is available, archiving now"
y=`tar -cvf mystuff.tar $list_file`
else
echo "List File is not available"
fi
The above script is working fine and it's supposed to create a list file of all .csv files and tar's it.
However I am trying to do it from a different directory while running the script, so it should go to the destination directory and makes a list file with all the .csv in destination directory and make a .tar from the list file(i.e archive the list file)
So i am not sure what to change
there are a lot of tricks in filename handling. the one thing you should know is file naming under POSIX sucks. commands like ls or find may not return the expected result(but 99% of the time they will). so here is what you have to do to get the list of files truely:
for file in $src_dir/*.csv; do
echo `basename $file` >> $src_dir/$list_file
done
tar cvf $src_dir/mystuff.tar $src_dir/$list_file
maybe you should learn bash in a serious manner and try to google first before you asking question in SO next time.
http://www.gnu.org/software/bash/manual/html_node/index.html#SEC_Contents
http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html
I'm trying to create a script on my Raspberry's Xbian that will "watch" a folder and compress any folders with images I saved there from Google Image Search.
What I want: The script will move all folders with " - Google Search" in their name to a temp folder, rename them removing the " - Google Search" part, leaving only the subject of the search query. Then, it will sequentially number the files in each folder using the folder name / the search query as their new name. So, "Random file.jpg" and "anoth3r_rand0m_file.png" will become "search_topic_01.jpg" and "search_topic_02.jpg".
Then, they'll be all moved to another folder, an "Incoming Images" one, where ImageMagick will do its magic on them, at the same time moving them to a "Ready Images" folder.
Still with me?
Here's what I got so far, from bundling stuff I found online together with my limited knowledge of Bash scripting:
echo "making temp"
mkdir /media/dBox/downloads/Xyma/temp
wait
echo "moving files"
mv /media/dBox/downloads/Xyma/*Google\ Search /media/dBox/downloads/Xyma/temp
wait
echo "renaming folders"
rename s/\ -\ Google\ Search// /media/dBox/downloads/Xyma/temp/*
wait
echo "renaming files"
for dir in /media/dBox/downloads/Xyma/temp/; do
if test -d "$dir"; then
(
cd $dir
for file in *; do
newfile=$dir.${file#*.}
mv "$file" "$newfile"
done
)
fi
done
wait
echo "making ready subfolder"
mkdir /media/dBox/downloads/Xyma/temp/00_Unreg_Ready_Image_Folders
wait
echo "moving folders to ready folder"
mv /media/dBox/downloads/Xyma/temp/* /media/dBox/downloads/Xyma/00_Unreg_Ready_$
wait
echo "removing temp folder"
rmdir /media/dBox/downloads/Xyma/temp
...and let's just say "AAARGHRGHRGHHhh".
I'm sure that there must be an even simpler way, with, say, a five-word command and maybe two parameters, that will auto magically do everything and sprinkle it with stardust, or generally "a simpler and better way to do it", but it's currently slipping my mind.
So, I'm open to ideas and suggestions.
Any help? Anyone?
Help!