Copying directories recursively using shell script - shell

Should be an easy question for the gurus here, though it's hard to explain it in text so hopefully this is clear. I've got two directories on a box with some flavor of unix on it. I've got a script that I want to use to move all the files and directories from one location to another.
First, an example of how the directories look:
Directory A: final/results/2012/2012-02/2012-02-25/name/files
Directory B: test/results/2012/2012-02/2012-02-24/name/files
So you see they're very similar. What I want to do is move everything from the Directory B 2012 directory, recursively, to the same level of Directory A. So you'd end up with:
someproject/results/2012/2012-02/2012-02-25/name/files
someproject/results/2012/2012-02/2012-02-24/name/files
etc.
I want this script to be future proof though, meaning I don't want the 2012 hardcoded. Also, towards the end of a month you will potentially have data from two different months and both need to be copied into the 2012 directory. So here is the command I used in the shell script file:
CONS="/someproject";
ROOT="/test";
/bin/cp -r ${ROOT}/results/* ${CONS}/results/*
but this resulted in:
/final/results/2012/2012-02/2012-02-25/name/files
and
/final/results/2012/2012/2012-02/2012-02-24/name/files
So as I hope is clear, it started a level below where I wanted it too. Can anyone fill me in on what I'm doing wrong, if they can understand what I'm even trying to explain. My apologies if it's not clear. I'm sure this is a fairly simple fix but I'm not sure what to do. Shell scripting is not a strong point of mine.

One poster suggests rsync, which is overkill.
cp -rp will work fine. if you want to move the files, just mv the directory -- it and everything under it will move too.
The only real problem here is the use of terminating *'s in the command line in the original script. You don't need the *, you're just trying to pass directories to the cp command, you aren't trying to pass it the names of all the files already in the source (and more importantly, the destination).

You could also use a tool like rsync to make sure your source and target are synchronized.
rsync -av ${ROOT}/results/ ${CONS}/results/
You specified that you want to "move" the files, though. Which means deleting the originals after they're copied:
rsync -av --remove-source-files ${ROOT}/results/ ${CONS}/results/
If you start playing around with rsync, be sure to read the man page about how it treats trailing slashes.

Related

How to change SYMLINK to SYMLINKD in batch script

We're sharing SYMLINKD files on our git project. It almost works, except git modifies our SYMLINKD files to SYMLINK files when pulled on another machine.
To be clear, on the original machine, symlink is created using the command:
mklink /D Annotations ..\..\submodules\Annotations\Assets
On the original machine, the dir cmd displays:
25/04/2018 09:52 <SYMLINKD> Annotations [..\..\submodules\Annotations\Assets]
After cloning, on the receiving machine, we get
27/04/2018 10:52 <SYMLINK> Annotations [..\..\submodules\Annotations\Assets]
As you might guess, a file target type pointing at a a directory [....\submodules\Annotations\Assets] does not work correctly.
To fix this problem we either need to:
Prevent git from modifying our symlink types.
Fix our symlinks with batch script triggered on a githook
We're going we 2, since we do not want to require all users to use a modified version of git.
My limited knowledge of batch scripting is impeding me. So far, I have looked into simply modifying the attrib of the file, using the info here:
How to get attributes of a file using batch file and https://superuser.com/questions/653951/how-to-remove-read-only-attribute-recursively-on-windows-7.
Can anyone suggest what attrib commands I need to modify the symlink?
Alternatively, I realise I can delete and recreate the symlink, but how do I get the target directory for the existing symlink short of using the dir command and parsing the path from the output?
I think it's https://github.com/git-for-windows/git/issues/1646.
To be more clear: your question appears to be a manifestation of the XY problem: the Git instance used to clone/fetch the project appears to incorrectly treat symbolic links to directories—creating symbolic links pointing to files instead. So it appears to be a bug in GfW, so instead of digging it up you've invented a workaround and ask how to make it work.
So, I'd better try help GfW maintainer and whoever reported #1646 to fix the problem. If you need a stop-gap solution, I'd say a proper way would be to go another route and script several calls to git ls-tree to figure out what the directory symlinks are (they'd have a special set of permission bits;
you may start here).
So you would traverse all the tree objects of the HEAD commit, recursively,
figuring out what the symlinks pointing at directories are and then
fixup the matching entries in the work tree by deleting them
and recreating with mklink /D or whatever creates a correct sort of
symlink.
Unfortunately, I'm afraid trying to script this using lame possibilities
of cmd.exe-s scripting facilities would be an exercise in futility.
I'd take some more "real" programming language (PowerShell as an example,
and—since you're probably a Windows shop—even a .NET would be OK).

Move newly created text files to a var created directory

I have several text docs that are created each day from templates. This process I've achieved successfully albeit probably in a Cro-Magnon way. I want these newly created text files to be filed within a newly created dated folder.
The script creates the file docs from the templates successfully and also creates the newly dated directory. I don't really want to create these text files somewhere else and then move them to the newly created directory. Rather that they be created directly within it. All my research tends to involve directories that already exist rather than one created from a var.
I've included just one file creation example below.
Hope you can help. TIA
today=`date '+%y%m%d'`;
today_Folder=~/Desktop/test/"${today}"
if [[ ! -d $today_Folder ]]
then
mkdir "${today_Folder} `(date '+%A')`"
fi
cat ~/Desktop/test/template.txt >> ~/Desktop/test/dest.txt
P.S. I've tried to make the cat command regarding the text files clearer - it simply creates files. I'm NOT trying to create a tree of directories. Simply ONE newly created directory that could be in test along with the text files.
Your question is how to dynamically create a file, also creating all the path to contain that file? That's not possible in any intuitive/portable way, and it's not typically programs always have to create the directory before the file. What you can do is pass the -p flag to mkdir. On Linux systems (this may also not be portable), this flag means "create all the directories necessary for this path". Zero directories is okay, so you don't need to check whether the directory already exists. So change the whole if block to just this:
mkdir -p "${today_Folder} `(date '+%A')`"
Also, it's kind of smelly the way you want a string (the path) and you're using three operations to create it. Could it be simpler? You want more statements when they add clarity, but in this case the steps are so simple that the only thing accomplished is to make your colleagues go up and read what you wrote more than once. It might suit to change it to:
dir_path=...
mkdir -p "${dir_path}"
To accomplish this, keep in mind that instead of backticks, you can add command substitution with $(). It helps since backticks can't be nested--it makes the line more readable, since you clearly see the command's start/end.

How to create a batch file in Mac?

I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.

Making checks before rsyncing external drive on OSX

I have the following issue on OSX though I guess this could equally be filed under bash. I have several encrypted portable drives that I use to sync an offsite data store or as an on-the-go data store etc. I keep these updated using rsync with several options including --del and an includes file.
This is currently done very statically i.e.
rsync <options> --include-file=... /Volumes /Volumes/PortableData
where the includes file would read something like
+ /Abc/
+ /Def/
...
- *
I would like to do the following:
Check the correct drive is mounted and find its mount-point
Check that all the + /...../ entries are mounted under /Volumes
rsync
To achieve 1 I was intending to store the uuid of the drives in variables in my profile so that I could search for them and find the relevant mount point. A bash function in .bashrc that takes a uuid and returns a mount point. I have seen some web entries for achieving this.
2 I am a little more stuck on. What is the best way of retrieving only those entries that are both + and top level folder designations in the include files then iterating to check they are mounted and readable? Again, I'm thinking of trying to put some of this logic in functions for re-usability.
Is there a better way of achieving this? I have thought of CCC, but like the idea of scripting in bash and using rsync as it is a good way of getting to know the command line.
rsync can call in a file that is a list of exclusions.
I would write a script that dumped directories to text file that are NOT + and top level folder designations in the include files
You are going to want an exclusion to look like this:(you can use wildcards if it helps)
dirtoexlude1
dirtoexlude2
dirtoexlude
Then just direct an rsync to that exclusion file.
Your Rsync command will be something like this:
rsync -aP --exclude-from=rsyncexclusion.txt
a is for recursive essentially (with hand waving) and P is for verbose.
good luck.

Should you change the current directory in a shell script?

I've always mentally regarded the current directory as something for users, not scripts, since it is dependent on the user's location and can be different each time the script is executed.
So when I came across the Java jar utility's -C option I was a little puzzled.
For those who don't know the -C option is used before specifying a file/folder to include in a jar. Since the path to the file/folder is replicated in the jar, the -C option changes directories before including the file:
in other words:
jar -C flower lily.class
will make a jar containing the lily.class file, whereas:
jar flower/lily.class
will make a flower folder in the jar which contains lily.class
For a jar-ing script I'm making I want to use Bourne wild-cards folder/* but that would make using -C impossible since it only applies to the next immediate argument.
So the only way to use wild-cards is run from the current directory; but I still feel uneasy towards changing and using the current directory in a script.
Is there any downside to using the current directory in scripts? Is it frowned upon for some reason perhaps?
I don't think there's anything inherently wrong with changing the current directory from a shell script. Certainly it won't cause anything bad to happen, if taken by itself.
In fact, I have a standard script that I use for starting up a Java-based server, and the very first line is:
cd `dirname $0`
This ensures that the rest of the commands in the script are executed in the directory that contains the script file itself (useful when a single machine is hosting multiple server instances), regardless of where the shell script was actually invoked from. Without changing the current directory in the script, it would only work correctly if the user remember to manually cd into the corresponding directory before running the script.
In this case, performing the cd operation from within the script removes a manual step from the server startup/shutdown process, and makes things slightly less error-prone as a result.
So as with most things, there are legitimate uses for this sort of thing. And I'm sure there are also some questionable ones, as well. It really depends upon what's most appropriate for your specific use-case. Which is something I can't really comment on...I always just let maven build my JAR's for me.

Resources