Running program/macro to rename, add files to flash drive - macos

I have a huge batch of flash drives that I need to move files onto. I'd also love to rename the drives (they're all called NO NAME by default). I'd love to plug two drives in, run a terminal script on the computer to accomplish all of that (most importantly the file moving). Then remove the drives, put the next two in, run it again, etc. until I'm done. All of the drives are identically named.
Is batch executing like this possible, and does anyone know how to go about doing it?

I figured it out. Put each one in and run this command to rename the drive and then move the files into it:
diskutil rename /Volumes/OLDNAME "NEWNAME" && cp -r ~/Desktop/sourceFolder/. /Volumes/NEWNAME

Related

Why does this batch file fail for some users?

I have a batch file that copies the most recent version of an access front-end file to the user's C: drive and then opens it from there.
For some users, the copy command causes the batch file to close, and I can't work out what could cause that. The file seems to copy, but the batch file just closes itself without any visible error messages.
I've used Pause to confirm that the failure happens at the Copy step, not the Run or the If.
This is Windows 7, I've tried it with Copy and Xcopy. The users with the issue say it's worked in the past, they all have access to the location being copied from (and to). Mapping the location doesn't seem to make any difference, and UNC paths work for most users so it's not that.
Deleting the existing files in C:\databases doesn't help.
if not exist "C:\Databases\" mkdir "C:\Databases"
copy "\\SERVER02\FINOPS\COMMAQR\DIGIHUB\1. Live Version\DIGIHUB v2.5.accdb" "C:\Databases\"
start [the file]
For 95%+ of users, the batch file copies the most recent version down and opens the file. For a handful, the batch file reaches the copy step and closes itself.
Does anyone know why this could happen, or alternatives to both Copy and XCopy that might not fail?

How to compare two directories, and if they're the EXACT SAME, delete the second

I'm trying to setup an automatic backup on a raspberry pi system connected to an external hard drive.
Basically, I have shared folders and they're mounted via samba on the rPI under
/mnt/Comp1
/mnt/Comp2
I will then have the external hard drive plugged in and mounted with two folders under
/media/external/Comp1
/media/external/Comp2
I will then run a recursive copy from /mnt/Comp1* to /media/external/Comp1/* and the same with Comp2.
What I need help with is at the end of the copies (because it will be a total of 5 computers), I would like to verify that all the files transferred, and if they did and everything is on the external, then I can delete from the local machine automatically. I understand this is risky, because almost inevitably it will delete things that may not be backed up, but I need help knowing where to start.
I've found a lot of information on checking contents of a folder, and I know I can use the diff command, but I don't know how to use it in this pseudocode
use diff on directories /mnt/Comp1/ and /media/external/Comp1
if no differences, proceed to delete /mnt/Comp1/* recursively
if differences, preferably move the files not saved to /media/external/Comp1
repeat checking for differences, and deleting if necessary
Try something like:
diff -r -q d1/ d2/ >/dev/null 2>&1
check return value with $?
remove the d2, if return value is 1.

Where does this extra folder with \ 1/ came from

so I have external hdd hooked up with 2015 early mac book pro, usually when I want to access it I just type in cd /volumes/MAC -ls, will show all the files and folder in it.
Due to not ejecting properly, now I can't access that folder. Rather I see a new folder with same name like - cd /Volumes/MAC\ 1/. And I see all my files and folder in this folder.
Can anybody shed some light on it.
Thanks
This is probably happening due to a quirk of the way unix (and hence macOS) mounts volumes. They don't actually mount as folders, instead they mount over top of existing folders. Thus, when macOS detects that you've plugged in an external drive, it normally does something like this:
Figure out what the volume's "name" is
Create an empty folder named /Volumes/[volumename]
Mount the volume on that folder
When you unmount the volume, it does the reverse: it does the actual unmount, then deletes the folder it was mounted on. But if something goes wrong, and that folder doesn't get deleted, the leftover folder can cause this to happen when you attach the drive:
Figure out what the volume's "name" is
Try to create an empty folder named /Volumes/[volumename]... Oops, that already exists, better try something else.
Try to create an empty folder named /Volumes/[volumename 1]... Whew, that worked; we'll use that
Mount the volume on the /Volumes/[volumename 1] folder
Fortunately, the solution is pretty simple: check the leftover /volumes/MAC folder (or whatever it is using that name), make sure it doesn't contain anything important, delete it (or at least rename it), then cleanly unmount & remount your volume -- with the correct name available in /Volumes, it should mount without the " 1" suffix.
The termimal escape the space, thus you get MAC\ 1 for a folder named MAC 1.
Now as MAC was mounted, unmounted properly, then re-mounted, it was renamed from MAC to MAC 1

How to create a batch file in Mac?

I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.

RSync copies only folder directory structure not files

I am using RSync to copy tar balls to an external hard drive on a Windows XP machine.
My files are tar.gz files (perms 600) in a directory (perms 711).
However, when I do a dry-run, only the folders are returned, the files are ignored.
I use RSync a lot, so I presume there is no issue with my installation.
I have tried changing permissions of the files but this makes no difference
The owner of the files is root, which is also the user which the script logs in as
I am not using Rsync's CVS option
The command I am using is:
rsync^
-azvr^
--stats^
--progress^
-e 'ssh -p 222' root#servername:/home/directory/ ./
Is there something I am missing to get my files copied over?
I can think of only a single possibility: My experience with rsync is that it creates the directory structure before copying files in. Rsync may be terminating prematurely, but after this directory step has been completed.
Update0
You mentioned that you were running dry run. Rsync by default only shows the directory names when the directory and all its contents are not present on the receiver.
After a lot of experimentation, I'm only able to reproduce the behaviour you describe if the directories on the source have later modification dates than on the receiver. In this instance, the modification times are adjusted on the receiver.
I had this problem too, and it turns out that backing up to a windows drive from linux doesn't seem to copy the temp files in place, after they are transferred over.
Try adding the --inplace flag, when rsyncing to windows drives.

Resources