Simple Mac Bash Script (Find/Replace) - macos

First, any support and help on this is largely appreciated.
I'm trying to write a simple Bash script (completely new to this) to replace a file in a given directory.
Basically, I need to write a script to replace the safari preference file, here's what I have..and what's not working for that matter:
#!/bin/bash
find /Files/ -iname "com.apple.Safari.plist" - print0 | xargs -I{} -0 -1 cp file /Users/{}/Library/Preferences
It errors out with the following:
find: -: unknown option
xargs: illegal option -- 1
Any thoughts, ideas, are greatly appreciated.
Thanks,

I couldn't understand what exactly you want to accomplish with this. As I understand, you would have this "com.apple.Safari.plist" in /Files/, is that correct?
And then you want to get this file into some place that, I assume, overwrites Safari's current plist file. Assuming you take ghostdog74's correct advice and remove the space between - print0, thus turning it into -print0, and then remove the -1 from xargs, as it doesn't exist, this is what would happen:
find would find your file in /Files/, and xargs would run this:
cp file /Users/com.apple.Safari.plist/Library/Preferences; It would then die, since it would not find a file called "file" or a directory named "/Users/com.apple.Safari.plist/".
That's most likely not what you want. :)
If you just want to copy the file to somewhere, why don't you just do cp /Files/com.apple.Safari.plist ~/Library/Preferences/ ?
Do you really need find and xargs in this case? Could you clarify?

No space between - print0. and since -1 is not an option, remove it and see.
find /Files/ -iname "com.apple.Safari.plist" -print0 | xargs -I{} -0 cp file /Users/{}/Library/Preferences

Related

Operating on multiple specific folders at once with cp and rm commands

I'm new to linux (using bash) and I wanted to ask about something that I do often while I work, I'll give two examples.
Deleting multiple specific folders inside a certain directory.
Copying multiple specific folders into a ceratin directory.
I succesfully done this with files, using find with some regex and then using -exec and -delete. But for folders I found it more problematic, because I had problem pipelining the list of folders I got to the cp/rm command succescfully, each time getting the "No such file or directory error".
Looking online I found the following command (in my case for copying all folders starting with a Z):
cp -r $(ls -A | grep "Z*") destination
But when I execute it it says nothing and the prompt won't show up again until I hit Ctrl+C and nothing is copied.
How can I achieve what I'm looking for? For both cp and rm.
Thanks in advance!
First of all, you are trying to grep "Z*" but it means you are looking for Z, ZZ, ZZZZ, ZZZZZ ?
also try to execute ls -A - you will get multiple columns. I think need at least ls -1A to print result one per line.
So for your command try something like:
cp -r $(ls -1A|grep "^p") destination
or
cp -r $(ls -1A|grep "^p") -t destination
But all the above is just to correct syntax of your example.
It is much better to use find. Just in case try to put target directory in quotas like:
find <PATH_FROM> -type d -exec cp -r \"{}\" -t target \;

How to apply unix2dos on last modified file?

Ok, so I have a few files in my directory which follows the following pattern:
TEST_20150130.txt
TEST_20150202.txt
TEST_20150203.txt
TEST_20150204.txt
TEST_RESULT_20150130.csv
TEST_RESULT_20150202.csv
TEST_RESULT_20150203.csv
TEST_RESULT_20150204.csv
Now I want to apply the command unix2dos, but only on TEST_20150204.txt and TEST_RESULT_20150204.csv
Would anyone be able to suggest what would be the easiest one liner script that could do this?
The easiest (but not necessarily the most robust):
unix2dos `\ls -rtd DIR_NAME/* | tail -2`
This is easy and will surely work on your example, but you should also be aware that in general you shouldn't parse the output of ls like that.
You can use below command too... Depends you only want to modify today's file or only last file ...
unix2dos `find -maxdepth 1 -type f -daystart -mtime -1`

errors when piping a specific find command and creating a zipfile from its output?

I wish to create a program that zips whatever file is created in the directory the find parameters specify, and run it as a background process. I heavily comment it to give a better idea of what I'm trying to achieve. I'm running this from my MacBook Pro terminal, OS X version 10.9
#!/bin/sh
#find file in directory listed below
#type f to omit directories or special files
#mtime/ctime is modified/created -0 days or less
#name is with the name given in double quotes
#asterik meaning any file name with any file extension
#use xargs to convert find sequence to a command for the line after pipe
find /Users/name/thisdirectory type f -ctime -0 -name "'*'.'*'" | xargs zip -
Maybe you're looking for this:
find /path/to/dir -type f -ctime -0 -name "*.*" | zip -# file.zip
If you read zip -h, it explains that -# is to read the filenames from standard input.
You don't need xargs here, the function to work with a list of files received from standard input is built into zip itself, similar to most compression tools like tar.
Btw, I think you want to change -ctime -0, because I don't think it can match anything this way...

Find/Match variable filenames and move files to respective directory

I've never come to SO asking "Do my homework" but I really don't know where to start with this one.
I have a load of documents which are dumped in a directory after being auto-signed using JSignPdf (--output-directory option seemingly has no ability to output to same as input):
/some/dir/Signed/PDF1_signed.pdf
/some/dir/Signed/PDF2_signed.pdf
/some/dir/Signed/PDF2_signed.pdf
I'd like to then find their source/unsigned counterparts:
/some/dir/with/docs/PDF1.pdf
/some/dir/where/is/PDF2.pdf
/some/dir/why/this/PDF3.pdf
...and move the signed PDFs into the respective directories.
I use the command, to find all the PDFs in the variety of directories:
find . -name '*.pdf' -exec sh -c 'exec java -jar jsignpdf-1.4.3/JSignPdf.jar ... ' sh {} +
...and I've tried things like making find output a variable and then using IF THEN to match with no success. Would I need find output to be made into multiple variables? I'm so lost :(
I'd like to accomplish this in some shell, but if there are Perl junkies out there or anything else, I am more than happy for another portable solution.
I've tried to break it down, but still don't understand how to accomplish it...
find files matching VarName without _signed
move _signed file with matching name to the directory of found file
Thanks for any help/guidance.
Use a while loop to read each file found by find and move it to the correct place:
find /some/dir -name "*.pdf" ! -name "*_signed.pdf" -print0 | while IFS= read -d '' -r file
do
f="${file##*/}"
mv "/some/dir/Signed/${f%.*}_signed.pdf" "${file%/*}"
done
I have a similar problem I've been working on. Since the path manipulation required to convert /some/dir/where/is/PDF2.pdf to /some/dir/Signed/PDF2_signed.pdf is fairly simple but more involved than can be done in a simple one-liner, I've been using find to locate the first set, and using a simple loop to process them one at a time. You did mention homework, so I'll try not to give you too much code.
find /dir/containing/unsigned -name '*.pdf' -print0 | while IFS= read -d path; do
fetch_signed_version "$path"
done
where fetch_signed_version is a shell function you write that, given a path such as /some/dir/where/is/PDF2.pdf, extracts the directory (/some/dir/where/is), computes the signed PDF's name (PDF2_signed.pdf), then executes the necessary move (mv /some/dir/Signed/$signed_pdf /some/dir/where/is)
fetch_signed_version is actually pretty simple:
fetch_signed_version () {
dir=${1%/*}
fname=${1##*/}
signed_name=${fname%.pdf}_signed.pdf
mv "/some/dir/Signed/$signed_name" "$dir"
}

improve find performance

I have a bash script that zips up filenames based on user input. It is working fine albeit slowly since I have, at times, to parse up to 50K files.
find "$DIR" -name "$USERINPUT" -print | /usr/bin/zip -1 SearchResult -#
The # sign here means that zip will be accepting file names from STDIN. Is there a way to make it go faster?
I am thinking of creating a cron job to update the locate database every night but I am not root so I don't even if it is worth it.
Any suggestions welcome.
I suggest you make use of parallel processing in xargs command to speed up your entire process. Use a command like this:
find "$DIR" -name "$USERINPUT" -print0 | xargs -0 -P10 zip -1 SearchResult -#
Above command will make xargs run 10 parallel sub-processes.
Please record timing of above command like this:
time find "$DIR" -name "$USERINPUT" -print0 | xargs -0 -P10 zip -1 SearchResult -#
and see if this makes any performance improvements.
As Mattias Ahnberg pointed out, this use of find will generate the entire list of matching files before zip gets invoked. If you're doing this over 50,000 files, that will take some time. Perhaps a more suitable approach would be to use find's -exec <cmd> {} \; feature:
find "$DIR" -name "$USERINPUT" -exec /usr/bin/zip -1 {} \;
This way, find invokes zip itself on each matching file. You should achieve the same end result as your original version, but if the sheer number of files is your bottleneck (which, if the files are all small, is most likely), this will kick off running zip as soon as it starts finding matches, rather than when all matches have been found.
NB: I recommend reading the man page for find for details of this option. Note that the semi-colon must be escaped to prevent your shell interpreting it rather than passing it to find.
Sounds like you're trawling through the filesystem running a find for each of the 50,000 files.
Why not do one run of find, to log names of all files in the filesystem, and then pluck the locations of them straight from this log file ?
Alternatively, break the work down into seperate jobs, particularly if you have multiple filesystems and multiple CPUs. No need to be single-threaded in your approach.

Resources