errors when piping a specific find command and creating a zipfile from its output? - macos

I wish to create a program that zips whatever file is created in the directory the find parameters specify, and run it as a background process. I heavily comment it to give a better idea of what I'm trying to achieve. I'm running this from my MacBook Pro terminal, OS X version 10.9
#!/bin/sh
#find file in directory listed below
#type f to omit directories or special files
#mtime/ctime is modified/created -0 days or less
#name is with the name given in double quotes
#asterik meaning any file name with any file extension
#use xargs to convert find sequence to a command for the line after pipe
find /Users/name/thisdirectory type f -ctime -0 -name "'*'.'*'" | xargs zip -

Maybe you're looking for this:
find /path/to/dir -type f -ctime -0 -name "*.*" | zip -# file.zip
If you read zip -h, it explains that -# is to read the filenames from standard input.
You don't need xargs here, the function to work with a list of files received from standard input is built into zip itself, similar to most compression tools like tar.
Btw, I think you want to change -ctime -0, because I don't think it can match anything this way...

Related

Check if file is in a folder with a certain name before proceeding

So, I have this simple script which converts videos in a folder into a format which the R4DS can play.
#!/bin/bash
scr='/home/user/dpgv4/dpgv4.py';mkdir -p 'DPG_DS'
find '../Exports' -name "*1080pnornmain.mp4" -exec python3 "$scr" {} \;
The problem is, some of the videos are invalid and won't play, and I've moved those videos to a different directory inside the Exports folder. What I want to do is check to make sure the files are in a folder called new before running the python script on them, preferably within the find command. The path should look something like this:
../Exports/(anything here)/new/*1080pnornmain.mp4
Please note that (anything here) text does not indicate a single directory, it could be something like foo/bar, foo/b/ar, f/o/o/b/a/r, etc.
You cannot use -name because the search is on the path now. My first solution was:
find ./Exports -path '**/new/*1080pnornmain.mp4' -exec python3 "$scr" {} \;
But, as #dan pointed out in the comments, it is wrong because it uses the globstar wildcard (**) unnecessarily:
This checks if /new/ is somewhere in the preceding path, it doesn't have to be a direct parent.
So, the star is not enough here. Another possibility, using find only, could be this one:
find ./Exports -regex '.*/new/[^\/]*1080pnornmain.mp4' -exec python3 "$scr" {} \;
This regex matches:
any number of nested folders before new with .*/new
any character (except / to leave out further subpaths) + your filename with [^\/]*1080pnornmain.mp4
Performances could degrade given that it uses regular expressions.
Generally, instead of using the -exec option of the find command, you should opt to passing each line of find output to xargs because of the more efficient thread spawning, like:
find ./Exports -regex '.*/new/[^\/]*1080pnornmain.mp4' | xargs -0 -I '{}' python3 "$scr" '{}'

Run `tmutil isexcluded` recursively

I'd like to use tmutil recursively to list all of the files currently excluded from Time Machine backups. I know that I can determine this for a single file with
tmutil isexcluded /path/to/file
but I can't seem to get this to run recursively. I have tried grepping for the excluded files and outputting to a file like this:
tmutil isexcluded * | grep -i excluded >> ~/Desktop/TM-excluded.txt
but this only outputs data for the top level of the current directory. Can I use find or a similar command to feed every file/directory on the machine to tmutil isexcluded and pull out a list of the excluded files? What is the best way to structure the command?
I'm aware that most of the exclusions can be found in
/System/Library/CoreServices/backupd.bundle/Contents/Resources/StdExclusions.plist
and that some app-specific exclusions are searchable via
sudo mdfind "com_apple_backup_excludeItem = 'com.apple.backupd'"
but I am looking for a way to compare the actual flags on the files to these lists.
This should do it:
find /starting/place -exec tmutil isexcluded {} + | grep -F "[Excluded]" | sed -E 's/^\[Excluded\][[:space:]]*//'
This takes advantage of the fact that tmutil allows you to pass multiple filenames, so I use + at the end of the find instead of ; then I don't have to execute a new process for every single file on your machine - which could be slow. The grep looks for the fixed (F) string [Excluded] and the sed removes the [Excluded] and following 4 spaces.
You can get all the files in any subdirectory of /path/to/master/dir with
find /path/to/master/dir -type f
Now you cannot pipe the output to tmutil, so what you can do is
find /path/to/master/dir -type f -exec tmutil isexcluded {} \;
What this does is:
We know what find /path/to/master/dir -type f does.
-exec will execute anything after it.
{} will use each file from find's output separately with tmutil isexcluded.
\ ends the -exec so the -exec knows what command should execute (in case you would want to do something else after it).

Searching for exact file extension using sh or bash on macOS (syntax issue)

Looking for files with .pst or .pst extension. A few apps have files in their bundles that have either/both extensions, so excluding them.
After some testing, found this script works:
#!/bin/sh
find /Users -type f -not -path "*AnApplication.app*" | grep -i "*.pst$" > /path/to/search-result.txt
exit 0
However it is returning *.dpst" files. which I thought would not happen given the grep -i "*.pst$" part of the command.
We are using the $ to ensure search returns extensions, and not files with ".pst" in the path or middle of name (Example: myFile.pst.doc or /path/my.pst.files/).
Our goal is to find only files ending in ".pst", what am I doing rong? :)
Thanks for the huge help, my apologies for the belated response. Here is what we ended up going with:
find /Users -type f -not -path '*AnApplication.app*' -iname '*.pst'

execute command on files returned by grep

Say I want to edit every .html file in a directory one after the other using vim, I can do this with:
find . -name "*.html" -exec vim {} \;
But what if I only want to edit every html file containing a certain string one after the other? I use grep to find files containing those strings, but how can I pipe each one to vim similar to the find command. Perphaps I should use something other than grep, or somehow pipe the find command to grep and then exec vim. Does anyone know how to edit files containing a certain string one after the other, in the same fashion the find command example I give above would?
grep -l 'certain string' *.html | xargs vim
This assumes you don't have eccentric file names with spaces etc in them. If you have to deal with eccentric file names, check whether your grep has a -z option to terminate output lines with null bytes (and xargs has a -0 option to read such inputs), and if so, then:
grep -zl 'certain string' *.html | xargs -0 vim
If you need to search subdirectories, maybe your version of Bash has support for **:
grep -zl 'certain string' **/*.html | xargs -0 vim
Note: these commands run vim on batches of files. If you must run it once per file, then you need to use -n 1 as extra options to xargs before you mention vim. If you have GNU xargs, you can use -r to prevent it running vim when there are no file names in its input (none of the files scanned by grep contain the 'certain string').
The variations can be continued as you invent new ways to confuse things.
With find :
find . -type f -name '*.html' -exec bash -c 'grep -q "yourtext" "${1}" && vim "${1}"' _ {} \;
On each files, calls bash commands that grep the file with yourtext and open it with vim if text is matching.
Solution with a for cycle:
for i in $(find . -type f -name '*.html'); do vim $i; done
This should open all files in a separate vim session once you close the previous.

Find and delete .txt files in bash [duplicate]

This question already has answers here:
Command line: piping find results to rm
(5 answers)
Closed last month.
Recently frigged up my external hard drive with my photos on it (most are on DVD anyway, but..) by some partition friggery.
Fortunately I was able to put things back together with PhotoRec another Unix partition utility and PDisk.
PhotoRec returned over one thousand folders chalk full of anything from .txt files to important .NEF's.
So I tried to make the sorting easier by using unix since the OSX Finder would simply crumble under such requests as to select and delete a billion .txt files.
But I encounter some BS when I tried to find and delete txt files, or find and move all jpegs recursively into a new folder called jpegs. I am a unix noob so I need some assistance please.
Here is what I did in bash. (I am in the directory that ls would list all the folders and files I need to act upon).
find . -name *.txt | rm
or
sudo find . -name *.txt | rm -f
So it's giving me some BS that I need to unlink the files. Whatever.
I need to find all .txt files recursively and delete them preferably verbose.
You can't pipe filenames to rm. You need to use xargs instead. Also, remember to quote the file pattern ".txt" or the shell will expand it.
find . -name "*.txt" | xargs rm
find . -name "*.txt" -exec rm {} \;
$ find . -name "*.txt" -type f -delete

Resources