Exclude directories with a certain prefix using find [closed] - macos

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
Provided I'm not missing something that should be obvious, the OSX version of bash is different from Linux here, because I thought this question would provide all I needed to know. It didn't.
I'm trying to use find to find all directories that do not begin with "." (Mac OS X uses the .-prefix for hidden folders, e.g., /volumes/OD/.Trashes). I want to pipe all the non-hidden directories to rysnc to periodically mirror two local directories.
I tested to make sure I'm using find correctly with this code here:
find /volumes/OD -type d -iname ".*"
It finds the directories:
/volumes/OD/.Trashes
/volumes/OD/.Spotlight-V100
/volumes/OD/.fseventsd
So far so good. But when I try negating the condition I just tested, I get an error:
find /volumes/OD -type d -iname ! ".*"
yields this error:
find: .*: unknown primary or operator
I've tried escaping the "." with "\", but I only get the same error message. I've tried removing the parenthesis, but I get same error message. What am I missing here? Should I be using another operator besides iname?

The ! must precede the condition:
find /volumes/OD -type d '!' -iname '.*'
That said, you shouldn't need to pipe your file-list from find to rsync; I'm sure the latter offers a way to exclude dot-folders. Maybe
rsync --exclude='.*' ...
?

Reason :
! is prefix of jobs such as:
!23
!routENTER
You should do :
find /volumes/OD -type d \! -iname '.*'

ruakh has got the right answer, but here is an alternate way of doing a search:
find /volumes/OD -type d -not -iname ".*"

Related

ultimative find command in bash to trace all illegal and unknown chars and symbols in filenames

Many similar solutions can be found to detect or change "illegal" characters in filenames.
Most solutions require you to know the illegal characters.
A solution to find those filenames often ends with something like:
find . -name "*[\+\;\"\\\=\?\~\<\>\&\*\|\$\'\,\{\}\%\^\#\:\(\)]*"
This is already quite good, but sometimes there are cryptic characters (e.g. h͔͉̝e̻̦l̝͍͓l̢͚̻o͇̝̘w̙͇͜o͔̼͚r̝͇̞l̘̘d̪͔̝.̟͔̠t͉͖̼x̟̞t̢̝̦ or ʇxʇ.pʅɹoʍoʅʅǝɥ or © or €), symbols, or characters from other character sets in my file names. I can not trace these files this way. Inverse lookarounds or the find command with regex is probably the right approach, but I don't get anywhere.
In other words: Find all filenames which do NOT match the following pattern [^a-zA-Z0-9 ._-] would be perfect. Mostly [:alnum:] but with regex the command would be more flexibel.
find . ! -name [^a-zA-Z0-9 ._-] does not do the job. Any idea?
I use bash or zsh on OSX.
You can try
find . -name '*[!a-zA-Z0-9 ._-]*'

Why does 'ls' recurse into folders? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
Why does ls always recurse into folders when I use a wildcard on it (I'd rather that it didn't do this and instead just showed me all items in the directory starting with m and nothing else)?
$ ls
boot/ etc/ lost+found/ mnt/ proc/ run/ srv/ tmp/ var/ init* lib32# libx32# dev/ home/ media/ opt/ root/ snap/ sys/ usr/ bin# lib# lib64# sbin#
/ $ ls m*
media:
mnt:
c/ d/ e/ wsl/
$ alias ls
alias ls='ls -FAh --color=auto --group-directories-first'
This question is off-topic here, and should be migrated to Unix & Linux or Super User; answering community-wiki for the OP's benefit, but expecting this to be closed).
ls isn't recursing. Instead, it's parsing the command line that it's given as an instruction to list the contents of the media directory.
The important thing to understand about UNIX in general is that commands don't parse their own command lines -- whatever starts a program is responsible for coming up with an array of C strings to be used as its command line argument, so a single string like ls m* can't be used.
The shell thus replaces ls m* with an array ["ls", "media"] (when media is the only match for m*).
Because ls can't tell the difference between being given media as the name of a directory to list, and being given media as the result of expanding a glob, it assumes the former, and lists the contents of that directory.
Why does ls always recurse into folders when I use a wildcard on it
It's according to the specifications if the wildcard globbing matches a directory.
From The Open Group Base Specifications Issue 7, 2018 edition:
For each operand that names a file of type directory, ls shall write the names of files contained within the directory as well as any requested, associated information.
You can however override this default behavior by using the -d option:
Do not follow symbolic links named as operands unless the -H or -L options are specified. Do not treat directories differently than other types of files. The use of -d with -R or -f produces unspecified results.

How to use 'find' to print only matching directories?

I am trying to use find to recursively search the file names in a directory for a particular pattern (wp-config.*). When I do so using:
find `wp-config.*`
it seems to print all directories to the screen. How can I ensure that only the matching directories are printed to the screen?
Thanks.
From the answers in this outside post, I was able to use this command to do what I want:
find . -name 'wp-config.*' -printf "%h\n"
One of my main issues was that I originally did not understand that find does not print results to the screen. So it is necessary to pipe results to some kind of output.
Correct usage of find:
find -type d -name "wp-config.*"
The '-type d' will give you the directories.
The '-name "wp-config.*"' will give you the names requested.
Always use the man pages to search up commands. In this case:
man find
One last thing. The backticks ` are serving a totally different purpose. What you need here are regular quotes ".

bash Script - find: invalid predicate `-newermt' error [duplicate]

This question already has answers here:
How to use 'find' to search for files created on a specific date? [closed]
(9 answers)
Closed 4 years ago.
I have a scenario where i have to I need to find the files for a specific year for eg- 2012 and empty the files. Trying to use the below command which is throwing
#!/bin/bash
find . -type f -newermt 2012-01-01 ! -newermt 2012-01-01 -exec truncate -s 0 {} \;
find: invalid predicate `-newermt' error Below is the command
Strange thing is this command works for some people.Found that there may be some version compatibility issue for the find with the predicate 'newermt' for my system.
So just wanted to check
1. So how can i resolve the above error
2. Is there any way by which i can perform my task i.e, - find the files for a specific year for eg- 2012 and empty the files.
The issue got resolved for me as in my other box/environment where i needed the scenario i had a upper find version (4.4.2). Only in the test environment i had a lower version and hence was getting the issue.Thanks all for your inputs.

fswatch to watch only a certain file extension [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
The community reviewed whether to reopen this question 1 year ago and left it closed:
Original close reason(s) were not resolved
Improve this question
I am using fswatch and only want it triggered if a file with extension .xxx is modified/created etc. The documentation and the second reference below indicate that:
All paths are accepted by default, unless an exclusion filter says otherwise.
Inclusion filters may override any exclusion filter.
The order in the definition of filters in the command line has no effect.
Question: What is the regular expression to use to exclude all files that do not match the .xxx extension?
References:
Is there a command like "watch" or "inotifywait" on the Mac?
Watch for a specific filetype
Platform:
MacOS 10.9.5.
I'm fswatch author. It may not be very intuitive, but fswatch includes everything unless an exclusion filter says otherwise. Coming to your problem: you want to include all files with a given extension. Rephrasing in term of exclusion and inclusion filters:
You want to exclude everything.
You want to include files with a given extension ext.
That is:
To exclude everything you can add an exclusion filter matching any string: .*.
To include files with a given extension ext, you add an inclusion filter matching any path ending with .ext: \\.ext$. In this case you need to escape the dot . to match the literal dot, then the extension ext and then matching the end of the path with $.
The final command is:
$ fswatch [options] -e ".*" -i "\\.ext$"
If you want case insensitive filters (e.g. to match eXt, Ext, etc.), just add the -I option.
You may watch for changes to files of a single extension like this:
fswatch -e ".*" -i ".*/[^.]*\\.xxx$" .
This will exclude all files and then include all paths ending with .xxx (and also exclude files starting with a dot).
If you want to run a command on the file change, you may add the following:
fswatch -e ".*" -i ".*/[^.]*\\.xxx$" -0 . | xargs -0 -n 1 -I {} echo "File {} changed"

Resources