Been searching around for a bit and cannot find a solution for this one. I guess I'm looking for a leaf-directory by name. In this example I'd like to get a list of directories call 'modules' that do NOT have a subdirectory called module.
modules/package1/modules/spackage1
modules/package1/modules/spackage2
modules/package1/modules/spackage3/modules
modules/package1/modules/spackage3/modules/spackage1
modules/package2/modules/
The list I desire would contain
modules/package1/modules/spackage3/modules/
modules/package2/modules/
All the directories named module that do not have a subdirectory called module
I started with trying something this with no luck
find . -name modules \! -exec sh -c 'find -name modules' \;
-exec works on exit code, okay lets pass the count as exit code
find . -name modules -exec sh -c 'exit $(find {} -name modules|grep -n ""|tail -n1|cut -d: -f1)' \;
This should take the count of each subdirectory called modules and exit with it. No such love.
This assumes GNU find. Find all leaf directories that include only one occurrence of "modules":
find -regex '.*/modules\(/.*\|$\)' \! -regex '.*/modules/.*/modules\(/.*\|$\)' -type d -links 2
find . -name modules |\
awk '{gooddir[$1]=1}\
END {\
for(i in gooddir){\
for(j in gooddir){\
if(i!=j&&substr(i,0,length(j))==i)\
{gooddir[i]=0;break}\
}\
}\
for(i in gooddir){if(gooddir[i]==1){print i}}}'
try this?
Your start
find . -name modules \! -exec sh -c 'find -name modules' \;
wasn't bad, it just needs some improvements.
find . -name modules \! -exec sh -c 'find {} -mindepth 1 -name modules|read' \; -print
{} to search in the found modules directory, not in the top directory again
-mindepth 1 to search only under the found modules directory, not to find itself again
|read to set the exit status (unfortunately, find doesn't do it) if no files are found
-print to print the found directories
Related
The below command will show how many characters contains in every file in current directory.
find -name '*.*' |xargs wc -c
I want to write the standout into a file.
find -name '*.*' |xargs wc -c > /tmp/record.txt
It encounter an issue:
wc: .: Is a directory
How to write all the standard output into a file?
Why -name '*.*'? That will not find every file and will find directories. You need to use -type f, and better than piping the result to xargs is using -exec:
find . -type f -maxdepth 1 -exec wc -c {} + > /tmp/record.txt
-maxdepth 1 guarantees that the search won't dive in subdirectories.
I think you maybe meant find |xargs wc -c?
find -name '.' just returns .
Filter only files, if you want only files.
find -type f
I'm learning bash scripting and needed some simple help.
Here is what I have thus far:
find . -type d -empty -not -path "./.git/*" -exec touch {}/.gitkeep \;
So what this does is starts from a root path, finds all directories inside this root path that are empty and do not have a .git folder, and then when that operation is successful it runs -exec touch {}/.gitkeep to create a file .gitkeep inside that empty directory to ensure proper git commits.
What I want now is to echo out the current file path for the gitkeep file just created.
My first question is:
Should I be piping | as so:
find . -type d -empty -not -path "./.git/*" -exec touch {}/.gitkeep | outputFilenameDisplayFunction \;
Or maybe repeat what -exec does as so:
find . -type d -empty -not -path "./.git/*" -exec touch {}/.gitkeep - exec outputFilenameDisplayFunction \;
Or maybe use >
find . -type d -empty -not -path "./.git/*" -exec touch {}/.gitkeep > outputFilenameDisplayFunction \;
None of these commands has been tested yet. I really am looking for explanations so i can be knowledgeable in the future.
As mentioned here, find accepts multiple -exec portions to the command.
In your case, the second one can call a script, as in here:
find . -type d -empty -not -path "./.git/*" -exec touch {}/.gitkeep \; -exec myscript {} \;
Note the \;.
The script would be:
#!/bin/sh
echo "$1" > "afile"
Charles Duffy actually proposes in the comments fir the second -exec:
-exec sh -c 'echo "$1" >>aFile' _ {} \;
avoid the need for an external file storing your script.
Let's start from your stated requirements:
So what this does is starts from a root path, finds all directories inside this root path that are empty and do not have a .git folder, and then when that operation is successful it runs -exec touch {}/.gitkeep to create a file .gitkeep inside that empty directory to ensure proper git commits.
If a directory is empty, it "can't have a .git folder" in the sense of having a child named .git by definition -- if it had any subdirectory, it wouldn't be empty. So we can completely ignore that part of your description in prose -- or interpret to refer to what the code actually appears to be intended to do, pruning any directory which is under .git.
Should that be your intent, -path is the wrong tool for that job altogether, as it still searches the .git tree (and then excludes all the things that it found); instead, use -prune to stop find from recursing down that path at all:
while IFS= read -r -d '' dirname; do
touch -- "${dirname}/.gitkeep"
printf '%q\n' "$dirname" # this goes to the logfile, since we open it for the whole loop
done < <(find . -name .git -prune -o -type d -empty -print0) >logFile
Why prefer this approach?
Instead of starting a shell per directory found (as would happen if you used -exec to start a shell script or a shell), it keeps your initial/primary shell running, and iterates through the loop once per item found.
Because it's running code in that shell, you can use shell functions; modify shell variables (as with (( ++directoriesFound )) to keep a counter, f/e), or perform redirections scoped to the loop (ie. >logFile) to open an output file just once and use if repeatedly within.
On GNU/anything, find has -printf, which makes doing what you want a straight
find -name .git -prune \
-o -type d -empty -printf %p/.gitkeep\\n -execdir touch {}/.gitkeep \;
(note: fixed omitted {}/, and GNU find's -execdir doesn't change the behavior here but is safer than -exec on systems that may find themselves under attack, the exec'd command is run directly in the location find got to rather than causing the executed command to re-walk the path).
On a unix server, I'm trying to figure out how to remove a file, say "example.xls", from any subdirectories that start with v0 ("v0*").
I have tried something like:
find . -name "v0*" -type d -exec find . -name "example.xls" -type f
-exec rm {} \;
But i get errors. I have a solution but it works too well, i.e. it will delete the file in any subdirectory, regardless of it's name:
find . -type f -name "example.xls" -exec rm -f {} \;
Any ideas?
You will probably have to do it in two steps -- i.e. first find the directories, and then the files -- you can use xargs to make it in a single line, like
find . -name "v0*" -type d | \
xargs -l -I[] \
find [] -name "example.xls" -type f -exec rm {} \;
what it does, is first generating a list of viable directory name, and let xargs call the second find with the names locating the file name within that directory
Try:
find -path '*/v0*/example.xls' -delete
This matches only files named example.xls which, somewhere in its path, has a parent directory name that starts with v0.
Note that since find offers -delete as an action, it is not necessary to invoke the external executable rm.
Example
Consider this directory structure:
$ find .
.
./a
./a/example.xls
./a/v0
./a/v0/b
./a/v0/b/example.xls
./a/v0/example.xls
We can identify files example.xls who have one of their parent directories named v0*:
$ find -path '*/v0*/example.xls'
./a/v0/b/example.xls
./a/v0/example.xls
To delete those files:
find -path '*/v0*/example.xls' -delete
Alternative: find only those files directly under directory v0*
find -regex '.*/v0[^/]*/example.xls'
Using the above directory structure, this approach returns one file:
$ find -regex '.*/v0[^/]*/example.xls'
./a/v0/example.xls
To delete such files:
find -regex '.*/v0[^/]*/example.xls' -delete
Compatibility
Although my tests were performed with GNU find, both -regex and -path are required by POSIX and also supported by OSX.
I think this is probably a pretty n00ber question but I just gotsta ask it.
When I run:
$ find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \)
and get:
./01.Adagio - Allegro Vivace.mp3
./03.Allegro Vivace.mp3
./02.Adagio.mp3
./04.Allegro Ma Non Troppo.mp3
why does find prepend a ./ to the file name? I am using this in a script:
fList=()
while read -r -d $'\0'; do
fList+=("$REPLY")
done < <(find . -type f \( -name "*.mp3" -o -name "*.ogg" \) -print0)
fConv "$fList" "$dBaseN"
and I have to use a bit of a hacky-sed-fix at the beginning of a for loop in function 'fConv', accessing the array elements, to remove the leading ./. Is there a find option that would simply omit the leading ./ in the first place?
The ./ at the beginning of the file is the path. The "." means current directory.
You can use "sed" to remove it.
find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) | sed 's|./||'
I do not recommend doing this though, since find can search through multiple directories, how would you know if the file found is located in the current directory?
If you ask it to search under /tmp, the results will be on the form /tmp/file:
$ find /tmp
/tmp
/tmp/.X0-lock
/tmp/.com.google.Chrome.cUkZfY
If you ask it to search under . (like you do), the results will be on the form ./file:
$ find .
.
./Documents
./.xmodmap
If you ask it to search through foo.mp3 and bar.ogg, the result will be on the form foo.mp3 and bar.ogg:
$ find *.mp3 *.ogg
click.ogg
slide.ogg
splat.ogg
However, this is just the default. With GNU and other modern finds, you can modify how to print the result. To always print just the last element:
find /foo -printf '%f\0'
If the result is /foo/bar/baz.mp3, this will result in baz.mp3.
To print the path relative to the argument under which it's found, you can use:
find /foo -printf '%P\0'
For /foo/bar/baz.mp3, this will show bar/baz.mp3.
However, you shouldn't be using find at all. This is a job for plain globs, as suggested by R Sahu.
shopt -s nullglob
files=(*.mp3 *.ogg)
echo "Converting ${files[*]}:"
fConv "${files[#]}"
find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) -exec basename "{}" \;
Having said that, I think you can use a simpler approach:
for file in *.mp3 *.ogg
do
if [[ -f $file ]]; then
# Use the file
fi
done
If your -maxdepth is 1, you can simply use ls:
$ ls *.mp3 *.ogg
Of course, that will pick up any directory with a *.mp3 or *.ogg suffix, but you probably don't have such a directory anyway.
Another is to munge your results:
$ find . -maxdepth 1 -type f \( -name "*.mp3" -o -name "*.ogg" \) | sed 's#^\./##'
This will remove all ./ prefixes, but not touch other file names. Note the ^ anchor in the substitution command.
I want to run a find command but only find the files in directories, not the directories or subdirectories themselves. Also acceptable would be to find the directories but grep them out or something similar, still listing the files in those directories. As of right now, to find all files changed in the last day in the working directory, and grep'ing out DS_Store and replacing spaces with underscores:
find . -mtime -1 -type f -print | grep -v '\.DS_Store' | awk '{gsub(/ /,"_")}; 1'
Any help would be appreciated!
If you have GNU find:
find . -mtime -1 ! -name '.DS_Store' -type f -printf '%f\n'
will print only the basename of the file.
For other versions of find:
find . -mtime -1 ! -name '.DS_Store' -type f -exec basename {} \;
you could then do:
find -name index.html -exec sh -c 'basename "$1" | tr " " _' _ {} \;