There's a similar post # How to add CVS directories recursively
However, trying out some of the answers such as:
find . -type f -print0| xargs -0 cvs add
Gave:
cvs add: cannot open CVS/Entries for
reading: No such file or directory cvs
[add aborted]: no repository
And
find . \! -name 'CVS' -and \! -name 'Entries' -and \! -name 'Repository' -and \! -name 'Root' -print0| xargs -0 cvs add
Gave:
cvs add: cannot add special file `.';
skipping
Does anyone have a more thorough solution to recursively adding new files to a CVS module? It would be great if I could alias it too in ~/.bashrc or something along those lines.
And yes, I do know that it is a bit dated but I'm forced to work with it for a certain project otherwise I'd use git/hg.
This may be a bit more elegant:
find . -type f -print0 | egrep -v '\/CVS\/|^\.$' | xargs -0 cvs add
Please note that print0, while very useful for dealing with file names containing spaces, is NOT universal - it is not, for example, in Solaris's find.
find . -name CVS -prune -o -type f -print0
See this answer of mine to the quoted question for an explanation of why you get the "cannot open CVS/Entries for reading" error.
Two important things to keep in mind when looking at the other solutions offered here:
folders have to be added, too
in order to add a file or folder, the parent folder of that item must already have been added, so the order in which items are processed is very important
So, if you're just starting to populate your repository and you haven't yet got anything to check out that would create a context for the added files then cvs add cannot be used - or at least not directly. You can create the "root context" by calling the following command in the parent folder of the first folder you want to add:
cvs co -l .
This will create the necessary sandbox meta-data (i.e. a hidden "CVS" subfolder containing the "Root", "Repository" and "Entries.*" files) that will allow you to use the add command.
Related
I'm trying to get a list of files which I can pipe to wc -l to get a word count of all of them (not using wc directly so I can filter the file list before using the command).
My directory structure is something like this:
- folder
- file.php
- file2.html
- file3.php
- folder1
- folder2a
- folder3b
- folder4
- file.php
- file2.php
I'd like to exclude certain directories in my find, largely libraries and other stuff that I didn't make. I can do that manually like so:
find /var/www/html/ -type f -not -path "/var/www/html/folder/folder1" -not -path "/var/www/html/folder/folder2a" etc.
However, it's being annoying to have to explicitly specify all the folders, and the list could change at any point, too. I've tried using /* and /** to pattern match but that doesn't work, either. Is there a way for one of these "not"s in my find command that I can exclude all the subdirectories of a particular directory, but not exclude that directory itself? (include its files, but not any of its subdirectories)?
Here's an intuitive guess:
find /var/www/html -not -path '/var/www/html/someotherbadfolder' -type f \( ! -path "/var/www/html/folder" -maxdepth 1 \)
But even find complains about this:
find: warning: you have specified the -maxdepth option after a non-option argument -not, but options are not positional (-maxdepth affects tests specified before it as well as those specified after it). Please specify options before other arguments.
So it seems maxdepth is incapable of being combined in an operation.
There's lots of Q&A about excluding specific subdirectories, but not generically any subdirectories in a particular subdirectory.
I was able to get it to work in a single directory with -maxdepth 1, but the problem is this is an exclusion part of a larger command, and that didn't work once I ran the full command. Potentially, I might need to exclude specific subdirectories as well as any subdirectories in several other specific subdirectories.
Assuming you're specifically looking for files (i.e. not directories):
find /var/www/html -type f -not -path "/var/www/html/folder/*/*"
That's because:
files directly under /var/www/html/folder aren't directories so they don't match the -path clause.
directories directly under /var/www/html/folder don't match -type f.
files under subdirectories of /var/www/html/folder has to have the extra / in the path, so they match the -path expression.
Just with find:
find /var/www/html -type f -not -path '/var/www/html/folder/*/*'
Original answer:
One hack could be grep -v on the output of find:
find /var/www/html/ -type f | grep -v "/var/www/html/folder/.*/" | wc -l
I want to write a bash to list the directory under /usr/jboss/jbosseap that content directories (app_m1 or app_m01)
I want to list either of the naming convention
This is what i'm doing currently but it doesn't work
ls -1d *m{[0-9],[0-9][0-9]}
It only works if both (app_m1 and app_m01) are present.
There are lots of ways you can do this, here's a simple version :
find . -type d -name "app_m1" -o -name "app_m01"
I want to delete a file from a directory which contains many subdirectories but the deletion should not happen in one subdiretory(searc) whose name is already predefined but path varies as shown below.So now how to delete a file i am using the below command
find . -type f -name "*.txt" -exec rm -f {} \;
this command deletes all the files in the directory.So How can we delete the file without serching that subdirectory.
The subdirectory file name will be same but the path will different
for eg
Main
|
a--> searc
|
b-->x--->searc
|
c-->y-->x-->searc
now the
the subdirectory not to be searched can be present any where as shown above
I think you want the -prune option. In combination with a successful name match, this prevents descent into the named directories. Example:
% mkdir -p test/{a,b,c}
% touch test/{a,b,c}/foo.txt
% find test -name b -prune -o -name '*.txt' -print
test/a/foo.txt
test/c/foo.txt
I am not completely sure what you're asking, so I can give only somewhat generic advice.
You already know the -name option. This refers to the filename only. You can, however, also use -wholename (a.k.a. -path), which refers to the full path (beginning with the one given as first option to find).
So if you want to delete all *.txt files except in the foo/bar subdirectory, you can do this:
find . -type f -name "*.txt" ! -wholename "./foo/bar/*" -delete
Note the -delete option; it doesn't require a subshell, and is easier to type.
If you would like to exclude a certain directory name regardless of where in the tree it might be, just don't "root" it. In the above example, foo/bar was "rooted" to ./, so only a top-level foo/bar would match. If you write ! -wholename "*/foo/bar/*" instead (allowing anything before or after via the *), you would exclude any files below any directory foo/bar from the operation.
You can use xargs instead of the exec
find .... <without the --exec stuff> | grep -v 'your search' | xargs echo rm -f
Try this first. If it is satisfactory, you can remove the echo.
I have been using CVS and am having trouble when moving files from other projects. Is there a easy way to remove all folders [and subfolders] with the name 'CVS' so that i can add them correctly to my new cvs repo?
First try to find them find . -name CVS, then remove them find . -name CVS -delete. If you'd like to ensure that only folders are removed, just add the -type d option to the find commands.
Another possible solution with more information during the search and deletion: find . -type d -name CVS -exec rm -rv {} \;
Use find:
find . -iname "*cvs*" -delete
I have a file /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home
I am trying to find if I have the *.jdk anywhere else on my hard drive. So I do a search command:
find . -name "*.jdk"
But it doesn't find anything. Not even the one I know that I have. How come?
find . only looks in your current directory. If you have permissions to look for files in other directories (root access) then you can use the following to find your file -
find / -type f -name "*.jdk"
If you are getting tons of permission denied messages then you can suppress that by doing
find / -type f -name "*.jdk" 2> /dev/null
a/
find . means, "find (starting in the current directory)." If you want to search the whole system, use find /; to search under /System/Library, use find /System/Library, etc.
b/
It's safer to use single quotes around wildcards. If there are no files named *.jdk in the working directory when you run this, then find will get a command-line of:
find . -name *.jdk
If, however, you happen to have files junk.jdk and foo.jdk in the current directory when you run it, find will instead be started with:
find . -name junk.jdk foo.jdk
… which will (since there are two) confuse it, and cause it to error out. If you then delete foo.jdk and do the exact same thing again, you'd have
find . -name junk.jdk
…which would never find a file named (e.g.) 1.6.0.jdk.
What you probably want in this context, is
find /System -name '*.jdk'
…or, you can "escape" the * as:
find /System -name \*.jdk
Probably your JDKs are uppercase and/or the version of find available on OS X doesn't default to -print if no action is specified; try:
find . -iname "*.jdk" -print
(-iname is like -name but performs a case-insensitive match; -print says to find to print out the results)
--- EDIT ---
As noted by #Jaypal, obviously find . ... looks only into the current directory (and subdirectories), if you want to search the whole drive you have to specify / as search path.
The '.' you are using is the current directory. If you're starting in your home dir, it will probably miss the JDK files.
Worst case search is to start from root
find / -name '*.jdk' -o -name '*.JDK' -print
Otherwise replace '/' with some path you are certain should be parent to you JDK files.
I hope this helps.
If you are on Mac terminal, and also already in the directory where you want the search to be conducted at, then this may also work for you:
find *.pdf
At least it worked for me.
find / -type f -name "*.jdk" works on Mac also
This works for me on macOS.
find . -type f -iname '*.jdk'
ls *.jpg | cut -f 1 -d "."
sub out the '.jpg' to whatever extension you want to list