How to remove all files with a specific extension from version control? - bash

I am using Mercurial under Linux. I would like to exclude all files containing the pattern *.pro.user* from the version control system.
I tried to list all the files with:
find . -name "*.pro.user*"
This turned out also some results which are in the .hg folder:
...
./.hg/store/data/_test_run_multiple/_test_run_multiple.pro.user.i
./.hg/store/data/_test_non_dominated_sorting/_test_sorting.pro.user.i
./Analyzer/AlgorithmAnalyzer.pro.user
./Analyzer/AlgorithmAnalyzer.pro.user.a6874dd
...
I then tried to pipe this result to the hg forget command like:
find . -name "*.pro.user*" | hg forget
but I get:
abort: no files specified
My guess is that the list needs to be processed in some way in order to be passed to hg forget.
I would like to ask:
How can I pass the result of my find query into the hg forget command?
Since the query result contains files in the "private" folder .hg, is it a good idea? I hope that Mercurial will ignore that request, but shoud I remove those results somehow?

Try the following:
hg forget "set:**.pro.user*"
This tells Mercurial to forget any files that match the fileset **.pro.user*. As the fileset is defined in Mercurial, it won't go into the .hg directory. You can do even more with filesets by looking at: hg -v help filesets
The ** at the start means to work in subdirectories, rather than just the current directory.

First of all, you can use find * -name "*.pro.user*" to avoid looking in .hg.
Mercurial's forget command requires its arguments on the command line. So you need to use xargs:
find * -name "*.pro.user*" | xargs hg forget
Alternatively you can ask find to do the job:
find * -name "*.pro.user*" -exec hg forget {} \;
Finally, you should add *.pro.user* to your .hgignore file.

Related

using find utility to look for all .git/config files under a certain directory

I have a certain github repo that I used to test out netlify and vuepress. I somewhat lost track of where it is on the file system, so I was planning to use mdfind (I am on mac) or find to locate all the .git/config files and then grep for my github url.
But it seems surprisingly hard to convince find to look for config under the hidden .git directories.
I did find How do I search all hidden files that are in hidden folders using Terminal? and looking at it, it looks like the following would work:
find . -name '.*' \( -type d -exec find {} \; -prune -o -print \) | egrep '/.git/config'
but given that config is a highly specific file for git, I was hoping that there is a better suited find command that will do the trick. I have already given up on mdfind as it the linked question's accepted answer is skeptical about getting it to reliably find hidden files.
Note: not looking for answers based on the locate utility, or some GUI tool, this is strictly about getting find to do the work.
Use -path.
find . -path '*/.git/config'

Find git repository on macOS

I would like to find the location of a Git repository I made on my mac. Is there a way to find, for exemple, albatrocity/git-version-control.markdown on macOS using the Terminal? I installed everything with default parameters. I guess it must be in the User directory but I don't find anything related to GitHub there.
When I find it, I would like to completely remove it to maker a "proper" install.
EDIT: sudo find / -name "parsing.py" -print
I used a file that I know the folder contained when Terminal showed me nothing with sudo find / -wholename "*albatrocity/git-version-control.markdown"
You can use find's -wholename option to find a file based on its name and folder:
find <directory> -wholename "*albatrocity/git-version-control.markdown"
Example, if you want to search in the /Users/ directory:
find /Users/ -wholename "*albatrocity/git-version-control.markdown"
If you have locate on mac, and a regularly running updatedb, locate might be much faster:
locate albatrocity | grep git-version-control.markdown
It uses a hashtable to fast access filenames, but can be out of date, if the database isn't updated regularly or the file is too young (typically less than one day old).
If this is without success, then I would go for a full search with find, but maybe restrict it to a possible, narrowed path.

How to clean up directory structure from botched rsync operation?

In a bash script, I am rsync'ing many directories. However, in my script I forgot to put a trailing slash at the end of the source directory. As a result, I have something like
rsync /first/path/dir /second/path/dir
in my script, which I know is wrong, and ends up creating
/second/path/dir/dir
in the destination directory, which is not what I want at all.
Is there a quick way I can use the find command to find all instances of "dir/dir" and perform an 'rm -rf' without losing the other original contents of /second/path/dir ?
you can use the -path argument to unix find command to locate (and remove) 'dir/dir'
find . -path \*/dir/dir -exec rm -rf {} \;

Script to find files in subdirectories

I need a script that will find and get me all files in all subdirectories (and leave them in the folder structure as they are now). I know how to find and print that files:
find . -name "something.extension"
The point is, in those directories are lots files that was used before, but I don't want to get those, so the script should only find me files that matches some kind of path pattern which is:
xxx/trunk/xxx/src/main/resources
xxx is different everytime, and after resources there are still some folders that directories are different based on xxx.
Every top xxx folder contains folder named 'tags' (the same level as trunk) that stores previous releases of module (and every release has files that name I am looking for, but I don't want outdated files).
So I want to find all that files in subdirectories of that path pattern that I specified and copy to new location but leave folder structure as it is right now.
I am using Windows and cygwin.
Update
I combined answer commands that 'that other guy' posted below, and it works. Just to be clear I have something like this:
find */trunk/*/src/main/resources -name "something.extension" -exec mkdir -p /absolute/target/path/{} \; -exec cp {} /absolute/target/path/{} \;
Thanks.
Instead of searching under the entire current directory (.), just search under the directories you care about:
find */trunk/*/src/main/resources -name "something.extension"

SVN is it possible to commit only files that match a given extension

I'm using slik svn client and I wanna know if there is any way to only commit files that match a given extensions. I'm aware of the ignore prop (and hooks). But I'm wondering if there is an easy way, such as: svn commit *.cs for committing only C# files. (I've tried that and it doesn't work ;))
For me, it is easy to say commit these three file extensions and just ignore the rest.
Any help will be appreciated, Thanks!
If they are in the same directory, yes, for example:
svn commit *.cs
svn commit path/to/subdir/*.cs
If they are not in the same directory, and you can only use DOS, then no.
If you have cygwin or Git Bash, then you could do this:
find . -type f -name '*.cs' -exec svn commit {} +
The svn commit command doesn't have a filtering option for selecting a subset of files, you can only use shell globs (like *.cs).

Resources