Query on find command - bash

When i do a find inside a directory i get the output like
./MyWork/a.c
./MyWork/b.c
./mtab
My question is how can use find command in such a way that my output will not show ./
my output should be
MyWork/a.c
MyWork/b.c
mtab
Thanks,
LinuxPenseur

Add -printf "%P\n" at the end.

Related

Call partial file name in a linux script

I have several files: one_001_three.txt, one_002_three.txt, . . .
After removing the extension, I would like to call it such that:
${fname}_001
would call the file 'one_001'
Any ideas?
As far as i understood your Question ..
yu can use this command basename try with this and develop .
basename one_001_three.txt _three.txt -->this will give output as one_001 .
the filename doesnt get changed though .
As far as i understood your Question ..
you can use this command basename try with this and develop .
basename one_001_three.txt _three.txt -->this will give output as one_001 .
the filename doesnt get changed though

Find a file in Cygwin

I use find -name "nametofind" in cygwin to search for a file, but it does not give me any result, even when the file I want to search exists in the current directory. What am I doing wrong? Thanks.
As the comment mentioned more succinctly, you need to tell find which directory you want to search. If you it is the current directory, you should use ..
find . -name "nametofind"
It appears that the OP was trying to either match a partial file name or a file name with a different case. As #devnull mentioned in his comment, the correct solution for either case is to use the following.
find . -iname '*nametofind*'

Bash find files and display path function

I'm writing a function in my .bashrc file which helps me find files according to the string which I pass as an argument to the function:
# ~/.bashrc
function search {
find . -iname "*$1*" -printf "%f\n"
}
This function is good. It prints out all the files with the given string under the directory I'm in, and also all the files with the same given string in all subdirectories. Only it just prints out each file and not its path.
If I for example have a folder containing some sub-folders and files this function would be so much more helpful if it would print out the path to each file if they are located under any sub-folders.
F.ex. if I have a folder named Folder/ and a few sub-folders named whatever_num and running search thisandthat would spit out a list looking something like this:
$ search thisandthat
some-file-containing-thisandthat-in-its-filename.ext
whatever_1/path/to/some-file-containing-thisandthat-in-its-filename.ext
whatever_2/path/to/some-file-containing-thisandthat-in-its-filename.ext
So my queston is: How can I modify my search function so that it prints out the path to the files I might be searching for?
Thank you!
function search {
find `pwd` -iname "*$1*"
}
Change the find line to:
find . -iname "*$1*" -printf "$(pwd)/%P\n"
Thanks to #JKB and to #acro444. Their answers helped me modify my search function. I am posting my answer since I decided to go with a slightly different solution than the once they posted but it's thanks to their contribution:
# ~/.bashrc
function search {
find . -iname "*$1*" -printf "%P\n"
}
I do it this way because I don't care about seeing /home/myusername/... all the time in front of the rest of the path. I know I am in my /home/username/somefolder directory and seeing always /home/username/..... for each file that might come up is just too much "noise" I think and unnecessary.
Thank you again for the help!
Cheers!

Bash script to find file older than X days, then subsequently delete it, and any files with the same base name?

I am trying to figure out a way to search a directory for a file older than 365 days. If it finds a match, I'd like it to both delete the file and locate any other files in the directory that have the same basename, and delete those as well.
File name examples: 12345.pdf (Search for) then delete, 12345_a.pdf, 12345_xyz.pdf (delete if exist).
Thanks! I am very new to BASH scripting, so patience is appreciated ;-))
I doubt this can be done cleanly in a single pass.
Your best bet is to use -mtime or a variant to collect names and then use another find command to delete files matching those names.
UPDATE
With respect to your comment, I mean something like:
# find basenames of old files
find .... -printf '%f\n' | sort -u > oldfiles
for file in ($<oldfiles); do find . -name $file -exec rm; done

Bash find in current folder and #name# sub-folder recursively

Tricky question for a bash noob like me, but i'm sure this this easier that it seems to me.
I'm currently using the find command as follows :
run "find #{current_release}/migration/ -name '*.sql'| sort -n | xargs cat >#{current_release}/#{stamp}.sql"
in my capistrano recipe.
Problem is #{current_release}/migration/ contains subfolders, and I'd like the find command to include only one of these, depending on it's name (that I know, it's based on the target environment.
As a recap, folder structure is
Folder
|- sub1
|- sub2
and i'm trying to make a find specifying to recurse ONLY on sub1 for example. I'm sure this is possible, just couldn't find how.
Thanks.
Simply specify the directory you want as argument to find, e.g. find #{current_release}/migration/sub1 ....
EDIT: As per your clarification, you should use the -maxdepth argument for find, to limit the recursion depth. So, for example, you can use find firstdir firstdir/sub1 -maxdepth 1.
You just need to append that to your find invocation:
find #{current_release}/migration/sub_you_want -name ...
Depending on how you make the determination of the sub-directory you want, you should be able to script that as well.

Resources