find exec and strip extension from filenames - bash

Any idea why this command is not working? btw, I'm trying to strip out the extensions of all csv files in current directory.
find -type f -iname "*.csv" -exec mv {} $(basename {} ".csv") \;
Tried many variants including the parameter expansions, xargs ... Even then all went futile.

This should do:
find ./ -type f -iname "*.csv" -exec sh -c 'mv {} $(basename {} .csv)' \;
find is able to substitute {} with its findings since the quotes prevent executing the subshell until find is done. Then it executes the -exec part.
The problem why yours is not working is that $(basename {} ".csv") is executed in a subshell (-> $()) and evaluated beforehand. If we look at the command execution step-by-step you will see what happens:
find -type f -iname "*.csv" -exec mv {} $(basename {} ".csv") \; - your command
find -type f -iname "*.csv" -exec mv {} {} \; - subshell gets evaluated ($(basename {} ".csv") returns {} since it interprets {} as a literal)
find -type f -iname "*.csv" -exec mv {} {} \; - as you see now: move does actually nothing

First, take care that you have no subdirectories; find, without extra arguments, will automatically recur into any directory below.
Simple approach: if you have a small enough number of files, just use the glob (*) operator, and take advantage of rename:
$ rename 's/.csv$//' *.csv
If you have too many files, use find, and perhaps xargs:
$ find . -maxdepth 1 -type f -name "*.csv" | xargs rename 's/.csv$//'
If you want to be really safe, tell find and xargs to delimit with null-bytes, so that you don't have weird filenames (e.g., with spaces or newlines) mess up the process:
$ find . -maxdepth 1 -type f -name "*.csv" -print0 | xargs -0 rename 's/.csv$//'

Related

Find file and cd into it

I am attempting to find multiple files, then quit after the first match and then cd into this match, I have attempted:
find `pwd` -iname 'tensorflow' -type d -exec echo {} \; -quit | xargs -I{} cd {}
However, this does nothing and it won't enter into that directory.
There is no /usr/bin/cd, it's not an executable. You have to run it in current shell, not in subshell as part of pipeline.
Do not use backticks. Prefer $(...).
find pwd? Just find ., you are already in pwd.
-exec echo {} \;? Just -print it.
dir=$(find . -iname 'tensorflow' -type d -print -quit)
cd "$dir"

Not able to replace all filenames in directory containing a substring with another substring

FROM=$1
TO=$2
find . -name '*'$FROM'*' -type f -exec bash -c 'mv "$1" "${1/$FROM/$TO}"' -- {} \;
find . -name '*'$FROM'*' -type f part is finding files correctly, but mv part doesn't seem to work

Cannot shell out to find command

In Ruby, I want to shell out the following find command:
find . -type f -name '*.c' -exec mv {} . \;
I have tried many permutations of this command:
system("find . -type f -name '*.c' -exec mv {} . \;")
`find . -type f -name '*.c' -exec mv {} . \;`
%x(find . -type f -name '*.c' -exec mv {} . \;)
But when I run the command, find generates the error message:
find: -exec: no terminating ";" or "+"
I don't think the issue is characters which need to be escaped. This is probably a really simple fix, but any help would be greatly appreciated!
You need - as #mudasobwa indicated - actually pass the backslash to the find command. If you try your string in irb, you see immediately what's going wrong:
>> "find . -type f -name '*.c' -exec mv {} . \;"
=> "find . -type f -name '*.c' -exec mv {} . ;"
However, for actually running your find command, you need to make up your mind, whether system or %x() is the right tool to use. If you want do process the stdout of the command, you have to use %x, and in this case, you have to escape the backslash, because the string is then expended as if it were a string between double quotes:
find_stdout = %x(find . -type f -name '*.c' -exec mv {} . \\;)
If you are not interested in the result, but only in the overall success (exit code,....) of the command, you shhould use system, and in this case, you could use a single quoted string, which permits you to not escape the backslashes:
result = system('find . -type f -name "*.c" -exec mv {} . \;')
Of course, escaping here is not wrong either, and some people recommend for consistency and maintainability to always escape a backslash.

Why doesn't find let me match multiple patterns?

I'm writing some bash/zsh scripts that process some files. I want to execute a command for each file of a certain type, and some of these commands overlap. When I try to find -name 'pattern1' -or -name 'pattern2', only the last pattern is used (files matching pattern1 aren't returned; only files matching pattern2). What I want is for files matching either pattern1 or pattern2 to be matched.
For example, when I try the following this is what I get (notice only ./foo.xml is found and printed):
$ ls -a
. .. bar.html foo.xml
$ tree .
.
├── bar.html
└── foo.xml
0 directories, 2 files
$ find . -name '*.html' -or -name '*.xml' -exec echo {} \;
./foo.xml
$ type find
find is an alias for noglob find
find is /usr/bin/find
Using -o instead of -or gives the same results. If I switch the order of the -name parameters, then only bar.html is returned and not foo.xml.
Why aren't bar.html and foo.xml found and returned? How can I match multiple patterns?
You need to use parentheses in your find command to group your conditions, otherwise only 2nd -name option is effective for -exec command.
find . \( -name '*.html' -or -name '*.xml' \) -exec echo {} \;
find utility
-print == default
If you just want to print file path and names, you have to drop exec echo, because -print is default.:
find . -name '*.html' -or -name '*.xml'
Order dependency
Otherwise, find is read from left to right, argument order is important!
So if you want to specify something, respect and and or precedence:
find . -name '*.html' -exec echo ">"{} \; -o -name '*.xml' -exec echo "+"{} \;
or
find . -maxdepth 4 \( -name '*.html' -o -name '*.xml' \) -exec echo {} \;
Expression -print0 and xargs command.
But, for most cases, you could consider -print0 with xargs command, like:
find . \( -name '*.html' -o -name '*.xml' \) -print0 |
xargs -0 printf -- "-- %s -\n"
The advantage of doing this is:
Only one (or few) fork for thousand of entry found. (Using -exec echo {} \; implies that one subprocess is run for each entry found, while xargs will build a long line with as many argument one command line could hold...)
In order to work with filenames containing special character or whitespace, -print0 and xargs -0 will use the NULL character as the filename delimiter.
find ... -exec ... {} ... +
From some years ago, find command accept a new syntax for -exec switch.
Instead of \;, -exec switch could end with a plus sign +.
find . \( -name '*.html' -o -name '*.xml' \) -exec printf -- "-- %s -\n" {} +
With this syntax, find will work like xargs command, building long command lines for reducing forks.

I am getting an error "arg list too long" in unix

i am using the following command and getting an error "arg list too long".Help needed.
find ./* \
-prune \
-name "*.dat" \
-type f \
-cmin +60 \
-exec basename {} \;
Here is the fix
find . -prune -name "*.dat" -type f -cmin +60 |xargs -i basename {} \;
To only find files in the current directory, use -maxdepth 1.
find . -maxdepth 1 -name '*.dat' -type f -cmin +60 -exec basename {} \;
In all *nix systems the shell has a maximum length of arguments that can be passed to a command. This is measured after the shell has expanded filenames passed as arguments on the command line.
The syntax of find is find location_to_find_from arguments..... so when you are running this command the shell will expand your ./* to a list of all files in the current directory. This will expand your find command line to find file1 file2 file3 etc etc This is probably not want you want as the find is recursive anyway. I expect that you are running this command in a large directory and blowing your command length limit.
Try running the command as follows
find . -name "*.dat" -type f -cmin +60 -exec basename {} \;
This will prevent the filename expansion that is probably causing your issue.
Without find, and only checking the current directory
now=$(date +%s)
for file in *.dat; do
if (( $now - $(stat -c %Y "$file") > 3600 )); then
echo "$file"
fi
done
This works on my GNU system. You may need to alter the date and stat formats for different OS's
If you have to show only .dat filename in the ./ tree. Execute it without -prune option, and use just path:
find ./ -name "*.dat" -type f -cmin +60 -exec basename {} \;
To find all the .dat files which are older than 60 minutes in the present directory only do as follows:
find . -iregex "./[^/]+\.dat" -type f -cmin +60 -exec basename {} \;
And if you have croppen (for example aix) version of find tool do as follows:
find . -name "*.dat" -type f -cmin +60 | grep "^./[^/]\+dat" | sed "s/^.\///"

Resources