To execute 'find' with some variables from txt file i made this
but it doesn't work.
is that wrong with execute statement?
#/bin/bash
while read line;
do
echo tmp_name: $line
for ST in 'service.getFile("'$line;
do
find ./compact/ -type f -exec grep -l $ST {} \;
done
done < tmpNameList.txt
Try and quote $ST in your find command.
What's more:
since you operate from the current directory, ./ is not necessary;
you don't seem to have any special regex character (the ( needs to be quoted in grep's classical regex mode, and I assume you did mean a literal dot), so use fgrep instead (or grep -F). Ie:
find compact/ -type f -exec fgrep -l "$ST" {} \;
grep can read multiple patterns from a file (-f option):
find ./compact/ -type f -exec grep -f patterns.txt {} +
where patterns.txt (prepend 'service.getFile(' to each line) is:
sed 's/^/service.getFile(/' tmpNameList.txt >patterns.txt
Related
for i in *.txt;
do
xxd -l 3 $i >> log
done
I also want to log file names $i for each result. E.g.:
file_name
result_of_command
You probably just need to use printf:
for f in *.txt; do
printf "%s: %s\n" "$f" "$(xxd -l 3 "$f")"
done >> log
I'm not totally clear what you are asking, but is this what you want?
for i in *.txt;
do
echo "$i" >> log
xxd -l 3 $i >> log
done
It's better to use find with the -exec option to run a command for every file matching certain criteria.
If you want all files in your current directory matching *.txt you can use find. You can use the -exec option to run a command for each file. {} replaces the name of the file and \; (an escaped ; terminates the command). You can use + instead to tell find to replace {} with multiple filenames.
find . -type f -name '*.txt' -maxdepth 1 -exec xxd -l 3 {} \; >> log
Note that the above example includes hidden files, you can exclude them using a regex.
find . -type f \( ! -regex '.*/\..*' \) -name '*.txt' -maxdepth 1 -exec xxd -l 3 {} \; >> log
Also, if you're going to be globbing files in the current directory and using them in commands, always use ./*. Paths beginning with - are likely to be interpreted by your command as options.
Hi after I find the files and enclose their name with double quotes with the following command:
FILES=$(find . -type f -not -path "./.git/*" -exec echo -n '"{}" ' \; | tr '\n' ' ')
I do a for loop to grep a certain word inside each file that matches find:
for f in $FILES; do grep -Eq '(GNU)' $f; done
but grep complains about each entry that it cannot find file or directory:
grep: "./test/test.c": No such file or directory
see picture:
whereas echo $FILES produces:
"./.DS_Store" "./.gitignore" "./add_license.sh" "./ads.add_lcs.log" "./lcs_gplv2" "./lcs_mit" "./LICENSE" "./new test/test.js" "./README.md" "./sxs.add_lcs.log" "./test/test.c" "./test/test.h" "./test/test.js" "./test/test.m" "./test/test.py" "./test/test.pyc"
EDIT
found the answer here. works perfectly!
The issue is that your array contains filenames surrounded by literal " quotes.
But worse, find's -exec cmd {} \; executes cmd separately for each file which can be inefficient. As mentioned by #TomFenech in the comments, you can use -exec cmd {} + to search as many files within a single cmd invocation as possible.
A better approach for recursive search is usually to let find output filenames to search, and pipe its results to xargs in order to grep inside as many filenames together as possible. Use -print0 and -0 respectively to correctly support filenames with spaces and other separators, by splitting results by a null character instead - this way you don't need quotes, reducing possibility of bugs.
Something like this:
find . -type f -not -path './.git/*' -print0 | xargs -0 egrep '(GNU)'
However in your question you had grep -q in a loop, so I suspect you may be looking for an error status (found/not found) for each file? If so, you could use -l instead of -q to make grep list matching filenames, and then pipe/send that output to where you need the results.
find . -print0 | xargs -0 egrep -l pattern > matching_filenames
Also note that grep -E (or egrep) uses extended regular expressions, which means parentheses create a regex group. If you want to search for files containing (GNU) (with the parentheses) use grep -F or fgrep instead, which treats the pattern as a string literal.
I'm in zsh.
I'd like to do something like:
find . -iname *.md | xargs cat && echo "---" > all_slides_with_separators_in_between.md
Of course this cats all the slides, then appends a single "---" at the end instead of after each slide.
Is there an xargs way of doing this? Can I replace cat && echo "---" with some inline function or do block?
Very strangely, when I create a file cat---.sh with the contents
cat $1
echo ---
and run
find . -iname *.md | xargs ./cat---.sh
it only executes for the first result of find.
Replace cat---.sh with cat and it runs on both files.
There's no need to use xargs at all here. Following is a properly paranoid approach (robust against files with spaces, files with newlines, files with literal backslashes in their names, etc):
while IFS= read -r -d '' filename; do
printf '---\n'
cat -- "$filename"
done < <(find . -iname '*.md' -print0) >all_slides_with_separators.md
However -- you don't even need that either: find can do all the work itself, both printing the separator and calling cat!
find . -iname '*.md' -printf '---\n' -exec cat -- '{}' ';' >all_slides_with_separators.md
A common usage pattern is xargs sh -c 'command; another' _ where the entire shell script in the quotes will have access to the command-line arguments. The underscore is because the first argument to sh -c will be assigned to $0 (where you'd often see e.g. -sh in a ps listing).
find . -iname '*.md' |
xargs sh -c 'for x; do
cat "$x" && echo "---"
done' _ > all_slides_with_separators_in_between.md
As noted in the comments, you should probably investigate find -print0 and the corresponding xargs -0 option in GNU find (and maybe install it if you don't have it).
You can do something like this, but it can be insecure in some cases (see comments):
find . -iname '*.md' | xargs -I % sh -c '{ cat %; echo "----"; }' > output.txt
You'll rarely need find in zsh; its globbing facilities cover nearly every use case of find.
for f in (#i)**/*.md; do
cat $f
print -- "---"
done > all_slides.md
This looks in the current directory hierarchy for every file that matches *.md in a case-insensitive manner.
For even more efficiency, replace cat $f with < $f; zsh itself will read the file and write its contents to standard output.
Using GNU Parallel it looks like this:
parallel cat {}\; print -- --- ::: **/*.md
I want to find all files within the current directory that contain a given string, then print just the 4th line of each file.
grep --null -l "$yourstring" * | # List all the files containing your string
xargs -0 sed -n '4p;q' # Print the fourth line of said files.
Different editions of grep have slightly different incantations of --null, but it's usually there in some form. Read your manpage for details.
Update: I believe one of the null file list incantations of grep is a reasonable solution that will cover the vast majority of real-world use cases, but to be entirely portable, if your version of grep does not support any null output it is not perfectly safe to use it with xargs, so you must resort to find.
find . -maxdepth 1 -type f -exec grep -q "$yourstring" {} \; -exec sed -n '4p;q' {} +
Because find arguments can almost all be used as predicates, the -exec grep -q… part filters the files that are eventually fed to sed down to only those that contain the required string.
From other user:
grep -Frl string . | xargs -n 1 sed -n 4p
Give a try to the below GNU find command,
find . -maxdepth 1 -type f -exec grep -l 'yourstring' {} \; | xargs -I {} awk 'NR==4{print; exit}' {}
It finds all the files in the current directory which contains specific string, and prints the line number 4 present in each file.
This for loop should work:
while read -d '' -r file; do
echo -n "$file: "
sed '4q;d' "$file"
done < <(grep --null -l "some-text" *.txt)
I have a lot of directory that end with "_ and 6 digits", eg:
diff_gb_and_pf_2voids_158543
I would like to find all that folders in the current folder, and rename them by deleting the "_" and the 6 digits at the end.
So far I'm stuck with this command:
find . -type d -print |grep '.*[0-9]\{6\}$' |xargs -I {} bash -c 'for i in {}; do mv "$i" ????; done;'
I can't find how to do the last step. I would try and call sed, but how ?
Also, if there is a nicer way, please tell.
Thanks
Here's one way using your shell:
for i in $(find . -mindepth 1 -type d -regextype posix-extended -regex '.*_[0-9]{6}'); do
mv "$i" "${i%_*}";
done
Here you go:
find /path -regex '.*_[0-9]\{6\}' -exec sh -c 'n="{}"; echo mv "{}" "${n%_*}"' \;
Check the output, if it looks good then drop the echo in there.
Explanation: for each matched file, we run a sub-shell, where we assign the filename to variable n, so that we can use pattern substitution ${n%_*}, which cuts off the last _ character and everything after it until the end of the filename.
Or here's a more portable way that should work in older systems too:
find /path -name '*_[0-9][0-9][0-9][0-9][0-9][0-9]' | sed -ne 's/\(.*\)_[0-9]\{6\}$/mv "&" "\1"/p'
Check the output, if it looks good than pipe it to sh (append this: | sh)
Explanation:
The sed command receives the list of files to rename
In the pattern we capture the first part of the filename within \( ... \)
We replace the pattern with the text mv "&" "\1", where & is substituted with the pattern that was matched, in this case the entire original filename, and \1 is substituted with the part we captured within \( ... \)
#!/usr/bin/env bash
set -x
ls -d diff* > dirlist
while IFS='_' read field1 field2 field3 field4 field5 field6
do
mv -v "${field1}_${field2}_${field3}_${field4}_${field5}_${field6}" \
"${field1}_${field2}_${field3}_${field4}_${field5}"
done < dirlist