Consider a directory containing (only) the 3 files obtained by:
echo "foobar" > test1.txt
echo "\$foobar" > test2.txt
echo "\$\$foobar" > test3.txt
(and thus containing respectively foobar, $foobar, $$foobar).
The grep instruction:
grep -l -r --include "*.txt" "\\$\\$" .
filters the files (actually, the unique file) containing double dollars:
$ grep -l -r --include "*.txt" "\\$\\$" .
./test3.txt
So far, so good. Now, this instruction fails within a makefile, e.g.:
doubledollars:
echo "Here are files containing double dollars:";\
grep -l -r --include "*.txt" "\\$\\$" . ;\
printf "\n";\
leads to the errors:
$ make doubledollars
echo "Here are files containing double dollars:";\
grep -l -r --include "*.txt" "\\\ . ;\
printf "\n";\
/bin/sh: -c: ligne 2: unexpected EOF while looking for matching `"'
/bin/sh: -c: ligne 3: syntax error: unexpected end of file
makefile:2: recipe for target 'doubledollars' failed
make: *** [doubledollars] Error 1
Hence my question: how to escape double dollars in a makefile?
Edit: note that this question does not involve Perl.
with following Makefile
a:
echo '$$$$'
make a gives
$$
... and it's better to use single quotes if you do not need variable expansion:
grep -l -r -F --include *.txt '$$' .
unless you write script to be able to be executed on Windows in MinGW environment, of cause.
Related
I'm trying to remove all .js and .js.map files from any sub-directory of src called __tests__.
$ find . -path './src/**' -name __tests__ | # find subdirectories
> sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | # for each subdirectory, concat *.js and *.js.map
> xargs rm # remove files
This fails with the following errors:
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
However, if I change my xargs rm to xargs echo rm, copy and paste the output, and run it, it works.
$ find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' |
> xargs echo rm # echo command to remove files
rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
$ rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
Wrapping the output of my echo in $(...) and prepending rm results in the same error as before.
$ rm $(find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | xargs echo rm
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
What am I doing wrong?
I doubt it matters, but I'm using GitBash on Windows.
First, to explain the issue: In find | sed | xargs rm, the shell only sets up communication between those programs, but it doesn't actually process the results in any way. That's a problem here because *.js needs to be expanded by a shell to replace it with a list of filenames; rm treats every argument it's given as a literal name. (This is unlike Windows, where programs do their own command-line parsing and glob expansion).
Arguably, you don't need find here at all. Consider:
shopt -s globstar # enable ** as a recursion operator
rm ./src/**/__tests__/*.js{,.map} # delete *.js and *.js.map in any __tests__ directory under src
...or, if you do want to use find, let it do the work of coming up with a list of individual files matching *.js, instead of leaving that work to happen later:
find src -regextype posix-egrep -regex '.*/__tests__/[^/]*[.]js([.]map)?' -delete
You need to have your globs (*) expanded. File name expansion is performed by the shell on UNIX, not by rm or other programs. Try:
.... | xargs -d $'\n' sh -c 'IFS=; for f; do rm -- $f; done' sh
...to explain this:
The -d $'\n' ensures that xargs splits only on newlines (not spaces!), and also stops it from treating backslashes and quotes as special.
sh -c '...' sh runs ... as a script, with sh as $0, and subsequent arguments in $1, etc; for f; will thus iterate over those arguments.
Clearing IFS with IFS= prevents string-splitting from happening when $f is used unquoted, so only glob expansion happens.
Using the -- argument to rm ensures that it treats subsequent arguments as filenames, not options, even if they start with dashes.
That said, if you have really a lot of files for each pattern, you might run into an "argument list too long", even though you are using xargs.
Another caveat is that filenames containing newlines can potentially be split into multiple names (depending on the details of the version of find you're using). A way to solve this that will work with all POSIX-compliant versions of find might be:
find ./src -type d -name __tests__ -exec sh -c '
for d; do
rm -- "$d"/*.js{,.map}
done
' sh {} +
Within a bash script, I'm trying to pull all files with an extension '.jstd' into an array, loop over that array and carry out some action.
My script is failing to copy the path of each script into the array.
I have the following script.
#!/bin/bash
IFS=$'\n'
file_list=($(find '/var/www' -type f -name "*.jstd"))
for i in "${file_list[#]}"; do
echo "$i"
done
echo $file_list
unset IFS
The line file_list=($(find '/var/www' -type f -name "*.jstd")) works fine in the terminal, but fails in the script with:
Syntax error: "(" unexpected
I've googled, but failed. All ideas gratefully received.
edit: In case it helps in reproduction or clues, I'm running Ubuntu 12.04, with GNU bash, version 4.2.25(1)-release (i686-pc-linux-gnu)
This is precisely the error you would get if your shell were /bin/sh on Ubuntu, not bash:
$ dash -c 'foo=( bar )'
dash: 1: Syntax error: "(" unexpected
If you're running your script with sh yourscript -- don't. You must invoke bash scripts with bash.
That being given, though -- the better way to read a file list from find would be:
file_list=( )
while IFS= read -r -d '' filename; do
file_list+=( "$filename" )
done < <(find '/var/www' -type f -name "*.jstd" -print0)
...the above approach working correctly with filenames containing spaces, newlines, glob characters, and other corner cases.
I have a folder called /input/temp. Inside the folder I have lot of files. I need to find the file of pattern Article_????_test_?????????.txt and replace by format below.
Article_????_?????????.txt
Below is the code I tried and which doesn't work:
echo "Please provide the file name Corresponding to DC..."
read file
ls $HOME/*.txt | grep $file
if [ $? -eq 0 ]
find . $file '*_Test_*' -exec bash -c 'mv $0 ${0/_Test/ }' {} \;
if [ $? -eq 0 ]
find . $file -name "*.txt" -exec bash -c "mv {} \`echo {} | sed -e 's/[_TEST_]/_/g'\`" \;
then
I Got below error:
find: 0652-083 Cannot execute bash:: A file or directory in the path name does not exist.
find: 0652-083 Cannot execute bash:: A file or directory in the path name does not exist.
bash can't be executed on my platform.
Unless the file name is a regular expression you can use if [ -e "$file" ] instead of ls + grep + test.
When you do find . $file '*_Test_*' the last three parameters are actually taken as files or directories to search underneath! It will return
all files in the current directory,
all files in the directory $file or the path $file if it's not a directory, and
all files in any directories matching *_Test_* or their paths if they are not directories.
There's no need for bash - you can run mv directly in -exec. This is just extra complexity for no gain.
Use $(command) instead of command for much easier quote handling. Each $() has a separate quoting context, so you can do for example echo "$(command "$(c2 "argument with spaces")")".
According to the link here:
This should work
ls -1 Article_????test?????????.txt|awk '{old=$0;gsub(/test/,"_",$0);system("mv \""old"\" "$0)}'
Also try the 'rename' command.
rename 's/_test_/_/' *.txt
You can fine tune the regular expression...
Update from your code:
cd $HOME
find . -name '*_Test_*' |while read line
do
echo mv ${line) ${line/_Test/}
done
If you need search the pattern Article_????test?????????.txt, try this
cd $HOME
find . -name 'Article_????_test_?????????.txt' |while read line
do
echo mv ${line) ${line/_Test/}
done
My directory has the following structure (on Mac)
main_folder/
|_folder1/
|_folder2/
|_folder3/
each sub folder has a file that has the identical name "classFile.trans"
I want to traverse the sub folders and do grep on the file classFile.trans. But I don't know how to save the output new file in the corresponding sub folder. Thanks
#!/bin/bash
find file in ./main_folder/**classFile.trans; do
grep -v "keyword" $file > newClassFile.trans #how do I save the output new file in the corresponding sub folder?
done
Probably easiest to run the grep in each subdirectory:
#!/bin/bash
for d in ./main_folder/*; do
( cd $d;
file=classFile.trans
test -f $file && grep -v "keyword" $file > newClassFile.trans
)
done
The parentheses cause the body of the loop to iterate in a new subshell, so the working directory of the main shell is not changed. However, this makes error messages from grep fairly useless. If any of the classFile.trans is not readable, for example, the error message will not indicate which directory. So it is probably better to do:
for d in ./main_folder/*; do grep -v keyword $d/$file > $d/$newfile; done
I would use
#!/bin/bash
for file in `find ./main_folder/ -name "classFile.trans" `; do
newFile=`dirname $file`/newClassFile.trans;
grep -v "keyword" $file > $newFile
done
The find command has -exec* flags that allow it to run commands for each file matched. For your case you would do:
find ./main_folder -name classFile.trans \
-execdir $SHELL -c "grep -v '$keyword' {} >newClassFile.trans" \;
(\ and linebreak added so the whole command can be read without scrolling)
Breaking the various arguments down:
-name classFile.trans searches for all files named classFile.trans
-execdir runs everything up to the ; character in the directory that contains the matched file
$SHELL -c runs your $SHELL (e.g., /bin/bash) with the -c argument which immediately executes its respective value instead of creating an interactive shell that you can type in
"grep -v '$keyword' {} >newClassFile.trans" runs your grep and output redirection in the file's respective directory thanks to -execdir; note that find turns {} in to the matched file's name
This is necessary so the > redirection runs in the sub-command, not the "current" shell the find command is being run in
\; escapes the ; character so it can be sent to find instead of acting as a command terminator
A test:
# Set up the folders and test files
$ mkdir -p main_folder/{f1,f2,f3}
$ for i in main_folder/f?; do printf 'a\nb\nc\n' >$i/classFile.trans; done
# Contents of one of the test files
$ cat main_folder/f1/classFile.trans
a
b
c
# Set the keyword to the letter 'b'
$ keyword=b
$ find ./main_folder -name classFile.trans -execdir $SHELL -c "grep -v '$keyword' {} >newClassFile.trans" \;
# newClassFile.trans was created sans 'b'
$ cat main_folder/f1/newClassFile.trans
a
c
I am writing a bash script that, when run from directory B, mirrors the directory structure of directory A within directory B.
Currently, I am doing so as follows:
#!/bin/bash
dirify () {
echo $1
}
export -f dirify
find "../test" -type d -exec bash -c "dirify '{}'" \;
I am running this script from directory B, and ../test is directory A.
Fortunately, the directory I am using to test contains folders with ' in the name. When I run this script, bash gives the following error when it reaches those directories:
> bash: -c: line 0: unexpected EOF while looking for matching `''
> bash: -c: line 1: syntax error: unexpected end of file
(note that line 0 and line 1 refer to the lines within the dirify() function)
A more simplified way of illustrating this issue is as follows:
find "../test" -exec bash -c "echo '{}'" \;
This example produces the same errors.
Anyway, this is an issue because in production, I can't assume that file paths will not contain the ' character.
Is there anyway around this issue?
Pass it as an argument.
bash -c 'dirify "$1"' dirify {}