Why the command used to remove BOM doesn't work? - macos

I found a command to find documents in my all-level folders and directories that are utf-8 with BOM and then remove the BOM. But it doesn't seem to work on my computer(osx)...Should I install moodle on my machine first in order to run it in my command line?
Below is the command:
find . -type f -exec sed 's/^\xEF\xBB\xBF//' -i.bak {} \; -exec rm {}.bak \;
The result I got is sed: -i.bak: No such file or directory and all the content in the files, which seems very weird.
Thank you for your help!

The command you found was for GNU's sed, which supports optional arguments anywhere. Worst of all, OS X's sed doesn't seem to support non-ASCII byte sequences.
Instead, for OS X use the following answer, which uses Perl: https://stackoverflow.com/a/9101056/1554386
Tying it into find is as follows:
find . -type f -exec perl -e 's/\xef\xbb\xbf//;' -pi.bak {} \;
You can add the -exec rm {}.bak \; from your command if you wish, but you can just as easily do that separately

Related

Delete the contents of all files in BASH recursively? [duplicate]

I would like to clear the content of many log files of a given directory recursively, without deleting every file. Is that possible with a simple command?
I know that I can do > logs/logfile.log one by one, but there are lots of logs in that folder, and that is not straightforward.
I am using macOS Sierra by the way.
Thanks to #chepner for showing me the better way to protect against double quotes in the file names:
You can use find to do it
find start_dir -type f -exec sh -c '> "$1"' _ {} \;
And you could add extra restrictions if you don't want all files, like if you want only files ending in .log you could do
find start_dir -type f -name '*.log' -exec sh -c '> "$1"' _ {} \;
As macOS includes Perl anyway:
perl -e 'for(<logs/*log>){truncate $_,0}'
Or, more succinctly, if you use homebrew and you have installed GNU Parallel (which is just a Perl script), you can do:
parallel '>' ::: logs/*log

Command to empty many files recursively

I would like to clear the content of many log files of a given directory recursively, without deleting every file. Is that possible with a simple command?
I know that I can do > logs/logfile.log one by one, but there are lots of logs in that folder, and that is not straightforward.
I am using macOS Sierra by the way.
Thanks to #chepner for showing me the better way to protect against double quotes in the file names:
You can use find to do it
find start_dir -type f -exec sh -c '> "$1"' _ {} \;
And you could add extra restrictions if you don't want all files, like if you want only files ending in .log you could do
find start_dir -type f -name '*.log' -exec sh -c '> "$1"' _ {} \;
As macOS includes Perl anyway:
perl -e 'for(<logs/*log>){truncate $_,0}'
Or, more succinctly, if you use homebrew and you have installed GNU Parallel (which is just a Perl script), you can do:
parallel '>' ::: logs/*log

How to use the name of the file with sed in a find expression

Trying to answer Using Bash/Perl to modify files based on each file's name I ended in a point in which I don't know how to use find and sed all together.
Let's say there is a certain structure of files in which we want to change a line, appending the name of the file.
If it was a normal for loop we would do:
for file in dir/*
do
sed -i "s/text/text plus $file/g" $file
done
But let's say we want to use find to change files from all subdirectories. In this case, I would use...
find . -type f -exec sed -i "s/text/text plus {}/g" {} \;
^
it does not like this part
but these {} within sed are not accepted and I get the error
sed: -e expression #1, char 20: unknown option to `s'
I found some similar questions (1) but could not generalize it enough to make it understandable for me in this case.
I am sure you guys will come with a great solution for this. Thanks!
I really think the issue is that your files name contains a / that is why sed believes it start the options strings.
Replace / by # in you sed command would do the job.
I try that on Linux BASH and it work perfectly
find . -type f -exec sed -i -e "s#text#test plus {}#g" {} \;
find would return pathnames (relative or absolute) depending upon the path you specify.
This would conflict with the delimiter you've specified, i.e. /. Change the delimiter for sed and you should be good:
find . -type f -exec sed -i "s|text|text plus {}|g" {} \;
EDIT: For removing the leading ./ from the paths, you can try:
find . -type f -exec sh -c '$f={}; f=${f/.\//}; sed -i "s|text|text plus ${f}|g" {}' \;
I'm certain that better solutions might exist ...

batch rename file extensions in subdirectories

I'm trying to create a batch file in linux that will allow me to change extensions of files in multiple subdirectories. After much searching and experimenting i've found what seems to be a solution:
find /volume1/uploads -name "*.mkv" -exec rename .mkv .avi {} +
When running the script i get the following error:
find: -exec CMD must end by ';'
I've tried adding ; and \; (with or without +) but to no avail. What's wrong with the command and how can I fix it?
Edit: Running on a Synology NAS with DSM 4.2
you have to escape all characters that would be interpreted by bash. in your case these are the semicolon and the curly braces (you forgot to escape the latter in your code):
find /volume1/uploads -name "*.mkv" -exec rename .mkv .avi \{\} \;
the {} (in our case \{\}) is expanded to the filename, so the actual call would look like rename .mkv .avi /volume1/uploads/foo/bla.mkv (which is not the exact syntax the /usr/bin/rename needs, at least on my system).
instead it would be something like:
find /volume1/uploads -name "*.mkv" -exec rename 's/\.mkv$/.avi/' \{\} \;
UPDATE
if you don't want to (or cannot) use perl's rename script, you could use the following simple bash script and save it as /tmp/rename.sh
#!/bin/sh
INFILE=$1
OUTFILE="${INFILE%.mkv}.avi"
echo "moving ${INFILE} to ${OUTFILE}"
mv "${INFILE}" "${OUTFILE}"
make it executable (chmod u+x /tmp/rename.sh) and call:
find /volume1/uploads -name "*.mkv" -exec /tmp/rename.sh \{\} \;
UPDATE2
it turned out that this question is really not about bash but about busybox.
with a limited shell interpreter like busybox, the simplest solution is just to append the new file extension:
find /volume1/uploads -name "*.mkv" -exec mv \{\} \{\}.avi \;
Not sure how different find and rename commands are on your DSM 4.2 OS so try something like:
find /volume1/uploads -name "*.mkv" | while read filename;
do mv -v "${filename}" "$(echo "${filename}" | sed -e 's/\.mkv$/\.avi/')"
done

Bash: any command to replace strings in text files?

I have a hierarchy of directories containing many text files. I would like to search for a particular text string every time it comes up in one of the files, and replace it with another string. For example, I may want to replace every occurrence of the string "Coke" with "Pepsi". Does anyone know how to do this? I am wondering if there is some sort of Bash command that can do this without having to load all these files in an editor, or come up with a more complex script to do it.
I found this page explaining a trick using sed, but it doesn't seem to work in files in subdirectories.
Use sed in combination with find. For instance:
find . -name "*.txt" | xargs sed -i s/Coke/Pepsi/g
or
find . -name "*.txt" -exec sed -i s/Coke/Pepsi/g {} \;
(See the man page on find for more information)
IMO, the tool with the easiest usage for this task is rpl:
rpl -R Coke Pepsi .
(-R is for recursive replacement in all subdirectories)
Combine sed with find like this:
find . -name "file.*" -exec sed -i 's/Coke/Pepsi/g' {} \;
find . -type f -exec sed -i 's/old-word/new-word/g' {} \;
I usually do it in perl. However watch out - it uses regexps which are much more powerful then normal string substitution:
% perl -pi -e 's/Coke/Pepsi/g;' $filename
EDIT I forgot about subdirectories
% find ./ -exec perl -pi -e 's/Coke/Pepsi/g;' {} \;
you want a combination of find and sed
You may also:
Search & replace with find & ed
http://codesnippets.joyent.com/posts/show/2299
(which also features a test mode via -t flag)

Resources