remove with bash lots of files with spaces in their names - bash

Okey, my question seems to be related to this one:
Trouble in script with spaces in filename
but the answer doesn't seem to tackle the problem when you have to deal with a bunch of those files.
Let us say that I get some filenames of files that I want to erase with find, and that I put them in a text file. Then I want to delete them, with rm, say. rm seems not to be able to interpret the spaces of the filenames, even after I put the names inside quotes manually!
What would you suggests?

It depends on how you're actually processing the file list (which you haven't yet shown). Assuming you have them one per line in the file:
:: cat list
thishasnospaces
but this does
then even something like this won't work if they have spaces:
:: for fspec in $(cat list) ; do echo "rm -f \"${fspec}\"" ; done
rm -rf "thishasnospaces"
rm -rf "but"
rm -rf "this"
rm -rf "does"
That's because this treats all white space as identical. However, you can do it with a while loop:
:: cat list | while read ; do echo "rm -f \"$REPLY\""; done
rm -f "thishasnospaces"
rm -f "but this does"
You'll see that preserves the one per line aspect. Just remove the echo when you're happy it will work.
:: cat list | while read ; do rm -f "$REPLY"; done
But just keep in mind this may all be unnecessary. The find command already has the capability to delete files that it finds, bu use of the -delete option. If you can use that, there'll be far less messing about with spaces in the filenames.

Why do you need a text file?
Try something like:
touch "a b"
touch "c d"
find . -name "* *" -exec rm "{}" \;
Good luck!

Related

Passing variables to cp / mv

I can use the cp or mv command to copy/mv files to a new folder manually but while in a for loop, it fails.
I've tried various ways of doing this and none seem to work. The most frustrating part is it works when run locally.
A simple version of what I'm trying to do is shown below:
#!bin/bash
#Define path variables
source_dir=/home/me/loop
destination_dir=/home/me/loop/new
#Change working dir
cd "$source_dir"
#Step through source_dir for each .txt. file
for f in *.txt
do
# If the txt file was modified within the last 300 minutes...
if [[ $(find "$f" -mmin -300) ]]
then
# Add breaks for any spaces in filenames
f="${f// /\\ }"
# Copy file to destination
cp "$source_dir/$f $destination_dir/"
fi
done
Error message is:
cp: missing destination file operand after '/home/me/loop/first\ second.txt /home/me/loop/new/'
Try 'cp --help' for more information.
However, I can manually run:
mv /home/me/loop/first\ second.txt /home/me/loop/new/
and it works fine.
I get the same error using cp and similar errors using rsync so I'm not sure what I'm doing wrong...
cp "$source_dir/$f $destination_dir/"
When you surround both arguments with double quotes you turn them into one argument with an embedded space. Quote them separately.
cp "$source_dir/$f" "$destination_dir/"
There's no do anything special for spaces beforehand. The quoting already ensures files with whitespace are handled correctly.
# Add breaks for any spaces in filenames
f="${f// /\\ }"
Let's take a step back, though. Looping over all *.txt files and then checking each one with find is overly complicated. find already loops over multiple files and does arbitrary things to those files. You can do everything in this script in a single find command.
#!bin/bash
source_dir=/home/me/loop
destination_dir=/home/me/loop/new
find "$source_dir" -name '*.txt' -mmin -300 -exec cp -t "$destination_dir" {} +
You need to divide it in to two strings, like this:
cp "$source_dir/$f" "$destination_dir/"
by having as one you are basically telling cp that the entire line is the first parameter, where it is actually two (source and destination).
Edit: As #kamil-cuk and #aaron states there are better ways of doing what you try to do. Please read their comments

Bash loop through variable (folder names with pattern)

I have a small bash script which goes through some directories and deletes contents within them like so:
#!/bin/bash
echo "Start job at: $(date)"
rm -rf /my/location/log_folder1/*
exit 0
unfortunately, I have these log_folder from 1 --> 10. So, log_folder1, log_folder2, log_folder3 etc..
How can I efficiently loop through this rather than write a separate line for each rm?
I do not want to use the find command to do this - I would like to learn how to loop through.
You can do it with brace expansion:
rm -rf /my/location/log_folder{1..10}/*
See man bash - Brace Expansion for details.
Brace Expansion Example:
$ echo "asd"{1..10}
asd1 asd2 asd3 asd4 asd5 asd6 asd7 asd8 asd9 asd10
Given:
$ find . -type d
.
./log_folder1
./log_folder1/sub_log_folder1
./log_folder1/sub_log_folder2
./log_folder1/sub_log_folder3
./log_folder10
./log_folder2
./log_folder3
./log_folder4
./log_folder5
./log_folder6
./log_folder7
./log_folder8
./log_folder9
Use a filename expansion (a 'glob'), and a for loop:
$ for fn in log_folder*/*; do
> rm -rf "$fn"
> done
Resulting in:
$ find . -type d
.
./log_folder1
./log_folder10
./log_folder2
./log_folder3
./log_folder4
./log_folder5
./log_folder6
./log_folder7
./log_folder8
./log_folder9
Be sure to use quotes around the file name in "$fn" so you actually delete what your think and not the unintended result of word splits.
Or, you can just use a glob that targets subdirectories:
$ rm -rf log_folder*/*
Or, an expansion that only targets some of the sub directories, not all:
$ rm -rf log_folder{1..5}/*
Which can also be used in the for loop.
(I suggest being careful with rm -rf .... if you are just getting started with scripting and expansions however. Perhaps practice with something more reversible...)
Use find
find log_folder* -delete
You can use this loop:
# using a common base directory makes it a little safer with 'rm -rf'
fbase=/my/location/log_folder
for n in {1..10}; do
f="$fbase$n"
echo "Cleaning up folder $f"
rm -rf "$f"/* # this will empty everything under $f, but not $f itself will remain as an empty folder
# use 'rm -rf "$f"' to remove everything including the top folder
done

Glob renaming in bash

I'm fairly new to bash so sorry if this is kind of a basic question. I was trying to rename a bunch of mp3 files to prepend 1- to their filenames, and mv *.mp3 1-*.mp3 didn't work unfortunately. So I tried to script it, first with echo to test the commands:
for f in *.mp3 ; do echo mv \'$f\' \'1-$f\'; done
Which seems to output the commands that I like, so I removed the echo, changing the command to
for f in *.mp3 ; do mv \'$f\' \'1-$f\'; done
Which failed. Next I tried piping the commands onward like so
for f in *.mp3 ; do echo mv \'$f\' \'1-$f\'; done | /bin/sh
Which worked, but if anyone could enlighten me as to why the middle command doesn't work I would be interested to know. Or if there is an more elegant one-liner that would do what I wanted, I would be interested to see that too.
I think you have to change the command to
for f in *.mp3 ; do mv "$f" "1-$f"; done
Otherwise you would pass something like 'file1.mp3' and '1-file1.mp3' to mv (including the single quotes).
Dry run:
rename -n 's/^/1-/' *.mp3
Remove the -n if it looks correct to run the command. man rename for details.

bash script to batch truncate filenames in one dir

this question is somewhat unique among others asked.
i have a dir with a bunch of folders and they are named using periods to separate every word.
such as: foo.bar.2011.useless.words
the last two words are always the useless ones, so i would like to truncate starting with the second to last period.
not sure of the wording...
many thanks
for file in *.*.*
do
mv "$file" "${file%.*.*}"
done
If your folders with dots are only one level deep, go with Ignacio's answer. However, if you have folders that you want to rename that exist in subdirs such as /toplevel/subdir1/foo.bar.baz.blah/ then you'll need to use my find command below. Unless you have Bash 4.x in which case you can use the shopt -s globstar option.
find /top/level/dir -type d -name '*.*.*' -exec sh -c 'for arg; do echo mv "$arg" "${arg%.*.*}"; done' _ {} +
I added an echo in there so you can do a dry-run without making any changes. Remove the echo if you are satisfied with the output and run it again to make the changes permanent.
Edit
Removed my $tmp var by shamelessly stealing Ignacio's PE

How to do something to every file in a directory using bash?

I started with this:
command *
But it doesn't work when the directory is empty; the * wildcard becomes a literal "*" character. So I switched to this:
for i in *; do
...
done
which works, but again, not if the directory is empty. I resorted to using ls:
for i in `ls -A`
but of course, then file names with spaces in them get split. I tried tacking on the -Q switch:
for i in `ls -AQ`
which causes the names to still be split, only with a quote character at the beginning and ending of the name. Am I missing something obvious here, or is this harder than it ought it be?
Assuming you only want to do something to files, the simple solution is to test if $i is a file:
for i in *
do
if test -f "$i"
then
echo "Doing somthing to $i"
fi
done
You should really always make such tests, because you almost certainly don't want to treat files and directories in the same way. Note the quotes around the "$i" which prevent problems with filenames containing spaces.
find could be what you want.
find . | while read file; do
# do something with $file
done
Or maybe like this:
find . -exec <command> {} \;
If you do not want the search to include subdirectories you might need to add a combination of -type f and -maxdepth 1 to the find command. See the find man page for details.
It depends whether you're going to type this at a command prompt, and which command you're applying to the files.
If it's typed you could go with your second choice and substitute something harmless for the command. I like to use echo instead of mv or rm, for example.
Put it all on one line:
for i in * ; do command $i; done
When that works - or you can see where it fails, and whether it's harmless, you can press up-arrow, edit the command and try again.
Use shopt to prevent expansion to *.txt
shopt -s nullglob
for myfile in *.txt
do
# your code here
echo $myfile
done
this should do the trick:
find -type d -print0 | xargs -n 1 -0 echo "your folder: {} !"
find -type f -print0 | xargs -n 1 -0 echo "your file: {} !"
the print0 / 0 are there to avoid problems with whitespace

Resources