I am trying to remove files in a directory using rm and without deleting the directory itself in a script. The examples I see only do this while in the directory itself, and I would like to do it without navigating there.
I tried
rm "$(dirname $1)/filetokeep/*"
but it is not working. Any help?
Quoting the wildcard inhibits expansion.
rm -- "$(dirname -- "$1")/filetokeep"/*
Using -- ensures that values can't be interpreted as optional arguments rather than positional ones (so that things still work if the directory named in $1 starts with a -).
Related
I was trying to remove a file within a repository which contained a "$" symbol, so I used:
rm *$*
But this deleted all my files in the directory and attempted to delete all subdirectories as well.
Can someone explain why this command did not just remove files containing a $?
*$* first undergoes parameter expansion, with $* expanding to a string consisting of all the current positional parameters. If there are none, the result is *, which then undergoes pathname expansion.
The correct command would have been something like rm *"$"* or rm *\$*, with the $ escaped to prevent any parameter expansion from taking place before pathname expansion.
I want to write a script that takes a name of a folder as a command line argument and produces a file that contains the names of all subfolders with size 0 (empty subfolder). This is what I got:
#!/bin/bash
echo "Name of a folder'
read FOLDER
for entry in "$search_dir"/*
do
echo "$entry"
done
your script doesn't have the logic you intended. find command has a feature for this
$ find path/to/dir -type d -empty
will print empty directories starting from the given path/to/dir
I would suggest you accept the answer which suggests to use find instead. But just to be complete, here is some feedback on your code.
You read the input directory into FOLDER but then never use this variable.
As an aside, don't use uppercase for your private variables; this is reserved for system variables.
You have unpaired quotes in the prompt string. If the opening quote is double, you need to close with a double quote, or vice versa for single quotes.
You loop over directory entries, but do nothing to isolate just the ones which are directories, let alone empty directories.
Finally, nothing in your script uses Bash-only facilities, so it would be safe and somewhat more portable to use #!/bin/sh
Now, looping over directories can be done by using search_dir/*/ instead of just search_dir/*; and finding out which ones are empty can be done by checking whether a wildcard within the directory returns just the directory itself. (This assumes default globbing behavior -- with nullglob you would make a wildcard with no matches expand to an empty list, but this is problematic in some scenarios so it's not the default.)
#!/bin/bash
# read -p is not POSIX
read -p "Name of a folder" search_dir
for dir in "$search_dir"/*/
do
# [[ is Bash only
if [[ "$dir"/* = "$dir/*" ]]; then # Notice tricky quoting
echo "$dir"
fi
done
Using the wildcard expansion with [ is problematic because it is not prepared to deal with a wildcard expansion -- you get "too many arguments" if the wildcard expands into more than one filename -- so I'm using the somewhat more mild-tempered Bash replacement [[ which copes just fine with this. Alternatively, you could use case, which I would actually prefer here; but I've stuck to if in order to make only minimal changes to your script.
I am currently trying to substitute arguments into a filepath
FILES=(~/some/file/path/${1:-*}*/${2:-*}*/*)
I'm trying to optionally substitute variables, so that if there are no arguments the path looks like ~/some/file/path/**/**/* and if there is just one, it looks like ~/some/file/path/arg1*/**/*, etc. However, I need the wildcard expansion to occur after the filepath has been constructed. Currently what seems to be happening is that the filepath is into FILES as a single filepath with asterisks.
The broader goal is to pass all subdirectories that are two levels down from the current directory into the FILES variable, unless arguments are given, in which case the first argument is used to pick a particular directory at the first level, the second argument for the second level.
edit:
This script generates directories and then grabs random files from them, and previously had ** instead of *, however it still works, and correctly restricts the files to pull from when given arguments. Issue resolved.
#!/bin/bash
mkdir dir1 dir1/a
touch dir1/a/foo.txt dir1/a/bar.txt
cp -r dir1/a dir1/b
cp -r dir1 dir2
files=(./*${1:-}/*/*)
for i in {1..10}
do
# Get random file
nextfile=${files[$RANDOM % ${#files[#]} ]}
# Use file
echo "$nextfile" || break
sleep 0.5
done
rm -r dir1 dir2
I can't reproduce this behavior.
$ files=( ~/tmp/foo/${1:-*}*/${2:-*}*/* )
$ declare -p files
declare -a files='([0]="/Users/chaduffy/tmp/foo/bar/baz/qux")'
To explain why this is expected to work: Parameter expansion happens before glob expansion, so by the time glob expansion takes place, content has already been expanded. See lhunath's simplified diagram of the bash parse/expansion process for details.
A likely explanation is simply that your glob has no matches, and is evaluating to itself for that reason. This behavior can be disabled with the nullglob switch, which will give you an empty array:
shopt -s nullglob
files=(~/some/file/path/${1:-*}*/${2:-*}*/*)
declare -p files
Another note: ** only has special meaning in shells where shopt -s globstar has been run, and where this feature (added in 4.0) is available. On Mac OS X (without installation of a newer version of bash via MacPorts or similar), it doesn't exist; you'll want to use find for recursive operations. If your glob would only match if ** triggered recursion, this would explain the behavior in question.
I'm trying to create a for loop on folders that contain spaces, comma's and parenthesis. For example:
Italy - Rimini (Feb 09, 2013)
First it scans a parent folder /albums for sub-folders that look like in the example above. Then it executes a curl actions on files in thoses sub-folders. It works fine if the sub-folders do not contain spaces, comma's or other symbols.
for dir in `ls /albums`;
do
for file in /albums/$dir/*
do
curl http://upload.com/up.php -F uploadfile[]=#"$file" > out.txt
php process.php
done
php match.php
done
But if there are such symbols, it seems the the curl bit gets stuck - it can't find the $file (probably because $dir is incorrect).
I could replace all the symbols in the sub-dirs or remove them or rename the folders to 001, 002 and it works flawlessly. But before resorting to that I'd like to know if it can be solved using bash tricks while keeping the sub-folder name intact?
Familiarize yourself with the concept of word splitting of your shell. Then realize that using ls to get a list of files with spaces is asking for trouble. Instead, use shell globbing and then quote expansions:
cd /albums
for dir in *; do
for file in /albums/"$dir"/*; do
echo x"$dir"x"$file"x
done
php match.php
done
For problems with spaces in filenames, you have to change the IFS to
IFS='
'
which tells the shell, that only linebreaks are file separators. By default IFS is set to tabs, spaces and linebreaks.
So just put this before the loop begins, and it will work with filenames that contains spaces.
And of course put quotes around your variablenames.
I have a simple test bash script which looks like that:
#!/bin/bash
cmd="rsync -rv --exclude '*~' ./dir ./new"
$cmd # execute command
When I run the script it will copy also the files ending with a ~ even though I meant to exclude them. When I run the very same rsync command directly from the command line, it works! Does someone know why and how to make bash script work?
Btw, I know that I can also work with --exclude-from but I want to know how this works anyway.
Try eval:
#!/bin/bash
cmd="rsync -rv --exclude '*~' ./dir ./new"
eval $cmd # execute command
The problem isn't that you're running it in a script, it's that you put the command in a variable and then run the expanded variable. And since variable expansion happens after quote removal has already been done, the single quotes around your exclude pattern never get removed... and so rsync winds up excluding files with names starting with ' and ending with ~'. To fix this, just remove the quotes around the pattern (the whole thing is already in double-quotes, so they aren't needed):
#!/bin/bash
cmd="rsync -rv --exclude *~ ./dir ./new"
$cmd # execute command
...speaking of which, why are you putting the command in a variable before running it? In general, this is a good way make code more confusing than it needs to be, and trigger parsing oddities (some even weirder than this). So how about:
#!/bin/bash
rsync -rv --exclude '*~' ./dir ./new
You can use a simple --eclude '~' as (accoding to the man page):
if the pattern starts with a / then it is anchored to a particular spot in
the hierarchy of files, otherwise it
is matched against the end of the
pathname. This is similar to a leading
^ in regular expressions. Thus "/foo"
would match a name of "foo" at either
the "root of the transfer" (for a
global rule) or in the merge-file's
directory (for a per-directory rule).
An unqualified "foo" would match a
name of "foo" anywhere in the tree
because the algorithm is applied
recursively from the top down; it
behaves as if each path component gets
a turn at being the end of the
filename. Even the unanchored
"sub/foo" would match at any point in
the hierarchy where a "foo" was found
within a directory named "sub". See
the section on ANCHORING
INCLUDE/EXCLUDE PATTERNS for a full
discussion of how to specify a pattern
that matches at the root of the
transfer.
if the pattern ends with a / then it will only match a directory, not a
regular file, symlink, or device.
rsync chooses between doing a simple string match and wildcard
matching by checking if the pattern
contains one of these three wildcard
characters: '*', '?', and '[' .
a '*' matches any path component, but it stops at slashes.
use '**' to match anything, including slashes.