I am attempting to run an rsync command that will copy files to a new location. If I run the rsync command directly, without any parameter expansions on the command line, rsync does what I expect
$ rsync -amnv --include='lib/***' --include='arm-none-eabi/include/***' \
--include='arm-none-eabi/lib/***' --include='*/' --exclude='*' \
/tmp/from/ /tmp/to/
building file list ... done
created directory /tmp/to
./
arm-none-eabi/
arm-none-eabi/include/
arm-none-eabi/include/_ansi.h
...
arm-none-eabi/lib/
arm-none-eabi/lib/aprofile-validation.specs
arm-none-eabi/lib/aprofile-ve.specs
...
lib/
lib/gcc/
lib/gcc/arm-none-eabi/
lib/gcc/arm-none-eabi/4.9.2/
lib/gcc/arm-none-eabi/4.9.2/crtbegin.o
...
sent 49421 bytes received 6363 bytes 10142.55 bytes/sec
total size is 423195472 speedup is 7586.32 (DRY RUN)
However, if I enclose the filter arguments in a variable, and invoke the command using that variable, different results are observed. rsync copies over a number of extra directories I do not expect:
$ FILTER="--include='lib/***' --include='arm-none-eabi/include/***' \
--include='arm-none-eabi/lib/***' --include='*/' --exclude='*'"
$ rsync -amnv ${FILTER} /tmp/from/ /tmp/to/
building file list ... done
created directory /tmp/to
./
arm-none-eabi/
arm-none-eabi/bin/
arm-none-eabi/bin/ar
...
arm-none-eabi/include/
arm-none-eabi/include/_ansi.h
arm-none-eabi/include/_syslist.h
...
arm-none-eabi/lib/
arm-none-eabi/lib/aprofile-validation.specs
arm-none-eabi/lib/aprofile-ve.specs
...
bin/
bin/arm-none-eabi-addr2line
bin/arm-none-eabi-ar
...
lib/
lib/gcc/
lib/gcc/arm-none-eabi/
lib/gcc/arm-none-eabi/4.9.2/
lib/gcc/arm-none-eabi/4.9.2/crtbegin.o
...
sent 52471 bytes received 6843 bytes 16946.86 bytes/sec
total size is 832859156 speedup is 14041.53 (DRY RUN)
If I echo the command that fails, it generates the exact command that succeeds. Copying the output, and running directly gives me the expected result.
There is obviously something I'm missing about how bash parameter expansion works. Can somebody please explain why the two different invocations produce different results?
The shell parses quotes before expanding variables, so putting quotes in a variable's value doesn't do what you expect -- by the time they're in place, it's too late for them to do anything useful. See BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail! for more details.
In your case, it looks like the easiest way around this problem is to use an array rather than a plain text variable. This way, the quotes get parsed when the array is created, each "word" gets stored as a separate array element, and if you reference the variable properly (with double-quotes and [#]), the array elements get included in the command's argument list without any unwanted parsing:
filter=(--include='lib/***' --include='arm-none-eabi/include/***' \
--include='arm-none-eabi/lib/***' --include='*/' --exclude='*')
rsync -amnv "${filter[#]}" /tmp/from/ /tmp/to/
Note that arrays are available in bash and zsh, but not all other POSIX-compatible shells. Also, I lowercased the filter variable name -- recommended practice to avoid colliding with the shell's special variables (which are all uppercase).
I like to break the arguments onto separate lines, for convenience sake:
ROPTIONS=(
-aNHXxEh
--delete
--fileflags
--exclude-from=$EXCLUDELIST
--delete-excluded
--force-change
--stats
--protect-args
)
and then call it thusly:
rsync "${ROPTIONS[#]}" "$SOURCE" "$DESTINATION"
Related
I have a script that I call with an application, I can't run it from command line. I derive the directory where the script is called and in the next variable go up 1 level where my files are stored. From there I have 3 variables with the full path and file names (with wildcard), which I will refer to as "masks".
I need to find and "do something with" (copy/write their names to a new file, whatever else) to each of these masks. The do something part isn't my obstacle as I've done this fine when I'm working with a single mask, but I would like to do it cleanly in a single loop instead of duplicating loop and just referencing each mask separately if possible.
Assume in my $FILESFOLDER directory below that I have 2 existing files, aaa0.csv & bbb0.csv, but no file matching the ccc*.csv mask.
#!/bin/bash
SCRIPTFOLDER=${0%/*}
FILESFOLDER="$(dirname "$SCRIPTFOLDER")"
ARCHIVEFOLDER="$FILESFOLDER"/archive
LOGFILE="$SCRIPTFOLDER"/log.txt
FILES1="$FILESFOLDER"/"aaa*.csv"
FILES2="$FILESFOLDER"/"bbb*.csv"
FILES3="$FILESFOLDER"/"ccc*.csv"
ALLFILES="$FILES1
$FILES2
$FILES3"
#here as an example I would like to do a loop through $ALLFILES and copy anything that matches to $ARCHIVEFOLDER.
for f in $ALLFILES; do
cp -v "$f" "$ARCHIVEFOLDER" > "$LOGFILE"
done
echo "$ALLFILES" >> "$LOGFILE"
The thing that really spins my head is when I run something like this (I haven't done it with the copy command in place) that log file at the end shows:
filesfolder/aaa0.csv filesfolder/bbb0.csv filesfolder/ccc*.csv
Where I would expect echoing $ALLFILES just to show me the masks
filesfolder/aaa*.csv filesfolder/bbb*.csv filesfolder/ccc*.csv
In my "do something" area, I need to be able to use whatever method to find the files by their full path/name with the wildcard if at all possible. Sometimes my network is down for maintenance and I don't want to risk failing a change directory. I rarely work in linux (primarily SQL background) so feel free to poke holes in everything I've done wrong. Thanks in advance!
Here's a light refactoring with significantly fewer distracting variables.
#!/bin/bash
script=${0%/*}
folder="$(dirname "$script")"
archive="$folder"/archive
log="$folder"/log.txt # you would certainly want this in the folder, not $script/log.txt
shopt -s nullglob
all=()
for prefix in aaa bbb ccc; do
cp -v "$folder/$prefix"*.csv "$archive" >>"$log" # append, don't overwrite
all+=("$folder/$prefix"*.csv)
done
echo "${all[#]}" >> "$log"
The change in the loop to append the output or cp -v instead of overwrite is a bug fix; otherwise the log would only contain the output from the last loop iteration.
I would probably prefer to have the files echoed from inside the loop as well, one per line, instead of collect them all on one humongous line. Then you can remove the array all and instead simply
printf '%s\n' "$folder/$prefix"*.csv >>"$log"
shopt -s nullglob is a Bash extension (so won't work with sh) which says to discard any wildcard which doesn't match any files (the default behavior is to leave globs unexpanded if they don't match anything). If you want a different solution, perhaps see Test whether a glob has any matches in Bash
You should use lower case for your private variables so I changed that, too. Notice also how the script variable doesn't actually contain a folder name (or "directory" as we adults prefer to call it); fixing that uncovered a bug in your attempt.
If your wildcards are more complex, you might want to create an array for each pattern.
tmpspaces=(/tmp/*\ *)
homequest=($HOME/*\?*)
for file in "${tmpspaces[#]}" "${homequest[#]}"; do
: stuff with "$file", with proper quoting
done
The only robust way to handle file names which could contain shell metacharacters is to use an array variable; using string variables for file names is notoriously brittle.
Perhaps see also https://mywiki.wooledge.org/BashFAQ/020
I have a Bash script in which I call rsync in order to perform a backup to a remote server. To specify that my Downloads folder be backed up, I'm passing "'${HOME}/Downloads'" as an argument to rsync which produces the output:
rsync -avu '/Volumes/Norman Data/Downloads' me#example.com:backup/
Running the command with the variable expanded as above (through the terminal or in the script) works fine, but because of the space in the expanded variable and the fact that the quotes (single ticks) are ignored when included in the variable being passed as part of an argument (see here), the only way I can get it not to choke on the space is to do:
stmt="rsync -avu '${HOME}/Downloads' me#examle.com:backup/"
eval ${stmt}
It seems like there would be some vulnerabilities presented by running eval on anything not 100% private to that script. Am I correct in thinking I should be doing it a different way? If so, any hints for a bash-script-beginner would be greatly appreciated.
** EDIT ** - I actually have a bit more involved use case than. the example above. For the paths passed, I have an array of them, each containing spaces, that I'm then combining into 1 string kind of like
include_paths=(
"'${HOME}/dir_a'"
"'${HOME}/dir_b' --exclude=video"
)
for item in "${include_paths[#]}"
do
inc_args="${inc_args}" ${item}
done
inc_args evaluates to '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video
which I then try to pass as an argument to rsync but the single ticks are read as literals and it breaks after the 1st /Volumes/Norman because of the space.
rsync -avu "${inc_args}" me#example.com:backup/
Using eval seems to read the single ticks as quotes and executes:
rsync -avu '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video me#example.com:backup/
like I need it to. I can't seem to get any other way to work.
** EDIT 2 - SOLUTION **
So the 1st thing I needed to do was modify the include_paths array to:
remove single ticks from within double quoted items
move any path-specific flags (ex. --exclude) to their own items directly after the path it should apply to
I then built up an array containing the rsync command and its options, added the expanded include_paths and exclude_paths arrays and the connection string to the remote host.
And finally expanded that array, which ran my entire, properly quoted rsync command. In the end the modified array include_paths is:
include_paths=(
"${HOME}/dir_a"
"${HOME}/dir_b"
"--exclude=video"
"${HOME}/dir_c"
)
and I put everything together with:
cmd=(rsync -auvzP)
for item in "${exclude_paths[#]}"
do
cmd+=("--exclude=${item}")
done
for item in "${include_paths[#]}"
do
cmd+=("${item}")
done
cmd+=("me#example.com:backup/")
set -x
"${cmd[#]}"
Use an array for the commands/option instead of a plain variable.
stmt=(rsync -avu "${HOME}/Dowloads" me#example.com:backup/)
Execute it using the builtin command
command "${stmt[#]}"
...Or I personally just put the options/arguments in an array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
The execute it using rsync
rsync "${options[#]}"
If you have newer version of bash which that supports the additional P.E. parameter expansion, then you could probably quote the array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
Check the output by applying the P.E.
echo "${options[#]#Q}"
Should print
'-avu' '/Volumes/Norman Data/Downloads' 'me#examle.com:backup/'
Then you can just
rsync "${options[#]#Q}"
I have a text file called OPTIONS.txt storing all flags of Makefile:
arg1=foo arg2="-foo -bar"
I want to pass all flags in this file to make. However,
make `cat OPTIONS.txt`
fails with make: invalid option -- 'a'. It seems that shell interprets it as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^argv[2] ^argv[3]
Is there any way to make it interpreted as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^--------argv[2]
Since you control the options file, store the options one per line:
arg1=foo
arg2="-foo -bar"
Then in the shell, you'll read the file into an array, one element per line:
readarray -t opts < OPTIONS.txt
Now you can invoke make and keep the options whole:
make "${opts[#]}"
If you want the shell to interpret quotes after backtick expansion you need to use eval, like this:
eval make `cat OPTIONS.txt`
however just be aware that this evaluates everything, so if you have quoted content outside of the backticks you'll get the same issue:
eval make `cat OPTIONS.txt` arg4="one two"
will give an error. You'd have to double-quote the arg4, something like this:
eval make `cat OPTIONS.txt` arg4='"one two"'
In general it's tricky to do stuff like this from the command line, outside of scripts.
ETA
The real problem here is that we don't have a set of requirements. Why do you want to put these into a file, and what kind of things are you adding; are they only makefile variable assignments, or are there other make options here as well such as -k or similar?
IF the OP controls (can change) the format of the file AND the file contains content only used by make AND the OP doesn't care about the variables being command line assignments vs. regular assignments AND there are only variable assignments and not other options, then they can just (a) put each variable assignment on its own line, (b) remove all quotes, and (c) use include OPTIONS.txt from inside the makefile to "import" them.
I've got a bash script accepting several files as input which are mixed with various script's options, for example:
bristat -p log1.log -m lo2.log log3.log -u
I created an array where i save all the index where i can find files in the script's call, so in this case it would be an arrat of 3 elements where
arr_pos[0] = 2
arr_pos[1] = 4
arr_pos[3] = 5
Later in the script I must call "head" and "grep" in those files and i tried this way
head -n 1 ${arr_pos[0]}
but i get this error non runtime
head: cannot open `2' for reading: No such file or directory
I tried various parenthesis combinations, but I can't find which one is correct.
The problem here is that ${arr_pos[0]} stores the index in which you have the file name, not the file name itself -- so you can't simply head it. The array storing your arguments is given by $#.
A possible way to access the data you want is:
#! /bin/bash
declare -a arr_pos=(2 4 5)
echo ${#:${arr_pos[0]}:1}
Output:
log1.log
The expansion ${#:${arr_pos[0]}:1} means you're taking the values ranging from the index ${arr_pos[0]} in the array $#, to the element of index ${arr_pos[0]} + 1 in the same array $#.
Another way to do so, as pointed by #flaschenpost, is to eval the index preceded by $, so that you'd be accessing the array of arguments. Although it works very well, it may be risky depending on who is going to run your script -- as they may add commands in the argument line.
Anyway, you may should try to loop through the entire array of arguments by the beginning of the script, hashing the values you find, so that you won't be in trouble while trying to fetch each value later. You may loop, using a for + case ... esac, and store the values in associative arrays.
I think eval is what you need.
#!/bin/bash
arr_pos[0]=2;
arr_pos[1]=4;
arr_pos[2]=5;
eval "cat \$${arr_pos[1]}"
For me that works.
I'm trying to create a file hierarchy to store data. I want to create a folder for each data acquisition session. That folder has five subfolders, which are named below. My code attempt below gives an error, but I'm not sure how to correct it.
Code
#!/bin/sh
TRACES = "/Traces"
LFPS = '/LFPS'
ANALYSIS = '/Analysis'
NOTES = '/Notes'
SPIKES = '/Spikes'
folders=($TRACES $LFPS $ANALYSIS $NOTES $SPIKES)
for folder in "${folders[#]}"
do
mkdir $folder
done
Error
I get an error when declaring the variables. As written above, bash flips the error Command not found. If, instead, I declare the file names as TRACES = $('\Traces'), bash flips the error No such file or directory.
Remove the spaces between the variable names and the values:
#!/bin/sh
TRACES="/Traces"
LFPS='/LFPS'
ANALYSIS='/Analysis'
NOTES='/Notes'
SPIKES='/Spikes'
folders=($TRACES $LFPS $ANALYSIS $NOTES $SPIKES)
for folder in "${folders[#]}"
do
mkdir $folder
done
With spaces, bash interprets this like
COMMAND param1 param2
with = as param1
I'm taking the 'no spaces around the variable assignments' part of the fix as given.
Using array notation seems like overkill. Allowing for possible spaces in names, you can use:
for dir in "$TRACE" "$LFPS" "$NOTES" "$PASS"
do mkdir "$dir"
done
But even that is wasteful:
mkdir "$TRACE" "$LFPS" "$NOTES" "$PASS"
If you're worried that the directories might exist, you can avoid error messages for that with:
mkdir -p "$TRACE" "$LFPS" "$NOTES" "$PASS"
The -p option is also valuable if the paths are longer and some of the intermediate directories might be missing. If you're sure there won't be spaces in the names, the double quotes become optional (but they're safe and cheap, so you might as well use them).
Also you would want to do some checking beforehand if folders exist or not.
Also you can always debug the shell script with set -x, you could just use "mkdir -p" which would do the trick.
I made the following changes to get your script to run.
As a review comment it is unusual to create such folders hanging off the root file system.
#!/bin/sh
TRACES="/Traces"
LFPS='/LFPS'
ANALYSIS='/Analysis'
NOTES='/Notes'
SPIKES='/Spikes'
folders="$TRACES $LFPS $ANALYSIS $NOTES $SPIKES"
for folder in $folders
do
mkdir $folder
done
Spaces were removed from the initial variable assignments and I also simplified the for loop so that it iterated over the words in the folders string.