bash cp filenames with spaces [duplicate] - bash

This question already has answers here:
Tilde in path doesn't expand to home directory
(5 answers)
Closed 4 years ago.
I'm using variables in bash script to hold folder names (to iterate over multiple folders).
I'd like to copy files from one to another, the files exist in source directory. Folders and filenames contain spaces, so I must use double quote.
For instance:
#!/bin/bash
inpath="~/foo bar/"
outpath="~/temp basket/
cp "$inpath*" "$outpath"
The copy fails as: '~/foo bar/*' No such file or directory
Is there any consistent way to do that?

Only quote the parts you don't want expanded or split:
inpath=~/'foo bar'
outpath=~/'temp basket'
cp -- "$inpath/"* "$outpath"

Related

How to use whitespace in directories with a Bash script? [duplicate]

This question already has answers here:
How do I pass on script arguments that contain quotes/spaces?
(2 answers)
Bash script to cd to directory with spaces in pathname
(14 answers)
Closed 24 days ago.
I have a script that appends the date and time to all files in a folder. I use the script like this...
bash append_date.sh /home/user/Documents/Podcasts/
and that will append the date to all files in the /home/user/Documents/Podcasts/ folder
Problem is that if there is a whitespace in the directory tree it fails to do anything. ie
bash append_date.sh /home/user/Documents/My Stuff/
I have tried passing the following, but that does not work
bash append_date.sh /home/user/Documents/My\ Stuff/
How do I get this script to play nice with whitespaces?
Many thanks for any help.

Escape command parameters for re-use in Bash? [duplicate]

This question already has answers here:
Why do bash parameter expansions cause an rsync command to operate differently?
(2 answers)
How can I store a list of arguments in a variable in bash?
(2 answers)
Closed 1 year ago.
I want to save command parameters for re-use in a variable in Bash. The reason is they are very long and I use them multiple times.
SRC_FOLDER="src folder"
DST_FOLDER="dst folder"
PARAMS="--dry-run \"$SRC_FOLDER\" \"$DST_FOLDER\""
rsync $PARAMS
The problem is the space in the src and dst folder. Rsync thinks there are 4 folders instead of 2. I think I somehow have to escape the variables in line 3.
I cannot escape the folders in line 1 and 2 because they might have been passed to my bash script by parameter and I don't know the content.
Please also note that the src and dst folders may have other special signs like *, $, \, ", '.
Is this possible without using separate variables like this:
rsync $PARAMS "$SRC_FOLDER" "$DST_FOLDER"
Is there a solution?
#!/bin/bash
rsync_args=(
--dry-run
"src folder"
"dst folder"
)
rsync "${rsync_args[#]}"
About the problem that I mentioned in my comment:
The remote path in rsync commands is subject to word splitting.
The following example is wrong:
rsync -av ~/ user#server:'home backup 2021-12-04/'
You have to write it like this:
rsync -av ~/ user#server:'home\ backup\ 2021-12-04/'
A way for fixing it automatically is:
#!/bin/bash
remote_path='home backup 2021-12-04/'
qq="'\''"
rsync -av ~/ user#server:"'${remote_path//\'/$qq}'"

I want to move all files from one folder to other folder using shell script [duplicate]

This question already has answers here:
Command not found error in Bash variable assignment
(5 answers)
Closed 4 years ago.
I want to move all files from one folder to other folder using shell script.
This is my code but it throws error
#!/bin/sh
SRC = '/home/xxx/test1/'
DESTN = '/home/xxx/test/'
mv SRC DESTN
Error:./move.sh:2:./move.sh:SRC:not found
./move.sh:2:./move.sh:SRC:not found
mv:cannot stat 'SRC': No such file or directory
When declaring shell variables, you cannot add spaces between the variable name and the = sign, nor between the = and the value.
Also remember to add $ before the variable name when using it after its declaration.
Your script should look like this one:
#!/bin/sh
SRC="/home/xxx/test1/*"
DESTN="/home/xxx/test"
mv "$SRC" "$DESTN"

Renaming Multiple Files on macOS Terminal [duplicate]

This question already has answers here:
Batch renaming files with Bash
(10 answers)
Closed 5 years ago.
Is it possible to rename multiple files that share a similar name but are different types of files all at once?
Example:
apple.png
apple.pdf
apple.jpg
Can I substitute the apple for something else, for example "pear"? If this is possible, what would the command be? Many thanks for your time!
You can do this in bash natively by looping over the files beginning apple and renaming each one in turn using bash parameter expansion
$ for f in apple*; do mv "$f" "${f/apple/pear}"; done
The for f in apple* finds all files matching the wildcard. Each filename is then assigned to the variable f
For each assignment to f bash calls the command mv to move (rename) the file from it's existing name to one where apple is replaced by pear
You could also install rename using a package manager like Homebrew and call
rename -e 's/apple/pear/' apple*

Loop over directories [duplicate]

This question already has answers here:
How to loop over directories in Linux?
(11 answers)
Closed 7 years ago.
I'm not sure how to handle directories of directory scenario in shell.
I have folder structure as below.
Directory structure:
/DirA/DirA1/DirA11/*.txt
/DirA2/DirA21/*.txt
/DirA3/DIrA31/*.txt'
I'm new to shell scripting, not able to figure out how to read these text files.
You can use the find command to process all files with certain properties in a directory tree. For example,
find /DirA* -name '*.txt' 2>/dev/null
would list all files named *.txt inside the trees you are mentioning. Note that if you use wildcards in the name mask, you need to single-quote them in order to protect them from the shell.
for f in /DirA/DirA1/DirA11/*.txt /DirA2/DirA21/*.txt /DirA3/DIrA31/*.txt; do
# do stuff with $f
done

Resources