Curl upload with spaces in filenames - bash

I am trying to use cURL to upload files with spaces in their filenames onto a dedicated server. I am using bash. In a previous project, I just gave up, removing all spaces from filenames. This is not feasible for this project.
Running cURL in verbose mode suggests that it stops when it reads my local file path:
curl -X PUT -u $USER:$PASS --data-binary #"$LOCAL_FILE" "$SERVER/remote.php/dav/files/$USER/$REMOTE_DIR/$REMOTE_FILE"
where $LOCAL_FILE is a path to a file on my local machine (with spaces), and $REMOTE_FILE also has spaces.
This gives:
Warning: Couldn't read data from file "/Users/my_account/somepath/with
Warning: spaces
which implies the command is taking "/Users/my_account/somepath/with" and "spaces" as two separate paths.
How can I solve this?
My full code:
#!/bin/bash -ex
# Local dir - note space in path
IMAGES_DIR="/Users/my_account/somepath/with spaces"
# Remote server, and credentials
SERVER="http://myserver"
REMOTE_DIR="mydir"
USER="myname"
PASS="mypass"
FILE="$1"
LOCAL_FILE="$IMAGES_DIR/$FILE"
REMOTE_FILE=urlencode $FILE
# Move file to the server
curl -v -X PUT -u $USER:$PASS --data-binary #"$LOCAL_FILE" "$SERVER/remote.php/dav/files/$USER/$REMOTE_DIR/$REMOTE_FILE"
# Check that file has made it
echo 'Waiting for file to be on server'
until [ $result > 0 ]
do
result=$(curl -I "$SERVER/remote.php/dav/files/$USER/$REMOTE_DIR/$REMOTE_FILE" -u $USER:$PASS 2>/dev/null | grep Content-Length | awk -F ': ' '{print $2}')
echo "."
sleep 2
done
echo "File $FILE is now on server."
urlencode() {
# urlencode <string>
old_lc_collate=$LC_COLLATE
LC_COLLATE=C
local length="${#1}"
for (( i = 0; i < length; i++ )); do
local c="${1:i:1}"
case $c in
[a-zA-Z0-9.~_-]) printf "$c" ;;
*) printf '%%%02X' "'$c" ;;
esac
done
LC_COLLATE=$old_lc_collate
}

In the Windows version of curl 7.55.0 (should be in higher versions also) the below command will work:
local Windows path: path enclosed in double or single quotes
server URL: path enclosed in double or single quotes and each space replaced with '%20' an example to upload tes st.txt is given below
curl -username:password -T "./Test/te st.txt" "http://server-url/folder1/folder2/te%20st.txt"

Did you tried changing the value of IFS in your script? Normally its value is set to $' \t\n', which includes space. Set it to nothing like so: IFS= and no field splitting is performed at all. You can get the default value back by executing unset IFS.
Example:
var="a b c d"
echo $var
a b c d
IFS=
echo $var
a b c d

TL;DR : the problem still occurs on windows with filenames containing illegal characters, like | > < :
As #Jerin points out, curl on windows (at least in versions > 7.55.0) will accept spaces in filenames.
However, with curl 7.57.0 (x86_64-w64-mingw32), I'm observing that
curl -X POST http://localhost:8080/ -H "Content-Type: application/json" --data-binary "#my_dir/2019-11-04>.json"
gives me
Warning: Couldn't read data from file "my_dir/2019-11-04>.json",
Warning: this makes an empty POST.
And doesn't pass the file as POST data.

Related

Reading .env with bash (vars with spaces)

I'm using nodedock.
It has a start.sh script to start you docker
#!/usr/bin/env bash
set -e
cd "$( dirname "${BASH_SOURCE[0]}" )"
if [ ! -f .env ]; then
echo "Having .env is required. Maybe you forgot to copy env-example?"
exit 1
fi
while read -r line; do
VARNAME=$(echo ${line} | awk '{sub(/\=.*/,x)}1')
if [[ -z ${!VARNAME} ]]; then
declare -x ${line}
fi
done < <(egrep -v "(^#|^\s|^$)" .env)
docker-compose up -d ${NODEDOCK_SERVICES}
docker-compose logs -t -f ${NODEDOCK_LOG_AFTER_START}
NODEDOCK_SERVICES = nginx node workspace mongo
If found that if you need to have a variable with spaces you have to write your env variable with doubles quotes "nginx node workspace mongo"
The problem is that this "req expression" VARNAME=$(echo ${line} | awk '{sub(/\=.*/,x)}1') doesn't work with double quotes.
Any solution?
The problem is not with your awk expression but when you make a call to the declare built-in. Use proper quotes when declaring it.
declare -x "$str"
because without the quotes, your assignment would look like
declare -x NODEDOCK_SERVICES=nginx node workspace mongo
which splits on white-space and the first word of the resultant string gets assigned to NODEDOCK_SERVICES. But with proper quotes, the assignment would remain intact preserving the spaces in the resultant string.
That said, your whole loop can be modified by making the read loop parse the line with = as de-limiter, so you can easily parse the key/value pairs. At this point it is not clear that the assignments in your file would be of the form 1 or 2 below
NODEDOCK_SERVICES = nginx node workspace mongo
NODEDOCK_SERVICES=nginx node workspace mongo
The below logic would work for both the cases
shopt -s extglob
while IFS== read -r key value; do
key=${key%%+([[:space:]])}
value=${value##+([[:space:]])}
if [[ -z ${!key} ]]; then
declare -x "$key=$value"
fi
done < <(egrep -v "(^#|^\s|^$)" .env)
As a good practice, always quote your variables in bash, unless you see a good reason not to. And lower-casing user defined variables helps you distinguish them from the environment variables maintained by the shell itself.
If you want to read specific variables from an .env file format (maybe not exactly your question but it might help others as your title might be misleading):
read_var() {
VAR=$(grep "^$1=" $2 | xargs)
IFS="=" read -ra VAR <<< "$VAR"
IFS=" "
echo ${VAR[1]}
}

How to fix ambiguous redirect when executing a loop reading a text file?

I getting the error ambiguous redirect when trying the execute a command through a list of files. I have already tested that the command works well when doing the loop file by file.
for i in "file1.vcf" "file2.vcf"
do
grep -e "#" -e "PASS" /home/hpz440/Documents/P/example/input/$i > /home/hpz440/Documents//example/output/$i'_PASS'.vcf
echo $i
done
Now, as I have thousands of file inputs, and I wanted to put the path for all of them in a list.
for i in 'cat authomatic_test.txt'
do
grep -e "#" -e "PASS" /home/hpz440/Documents/P/example/input/$i > /home/hpz440/Documents//example/output/$i'_PASS'.vcf
echo $i
done
But the I get this error:
bash: /home/hpz440/Documents/example/output/$i'_PASS'.vcf: ambiguous redirect
My list is a txt file like this:
hpz440#yasminlima:~/Documents//example/input$ cat authomatic_test.txt
/home/hpz440/Documents/example/input/file1.vcf
/home/hpz440/Documents/example/input/file2.vcf
Could anyone give me a light?
Thank you!
for i in 'cat authomatic_test.txt'
# i='cat authomatic_test.txt'
... > /home/hpz440/Documents//example/output/$i'_PASS'.vcf
The variable i has a space in it. Variable expansion with spaces are allowed in the destination of the redirection, but are ambiguous - should the space be part of the filename, or should it split the token into a filename and argument?. Bash prints ambiguous redirect error, because it can't parse the destination. After shell expansion it is expanded to:
... > /home/hpz440/Documents//example/output/cat authomatic_test.txt'_PASS'.vcf
What you want is this:
while IFS= read -r line; do
grep -e "#" -e "PASS" /home/hpz440/Documents/P/example/input/"$i" > /home/hpz440/Documents//example/output/"$i"_PASS.vcf
done < authomatic_test.txt
Remember about proper understanding and using quotes.

(BASH) Passing a loop variable into command that will be stored in a variable

I am trying to iterate through a line-by-line list of file ID strings (testIDs.txt) and pass them to an AWS command. The command should utilize the loop variable, and the output of the command should be stored in the "folder" variable. Whenever I run the script, I get blank spaces as output in the terminal.
#!bin/bash
while read p; do
folder=$(aws s3 ls s3://a-bucket/ --recursive | grep "${p}" | cut -c 32-)
echo "${folder}"
done < testIDs.txt
The output of the AWS command should be two strings, and I have checked that this is true by running the AWS line separately in the terminal and using a string instead of ${p}.
Note: Right now, I simply want to print folder, but later I will pass folder to another loop.
You may want to just use two for loops, (1) read contents of the bucket to temp (optional/could search it directly) , (2) Loop through IDs and (3) loop through each line of the bucket and look for the ID.
Example:
Read the folder structure to local temp, then for every ID look at the line of the file check for the testIDs and print them to a results file.
#!bin/bash
TMP="/tmp/s3-log-${RANDOM}.dat"
RESULTS="/tmp/s3-log-results.txt"
ID_FILE="testIDs.txt"
> "${RESULTS}"
aws s3 ls s3://a-bucket/ --recursive > "${TMP}"
while read p; do
while IFS= read -r line ;do
if ! [[ $line = *"${p}"* ]]; then
echo "id: ${p} => $(echo ${line} | cut -c 32-)" | tee -a "${RESULTS}"
fi
done < "${TMP}"
done < "${ID_FILE}"
This is probably what you're looking for; use the while loop if you don't know how many values you need to loop through:
while read -r -d $'\t' resourceName; do
echo "$resourceName"
done <<< "$(aws transfer list-servers --query='Servers[*].ServerId' --output text)"
In this case it's the -d $'\t' that influences the Bash word-splitting on the list of output AWS resources.

How can I pass the filename from a variable locally into ssh? [duplicate]

When I stumble across an evil web site that I want blocked from corporate access, I edit my named.conf file on my bind server and then update my proxy server blacklist file. I'd like to automate this somewhat with a bash script. Say my script is called "evil-site-block.sh" and contains the following:
ssh root#192.168.0.1 'echo "#date added $(date +%m/%d/%Y)" >> /var/named/chroot/etc/named.conf; echo "zone \"$1\" { type master; file \"/etc/zone/dummy-block\"; };" >> /var/named/chroot/etc/named.conf'
It is then run as
$ evil-site-block.sh google.com
When I look at the contents of named.conf on the remote machine I see:
#date added 09/16/2014
zone "" { type master; file "/etc/zone/dummy-block"; };
What I can't figure out is how to pass "google.com" as $1.
First off, you don't want this to be two separately redirected echo statements -- doing that is both inefficient and means that the lines could end up not next to each other if something else is appending at the same time.
Second, and much more importantly, you don't want the remote command that's run to be something that could escape its quotes and run arbitrary commands on your server (think of if $1 is '$(rm -rf /)'.spammer.com).
Instead, consider:
#!/bin/bash
# ^ above is mandatory, since we use features not found in #!/bin/sh
printf -v new_contents \
'# date added %s\nzone "%s" { type master; file "/etc/zone/dummy-block"; };\n' \
"$(date +%m/%d/%Y)" \
"$1"
printf -v remote_command \
'echo %q >>/var/named/chroot/etc/named.conf' \
"$new_contents"
ssh root#192.168.0.1 bash <<<"$remote_command"
printf %q escapes data such that an evaluation pass in another bash shell will evaluate that content back to itself. Thus, the remote shell will be guaranteed (so long as it's bash) to interpret the content correctly, even if the content attempts to escape its surrounding quotes.
Your problem: Your entire command is put into single quotes – obviously so that bash expressions are expanded on the server and not locally.
But this also applies to your $1.
Simple solution: “Interupt” the quotation by wrapping your local variable into single quotes.
ssh root#192.168.0.1 'echo "#date added $(date +%m/%d/%Y)" >> /var/named/chroot/etc/named.conf; echo "zone \"'$1'\" { type master; file \"/etc/zone/dummy-block\"; };" >> /var/named/chroot/etc/named.conf'
NB: \"$1\" → \"'$1'\".
NOTE: This solution is a simple fix for the one-liner as posted in the question above. If there's the slightest chance that this script is executed by other people, or it could process external output of any kind, please have a look at Charles Duffy's solution.

bash - assign variable to curl get request

#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
curl 'https://shoesworkshop.net/libraries/ajax/ajax.invoice.php?act=viewallinvoice&invoiceid="${line}"&sEcho=1&iColumns=8&iDisplayStart=0&iDisplayLength=20&bRegex=false&bRegex_0=false&bSearchable_0=true&bRegex_1=false&bSearchable_1=true&bRegex_2=false&bSearchable_2=true&bRegex_3=false&bSearchable_3=true&bRegex_4=false&bSearchable_4=true&bRegex_5=false&bSearchable_5=true&bRegex_6=false&bSearchable_6=true&bRegex_7=false&bSearchable_7=true&iSortCol_0=0&sSortDir_0=asc&iSortingCols=1&bSortable_0=true&bSortable_1=true&bSortable_2=true&bSortable_3=true&bSortable_4=true&bSortable_5=true&bSortable_6=true&bSortable_7=true' -H 'Host: shoesworkshop.net'| sed 's/^[^[[]*:/:/'
done < "$1"
inside $line there is a value like this
AAAAA
SSSSS
DDDDD
and i want to pass $line into curl command
can someone help me how?
i tried "'${line}'" and '${line}' and it still not working
i want to make a repeat call using curl get request from the url using variable from $line
For simple URLs, one way is to just use double quotes for the complete URL, including your variable expansion, ${line}, like this:
curl "https://shoe...&invoiceid=${line}&sEcho=1&iCo...table_7=true"
(Under single quotes, your shell variable line is not expanded.)
If your URL contains shell-special characters like $, it's best to combine both single and double quotes (and concatenate several strings, like explained here). For Example:
curl 'https://shoe...&invoiceid='"$line"'&sEcho=1&iCo...table_7=true'
# ^------ fixed part ------^ ^var^ ^------- fixed part ------^
However, if your variable contains characters that have to be URL-encoded (like space, &, ?, etc.) it's best to let curl handle that with --data-urlencode option. When called with this option, curl will default to POST method, but you can override this with -G, in which case your parameters will be appended to URL query. For example:
line="1&2?3 4"
curl "http://httpbin.org/get?x=1&y=2" --data-urlencode z="$line" -G
produces the right URL:
http://httpbin.org/get?x=1&y=2&z=1%262%3F3%204
Your script, fixed:
#!/bin/bash
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
curl --data-urlencode invoiceid="$line" -G 'https://shoesworkshop.net/libraries/ajax/ajax.invoice.php?act=viewallinvoice&sEcho=1&iColumns=8&iDisplayStart=0&iDisplayLength=20&bRegex=false&bRegex_0=false&bSearchable_0=true&bRegex_1=false&bSearchable_1=true&bRegex_2=false&bSearchable_2=true&bRegex_3=false&bSearchable_3=true&bRegex_4=false&bSearchable_4=true&bRegex_5=false&bSearchable_5=true&bRegex_6=false&bSearchable_6=true&bRegex_7=false&bSearchable_7=true&iSortCol_0=0&sSortDir_0=asc&iSortingCols=1&bSortable_0=true&bSortable_1=true&bSortable_2=true&bSortable_3=true&bSortable_4=true&bSortable_5=true&bSortable_6=true&bSortable_7=true' -H 'Host: shoesworkshop.net' | sed 's/^[^[[]*:/:/'
done < "$1"

Resources