Bash Script CURL Request a File Getting Failed to open/read local data from file/application - bash

I have a bash script that loops through all files in a directory and makes a curl request to some URL
for FILE in ./files/*;
do
echo "Making a request..."
echo $FILE
curl --location --request POST "${URL}" \
--form 'file=#"${FILE}"' \
sleep 100
done
echo "done!"
The curl request was copied from postman so I'm confident that it works.
When I run the script, I get the following
Making a request...
./files/split1.csv
curl: (26) Failed to open/read local data from file/application
The issue I'm getting is how to handle string interpolation here.

You are very close, just remove the single quotes to get the string interpolation to work.
curl --location --request POST "${URL}" \
--form file=#"${FILE}" \

Related

Command not found while executing curl in shell script

I am executing the below curl command in shell script. But I am getting command not found error. So wanted to check if I am making any mistake while executing it.
return = $(curl -s --location --request GET --url 'https://testurl.mywebsite.com/api/accts?Id=trst_id&var=test_var' --header 'type: app/test')
echo "data is: ${return}"
I am getting the error:
return: command not found
Kindly let me know if there is a problem with the way I am trying to execute the curl command.
Thank you.
Like dan pointed out in comments, the extra whotespaces between the variable and command caused the issue. After removing extra whitespaces it worked.
return=$(curl -s --location --request GET --url 'https://testurl.mywebsite.com/api/accts?Id=trst_id&var=test_var' --header 'type: app/test')
echo "data is: ${return}"

Bash script to loop through remote directory and pipe files 1 at a time to CURL

I am trying to transfer all files residing in a specified directory on Server1 to Server3 via a script running on Server2.
The transfer to Server3 has to happen through an API and thus must use the following CURL call:
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer $token" \
--header "Dropbox-API-Arg: {\"path\": \"/xfer/$name\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #$f
If it is just 1 file, I can do it successfully, but i'm trying to iterate through the directory on Server1 and send the file directly to the CURL call. So far I've got:
files="( $(ssh me#server1 ls dir/*) )"
while read f
do
name=$(basename ${f})
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer $token" \
--header "Dropbox-API-Arg: {\"path\": \"/xfer/$name\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #$f
done <<< "$files"
The loop seems to be reading the "(" from the array of files into the 1st file name, which obviously causes a problem. I can't get beyond that to be able to tell if POSTING the current file in the loop via --data-binary will actually do what I think (or am hoping) it will.
Any ieas?
The error in the original message was enclosing the ssh command with "()". I am working on a similar issue. In the past I've used Rsync but I want a solution that doesn't require installing extra software. Here is an example that I'm working with to move files off of a Nodejs dev server to backup, running in Bash on Debian:
files=$(ssh chris#estack ls ~/tmp/gateway)
#echo $files
for FILE in $files
do
if [[ "$FILE" = "node_modules" || "$FILE" = ".git" ]]
then
echo "skip $FILE";
continue
fi
echo Copy ~/tmp/gateway/$FILE
#scp -Cpr chris#estack:~/tmp/gateway/$FILE ~/tmp/tmp
done

Escaping multiple layers of mixed quotes for a curl command executed inside a bash script

I have the following bash script that uses its arguments to hit a RESTful web service (via curl) and prints out both the curl request made as well as the response:
#! /bin/bash
# arguments:
# $1 - username
# $2 - password
#
# outputs:
# if the script exits with a non-zero status then something went wrong
# verify that we have all 6 required arguments and fail otherwise
if [ "$#" -ne 2 ]; then
echo "Required arguments not provided"
exit 1
fi
# set the script arguments to meaningful variable names
username=$1
password=$2
# login and fetch a valid auth token
req='curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam'
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam)
# echo the request for troubleshooting
echo "req = $req"
if [ -z "$resp" ]; then
echo "Login failed; unable to parse response"
exit 1
fi
echo "resp = $resp"
When I run this I get:
$ sh myscript.sh myUser 12345#45678
curl: (3) Port number ended with '"'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0curl: (6) Could not resolve host: 12345#45678"
100 1107 100 1093 100 14 2849 36 --:--:-- --:--:-- --:--:-- 2849
req = curl -k -i -H "Content-Type: application/json" -X POST -d {"username":"$username","password":"$password"} https://somerepo.example.com/flimflam
resp = HTTP/1.1 400 Bad Request...(rest omitted for brevity)
Obviously, I'm not escaping the various layers of single- and double-quotes inside the curl statement correctly, as is indicated by outputs like:
curl: (6) Could not resolve host: 12345#45678"
and:
req = curl -k -i -H "Content-Type: application/json" -X POST -d {"username":"$username","password":"$password"} https://somerepo.example.com/flimflam
where the username/password variables are not parsing.
In reality my script takes a lot more than 2 arguments, which is why I'm changing them to have meaningful variable names (such as $username instead of $1) so its more understandable and readable.
Can anyone spot where I'm going awry? Thanks in advance!
Update
I tried the suggestion which turns the req into:
curl -k -i -H 'Content-Type: application/json' -X POST -d "{'username':'myUser','password':'12345#45678'}" https://somerepo.example.com/flimflam
However this is still an illegal curl command and instead needs to be:
curl -k -i -H 'Content-Type: application/json' -X POST -d '{"username":"myUser","password":"12345#45678"}' https://somerepo.example.com/flimflam
First, as I said in a comment, storing commands in variables just doesn't work right. Variables are for data, not executable code. Second, you have two levels of quoting here: quotes that're part of the shell syntax (which are parsed, applied, and removed by the shell before the arguments are passed to `curl), and quotes that're part of the JSON syntax.
But the second problem is actually worse than that, because simply embedding an arbitrary string into some JSON may result in JSON syntax errors if the string contains characters that're part of JSON syntax. Which passwords are likely to do. To get the password (and username for that matter) embedded correctly in your JSON, use a tool that understands JSON syntax, like jq:
userinfo=$(jq -n -c --arg u "$username" --arg p "$password" '{"username":$u,"password":$p}')
Explanation: this uses --arg to set the jq variables u and p to the shell variables $username and $password respectively (and the double-quotes around the shell variables will keep the shell from doing anything silly to the values), and creates a JSON snippet with them embedded. jq will automatically add appropriate quoting/escaping/whatever is needed.
Then, to use it with curl, use something like this:
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d "$userinfo" https://somerepo.example.com/flimflam)
Again, the double-quotes around $userinfo keep the shell from doing anything silly. You should almost always put double-quotes around variables references in the shell.
Note that I never used the req variable to store the command. If you need to print the command (or its equivalent), use something like this:
printf '%q ' curl -k -i -H "Content-Type: application/json" -X POST -d "$userinfo" https://somerepo.example.com/flimflam
echo
The %q format specifier tells the shell to add appropriate quoting/escaping so that you could run the result as a shell command, and it'd work properly. (And the echo is there because printf doesn't automatically add a newline at the end of its output.)
try changing this:
req='curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam'
to this
req="curl -k -i -H 'Content-Type: application/json' -X POST -d \"{'username':'$username','password':'$password'}\" https://somerepo.example.com/flimflam"
and similarly for the resp
ah those pesky "curly" thingies...
how 'bout...
req="curl -k -i -H 'Content-Type: application/json' -X POST -d '{\"username\":\"$username\",\"password\":\"$password\"}' https://somerepo.example.com/flimflam"
This needs even more escaping:
With:
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d "{\"username\":\"$username\",\"password\":\"$password\"}" https://somerepo.example.com/flimflam)
In bash, the variables are still expanded when they're inside single quotes that are inside double quotes.
And you'll need the \" double quotes in the payload as per the JSON definition.
EDIT: I rerun the curl through a HTTP proxy and corrected the script line (see above, removed the single quotes). Results (in raw HTTP) are now:
POST /flimflam HTTP/1.1
Host: somerepo.example.com
User-Agent: curl/7.68.0
Accept: */*
Content-Type: application/json
Content-Length: 44
Connection: close
{"username":"user","password":"12345#abcde"}
(which should be fine)

curl PUT using auth token header to mesosphere fails without eval

EDIT:
I have managed to make it work with
response=$(
curl -k -X PUT -d "$marathon_payload" --write-out %{http_code} --silent --output "$tmp"\
-H "Authorization: token=$dcos_token" -H "$header_content_type" $app_id_url
)
The single quotes were causing the problem. It took a few gyrations but all good.
MORAL: quotes inside the value don't matter if the value is properly quoted UNLESS you eval the whole thing, and I should have known that. Occam's wins again.
end edit
I am initiating Mesosphere microservice deployments with curl, but it won't succeed without using eval. Since I recently inherited this code I've been trying to scrub the eval out of it just as a matter of habit, but it's thwarting me.
The script initiates the deployment with
response=$(
eval curl -k -X PUT -d "'$marathon_payload'" --write-out %{http_code} --silent --output $tmp\
-H "'Authorization: token=$dcos_token'" -H "'$header_content_type'" $app_id_url
)
If it gets a 200 or a 201, it loops a curl to effectively screen-scrape the deployments page till the request disappears.
chkDeploy() { rm -f $tmp;
eval curl -k -X GET --silent --write-out %{http_code} --silent --output $tmp\
-H "'Authorization: token=$dcos_token'" -H "'$header_content_type'" $deployments_url
}
response=$( chkDeploy )
$dcos_token is a base64 encoded string.
It then checks the service with another curl loop to the info page so it can verify the version number. This one is working fine with no eval.
chkCode() {
curl -k -X GET --write-out %{http_code} --silent --output $tmp $info_url;
}
response=$( chkCode )
The first two return 401, authentication failure.
I'm guessing the auth token quoting is off.
There's no reason to use eval here; you just need to quote the arguments to -H properly.
response=$(
curl -k -X PUT -d "$marathon_payload" \
--write-out %{http_code} \
--silent --output "$tmp" \
-H "Authorization: token=$dcos_token" \
-H "$header_content_type" "$app_id_url"
)

How to verify a curl request in bash script?

I have a curl request like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url
Which returns a json as response. So I will parse the response using jq to get some specific data. Like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url | jq '<expression>'
Now if the curl request fails then obviously the parsing operation throws ugly error. I want to avoid this. How to store the response first and then later parse it if the request is successful. I don't want to display the json whole response. Also if I add -w "%{http_code}" in my request it appends the status code with the JSON response which messes up the parsing. How to solve this ? I basically want to first check if the curl request is successful or not then get the JSON response and parse it.I also want to get the status code, so that if it fails I can display the status code. But status code is now messing up with json response.
You can combine the --write and --fail options:
# separating the (verbose) curl options into an array for readability
curl_args=(
--write "%{http_code}\n"
--fail
--silent
--user "$user:$password"
--request GET
--header "Content-Type: application/json"
)
if ! output=$(curl "${curl_args[#]}" "$url"); then
echo "Failure: code=$output"
else
# remove the "http_code" line from the end of the output, and parse it
sed '$d' <<<"$output" | jq '...'
fi
Also note: quote your variables!
I found glenn jackman's answer good, but a bit confusingly written, so I rewrote it, and altered it so I can use it as a safer alternative to curl | jq.
#!/bin/bash
# call this with normal curl arguments, especially url argument, e.g.
# safecurl.sh "http://example.com:8080/something/"
# separating the (verbose) curl options into an array for readability
curl_args=(
-H 'Accept:application/json'
-H 'Content-Type:application/json'
--write '\n%{http_code}\n'
--fail
--silent
)
echo "${curl_args[#]}"
# prepend some arguments, but pass on whatever arguments this script was called with
output=$(curl "${curl_args[#]}" "$#")
return_code=$?
if [ 0 -eq $return_code ]; then
# remove the "http_code" line from the end of the output, and parse it
echo "$output" | sed '$d' | jq .
else
# echo to stderr so further piping to jq will process empty output
>&2 echo "Failure: code=$output"
fi
Note: This code does not test for services that ignore the requested content type and respond with HTML. You'd need to test for grep -l '</html>' for that.

Resources