Bash script to loop through remote directory and pipe files 1 at a time to CURL - bash

I am trying to transfer all files residing in a specified directory on Server1 to Server3 via a script running on Server2.
The transfer to Server3 has to happen through an API and thus must use the following CURL call:
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer $token" \
--header "Dropbox-API-Arg: {\"path\": \"/xfer/$name\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #$f
If it is just 1 file, I can do it successfully, but i'm trying to iterate through the directory on Server1 and send the file directly to the CURL call. So far I've got:
files="( $(ssh me#server1 ls dir/*) )"
while read f
do
name=$(basename ${f})
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer $token" \
--header "Dropbox-API-Arg: {\"path\": \"/xfer/$name\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #$f
done <<< "$files"
The loop seems to be reading the "(" from the array of files into the 1st file name, which obviously causes a problem. I can't get beyond that to be able to tell if POSTING the current file in the loop via --data-binary will actually do what I think (or am hoping) it will.
Any ieas?

The error in the original message was enclosing the ssh command with "()". I am working on a similar issue. In the past I've used Rsync but I want a solution that doesn't require installing extra software. Here is an example that I'm working with to move files off of a Nodejs dev server to backup, running in Bash on Debian:
files=$(ssh chris#estack ls ~/tmp/gateway)
#echo $files
for FILE in $files
do
if [[ "$FILE" = "node_modules" || "$FILE" = ".git" ]]
then
echo "skip $FILE";
continue
fi
echo Copy ~/tmp/gateway/$FILE
#scp -Cpr chris#estack:~/tmp/gateway/$FILE ~/tmp/tmp
done

Related

How to use Bash command line to curl an API with token and payload as parameters

I am novice and first timer to bash. Trying to run bash under command line to invoke an API, by passing token and payload received from two different APIs and are set as parameters. Below is my command. I am trying to add this bash script to a task in AzureBatch service job.
It has 3 curl requests,
First one (Line#1 in the code snippet below)- gets payload by
calling an API. ---- This is working fine, I am able to verify the
payload using the echo statement following the first curl command.
Second one(Line#3 in the code snippet below) - gets token by
calling the token provider ----- This is working fine as well,
verified using the echo statement.
Third one (Line#5 in the code snippet below),This is the problematic command. I am trying to pass the token and payload received from the above two commands and the curl is not able to resolve them.
both token and payload or not resolving to their values..
My Bash COmmand
/bin/bash -c
"payload=$(curl --location --request GET 'http://url/OutreachData')
&& echo -e \"The value of payload is: "'$payload'"\"
&& token=$(curl --location --request POST 'https://login.microsoftonline.com/<<tenantId>>/oauth2/v2.0/token' --header 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=<<clientId>>' --data-urlencode 'scope=api://<<applicationId>>/.default' --data-urlencode 'client_secret=<<clientSecret>>' --data-urlencode 'grant_type=client_credentials' --data-urlencode 'Audience=api://<<applicationId>>'|jq -j '.access_token')
&& echo -e \"value of token is "'$token'"\n\"
&& result=$(curl --location --request POST 'https://url/api/<<Resource>>' --header 'accept: */*' --header 'Content-Type: application/json' --header 'Authorization: Bearer '"'$token'" --data-raw "'$payload'")
&& echo -e \"Result is "'$result'"\""
This is how the third Curl is resolving to, payload and token are not getting replaced as we can see in the authorization header and data-raw elements
++ curl --location --request POST https://url/api/ --header 'accept: /' --header 'Content-Type: application/json' --header 'Authorization: Bearer ' --data-raw ''''''''
There should be no need to explicitly run this with bash -c unless you are in a very constrained environment where you simply cannot run Bash by any other means.
The immediate problem is that code like
bash -c "echo "'$token'" && true"
ends up with $token being single-quoted in the shell which you run bash -c from. But the blazingly obvious fix is to not have this complex quoting in the first place.
payload=$(curl --location --request GET 'http://url/OutreachData')
echo "The value of payload is: '$payload'"
token=$(curl --location --request POST \
'https://login.microsoftonline.com/<<tenantId>>/oauth2/v2.0/token' \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode 'client_id=<<clientId>>' \
--data-urlencode 'scope=api://<<applicationId>>/.default' \
--data-urlencode 'client_secret=<<clientSecret>>' \
--data-urlencode 'grant_type=client_credentials' \
--data-urlencode 'Audience=api://<<applicationId>>' |
jq -j '.access_token')
echo "value of token is "'$token'"
result=$(curl --location --request POST \
'https://url/api/<<Resource>>' --header 'accept: */*' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer '"$token" \
--data-raw "$payload")
echo "Result is "'$result'"
If your current shell is not Bash and you need these commands to be run in Bash, a simpler workaround is to put the script in a here document, which drastically simplifies the quoting needs (or if this is in an interactive session, just run bash and run these commands at the interactive Bash prompt, then exit when you no longer want to be in Bash).

Using Bash to make a POST request

I have a 100 Jetpacks that I have to sign in to configure. I am trying to do it in a bash script but I am having no luck. I can connect to the wifi no problem but my POST request are not achieving anything. Any Advice? Here is link to my github. I have copies of what I captured on Burp suite https://github.com/Jdelgado89/post_Script
TYIA
#!/bin/bash
nmcli device wifi rescan
nmcli device wifi list
echo "What's they last four?"
read last4
echo "What's the Key?"
read key
nmcli device wifi connect Ellipsis\ \Jetpack\ $last4 password $key
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
echo "{"CurrentPassword":"$key","NewPassword":"G1l4River4dm1n","SecurityQuestion":"NameOfStreet","SecurityAnswer":"Allison"}" > change_admin.json
echo "{"SSID":"GRTI Jetpack","WiFiPassword":"G1l4River3r","WiFiMode":0,"WiFiAuthentication":6,"WiFiEncription":4,"WiFiChannel":0,"MaxConnectedDevice":8,"PrivacySeparator":false,"WMM":true,"Command":"SetWifiSetting"}" > wifi.json
cat sign_on.json
cat change_admin.json
cat wifi.json
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #sign_on.json http://192.168.1.1/cgi-bin/sign_in.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #change_admin.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #wifi.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
This is not correct:
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
The double quotes are not being put literally into the file, they're just terminating the shell string beginning with the previous double quote. So this is writing
{Command:SignIn,Password:keyvalue}
into the file, with no double quotes. You need to escape the nested double quotes.
echo "{\"Command\":\"SignIn\",\"Password\":\"$key\"}" > sign_on.json
However, it would be best if you used the jq utility instead of formatting JSON by hand. See Create JSON file using jq.
jq -nc --arg key "$key" '{"Command":"SignIn","Password":$key}' >sign_on.json

file transfer from 1 remote server to another remote server without downloading file

I am trying to write a Bash script on my server (My Server) that will grab a file from one remote server (Source) and copy it to a Dropbox account (Destination). I need to get the file from Source via SFTP and will be copying it to Destination using the Dropbox API (HTTPS). So far I can get the file with:
curl -k "sftp://Source/file.txt" --user "me:mypasswd" -o "/test/file.txt" --ftp-create-dirs
and then copy it to Dropbox with
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer " \
--header "Dropbox-API-Arg: {\"path\": \"/path/to/file.txt\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #/test/file.txt
I'm guessing the "right" way to do this is to pipe the file from Source directly to Destination, but I'm just not sure how to go about putting them together.
This is definitely not my area of expertise, so I don't even know where to start - nested CURL calls? If anyone could point me in the right direction, I'd be most appreciative.
UPDATE
Here's the whole curl command I'm running:
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer $token" \
--header "Dropbox-API-Arg: {\"path\": \"/xfer/chef.txt\",\"mode\": \"add\",\"autorename\": true,\"mute\": false,\"strict_conflict\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary "$(curl -k "http://marketing.dave0112.com/file.txt" --user "me:mypasswd")"
I was having issues with CURL not supporting SFTP so I changed to HTTP while i get that end sorted out. Not sure if that affects anything.
You can replace this line :
--data-binary #/test/file.txt
with
--data-binary #<(curl -k "sftp://Source/file.txt" --user "me:mypasswd")
If problems, try :
--data-binary "$(curl -k "sftp://Source/file.txt" --user "me:mypasswd")"

curl PUT using auth token header to mesosphere fails without eval

EDIT:
I have managed to make it work with
response=$(
curl -k -X PUT -d "$marathon_payload" --write-out %{http_code} --silent --output "$tmp"\
-H "Authorization: token=$dcos_token" -H "$header_content_type" $app_id_url
)
The single quotes were causing the problem. It took a few gyrations but all good.
MORAL: quotes inside the value don't matter if the value is properly quoted UNLESS you eval the whole thing, and I should have known that. Occam's wins again.
end edit
I am initiating Mesosphere microservice deployments with curl, but it won't succeed without using eval. Since I recently inherited this code I've been trying to scrub the eval out of it just as a matter of habit, but it's thwarting me.
The script initiates the deployment with
response=$(
eval curl -k -X PUT -d "'$marathon_payload'" --write-out %{http_code} --silent --output $tmp\
-H "'Authorization: token=$dcos_token'" -H "'$header_content_type'" $app_id_url
)
If it gets a 200 or a 201, it loops a curl to effectively screen-scrape the deployments page till the request disappears.
chkDeploy() { rm -f $tmp;
eval curl -k -X GET --silent --write-out %{http_code} --silent --output $tmp\
-H "'Authorization: token=$dcos_token'" -H "'$header_content_type'" $deployments_url
}
response=$( chkDeploy )
$dcos_token is a base64 encoded string.
It then checks the service with another curl loop to the info page so it can verify the version number. This one is working fine with no eval.
chkCode() {
curl -k -X GET --write-out %{http_code} --silent --output $tmp $info_url;
}
response=$( chkCode )
The first two return 401, authentication failure.
I'm guessing the auth token quoting is off.
There's no reason to use eval here; you just need to quote the arguments to -H properly.
response=$(
curl -k -X PUT -d "$marathon_payload" \
--write-out %{http_code} \
--silent --output "$tmp" \
-H "Authorization: token=$dcos_token" \
-H "$header_content_type" "$app_id_url"
)

Bash script to backup data to google drive [duplicate]

This question already has answers here:
Upload and update files in google drive via cmd [closed]
(3 answers)
Closed 6 years ago.
I want to create a bash script that will log into my google drive account and backup my main folder. I've done a bit of research and I know we need some sort of OAuth for using the Google Drive API but I'm still fairly new so I thought I can get some solid guidance from the stack community.
I guess you don't need to re-invent the wheel, use gdrive and you're good to go
SETUP:
wget -O drive "https://drive.google.com/uc?id=0B3X9GlR6EmbnMHBMVWtKaEZXdDg"
mv drive /usr/sbin/drive
chmod 755 /usr/sbin/drive
Now, simply run drive to start the authentication process. You'll get a link like this to paste and browse to in to your web browser:
Go to the following link in your browser:
https://accounts.google.com/o/oauth2/auth?client_id=123456789123-7n0vf5akeru7on6o2fjinrecpdoe99eg.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&state=state
Authenticate and provide permission for the application to access your Google Drive, and then you'll be provided a verification code to copy and paste back in to your shell:
Enter verification code: 4/9gKYAFAJ326XIP6JJHAEhs342t35LPiA5QGW0935GHWHy9
USAGE:
drive upload --file "/some/path/somefile.ext"
SRC:
Backing up a Directory to Google Drive on CentOS 7
NOTE:
If you really want to build a bash script take a look at this gist , i.e.:
automatically gleans MIME type from file
uploads multiple files
removes directory prefix from filename
works with filenames with spaces
uses dotfile for configuration and token
interactively configuring
uploads to target folder if last argument looks like a folder id
quieter output
uses longer command line flags for readability
throttle by adding curl_args="--limit-rate 500K" to $HOME/.gdrive.conf
#!/bin/bash
# based on https://gist.github.com/deanet/3427090
#
# useful $HOME/.gdrive.conf options:
# curl_args="--limit-rate 500K --progress-bar"
browser="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36"
destination_folder_id=${#: -1}
if expr "$destination_folder_id" : '^[A-Za-z0-9]\{28\}$' > /dev/null
then
# all but last word
set -- "${#:0:$#}"
else
# upload to root
unset destination_folder_id
fi
if [ -e $HOME/.gdrive.conf ]
then
. $HOME/.gdrive.conf
fi
old_umask=`umask`
umask 0077
if [ -z "$username" ]
then
read -p "username: " username
unset token
echo "username=$username" >> $HOME/.gdrive.conf
fi
if [ -z "$account_type" ]
then
if expr "$username" : '^[^#]*$' > /dev/null || expr "$username" : '.*#gmail.com$' > /dev/null
then
account_type=GOOGLE
else
account_type=HOSTED
fi
fi
if [ -z "$password$token" ]
then
read -s -p "password: " password
unset token
echo
fi
if [ -z "$token" ]
then
token=`curl --silent --data-urlencode Email=$username --data-urlencode Passwd="$password" --data accountType=$account_type --data service=writely --data source=cURL "https://www.google.com/accounts/ClientLogin" | sed -ne s/Auth=//p`
sed -ie '/^token=/d' $HOME/.gdrive.conf
echo "token=$token" >> $HOME/.gdrive.conf
fi
umask $old_umask
for file in "$#"
do
slug=`basename "$file"`
mime_type=`file --brief --mime-type "$file"`
upload_link=`curl --silent --show-error --insecure --request POST --header "Content-Length: 0" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full${destination_folder_id+/folder:$destination_folder_id/contents}?convert=false" --dump-header - | sed -ne s/"Location: "//p`
echo "$file:"
curl --request POST --output /dev/null --data-binary "#$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args
done

Resources