Calculate OAuth signature for NetSuite restlet using bash - bash

I'm currently trying to generate an OAuth signature for my curl request header. These point to a NetSuite restlet. Resources online are either inconclusive or too high level for my understanding/lacking examples. How do I go about calculating the oauth_signature value for my request?
The following is my request with credentials ommitted:
curl --request GET \
--url 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=foo&deploy=bar' \
--header 'Authorization: OAuth realm="'"$realm"'",oauth_consumer_key="'"$oauth_consumer_key"'",oauth_token="'"$oauth_token"'",oauth_signature_method="HMAC-SHA1",oauth_timestamp="'"$(OAuth_timestamp)"'",oauth_nonce="'"$(OAuth_nonce)"'",oauth_version="1.0",oauth_signature="'"$(OAuth_signature)"'"' \
--header 'cache-control: no-cache' \
--header 'content-type: application/json' \
| jq
Below is a list of the parameters I'm passing for the sake of readability:
params=(
oauth_consumer_key='foo'
oauth_signature_method='HMAC-SHA1'
oauth_version='1.0'
oauth_nonce=$(OAuth_nonce)
oauth_timestamp=$(OAuth_timestamp)
oauth_token='tokenfoo'
realm='4478811'
)
I am generating the timestamp and nonce like so:
OAuth_nonce () {
md5 <<< "$RANDOM-$(date +%s.%N)" | cut -d' ' -f 1
}
OAuth_timestamp () {
echo "$(date +%s)"
}
I got most of my resources from https://github.com/livibetter-backup/bash-oauth but no docs exist, the examples are poor, and the library itself doesn't seem to work when I've tested the functions.
All the values I use in the script (confirmed passing with bash +x) work when ran in Postman, but I can't calculate a oauth_signature value outside of it.
How do I create a OAuth_signature function that I can return a valid signature with? What parameters am I going to have to pass that function to calculate correctly? Is it possible or easier to generate perhaps using perl or python?

Related

Pass Associative Array as Data Param in POST request using cURL

I have an associative array that I want to pass to cURL as POST data. However i have tried multiple things, still it doesn't work.
The array:
declare -A details
details[name]="Honey"
details[class]="10"
details[section]="A"
details[subject]="maths"
The cURL commands have tried so far (all of these failed):
resp = $(cURL --request POST --data details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data variables=details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data "variables=$details" "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data "variables=${details}" "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data $details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data ${details} "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data variables=details "https://somedomain.net/getMarks")
Something like shown below, I want the above request to be (indirectly), however I want to pass the array directly instead of writing its contents.
resp = $(cURL --request POST --data '{"variables":[{"name": "Honey"},{"class": "10"},{"section": "A"},{"subject": "maths"}]}' "https://somedomain.net/getMarks")
Please note that to begin with I will always have the associative array ONLY (not any json array or string).
This question rose when I was trying calling cURL command with the associative array as on this link (GITLAB API)(the example does not contain variables array example). Here they have mentioned a variables array (array of hashes).
Since I had to use an older version of bash, which does not involve
the name referencing as stated on the answer, I had to try to
code string creation of the associative array without passing it to a function
Since I always had an associative array to begin with, the process of passing the array as accepted by the gitlab API normally was:
resp=$(cURL --request POST --data '{"variables":[{"name": "Honey"},{"class": "10"},{"section": "A"},{"subject": "maths"}]}' "https://somedomain.net/getMarks")
OR
resp=$(cURL --request POST --data "variables[name]=Honey" --data "variables[class]=10" --data "variables[section]=A" --data "variables[subject]=maths" "https://somedomain.net/getMarks")
So tried some tweaks on the second way and what worked for me was:
_sep=""
_string=""
for index in "${!details[#]}"
do
_string="${_string}${_sep}variables[${index}]="${details[$index]}"
_sep="&"
done
resp=$(cURL --request POST --data "$_string" "https://somedomain.net/getMarks")
#which indirectly was:
resp=$(cURL --request POST --data "variables[name]=Honey&variables[class]=10&variables[section]=A&variables[subject]=maths" "https://somedomain.net/getMarks")
And it was a success. Thanks to #markp-fuso for giving me an intuition of creating a string with his logic above.
Assumptions/understandings:
no need to list the array entries in any particular order
neither the array indices or values contain newlines
One bash idea:
# use a function to build the --data component
build_data() {
local -n _arr="$1" # use nameref so we can pass in the name of any associative array
local _sep=""
local _string='{"variables":['
local _i
for i in "${!_arr[#]}"
do
_string="${_string}${_sep}{\"${i}\": \"${_arr[$i]}\"}"
_sep=","
done
printf "%s]}" "${_string}"
}
Adding this to the curl call:
resp=$(cURL --request POST --data "$(build_data details)" "https://somedomain.net/getMarks")
NOTES:
no spaces allowed on either side of the =, ie, resp = $(curl ...) needs to be resp=$(curl ...)
without an actual/valid URL I'm guessing a bit on if/where the escaped quotes belong so may need to tweak the escaped quotes to get working correcly

Not able to replace the value of the variable inside expression in bash script

I am trying to run a bash script, where I would like to make POST calls in a for loop as follows:
for depId in "${depIds[#]}"
do
echo "$depId" <--------------------------------- THIS IS PRINTING PROPER VALUE
curl 'https://student.service.com/api/student' \
-H 'Accept: application/json' \
-H 'Content-Type: application/json' \
-H 'Cookie: UISESSION=abcd' \
--data-raw '{"name":"Student Name","description":"Dummy","depId":$depId}' \ <---- HERE I CANNOT GET THE VALUE OF THE VARIABLE
--compressed
echo "$content"
done
As mentioned above, I cannot get the value of the department id in the URL, with the above form, I am getting a Request Malformed exception. I have even tried with ${depId}, but no luck.
Could anyone please help here ?
Try flipping your quotes around the variable.
--data-raw '{"name":"Student Name","description":"Dummy","depId":'"$depId"'}' \

provide method for check status code with pass callback function in bash script

I faced with problem with organization code. I have api which returned jwt token with some expire lifetime and several api which protected by jwt token, I need call this api's during some time and jwt token can be expired, how to protect from this case ? Need develop some approach until http_code !=401 -> execute curls, else -> relogin. How to organization bash script for correct works ? Maybe need move some logic(aobut check status code 401) to method, how to pass callback function with some curl(for some api) to method ?
This what I did, looking at this, you can analyze it and made conclusion how it should works:
http_code='401'
successful_status_code='200'
auth='false'
while [ "$auth" != "true" ]
do
# in that [if] cheking status code if it 401, int case hwen token expired, need to fetch new token, by again execute api/v2/login
if [ "$http_code" != "$successful_status_code" ]
then
# this curl execute api with username and password from file, then
#response with `token` saving in `result.json`, from this file
#anotther #api will be fetch token for get access to the protected api
http_code=$(curl --request POST -sL \
--url 'http://qbee.local/api/v2/login'\
--output './result.json' \
--header "Content-Type: application/json" \
-d "#login.json" \
--write-out "%{http_code}")
else
auth='true'
# fetched token which was saved after api/v2/login curl
tokenValue=$(jq -r '.token' './result.json')
echo 'token was' + $tokenValue
# For get access to the potected api need to add header
# 'Authorization: Bearer some_token', then apireturn some data,
#but in case when toke expired need to execute `api/v2/login`
# this api curl saved response in file, that's it
http_code=$(curl --request GET -sL \
--url 'http://qbee.local/api/v2/grouptree?_format=json'\
--output './grouptree_result.json' \
--header 'Authorization: Bearer '"$tokenValue")
echo "$(cat grouptree_result.json)"
fi
done
How I imagine it, some bash script should executed several api during by some time period, so need to provide some structure which should be catch case when toke will be expire and relogin again, saving new tokne in file.
#some scenario where several api's should be call
statuscode = executeApi(fetchGroupTree())
statuscode = executeApi(fetchSomeData())
...
//etc methods which call api
#method which provide approach catch 401 and relogin user, refresh
#toke and then again execute current api method
public function executeApi(function api())
{
statuscode = execute api()
if (statuscode == 401) {
execute auth()
statuscode = execute api()
}
return statuscode
}
# another some api which expect not expire token for provide access
public function fetchSomeData()
{
tokenValue=$(jq -r '.token' './result.json')
echo 'token was' + $tokenValue
// execute some curl api
}
# some api which expect not expire token for provide access
public function fetchGroupTree()
{
tokenValue=$(jq -r '.token' './result.json')
echo 'token was' + $tokenValue
curl --request GET -sL \
--url 'http://qbee.local/api/v2/grouptree?_format=json'\
--output './grouptree_result.json' \
--header 'Authorization: Bearer '"$tokenValue" \
echo "$(cat grouptree_result.json)"
}
# method for relogin and saving new token in file after old token was expired
public function auth()
{
http_code=$(curl --request POST -sL \
--url 'http://qbee.local/api/v2/login'\
--output './result.json' \
--header "Content-Type: application/json" \
-d "#login.json" \
--write-out "%{http_code}")
}
But this is my vision, I'm not expert on bash scripts how usually develop bash script for this goal, maybe some best practices ?
You typically write some wrappers:
# the tokenfile location
tokenf=$(mktemp)
trap 'rm "$tokenf"' EXIT
# just some abstractions
get_token() {
cat "$tokenf"
}
has_token() {
[[ -s "$tokenf" ]]
}
clear_token() {
: > "$tokenf"
}
# the generic curl wrapper way we take
# remember to add -S so that you see errors
_curl() {
curl -sSL "$#"
}
# execute authentication request and place the result in token file
do_auth() {
local cmd
cmd=(
_curl --request POST
--url 'http://qbee.local/api/v2/login'
--output "$tokenf"
--header "Content-Type: application/json"
-d "#login.json"
# TODO: implement checking error code
# --write-out "%{http_code}"
)
"${cmd[#]}"
}
# small wrapper so that loop looks nice
# first argument - filename for output
# other arguments passed to curl
# outputs HTTP exit code
_execute_api_callback() {
local outf cmd
outf="$1"
shift
cmd=(
_curl --request GET
--header 'Authorization: Bearer '"$(get_token)"
--output "$outf" --write-out "%{http_code}"
"$#"
)
"${cmd[#]}"
}
# run curl request in a loop requesting for authentication
execute_api() {
# execute in a subshell for EXIT trap to auto-cleanup
(
# The temporary location for curl output
tmpf=$(mktemp)
trap 'rm "$tmpf"' EXIT
while
# TODO: check other error codes
statuscode=$(_execute_api_callback "$tmpf" "$#")
((statuscode != 200)
# TODO: implement timeout
do
do_auth # TODO: check failure
done
cat "$tmpf"
)
}
if ! has_token; then
do_auth
fi
# is --url really that needed?
execute_api 'http://qbee.local/api/v2/grouptree?_format=json'
This is just a code I have written in 5 mins on this forum - I did not test it, I did not run it. Check your code with https://shellcheck.net , read https://mywiki.wooledge.org/BashFAQ/005 and check out https://mywiki.wooledge.org/BashGuide/Parameters .

Running Curl POST w/ JSON Payload Sequentially Against File

I am hitting a wall trying to build a script to save myself quite a good bit of time. I am working in a system in which I need to run a curl POST against a list of values. The list is about 400 lines long, so I am hoping to find a way of scripting this in Bash instead of running that call manually for each entry. Below are some details to help understand what I'm trying to accomplish:
If I were to be doing this task manually, each call would be formatted like the below:
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
This points to my JSON in the listed file which shows as the below:
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": "apple"
}
If I run the above, the call works, and the expected operation occurs. The issue is that I need to run this against roughly 400 values for that "memberInfo" field in the JSON payload. I'm trying to identify a way to run a single bash script, which will run this curl command over and over, and update the JSON payload to use each row in a file as the below:
memberList.txt
apple
banana
peach
pear
orange
.
.
And then maybe insert a pointer in my JSON for the "memberInfo" field over to this file.
Any and all help/suggestions are greatly appreciated!
.
This will do as you intend. Its a little convoluted but you might polish it a bit.
#!/bin/bash
function getString(){
echo $1 | python3 -c '
import json
import sys
payload="""
{
"groupId": "12345678987654321",
"type": "serial",
"memberInfo": ""
}
"""
obj = json.loads(payload)
obj["memberInfo"] = sys.stdin.read().strip()
print(json.dumps(obj, indent = " "))
'
}
while read member
do
getString "$member" > json_payload.json
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d #json_payload.json
done <<< "$( cat fruits.txt )"
Hope it helps!
while read member; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '{"groupId": "12345678987654321","type": "serial","memberInfo": "$member"}'
done <members.txt
This will work if you only care about the memberInfo field, another method could be writing your json line by line to payloads.txt file.
payloads.txt
{"groupId": "12345678987455432","type": "stereo","memberInfo": "apple"}
{"groupId": "34532453453453465","type": "serial","memberInfo": "banana"}
...
then use this as the script
while read payload; do
curl -X POST --header "Content-Type: application/json" -v 'http://www.website.com:8081/cc/membership' -d '$payload'
done <payloads.txt
here is a collection of bash scripting common uses I've had to use
https://github.com/felts94/advanced-bash/blob/master/bash_learn.sh

ASP Classic parse data from curl POST -F

I have the following CURL request which pointing to my service:
curl -X POST \
http://go.example.com/ \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-H 'Postman-Token: cf0c1ab5-08ff-1aa2-428e-24b855e1a61c' \
-H 'content-type: multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW' \
-F fff=vvvvv \
-F rrrr=ddddd \
-F xx=something
I'm trying to catch the xx paramter in classic ASP code.
I tried 'Request("xx")' and 'Request.Form("xx")'.
Do you have any idea?
This is from the CURL documentation
-F, --form
(HTTP SMTP IMAP) For HTTP protocol family, this lets curl emulate a filled-in form in which a user has pressed the submit button. This causes curl to POST data using the Content-Type multipart/form-data according to RFC 2388.
When a form is submitted to Classic ASP using a content-type of multipart/form-data the only method available is Request.BinaryRead() as Request.Form is for application/x-www-form-urlencoded data.
Here is a quick example of calling Request.BinaryRead() to get you started:
<%
'Should be less than configured request limit in IIS.
Const maxRequestSizeLimit = ...
Dim dataSize: dataSize = Request.TotalBytes
Dim formData
If dataSize < maxRequestSizeLimit Then
'Read bytes into a SafeArray
formData = Request.BinaryRead(dataSize)
'Once you have a SafeArray its up to you to process it.
...
Else
Response.Status = "413 PAYLOAD TOO LARGE"
Response.End
End If
%>
Parsing a SafeArray isn't easy
If you want to still use Request.Form you can do by specifying the form parameters in the CURL command using -d instead of -F. From the documentation;
-d, --data
(HTTP) Sends the specified data in a POST request to the HTTP server, in the same way that a browser does when a user has filled in an HTML form and presses the submit button. This will cause curl to pass the data to the server using the content-type application/x-www-form-urlencoded. Compare to -F, --form.
So the CURL command would be something like;
curl -X POST \
http://go.mytest-service.com/ \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d fff=vvvvv \
-d rrrr=ddddd \
-d xx=something
You would then retrieve the xx parameter in Classic ASP using;
<%
Dim xx: xx = Request.Form("xx")
%>
Useful Links
application/x-www-form-urlencoded or multipart/form-data?
MSDN - Request.BinaryRead Method
Example class for parsing a SafeArray (specifically the BuildUpload() method which takes a SafeArray and parses the binary)
Example implemenation of a File Uploader class using Request.BinaryRead() on Planet Source Code

Resources