How to use bash for http request? - bash

I get the following error when I try to use bash for HTTP request. Does anybody know how to fix the problem? Thanks.
$ exec 3<>/dev/tcp/httpbin.org/80
$ echo -e 'GET /get HTTP/1.1\r\nUser-Agent: bash\r\nAccept: */*\r\nAccept-Encoding: gzip\r\nhost: http://httpbin.org\r\nConnection: Keep-Alive\r\n\r\n' >&3
$ cat <&3
HTTP/1.1 400 Bad Request
Server: awselb/2.0
Date: Wed, 31 Mar 2021 00:43:01 GMT
Content-Type: text/html
Content-Length: 524
Connection: close
<html>
<head><title>400 Bad Request</title></head>
<body>
<center><h1>400 Bad Request</h1></center>
</body>
</html>
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->

Your Host header is incorrect. That should be a hostname, not a URL:
$ echo -e 'GET /get HTTP/1.1\r
User-Agent: bash\r
Accept: */*\r
Accept-Encoding: gzip\r
host: httpbin.org\r
Connection: Keep-Alive\r
\r
' >&3
Which results in:
$ cat <&3
HTTP/1.1 200 OK
Date: Wed, 31 Mar 2021 00:55:23 GMT
Content-Type: application/json
Content-Length: 279
Connection: keep-alive
Server: gunicorn/19.9.0
Access-Control-Allow-Origin: *
Access-Control-Allow-Credentials: true
{
"args": {},
"headers": {
"Accept": "*/*",
"Accept-Encoding": "gzip",
"Host": "httpbin.org",
"User-Agent": "bash",
"X-Amzn-Trace-Id": "Root=1-6063c87b-09303a470da318290e856d71"
},
"origin": "96.237.56.197",
"url": "http://httpbin.org/get"
}
Regarding your second question about getting your script to terminate
properly, one option is to parse the Content-Length header and only
read that many bytes. Something like:
#!/bin/bash
exec 3<>/dev/tcp/httpbin.org/80
cat <<EOF | dos2unix >&3
GET /get HTTP/1.1
User-Agent: bash
host: httpbin.org
EOF
while :; do
read line <&3
line=$(echo "$line" | tr -d '\r')
[[ -z $line ]] && break
if [[ $line =~ "Content-Length" ]]; then
set -- $line
content_length=$2
fi
done
echo "length: $content_length"
dd bs=1 count=$content_length <&3 2> /dev/null
That works for this one particular test case, but it's awfully fragile
(e.g., what if there is no Content-Length header?).
I would just use curl instead of trying to use bash as an http
client.

Related

Stress testing URI using xargs + curls bash script failing with status empty

I'm trying to do user acceptance testing on an application which becomes unresponsive on a particular URL parameter included in the GET request.
Steps
I have curl and run the GET req (crafted) copied curl syntax for Unix and copied to ubuntu server along with some changes.
'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0Ellc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1'
"https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'23548'**0-09-15|%2013:04:00"
"https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'`23548`'**0-09-15|%2013:04:00"
The ** asterisks are not part of the actual values; I use them to demarcate my injected value
Using a small bash script I have generated 1000s of (unique) payload combinations for Curl.
#/bin/bash
for ((i=0; i<1000; ++i)); do
echo "
'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7
f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciU yMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmZyaWVuZHMlMjIlM0ElMjIlMjIlMkMlMjJhdXRoJTIyJTNBJTIyZWQ0YjVhNDFkMzJlY2U4MzQ3Mzk0ZjlkZT U5YThjMWQlMjIlMkMlMjJyZWZlcmVyJTIyJTNBJTIyaXJpZGl1bS1wcmVwcm9kLmVtcGlyaWMuYWklMjIlMkMlMjJhY2Nlc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwO
GE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disable
lastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_
Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1' \"https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202'$((1 + RANDOM % 10000000))'0-09-15|%2013:04:00\""
> URL.txt
done
Final command for testing (one-liner) fails as
cat URL.txt | xargs -I{} -- curl -O {}
Output:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
Expected output
when I run the curl manually copying the contents from URL file I get
[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00", "dashboard_config": null, "id": 5}]
<html>
<head><title>504 Gateway Time-out</title></head>
<body>
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.18.0</center>
In order to troubleshoot, i used set -x on shell cmd-line I can't see why or how the request is crafted and handled by the curl processes. The curl output shows output (above) which has all 0 values in all fields, this tell me its just a bad malformed request, which isn't the actual case since i manually tested running the URL payload given in URL.txt multiple times it works.
EMPTY LINE
CODE
NEW-LINE
CODE
NEWLINE
...
I want to generate as many parallel requests as possible, without waiting for the first one to finish.
Debug
running it with -v using one-liner (showing only importants lines)
> GET /getMultiDashboard/demouser -H Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/ HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
>
{ [5 bytes data]
< HTTP/1.1 400 BAD_REQUEST
< Content-Length: 0
< Connection: Close
When I run it with curl alone not using xargs I get the correct output no 400 bad request
> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
>
< HTTP/1.1 200 OK
< Content-Type: text/html; charset=utf-8
< Date: Mon, 05 Oct 2020 09:48:51 GMT
< ETag: W/"3b4-gP1vMAXMzUZy+pt7cwyOmQslPT8"
< Server: nginx/1.18.0
< Strict-Transport-Security: max-age=15552000; includeSubDomains
< Vary: Accept-Encoding
< X-Content-Type-Options: nosniff
< X-DNS-Prefetch-Control: off
< X-Download-Options: noopen
< X-Frame-Options: SAMEORIGIN
< X-XSS-Protection: 1; mode=block
< Content-Length: 948
< Connection: keep-alive
<
* Connection #0 to host abc.ai left intact
[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00", "2020-08-12 09:08:00", "dashboard_config": {}, "sort_id": 4, "id": 2}, {"dashboard_id": 5}]* Found bundle for host abc.ai: 0x55836cf75a50 [can pipeline]
* Re-using existing connection! (#0) with host abc.ai
* Connected to abc.ai (52.86.136.249) port 443 (#0)
> GET /getTagTr/E1_CP/2020-9-12%2013:4:0/202'6368'0-09-15|%2013:04:00 HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
Having multiple curl arguments and options in the same file adds a complication which probably isn't worth working around. Basically,
echo "http://example.com -H 'X-Hello: Hello'" | xargs curl -O
passes the entire argument to echo as a single string to curl, which interprets it as the URL to fetch.
My suggestion would be to put the URL and any other arguments on the command line, and only store the -H option's argument in the file.
for ((i=0; i<1000; ++i)); do
curl -O http://example.com -H "$(sed "s/%|/%$((1 + RANDOM))|/" xm.cookiefile)"
done
and run 400 (or whatever) of these jobs in parallel, perhaps just as regular background processes, or maybe with xargs if you think it adds value. (Maybe also look at GNU parallel which simplifies some aspects of this.)
I took out the big modulo because it's not doing anything; $RANDOM produces integers in the range 0-32767 so if you need a much bigger number, maybe paste together multiple $RANDOM numbers, or maybe use a different random source.

curl post audio data with rate limit

I am trying to post audio data with curl for a HTTP-API which allows to transmit/receive audio files.
First I tried this:
curl -vv --http1.0 -H "Content-Type: audio/basic" -H "Content-Length: 9999999" -H "Connection: Keep-Alive" -H "Cache-Control: no-cache" --data-binary #- 'http://IP/API-Endpoint.cgi'
This seems to work:
* Trying [IP]...
* TCP_NODELAY set
* Connected to [IP] ([IP]) port 80 (#0)
> POST /API-Endpoint.cgi HTTP/1.0
> Host: [IP]
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type: audio/basic
> Content-Length: 9999999
> Connection: Keep-Alive
> Cache-Control: no-cache
>
* upload completely sent off: 17456 out of 17456 bytes
* HTTP 1.0, assume close after body
< HTTP/1.0 200 OK
< Content-Type: text/plain
< Content-Length: 0
* HTTP/1.0 connection set to keep alive!
< Connection: keep-alive
< Date: Wed, 06 Jun 2018 19:38:37 GMT
< Server: lighttpd/1.4.45
But I can only hear the very last part of the Audio file. (The file has the correct audio format for the API: G.711 μ-law with 8000 Hz) My next guess is, that the audio gets transmitted too fast and has to be sent in real time to the API endpoint. So I tried the --limit-rate parameter of curl, which had no effect. Then I tried piping the data with a rate limit into curl:
cat myfile.wav | pv -L 10k | curl -vv --http1.0 -H "Content-Type: audio/basic" -H "Content-Length: 9999999" -H "Connection: Keep-Alive" -H "Cache-Control: no-cache" --data-binary #- 'http://IP/API-Endpoint.cgi'
but the result is always the same: I can only hear the last part of the audio file. It seems like curl is waiting for the piped input to complete and then sends the request as before.
Is there an option to post audio to a HTTP-API from bash in "real time"?
Update:
Without forcing HTTP 1.0 I get the following result:
curl -vv -H "Content-Type: audio/basic" --data-binary '#myfile.wav' 'http://[IP]/API-Endpoint.cgi'
* Trying [IP]...
* TCP_NODELAY set
* Connected to [IP] ([IP]) port 80 (#0)
> POST /API-Endpoint.cgi HTTP/1.1
> Host: [IP]
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type: audio/basic
> Content-Length: 15087
> Expect: 100-continue
>
< HTTP/1.1 417 Expectation Failed
< Content-Type: text/html
< Content-Length: 363
< Connection: close
< Date: Wed, 06 Jun 2018 20:34:22 GMT
< Server: lighttpd/1.4.45
<
<?xml version="1.0" encoding="iso-8859-1"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<title>417 - Expectation Failed</title>
</head>
<body>
<h1>417 - Expectation Failed</h1>
</body>
</html>
* Closing connection 0
with -H "Content-Length: 9999999" you say that your audio file is exactly 9999999 bytes long (roughly 10 megabytes), but curl reports that your file is 17456 bytes:
* upload completely sent off: 17456 out of 17456 bytes
(roughly 0.02 megabytes), so either your Content-Length header is wrong (that's my best guess), or the program feeding your audio file to curl is faulty, closing stdin prematurely.
either fix your Content-Length header, or fix the program feeding curl's stdin, hopefully that should send the entire file intact.
EDIT: oh, seems that server can't handle Expect: 100-continue, to disable that header, add the argument -H 'Expect:'
(an empty Expect header will make curl omit the header entirely, instead of sending the header empty)
... but to answer the question in the title, yeah that's the --limit-rate argument.

curl syntax in GET based HTTP logins

For practice purposes I decided to create a simple bruteforcing bash script, that I succesuly used to solve DWVA. I then moved to IoT - namely my old IP camera. This is my code as of now:
#!/bin/bash
if [ "${##}" != "2" ]; then
echo "<command><host><path>"
exit
fi
ip=$1
path=$2
for name in $(cat user.txt); do
for pass in $(cat passwords.txt); do
echo ${name}:${pass}
res="$(curl -si ${name}:${pass}#${ip}${path})"
check=$(echo "$res" | grep "HTTP/1.1 401 Unauthorised")
if [ "$check" != '' ]; then
tput setaf 1
echo "[FAILURE]"
tput sgr0
else
tput setaf 2
echo "[SUCCESS]"
tput sgr0
exit
fi
sleep .1
done;
done;
Despite obvious flaws - like reporting succes in case of network failure - it's as good as my 20 minutes coding jobs are. However, I can't seem to get the curl command syntax quite right. Camera in question is a simple Axis, running cramFS and a small scripting os. It's similar to a lot of publicly available cameras' login forms, like ones found here, here or here. A simple GET, yet I feel like I'm bashing my head against a wall. Any bit of ahint will be madly appreciated at this point.
I've taken the liberty to paste contents of first GET package:
AYGET /operator/basic.shtml?id=478 HTTP/1.1
Host: <target_host_ip>
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:58.0) Gecko/20100101 Firefox/58.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Referer: http://<target_host_ip>/view/view.shtml?id=282&imagepath=%2Fmjpg%2Fvideo.mjpg&size=1
Connection: keep-alive
Upgrade-Insecure-Requests: 1
Authorization: Digest username="root", realm="AXIS_ACCC8E4A2177", nonce="w3PH7XVmBQA=32dd7cd6ab72e0142e2266eb2a68f59e92995033", uri="/operator/basic.shtml?id=478", algorithm=MD5, response="025664e1ba362ebbf9c108b1acbcae97", qop=auth, nc=00000001, cnonce="a7e04861c3634d3b"
Package sent in return is a simple, dry 401.
PS.: Any powers that be - feel free to remove the IPs if they violate anything. Also feel free to point out grammar/spelling etc. mistakes since C2 exam is coming.
It looks like those cameras don't simply use "Basic" HTTP auth with a base64 encoded username:password combo, but use digest authentication which involves a bit more.
Luckily, with cURL this just means you need to specify --digest on the command line to handle it properly.
Test the sequence of events yourself using:
curl --digest http://user:password#example.com/digest-url/
You should see something similar to:
* Trying example.com...
* Connected to example.com (x.x.x.x) port 80 (#0)
* Server auth using Digest with user 'admin'
> GET /view/viewer_index.shtml?id=1323 HTTP/1.1
> Host: example.com
> User-Agent: curl/7.58.0
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Date: Wed, 08 Nov 1972 17:30:37 GMT
< Accept-Ranges: bytes
< Connection: close
< WWW-Authenticate: Digest realm="AXIS_MACADDR", nonce="00b035e7Y417961b2083fae7e4b2c4053e39ef8ba0b65b", stale=FALSE, qop="auth"
< WWW-Authenticate: Basic realm="AXIS_MACADDR"
< Content-Length: 189
< Content-Type: text/html; charset=ISO-8859-1
<
* Closing connection 0
* Issue another request to this URL: 'http://admin:admin2#example.com/view/viewer_index.shtml?id=1323'
* Server auth using Digest with user 'admin'
> GET /view/viewer_index.shtml?id=1323 HTTP/1.1
> Host: example.com
> Authorization: Digest username="admin", realm="AXIS_MACADDR", nonce="00b035e7Y417961b2083fae7e4b2c4053e39ef8ba0b65b", uri="/view/viewer_index.shtml?id=1323", cnonce="NWIxZmY1YzA3NmY3ODczMDA0MDg4MTUwZDdjZmE0NGI=", nc=00000001, qop=auth, response="3b03254ef43bc4590cb00ba32defeaff"
> User-Agent: curl/7.58.0
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Date: Wed, 08 Nov 1972 17:30:37 GMT
< Accept-Ranges: bytes
< Connection: close
* Authentication problem. Ignoring this.
< WWW-Authenticate: Digest realm="AXIS_MACADDR", nonce="00b035e8Y8232884a74ee247fc1cc42cab0cdf59839b6f", stale=FALSE, qop="auth"
< WWW-Authenticate: Basic realm="AXIS_MACADDR"
< Content-Length: 189
< Content-Type: text/html; charset=ISO-8859-1
<

Using PDF Reactor as Web Service

I am discovering PDF reactor and I'd like to use it as a web service. To test a file, I use cURL
curl -v -X POST --header "Content-Type:application/xml" http://localhost:9423/service/rest/convert/async -d #test.html
Is that correct ?
test.html :
<html>
<body>
Coucou, je suis terrien.
</body>
</html>
Thank you for your help,
Cédrik
edit #1:
response from the comman above :
* About to connect() to localhost port 9423 (#0)
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 9423 (#0)
> POST /service/rest/convert/async HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.14.3.0 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: localhost:9423
> Accept: */*
> Content-Type:application/xml
> Content-Length: 50
>
< HTTP/1.1 400 Bad Request
< Content-Type: text/plain
< Date: Tue, 15 Dec 2015 11:47:29 GMT
< Content-Length: 307
< Server: Jetty(9.3.2.v20150730)
<
* Connection #0 to host localhost left intact
* Closing connection #0
JAXBException occurred : élément inattendu (URI : "", local : "html"). Les éléments attendus sont <{http://webservice.pdfreactor.realobjects.com/}configuration>. élément inattendu (URI : "", local : "html"). Les éléments attendus sont <{http://webservice.pdfreactor.realobjects.com/}configuration>.
When using the REST API of PDFreactor via cURL you have to send a configuration XML or JSON to the server which includes configuration for PDFreactor and your document, as described here: http://www.pdfreactor.com/product/doc_html/index.html#d0e688
A sample configuration for XML could look like this:
config.xml:
<tns:configuration xmlns:tns="http://webservice.pdfreactor.realobjects.com/">
<document><html> <body> Coucou, je suis terrien. </body> </html></document>
</tns:configuration>
You can then call the following:
curl -v -X POST --header "Content-Type:application/xml" http://localhost:9423/service/rest/convert/async.xml -d #config.xml
The output will look like the following:
* About to connect() to localhost port 9423
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 9423
> POST /service/rest/convert/async.xml HTTP/1.1
> User-Agent: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5
> Host: localhost:9423
> Accept: */*
> Content-Type:application/xml
> Content-Length: 195
>
> <tns:configuration xmlns:tns="http://webservice.pdfreactor.realobjects.com/"> <document><html><body>Coucou, je suis terrien.</body></html></document></tns:configuration>HTTP/1.1 202 Accepted
< Access-Control-Allow-Credentials: true
< Access-Control-Allow-Headers: Accept, Content-Length, content-type, Host, User-Agent
< Access-Control-Allow-Methods: GET, PUT, POST, DELETE
< Access-Control-Expose-Headers: Location
< Cache-Control: no-cache
< Date: Wed, 16 Dec 2015 16:34:19 GMT
< Location: http://localhost:9423/service/rest/progress/c2a58dbd-ef9d-4b79-87d9-079c139fe9ed
< Content-Length: 0
< Server: Jetty(9.3.2.v20150730)
* Connection #0 to host localhost left intact
* Closing connection #0
The "Location" response header contains the URL which can be used to retrieve the progress of the conversion, so you can retrieve the progress with (the ID will of course vary):
curl -v http://localhost:9423/service/rest/progress/c2a58dbd-ef9d-4b79-87d9-079c139fe9ed
This will return the conversion progress and if the conversion has finished the "Location" repsonse header will contain a new URL to retrieve the document. You can use ".pdf" to retrieve the PDF binary data or ".xml" to retrieve XML data containing the PDF as base64 encoded String, the number of pages of the document, etc.
curl -v http://localhost:9423/service/rest/document/c2a58dbd-ef9d-4b79-87d9-079c139fe9ed.pdf

Bash. CGI post gives error 500. But works without AJAX.

I'm trying to call an CGI page but the response comes in blank. It returns error 500. If I just do the post without AJAX it works well.
#!/bin/bash
echo "content-type: text/html"
echo "lalala" > temp.file
cat temp.file
echo "
<br><b>Program:</b> $program <br> \n"
echo "<html> adsdasd </html>"
Here are the headers:
Connection close
Content-Length 535
Content-Type text/html; charset=iso-8859-1
Date Thu, 19 Jan 2012 12:30:04 GMT
Server Apache
Request Headers
Accept */*
Accept-Encoding gzip, deflate
Accept-Language en-us,en;q=0.5
Connection keep-alive
Content-Length 16
Content-Type application/x-www-form-urlencoded; charset=UTF-8
Host cgi:8888
Origin null
User-Agent Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:10.0) Gecko/20100101 Firefox/10.0
I solved it with
echo
echo
in the begin of the file.
It seems the server need those two echo before the header

Resources