The request in cmd:
curl -G -k https://api-ip.fssprus.ru/api/v1.0/result?" -d "#/test/request11.JSON" -o "/test/response11.JSON" -D "/test/hdrout2.HDR" -H "accept: application/json; charset=utf-8"
and get response
{"status":"success","code":0,"exception":"","response":{"status":0,"task_start":"2018-05-16 10:58:42","task_end":"2018-05-16 10:58:45","result":[{"status":0,"query":{"type":1,"params":{"region":"16","firstname":"\u0418\u0432\u0430\u043d","secondname":"\u0418\u0432\u0430\u043d\u043e\u0432\u0438\u0447","lastname":"\u0418\u0432\u0430\u043d\u043e\u0432","birthdate":"11.06.1975"}},"result":[]}]}}
How can I decode encoding so that the answer comes in the form of utf-8 ?
Pipe this the output through jq. It handles a lot of JSON management, including displaying UTF-8.
There are other tools like json_pp, which is part of most Perl distributions, which will also decode the UTF-8 for you.
Related
Here's the code I'm looking at:
#!/bin/bash
nc -l 8080 &
curl "http://localhost:8080" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
--data #<(cat <<EOF
{
"me": "$USER",
"something": $(date +%s)
}
EOF
)
What does the # do? Where is there documentation about #?
It is a curl-specific symbol. man curl shows you:
-d, --data <data>
(HTTP) Sends the specified data in a POST request to the HTTP server, in the
(same way that a browser does when a user has filled in an HTML form and
(presses the submit button. This will cause curl to pass the data to the
(server using the content-type application/x-www-form-urlencoded. Compare to
(-F, --form.
--data-raw is almost the same but does not have a special interpretation of
the # character. To post data purely binary, you should instead use the
--data-binary option. To URL-encode the value of a form field you may use
--data-urlencode.
If any of these options is used more than once on the same command line, the
data pieces specified will be merged together with a separating &-symbol.
Thus, using '-d name=daniel -d skill=lousy' would generate a post chunk that
looks like 'name=daniel&skill=lousy'.
If you start the data with the letter #, the rest should be a file name to
read the data from, or - if you want curl to read the data from stdin.
Multiple files can also be specified. Posting data from a file named
'foobar' would thus be done with -d, --data #foobar. When --data is told to
read from a file like that, carriage returns and newlines will be stripped
out. If you don't want the # character to have a special interpretation use
--data-raw instead.
See also --data-binary and --data-urlencode and --data-raw. This option
overrides -F, --form and -I, --head and -T, --upload-file.
I'm trying to get a config file from our GitHub using the get contents api.
This returns a JSON containing the file content encoded as a base64 string.
I'd like to get it as text
Steps I've taken
get initial api response:
curl -H 'Authorization: token MY_TOKEN' \
https://github.com/api/v3/repos/MY_OWNER/MY_REPO/contents/MY_FILE
this returns a JSON response with a field "content": "encoded content ..."
get the encoded string:
add <prev command> | grep -F "content\":"
this gets the content, but there's still the "content": string, the " chars and a comma at the end
cut the extras:
<prev command> | cut -d ":" -f 2 | cut -d "\"" -f 2
decode:
<prev command | base64 --decode>
final command:
curl -H 'Authorization: token MY_TOKEN' \
https://github.com/api/v3/repos/MY_OWNER/MY_REPO/contents/MY_FILE | \
grep -F "content\":" | cut -d ":" -f 2 | cut -d "\"" -f 2 | base64 --decode
Issues:
the resulting string (before the base64 --decode) decodes in an online decoder (not well -> see next item), but fails to do so in bash. The response being
"Invalid character in input stream."
When decoding the string in an online decoder, some (not all) of the file is in gibberish, and not the original text. I've tried all the available charsets.
Notes:
I've tried removing the last 2 (newline) chars with sed 's/..$//', but this has no effect.
If I select the output with the mouse and copy paste it to a echo MY_ECODED_STRING_PASTED_HERE | base64 --decode command, it has the same effect as the online tool, that is, it decodes as gibberish.
Add header Accept: application/vnd.github.VERSION.raw to the GET.
Following tripleee's advice, i've switched the extracting method to jq
file=randomFileName74894031264.txt
curl -H 'Authorization: token MY_TOKEN' https://github.com/api/v3/repos/MY_OWNER/MY_REPO/contents/MY_FILE > "$file"
encoded_str=($(jq -r '.content' "$file"))
echo "$encoded_str" | base64 -D
rm -f "$file"
This works when running from the command line, but when running as a script the stdout doesn't flush, and we only get the first few lines of the file.
I will update this answer when I've formalized a generic script.
I have five cURL statements that work fine by themselves and am trying to put them together in a bash script. Each cURL statement relies on a variable generated from a cuRL statement executed before it. I'm trying to figure out the smartest way to go about this. Here is the first cURL statement;
curl -i -k -b sessionid -X POST https://base/resource -H "Content-Type: application/json" -H "Authorization: Authorization: PS-Auth key=keyString; runas=userName; pwd=[password]" -d "{\"AssetName\":\"apiTest\",\"DnsName\":\"apiTest\",\"DomainName\":\"domainNameString\",\"IPAddress\":\"ipAddressHere\",\"AssetType\":\"apiTest\"}"
This works fine, it produces this output;
{"WorkgroupID":1,"AssetID":57,"AssetName":"apiTest","AssetType":"apiTest","DnsName":"apiTest","DomainName":"domainNameString","IPAddress":"ipAddressHere","MacAddress":null,"OperatingSystem":null,"LastUpdateDate":"2017-10-30T15:18:05.67-07:00"}
However, in the next cURL statement, I need to use the integer from AssetID in order to execute it. In short, how can I take the AssetID value and store it to a variable to be used in the next statement? In total, I'll be using 5 cURL statements and they rely on values generated in the preceeding statement to execute. Any insight on how is appreciated.
Download and install jq which is like sed for JSON data. You can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep does for unstructured data. Remember to replace '...' with your actual curl arguments
curl '...' | jq --raw-output '.AssetID'
and to store it in a variable use command-substitution syntax to run the command and return the result.
asset_ID=$( curl '...' | jq --raw-output '.AssetID' )
In the curl command, drop the -i flag to output only the JSON data without the header information.
I was reading the JIRA REST API tutorial and in their example curl request, they show
curl -D- -u username:password <rest-of-request>
What is the -D- syntax with the dash before and after?
To quote man curl:
-D, --dump-header <file>
Write the protocol headers to the specified file.
This option is handy to use when you want to store
the headers that a HTTP site sends to you. C
After a -D you normally give the name of the file where you want to dump the headers. As with many utilities, - is recognized as an alias to stdout. (if you're not familiar with that concept: when you launch the command from a terminal without redirection, stdout is the "terminal screen")
The -D- (without space) form is exactly the same as -D - (or on Linux at least, -D /dev/stdout)
Some research revealed a few useful stackexchange posts, namely expanding variable in CURL, but that given answer doesn't seem to properly handle bash variables that have spaces in them.
I am setting a variable to the output of awk, parsing a string for a substring (actually truncating to 150 characters). The string I am attempting to POST via curl has spaces in it.
When I use the following curl arguments, the POST variable Body is set to the part of the string before the first space.
curl -X POST 'https://api.twilio.com/2010-04-01/Accounts/GUID/SMS/Messages.xml' -d 'From=DIDfrom' -d 'To=DIDto' -d 'Body="'$smsbody'" -u SECGUID
smsbody is set as:
smsbody="$(echo $HOSTNAME$ $SERVICEDESC$ in $SERVICESTATE$\: $SERVICEOUTPUT$ | awk '{print substr($0,0,150)}')"
So the only portion of smsbody that is POSTed is $HOSTNAME$ (which happens to be a string without any space characters).
What is the curl syntax I should use to nest the bash variable properly to expand, but be taken as a single data field?
Seems pretty trivial, but I messed with quotes for a while without luck. I figure someone with better CLI-fu can handle it in a second.
Thanks!
It looks like you have an extra single quote before Body. You also need double quotes or the $smsbody won't be evaluated.
Try this:
curl -X POST 'https://api.twilio.com/2010-04-01/Accounts/GUID/SMS/Messages.xml' \
-d 'From=DIDfrom' -d 'To=DIDto' -d "Body=$smsbody" -u SECGUID
If the $s are still an issue (I don't think spaces are), try this to prepend a \ to them:
smsbody2=`echo $smsbody | sed 's/\\$/\\\\$/g'`
curl -X POST 'https://api.twilio.com/2010-04-01/Accounts/GUID/SMS/Messages.xml' \
-d 'From=DIDfrom' -d 'To=DIDto' -d "Body=$smsbody2" -u SECGUID
If I run nc -l 5000 and change the twilio address to localhost:5000, I see the smsbody variable coming in properly.
matt#goliath:~$ nc -l 5000POST / HTTP/1.1
Authorization: Basic U0VDR1VJRDphc2Q=
User-Agent: curl/7.21.6 (x86_64-apple-darwin10.7.0) libcurl/7.21.6 OpenSSL/1.0.0e zlib/1.2.5 libidn/1.20
Host: localhost:5000
Accept: */*
Content-Length: 45
Content-Type: application/x-www-form-urlencoded
From=DIDfrom&To=DIDto&Body=goliath$ $ in $: