i am loading xml as String from Remote using curl as below:
$ curl -i -H "Accept: application/xml" -X GET "URL Here"
but response is not xml format, hence not easily readable.
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><RunConfig <PipeLineXmlVersion>1.0</PipeLineXmlVersion><DateTime>20161128_160859</DateTime><Analysis><Lane>1</Lane><PipeLine>run_multiplexed_auto_start_v4.0.sh</PipeLine><Version>4.0</Version><Mismatch>1</Mismatch><MergeLane>0</MergeLane<Version>4.0</Version></Analysis></RunConfig>
When i try the same API using some REST client then i can see the proper xml.
As i searched, Accept header should work but unfortunately not in my case.
Please help me with this.
Thanks.
If what you mean by "not proper" is the fact that the response is not pretty printed (i.e. lack spaces and indent), there are plenty of command line tools to format xml.
For example:
curl ... | xmllint --format -
Here, you pass the response of curl to xmllint (part of xmllib2-utils), which will format your answer. The - in the end tells the tool to print the result in the console.
Have a look at this question for more ways to achieve it.
Related
HTTP GraphQl calls always return 200 - even if the response doesn't suite the request.
I'm trying to do a graphql call and understand if there's an error, like using $? and --fail but it doesn't help because of the always 200 response.
Even if graphql's output isn't according to input and contains error arrays, curl only cares about the http code, which is always 200.
Is there a way for curl to understand a graphql error? Like some kind of built in mechanism in to compare requested input to actual input and understand there's an error?
Perhaps I'm barking on the wrong tree here and should use some command line tool more dedicated to graphql? Thanks.
curl doesn't know anything in particular about GraphQL. You can pipe the output of curl to grep to check for the presence of errors and draw conclusions based on that as necessary.
ex:
curl --request POST \
--header 'content-type: application/json' \
--url http://localhost:4000/ \
--data 'your query data'| grep "errors"
I'm writing a bash script to interface with my ecobee (query info and change settings). I have the authorization all worked out (access token and refresh token) and am now trying to request info from the ecobee. This json parameter list is dynamically created. None of the curl examples in the Developers API Doc seem to work.
I've tried assigning the json to a variable (?json="$selection") and to a file (?json=#"$location"). My latest attempt (hard coding the json and escaping the braces) of the curl is as follows:
response=$(curl -s "https://api.ecobee.com/1/thermostat?json=\{"selection":\{"selectionType":"registered","selectionMatch":null,"includeRuntime":true,"includeSettings":true/}/}" -H "Content-Type: application/json;charset=UTF-8" -H "Authorization: Bearer $access_token")
and I received a null response declare -- response=""
If I have curl read from a file:
{"selection":{"selectionType":"registered","selectionMatch":null,"includeRuntime":true,"includeSettings":true}}
then I receive:
response='{
"status": {
"code": 4,
"message": "Serialization error. Malformed json. Check your request and parameters are valid."
}
}'
Which I assuming it' an escape issue?
Can anyone offer any insight? Again, I thought this would be the easy part. I'm using jq to build my request json. Other alternatives to curl that can better deal with json? I'm limited to bash (which is what I know)
Thanks
To integrate (arbitrary) data into a URL (or URI in general) you need to prepare it using percent-encoding first.
As you have mentioned to use jq to compose that data, you could add #uri to your jq filter to perform the encoding, and use the --raw-output (or -r) option to then output it properly.
For example, jq -r '#uri' applied to your JSON
{
"selection": {
"selectionType": "registered",
"selectionMatch": null,
"includeRuntime": true,
"includeSettings": true
}
}
would output
%7B%22selection%22%3A%7B%22selectionType%22%3A%22registered%22%2C%22selectionMatch%22%3Anull%2C%22includeRuntime%22%3Atrue%2C%22includeSettings%22%3Atrue%7D%7D
which you can use within the query string ?json=….
I am working on Windows10 and tried to load a geojson file into my couchdb via the "curl" command and a POST request in the cmd which looks like that:
C:\Program Files\cURL\bin>curl -d #path-to-my-data\data.geojson -H "Content-type: application/json" -X POST http://127.0.0.1:5984/_utils/database.html?-dbName-
and then I get the following error:
{"error":"method_not_allowed","reason":"Only GET,HEAD allowed"}
On http://couchdb-13.readthedocs.org/en/latest/api-basics/ it is said, that "If you use the an unsupported HTTP request type with a URL that does not support the specified type, a 405 error will be returned, listing the supported HTTP methods."
When I try that with a PUT request, I get the same error.
I validated the json with jsonlint so this should not be the problem.
I tried several tutorials like "Three Steps to CouchDB Heaven …" or "Export & Import a Database with CouchDB" but none of them seems to work.
So I am not sure, where the problem is. Do I need to make changes in my geojson file, or something else?
thanks for your help
The needed curl command just looks like that:
curl -H "Content-Type: application/json" -X POST http://localhost:5984/db -d #C:\Users\Name\Desktop\data.geojson
How do you POST a binary variable in curl bash?
#!/usr/bin/env bash
IMAGE=$(curl "http://www.google.com/images/srpr/logo3w.png")
curl --data-binary "$IMAGE" --request "POST" "http://www.somesite.com"
Curl seems to do corrupt the image when uploading.
Curl has the option to write response to disk and then read from it, but it'd be more efficient to do it solely in memory.
Try to eliminate the variable ... as follows:
curl "http://www.google.com/images/srpr/logo3w.png" | curl --data-binary - --request "POST" "http://www.somesite.com"
From the curl man page:
If you start the data with the letter #, the rest should be a file name to read the data from, or - if you want curl to read the data from stdin.
EDIT: From the man page, too:
--raw When used, it disables all internal HTTP decoding of content or transfer encodings and instead makes them passed on unaltered, raw. (Added in 7.16.2)
What happens, if applied on either or both sides?
I had a related problem, where I wanted to dynamically curl a file from a given folder.
curl --data-binary directory/$file --request "POST" "http://www.somesite.com"
did not work - uploaded the string "directory/myFile.jar" instead of the actual file.
Adding the # symbol
curl --data-binary #directory/$file --request "POST" "http://www.somesite.com" fixed it.
I'm sending requests to a third-party API. It says I must send an HTTP PUT to http://example.com/project?id=projectId
I tried doing this with PHP curl, but I'm not getting a response from the server. Maybe something is wrong with my code because I've never used PUT before. Is there a way for me to execute an HTTP PUT from bash command line? If so, what is the command?
With curl it would be something like
curl --request PUT --header "Content-Length: 0" http://website.com/project?id=1
but like Mattias said you'd probably want some data in the body as well so you'd want the content-type and the data as well (plus content-length would be larger)
If you really want to only use bash it actually has some networking support.
echo -e "PUT /project?id=123 HTTP/1.1\r\nHost: website.com\r\n\r\n" > \
/dev/tcp/website.com/80
But I guess you also want to send some data in the body?
Like Mattias suggested, Bash can do the job without further tools. If you want to send data, you have to preset at least "Content-length". With variables "host", "port", "resource" and "data" defined, you can do a HTTP put with
echo -e "PUT /$resource HTTP/1.1\r\nHost: $host:$port\r\nContent-Length: ${#data}\r\n\r\n$data\r\n" > /dev/tcp/$host/$port
I tested this with a Rest API and it workes fine.