I want to use Marathon's Rest API to get the hostname of a particular application.
curl -XGET http://IP:8080/v2/apps/app_name/tasks gives a list of details including hostname. However I want the output to be only the hostname. Does something exist for this?
I am not intimately familiar with Marathon API, but I won't be surprised if the answer is no.
Did you consider using a JSON processor for extracting the values you need? For instance, to get the list of all hosts the tasks are running on, one can do something like:
curl -XGET http://IP:8080/v2/apps/app_name/tasks | jq .tasks[].host
And if you are interested in a particular task, something like:
curl -XGET http://IP:8080/v2/apps/app_name/tasks | jq .tasks[0].host
Related
I have an API (via hug) that sits between a UI and Elasticsearch.
The API uses elasticsearch-py to run searches, for example:
es = Elasticsearch([URL], http_compress=True)
#hug.post('/search')
def search(body):
return es.search(index='index', body=body)
This works fine; however, I cannot figure out how to obtain a compressed JSON result.
Elasticsearch is capable of this because a curl test checks out — the following returns a mess of characters to the console instead of JSON and this is what I want to emulate:
curl -X GET -H 'Accept-Encoding: gzip' <URL>/<INDEX>/_search
I've tried the approach here to modify HTTP headers, but interestingly enough the "Accept-Encoding": "gzip" header is already there: it just doesn't appear to be passed to Elastic because the result is always uncompressed.
Lastly, I'm passing http_compress=True when creating the Elastic instance; however, this only compresses the payload — not the result.
Has anyone had a similar struggle and figured it out?
I have a table called Goals with default CLP (Public read and write).
My mobile app has default ACL set so that only the owner for the data can read and write from it.
Let’s now assume that someone is able to obtain the client keys maliciously from the app, and add an entry in the table Goals without ACL using a command like that:
curl -X POST \
-H "Content-Type: application/json" \
-H "X-Parse-Application-Id: xyx” \
-H "X-Parse-REST-API-Key: 12345” \
-d "{\"name\":\"whatever\"}" \
https://api.parse.com/1/classes/goals
Now every user will load this new data, I would like to prevent that.
I assume there are two options:
Prevent Rest API users from writing data without ACL or with a Public ACL, perhaps with some CloudCode
In the app filter out the data that doesn't belong directly to that user
My question is, are the two above the only available options? Is the first option only doable with Cloud Code?
Ok, let's try this again. I think the best solution in this case is to create a beforeSave trigger in cloud-code that sets the ACL the way you want it. With the Parse Javascript SDK, you can construct an ACL that only gives access to the user passed to it in the constructor. Something like this (untested code):
Parse.Cloud.beforeSave("Goal", function(request, response) {
request.object.setACL(new Parse.ACL(Parse.User.current()));
response.success();
});
Note: I used this post as a reference. And this too
I'm not sure if this is a possibility but I am trying to update the status of a text widget in Dashing using curl.
The status I would like to update is 'warning' or 'danger' to reflect if a server has gone down or become unresponsive. My idea is that the dashboard will be populated with several green text widgets all saying online when the dashboard initialises. Periodically services running on other machines will post requests to the dashboard changing the status of widgets.
I have tried using curl to simulate the post messages from other machines and I'm able to update the text and title of the text widgets but have had no luck updating the status.
I have been using:
curl -d "{ \"auth_token\": \"YOUR_AUTH_TOKEN\", \"status\": \"danger\" }" -H "Content-Type: application/json" http://localhost:3030/widgets/frontend11
But the widget does not change colour. I have seen examples where the coffee script code was amended to include this possibility, but I thought that this functionality was included in all widgets?
We do this - changing status via curl - and it works great. Here's a snip of our code:
json='{ "auth_token": "'$dashing_auth_token'", "current": '$widget_value', "value": '$widget_value', "status": "'$widget_status'" }'
curl -H Content-Type:application/json -d "${json}" "${dashing_url}widgets/${widget_id}"
The above is in a function that gets passed all of the variables, but the variable names hopefully are easy enough to read there that you can make sense of it. I can write up more (or send the whole function) if it'd help you, but I think just the two lines should be enough to get you there without all the rest of the clutter. Let me know if more would be helpful.
I am instructed to use write UNIX shell scripts that scrape certain websites. We use fiddler to trace the HTTP requests, then we write the cURLs accordingly. For the most part, scraping most websites seem to be fairly simple, however I've ran into a situation where I'm having difficulties capturing certain information.
I need to be somewhat generic in saying that I cannot provide the website address that I am actually looking at, however I can post some of the requests and responses to provide context.
Here's the situation:
The website starts with a search screen. You enter your search query and the website returns a list of results.
I need to choose the first result from the result page.
I need to capture EVERYTHING on the page from the first result.
Everything up until this point is working fine
Here's the problem:
The page returned has hyperlinks that are wickets. When these links are pressed, a window pops up within the page - it is not actually a window like a pop up created by javascript, it is more comparable to what you see when you 'compose a message' or 'poke' someone on Facebook ( am I the only one who still does that? ).
I need to capture the contents of that pop up window. There are usually multiple wicket links on a given page. Handling that should be easy enough with a loop, but I need to figure out the proper way to cURL those wickets first.
Here is the cURL i'm currently using to attempt to scrape the wickets.
(I'm explicitly defining the referrer URL, Accept, and Wicket-Ajax boolean as these were the items that were sent in the header when I traced the site). Link is the URL which looks like this:
http://www.someDomainName.com/searches/?x=as56f1sa65df1&random=0.121345151
( the random I believe is populated with some javascript, not sure if that's needed or even possible to recreate. I'm currently sending one of the randoms that I received on one particular occasion. ).
/bin/curl -v3 -b COOKIE -c COOKIE -H "Accept: text/xml" -H "Referer: $URL$x" -H "Wicket-Ajax: true" -sLf "$link"
Here is the response I get:
<ajax-response><redirect><![CDATA[home.page;jsessionid=6F45DF769D527B98DD1C7FFF3A0DF089]]></redirect>
</ajax-response>
I am expecting an XML document with actual content to be returned. Any insight into this issue would be greatly appreciated. Please let me know if you need more information.
Thanks,
Paul
I am playing with the open data project at spogo.co.uk (sport england).
See here for a search example: https://spogo.co.uk/search#all/Football Pitch/near-london/range-5.
I have been using cygwin and curl to POST JSON data to the MVC controller. An example is below:
curl -i -X POST -k -H "Accept: application/json" -H "Content-Type: application/json; charset=utf-8" https://spogo.co.uk/search/all.json --data '{"searchString":"","address": {"latitude":55,"longitude":-3},"page":0}'
Question:
How can I find out what other variables can be included in the post data?
How can I return all results, rather than just 20 at a time? Cycling through page numbers doesn't deliver all at once.
AJAX is simply a technique of posting data over a connection asynchronously, JSON is just a string format that can contain data. Neither of which have built in mechanisms for querying information such as what fields are accepted or the amount of data returned.
You will want to check the web service documentation for on spogo.co.uk for these answers, IF their web service exposes such functionality they will be the final authority on what the commands and formats are.