Is it possible to see all metrics (all paths) in whisper (graphite)? - metrics

I have a lot of metrics in Graphite and I have to search through them.
I tried to use whisper-fetch.py, but it returns the metric values (numbers), I want the metric names, something like that:
prefix1.prefix2.metricName1
prefix1.prefix2.metricName2
...
Thank you.

Graphite has a dedicated endpoint for retrieving all metrics as part of its HTTP API: /metrics/index.json
For example, running this command against my local Graphite
curl localhost:8080/metrics/index.json | jq "."
produces the following output:
[
"carbon.agents.graphite-0-a.activeConnections",
"carbon.agents.graphite-0-a.avgUpdateTime",
"carbon.agents.graphite-0-a.blacklistMatches",
"carbon.agents.graphite-0-a.cache.bulk_queries",
"carbon.agents.graphite-0-a.cache.overflow",
...
"stats_counts.response.200",
"stats_counts.response.400",
"stats_counts.response.404",
"stats_counts.statsd.bad_lines_seen",
"stats_counts.statsd.metrics_received",
"stats_counts.statsd.packets_received",
"statsd.numStats"
]

You can just use the unix find command, e.g. find /data/graphite -name 'some_pattern' or use the web api, e.g. curl http://my-graphite/metrics/find?query=somequery, see graphite metrics api

Related

gcloud dns managed-zones list along with record-sets count format

In the output of gcloud dns managed-zones list ,I want to show the name of dnsName, creationTime, name, networkName, visibility and the count of recrod-sets in each hosted-zone.
I used below command to get two output in two commands
#get hosted-zone and other values
gcloud dns managed-zones list --format='table(dnsName, creationTime:sort=1, name, privateVisibilityConfig.networks.networkUrl.basename(), visibility)'
#get record-sets for a hostedzone
gcloud dns record-sets list --zone=$zoneName |awk 'NR>1{print}'|wc -l
I think I can get this in a shell script by getting a list of hosted zone and then printing two output together.
But is there a better way to do in a single gcloud command ?
IIRC (!?), you'll need to issue both gcloud commands as each provides distinct data.
To your point, you should be able to easily combine the combine the commands using a shell script and iterating over each zone from managed-zones list, to issue record-sets list --zone=${i}.
If you'd like help, please include dummy data from the 2 commands and I'll draft something for you.

Get deployments counts on Ansible Tower

How to get deployment counts on Ansible Tower? Is there any way to get deployment counts using ansible tower REST API?
In respect of your question and comment
... or last jobs history list ...
you could use according the Ansible Tower REST API and Jobs - List jobs something like
curl --silent -u "${ACCOUNT}:${PASSWORD}" https://${TOWER_URL}/api/v2/jobs/ | jq .
which provides also a Sorting and Searching option, in example ?order_by=name.

Service that outputs your ip and geo location and other ip related info in the commandline like wtfismyip.com?

I've been using wtfismyip.com to get info about my ip by doing
curl wtfismyip.com/json
It outputs all the info to the terminal in a nice json format. Is there another service like this for outputting to the terminal?
curl http://api.db-ip.com/v2/free/self
This outputs your IP info in a JSON format, you can also specify a field name in order to get a text response (ie. http://api.db-ip.com/v2/free/self/countryCode)

Can you view historic logs for parse.com cloud code?

On the Parse.com cloud-code console, I can see logs, but they only go back maybe 100-200 lines. Is there a way to see or download older logs?
I've searched their website & googled, and don't see anything.
Using the parse command-line tool, you can retrieve an arbitrary number of log lines:
Usage:
parse logs [flags]
Aliases:
logs, log
Flags:
-f, --follow=false: Emulates tail -f and streams new messages from the server
-l, --level="INFO": The log level to restrict to. Can be 'INFO' or 'ERROR'.
-n, --num=10: The number of the messages to display
Not sure if there is a limit, but I've been able to fetch 5000 lines of log with this command:
parse logs prod -n 5000
To add on to Pascal Bourque's answer, you may also wish to filter the logs by a given range of dates. To achieve this, I used the following:
parse logs -n 5000 | sed -n '/2016-01-10/, /2016-01-15/p' > filteredLog.txt
This will get up to 5000 logs, use the sed command to keep all of the logs which are between 2016-01-10 and 2016-01-15, and store the results in filteredLog.txt.

Using CURL to download file and view headers and status code

I'm writing a Bash script to download image files from Snapito's web page snapshot API. The API can return a variety of responses indicated by different HTTP response codes and/or some custom headers. My script is intended to be run as an automated Cron job that pulls URLs from a MySQL database and saves the screenshots to local disk.
I am using curl. I'd like to do these 3 things using a single CURL command:
Extract the HTTP response code
Extract the headers
Save the file locally (if the request was successful)
I could do this using multiple curl requests, but I want to minimize the number of times I hit Snapito's servers. Any curl experts out there?
Or if someone has a Bash script that can respond to the full documented set of Snapito API responses, that'd be awesome. Here's their API documentation.
Thanks!
Use the dump headers option:
curl -D /tmp/headers.txt http://server.com
Use curl -i (include HTTP header) - which will yield the headers, followed by a blank line, followed by the content.
You can then split out the headers / content (or use -D to save directly to file, as suggested above).
There are three options -i, -I, and -D
> curl --help | egrep '^ +\-[iID]'
-D, --dump-header FILE Write the headers to FILE
-I, --head Show document info only
-i, --include Include protocol headers in the output (H/F)

Resources