How can i get a metric from a specific version in sonarqube using the webservice api - sonarqube

For my release automation i'm creating a document generator that includes the current measurements from sonarqube. In this document i would like to report the differences between several versions of the code.
I managed to get the list of versions without any problem using
http://nemo.sonarqube.org/api/events?resource=org.codehaus.sonar:sonar&categories=Version
And i also managed to get a measurement of the current code state using
http://nemo.sonarqube.org/api/resources?resource=org.codehaus.sonar:sonar&metrics=ncloc
Can anybody help me in how to get the ncloc of an older version, say version '4.0'?

The Web Service does not allow to query this information.

Well the solution is roundabout but you could get the data you desire based on the version.
Proposed solution:
Get the version-specific details from the API.
http://nemo.sonarqube.org/api/events?resource=org.codehaus.sonar:sonar&categories=Version&format=json
The response would be something like:
[{"id":"23761","rk":"helloworld","n":"1.1","c":"Version","dt":"2017-07-19T20:28:54-0500"},
{"id":"23731","rk":"helloworld","n":"1.0","c":"Version","dt":"2017-07-18T14:51:20-0500"},
{"id":"5107","rk":"helloworld","n":"1","c":"Version","dt":"2015-12-07T11:37:44-0600"}]
The "dt" value specifies the point of time where the Version is released.
Parse the JSON and get the dt values. Find the minimum and maximum date values from the obtained dt values.
Use the time machine API to query out the metrics you need using the API
http://nemo.sonarqube.org/api/timemachine?resource=helloworld&metrics=coverage,ncloc&rfomDateTime=(min_dt_value)&toDateTime=(max_dt_value)
You will get all the metrics between the timestamps.
Compare our version-specific dt values against the one obtained from the above response and thus get the version specific metric values.

Related

How to conditionally process FlowFile's by a MongoDB query result?

I need to process a list of files based on the result of a MongoDB query, but I can't find any processor that would let me do that. I basically have to take each file and process it or completely discard based on the result of a query that involves that file attributes.
The only MongoDB-related processor that I see in NiFi 1.50 is GetMongo, which apparently can't receive connections, but only emit FlowFiles based on the configured parameters.
Am I looking in the wrong place?
NIFI-4827 is an Improvement Jira that aims to allow GetMongo to accept incoming flow files, the content would contain the query, and the properties will accept Expression Language. The code is still under review, but the intent is to make it available in the upcoming NiFi 1.6.0 release.
As a possible workaround in the meantime, if there is a REST API you could use InvokeHttp to make the call(s) manually and parse the result(s). Also if you have a JDBC driver for MongoDB (such as Unity), you might be able to use ExecuteSQL.

SonarQube API - multiple metrics for all components/projects

I'd like to retrieve multiple metrics for ALL projects in SQ in one GET request. Is this possible?
It seems like GET api/measures/component can give me the XML I want, but only if given a specific componentKey (project name). The only other alternative seems to be to go one by one through each component, which wouldn't be ideal given that I have over 500 projects I would like get metrics for.
What you're asking for is not possible. You'll have to query the api/measures/component many times to get all measures from all projects.

Connecting NiFi to ElasticSearch

I'm trying to solve one task and will appreciate any help - links to documentation, or links to forums, or other FAQs besides https://cwiki.apache.org/confluence/display/NIFI/FAQs, or any meaningful answer in this post =) .
So, I have the following task:
Initial part of my system collects data each 5-15 min from different DB sources. Then I remove duplicates, remove junk, combine data from different sources according to logic and then redirect it to second part of the system as several streams.
As far as I know, "NiFi" can do this task in the best way =).
Currently I can successfully get information from InfluxDB by "GetHTTP" processor. However I can't configure same kind of processor for getting information from Elastic DB with all necessary options. I'd like to receive data each 5-15 minutes for time period from "now-minus-<5-15 minutes>" to "now". (depends on scheduler period) with several additional filters. If I understand it right, this can be achieved either by subscription to "_index" or by regular requests to DB with desired interval.
I know that NiFi has several specific Processors designed for Elasticsearch (FetchElasticsearch5, FetchElasticsearchHttp, QueryElasticsearchHttp, ScrollElasticsearchHttp) as well as GetHTTP and PostHTTP Processors. However, unfortunately, I have lack of information or even better - examples - how to configure their "Properties" for my purposes =(.
What's the difference between FetchElasticsearchHttp, QueryElasticsearchHttp? Which one fits better for my task? What's the difference between GetHTTP and QueryElasticsearchHttp besides several specific fields? Will GetHTTP perform the same way if I tune it as I need?
Any advice?
I will be grateful for any help.
The ElasticsearchHttp processors try to make it easier to interact with ES by generating the appropriate REST API call based on the properties you set. If you know the full URL you need, you could use GetHttp or InvokeHttp. However the ESHttp processors let you put in just the stuff you're looking for, and it will generate the URL and return the results.
FetchElasticsearch (and its variants) is used to get a particular document when you know the identifier. This is sometimes used after a search/query, to return documents one at a time after you know which ones you want.
QueryElasticsearchHttp is for when you want to do a Lucene-style query of the documents, when you don't necessarily know which documents you want. It will only return up to the value of index.max_result_window for that index. To get more records, you can use ScrollElasticsearchHttp afterwards. NOTE: QueryElasticsearchHttp expects a query that will work as the "q" parameter of the URL. This "mini-language" does not support all fields/operators (see here for more details).
For your use case, you likely need InvokeHttp in order to issue the kind of query you describe. This article describes how to issue a query for the last 15 minutes. Once your results are returned, you might need some combination of EvaluateJsonPath and/or SplitJson to work with the individual documents, see the Elasticsearch REST API documentation (and NiFi processor documentation) for more details.

What is the difference between startDate and a filter on "published" in the Okta Events API?

I've written a .NET app using the Okta.Core.Client 0.2.9 SDK to pull events from our organization's syslog for import into another system. We've got it running every 5 minutes, pulling events published since the last event received in the previous run.
We're seeing delays on some events showing up. If I do a manual run at the top of the hour for the previous hour's data, it'll include more rows than the 5-minute runs. While trying to figure out why I remembered the startDate param, mutually-exclusive with the filter one I've been using.
The docs don't mention much about it - just that it "Specifies the timestamp to list events after". Does it work the same as published gt "some-date"? We're capturing data for chunks of time, so I needed to include a "less than" filter and ignored startDate. But the delayed events have me looking for a workaround.
Are you facing delayed results using startDate or filter?
Yes published gt "some-date" and startDate work the same way. The following two API calls.
/api/v1/events?limit=100&startDate=2016-07-06T00:00:00.000Z
and
/api/v1/events?limit=100&filter=published gt "2016-07-06T00:00:00.000Z"
returns the same result. Since, they are mutually exclusive filter might come in handy in creating more specific queries including the other query parameters in your query using filter.

SonarQube updating manual metric via REST API sets date to January 17, 1970

We have an install of Sonar 5.1.2, and we're attempting to attach a performance metric to our sonar scan. However, when we update the metric via the REST API, the metric "updated_at" is always set to January 17, 1970. This, unsurprisingly, messes up the timeline view leaving us with only the message "Current timeline is reduced to a shorter period because of limited historical data for one of the metric."
That is, we issue the call to
http://statica:9000/api/manual_measures?resource=<project name>&metric=<metric name>&val=<value>
(And we supply the appropriate authorization for the call.
We get the response
{"id":3,"metric":"<metric name>","resource":"<project name>","val":17.0,"created_at":"2015-10-09T11:41:04-0400","updated_at":"1970-01-17T12:19:04-0500","login":"<user name>","username":"<user name"}
When we go to the site itself, and then go in to the project, then choose "Settings > Manual Measures", we can see our metric there, and in the DATE column, it shows "Jan 17 1970 12:19", which matches what was returned via the REST API.
Also, if you then go to the dashboard of the project where we have the Timeline widget configured to show the metric (as well as LOC and Coverage), we get the simple message at the bottom of "Current timeline is reduced to a shorter period because of limited historical data for one of the metric." and a single flat line in the graph.
Is this expected? Is there any way to capture the date the metric was updated instead of this default date? Is there a parameter we need to supply with the call to updating the metric value?
This WebService is pretty old and does not comply with current WebService guidelines of SQ. It's been removed in 5.2 and replaced by WS api/custom_measures/create which does not have this issue AFAIK.
As of today, no 5.1.3 is planed, so this issue won't be fixed.

Resources