How to get all results for api/issues/search (not just first 500)? - sonarqube

I am trying to use the SonarQube web service API api/issues/search to extract the information of all issues. But I see that the maximum number of results from the API is only 500 with filters like pageSize.
Is there a different way of using this API so that I can get all the issues in the resultlist?

The web service results are paginated. Use ps (page size) and p to step through the result set.
That said, there's a hard limit of 10k.

Related

Jmeter: How to count numbers of rows returned from a search (response or gui)

In JMeter i need to perform a large search and count the number of rows which are returned. Max rows are 50000.
The number of rows which are returned are shown on the website after a search. "Number of returned rows: xx".
Or I can count the rows inside the HTTP response.
I have tried to use a regex post-processer to count the amount of rows which are returned, the problem is that JMeter freezes since the http-response is so large.
I have also tried to extract the text directly from the website unsuccesfully. I guess one cant do that since the information is not in the HTTP-response?
--So:
Is there some faster and less demanding way to counter all the returned rows inside a HTTP-response body?
Or is there some way to get the text directly from the website?
Thank you.
It looks like your application is buggy, I don't think that returning 50000 entries in a single shot is something people should be doing as there is creates extra network traffic and consumes a lot of resources on server and client(browser) side. I would rather expect some form of Pagination when it comes to operating large amounts of data.
If you're totally sure that your application works as expected you can try using Boundary Extractor which is available since JMeter 4.0
Due to the specifics of internal implementation it consumes less resources and acts faster than the Regular Expression Extractor therefore the load you will be able to conduct from a single machine will be higher.
Check out The Boundary Extractor vs. the Regular Expression Extractor in JMeter article for more information
yes you can get that count in matchNr which is coming after search string. use Regular expression to match any name or id,
do match No. -1
ex. regex variable name is totalcount so then you can fetch that count by using ${totalcount_matchNr}

SonarQube API Issue search is only returning 100 results

Utilizing SonarQube 5.1, I have been attempting to utilize the API search feature to gather all of the issues pertaining to my current project to display on a radiator. On the Web interface, SonarQube indicates there are 71 major issues and 161 minor issues.
Using this search string
https://sonarqube.url.com/api/issues/search?projectKeys=myproject'skey
I get back a response with exactly 100 results. When I process those results for only OPEN items, I get back a total of 55 issues. 36 major, 19 minor.
This is being achieved through a Powershell script that authenticates to the SonarQube server and passes in the query, then deserializes the response into an array I can process. (Counting major/minor issues)
With the background out of the way, the meat of my question is: Does anyone know why the responses I am receiving are locked at 100? In my research I saw others indicating a response to an issue search would be capped at 500 due to an outstanding bug. However the expected number of issues I am looking for is far below that number. The API's instructions indicate that it would return the first 10,000 issues. Is there a server side setting that restricts the output it will return to a search query?
Thanks in advance,
The web service docs show that 100 is the default value of the ps parameter. You can set the value higher, but it will still max out.
You might have noticed a "paging" element in the JSON response. You can use it to calculate how many pages of results there are and loop through them using the p parameter to specify page number.

Google Places webservice returns nextpage token as 0x0 after two requests

I have written a d code to retrieve the available places (like hospitals in US) from google places web service. I am able to retrieve place details only continuously for 2 requests only. afterwards the pagination token is set as 0x0 in the json response.
But if i manually do the search in the google then it keep on moving more than that. What is wrong am i doing here. Google says that it allows 1000 request per day limit for free. But it doesn't serve more than 2 requests. I have used d lang "requests" module for making the http request.
You only get 60 results across three pages, because that's the maximum provided by the API.
Nearby Search and Text Search requests in the Google Places API web service return a maximum of 60 results. See the Accessing Additional Results section of the documentation, which says:
each search can return as many as 60 results, split across three pages.
(Emphasis mine.)
dlang-requests can display (to stdout) detailed info on request and response if you use verbosity=3.
Also you can compile sources it with -debug=requests and set globalLogLevel(LogLevel.trace) to produce even more detailed log information. If this doesn't help, then please give me detailed info on failed API call so that I can reproduce problem.

How does Parse Query.each count towards execution limits

I am wondering how the each command on a Parse Query counts towards the request execution limits. I am building an app that will need to perform a function on many objects (could be more than 1000) in a parse class.
For example (in JavaScript),
var query = new Parse.Query(Parse.User);
query.equalTo('anObjectIWant',true); //there could be more than 1000 objects I want
query.each(function(object){
doSomething(object); //doSomething does NOT involve another Parse request
});
So, will the above code count as 1 request towards my Parse application execution limit (you get 30/second free), or will each object (each recurrence of calling "each") use one request (so 1000 objects would be 1000 requests)?
I have evaluated the resource usage by observing the number of API requests made by query.each() for different result set sizes. The bottom line is that (at the moment of writing) this function is using the default query result count limit of 100. Thus if your query matches up to 100 results it will make 1 API request, 2 API requests for 101-200 and so forth.
This behavior can not be changed by manually increasing the limit to the maximum using query.limit(1000). If you do this you will get an error when you call query.each() afterwards (this is also mentioned in the documentation).
Therefore it has to be considered to manually implement this functionality (e.g., by recursive query.find()) which allows you to set the query limit to 1000 and thus, in the best case, only consumes one-tenth of the API requests query.each() would consume.
This would count as 1 or 2 depending on :
If it is run from cloudcode function =2,when 1 is for cloudcode call + 1 for query. Since queries get their results all at once it is single call.
If this should be place within "beforeSave" functions or similar then only query would be counted, 1 API call.
So you should be pretty fine as long as you don't trigger another parse API for each result.
I would not be surprised if the .each method would query the server each iteration.
You can actually check this using their "control panel", just look at the amount of requests beeing made.
We left Parse after doing some prototyping, one of the reasons was that while using proper and sugested code from the parse website, I managed to create 6500 requests a day beeing the only one using the app.
Using our own API, we are down to not more than 100.

Exhaustive Search on Google Places

I'm trying to use Google Places API for a business locator app, but am having trouble creating an exhaustive database of business.
1.The API call only returns 20 results back.
2.The "type" restriction (e.g. type=restaurant) does not pick up all businesses by type in a given zip. I could use "keyword" but not all restaurants have restaurant in their name, and not all spas have "spa" in their name.
3. Each call produces the same set of results from day to day, and with only 20 returns per call, how am I to get a more exhaustive database of businesses?
I can try to get around the above three constraints by looping through a very well degraded search of businesses: say by zip code, some list of keywords, category type. But I still won't get close to picking up the 50 million or so businesses in google places.
In fact, even when I make a call for restaurants and bars in my own neighborhood, I don't pick up popular places down the block from me.
How is the API usable for an app that locates places then?
Any suggestions on how to create a more exhaustive search?
Thanks,
Nad
I'm not able to answer your question regarding Google Places API.
But for your requirements ('business locator app', 'I don't pick up popular places down the block from me') I suggest you try Yelp Search API:
Yelp's API program enables you to access trusted Yelp information in real time, such as business listing info, overall business ratings and review counts, deals and recent review excerpts.
Yelp is a popular review website with a capable API and you may test the quality of database and the devoted user base they have at Yelp homepage.
Note:
They keep some data for themselves and do not return everything in response.
The (free) dev account has a limit of 100 calls per 24 hours.
I know I'm late but maybe it helps someone these days.
By default, each Nearby Search or Text Search returns up to 20
establishment results per query; however, each search can return as
many as 60 results, split across three pages.
You need to use the field nextPageToken that you will receive on the first search to get the next page.
https://developers.google.com/places/web-service/search
An issue in stack overflow says:
There is no way to get more than 60 results in Places API. Some people
tried to file a feature request in Google issue tracker, but Google
rejected it with the following comment Unfortunately Places API is not
in a position to return more than 60 results. Besides technical
reasons (latency, among others) returning more than 60 results would
make the API be more like a database or general-purpose search engine.
We'd rather improve search quality so that users don't need to go so
far down a long list of results.
google places api more than 60 results
I faced the same difficulties that you did and decided to use the Yelp API instead. It is free, very complete and returns up to 1000 results. You should however check the terms of service before doing anything. It does not provide the website of the business (only the Yelp website link).
https://www.yelp.com/developers/documentation/v3/business_search
Other options I investigated at that time:
Foursquare ventures. (It was very expensive, and only returned up to around 100 results)
Here places API
Factual Places (I don't think this one is an API)
Sygic Travel API (Specific for touristical spots)
Planet.osm (OpenStreetMap)

Resources