Link automated Postman tests with elastic apm report - elasticsearch

I currently have a spring application, in which I have implemented a continuous integration pipeline.
In this pipeline, postman tests are executed automatically.
I would like to add a performance dimension to these tests. So, I set up elastic-apm in my application with an apm server, elasticsearch and kibana.
However, I would like to find a way to link the requests sent by my postman tests to their associated transaction visualization in kibana in order to generate a test performance report
I can't find a way to do this. I've seen that there is a transaction.id or trace.id metadata, but I can't figure out how to use it.
Do you have any idea, even from a "high-level" point of view, how to achieve this?
Thanks in advance !

Related

How to fetch BigQuery data into a springboot application?

I have a use case wherein I need to fetch data from GCP BigQuery database into my Springboot application and subsequently perform some operations on it. I'm unable to understand how to go about doing it. For example, how the application properties need to be configured for using BQ database, etc, nor was I able to find any good resource for the same.
Request you all to kindly guide me a bit on this. Would be great even if you could point me to a relevant resource!
Indeed there are no examples on Spring Cloud documentation. However there is nice sample on spring-cloud-gcp github.
There is small tutorial how to run it, so I think this will be good starting point.

Showing HTTP Request API latency using the Spring Boot Micrometer metrics

We use Prometheus to scrape Spring Boot 2.0.0 metrics and then persist them in InfluxDB.
We then use Grafana to visualize them from InfluxDB.
Our micrometer dependencies are
micrometer-core
micrometer-registry-prometheus
I want to be able to show a latency metric for our REST APIs.
From our Prometheus scraper I can see these metrics are generated for HTTP requests.
http_server_requests_seconds_count
http_server_requests_seconds_sum
http_server_requests_seconds_max
I understand from the micrometer documentation, https://micrometer.io/docs/concepts#_client_side, that latency can be done by combining 2 of the above generated metrics: totalTime / count.
However our data source is InfluxDB which does not support combining measurements, https://docs.influxdata.com/influxdb/v1.7/troubleshooting/frequently-asked-questions/#how-do-i-query-data-across-measurements,
so I am unable to implement that function in InfluxDB.
Do I need to provide my own implementation of this latency metric in the Spring Boot component or is their an easier way that I can achieve this?
You essentially can join your measurements in Kapacitor, another component of Influxdata TICK stack.
It's going to be pretty simple with JoinNode, possibly followed by Eval to calculate what you want right in place. There's tons of examples around it in documentation.
Although the problem is different there: you'd unnecessarily overingeneered your solution, and moreover - you're trying to combine two products that has the same purpose, but uses different approach to it. How smart is that?
You're already scraping things with Prometheus? Fine! Stay with it, do the math there, it's simple. And Grafana works with Prometheus too, right out of the box!
You wanna have your data in Influx (I can understand that, it's certainly more advanced)?
Fine! Micrometer can send it right to Influx out of the box - and in at least two ways!
I, personally, don't see any reason to do what you suppose to do, can you share one?

Automated API blackbox testing

I have a (slightly complex) spring webservice which communicates with multiple frontends via a RESTful API (JSON) and additionally with other devices via SOAP or REST. I'd like setup an automated test environment which is capable of the following things:
create preconditions via fixtures (POSTGRES DB)
send REST or SOAP messages against the API
is able to run certain task (requests against the API) at a specific
time/date
assert and validate the produced results (return of the API call or
check the DB)
run all tests independet from any frontend/UI
integrate the testing environment in my infrastructure (i.e. create a
docker container which runs all tests deployed by Jenkins)
preferably I'd like to build reusable components (i.e. for creating a user that is needed in multiple different tests and so on...). I know there are a lot of tools and frameworks (SoapUI, JMETER,...). But before trying them all and getting lost, I'd like to get an experience report from someone who has a simular setup.
we are using JMeter for API testing. We tried SOAPui but it had some memory issues. So we are pushing forward with JMeter and so far so good.
For your questions:
We are using MySQL, but this post seems to show how to set up a postgres connection in JMeter: https://hiromia.blogspot.com/2015/03/how-to-perform-load-testing-on.html
JMeter can send REST API requests
I'm not sure if this is possible but you could probably have your Jenkins job scheduled to run when you need the API to run the specific task at the specific time.
There are quite a few assertions in JMeter. I use the Response and the BeanShell Assertions a lot.
JMeter is independent from any front end UI which helps pinpoint bugs.
I have not run docker but I am running via Jenkins. This jenkins plugin has been helpful: https://wiki.jenkins.io/display/JENKINS/Log+Parser+Plugin
A few more Tips:
Use the HTTP Request Defaults element. It will save you from having to update all your HTTP requests.
Use the User Defined Variables to define variables you need.
You can combine user defined variables like: ${namePrefix}${myTime} but it will have to be in a 2nd User Defined Variable element(you cant combine them in the same element)
If you have multiple test environments, set up a user defined variable with a value like this: ${__P(testenv,staging)} This way, you can change it from a CLI like this: -Jtestenv=HOTFIX
We are using ExtentReports for pretty html results reports with a custom JSR223 Listener(find my old post on this site).
If your site uses cookies, use the HTTP Cookie Manager.
If you need things to happen in order, on the Test Plan element, check this option: Run Threat Groups consecutively. If you dont, JMeter runs them in a random order.
Hope this is helpful. Happy Testing!

SonarQube webservice API for delta load?

I see that Sonarqube provides an Webservice API to get all issues and I will load this data in to a database for analysis. Then, I wish to have my reporting database in sync with the changes happening in the system. Do we have a Webservice API that captures change data?
Overall, I want reporting DB to be in sync with the system.
The createdAfter parameter of the issues search web service will get you new issues, but there's no analogous updatedAfter parameter. Note that by only looking at added issues, you'll overlook issues that are closed in a new analysis.

Programmatically generate VersionOne reports for multiple projects

My organization is tracking multiple Scrum projects in VersionOne. Each week, we use the Release Forecasting report for each project to create a management dashboard that indicates the health and expected completion date of each project. I would like to automate this. Do any of the VersionOne APIs allow for the execution of this report and retrieving the image that is generated?
There is not an endpoint specific to Release Forecasting. Nor is there an endpoint to generate the image. However, you can get to the underlying data via the existing API endpoints. For reporting, I recommend query.v1. The closest example is the query for burndown data. You would need to take Scope as the focus of the query, not Timebox.
You might also take a look at VersionOne's Reporting and Analytics. While that is not a coding or API way to get the reports, it might still automate the needs you have.
I was able to automate the retrieval of this report, but not through the V1 API. Through careful use of Fiddler and a C# script using WebClient to execute POST requests, it was possible. The resulting code is pretty fragile, though, since it isn't using the API.

Resources