locust - test statistics meta data - performance

The Locust documentation does not clearly call out the various fields that will be reported in the CSV file of test statistics. I came across this question that has some description of the results. Is this how test results look like? Or, are there other formats?

Right now it pulls straight from the data you can manually download from the endpoint, looking like:
$cat foobar_distribution.csv
"Name","# requests","50%","66%","75%","80%","90%","95%","98%","99%","100%"
"_get_token",0,"N/A","N/A","N/A","N/A","N/A","N/A","N/A","N/A","N/A"
"client _ping",2,5,5,5,5,5,5,5,5,5
"client _scores",7,4,4,4,4,5,5,5,5,5
"rpc_get_scores",7,5,5,5,5,7,7,7,7,7
"rpc_get_token",36,0,0,0,0,0,0,1,1,1
"rpc_ping",2,6,6,6,6,6,6,6,6,6
"None Total",54,0,1,4,4,5,5,6,7,7
$cat foobar_requests.csv
"Method","Name","# requests","# failures","Median response time","Average response time","Min response time","Max response time","Average Content Size","Requests/s"
"Method","_get_token",0,0,0,0,0,0,0,0.00
"Method","_ping",2,0,3,4,3,5,0,0.19
"Method","_scores",7,0,4,4,4,5,0,0.68
"Method","rpc_get_scores",7,0,5,5,4,7,0,0.68
"Method","rpc_get_token",36,0,0,0,0,1,0,3.51
"Method","rpc_ping",2,0,4,5,4,6,0,0.19
I apparently forgot to update the documentation for that when I made the PR... you can see the fields here from the PR though too.

Related

Displaying JSON output from an API call in Ruby using VScode

For context, I'm someone with zero experience in Ruby - I just asked my Senior Dev to copy-paste me some of his Ruby code so I could try to work with some APIs that he ended up putting off because he was too busy.
So I'm using an API wrapper called zoho_hub, used as a wrapper for Zoho APIs (https://github.com/rikas/zoho_hub/blob/master/README.md).
My IDE is VSCode.
I execute the entire length of the code, and I'm faced with this:
[Done] exited with code=0 in 1.26 seconds
The API is supposed to return a paginated list of records, but I don't see anything outputted in VSCode, despite the fact that no error is being reflected. The last 2 lines of my code are:
ZohoHub.connection.get 'Leads'
p "testing"
I use the dummy string "testing" to make sure that it's being executed up till the very end, and it does get printed.
This has been baffling me for hours now - is my response actually being outputted somewhere, and I just can't see it??
Ruby does not print anything unless you tell it to. For debugging there is a pretty printing method available called pp, which is decent for trying to print structured data.
In this case, if you want to output the records that your get method returns, you would do:
pp ZohoHub.connection.get 'Leads'
To get the next page you can look at the source code, and you will see the get request has an additional Hash parameter.
def get(path, params = {})
Then you have to read the Zoho API documentation for get, and you will see that the page is requested using the page param.
Therefore we can finally piece it together:
pp ZohoHub.connection.get('Leads', page: NNN)
Where NNN is the number of the page you want to request.

how to parse previously fetch whois data with Ruby Whois?

According to README on github, Ruby Whois can be used "as a standalone library to parse WHOIS records fetched previously and/or from different WHOIS clients."
I know how to use the library to directly perform whois query and parse the returning result. But I cannot find anywhere(stackoverflow included) how I can use this library to parse whois data previously fetched ?
I think it's not important but this is how I get my data, anyway: they are fetched through linux whois command and stored in separate files, each file containing one whois query result.
The manual pages on https://whoisrb.org/ are 404. Even the code on the homepage is outdated thus wrong, and the doc pages provide little information.
I tried to scan the source code on github( https://github.com/weppos/whois-parser and https://github.com/weppos/whois). I tried to find the answer on rubydoc ( https://www.rubydoc.info/gems/whois-parser/Whois/Parser, https://www.rubydoc.info/gems/whois/Whois/Record and some related pages). Both failed, partly because this task is the first time and the reason that I use Ruby.
So could anyone help me? I'm really desperate and I'll definitely appreciate any help.
Try it like this,
require 'whois-parser'
domain = 'google.com'
data = 'WHOIS DATA THAT YOU ALREADY HAVE'
whois_server = Whois::Server.guess domain
whois_data = [Whois::Record::Part.new(body: data, host: whois_server.host)]
record = Whois::Record.new(whois_server, whois_data)
parser = record.parser
parser.available? #=> false
parser.registered? #=> true

Issue with Correlation using Gatling Performance Testing tool

I am new to Gatling
From last 5 days, I am trying to capture below request but I am unable to correlate the below "sesskey" which will be used in throughout my whole flow :
.feed(feeder)
.exec(http("request_2")
.post("/login/index.php")
.headers(headers_0)
.formParam("username", "${username}")
.formParam("password", "${password}")
.formParam("anchor", "")
.resources(http("request_3")
.get("/theme/image.php/clean/core/1468244430/t/block_to_dock"),
http("request_4")
.get("/lib/javascript.php/1468244430/blocks/course_overview/module.js"),
http("request_5")
.get("/theme/image.php/clean/core/1468244430/t/collapsed"),
http("request_6")
.get("/theme/image.php/clean/core/1468244430/t/expanded"),
http("request_7")
.get("/lib/requirejs.php/1468244430/core/first.js"),
http("request_8")
.get("/theme/yui_combo.php?3.17.2/anim-base/anim-base.js&3.17.2/anim-color/anim-color.js&3.17.2/anim-xy/anim-xy.js&3.17.2/anim-curve/anim-curve.js&3.17.2/anim-easing/anim-easing.js&3.17.2/anim-node-plugin/anim-node-plugin.js&3.17.2/anim-scroll/anim-scroll.js"),
http("request_9")
.get("/lib/javascript.php/1468244430/lib/requirejs/jquery-private.js"),
http("request_10")
.get("/theme/yui_combo.php?3.17.2/cssbutton/cssbutton-min.css")
.headers(headers_10),
http("request_11")
.get("/lib/javascript.php/1468244430/lib/jquery/jquery-1.12.1.min.js"),
http("request_12")
.get("/theme/yui_combo.php?3.17.2/handlebars-base/handlebars-base.js&3.17.2/handlebars-compiler/handlebars-compiler.js&m/1468244430/core/handlebars/handlebars-debug.js&3.17.2/plugin/plugin.js&m/1468244430/core/lockscroll/lockscroll-debug.js&m/1468244430/core/notification/notification-ajaxexception-debug.js&m/1468244430/core/notification/notification-alert-debug.js&m/1468244430/core/notification/notification-exception-debug.js&m/1468244430/core_message/messenger/messenger-debug.js"),
http("request_13")
.get("/theme/yui_combo.php?m/1468244430/calendar/info/info.css")
.headers(headers_10),
http("request_14")
.get("/theme/yui_combo.php?m/1468244430/calendar/info/info-debug.js"),
http("request_15")
.post("/lib/ajax/service.php?sesskey=LuyCPEUwdm")
.headers(headers_1)
.body(RawFileBody("MoodleViewPageV01_0015_request.txt")),
http("request_16")
.get("/theme/yui_combo.php?m/1468244430/core/formautosubmit/formautosubmit-debug.js"),
http("request_17")
.get("/theme/yui_combo.php?3.17.2/event-mousewheel/event-mousewheel.js&3.17.2/event-resize/event-resize.js&3.17.2/event-hover/event-hover.js&3.17.2/event-touch/event-touch.js&3.17.2/event-move/event-move.js&3.17.2/event-flick/event-flick.js&3.17.2/event-valuechange/event-valuechange.js&3.17.2/event-tap/event-tap.js&3.17.2/event-simulate/event-simulate.js&3.17.2/async-queue/async-queue.js&3.17.2/gesture-simulate/gesture-simulate.js&3.17.2/node-event-simulate/node-event-simulate.js&m/1468244430/core/actionmenu/actionmenu-debug.js"),
http("request_18")
.get("/theme/image.php/clean/core/1468244430/t/switch_minus"),
http("request_19")
.get("/theme/image.php/clean/core/1468244430/t/switch_plus")))
In the request above, Request_15 where you will get .post("/lib/ajax/service.php?sesskey=LuyCPEUwdm") and I want to capture sesskey value LuyCPEUwdm which will be going to used multiple times during my scenario.
How I should capture that key ?
Please help
Appreciate your time :)
I got the answer.
Here, When you will going to record the script using recorder, There...You need to select "save & check Response bodies?" option and then go for recording
So after recording completed, Your response will be stored in "..\gatling-charts-highcharts-bundle-2.2.2\user-files\bodies" path
So you can compare the scenario of your script with the response body files and correlate that value based on captured value in the response body files

Ruby neo4j-core mass processing data

Has anyone used Ruby neo4j-core to mass process data? Specifically, I am looking at taking in about 500k lines from a relational database and insert them via something like:
Neo4j::Session.current.transaction.query
.merge(m: { Person: { token: person_token} })
.merge(i: { IpAddress: { address: ip, country: country,
city: city, state: state } })
.merge(a: { UserToken: { token: token } })
.merge(r: { Referrer: { url: referrer } })
.merge(c: { Country: { name: country } })
.break # This will make sure the query is not reordered
.create_unique("m-[:ACCESSED_FROM]->i")
.create_unique("m-[:ACCESSED_FROM]->a")
.create_unique("m-[:ACCESSED_FROM]->r")
.create_unique("a-[:ACCESSED_FROM]->i")
.create_unique("a-[:ACCESSED_FROM]->r")
.create_unique("i-[:IN]->c")
.exec
However doing this locally it takes hours on hundreds of thousands of events. So far, I have attempted the folloiwng:
Wrapping Neo4j::Connection in a ConnectionPool and multi-threading it - I did not see much speed improvements here.
Doing tx = Neo4j::Transaction.new and tx.close every 1000 events processed - looking at a TCP dump, I am not sure this actually does what I expected. It does the exact same requests, with the same frequency, but just has a different response.
With Neo4j::Transaction I see a POST every time the .query(...).exec is called:
Request: {"statements":[{"statement":"MERGE (m:Person{token: {m_Person_token}}) ...{"m_Person_token":"AAA"...,"resultDataContents":["row","REST"]}]}
Response: {"commit":"http://localhost:7474/db/data/transaction/868/commit","results":[{"columns":[],"data":[]}],"transaction":{"expires":"Tue, 10 May 2016 23:19:25 +0000"},"errors":[]}
With Non-Neo4j::Transactions I see the same POST frequency, but this data:
Request: {"query":"MERGE (m:Person{token: {m_Person_token}}) ... {"m_Person_token":"AAA"..."c_Country_name":"United States"}}
Response: {"columns" : [ ], "data" : [ ]}
(Not sure if that is intended behavior, but it looks like less data is transmitted via the Non-Neo4j::Transaction technique - highly possibly I am doing something incorrectly)
Some other ideas I had:
* Post process into a CSV, SCP up and then use the neo4j-import command line utility (although, that seems kinda hacky).
* Combine both of the techniques I tried above.
Has anyone else run into this / have other suggestions?
Ok!
So you're absolutely right. With neo4j-core you can only send one query at a time. With transactions all you're really getting is the ability to rollback. Neo4j does have a nice HTTP JSON API for transactions which allows you to send multiple Cypher requests in the same HTTP request, but neo4j-core doesn't currently support that (I'm working on a refactor for the next major version which will allow this). So there are a number of options:
You can submit your requests via raw HTTP JSON to the APIs. If you still want to use the Query API you can use the to_cypher and merge_params methods to get the cypher and params for that (merge_params is a private method currently, so you'd need to send(:merge_params))
You can load via CSV as you said. You can either
use the neo4j-import command which allows you to import very fast but requires you to put your CSV in a specific format, requires that you be creating a DB from scratch, and requires that you create indexes/constraints after the fact
use the LOAD CSV command which isn't as fast, but is still pretty fast.
You can use the neo4apis gem to build a DSL to import your data. The gem will create Cypher queries under the covers and will batch them for performance. See examples of the gem in use via neo4apis-twitter and neo4apis-github
If you are a bit more adventurous, you can use the new Cypher API in neo4j-core via the new_cypher_api branch on the GitHub repo. The README in that branch has some documentation on the API, but also feel free to drop by our Gitter chat room if you have questions on this or anything else.
If you're implementing a solution which is going to make queries like above where you have multiple MERGE clauses, you'll probably want to profile your queries to make sure that you are avoiding the eager (that post is a bit old and newer versions of Neo4j have alleviated some of the need for care, but you can still look for Eager in your PROFILE)
Also worth a look: Max De Marzi's post on Scaling Cypher Writes

Can I use Serilog.Extra.Web's HttpRequestNumber or HttpRequestId as the SerilogMetrics timed-operation identifier?

I'm using SerilogMetrics's BeginTimedOperation() in a Web API, and it would be really great to be able to use the HttpRequestNumber or HttpRequestId properties (from the respective Serilog.Extra.Web enrichers) as the identifier, making it super easy to correlate timing-related log entries with others across a request.
Something like:
using (logger.BeginTimedOperation("doing some work", HttpRequestNumberEnricher.CurrentRequestNumber))
{ ... }
Short of poking around in HttpContext.Current for the magically- (i.e. non-public) named properties, is this achievable? Thanks!
If you begin a timed operation during a web request, the operation's events will already be tagged with the HttpRequestId.
You'll see it when logging to a structured log server like Seq, but if you're writing it out to a text file or trace then the property won't be included in the output message by default. To show it in there use something like:
.WriteTo.File(...,
outputTemplate: "{Timestamp} [{Level}] ({HttpRequestId}) {Message} ...")
The logging methods use a default template you can draw on for inspiration, and there's some info spread around the wiki though there's no definitive reference.

Resources