I am using a MongoDB on mLab to store a basic collection of boardgames which I wish to show in my Ruby app. I have completed a tutorial that uses Mongoid to implement this locally, but so far I can't get it working with the mLab instance of the DB.
I add this to my mongoid.yml file
development:
clients:
default:
uri: 'mongodb://user:password#ds141232.mlab.com:41232/boardgame_banter'
The other options automatically generated in the config file, I have left blank (as default).
I want to understand these 2 lines from the Terminal:
MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | STARTED | {"find"=>"boardgames", "filter"=>{}}
MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | SUCCEEDED | 0.037816999999999996s
I get no errors, but also no documents returned and the generated index.html is blank...
Can anyone explain the first of the two lines MONGODB | ... to me, or at least confirm if my assumptions below are correct? Particularly the last part of the chain, is this telling me that the filtered results are empty?
MONGODB | <<hostname>> | <<database.find()>> | <<STATUS>> | {"find"=><<collection>>, "filter"=>{<<no results??>>}}
UPDATE after suggestion from #tfogo in the comments
In my controller:
# GET /boardgames
# GET /boardgames.json
def index
#boardgames = Boardgame.all
#log = Boardgame.all.to_a
puts "LOG: #{#log}"
end
Which produces the following empty Log statement in the console:
Started GET "/boardgames" for 127.0.0.1 at 2018-03-02 11:25:00 +0100
Processing by BoardgamesController#index as HTML
D, [2018-03-02T11:25:00.186878 #12983] DEBUG -- : MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | STARTED | {"find"=>"boardgames", "filter"=>{}}
D, [2018-03-02T11:25:00.223330 #12983] DEBUG -- : MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | SUCCEEDED | 0.035911000000000005s
LOG: [#<Boardgame _id: 5a984b439de90b3769420f2d, name: nil, rating: nil, minplayer: nil, maxplayer: nil, duration: nil, owner: nil>]
Rendering boardgames/index.html.erb within layouts/application
D, [2018-03-02T11:25:00.235908 #12983] DEBUG -- : MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | STARTED | {"find"=>"boardgames", "filter"=>{}}
D, [2018-03-02T11:25:00.274734 #12983] DEBUG -- : MONGODB | ds141232-a.mlab.com:41232 | boardgame_banter.find | SUCCEEDED | 0.038311s
Rendered boardgames/index.html.erb within layouts/application (42.3ms)
Completed 200 OK in 127ms (Views: 76.9ms)
Related
We have a web-application hosted on Azure and it sends Telemetry to App Insights and the Dev team is asking if it is ok to Turn off sending the SESSION/KEEPALIVE data thats being posted from web-application. Will this affect any functionality like User Flows etc in Application Insights?
Any guidance on this?
Following is sample data:-
timestamp | id | source | name | url | success | resultCode | duration | performanceBucket
-- | -- | -- | -- | -- | -- | -- | -- | --
2019-09-25T16:00:31.8191577Z | \|Ac34D.9fIx+.4c3e0b35_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 15.8274 | <250ms
2019-09-25T16:00:42.7423811Z | \|Ac34D.FqSNy.83ee6e0d_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 38.3679 | <250ms
2019-09-25T16:00:48.716939Z | \|Ac34D.h8kwN.34c0b012_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 16.0359 | <250ms
2019-09-25T16:00:54.1607213Z | \|Ac34D.v2qfF.4c3e0b36_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 15.2518 | <250ms
Views in Applications Insights typically target a specific set of telemetry item types.
For instance, user flows UI leverages PageView and CustomEvent telemetry types. Therefore, if keep alive is reported as one of those types it will be displayed in that UI.
However, if the example above is Dependency telemetry, then that view won't be affected.
In general, if you'd like to drop some of the telemetry before it reaches AI and is processed for storage, you'd use TelemetryProcessor (in case of Java Script SDK, TelemetryInitializer) to filter it out:
var telemetryInitializer = (envelope) => {
if (envelope.data.someField == 'keepalive') return false;
};
appInsights.addTelemetryInitializer(telemetryInitializer);
I am not sure it this is intended to be so, but I am confused by the behavior.
When I have the following Scenario Outline:
Scenario Outline: outline1
Given url
And query parameters <query_params>
When method
Then status is
Examples:
| method | endpoint | query_params | status |
| GET | /endpoint1 | ?a=1&b=1 | 200 |
| GET | /endpoint1 | ?a=1&b=1&c=3 | 200 |
I see the following snippet generated.
func FeatureContext(s *godog.Suite) {
s.Step(^method GET$, methodGET)
s.Step(^query parameters \?a=(\d+)&b=(\d+)$, queryParametersAB)
s.Step(^query parameters \?a=(\d+)&b=(\d+)&c=(\d+)$, queryParametersABC)
}
As you can see 2 lines of "query parameters" produces 2 different functions. Why is godog parsing this text? This is a little different from cucumber gherkin parsing.
One side effect of this is that if I have 100 lines in the data table, I am forced to implement all of them.
Is there a way I can ask godog to not do this parsing?
The solution to the problem is to use double quotes around as given below.
Scenario Outline: outline1
Given url
And query parameters "<query_params>"
When method
Then status is
Examples:
| method | endpoint | query_params | status |
| GET | /endpoint1 | ?a=1&b=1 | 200 |
| GET | /endpoint1 | ?a=1&b=1&c=3 | 200 |
Then the following will be generated:
s.Step(`^query parameters "([^"]*)"$`, queryParameters)
I am new to cucumber and trying to use the datatable in a scenario.
Scenario: 1. Sets the configuration and validates it
When the user sets the POST config
| key | value |
| enabled | false |
| timezone | "Asia/Kolkata" |
Then the user gets the config and the result is successfull
| key | value |
| enabled | false |
| timezone | "Asia/Kolkata" |
Here i am using the same datatable to construct REST post request and then validate it.
Is there a possibility to specify the same datatable for multiple steps ?
If i specify the datatable at the end of the scenario,for the first step i get the error Arity mismatch.
TIA
You could also provide different parameters to you test by using scenario outline.
Scenario Outline: Sets the configuration and validates it
When the user sets the POST config with key “<*key>” and enabled status equals to “<*enabled>” for timezone “<*timezone>”
Then the user gets the config and the correct data is recorded for key “<*key>” and enabled status equals to “<*enabled>” for timezone “<*timezone>”
Examples:
| key | enabled | timezone |
| value | false | Asia/Kolkata |
| value2 | true | timezone2 |
The first time your test will be run with key = value, enabled = false, timezone = Asia/Kolkata
The second time the test will be executed with key = value2, enabled = true, timezone = timezone2
Etc
P.S.: You need to delete the * symbols
I hope it helps.
I'm writing scenarios for QA engineers, and now I face a problem such as step encapsulation.
This is my scenario:
When I open connection
And device are ready to receive files
I send to device file with params:
| name | ololo |
| type | txt |
| size | 123 |
All of each steps are important for people, who will use my steps.
And I need to automate this steps and repeat it 100 times.
So I decide create new step, which run it 100 times.
First variant was to create step with other steps inside like:
Then I open connection, check device are ready and send file with params 100 times:
| name | ololo |
| type | txt |
| size | 123 |
But this version is not appropriate, because:
people who will use it, will don't understand which steps executes inside
and some times name of steps like this are to long
Second variant was to create step with other steps in parameters table:
I execute following steps 100 times:
| When I open connection |
| And device are ready to receive files |
| I send to device file |
It will be easy to understand for people, who will use me steps and scenarios.
But also I have some steps with parameters,
and I need to create something like two tier table:
I execute following steps 100 times:
| When I open connection |
| And device are ready to receive files |
| I send to device file with params: |
| | name | ololo | |
| | type | txt | |
| | size | 123 | |
This is the best variant in my situation.
But of cause cucumber can't parse it without errors ( it's not correct as cucumber code ).
How can I fix last example of step? (mark with bold font)
Does cucumber have some instruments, which helps me?
Can you suggest some suggest your type of solution?
Does someone have similar problems?
I decide to change symbols "|" to "/" in parameters table, which are inside.
It's not perfect, but it works:
This is scenarios steps:
I execute following steps 100 times:
| I open connection |
| device are ready to receive files |
| I send to device file with params: |
| / name / ololo / |
| / type / txt / |
| / size / 123 / |
This is step definition:
And /^I execute following steps (.*) times:$/ do |number, table|
data = table.raw.map{ |raw| raw.last }
number.to_i.times do
params = []
step_name = ''
data.each_with_index do |line,index|
next_is_not_param = data[index+1].nil? || ( data[index+1] && !data[index+1].include?('/') )
if !line.include?('/')
step_name = line
#p step_name if next_is_not_param
step step_name if next_is_not_param
else
params += [line.gsub('/','|')]
if next_is_not_param
step_table = Cucumber::Ast::Table.parse( params.join("\n"), nil, nil )
#p step_name
#p step_table
step step_name, step_table
params = []
end
end
end
#p '---------------------------------------------------------'
end
end
I'm new to pig and am trying to perform some basic analysis on a file containing events that look like the below:
1345477765 2012-08-20 08:49:24 servername 12.34.56.78 192.168.1.4 joebloggs ManageSystem Here's your message
I attempt to load the file as below:
logs = LOAD '/path/to/file' using PigStorage AS (loggedtime:long, serverdate:chararray, servertime:chararray, servername:chararray, externalip:chararray, internalip:chararray, username:chararray, systemtype:chararray, message:chararray);
When I illustrate logs everything looks ok:
Illustrate logs
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
| logs | loggedtime:long | serverdate:chararray | servertime:chararray | servername:chararray | externalip:chararray | internalip:chararray | username:chararray | systemtype:chararray | message:chararray |
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
| | 1345477765 | 2012-08-20 | 08:49:24 | servername | 12.34.56.78 | 192.168.1.4 | joebloggs | ManageSystem | Here's your message |
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Also, when a describe them everything is as I would expect:
logs: {loggedtime: long,serverdate: chararray,servertime: chararray,servername: chararray,externalip: chararray,internalip: chararray,username: chararray,systemtype: chararray,message: chararray}
However, when I dump logs, the loggedtime is not included.
dump logs;
(,2012-08-20,08:49:24,servername,12.34.56.78,192.168.1.4,joebloggs,ManageSystem,Here's your message)
Presumably as a result of this, my filter returns no events:
specificlog = FILTER logs BY loggedtime == 1345477765;
Hopefully I'm missing something easy here.
I eventually figured this out myself. To parse to a long I had to put an "L" at the end of the number.
e.g. by changing my source data to the below I was able to get this working.
1345477765L 2012-08-20 08:49:24 servername 12.34.56.78 192.168.1.4 joebloggs ManageSystem Here's your message
Hopefully this will help someone with the same problem.