I am not sure it this is intended to be so, but I am confused by the behavior.
When I have the following Scenario Outline:
Scenario Outline: outline1
Given url
And query parameters <query_params>
When method
Then status is
Examples:
| method | endpoint | query_params | status |
| GET | /endpoint1 | ?a=1&b=1 | 200 |
| GET | /endpoint1 | ?a=1&b=1&c=3 | 200 |
I see the following snippet generated.
func FeatureContext(s *godog.Suite) {
s.Step(^method GET$, methodGET)
s.Step(^query parameters \?a=(\d+)&b=(\d+)$, queryParametersAB)
s.Step(^query parameters \?a=(\d+)&b=(\d+)&c=(\d+)$, queryParametersABC)
}
As you can see 2 lines of "query parameters" produces 2 different functions. Why is godog parsing this text? This is a little different from cucumber gherkin parsing.
One side effect of this is that if I have 100 lines in the data table, I am forced to implement all of them.
Is there a way I can ask godog to not do this parsing?
The solution to the problem is to use double quotes around as given below.
Scenario Outline: outline1
Given url
And query parameters "<query_params>"
When method
Then status is
Examples:
| method | endpoint | query_params | status |
| GET | /endpoint1 | ?a=1&b=1 | 200 |
| GET | /endpoint1 | ?a=1&b=1&c=3 | 200 |
Then the following will be generated:
s.Step(`^query parameters "([^"]*)"$`, queryParameters)
Related
I have the following setup:
stages:
| id | name | order |
commands:
| id | name | body |
------------------------------=
| 1 | test | phpunit |
| 2 | style | echo "style" |
| 3 | deploy | deploy |
command_stage:
| command_id | stage_id | order |
---------------------------------
| 1 | 1 | 1 |
| 2 | 2 | 1 |
| 3 | 1 | 2 |
Basically, I would like to create a method on the stage model which allows me to get all of the commands back based off of the order, but commands have a specific order and so do the stages.
So each command is stored under a stage so we know which part to run but each command also has an order in that stage. Now I know I can do something like the following:
$inOrder = collect();
Stage::get()->each(function ($stage) {
$commands = $stage->commands()->orderByPivot('order')->get();
$inOrder->push($commands);
});
return $inOrder;
But I was wondering if there is a nicer way to do this? Or even a way to do this solely on one database hit?
Pre-load and sort the relationship:
$stages = Stage::with(['commands' => function ($subQuery) {
$subQuery->orderBy('command_stage', 'order');
}])->get();
foreach($stages as $stage) {
$inOrder->push($stage->commands);
}
This method of Eager-Loading will reduce the N+1 query issue of doing $stage->commands() inside the foreach() loop.
We have a web-application hosted on Azure and it sends Telemetry to App Insights and the Dev team is asking if it is ok to Turn off sending the SESSION/KEEPALIVE data thats being posted from web-application. Will this affect any functionality like User Flows etc in Application Insights?
Any guidance on this?
Following is sample data:-
timestamp | id | source | name | url | success | resultCode | duration | performanceBucket
-- | -- | -- | -- | -- | -- | -- | -- | --
2019-09-25T16:00:31.8191577Z | \|Ac34D.9fIx+.4c3e0b35_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 15.8274 | <250ms
2019-09-25T16:00:42.7423811Z | \|Ac34D.FqSNy.83ee6e0d_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 38.3679 | <250ms
2019-09-25T16:00:48.716939Z | \|Ac34D.h8kwN.34c0b012_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 16.0359 | <250ms
2019-09-25T16:00:54.1607213Z | \|Ac34D.v2qfF.4c3e0b36_ | POST session/keepalive | http://XXXXXXXXXXXXXX.com/session/keepalive | TRUE | 200 | 15.2518 | <250ms
Views in Applications Insights typically target a specific set of telemetry item types.
For instance, user flows UI leverages PageView and CustomEvent telemetry types. Therefore, if keep alive is reported as one of those types it will be displayed in that UI.
However, if the example above is Dependency telemetry, then that view won't be affected.
In general, if you'd like to drop some of the telemetry before it reaches AI and is processed for storage, you'd use TelemetryProcessor (in case of Java Script SDK, TelemetryInitializer) to filter it out:
var telemetryInitializer = (envelope) => {
if (envelope.data.someField == 'keepalive') return false;
};
appInsights.addTelemetryInitializer(telemetryInitializer);
So my issues might be of syntactic nature, maybe not, but I am clueless on how to proceed next. I am writing a test case on the Robot Framework, and my end goal is to be able to run ,multiple tests, back to back in a Loop.
In this cases below, the Log to Console call works fine, and outputs the different values passed as parameters. The next call "Query Database And Analyse Data" works as well.
*** Test Cases ***
| For-Loop-Elements
| | #{Items} = | Create List | ${120} | ${240} | ${240}
| | :FOR | ${ELEMENT} | IN | #{ITEMS}
| | | Log To Console | Running tests at Voltage: ${ELEMENT}
| | | Query Database And Analyse Data
But then, when I try to makes a test cases with documentation and tags with "Query Database And Analyse Data", I get the Error: Keyword Name cannot be Empty, which leads me to think that when the file gets to [Documentation tag], it doesn't understand that it is part of a test case. This is usually how I write test cases.
Please note here that the indentation tries to match the inside of the loop
*** Test Cases ***
| For-Loop-Elements
| | #{Items} = | Create List | ${120} | ${240} | ${240}
| | :FOR | ${ELEMENT} | IN | #{ITEMS}
| | | Log To Console | Running tests at Voltage: ${ELEMENT}
| | | Query Database And Analyse Data
| | | | [Documentation] | Query DB.
| | | | [Tags] | query | voltagevariation
| | | Duplicates Test
| | | | [Documentation] | Packets should be unique.
| | | | [Tags] | packet_duplicates | system
| | | | Duplicates
| | | Chroma Output ON
| | | | [Documentation] | Setting output terminal status to ON
| | | | [Tags] | set_output_on | voltagevariation
| | | | ${chroma-status} = | Chroma Output On | ${HOST} | ${PORT}
Now is this a syntax problem, indentation issue, or is it just plain impossible to do what I'm trying to do? If you have written similar cases, but in a different manner, please let me know!
Any help or input would be highly appreciated!
You are trying to use Keywords as Test Cases. This approach is not supported by Robot Framework.
What you could do is make one Test Case with a lot of Keywords:
*** Test Cases ***
| For-Loop-Elements
| | #{Items} = | Create List | ${120} | ${240} | ${240}
| | :FOR | ${ELEMENT} | IN | #{ITEMS}
| | | Log To Console | Running tests at Voltage: ${ELEMENT}
| | | Query Database And Analyse Data
| | | Duplicates
| | | ${chroma-status} = | Chroma Output On | ${HOST} | ${PORT}
*** Keywords ***
| Query Database And Analyse Data
| | Do something
| | Do something else
...
You can't really fit [Tags] anywhere useful. You can, however, fire meaningful fail messages (substituting the [Documentation]) if instead of using a Keyword directly you wrapped it in Run Keyword And Return Status.
Furthermore, please have a look at data driven tests to get rid of the :FOR-loop completely.
I'm writing scenarios for QA engineers, and now I face a problem such as step encapsulation.
This is my scenario:
When I open connection
And device are ready to receive files
I send to device file with params:
| name | ololo |
| type | txt |
| size | 123 |
All of each steps are important for people, who will use my steps.
And I need to automate this steps and repeat it 100 times.
So I decide create new step, which run it 100 times.
First variant was to create step with other steps inside like:
Then I open connection, check device are ready and send file with params 100 times:
| name | ololo |
| type | txt |
| size | 123 |
But this version is not appropriate, because:
people who will use it, will don't understand which steps executes inside
and some times name of steps like this are to long
Second variant was to create step with other steps in parameters table:
I execute following steps 100 times:
| When I open connection |
| And device are ready to receive files |
| I send to device file |
It will be easy to understand for people, who will use me steps and scenarios.
But also I have some steps with parameters,
and I need to create something like two tier table:
I execute following steps 100 times:
| When I open connection |
| And device are ready to receive files |
| I send to device file with params: |
| | name | ololo | |
| | type | txt | |
| | size | 123 | |
This is the best variant in my situation.
But of cause cucumber can't parse it without errors ( it's not correct as cucumber code ).
How can I fix last example of step? (mark with bold font)
Does cucumber have some instruments, which helps me?
Can you suggest some suggest your type of solution?
Does someone have similar problems?
I decide to change symbols "|" to "/" in parameters table, which are inside.
It's not perfect, but it works:
This is scenarios steps:
I execute following steps 100 times:
| I open connection |
| device are ready to receive files |
| I send to device file with params: |
| / name / ololo / |
| / type / txt / |
| / size / 123 / |
This is step definition:
And /^I execute following steps (.*) times:$/ do |number, table|
data = table.raw.map{ |raw| raw.last }
number.to_i.times do
params = []
step_name = ''
data.each_with_index do |line,index|
next_is_not_param = data[index+1].nil? || ( data[index+1] && !data[index+1].include?('/') )
if !line.include?('/')
step_name = line
#p step_name if next_is_not_param
step step_name if next_is_not_param
else
params += [line.gsub('/','|')]
if next_is_not_param
step_table = Cucumber::Ast::Table.parse( params.join("\n"), nil, nil )
#p step_name
#p step_table
step step_name, step_table
params = []
end
end
end
#p '---------------------------------------------------------'
end
end
I have an Oracle Forms 6i form with a data block that consists of several columns.
------------------------------------------------------------------------------
| FIRST_NAME | LAST_NAME | DEPARTMENT | BIRTH_DATE | JOIN_DATE | RETIRE_DATE |
------------------------------------------------------------------------------
| | | | | | |
| | | | | | |
| | | | | | |
| | | | | | |
------------------------------------------------------------------------------
The user can press F7 (to Enter in Query Mode, for example, he/she types JOH% in the first_name and H% in the DEPARTMENT field) , then F8 to execute the query and see the results. In this example, a list of all employees with their last name starting with JOH and working in any department starting with H will be listed. Here is a sample output of that query
------------------------------------------------------------------------------
| FIRST_NAME | LAST_NAME | DEPARTMENT | BIRTH_DATE | JOIN_DATE | RETIRE_DATE |
------------------------------------------------------------------------------
| MIKE | JOHN | HUMAN RES. | 05-MAY-82 | 02-FEB-95 | |
| BEN | JOHNATHAN | HOUSING | 23-APR-76 | 16-AUG-98 | |
| SMITH | JOHN | HOUSING | 11-DEC-78 | 30-JUL-91 | |
| | | | | | |
------------------------------------------------------------------------------
I then added a small button on top of each column to allow the user to sort the data by the desired column, by executing WHEN-BUTTON-PRESSED trigger:
set_block_property('dept', order_by, 'first_name desc');
The good news is that the ORDER_BY does change. The bad news is that the user never notice the change because he/she will need to do another query and execute to see the output ordered by the column they selected. In other words, user will only notice the change in the next query he/she will execute.
I tried to automatically execute the query upon changing the ORDER_BY clause like this:
set_block_property('dept', order_by, 'first_name desc');
go_block('EMPLOYEE');
do_key('EXECUTE_QUERY');
/* EXECUTE_QUERY -- same thing */
but what happens is that all data from the table is selected, ignoring the criteria that the user has initially set during the query mode entry.
I also searched for a solution to this problem and most of them deal with SYSTEM.LAST_QUERY and default_where. The problem is, last_query can refer to a different block from a different form, that is not valid on the currently displayed data bloc.
How can do the following in just one button press:
1- Change the ORDER_BY clause of the currently active datablock
and: 2- Execute the last query that the user has executed, using the same criteria that was set?
Any help will be highly appreciated.
You can get the last query of the block with get_block_property built-in function:
GET_BLOCK_PROPERTY('EMPLOYEE', LAST_QUERY);
Another option is to provide separate search field(s) on the form, instead of using the QBE functionality.