Run specific part of Cypress test multiple times (not whole test) - cypress

Is it possible to run specific part of the test in Cypress over and over again without execution of whole test case? I got error in the second part of test case and first half of it takes 100s. It means I have to wait 100s every time to get to the point where the error occurs. I would like rerun test case just few steps before error occurs. So once again, my question is: Is it possible to do in Cypress? Thanks

Workaround #1
If you are using cucumber in cypress you can modify your scenario to a Scenario Outline that will execute Nth times with a scenario tag:
#runMe
Scenario Outline: Visit Google Page
Given that google page is displayed
Examples:
| nthRun |
| 1 |
| 2 |
| 3 |
| 4 |
| 100 |
After that run the test in the terminal by running through tags:
./node_modules/.bin/cypress-tags run -e TAGS='#runMe'
Reference: https://www.npmjs.com/package/cypress-cucumber-preprocessor?activeTab=versions#running-tagged-tests
Workaround #2
Cypress does have retry capability but it would only retry the scenario during failure. You can force your scenario to fail to retry it Nth times with a scenario tag:
In your cypress.json add the following configuration:
{
"retries": {
// Configure retry attempts for `cypress run`
// Default is 0
"runMode": 99,
// Configure retry attempts for `cypress open`
// Default is 0
"openMode": 99
}
}
Reference: https://docs.cypress.io/guides/guides/test-retries#How-It-Works
Next is In your feature file, add an unknown step definition on the last step of your scenario to make it fail:
#runMe
Scenario: Visit Google Page
Given that google page is displayed
And I am an uknown step
Then run the test through tags:
./node_modules/.bin/cypress-tags run -e TAGS='#runMe'

For a solution that doesn't require adding a change to the config file, you can pass retries as a param to specific tests that are known to be flakey for acceptable reasons.
https://docs.cypress.io/guides/guides/test-retries#Custom-Configurations
Meaning you can write (from docs)
describe('User bank accounts', {
retries: {
runMode: 2,
openMode: 1,
}
}, () => {
// The per-suite configuration is applied to each test
// If a test fails, it will be retried
it('allows a user to view their transactions', () => {
// ...
}
it('allows a user to edit their transactions', () => {
// ...
}
})```

Related

Can't get tagging to work with cypress-cucumber-preprocessor

I am currently having issues with using tagging with the cypress-cucumber-preprocessor package. I know that the cypress-tags has been removed and made redundant so I'm trying to set up tagging using the new syntax but to no avail.
Here is my feature:
Feature: duckduckgo.com
Rule: I am on a desktop
Scenario: visiting the frontpage
When I visit <site>
Then I should see a search bar
#google
Examples:
| site |
| google.com |
#duckduckgo
Examples:
| site |
| duckduckgo.com |
And my step definitions:
import { When, Then } from "#badeball/cypress-cucumber-preprocessor";
When(`I visit` + url, () => {
if(url === 'duckduckgo.com') return cy.visit("https://www.duckduckgo.com");
if(url === 'google.com') return cy.visit("https://www.google.com");
});
Then("I should see a search bar", () => {
cy.get("input").should(
"have.attr",
"placeholder",
"Search the web without being tracked"
);
});
When I try to run my tests with npx cypress run --env tags="#google", it gives me an error saying url in my steps definitions isn't defined. What am I doing wrong?
Try to add script with this command in package.json file this way:
"scripts": {
"open:test": "npm run clean:report && cypress open --env configFile=test,TAGS=#test" (or any tag you need)
}
And then use it as:
npn run open:test
The main difference besides rapping it into script is not using quotes, maybe this will help you

How we can run same feature file on multiple browser sequentially? [duplicate]

I am able to execute WebUI feature file against single browser (Zalenium) using parallel runner and defined driver in karate-config.js. How can we execute WebUI feature file against multiple browsers (Zalenium) using parallel runner or distributed testing?
Use a Scenario Outline and the parallel runner. Karate will run each row of an Examples table in parallel. But you will have to move the driver config into the Feature.
Just add a parallel runner to this sample project and try: https://github.com/intuit/karate/tree/master/examples/ui-test
Scenario Outline: <type>
* def webUrlBase = karate.properties['web.url.base']
* configure driver = { type: '#(type)', showDriverLog: true }
* driver webUrlBase + '/page-01'
* match text('#placeholder') == 'Before'
* click('{}Click Me')
* match text('#placeholder') == 'After'
Examples:
| type |
| chrome |
| geckodriver |
There are other ways you can experiment with, here is another pattern when you have a normal Scenario in main.feature - which you can then call later from a Scenario Outline from a separate "special" feature - which is used only when you want to do this kind of parallel-ization of UI tests.
Scenario Outline: <config>
* configure driver = config
* call read('main.feature')
Examples:
| config! |
| { type: 'chromedriver' } |
| { type: 'geckodriver' } |
| { type: 'safaridriver' } |
EDIT - also see this answer: https://stackoverflow.com/a/62325328/143475
And for other ideas: https://stackoverflow.com/a/61685169/143475
EDIT - it is possible to re-use the same browser instance for all tests and the Karate CI regression test does this, which is worth studying for ideas: https://stackoverflow.com/a/66762430/143475

Terminate / Skip / Stop all tests from all spec files if any one test fails in cypress

am trying to skip all other tests from all spec files if one test fails and found a working solution over here Is there a reliable way to have Cypress exit as soon as a test fails?. However, this looks to be working only if the test fails in it() assertions. How can we skip the tests if somethings fails in beforeach()
For eg:
before(() => {
cy.get('[data-name="email-input"]').type(email);
cy.get('[data-name="password-input"]').type(email);
cy.get('[data-name="account-save-btn"]').click();
});
And if something goes wrong (for eg: CypressError: Timed out retrying: Expected to find element: '[data-name="email-input"]', but never found it.) in above code then stop/ skip all tests in all spec files.
Just in case anyone is looking answer for the same question. I have found a solution and would like to share.
To implement the solution I have used a cookie that I will set to value true if something fails and before executing each test cypress will check the value of cookie. If the value of cookie is true it skips the test.
Cypress.on('fail', error => {
document.cookie = "shouldSkip=true" ;
throw error;
});
function stopTests() {
cy.getCookie('shouldSkip').then(cookie => {
if (cookie && typeof cookie === 'object' && cookie.value === 'true') {
Cypress.runner.stop();
}
});
}
beforeEach(stopTests);
Also to note: Tests should be written in it() block and avoid using before() to write tests
As of Cypress 10, tests don't run if before or beforeEach hook fails.

Always some test cases getting jasmine.DEFAULT_TIMEOUT_INTERVAL

I am going to create end to end(e2e) test using protractor with jasmine and angular 6. I have written some test cases almost 10 cases. That's all working fine, but always some cases become fails. And its failed because of jasmine timeout. I have configure timeout value like below. But I am not getting consistant result. sometimes a test cases is success but at next run it will goes to success or fail. I have searched on google but I have not found any useful solution.
I have defined some common properties for wait
waitForElement(element: ElementFinder){
browser.waitForAngularEnabled(false);
browser.wait(() => element.isPresent(), 100000, 'timeout: ');
}
waitForUrl(url: string){
browser.wait(() => protractor.ExpectedConditions.urlContains(url), 100000, 'timeout')
}
And protractor.conf.js file I have defined that
jasmineNodeOpts: {
showColors: true,
includeStackTrace: true,
defaultTimeoutInterval: 20000,
print: function () {
}
}
I am getting below error
- Error: Timeout - Async callback was not invoked within timeout specified by jasmine.DEFAULT_TIMEOUT_INTERVAL.
- Failed: stale element reference: element is not attached to the page document
(Session info: chrome=76.0.3809.100)
(Driver info: chromedriver=76.0.3809.12 (220b19a666554bdcac56dff9ffd44c300842c933-refs/branch-heads/3809#{#83}),platform=Windows NT 10.0.17134 x86_64)
I have got the solution:
I have configured waiting timeout 100000 ms for individual element find where whole script timeout was 20000 ms. So I have follow below process:
Keep full spec timeout below than sum of all elements find timeouts. I have configured defaultTimeoutInterval at jasmineNodeOpts greater than sum of value for all test cases timeout. And then add a large value to allScriptsTimeout: 2000000 inside of export.config. Its resolved my problem.
NB: I gave this answer because I think it may help others who will face this kind of problem.

Ginkgo skipped specs counted as failed

I've being using Ginkgo for a while and I have found a behavior I don't really understand. I have a set of specs that I only want to run if and only if a condition is available. If the condition is not available I want to skip the test suite.
Something like this:
ginkgo.BeforeSuite(func(){
if !CheckCondition() {
ginkgo.Skip("condition not available")
}
}
When the suite is skipped this counts as a failure.
FAIL! -- 0 Passed | 1 Failed | 0 Pending | 0 Skipped
I assumed there should be one tests to be considered to be skipped. Am I missing something? Any comments are welcome.
Thnaks
I think you are using Skip method incorrectly. It should be use inside spec like below, not inside BeforeSuite. When used inside spec it does show up as "skipped" in the summary.
It("should do something, if it can", func() {
if !someCondition {
Skip("special condition wasn't met")
}
})
https://onsi.github.io/ginkgo/#the-spec-runner

Resources