I have managed to get Cypress to report results into TestRail using the cypress-testrail-reporter. However, when I execute a test run (only manually at the minute using npx cypress run) the results of each test are logged as separate test runs. They are currently showing like this:
Screenshot from my test runs page on TestRail
I want the results from each time I execute my test suite to appear as 1 run rather than 5 different runs as shown in the screenshot.
I've bumped into the very same issue. So I've started to use Cypress-accumulative-reporter which fixes this issue.
This reporter is not perfect btw from time to time it does not publish the last spec executed, so I've created an empty spec file to run at the end... not clean workaround but it works.
https://www.npmjs.com/package/cypress-testrail-accumulative-reporter
Related
Our test suite takes 5 minutes to run (mostly due to Kafka containers being setup before each test I presume).
When running mvn quarkus:dev and working on a test, I don't know how to re-run only a single test, the one I'm working on.
If my test is broken, and is the only one broken, then it is fine. But as soon as it turns green, quarkus will not run it again if I change the test code.
If I am making big changes, quarkus will run all broken tests and I cannot clearly follow the result of the test I am working on.
I can use mvn verify to run a single test, but the compilation time and application startup times makes it too boring and breaks the mental flow.
How can I tell quarkus to run only some specific test while running?
I'm using cypress-allure-plugin in order to generate allure reports for cypress tests.
When I try to run all the specs in headed mode automatically I don't get a full report as every other test except the last has a "skipped" status in the report despite that not being the case when I watch the tests being executed.
This issue seems to be exclusive to using the "Run x integrations specs" since running them one by one manually in headed mode doesn't produce this issue and neither does running them in headless mode.
Has anybody encountered a similar issue?
EDIT: I think I found the issue. Sometimes the tests reload wipping current test results and only results that were generated afterwards are used in the report.
I am trying to run multiple feature files through maven goal(command line) but after 2 feature files that run successfully, the other feature files (3rd one onwards) fails in some test cases which when ran independently passes all the test cases.
So f I run each feature file individually I get proper results but running them all together gives wrong results.
We are using serenity framework with cucumber jvm. Please help how can we resolve this issue.
Your failing tests fail to fully setup the context. Some state is leaking from the previous ones. Look for what has changed during the first runs (database/mocks/whatever state) that has to be reset before running the third and following.
I have a big test suite written in TestNG that takes several hours to complete.
I'd like for maven/surefire to create a new suite that is a copy of the first but with just the failed tests in it, a suite that should be much faster to run.
Is it possible to create such a suite?
As a fall back I could create it on my own from a test report that is easy to parse, if there is such report.
Thank you.
On completion of run, testng generates a testng-failed.xml (in the same output folder as your reports), which basically is your initial suite file with the listeners, but the tests part contains only the failed testcases.
In case you are using Jenkins, you might consider adding a postbuild step that triggers another build that works on the same workspace as the current build and uses this failed xml. Or depending on how you are triggering your tests, you might look at writing a script to run the failed xml.
I have a project with many modules, and I'd like to know how long each test takes to execute.
Is there any parameter that can output that information? I've searched online and found nothing.
I could run all the test cases on the ide, which logs the execution times, and copy the times to a file, but I don't wanna do this every time I want the tests time log.
Check the target/surefire-reports directory. Each project includes a report for each test it runs, and the reports contain test execution times.