Is it possible to generate test code from a cucumber feature file? Some developers prefer to read actual code instead of parsing the natural language and associated step definitions used by Cucumber.
Instead of (or as a side effect of) running cucumber a file with test code could be generated. E.g. the code that is inside the step definitions.
Related
I'm evaluating ginkgo at the moment - I very much like the BDD style.
However I'm unable at the moment to get the VS Code debugger to work with the framework. The official VS-Code extension provides test-by-test debugging for native go tests using CodeLens. With other languages and frameworks (eg Typescript/Mocha), I've been able to debug individual test files by setting up launch.json appropriately, but have been unable to find suitable examples for go.
Does anybody have any examples of any launch.json setups for debugging ginkgo tests (or go code invoked from any other framework)?
Thanks!
After a bit of playing around I found a way forward which perhaps should have been obvious. In case it isn't I'll leave the question and this answer here:
For a package foo, a foo_suite_test.go file is generated by the gingko bootstrap command. This contains a top-level test called TestFoo which runs the rest of the tests within the package.
This does have a CodeLens run test | debug test section above it which you can use to debug the entire suite.
It's not quite as convenient as the individual CodeLens entries which appear over each native go test, but it's easy enough to isolate specific tests to run using the Gingko F prefix.
I am currently trying to refactor an existing gnome-shell extension's codebase. Part of that is introducing unit tests as it seems rather neglectful to not use tests in 2016.
After some tinkering I managed to setup a working node-phantomjs-qunit pipeline that actually gets me somewhere.
However, shell extensions use a custom imports-mechanic as well as
some amendments to build in classes (ex: String.format via GJS) that make it impossible to actually test those files in a isolated environment, that is: not within the shell.
So my question is: Is it really true that it is impossible to write unit tests for shell extensions?
I've done some work with unit tests with gnome shell extensions, take a look at this extension for a complete example:
https://github.com/emerinohdz/power-alt-tab
I've used webpack with babel (optional) and GJS. It is even built using Travis CI.
I've included a dumb polyfill for the GS parts I needed, and provided an alternative to handle modules, using ES6 imports instead of the default GS imports mechanism. No integration tests are possible right now, only unit tests, but at least you have control of most of your codebase.
According to this article, one can make a parameterized test in the GoogleTest framework with some code like this:
INSTANTIATE_TEST_CASE_P(InstantiationName,
MyStringTest,
::testing::Values("meek", "geek", "freek"));
TEST_P(MyStringTest, acceptsEekyWords)
{
ASSERT_TRUE(acceptName(GetParam()));
}
plus some scaffolding.
After going through the CxxTest User Guide, I couldn't help but notice the lack of any mention of parameterized tests. Are parameterized tests even possible with CxxTest?
This question seems to address something similar, but the answer is by no means trivial.
I'm new to C++ unit testing. Maybe parameterized tests aren't a big deal? Almost all of my tests were parameterized in my last C# NUnit project.
As I wrote in my answer to the other question you cited, there's no facility for generating tests at run time, which is what any parameter-based tester I've ever seen does. In CxxTest, the test list is determined at compile time by the Python-based C++ parser, cxxtestgen. You can generate suites dynamically, but you only have the option of generating zero or one copy of any suite.
Is CppUnit the only C/C++ unit test framework currently available for use with Sonar?
What would be involved in adding additional C/C++ unit testing frameworks? (e.g. how many lines of code is the CppUnit plugin, how reusable, etc.)
I think you should better send your queries in Sonar's mailing lists : http://www.sonarsource.org/support/support/
See the unit test page: http://docs.codehaus.org/display/SONAR/Unit+Test+Support
From that page:
The C++ Plugin parses xunit compliant format using the
sonar.cxx.xunit.reportPath. To use other formats, first they need to
be converted using the property sonar.cxx.xunit.xsltURL
For convenience the following xsl are provided
boosttest-1.x-to-junit-1.0.xsl For transforming Boost-reports
cpptestunit-1.x-to-junit-1.0.xsl For transforming
CppTestUnit-reports cppunit-1.x-to-junit-1.0.xsl For transforming CppUnit-reports
So packages that support xUnit format, like Google Test Framework, should be supported. Otherwise, if they output xml they should be supportable by changing the xslt.
I am involved in development of unit level test cases for our project. There are both managed code and native C++ code. After some study I chose NUnit for managed code. I would either use Gallio or FireBenchmarks which is an extension to provide HTML outputs and charts etc.
Do we have extensions like this for cppUnit or Boost.Test ? I have not decided which one to use. If there are none, which of these would be easier to extend to enable such a plugin ?
Please give your suggestions on this.
You can configure Boost.Test to generate XML output. The doc says:
This log format is designed for
automated test results processing. The
test log output XML schema depends on
the active log level threshold.
This can be enabled by specifying -output_format=XML on the command line, or by setting the environment variable BOOST_TEST_OUTPUT_FORMAT=XML. The related docs are here.
It is also possible to configure Boost.Test at compile time to produce XML output by default (described here)
In order to generate HTML you either need to implement your own formatter (which is possible, but nicely underdocumented, so please ask on the list) or to transform the XML in a postprocessing step.