I just started using async/await in my nodejs code, and noticed that my code coverage tool cannot handle it, I get "Fatal error: Unexpected token" for any lines with async on them. I'm using karma and jasmine as my unit test framework, and grunt-jasmine-node-coverage for code coverage. I checked and grunt-jasmine-node-coverage hasn't been updated in years. I looked for a more modern code coverage library and couldn't find any that had been updated in the past year. I'm fine with using just npm instead of grunt to run my tasks, I know I'm way behind on that, but I couldn't find any code coverage frameworks recent enough that I think that would make a difference.
Does anyone know of a code coverage framework for JS code that works with ES2018 syntax?
I used nyc (https://github.com/istanbuljs/nyc) with jasmine (https://jasmine.github.io/pages/docs_home.html) and it worked great. My package.json config was:
"scripts": {
"test":"jasmine",
"coverage": "nyc --reporter=lcov npm run test"
},
"nyc": {
"report-dir": "spec/coverage",
"exclude": [
"spec/**/*"
]
},
Related
Versions
Cypress version: 8.4.0
Preprocessor version: 4.2.0
Node version: 12.18.2
Hi all, apologies if this is a stupid question, I'm quite new/noob with cypress, let alone cypress + cucumber.
So I wrote some automation tests in cucumber, and they work fine. I have the feature files in the integration folder, and the step definition folders in the integration folder too. Now I'm trying to have some structure where under integration I have a folder named step_definitions (will show better in screenshot).
folder structure
In package.json I put the following:
"cypress-cucumber-preprocessor": { "nonGlobalStepDefinitions": true, "nonGlobalStepBaseDir": "step_definitions", "commonPath": "common", "stepDefinitions": "step_definitions" }
When I try to run the tests, I get the below error:
Error: We've tried to resolve your step definitions at step_definitions, but that doesn't seem to exist. As of version 2.0.0 it's required to set step_definitions in your cypress-cucumber-preprocessor configuration. Look for nonGlobalStepDefinitions and add stepDefinitions right next to it. It should match your cypress configuration has set for integrationFolder. We no longer rely on getting information from that file as it was unreliable and problematic across Linux/MacOS/Windows especially since the config file could have been passed as an argument to cypress.
Any pointers are appreciated :)
It seems to me that the problem in your case in "stepDefinitions": "step_definitions" have you tried to give the full path like "stepDefinitions": "cypress/integration/step_definitions"?
You should set nonGlobalStepDefinitions to false or remove this setting, as you don't have a separate folder for the step_definitions but it is inside the integration folder instead.
So, in order to use your structure, please modify that section in the package.json file to:
"cypress-cucumber-preprocessor": {
"commonPath": "cypress/integration/step_definitions/common",
"stepDefinitions": "cypress/integration/step_definitions"
}
That would be enough. It works.
I am working on a framework using webdriverIO and mocha. Recently I have installed the Allure reporter to generate HTML reports using jenkins
I am facing a problem with skipped tests though. I have a lot of tests that consist of a header without any code, that still need to be written.
In mocha I add "it.skip" to skip these tests.
While the tests are skipped, the Allure report only recognizes 1 skipped test per file.
When running the below code, Allure returns 1 passed test, 1 failed test and 1 skipped test
describe('Allure test', function() {
it.skip('1. this is a skipped test without any code', function () {
})
it.skip('2. this is another skipped test without any code', function () {
});
it('3. this is an enabled test that has a successfull assert', function () {
chai.expect("foo", "foo should equal foo").to.contain("foo")
});
it('4. this is an enabled test that has a failed assert', function () {
chai.expect("foo", "foo should equal foo").to.contain("bar")
});
});
I would really like my allure report to show how many tests are skipped, to be able to show how much work is left.
The default mocha logging handles this just fine, it shows this:
Number of specs: 1
1 passing (4.00s)
2 skipped
1 failing
I also use the wdio spec reporter which shows it like this, which is also fine:
1 passing (2s)
2 pending
1 failing
I have tried inplementing a categories.json file to manipulate the Allure categories, but I can't get anything to change.
I tried this as a test, but adding it to my allure results folder changed nothing:
[
{
"name": "Ignored tests",
"matchedStatuses": ["skipped", "Skipped", "pending", "Pending", "failed", "Failed", "broken", "Broken", "skip", "Skip", "failing", "Failing", "passes", "Passes"]
}
]
The tools and versions I use are:
`-- wdio-mocha-framework#0.6.2
`-- wdio-allure-reporter#0.6.3
`-- webdriverio#4.13.1
Can anyone tell me how I can get Allure to see all skipped tests?
It is a bug. I've fixed it in https://github.com/webdriverio/wdio-allure-reporter/pull/127
Thanks for reporting this. For the future if you run into such bug, please, file an issue on github.
I would like to display Spectron test results in TeamCity. I have followed the instructions at the Webdriverio TeamCity Reporter page, which are:
npm install wdio-teamcity-reporter --save-dev
and creating a wdio.conf.js file:
exports.config = {
reporters: ['teamcity'],
}
I have placed this file at the top of the project. It has no other entries; I've never needed it before.
I have also tried the additional configuration suggested at wdio-teamcity-reporter npm page.
This is the Jest object in package.json:
"jest": {
"moduleFileExtensions": [
"ts",
"tsx",
"js"
],
"transform": {
"\\.(ts|tsx)$": "<rootDir>/node_modules/ts-jest/preprocessor.js"
},
"roots": [
"<rootDir>/__tests__/",
"<rootDir>/components/"
],
"modulePaths": [
"<rootDir>/__tests__/",
"<rootDir>/components/"
],
"testMatch": [
"**/?(*.)(spec|test).(ts)?(x)"
]
}
And this is the relevant command (that TeamCity calls) in package.json:
"scripts": {
// ...
"test": "jest --maxWorkers=1 --forceExit",
// ...
},
This testing project is built with Typescript and Jest, and only comprises the e2e Spectron tests for an Electron app. The build artifact for that app is a TeamCity dependency for my test 'build'. In my build, TeamCity installs the app, runs the Spectron tests (which are passing), and then uninstalls the app.
All I can see at the moment is the Jest console output within the build log. While there are some hidden artifacts, I see no normal artifacts. I was thinking that the reporting package should have produced an html artifact. How do I go about displaying a test tab, or some other useful set of results?
It turns out that Jest can collect all the Webdriver results. Try using https://www.npmjs.com/package/jest-teamcity.
In jest.config.js use:
"testResultsProcessor": "jest-teamcity"
Using the latest version of Nativescript I've created a plugin as per the documentation and after running tns plugin add ../nativescript-keychain I get the message Successfully installed plugin nativescript-keychain.
I can also see that it's been added to the node_modules directory of my app but require("nativescript-keychain") doesn't work as I get the error Cannot find module 'nativescript-keychain'
My plugin package.json looks like
{
"name": "nativescript-keychain",
"version": "0.0.1",
"nativescript": {
"platforms": {
"ios": "2.2.1"
}
}
}
There are several reasons why this might occur; it would be helpful if you provided a repo to see all the code.
package.json doesn't have a link to the source, typically you have a main: "somefile" key.
Did you do tns run ios --emulator after you installed the plugin, you have to rebuild the app before it will take effect, plugins can't be synced via livesync...
Is the code TypeScript or JavaScript, if it is TypeScript it needs to be transpiled to JS before you can add it to your demo application. TNS will NOT compile any TS code in the plugins. Plugins have to ship with the final JS code.
You need typings for TS to use the auto-complete and not throw warnings about what methods are available.
I have several tests generating coverage reports with istanbul. One of them is generated by karma-coverage plugin. I am merging these reports with istanbul report but files from karma-coverage report are not included in the merged report.
There was an issue with file paths that had the same symptoms but it seems to have been fixed: https://github.com/karma-runner/karma-coverage/pull/163
So it is probably something else.
I have tried using grunt-istanbul that instruments source files separately and then I browserify them in the bundle. I also tried using preprocessor from karma-coverage plugin to instrument the bundle. In both cases karma-coverage generates reports that look ok, but in both cases these reports are not included in the merged report.
What am I doing wrongly here? Is there some workaround maybe?
Package versions:
"karma": "^0.13.10",
"karma-coverage": "^0.5.2",
"grunt-istanbul": "^0.6.1"
karma.conf.js
reporters: ['coverage', 'spec'],
coverageReporter: {
type: 'lcov',
dir: 'coverage'
}
Coverage reporter should have type: 'lcov' - then you can merge reports
If you are able to generate coverage separately then you can merge them as specified here:
link