Ginkgo: how to combine test reports - go

I'm setting up GitLab CI.
We use Ginkgo tests for BDD.
Ginkgo creates a report per each folder where tests are located.
This create a problem with collecting all reports and publishing it as a single test report file.
Is it possible to configure GinkGo in a such way so I could take all test in a single test report file?

What I understand is your reports lies in each test folder:
Example
testScripts
- test_1_directory (contains test spec and result files)
- test_2_directory (contains test spec and result files)
- test_3_directory (contains test spec and result files)
I'm not sure this might exactly help you but can give it a try
In you job add reports paths as mentioned below:
artifacts:
reports:
junit:
- ./packages/e2e/goProject/testScripts/**/**.xml
Assuming .xml are report generated.
At the end all test will be displayed in pipeline's test section

Related

Upload coverage information to SonarCloud from coverlet for C# project

I'm trying to collect coverage info and publish it to SonarCloud for my C# project, using GitHub Actions as my CI pipeline. The execution is very simple, basically trying to execute tests for all projects, merging all coverage files:
run: |
.\.sonar\scanner\dotnet-sonarscanner begin /k:"LanguageDev_Yoakke" /o:"languagedev" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="https://sonarcloud.io" /d:sonar.cs.opencover.reportsPaths="CoverageResults/coverage.opencover.xml" /d:"sonar.verbose=true"
dotnet build
dotnet test /p:CollectCoverage=true /p:CoverletOutput="../CoverageResults/" /p:MergeWith="../CoverageResults/coverage.json" /p:CoverletOutputFormat=\"opencover,json\" /maxcpucount:1
.\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"
Note, that I do need to output in both opencover and json formats. coverlet only seems to merge properly is json is among the output formats, so it can then convert that to opencover in the end.
The problem is, this generates no coverage information on the CI - downloading artifacts show no generated folder or files. However, if just pass /p:CoverletOutputFormat=opencover, all coverage information is generated on the CI - but not merged properly because of no json output.
Locally, the command
dotnet test /p:CollectCoverage=true /p:CoverletOutput="../CoverageResults/" /p:MergeWith="../CoverageResults/coverage.json" /p:CoverletOutputFormat=\"opencover,json\" /maxcpucount:1
just works and generates the proper coverage XML file.
What could be the problem, why does it not work with both coverage formats specified for the CI? Initially I thought that I do not know about escaping quotes in YAML but this does not seem to be the case here.

How to check jest coverage in console?

I have some jest tests and I can determine the coverage with
jest --coverage
Also see Code coverage for Jest
I automatically execute the tests on a build server (gitlab runner) and want that my tests fail if the coverage is below a certain limit.
In python there is a flag --cov-fail-under that can be used with pytest, e.g.
pytest --cov src --cov-fail-under=90 --cov-report=term
Unfortunately, I could not find a corresponding option for jest.
=>What is the recommended way to check the total coverage?
Should I write some extra script to evaluate the generated json coverage file or is there an easier solution like a specific reporter to use?
Not listed under CLI-Options, but there is coverageThreshold, which can be used in package.json or within an extra jest configuration file:
https://jestjs.io/docs/en/configuration#coveragethreshold-object

jasmine-allure-reporter, display test cases separately based on spec files

I'm using Protractor and jasmine-allure-reporter
And trying to run multiple specs that are defined in the config.js file.
specs: ['spec1.js','spec2.js']
spec1.js contains 3 tests and spec2.js contains single test.But the jasmine-allure-reporter displays all the four test cases(3+1) together and there is no specification about the spec files (spec file name). How can I display the test cases separately under each spec file-name in one HTML.
Please help me on this.
I am generating report using command "allure generate allure-results --clean -o allure-report || true"
allure-results >> Location where xml files are generated and
allure-report >> where html report is generated
Looking at the allure reporter repository, it doesn't look like it's supported. They set the outDir but do not expose a way to consolidateAll like how jasmine-reporter does. See jasmine-reporter GitHub. If you decide to switch to jasmine-reporter, consolidating XML files is pretty simple. See the Protractor cookbook for an example.

JavaScript Unit Tests not working on SonarQube

I have below problems with SonarRunner.
SonarQube along with Sonar runner unable to pull junit format xml reports
Unit Tests or Test Coverage widget doesn't show up. It says No Data.
I am following instructions described here
I manually created report file in XML format as described, but still no luck.
Below is the XML file - TEST-Firefox_210_Mac_OS.com.company.BarTest.xml,
<testsuite name="Firefox_210_Mac_OS.com.company.BarTest" errors="0" failures="0" tests="3" time="0.0">
<testcase classname="Firefox_210_Mac_OS.com.company.BarTest" name="testfullName" time="0.0"/>
</testsuite>
To pull the Unit Test execution report to show on sonar dashboard. I have used sample git project from here
and below is my sonar-project.properties,
# project metadata (required)
sonar.projectKey=org.codehaus.sonar:javascript-sonar-runner-jstestdriver
sonar.projectName=JavaScript project with Sonar Runner reusing reports generated by JsTestDriver
sonar.projectVersion=1.0
# path to source directories (required)
sonar.sources=C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/sources
# path to tests source directories (required)
sonar.tests=C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/tests
sonar.javascript.jstestdriver.reportsPath=C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/target/TEST-Firefox_210_Mac_OS.com.company.BarTest.xml
sonar.sourceEncoding=UTF-8
below is my jsTestDriver.conf
server: http://localhost:9876
load:
- C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/sources/*.js
- C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/sources/com/company/*.js
test:
- C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/tests/*.js
- C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/tests/com/company/*.js
plugin:
- name: "coverage"
jar: "coverage-1.3.5.jar"
module: "com.google.jstestdriver.coverage.CoverageModule"
My sonarqube is running on port : 9000 and below is the screen shot. As you see SonarRunner and jsTestDriver just doing a code analysis and not showing any unit tests.
SonarQube doc website says jsTestDriver will run the javascript unit
tests and copy the results in target folder in XML format
SonarQube doesnt run your Unit Tests, it just gathers the reports generated from your manuall run or other tools automatic run (like: Jenkins).
From SonarQube doc:
Prior to the SonarQube analysis, execute your unit tests in order to
generate XML report. The JUnit like XML format supported is the one
generated by js-test-driver
Then I manually created report file in XML format as described, but
still no luck.
If you created manually the report files and reports are still not showing, check your paths if ok.Check if sonar is really reading data from C:/Sonar/sonar-runner-dist-2.4/sonar-runner-2.4/projects/ path

How do I set up a Ginkgo test suite?

I have inherited a Go project that consists of a lot of common files, a library of sorts, two executables, and theoretically a test suite. The test suite is being written after the fact. But I dislike the only way I've found of setting up is rather unpalatable
I'm using Ginkgo, and this is my starting directory structure
component1/component1.go
component2/component2.go
cmd1/cmd1.go
cmd2/cmd2.go
project_suite_test.go
component1_test.go
Each cmd?.go file will be compiled into a separate executable.
What I would like is a multi-file test suite, usually one file per component. Where do I put the files so that go test will find and run all of them, without leaving them here in the root of the project?
ginkgo init and ginkgo bootstrap will set up your tests. ginkgo -r will run all your tests recursively.
Reason:
Ginkgo command will only work if you have actually bootstrap your project via ginkgo.
Options:
To use that you have to go to your test dir in terminal and run
ginkgo init : To Initialise project:
ginkgo bootstrap : This will generate new file with test suite config
ginkgo or ginkgo test : this will now be able to run tests based on your new generated file because that's what it is trying to search.
Alternatively:
If you like to keep your tests in a sub-folder, say test, then running
go test ./...
will attempt to run tests in every folder, even those that do not contain any test, thus having a ? in the subsequent report for non-test folders.
Running
go test ./.../test
instead will target only your test folders, thus having a clean report focused on your tests folders only.
you can alternatively use 'go run $(ls *.go)' to run all the files in a given folder.
Notice you have regular expression within () braces.
In-case you want to run test in different path update path as per your desired dir in the regular expression
You can use go test ./... in the root and it will go into child folders and execute the tests:
component1/component1.go
component1/component1_test.go
component2/component2.go
component2/component2_test.go
cmd1/cmd1.go
cmd1/cmd1_test.go
cmd2/cmd2.go
cmd2/cmd2_test.go

Resources