I have been using sonar for over 4 years for Java projects we have created.
Currently, we want to fail the build if some metrics cannot be met. Thus, I installed Build Breaker plugin and re-run the build cycle, without any issue. Then I modified quality gates to contain "'Comments (%)' rule to for 'is less than' check of '20' ( threshold for warning), '10'( threshold for error) " as attached in:
However, after those definitions the projects failed to build. Although all files seem to have enough comment (from 28 % to 77 %) , as in image below,
While in main screen it is shown as 0%
I could not find any log, comment or information on how this can happen (in sonar.log or mvn -X ) and as far I searched , no one encountered this issue.
Did anyone encountered this problem, or have any idea why this can occur? My SonarQube version is 5.6.7, Build Breaker version is 2.2 (downloaded from github), maven version is 3.0.5 and JDK 8.
When defining your Quality Gate, you selected the option "over leak period". So what counts the variation of the % of comments in your current leak period, which is the version 1.4.1.
What you see in the screenshot with the perc. of comments per file is the absolute measure.
You might want to uncheck the option "over leak period".
Note that the way measures are displayed was improved in later versions of SonarQube.
Related
I see 26,253 warnings in MSbuild log with SonarScanner (Red marked). It includes 558 compiler warnings, rest are from the code analyzer.
But, I see only 16,396 issues in SonarQube (blue marked)
Why there is huge difference of around 10,000 issues, why are not those issues reported in SonarQube !!
There are three four (thanks #Julian) possibilities here:
~10k issues in your project have been "resolved" False Positive or Won't Fix. Okay, this volume of FP/WF issues is unlikely, so on to
You have set some exclusions which filter out some issues. This is possible, but 10k seems a bit high for that, so...
The number you're looking at is the total number of SonarC# issues, plus compiler warnings, plus (insert some other checker run automatically during the build here), but only issues that correlate to Rules active in your Quality Profile will be reflected in SonarQube
.cs files outside the project base directory may be excluded automatically by the SonarQube Scanner for MSBuild
My project is analysed by SonarQube for every VCS check-in and I have observed some strange behavior:
The dependency cycle-count changes to extremes every now and then.
When viewing the details (e.g. clicking the link) the old (smaller number) value is displayed. What could be the cause of this?
This feature has been dropped from SonarQube platform in version 5.2 thus even if there might be some flaws on this on sonar java analyzer side there is not point to make an effort to fix them as this will be dropped when it will move to LTS version 5.x
See this ticket for detailed explanation : https://jira.sonarsource.com/browse/SONAR-6553
I updated sonarqube from 4.4 to 4.5.4LTS.
After initial analysis on a project using the same set of rules as in my previous version, number of issues increased by a significant
number.
Build breaker broke my build gracefully as the number of critical issues shot past my quality gate thresholds.
I double checked the number of rules on the previous instance. Its exactly the same.
I've got a quality profile in Sonar which will alert if the number of Violations goes up since the previous analysis, e.g. "Alert if Critical Issues since previous analysis is greater than 0".
The problem with this is that when you run a subsequent build without any code changes (or perhaps an innocuous code change) the alert is cleared.
Is there a way to get Sonar to compare its results against the last analysis that did not contain any alerts?
EDIT: I should make it clear that the "difference since previous version" option will not work for our setup as we're employing a Continuous Delivery strategy, in which each build is a potential release candidate with its own unique version (we're using a date/time stamp as the version).
EDIT #2: I have also tried setting the value sonar.timemachine.period4 to a hardcoded version that I want to compare against; however this value is not accessible when configuring the Alerts, and is certainly ignored during an actual analysis.
After poking around in Sonar's source, a colleague and I came up with a workaround solution.
Set up your quality profile using the "previous version" comparison wherever you actually want to compare to the last good build.
For each build:
Query the last VCS tag with a build version and assign it to a variable called ${LAST_GOOD_BUILD} or similar for the rest of your build process to use.
Run Sonar with -Dsonar.timemachine.period3=${LAST_GOOD_BUILD} (also making sure the BuildBreaker plugin is active)
If you get no alerts, the next build step needs to record your new version in a VCS tag;
This works because sonar.timemachine.period3 is the same setting as "previous version" in your quality profile, but you are now replacing it with a hard-specified version of your choosing. Every time you build, you are tagging only the builds that pass quality checks, and when you run Sonar, you're only comparing against these good versions.
Pretty horrid, but it gets our build pipeline up and running again. If anything's unclear about the above, please let me know and I'll update this "solution".
CAVEATS: Your version numbering cannot be whole integers - Sonar will interpret this as the number of days between your current analysis and the one you want to compare with! Also, it cannot be in a format that could be confused with yyyy-MM-DD (e.g. 1000-01-01) as if this also happens to resolve to a real date, then you are inadvertently specifying the start of a date range. I've not yet seen anyone specifying version numbers that way, but you never know.
No but you can configure SonarQube to base your differential views on previous_version or on a date. See http://docs.codehaus.org/display/SONAR/Differential+Views#DifferentialViews-DifferentialViewsSettings
Does Sonar offer any way to raise alerts and fail a build when the trend for certain metrics is bad?
Background: In our legacy project using a static threshold for example for code coverage ("red alert when coverage is below 80%") does not make much sense. But we would like to make sure that the coverage does not go down any further.
Please do not give any advice on lowering the bar by using a less restrictive rule set. This is no option in our case.
There is a build breaker plug-in that will fail the build if you breach a Warning or Error threshold setup in the quality profile.
Plug-in details are here:
http://docs.sonarqube.org/display/PLUG/Build+Breaker+Plugin
Not aware of any functionality that enables you to a metric trend.
We use Sonar as the second last step in our release process. The build breaker ensures that releases do not breach predetermined quality criteria.
We tried exactly the same, using the build breaker plugin. After a while, it showed to be too unflexible (and configuring Sonar is a mess), so we moved from sonar to Jenkins/Hudson plugins like Cobertura (for code coverage) or PMD for code style:
https://wiki.jenkins-ci.org/display/JENKINS/PMD+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/Cobertura+Plugin
With these plugins, very fine-granular settings are possible, to set for example the build to yellow at <70% code coverage or to red by <50%; even the weather-symbol for each build is setable.
In the meanwhile we scripted our own buildbreaker that gets excecuted within our build. We use Groovy to query the REST API of Sonar to retrieve a certain set of metrics (including their historical values). The retrieval of metrics is provided by a build plugin that is provided for our whole division.
Each team can parameterize their build with a set of rules regarding those metrics that have to be verified for their project. Of course, the rules are also provides as Groovy snippets :-)
Typical are:
Number of (major|critical|blocker) violations is less or equal than in previous build
No new duplicates
Coverage not lower than in previous build
Bad findings can then be used for breaking the build or just for reporting.