I am facing a small challenge while I try to set a custom quality gate in Sonar for "bugs" i.e. blocker should not be more than 0.
Instead the quality gate is picking the category "code smells" and showing the results from there.
Is there a way of how this can be corrected?
I am using sonarqube version 8.6.
If you are interested only in 0 blocker bugs, you should be checking Reliability rating is not worse than D. See https://docs.sonarqube.org/latest/user-guide/metric-definitions/#header-6
Related
I have integrated SonarQube and Checkmarx SAST and SCA into the Azure DevOps build pipeline. I am able to see both the SonarQube and Checkmarx reports without any issues.
I have the following questions. Could someone please clarify:
What is the difference between SonarQube and Checkmarx CxSAST?
What is the common thing between these two?
In which situations are SonarQube and Checkmarx preferred?
If I were to boil it down to a short phrase, SonarQube is used for ensuring code quality, and CheckMarx is used for ensuring the security of a system running that code.
SonarQube looks at several areas, including the code coverage percentage of unit tests of the code, duplication percentages, and also code quality issues found through static analysis of the code.
CheckMarx, on the other hand, just analyzes the flow of the code and the inputs and outputs. It looks for situations where inputs that could have been provided by an end user are used directly to control behavior, and other "attack vectors".
Can we replace the Static application security testing SAST Tool like (Fortify, Checkmarx and IBM Appscan) with SonarQube.
As per the SonarQube Roadmap Docs 8.1 (https://docs.sonarqube.org/latest/) says it covered all the security rules originated from establish standard: CWE, SANS Top 25, and OWASP Top 10.
I this area no tool is the same. So when you run all those tools on the same code you will get some similar findings, some new one's and some missing (maybe false positives), depending how they implement the tool. Given the fact that SonarQube is relatively new in this field I would suggest using some other tool for this specific area also. Be aware that achieving a 100% detection result is extremely difficult/impossible.
No you could definitively not actually. The coverage of Sonar is not the same thing you should view. You must understood how they made the detections, the number os False Positive/Negative etc...
Fortify and Checkmarx do analysis of the flow inside your code. They could analyses the control you made before anything. Sonarqube is more rules based and not flow based.
We would like to review our Sonar rules and the plan is to check weekly the TOP 10 occured issues after the analysis of our projects.
During the rule review we are planning to decide to remove/change/keep the rules.
Is there any way to create a list of the TOP 10 occured issues with Sonar?
We use SonarQube Version 5.6.3.
SonarQube native UI/API can report on most violated rules:
UI: Rule facet on the Issues page
WebAPI: the facets parameter of api/issues/search lets you get the same information
As a side-note: deciding to 'remove/change/keep the rules' only based on number of occurrence seems quite shortsighted. A rule finding many coding convention issues (code smells), can't be compared to a rule detecting a severe vulnerability. The latter might fire only once a year, but you'll be thankful the vulnerability didn't make it in production.
Trying sonarqube 6.0
Hi sonarqube experts
I've used the sonar.exclusion in the past, but its all-or-nothing kind of deal.
Given the following sections in the sonarqube dashboard:
Reliability
Security
Maintainability
Duplications
Size
Complexity
Documentation
Issues
Is there a way to cross-filter selected directorys (such as those containing thirdparty javascript libraries) so they are excluded except for Security as an example?
Usecase is to, when configuring Quality Gates, to not worry about maintainability of thirdparty javascript libraries, but to very much worry about their vulnerabilities.
Behind this request, I've the feeling that you're looking for a tool able to detect usages of API with known MITRE-CVE vulnerabilities. If this is the case, then SonarQube won't be of any help you covering this need.
TL;DR: Basically what I am looking for is a way to get a list of all sonar rules that have 0 issues raised. I could then move all of those to blockers and protect myself from someone adding that issue in the future.
My company is using sonar and static analysis to help guide refactoring and development of a sizable legacy codebase (~750K LOC). We have had a lot of success by lowering the severity of most rules and then choosing a smaller set of rules to promote up to blocker or critical as we find real issues in the code. This has kept the number of issues we are trying to address at a time manageable so we can actually feel like we are making progress and not drown in the noise of legacy issues.
In particular when we have been bitten by a field or QA issue that sonar could have detected we turn that issue up to a BLOCKER and fix every instance of in. These blockers break the build and we are now assured that we wont add a new instance of the same issue again. This has worked great and has kept a number of what would be nasty bugs from slipping through.
The big problem with that methodology is we need to have an example of every one of those classes of mistake atleast once in the codebase so we could learn that it was important and should be made a blocker. Any issues we haven't already encountered will still be at their default level, I'd like to move all of them up to BLOCKER now so we notice the day they are added.
Edit: Currently we are using 3.7.3 but we are about to upgrade to 5.X.
There are 2 ways to do this:
1- The difficult way is to query the SonarQube database. You have to understand the tables and write a SQL query based on which DB is used for your SonarQube. You Can find some reference here - OR here
2- I have never tried your method but it should work. You can use Sonar Web Service API. You also have a Web Service Java Client. Reference :
link1,link2,link3