Feature request: Keep historic data for differential metrics (e.g. Coverage on new code)

100 views
Skip to first unread message

thepa...@gmail.com

unread,
Oct 26, 2015, 4:06:02 AM10/26/15
to SonarQube
Hi all,

Coverage is a great metric. However, working with a large code base with a lot of technical debt, Total Coverage (%) is unlikely to move any time soon, and does not reflect the current mindset towards testing that "Coverage on new code (%)" does.
To properly analyze the development of coverage on new code, it is necessary to keep historic data (in my case, we display such historic data for complexity and issue count in a Tableau report). This is not configurable on coverage on new code, and manually fudging around the SQL database to flip the bit "delete_historic_data" is not useful in the long run, as it is being overwritten with "true" again.

IMO, it would be great to be able to configure this setting in the web UI and have it not be reset/overwritten again.

Cheers,
Rudolf Schreier

Freddy Mallet

unread,
Oct 28, 2015, 6:22:27 AM10/28/15
to thepa...@gmail.com, SonarQube
Hello Rudolf,

Making SonarQube keeping the history on "Coverage on new Code" metric is indeed not a big deal but before doing this update I would like first to discuss a little bit more your use case because I'm not sure to understand the root motivation. As you said the total coverage is unlikely to move any time soon and so defining a requirement/goal on this metric is highly depressing. Nevertheless, this metric remains useful in the long term to see if the team daily effort to increase the quality of the code leads after several months/years to also increase the total coverage.

At the opposite defining a short term requirement/goal on the "Coverage on New Code" is highly powerful. This is so powerful, that we do think all quality gates should have a constraint on this metric: coverage on new code < X% -> NOGO ! But then if this constraint is part of the Quality Gate, why would it be valuable to also track the evolution of this metric over time ?

Thanks


Freddy MALLET | SonarSource
Product Director & Co-Founder
http://sonarsource.com

--
You received this message because you are subscribed to the Google Groups "SonarQube" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sonarqube+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/sonarqube/b81ff78b-e2f0-4bc1-bece-ea665f7756d6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

thepa...@gmail.com

unread,
Oct 28, 2015, 7:40:17 AM10/28/15
to SonarQube, thepa...@gmail.com
Hi Freddy,

thanks for your reply. I agree that our business case is a bit of a weird one, so let me describe our codebase:
* A large proportion of our code is old, but stable, and no changes will have to be done to it any time soon. This code is only marginally covered (0-5%).
* The more recent, "working" set of our code is reviewed and tested more thoroughly for code quality, so it generally has higher coverage (heading towards 30% and upwards, fingers crossed).

This means that, as you say, the Quality Gate solution is an OK indicator if we fulfill the target for Coverage on New Code. However, since it takes a while to spread the knowledge of what a good code review, a good test, good coverage etc. is, it would be naive to assume that we can start fulfulling this Quality Gate limit within days or weeks. During this transitional period, we would get very many "false negatives" of failed Quality Gates (in that we are aware and expect a certain failure). We could of course regulate this with red/orange gating, but this information is a bit too coarsely grained to be easily interpreted: We only have the choice of analyzing the % of Quality Gate failure, or analyze all error causes of the Quality Gate, which would just be overkill for us.

So yes, I am proposing a change which would only be useful until it obsoletes itself and we can switch to a Gate-based solution, but I hope you can understand the need for a transitional solution.

For the moment, we have switched to an analysis of total Coverage, but as you say, this value has not moved even 0.1% during 3 months due to the small ratio of new code to old.

Kind regards,
Rudolf

Anne-Jeanette Peterson

unread,
Apr 27, 2016, 4:16:10 PM4/27/16
to SonarQube, thepa...@gmail.com
What came of this request?  We have a similar use case for observation/study of project adoption.  We are finding that we're not able to see the historical value of coverage on new code, violations on new code, etc. even through the timeline API. I've been trying to determine if this is a setting/configuration issue or if this is really not a feature.

Thank you for your help.

Rohan Shah

unread,
Apr 27, 2016, 10:57:48 PM4/27/16
to SonarQube, thepa...@gmail.com
Hi Rudolf,

We are also trying to get historic data for differential metrics but it seems SonarSource has still not implemented the feature. Can you confirm how did you turn the bit delete_historic_data off? I tried searching the field in database but could not find the column anywhere nor could find it anywhere in the codebase of Sonar.

Thanks
Rohan Shah

Rudolf Schreier

unread,
Apr 28, 2016, 1:27:41 AM4/28/16
to Rohan Shah, SonarQube
Hi all,

in fact, we gave up on storing this metric. The values for Coverage on New Code turned out to be so wildly varying that they provided very little value in the long run. Setting up a solution where Coverage on New Code was regularly backed up to a secondary database were considered, but were thus not worth the effort.

Sorry for the bad news.

Kind regards,
Rudolf

Rohan Shah

unread,
Apr 28, 2016, 11:03:37 AM4/28/16
to SonarQube, roha...@gmail.com, thepa...@gmail.com
Thanks for the quick reply. Which version was this happening on? Do you have more details on how it was 'wildly varying'? 

Thanks
Rohan Shah

Rohan Shah

unread,
Apr 28, 2016, 11:06:11 AM4/28/16
to SonarQube, thepa...@gmail.com
Hi Freddy,

We are looking for the same request. We want historical values for new code metrics just like we have for all other metrics? Is there a reason why Sonar does not keep track of new code metrics as and when code changes are made to project and the changes are analyzed by Sonar?

Thanks
Rohan 

Anne-Jeanette Peterson

unread,
Apr 28, 2016, 11:29:59 AM4/28/16
to SonarQube, roha...@gmail.com, thepa...@gmail.com
https://jira.sonarsource.com/browse/SONAR-7085

Rohan, This best describes the challenges.

Rudolf Schreier

unread,
Apr 28, 2016, 1:38:50 PM4/28/16
to Anne-Jeanette Peterson, SonarQube, roha...@gmail.com
Rohan,

I should have specified that the wildly varying measurements were not the fault of the analysis, but a logical consequence of our inconsistent project - the analyzed project spans 1M SLoC, and includes much generated as well as some external code. The project spans all layers from HTML generation to the construction of SQL queries. As such, when we analyze the changes once per day, we can achieve very high coverage when working mainly on the business layer, or very low coverage when modifying mostly front-end code.

Regards,
Rudolf
Reply all
Reply to author
Forward
0 new messages