Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Disable coverage without disabling test info
kentcb
#1 Posted : Wednesday, February 10, 2016 3:01:46 AM(UTC)
Rank: Member

Groups: Registered
Joined: 2/10/2016(UTC)
Posts: 20
Location: Australia

Thanks: 5 times
Was thanked: 2 time(s) in 2 post(s)
Is there a way to disable coverage information (circles) without disabling test info (>) ?

I don't understand the value of having the coverage information enabled for a project containing only unit tests. Obviously any given test will be covered by itself (unless you're Doing It Wrong). And yet turning off instrumentation seems to be the only way to remove the dots. But this also removes the one useful piece of information NCrunch is giving me in my unit test projects: the little > alongside each test.

Am I missing something obvious?

Thanks
Remco
#2 Posted : Wednesday, February 10, 2016 6:17:22 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
Hi, thanks for posting!

At the moment, it is not possible to configure NCrunch to behave in the way you are asking for.

However, implementing such behaviour in the product is by no means impossible, and could be suggested as a feature.

Though I would first urge you to take a closer look at the benefits you receive from having coverage information on your tests themselves. Code coverage conveys much more information that just trying to paint targets on uncovered lines. It tells you a great deal about how your code behaves.

For example, let's say that you have a utility method in your test assembly being used by several different tests. Based on a new requirement, you need to make changes to this method. Being able to see which tests are making use of the method (or individual lines within it) could be useful in guiding your changes.

Being able to see the path of code executed is also very useful for troubleshooting unexpected behaviour in tests. It is quite possible that a test is giving a false pass and being able to see its path of execution can often make this obvious.

Furthermore, the coverage markers also provide performance information that can help with identifying and resolving bottlenecks in tests.

Finally, if an assertion fails inside test code, the markers are necessary for showing this exception information inline. If a test fails without any markers, there will be no X to tell you where it blew up - you'd need to analyse this exclusively using the Tests Window.

I understand that the above points may not be advantageous to the way in which you work or the way in which you choose to use NCrunch. My intention in designing the product has always been for it to tell you more about how your code is behaving, as opposed to painting targets on coverage gaps. Considering such a goal, the visual pollution caused by the markers seems like a small price to pay in exchange for the many benefits.
1 user thanked Remco for this useful post.
kentcb on 2/12/2016(UTC)
kentcb
#3 Posted : Friday, February 12, 2016 3:22:49 AM(UTC)
Rank: Member

Groups: Registered
Joined: 2/10/2016(UTC)
Posts: 20
Location: Australia

Thanks: 5 times
Was thanked: 2 time(s) in 2 post(s)
Thanks for the helpful reply!

As you can probably tell, I'm new to the product so take everything I say with a large grain of salt.

I see what you mean about the extra information that comes about through having the coverage information. I guess I just feel like there are two separate streams of information being conflated: coverage and results. I would like to see the > and X symbols because I feel they add value whereas everything else currently just feels like noise to me.

> Being able to see which tests are making use of the method

I would use Find All References for this.

> It is quite possible that a test is giving a false pass and being able to see its path of execution can often make this obvious

This is what I meant by "doing it wrong". I would argue that tests with branching in them or that are difficult to understand are poor tests. I am talking specifically about unit tests here.

> the coverage markers also provide performance information

I'm not sure I find the perf info useful yet. Is it measuring the DEBUG binaries? Does it warm up the JIT?

> the markers are necessary for showing this exception information inline

Right, this is the most useful part to me, and is what I am arguing should be a separate thing to code coverage.

One last thing I wanted to suggest - and please don't take this the wrong way - is to perhaps have a designer provide some direction. I feel like this excellent product would benefit greatly from a visual overhaul. Stuff like the individual dots, difficulty differentiating between solid dots and semi-transparent ones (when tests are waiting to run) etc. And the configuration panes look quite different to "standard" VS UI.

Just my 2c! Thanks again for a great product - looking forward to being far more productive with it.

Cheers,
Kent
Remco
#4 Posted : Friday, February 12, 2016 4:06:52 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
kentcb;8336 wrote:

I'm not sure I find the perf info useful yet. Is it measuring the DEBUG binaries? Does it warm up the JIT?


The performance tracking cannot work with compiler optimisations enabled and does not override JIT behaviour. Even considering this, you may be surprised how often it flags up bottlenecks. Most performance problems are not algorithmic in nature (at least, not within our own codebases).

kentcb;8336 wrote:

> the markers are necessary for showing this exception information inline

Right, this is the most useful part to me, and is what I am arguing should be a separate thing to code coverage.


You're welcome to formally request this as a feature if you like.

kentcb;8336 wrote:

One last thing I wanted to suggest - and please don't take this the wrong way - is to perhaps have a designer provide some direction. I feel like this excellent product would benefit greatly from a visual overhaul. Stuff like the individual dots, difficulty differentiating between solid dots and semi-transparent ones (when tests are waiting to run) etc. And the configuration panes look quite different to "standard" VS UI.


There is method behind the madness here. The configuration system is soon to be overhauled to make it more sophisticated. The new system is expected to allow settings to inherit through global/solution/project, as well as be split between local and shared settings. Such a setup couldn't be sensibly applied to a standard tabbed popup dialog.

It's also worth remembering that NCrunch supports 5 different versions of Visual Studio, each with their own UI standards and integration constraints.
Users browsing this topic
Guest
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

YAF | YAF © 2003-2011, Yet Another Forum.NET
This page was generated in 0.055 seconds.
Trial NCrunch
Take NCrunch for a spin
Do your fingers a favour and supercharge your testing workflow
Free Download