Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

NCrunch gets sluggish after a while while devenv.exe having high cpu usage.
GreenMoose
#1 Posted : Wednesday, May 22, 2013 7:13:12 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
v1.45.0.12

I have an issue where devenv.exe (visual studio premium 2012, update 2) starts hogging cpu and NCrunch becomes really unresponsive.
With unresponsive I mean that the only dev-related process using cpu is devenv.exe, Ncrunch proccess are basically idling. Trying to bring up NCrunch configuration can take several minutes. Waiting for a test to start or build to start also is incredibly slow when this happens.

I guess all this is because NCrunch for whatever reason is waiting for devenv.exe to respond in some way while devenv.exe is busy doing some cpu stuff in background?

If I close solution and exit devstudio, devenv.exe is still consuming cpu and after a while I can even see NCrunch start new build processes, even though I have exited vstudio.

When I kill that process to get it to quit hogging cpu and restart studio, NCrunch is responsive again but after a while working with it (matter of hours at most) the same issue appears and I have to restart vstudio again.

I have disabled all my extensions including suspending ReSharper but same issue.

Any pointers in what's going on or how I can troubleshoot it?

Computer has 16GB Ram, 8 cores of which 4 is dedicated to NCrunch, NCrunch/OS running on SSD, using Windows 7 Pro, solution has ~5k tests, ncrunch is set to run impacted tests automatically.

Thanks.
Remco
#2 Posted : Wednesday, May 22, 2013 9:35:19 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
Hi,

Thanks for sharing this issue. When NCrunch becomes unresponsive (and the CPU usage starts to stay high), do you notice any general slowdown in other activities in the IDE? For example, are you able to scroll around your source code, making changes, opening files, etc?

If the IDE is staying responsive but NCrunch seems to be 'sleepy', then most likely there's something in the core engine that's overloading the main thread used by NCrunch for its general activities.

Try turning on the 'Log to output window' global configuration setting after you start up Visual Studio. A verbosity of 'Summary' should be fine. This means that NCrunch will start to report performance issues in a VS output window for you to observe. You might want to do this well in advance of the performance issue occurring, as it can be hard to switch on when the engine is not responding properly. Note that the setting will automatically revert back to 'False' after you restart the IDE (this is to prevent accidental logging, which inhibits long term performance), so you'll need to specifically turn it on again if you restart.

The information shown in this output window should hopefully narrow down what may be causing the issue. Judging from the pattern you're observing (i.e. it taking a few hours to kick in), it does seem as though NCrunch is accumulating resources somewhere without releasing them - in which case I'd be interested to know if the memory utilisation is higher than it should be when this happens.

Something else that would be useful to know is whether or not resetting (i.e. disable then enable) the NCrunch engine causes the problem to disappear.


Cheers,

Remco
GreenMoose
#4 Posted : Wednesday, May 22, 2013 12:56:26 PM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
I attached another studio-instance to the devenv.exe eating cpu. Looking at the thread which was consuming the cpu (according to process explorer at least) I found the below stack trace
Is it normal that this should take considerable amount of cpu time?

Code:
 	[Managed to Native Transition]	
>	mscorlib.dll!System.Globalization.CompareInfo.Compare(string string1, string string2, System.Globalization.CompareOptions options) + 0x6b bytes	
 	mscorlib.dll!string.CompareTo(string strB) + 0x34 bytes	
 	mscorlib.dll!System.Collections.Generic.GenericComparer<string>.Compare(string x, string y) + 0x43 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest,string>.CompareKeys(int index1, int index2) + 0x5f bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<System.__Canon,bool>.CompareKeys(int index1, int index2) + 0x54 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest>.QuickSort(int[] map, int left, int right) + 0x58 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest>.QuickSort(int[] map, int left, int right) + 0xf8 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest>.QuickSort(int[] map, int left, int right) + 0xf8 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<System.__Canon>.Sort(System.__Canon[] elements, int count) + 0x4a bytes	
 	System.Core.dll!System.Linq.OrderedEnumerable<nCrunch.Client.Model.UITest>.GetEnumerator.MoveNext() + 0xb3 bytes	
 	mscorlib.dll!System.Collections.Generic.List<nCrunch.Client.Model.UITest>.List(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UITest> collection) + 0x17e bytes	
 	System.Core.dll!System.Linq.Enumerable.ToList<nCrunch.Client.Model.UITest>(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UITest> source) + 0x3b bytes	
 	nCrunch.Client.dll!nCrunch.Client.Model.CoverageWatcher.createMarker(nCrunch.TestExecution.TestCoverageResult coverage, int lineIndex, bool includeExecutionTimes) + 0x320 bytes	
 	nCrunch.Client.dll!nCrunch.Client.Model.CoverageWatcher.issueUpdatedCoverageMarkers(nCrunch.Client.Model.MergedCodeCoverage uiCodeCoverage, bool includeExecutionTimes) + 0x79 bytes	
 	nCrunch.Client.dll!nCrunch.Client.Model.CoverageWatcher.issueUpdatesForCodeMembers(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UICodeMember> codeMembers, nCrunch.Client.Model.ICoverageUpdateFilter filter) + 0x114 bytes	
 	nCrunch.Client.dll!nCrunch.Client.Model.CoverageWatcher.issueUpdatesForComponents(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UIComponent> components, nCrunch.Client.Model.ICoverageUpdateFilter filter) + 0x66 bytes	
 	nCrunch.Client.dll!nCrunch.Client.Model.CoverageWatcher.testDataUpdated(nCrunch.Client.Model.ModelEvents.TestDataUpdatedEvent e) + 0x63 bytes	
 	[Native to Managed Transition]	
 	mscorlib.dll!System.Delegate.DynamicInvokeImpl(object[] args) + 0x76 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.CoreMessageDispatcher.#=qsOtkTQh12Donbg1BR8Yb0A==.#=qMMVCxU8hwssb5aZeRr36YA==() + 0x54 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.CoreMessageDispatcher.#=qca$H2e64Fz$iY6m8JPn903ByQhMpo8aJtyIFHlxVsfY=() + 0xd6 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.PooledWorkItem.Start() + 0x55 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.ThreadFactory.#=q5kXplrMDnWFnkDklQ5neEA==(object #=qBx6Kgz6ttHTCf2agqHe0Jg==) + 0x7e bytes	
 	mscorlib.dll!System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(object state) + 0x3e bytes	
 	mscorlib.dll!System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state, bool preserveSyncCtx) + 0xa7 bytes	
 	mscorlib.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state, bool preserveSyncCtx) + 0x16 bytes	
 	mscorlib.dll!System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem() + 0x60 bytes	
 	mscorlib.dll!System.Threading.ThreadPoolWorkQueue.Dispatch() + 0x149 bytes	
 	mscorlib.dll!System.Threading._ThreadPoolWaitCallback.PerformWaitCallback() + 0x5 bytes	
 	[Native to Managed Transition]	
GreenMoose
#3 Posted : Wednesday, May 22, 2013 1:09:21 PM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
Remco;4161 wrote:

...
Try turning on the 'Log to output window' global configuration setting after you start up Visual Studio. A verbosity of 'Summary' should be fine. This means that NCrunch will start to report performance issues in a VS output window for you to observe. You might want to do this well in advance of the performance issue occurring, as it can be hard to switch on when the engine is not responding properly. Note that the setting will automatically revert back to 'False' after you restart the IDE (this is to prevent accidental logging, which inhibits long term performance), so you'll need to specifically turn it on again if you restart.
..


Code:
[14:56:54.0242-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.3076903 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:00.8544-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.8302063 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:08.2521-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.8427071 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:14.8873-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.4551953 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:21.31-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.4226940 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:28.1302-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.6127000 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:35.113-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.5976989 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:41.9182-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.6152007 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:48.5059-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.5877032 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:57:55.7861-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.7827046 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:58:02.5713-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.5151965 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:58:09.0615-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.4901967 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:58:16.1267-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.5801992 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[14:58:22.6544-Core-119] Event [TestDataUpdatedEvent] took 00:00:06.3076907 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated

...and a bunch more lines...

Remco;4161 wrote:

...in which case I'd be interested to know if the memory utilisation is higher than it should be when this happens....

Not from what I can tell, its within reasonable limits staying somewhat stable at ~1.5GB

Remco;4161 wrote:

Something else that would be useful to know is whether or not resetting (i.e. disable then enable) the NCrunch engine causes the problem to disappear.

Same thread is eating cpu when I disable NCrunch (and NCrunch processes are kept running, some test host processes are continuing to start new test runs as I see those cpu usages go up now and then, while devenv.exe is high on cpu usage).
When enabling it again NCrunch processes were killed but devenv.exe eventuelly within seconds starts to eat cpu again with same callstack as above on the cpu consuming thread.
Remco
#5 Posted : Wednesday, May 22, 2013 11:14:31 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
Thanks - this narrows the issue down significantly!

It looks as though the step in question is part of the mapping routine where NCrunch synchronises the data from the engine and prepares it for display in the editor window. This has already been identified as a performance issue in V1, but because of the complexity involved in fixing it, unfortunately I won't be able to resolve it as part of a minor release. The following things can contributing to surfacing this problem:

1. Lots of open windows in the editor
2. Open windows in the editor with LOTS of source code
3. A large number of tests in the solution, many of which may have a fast response time
4. Heavy parallel execution

Most likely the reason you're seeing this after a few hours is because the longer you use Visual Studio, the more editor windows you will open. The issue is tied very strongly to the number of lines of source code you have open at any one time, and the frequency with which the engine needs to map results coming back from builds and test execution. Once the mapping time becomes too long, you reach a tipping point where the engine spends all its time trying to catch up from the volume of data being returned from your tests (at which point it does almost nothing but map, and it will almost totally stop responding).

If you notice the engine is slowing down too much, try closing some of the editor windows or reducing the number of tasks you run in parallel to see if this makes a difference.
GreenMoose
#6 Posted : Friday, May 24, 2013 1:07:08 PM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
So, currently I have closed ALL open documents, still I see below in NCrunch output taking about 1 minute or longer:
Code:
[10:00:54.5203-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.9826027 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[10:00:59.4479-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.9276015 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[10:01:04.3555-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.7601126 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[10:01:09.1231-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.7676109 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[10:01:14.4232-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.9726037 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[10:01:28.1536-Core-59] Event [TestDataUpdatedEvent] took 00:00:04.3901348 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated

etc.
Should that event occur even with no open documents? Only devenv.exe is using cpu, not any BuildHost or TestHost from NCrunch.

However tip of closing documents seems to have improved it a lot since it now seems to be responsive after a while at least, even though it is still heavy on devenv.exe cpu (but now it seems at least to not go on "forever" until the point I have to restart devstudio).
Remco
#7 Posted : Saturday, May 25, 2013 1:16:47 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
Something that may happen is that the closing of the windows takes a while to kick in, because the engine is already overloaded before this happens and it will take a while to register that the windows have been closed. Until then, the lagged processing will continue until it has cleared the backlog.

Does leaving the machine alone for a few minutes after closing the documents make any difference? In theory, NCrunch shouldn't be taking time in processing this event without any windows open. Do you have any areas of your codebase with extremely high coverage density (i.e. >200 tests covering an individual line).

Something that would be very interesting is to grab a capture of the event processing times before the windows are closed, then after the windows are closed. Does this show any improvement?
GreenMoose
#8 Posted : Wednesday, June 26, 2013 2:06:35 PM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
FWIW, I have noticed the following:

* I tested it on a fairly small solution at another computer, could pretty easily repro the TestDataUpdatedEvent delays (no other devs are currently using NCrunch in my office).
* It does not seem to differ if I have very small amount of documents opened in devstudio, after a while devenv.exe chokes and the above events are shown in log and it vstudio becomes practically unusable.

However...

I disabled parallel test runners via NCrunch setting "Allow parallel test execution". Then I don't experience this probelm at all. I occasionally notice the above events but they are only 1 seconds or so and completes pretty quickly. They practically flood the log output after a while if I have parallel test running enabled.
Remco
#9 Posted : Wednesday, June 26, 2013 11:53:03 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
The sluggish events are themselves triggered by the processing of data returned from test runs, so you can basically expect to see a single message per completion of a test task. How many sets of tests do you normally run in parallel? Is there a particular point (in max execution threads) at which you really start to notice the engine falling over?
GreenMoose
#10 Posted : Thursday, June 27, 2013 5:34:40 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
> How many sets of tests do you normally run in parallel?
What defines "sets of tests" ?

> there a particular point (in max execution threads) at which you really start to notice the engine falling over?
Yes, 2. As shown below though it isn't apparent when the issue is starting to appear. To be able to get some work done without waiting for NCrunch to become responsive I've disabled parallel testing during the work day for now.

A) Engine idle, force execution of 2 tests with "run in existing process" with settings: Fast Lane Threads(1), Max number of processing threads(2), Max number of test runner processes to pool(2), Allow parallel test execution(True), Allow tests to run in parallel with themselves

Code:
[07:16:34.5653-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3815381 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:16:39.9468-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3805380 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:16:45.4144-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.4645464 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:16:50.8129-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3265326 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:16:56.0394-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.2265226 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:17:01.575-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3935393 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:17:06.8895-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3135313 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated



B) Engine idle, force execution of same 2 tests with "run in existing process" with same settings except Allow parallel test execution(False) (unfortunately I did not do this before step A to verify if the issue was there before switching to parallel or not).

Code:
[07:20:22.3491-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.4715471 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:20:27.6636-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3135313 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:20:32.9691-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3045304 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:20:38.4547-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.3985398 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated
[07:20:43.6892-Core-94] Event [TestDataUpdatedEvent] took 00:00:05.2325232 to process on CORE thread when calling nCrunch.Client.Model.CoverageWatcher.testDataUpdated


Restarted vstudio (same no of documents open) and redid B)
<no output at all> (yes "log to output window" was set to true, got a lot of nullreferencexception in output window when closing solution which I submitted)

Restarted vstudio (same no of documents open) and redid A)
<no console output at all>

I will try to redo A) some time later today and verify if B) still outputs the long event durations.
Remco
#11 Posted : Thursday, June 27, 2013 7:13:47 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
Thanks for sharing these details. There is a fairly significant fix expected in the V2 release to solve the CoverageWatcher performance issues. I'd been hoping to avoid introducing this fix in the V1 branch because of the high risk of it destabilising the product, although it looks as though the issue is more serious than originally believed and there is only a very limited number of ways to work around it.

There is a maintenance release of V1 expected late next month and I'll make an effort to include the fix in this release.
GreenMoose
#12 Posted : Thursday, June 27, 2013 8:12:32 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
FWIW, after I tried re-enabled parallel tests NCrunch won't play nicely at all any longer, even with parallel execution disabled.
For instance, building tiny 1 project which in vstudio takes 1-2 seconds for a "Rebuild", NCrunch displays a processing time of 40s.

devenv.exe is now choking cpu, but now even without started to run tests because the build tasks is taking so long . "Summary" of ncrunch output is pretty quiet. Expected time of my test suite is now 1h40m.. normally it displays like 3min-5min or so...

Very weird.

*Edit: Some new long duration events, this time it seems to have something to do with status indicator?
Code:
[10:14:52.1231-Core-23] Event [TasksUnqueuedEvent] took 00:00:09.8784944 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qL2IMj6qixse9AsCiqZQLJg==
[10:15:02.3018-Core-23] Event [TasksUnqueuedEvent] took 00:00:10.1387178 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qL2IMj6qixse9AsCiqZQLJg==
[10:15:13.1912-Core-23] Event [TasksUnqueuedEvent] took 00:00:10.1337108 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qL2IMj6qixse9AsCiqZQLJg==
[10:15:23.2148-Core-23] Event [TasksUnqueuedEvent] took 00:00:09.9935886 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qL2IMj6qixse9AsCiqZQLJg==
[10:15:33.1909-Core-23] Event [TaskQueuedEvent:nCrunch.Core.Processing.TestExecutionTask:NUnitTests] took 00:00:09.9460480 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qgTlXKVU3ZZ32HR2UuPHTTw==
[10:15:43.5523-Core-23] Event [TaskQueuedEvent:nCrunch.Core.Processing.TestExecutionTask:NUnitTests] took 00:00:10.3413900 to process on CORE thread when calling nCrunch.VSAddIn.UI.StatusIndicator.StatusIndicatorSynchroniser.#=qgTlXKVU3ZZ32HR2UuPHTTw==
[10:16:02.4685-Core-23] Event [TaskQueuedEvent:nCrunch.Core.Processing.TestExecutionTask:NUnitTests] took 00:00:01.0634138 to process on CORE thread when calling nCrunch.VSAddIn.UI.ProcessingQueue.ProcessingQueueSynchroniser.#=q44G8QQ1$U2$KaiMYZeVDCA==
Remco
#13 Posted : Thursday, June 27, 2013 8:29:16 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
Yes - the status indicator has huge performance problems right now with big test pipelines. I recommend working with it closed on larger solutions until it can be properly fixed. There are a number of problems with the feature around its design. I've wanted to remove it for quite some time, though after suggesting I would do so many users expressed that they really liked it and wanted to keep it in. The long term solution will likely be to redesign it from scratch with a different approach but similar goal.
GreenMoose
#14 Posted : Thursday, June 27, 2013 9:05:28 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
Ok, I have now closed the status indicator window and see no more such events, thanks.

Regarding build times: I switched from "run impacted test automatically" to "run pinned tests automatically", restarted vstudio (several minutes before devenv.exe shuts down without me having to kill it) and build times went back to normal. Switching back to "run impacted tests automatically" and restarted vstudio (this time I killed process after it had been eating cpu for ~10min and never wanted to exit cleanly), build times extremely slow again.

"Run all tests automatically" also suffers from the slow build problem.

As soon as I switch to "run pinned tests automatically" and restart(/kill) vstudio the slow build times issues goes away.

Running all tests in solution with the "run pinned tests automatically" also seems a lot faster and vstudio is responsive again, I tried enabled parallel run with this engine mode and so far it works fine as well.
Remco
#15 Posted : Friday, June 28, 2013 1:10:22 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
The easiest issue to analyse here is the devenv.exe hanging on shutdown. If you have this happen to again, try spinning up a second instance of Visual Studio and attaching its debugger to the first instance. Break into the session and examine the list of running threads. Can you then copy/paste the list of threads here into the forum?

The engine modes themselves don't really have much of a direct impact on the behaviour of the NCrunch engine - they simply filter which tests are allowed into the processing queue. The engine modes don't exert any control over NCrunch's build behaviour. If you find that the engine mode is having an impact on the performance of Visual Studio or NCrunch in general, this brings to mind two possibilities:
1. The problem is related to system-level load (I/O perhaps?) and by choosing an engine mode with a less aggressive filter (i.e. 'Run all tests automatically'), the system is becoming overloaded with the tests being executed.
2. There are tests in your solution that are very expensive (resource-wise) to process and by changing engine mode you are changing the conditions under which NCrunch would run these tests, thus introducing variable performance.

Compared with typical unit tests, builds tend to be MUCH more expensive in terms of I/O resources than tests. This means that if your disk is struggling to keep up with I/O demands then you'll most likely notice this in elevated build times before you see it anywhere else. Build times will also be significantly elevated for the first few minutes that NCrunch is used on a solution with many files in it, as NCrunch needs to make several shadow copies of these files to satisfy its workspacing requirements for normal operation. The number of workspaces NCrunch needs to build will depend upon parallel execution requirements, pooled test processes and the max number of processing threads configuration setting. Resetting the engine (or enable/disable) will clear all the workspaces, requiring NCrunch to rebuild them from scratch.
GreenMoose
#16 Posted : Wednesday, September 4, 2013 12:16:33 PM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
Remco;4303 wrote:
The easiest issue to analyse here is the devenv.exe hanging on shutdown. If you have this happen to again, try spinning up a second instance of Visual Studio and attaching its debugger to the first instance. Break into the session and examine the list of running threads. Can you then copy/paste the list of threads here into the forum?.


Freshly installed OS. Previous OS felt this issue was resolved with the latest version, with the new OS I am experiencing extremely slugginess from NCrunch. Trivial projects are taking 45s to build (should be a few secs), devenv.exe is spiking cpu all the time and environment is practically unusable.
Using SoftPerfect's ramdisk this time.
Closing studio and devenv.exe (along with NCrunch's build host processes) are still alive, where devenv.exe is spiking cpu.

This time it has not even started running the tests.

Code:

Not Flagged		9620	1	Main Thread	Main Thread	nCrunch.Core.Threading.CoreMessageDispatcher.BlockUntilQueueIsEmpty()	Normal
Not Flagged		9408	2	Worker Thread	<No Name>		Highest
Not Flagged		6388	0	Worker Thread	<No Name>		Normal
Not Flagged		2716	4	Worker Thread	Pool Thread #0	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		4280	5	Worker Thread	Pool Thread #1	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		3472	0	Worker Thread	<No Name>		Normal
Not Flagged		9040	0	Worker Thread	<No Name>		Normal
Not Flagged		9476	10	Worker Thread	LAAFix	System.Threading.Thread.Suspend()	Normal
Not Flagged		8260	17	Worker Thread	In-proc Node (Default)	System.Threading.WaitHandle.WaitAny()	Normal
Not Flagged		3320	0	Worker Thread	<No Name>		Normal
Not Flagged		9236	74	Worker Thread	<No Name>		Normal
Not Flagged		2196	27	Worker Thread	ApplicationMutices	[Managed to Native Transition]	Normal
Not Flagged		10008	68	Worker Thread	Pool Thread #2	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		2416	69	Worker Thread	Pool Thread #3	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		8740	51	Worker Thread	<No Name>	[Managed to Native Transition]	Normal
Not Flagged		7412	20	Worker Thread	<No Name>	[Managed to Native Transition]	Normal
Not Flagged		1204	75	Worker Thread	ImageAnimator	System.Drawing.ImageAnimator.AnimateImages50ms()	Normal
Not Flagged		8664	61	Worker Thread	Worker Thread	nCrunch.Common.TestId.GetLastPeriodIndexOutsideParameters()	Lowest
Not Flagged		10544	16	Worker Thread	Worker Thread	[Managed to Native Transition]	Lowest
Not Flagged		3016	26	Worker Thread	<No Name>		Normal
Not Flagged		8364	70	Worker Thread	<No Name>		Normal
Not Flagged		8308	54	Worker Thread	Worker Thread	[Managed to Native Transition]	Lowest
Not Flagged		9740	71	Worker Thread	Worker Thread	[Managed to Native Transition]	Lowest
Not Flagged		2392	15	Worker Thread	<No Name>		Normal
Not Flagged	>	9636	62	Worker Thread	Worker Thread	[Managed to Native Transition]	Lowest
Not Flagged		8264	57	Worker Thread	<No Name>		Normal
Not Flagged		976	47	Worker Thread	<No Name>		Normal
Not Flagged		10092	83	Worker Thread	<No Name>		Normal
Not Flagged		11216	0	Worker Thread	<No Name>		Normal
Not Flagged		9752	0	Worker Thread	[Thread Destroyed]		Normal



*Edit: This seems to happen every time I choose "rebuild and resynchronize" in NCrunch. Below is callstack from a devenv.exe spiking cpu, NCrunch reports extreme long "processing time" for building trivial assemblies (e.g. 1 minute).
Code:

Not Flagged		5292	61	Worker Thread	<No Name>		Normal
Not Flagged		10364	1	Main Thread	Main Thread	DevExpress.XtraTreeList.Internal.TreeListFilterHelper.GetVisibleNodeCount()	Normal
Not Flagged		10680	2	Worker Thread	<No Name>		Highest
Not Flagged		10272	0	Worker Thread	<No Name>		Normal
Not Flagged		7844	4	Worker Thread	Pool Thread #0	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		10632	5	Worker Thread	Pool Thread #1	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		9932	6	Worker Thread	<No Name>		Normal
Not Flagged		9292	0	Worker Thread	<No Name>		Normal
Not Flagged		10260	0	Worker Thread	<No Name>		Normal
Not Flagged		8460	10	Worker Thread	LAAFix	System.Threading.Thread.Suspend()	Normal
Not Flagged		10180	13	Worker Thread	StatusBar	[Managed to Native Transition]	Normal
Not Flagged		5644	15	Worker Thread	Worker Thread	[Managed to Native Transition]	Lowest
Not Flagged		6308	17	Worker Thread	In-proc Node (Default)	System.Threading.WaitHandle.WaitAny()	Normal
Not Flagged		2372	0	Worker Thread	<No Name>		Normal
Not Flagged		9752	65	Worker Thread	<No Name>		Normal
Not Flagged		9272	22	Worker Thread	FileSystemTrackerCommit	[Managed to Native Transition]	Below Normal
Not Flagged		10432	24	Worker Thread	SettingsChangeDispatch	[Managed to Native Transition]	Normal
Not Flagged		8792	27	Worker Thread	ApplicationMutices	[Managed to Native Transition]	Normal
Not Flagged		10848	28	Worker Thread	PeriodicallyCommitTransactionExecuter	[Managed to Native Transition]	Normal
Not Flagged		404	29	Worker Thread	CacheUpdate	System.Threading.Monitor.Wait()	Normal
Not Flagged		5388	30	Worker Thread	DaemonFibers #0	System.Threading.Monitor.Wait()	Normal
Not Flagged		10892	31	Worker Thread	DaemonFibers #1	System.Threading.Monitor.Wait()	Normal
Not Flagged		8908	32	Worker Thread	DaemonFibers #2	System.Threading.Monitor.Wait()	Normal
Not Flagged		9132	33	Worker Thread	DaemonFibers #3	System.Threading.Monitor.Wait()	Normal
Not Flagged		10664	34	Worker Thread	DaemonFibers #4	System.Threading.Monitor.Wait()	Normal
Not Flagged		10824	35	Worker Thread	DaemonFibers #5	System.Threading.Monitor.Wait()	Normal
Not Flagged		9736	36	Worker Thread	DaemonFibers #6	System.Threading.Monitor.Wait()	Normal
Not Flagged		11260	37	Worker Thread	DaemonFibers #7	System.Threading.Monitor.Wait()	Normal
Not Flagged		3684	38	Worker Thread	DaemonProcessFibers #0	System.Threading.Monitor.Wait()	Normal
Not Flagged		11200	39	Worker Thread	DaemonProcessFibers #1	System.Threading.Monitor.Wait()	Normal
Not Flagged		5568	40	Worker Thread	DaemonProcessFibers #2	System.Threading.Monitor.Wait()	Normal
Not Flagged		9120	41	Worker Thread	DaemonProcessFibers #3	System.Threading.Monitor.Wait()	Normal
Not Flagged		7528	42	Worker Thread	DaemonProcessFibers #4	System.Threading.Monitor.Wait()	Normal
Not Flagged		3752	43	Worker Thread	DaemonProcessFibers #5	System.Threading.Monitor.Wait()	Normal
Not Flagged		4664	44	Worker Thread	DaemonProcessFibers #6	System.Threading.Monitor.Wait()	Normal
Not Flagged		8784	45	Worker Thread	DaemonProcessFibers #7	System.Threading.Monitor.Wait()	Normal
Not Flagged		10980	46	Worker Thread	Daemon	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		10764	47	Worker Thread	SolutionAnalysis.UpdatingDependencyToFilesMap	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		9620	48	Worker Thread	XML Indexing	System.Threading.WaitHandle.InternalWaitOne()	Below Normal
Not Flagged		10264	49	Worker Thread	Pdb cache thread	System.Threading.WaitHandle.InternalWaitOne()	Below Normal
Not Flagged		10612	50	Worker Thread	Unit Test Project Explorer	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		11020	67	Worker Thread	Pool Thread #2	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		9240	68	Worker Thread	Pool Thread #3	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		9860	69	Worker Thread	Pool Thread #4	System.Threading.WaitHandle.InternalWaitOne()	Normal
Not Flagged		11196	59	Worker Thread	ImageAnimator	System.Drawing.ImageAnimator.AnimateImages50ms()	Normal
Not Flagged		5124	58	Worker Thread	<No Name>	[Managed to Native Transition]	Normal
Not Flagged	>	10724	57	Worker Thread	<No Name>	[Managed to Native Transition]	Normal
Not Flagged		5272	62	Worker Thread	<No Name>		Normal
Not Flagged		11128	19	Worker Thread	<No Name>		Normal
Not Flagged		7328	72	Worker Thread	<No Name>		Normal
Not Flagged		11084	64	Worker Thread	<No Name>		Normal
Not Flagged		7120	21	Worker Thread	<No Name>		Normal
Not Flagged		7712	71	Worker Thread	<No Name>		Normal
Not Flagged		8388	51	Worker Thread	<No Name>		Normal
Not Flagged		2860	70	Worker Thread	<No Name>		Normal
Not Flagged		6584	52	Worker Thread	<No Name>		Normal
Not Flagged		10420	63	Worker Thread	<No Name>		Normal
Not Flagged		6936	16	Worker Thread	<No Name>		Normal
Not Flagged		7580	66	Worker Thread	<No Name>		Normal
Not Flagged		6416	56	Worker Thread	<No Name>		Normal
Not Flagged		8688	55	Worker Thread	<No Name>		Normal
Not Flagged		11204	14	Worker Thread	<No Name>		Normal



*Edit2: The thread spiking cpu seems to be with "start adress" clr.dll!DllGetClassObjectInternal (according to process explorer), breaking into devenv.exe and examining this thread (which has priority set to "Lowest") I find the below call stack, does this help?
Code:

 	[Managed to Native Transition]	
>	mscorlib.dll!System.Globalization.CompareInfo.Compare(string string1, string string2, System.Globalization.CompareOptions options) + 0x6b bytes	
 	mscorlib.dll!string.CompareTo(string strB) + 0x34 bytes	
 	mscorlib.dll!System.Collections.Generic.GenericComparer<string>.Compare(string x, string y) + 0x43 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest,string>.CompareKeys(int index1, int index2) + 0x5f bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<System.__Canon,bool>.CompareKeys(int index1, int index2) + 0x54 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest>.QuickSort(int[] map, int left, int right) + 0x7d bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<nCrunch.Client.Model.UITest>.QuickSort(int[] map, int left, int right) + 0xf8 bytes	
 	System.Core.dll!System.Linq.EnumerableSorter<System.__Canon>.Sort(System.__Canon[] elements, int count) + 0x4a bytes	
 	System.Core.dll!System.Linq.OrderedEnumerable<nCrunch.Client.Model.UITest>.GetEnumerator.MoveNext() + 0xb3 bytes	
 	System.Core.dll!System.Linq.Buffer<nCrunch.Client.Model.UITest>.Buffer(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UITest> source) + 0xd9 bytes	
 	System.Core.dll!System.Linq.Enumerable.ToArray<nCrunch.Client.Model.UITest>(System.Collections.Generic.IEnumerable<nCrunch.Client.Model.UITest> source) + 0x4e bytes	
 	nCrunch.VSAddIn.dll!nCrunch.VSAddIn.UI.TestTree.TestTreeSynchroniser.CreateUpdateEvent() + 0x2cf bytes	
 	nCrunch.VSAddIn.dll!nCrunch.VSAddIn.UI.DataAwareSynchronisation.DataAwareWindowSynchroniser<System.__Canon>.RefreshData() + 0x1b bytes	
 	nCrunch.VSAddIn.dll!nCrunch.VSAddIn.UI.TestTree.TestTreeSynchroniser.#=qossrgXWpPAiqDp2GjNhRKQ==(nCrunch.Client.Model.ModelEvents.TestDataUpdatedEvent #=qj_fDpCtlOqJ229S_LAi3yQ==) + 0x7 bytes	
 	[Native to Managed Transition]	
 	mscorlib.dll!System.Delegate.DynamicInvokeImpl(object[] args) + 0x76 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.CoreMessageDispatcher.#=qWvjWszlULQQDeHlJRRbKmg==.#=q9XXE3z5hPVuwWaq1o7Tq8A==() + 0x54 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.CoreMessageDispatcher.#=qr0NzxLdNklVn1_6dSy8G9HtAhV28O0dtWc3mgDcDah0=() + 0xd6 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.PooledWorkItem.Start() + 0x55 bytes	
 	nCrunch.Core.dll!nCrunch.Core.Threading.ThreadFactory.#=q4tyim2Qi0XIAMo_i$p1QvQ==(object #=qEa5fJAgIYj7WnzjepqzKsA==) + 0x7e bytes	
 	mscorlib.dll!System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(object state) + 0x3e bytes	
 	mscorlib.dll!System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state, bool preserveSyncCtx) + 0xa7 bytes	
 	mscorlib.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state, bool preserveSyncCtx) + 0x16 bytes	
 	mscorlib.dll!System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem() + 0x60 bytes	
 	mscorlib.dll!System.Threading.ThreadPoolWorkQueue.Dispatch() + 0x149 bytes	
 	mscorlib.dll!System.Threading._ThreadPoolWaitCallback.PerformWaitCallback() + 0x5 bytes	
 	[Native to Managed Transition]	

Remco
#17 Posted : Thursday, September 5, 2013 1:33:57 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,987

Thanks: 931 times
Was thanked: 1257 time(s) in 1170 post(s)
The stack trace above suggests there is a large amount of data being shown in the Tests Window.

The Tests Window has a known performance issue where it can slow down the VS UI when trying to render lots of data. This can also have a knock-on effect where NCrunch is slow to handle data returned by the engine. If you've just loaded in a new O/S and you're running NCrunch for the first time, then my guess is that you'll have potentially many tests in an unprocessed state - where they'll be shown in the Tests Window by default.

Try adjusting the filters in your Tests Window to hide the unexecuted tests. Do you notice an improvement in the responsiveness of NCrunch and Visual Studio? Once the tests have been run through and their results exist, you can reset the filters to show any new tests that you create.

Unfortunately the performance issues in the Tests Window are related to the component being used to show the data, which has trouble handling frequent updates of large data sets. The upcoming V2 release will include a complete replacement for this control and resolve the problem entirely.
GreenMoose
#18 Posted : Thursday, September 5, 2013 4:53:12 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 6/17/2012(UTC)
Posts: 503

Thanks: 142 times
Was thanked: 66 time(s) in 64 post(s)
Yeah closing NCrunch test window did it this time, it was a large amount of pending tests in it. Thanks.
1 user thanked GreenMoose for this useful post.
Remco on 9/5/2013(UTC)
Users browsing this topic
Guest
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

YAF | YAF © 2003-2011, Yet Another Forum.NET
This page was generated in 0.135 seconds.
Trial NCrunch
Take NCrunch for a spin
Do your fingers a favour and supercharge your testing workflow
Free Download