Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

NCrunch 3 Test Impact Detection does not work at all
DeltaEngine
#1 Posted : Saturday, December 17, 2016 11:28:32 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 11/23/2012(UTC)
Posts: 31
Location: Germany

Thanks: 8 times
Was thanked: 3 time(s) in 3 post(s)
Hi Remco,

Great release with NCrunch 3, I am enjoying all the new features. One major annoyance in a very big solution with 5000+ tests is that changing anything in a low level project will always execute all tests in all projects depending on it, even if it just a comment, space, empty line or any method not called by any test. So I was glad to hear that the new Test Impact Detection will improve this, but it does not work at all.
- Tested with VS2015 and VS2017, used NCrunch 2 and 3, tested with NUnit 2.6.4 and the newest NUnit 3.5, all makes no difference
- It is very easy to test and reproduce (just created a new solution and put this in there) with run tests on change (and using the new ILCompare setting for Impact Detection Modes, which is on by default anyway):

/// <summary>
/// Any change will always executed all tests .. even a space in a comment.
/// </summary>
public class CheckIfImpactedDetectionWorks
{
[Test]
public void SomeTest()
{
Assert.That(1 + 1, Is.EqualTo(2));
}

[Test]
public void AnotherTest()
{
Assert.That("hi" + " " + "there", Is.EqualTo("hi there"));
}
}

I don't even care if the system works perfectly, but a change to a comment should really not trigger a build and complete test cycle.

What annoys me most is that changing some test or simply writing a new one executes ALL other tests in that assembly with obviously none of them calling that test or new code, that makes really no sense. Dunno how you could check for that, but some way to see if the last edited code belongs to a test and then only executing that test would be good enough for 99% of my use cases (most of the time I am working on one test and when I go to the implementation I am fine that many tests are impacted as things can affect lots of things). Overall it is still a fast cycle, but still have to wait 5-10s for about 1000 tests to complete (seems to be faster after the initial run too), would be great if just 1 test executes and things can go below 1s like in the good old days when there was not that much code in the solution :)
Remco
#2 Posted : Saturday, December 17, 2016 11:13:57 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
Hi, thanks for sharing this issue. You're the first person to report a problem with the new impact detection, so I'm naturally excited to see what is happening here :)

When you're working in your solution, which engine mode are you using? Under the default engine mode (Run all tests automatically), NCrunch will still queue tests for execution when they haven't impacted - the impact detection will be used only for prioritisation.

If you want to restrict execution of tests to only impacted tests, the 'Run impacted tests automatically, others manually' mode is a better one.

NCrunch always shows impacted tests with a little 'i' next to their name in the Tests Window. There's also a column you can add to this window with a True/False value indicating impact status. I think the first thing we need to establish is whether this is actually an impact detection problem, or simply the engine queuing tests that it shouldn't be.
DeltaEngine
#3 Posted : Sunday, December 18, 2016 12:07:07 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 11/23/2012(UTC)
Posts: 31
Location: Germany

Thanks: 8 times
Was thanked: 3 time(s) in 3 post(s)
Thanks for the response and helpful tips.

I have turned on the Is Impacted column and it does in fact show false for everything, e.g. when adding a space to a comment. However it never shows true for complex test classes with base classes (or it is so fast I cannot see it), no matter if I just add a space to the method or do actual code changes. I tested it in a simpler use-case and there the impacted tests shows up correctly.

For testing I pinned a test and it executes quickly and as one of the first, the build is a bit slow (solution has 75 projects, with the ReSharper Build System it is almost instant to compile anything, which is quite nice, in NCrunch it takes 1-3 seconds before any tests is even attempted to be started).

This is what I see, after about 2 seconds the pinned test is executed quickly, then it seems all other tests in the assembly are also executed (spinning circle goes through them), most show zero time, but something is still being done as it takes around 10 seconds to get to "No tests are queued for execution" again (pretty much always, no matter how big the change is). All of our tests execute below 10ms, mostly below 1ms, so they are fast (with a few exceptions). Some tests (like the one visible here) have test sources that are executed before the tests can start, but all non-test sources tests behave the same way ..


I tried the "Run impacted tests automatically, rest manually" mode (with excluding our slow and nightly categories), seems NCrunch forgot about all the ignored tests from my normal mode and executed them anyway, even restarting did not fix it, so I had to re-ignore them (which is strange anyway because they are explicitly in the Slow category). I could not get this to work in the complex tests from the screenshot above, but in a simpler use-case it worked fine. I could change the test and only it would be executed or change the implementation and a bunch of tests failed until I fixed the implementation again.

Thanks for the tip with the "Run impacted tests automatically, rest manually" mode, will use that for now as it does mostly what I want and is pretty quick except for the build times. Maybe you can take a look how ReSharper Build solves this, any build of the lowest level project I do does only compile that one and all other projects stay the same (it is very rare that a change requires a recompilation): https://blog.jetbrains.c...oducing-resharper-build/
Remco
#4 Posted : Sunday, December 18, 2016 12:30:12 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
Thanks for coming back to me again on this issue.

Can you confirm that using the 'Run impacted tests automatically, others manually' engine mode solved the impact detection problem for you?

I'll try to address the other issues you've also raised above:

- NCrunch forgetting ignored tests: How are you ignoring your tests in NCrunch? Is this being done by manually adjusting the engine modes to add extra filters for automatic execution? If so, then switching to an engine mode without your custom filter will definitely cause the behaviour you've described. You might need to make some changes to the 'Run impacted tests automatically, others manually' mode to exclude your slow categories from automatic execution.

- Slow execution of fast tests: The best way to troubleshoot this is to enable the track engine performance NCrunch global configuration setting and restart the IDE. With this setting enabled, NCrunch will provide you with a detailed breakdown of all tasks the engine needed to perform when executing a test. With the number of tests in your solution, you'll likely find a significant amount of time is taken initialising NUnit. It's possible to export this data and copy/paste it here into the forum. I'll then help you analyse this to figure out if there's anything we can do to tweak the performance.

- Comparison with Resharper build: As I only have limited first-hand experience with this product, it's hard for me to draw any firm conclusions here. But based on its description, Resharper build will still need to recompile the project when it is changed (i.e. the compile step can never be skipped). The advantages of this system are such that it won't need to re-build projects that sit further up in the dependency chain if these projects haven't been affected by the change. NCrunch has had this feature since its first version, though it has some limitations in terms of compatibility. If the 'Copy referenced assemblies to workspace' setting is enabled on any projects in the solution, the feature is suppressed. NCrunch will give you a warning if it needs to rebuild dependencies as a result of this config setting or for any other reason than a public signature change. It's also worth considering that for NCrunch to execute your tests, it needs to do much more than just build the changed project; it also needs to instrument it, run a test discovery step to identify any changes to tests, then build a whole runtime environment containing all the relevant code with an initialised test framework. You should see each of these steps accounted for in the 'Execution Steps'.
DeltaEngine
#5 Posted : Sunday, December 18, 2016 10:21:02 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 11/23/2012(UTC)
Posts: 31
Location: Germany

Thanks: 8 times
Was thanked: 3 time(s) in 3 post(s)
Yes, the 'Run impacted tests automatically, others manually' engine mode solves the issue for me, it is far faster than the normal Run all tests automatically mode for me (10s vs 2s). For some reason today it works also far better than yesterday, maybe some outdated files were messing things up.

About the engine modes: Yes, I did change the mode settings to match my 'Run all tests automatically' mode, but it already executed and the results messed up my NCrunch Tests window and also took forever (because of slow and nightly tests not designed to be run with NCrunch). For some reason even restarting NCrunch did not help. Today it seems to work correctly, I even deleted the <IgnoredTests> sections from my ncrunch project files and all tests executed correctly.

Here is the engine performance for me (in Run impacted tests automatically mode), btw: I have a 4.2ghz overclocked i7 with 6 hyperthreads dedicated to NCrunch.
- When just changing a comment or adding an empty line no execution steps are shown, the build was also quick (<0.1s) when I changed something in the most basic project. Strange.
- Doing actual code changes did also work pretty well, but I cannot see anything in "Execution Steps", it stays empty except when manually executing a test.
- Here is what it looks like for most tests (could not find an overview, it only shows something when clicking on a single test or class)
Quote:

Step Duration Started At Finished At
All steps required for the execution of DeltaEngine.Tests.Datatypes.ColorTests.CreateWithBytes 00:00:31.4467062 2016-12-18 10:58:12 2016-12-18 10:58:44
Complete processing of task 'Run tests from DeltaEngine.Tests on (local)' 00:00:00.0035033 2016-12-18 10:58:44 2016-12-18 10:58:44
Enqueue tests for high priority execution 00:00:00.0335319 2016-12-18 10:58:12 2016-12-18 10:58:12
Merge test code coverage data 00:00:00.0100094 2016-12-18 10:58:44 2016-12-18 10:58:44
Prepare primary task for processing 'Run tests from DeltaEngine.Tests' 00:00:00.0010009 2016-12-18 10:58:43 2016-12-18 10:58:43
Process task 'Run tests from DeltaEngine.Tests on (local)' 00:00:01.2391900 2016-12-18 10:58:43 2016-12-18 10:58:44
Register result data for test 'DeltaEngine.Tests.Datatypes.ColorTests.CreateWithBytes' with engine 00:00:00 2016-12-18 10:58:44 2016-12-18 10:58:44
Waiting in processing queue 00:00:30.1589698 2016-12-18 10:58:12 2016-12-18 10:58:43


Not sure how useful that is, all pretty low times except for 'Waiting for processing queue', here is another result (from a class):
Quote:

Step Duration Started At Finished At
All steps required for the execution of DeltaEngine.Tests.Achievements.AchievementProcessorTests.* 00:00:38.8142824 2016-12-18 10:58:12 2016-12-18 10:58:51
Complete processing of task 'Run tests from DeltaEngine.Tests on (local)' 00:00:00.0055053 2016-12-18 10:58:51 2016-12-18 10:58:51
Enqueue tests for high priority execution 00:00:00.0335319 2016-12-18 10:58:12 2016-12-18 10:58:12
Merge test code coverage data 00:00:00.2872757 2016-12-18 10:58:51 2016-12-18 10:58:51
Prepare primary task for processing 'Run tests from DeltaEngine.Tests' 00:00:00.0010009 2016-12-18 10:58:47 2016-12-18 10:58:47
Process task 'Run tests from DeltaEngine.Tests on (local)' 00:00:04.3656934 2016-12-18 10:58:47 2016-12-18 10:58:51
Register result data for test 'DeltaEngine.Tests.Achievements.AchievementProcessorTests' with engine 00:00:00 2016-12-18 10:58:51 2016-12-18 10:58:51
Waiting in processing queue 00:00:34.1162704 2016-12-18 10:58:12 2016-12-18 10:58:47


And switching back to 'Run all tests automatically' mode, the results are much more detailed. I was yesterday probably talking about the "Dependency 'Analyse" steps being too slow for me (what I called build times), the tests could be faster, but I understand that running 5000 tests is going to be slow. This is why the 'Run impacted tests automatically, others manually' engine mode now works great for me, it only executes a tiny amount of tests I am working on and I can manually run them all when I am not sure if all of them were really not impacted.
Quote:

Step Duration Started At Finished At
All steps required for the execution of DeltaEngineServices.Server.Tests.Build.CodeUnpackerTests.* 00:01:06.7816489 2016-12-18 11:10:25 2016-12-18 11:11:32
Complete processing of task 'Run tests from DeltaEngineServices.Server.Tests on (local)' 00:00:00.0090086 2016-12-18 11:11:32 2016-12-18 11:11:32
Dependency 'Analyse DeltaEngine.Mocks on (local)' 00:00:00.4689505 2016-12-18 11:10:46 2016-12-18 11:10:47
Dependency 'Analyse DeltaEngineServices.Server.Mocks on (local)' 00:00:00.2722612 2016-12-18 11:10:49 2016-12-18 11:10:49
Dependency 'Analyse DeltaEngineServices.Server.Tests on (local)' 00:00:00.3117993 2016-12-18 11:10:49 2016-12-18 11:10:50
Dependency 'Build DeltaEngine on (local)' 00:00:02.1560751 2016-12-18 11:10:26 2016-12-18 11:10:28
Dependency 'Build DeltaEngine.Achievements on (local)' 00:00:00.3168044 2016-12-18 11:10:30 2016-12-18 11:10:30
Dependency 'Build DeltaEngine.Ads on (local)' 00:00:00.2307215 2016-12-18 11:10:30 2016-12-18 11:10:31
Dependency 'Build DeltaEngine.Analytics on (local)' 00:00:00.1431376 2016-12-18 11:10:30 2016-12-18 11:10:31
Dependency 'Build DeltaEngine.Authentication on (local)' 00:00:00.2507408 2016-12-18 11:10:30 2016-12-18 11:10:30
Dependency 'Build DeltaEngine.Content on (local)' 00:00:00.2287204 2016-12-18 11:10:29 2016-12-18 11:10:29
Dependency 'Build DeltaEngine.Editor on (local)' 00:00:00.4474301 2016-12-18 11:10:34 2016-12-18 11:10:35
Dependency 'Build DeltaEngine.Entities on (local)' 00:00:00.8403070 2016-12-18 11:10:28 2016-12-18 11:10:29
Dependency 'Build DeltaEngine.Fonts on (local)' 00:00:00.2667564 2016-12-18 11:10:29 2016-12-18 11:10:30
Dependency 'Build DeltaEngine.Graphics on (local)' 00:00:00.2617515 2016-12-18 11:10:29 2016-12-18 11:10:29
Dependency 'Build DeltaEngine.InAppPurchase on (local)' 00:00:00.5630410 2016-12-18 11:10:31 2016-12-18 11:10:31
Dependency 'Build DeltaEngine.Input on (local)' 00:00:00.4714529 2016-12-18 11:10:29 2016-12-18 11:10:29
Dependency 'Build DeltaEngine.Mocks on (local)' 00:00:00.4519340 2016-12-18 11:10:34 2016-12-18 11:10:34
Dependency 'Build DeltaEngine.Multimedia on (local)' 00:00:00.3062946 2016-12-18 11:10:30 2016-12-18 11:10:30
Dependency 'Build DeltaEngine.Networking on (local)' 00:00:00.4349177 2016-12-18 11:10:30 2016-12-18 11:10:30
Dependency 'Build DeltaEngine.Optional.Json on (local)' 00:00:00.2457360 2016-12-18 11:10:31 2016-12-18 11:10:32
Dependency 'Build DeltaEngine.Resolvers on (local)' 00:00:00.2932819 2016-12-18 11:10:33 2016-12-18 11:10:34
Dependency 'Build DeltaEngine.Spine on (local)' 00:00:00.3428292 2016-12-18 11:10:31 2016-12-18 11:10:31
Dependency 'Build DeltaEngine.Sprites on (local)' 00:00:00.4209044 2016-12-18 11:10:29 2016-12-18 11:10:29
Dependency 'Build DeltaEngine.Xml on (local)' 00:00:02.6280242 2016-12-18 11:10:31 2016-12-18 11:10:33
Dependency 'Build DeltaEngineServices.CodeConverters on (local)' 00:00:00.4048890 2016-12-18 11:10:35 2016-12-18 11:10:35
Dependency 'Build DeltaEngineServices.CodeConverters.Mocks on (local)' 00:00:00.2066985 2016-12-18 11:10:37 2016-12-18 11:10:37
Dependency 'Build DeltaEngineServices.ContentConverters on (local)' 00:00:00.6851582 2016-12-18 11:10:37 2016-12-18 11:10:38
Dependency 'Build DeltaEngineServices.Database on (local)' 00:00:00.2482390 2016-12-18 11:10:38 2016-12-18 11:10:38
Dependency 'Build DeltaEngineServices.Database.Mocks on (local)' 00:00:00.2051970 2016-12-18 11:10:40 2016-12-18 11:10:40
Dependency 'Build DeltaEngineServices.Database.Mongo on (local)' 00:00:00.2497393 2016-12-18 11:10:38 2016-12-18 11:10:38
Dependency 'Build DeltaEngineServices.Database.Sql on (local)' 00:00:00.2192104 2016-12-18 11:10:40 2016-12-18 11:10:40
Dependency 'Build DeltaEngineServices.Platforms on (local)' 00:00:00.2567467 2016-12-18 11:10:37 2016-12-18 11:10:37
Dependency 'Build DeltaEngineServices.Server on (local)' 00:00:00.9469097 2016-12-18 11:10:39 2016-12-18 11:10:40
Dependency 'Build DeltaEngineServices.Server.Mocks on (local)' 00:00:00.2672571 2016-12-18 11:10:44 2016-12-18 11:10:45
Dependency 'Build DeltaEngineServices.Server.Tests on (local)' 00:00:00.2562458 2016-12-18 11:10:45 2016-12-18 11:10:45
Enqueue tests for execution with normal priority 00:00:00.1941867 2016-12-18 11:10:25 2016-12-18 11:10:25
Filter tests to identify execution targets 00:00:00.0035035 2016-12-18 11:10:25 2016-12-18 11:10:25
Merge test code coverage data 00:00:00.0030029 2016-12-18 11:11:32 2016-12-18 11:11:32
Prepare primary task for processing 'Run tests from DeltaEngineServices.Server.Tests' 00:00:00.0010008 2016-12-18 11:11:31 2016-12-18 11:11:31
Process task 'Run tests from DeltaEngineServices.Server.Tests on (local)' 00:00:00.5670446 2016-12-18 11:11:31 2016-12-18 11:11:32
Register result data for test 'DeltaEngineServices.Server.Tests.Build.CodeUnpackerTests' with engine 00:00:00 2016-12-18 11:11:32 2016-12-18 11:11:32
Waiting in processing queue 00:00:00.4234064 2016-12-18 11:10:25 2016-12-18 11:10:26
Waiting in processing queue 00:00:00.0885852 2016-12-18 11:10:34 2016-12-18 11:10:34
Waiting in processing queue 00:00:01.8227509 2016-12-18 11:10:35 2016-12-18 11:10:37
Waiting in processing queue 00:00:00.4179017 2016-12-18 11:10:38 2016-12-18 11:10:39
Waiting in processing queue 00:00:04.2696005 2016-12-18 11:10:40 2016-12-18 11:10:44
Waiting in processing queue 00:00:00.2287198 2016-12-18 11:10:45 2016-12-18 11:10:45
Waiting in processing queue 00:00:01.1846377 2016-12-18 11:10:45 2016-12-18 11:10:46
Waiting in processing queue 00:00:02.3112201 2016-12-18 11:10:47 2016-12-18 11:10:49
Waiting in processing queue 00:00:41.5924513 2016-12-18 11:10:50 2016-12-18 11:11:31


Hope this helps ..
Remco
#6 Posted : Sunday, December 18, 2016 10:42:00 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
This is really great, thanks for sharing this information!

I can offer you a few tips here.

- The performance tracking will start tracking from the point that the test enters the processing queue. Tests are usually queued in one of two ways; either by manually selecting them for execution, or by changing the code so that they are queued automatically. If the tests are queued manually, likely any builds needed for them have already been executed. This would be why you got more detail when in the 'Run all tests automatically' engine mode. There is no performance penalty for leaving the tracking enabled, so I'd suggest keeping it on and regularly inspecting if you feel like the engine is taking too long.

- You have very low build times. My guess is that your projects are individually fairly small, which is really a good thing when working with NCrunch. When things are running right, you should get really good performance on this solution.

- Because almost all the time being taken here is waiting on the processing queue, any issues you're encountering are likely to be due to overall capacity and configuration settings rather than real problems in the engine. It's possible to see the prioritisation of tasks in the processing queue by simply opening up the Processing Queue Window and sorting it by priority. I admit that this isn't always easy to correlate with the performance tracing output, but it can often give you some idea why certain tests are run much later in the sequence. NCrunch will mark tests as 'waiting in the processing queue' if higher priority items exist before them in the queue, or if resources are being tied up by other queued actions that cannot be aborted.

Considering that you saw a 30 second wait before your tests started to execute when you manually tried to run them, I would suggest taking a look at your 'Fast lane threads' setting to make sure you have this set to a value of 1 or higher. Boosting the max number of processing threads would also really help here, or perhaps setting up a grid node using distributed processing. A 30 second wait for priority execution is very unusual as this means that ALL your processing threads are busy tied up with other work, which shouldn't happen for so long if you have fast lane threads assigned and your test execution times are consistent.
DeltaEngine
#7 Posted : Monday, December 19, 2016 6:35:58 AM(UTC)
Rank: Advanced Member

Groups: Registered
Joined: 11/23/2012(UTC)
Posts: 31
Location: Germany

Thanks: 8 times
Was thanked: 3 time(s) in 3 post(s)
Thanks for the additional tips.
- Okay, I will leave performance tracking on and will inspect it more in the future, I always through it slows NCrunch down.
- Will also look at NCrunch Processing Queue and see if I can optimize things by assigning higher priority to important projects.
- Build times vary, we had over 200 projects and merged a lot of them to make NDepend happy, which likes less and bigger assemblies instead of many smaller ones, this also improved startup time quite a bit.
- Some projects have 20-30k lines, I would not call that small :) The big solution I am working on has about 200k lines code, but the smaller one I usually work on has more like 25k lines, about 20 projects and works very well with NCrunch. Most projects are more in the 500-2000 lines range. Most methods have just 1 line and use expression bodies, so 1000 lines is still a lot of functionality.
- The 30s was kind of a worst case because I restarted NCrunch and tested different settings and modes, normally it was more around 10s, it was just annoying that it kept repeating the same tests over and over. Now with the impact mode things are much more smooth and I almost forget that NCrunch is even running at all, which is how it should be.
- Always have been using 6 threads for NCrunch, 2 processing threads and 1 fast lane thread, worked best for me. Obviously if I would have a 16 hyper-thread PC in 2017 I would give more cores to NCrunch (looking at AMD Ryzen as Intel is currently a bit expensive when going for 8+ cores).
- Maybe I could try NCrunch Distributed Processing again, last time the overhead for the network was way too much and slowed things down more than it was beneficial. Would maybe be useful for long running things and the slowest tests, not sure if that can be configured.

Maybe also promote the "Test impacted tests automatically, others manually" engine mode more into the First Time Crunch Wizard, e.g. higher in the list of Engine Execution Modes dialog and maybe mention for very big projects it might be more desirable than running everything (for. For me this is the best feature of NCrunch 3 (once figured out ^^).
Remco
#8 Posted : Monday, December 19, 2016 6:47:58 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 7,123

Thanks: 957 times
Was thanked: 1287 time(s) in 1194 post(s)
DeltaEngine;9582 wrote:

- Okay, I will leave performance tracking on and will inspect it more in the future, I always through it slows NCrunch down.


The original intention was that it would be possible to leave it off so that there would be no performance impact with the measurement. However, the design ended up being so lightweight that it actually didn't matter. Interestingly, the current setting only disables the performance tracking in the UI - the engine continues to track performance regardless of whether the setting is enabled or not.

DeltaEngine;9582 wrote:

- Will also look at NCrunch Processing Queue and see if I can optimize things by assigning higher priority to important projects.


Generally, tweaking the build priority for specific projects shouldn't be needed. This is because the engine will implicitly add priority to projects that have high priority tests depending on them. In some situations changing the build priority may make the engine seem dumber. It's impossible for me to be certain without knowing your solution, but proceed at your own risk here :)

DeltaEngine;9582 wrote:

- Always have been using 6 threads for NCrunch, 2 processing threads and 1 fast lane thread, worked best for me. Obviously if I would have a 16 hyper-thread PC in 2017 I would give more cores to NCrunch (looking at AMD Ryzen as Intel is currently a bit expensive when going for 8+ cores).


I recommend turning up your processing threads to see how the engine performs. As a general guideline, I usually recommend one processing thread per logical CPU assigned to NCrunch, so this would normally be 6 processing threads in your situation. Note that the ideal setting here is extremely variable depending on how heavy your tests are. If you have lots of big multi-threaded I/O-heavy integration tests, a lower setting can sometimes help. If you have only 2 processing threads with many long running tests, the engine will feel quite unresponsive as resources will be scarce.

DeltaEngine;9582 wrote:

- Maybe I could try NCrunch Distributed Processing again, last time the overhead for the network was way too much and slowed things down more than it was beneficial. Would maybe be useful for long running things and the slowest tests, not sure if that can be configured.


I'm interested in knowing whether you can consistently reproduce a situation where network latency is a problem with distributed processing. The protocol was intended to be very lightweight, with minimal data transferred to the server after the first big upload. NCrunch's performance tracking does also cover the steps involved with distributed processing, including some measurement of network overhead. If you find the time to try this again I'd be interested to hear how it goes.
Users browsing this topic
Guest
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

YAF | YAF © 2003-2011, Yet Another Forum.NET
This page was generated in 0.244 seconds.
Trial NCrunch
Take NCrunch for a spin
Do your fingers a favour and supercharge your testing workflow
Free Download