Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Sporadic IndexOutOfRangeException in TestCaseSource-based tests
nshallred
#1 Posted : Wednesday, January 30, 2019 5:47:31 PM(UTC)
Rank: Newbie

Groups: Registered
Joined: 5/5/2016(UTC)
Posts: 4
Location: United Kingdom

Thanks: 1 times
Environment:
Visual Studio Version: 2017 (15.9.6)
NCrunch: 3.23.0.10
Test Framework: NUnit (3.11)

We're in the process of converting a number of our projects to NET Core/Net Standard and I've run into an issue with one particular solution. I'm getting errors similar to the following: -

System.Reflection.TargetInvocationException : Exception has been thrown by the target of an invocation.
----> System.TypeInitializationException : The type initializer for 'Exclaimer.Applications.ComodoProxy.WebApi.Tests.ExclaimerOrderNumberTests' threw an exception.
----> System.IndexOutOfRangeException : Index was outside the bounds of the array.
at System.RuntimeFieldHandle.GetValue(RtFieldInfo field, Object instance, RuntimeType fieldType, RuntimeType declaringType, Boolean& domainInitialized)
at System.Reflection.RtFieldInfo.UnsafeGetValue(Object obj)
at System.Reflection.RtFieldInfo.GetValue(Object obj)
at NUnit.Framework.TestCaseSourceAttribute.GetTestCaseSource(IMethodInfo method) in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestCaseSourceAttribute.cs:line 263
at NUnit.Framework.TestCaseSourceAttribute.GetTestCasesFor(IMethodInfo method) in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestCaseSourceAttribute.cs:line 170
--TypeInitializationException

--IndexOutOfRangeException
at nCrunch.TestRuntime.DotNetCore.SharedMemoryExecutionDataRecorder.MarkCoverageOfClass(Int32 componentMappingId, Int32 classIndex)
at Exclaimer.Applications.ComodoProxy.WebApi.Models.ExclaimerOrderNumber..ctor(String prefix, CertificateType type, Guid id) in C:\Users\nick.hall\Source\Repos\tcps\src\Exclaimer.Applications.ComodoProxy.WebApi\Models\ExclaimerOrderNumber.cs:line 0
at Exclaimer.Applications.ComodoProxy.WebApi.Tests.ExclaimerOrderNumberTests..cctor() in C:\Users\nick.hall\Source\Repos\tcps\src\Exclaimer.Applications.ComodoProxy.WebApi.Tests\ExclaimerOrderNumberTests.cs:line 23

I've observed the following about the issue: -


  1. The affected tests are always based on use of the TestCaseSource attribute
  2. There seems to be a random factor - usually at least one test will blow up in the manner described; sometimes ALL of them do. Test failures vary between about one and 60.
  3. I have also sporadically seen an analysis failure reported by NCrunch which suggests it has encountered an AcessViolationException: -

    An error occurred while analysing this project after it was built: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.TypeInitializationException: The type initializer for 'Exclaimer.Applications.TenantCertificates.UnitTests.Diagnostics.CertificateRequestOperationUnitTests' threw an exception. ---> System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
    at nCrunch.TestRuntime.SharedExecutionMap.MarkTestLineExecutionTime(Int32 lineMarkerIndex, UInt32 tickCount)
    at nCrunch.TestRuntime.SharedMemoryExecutionDataRecorder.MarkTestLineExecutionTime(Int32 componentMappingId, Int32 lineMarkerIndex, UInt32 tickCount)
    at nCrunch.TestRuntime.TestCoverageEventListener.NCrunchExitMethod(Int32 componentId, Int32& existingCoverageMarkerIndex, UInt32 existingTickCount)
    at Exclaimer.Applications.TenantCertificates.UnitTests.Diagnostics.CertificateRequestOperationUnitTests..cctor() in C:\Users\nick.hall\Source\Repos\tcps\src\Exclaimer.Applications.TenantCerts.UnitTests\Diagnostics\CertificateRequestOperationUnitTests.cs:line 48
    --- End of inner exception stack trace ---
    at Exclaimer.Applications.TenantCertificates.UnitTests.Diagnostics.CertificateRequestOperationUnitTests.get_CompletedTestCaseDatas()
    --- End of inner exception stack trace ---
    at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
    at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
    at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
    at System.Reflection.RuntimePropertyInfo.GetValue(Object obj, BindingFlags invokeAttr, Binder binder, Object[] index, CultureInfo culture)
    at System.Reflection.RuntimePropertyInfo.GetValue(Object obj, Object[] index)
    at NUnit.Framework.TestCaseSourceAttribute.GetTestCaseSource(IMethodInfo method) in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestCaseSourceAttribute.cs:line 268
    at NUnit.Framework.TestCaseSourceAttribute.GetTestCasesFor(IMethodInfo method) in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestCaseSourceAttribute.cs:line 173
    at NUnit.Framework.TestCaseSourceAttribute.<BuildFrom>d__22.MoveNext() in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestCaseSourceAttribute.cs:line 139
    at NUnit.Framework.Internal.Builders.DefaultTestCaseBuilder.BuildFrom(IMethodInfo method, Test parentSuite) in C:\src\nunit\nunit\src\NUnitFramework\framework\Internal\Builders\DefaultTestCaseBuilder.cs:line 144
    at NUnit.Framework.Internal.Builders.NUnitTestFixtureBuilder.AddTestCasesToFixture(TestFixture fixture) in C:\src\nunit\nunit\src\NUnitFramework\framework\Internal\Builders\NUnitTestFixtureBuilder.cs:line 183
    at NUnit.Framework.Internal.Builders.NUnitTestFixtureBuilder.BuildFrom(ITypeInfo typeInfo, ITestFixtureData testFixtureData) in C:\src\nunit\nunit\src\NUnitFramework\framework\Internal\Builders\NUnitTestFixtureBuilder.cs:line 156
    at NUnit.Framework.TestFixtureAttribute.<BuildFrom>d__48.MoveNext() in C:\src\nunit\nunit\src\NUnitFramework\framework\Attributes\TestFixtureAttribute.cs:line 227
    at NUnit.Framework.Internal.Builders.DefaultSuiteBuilder.BuildFrom(ITypeInfo typeInfo) in C:\src\nunit\nunit\src\NUnitFramework\framework\Internal\Builders\DefaultSuiteBuilder.cs:line 80
    at NUnit.Framework.Api.DefaultTestAssemblyBuilder.GetFixtures(Assembly assembly, IList names) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\DefaultTestAssemblyBuilder.cs:line 208
    at NUnit.Framework.Api.DefaultTestAssemblyBuilder.Build(Assembly assembly, String assemblyPath, IDictionary`2 options) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\DefaultTestAssemblyBuilder.cs:line 170
    at NUnit.Framework.Api.DefaultTestAssemblyBuilder.Build(Assembly assembly, IDictionary`2 options) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\DefaultTestAssemblyBuilder.cs:line 85
    at NUnit.Framework.Api.NUnitTestAssemblyRunner.Load(Assembly assembly, IDictionary`2 settings) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\NUnitTestAssemblyRunner.cs:line 174
    at NUnit.Framework.Api.FrameworkController.LoadTests() in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\FrameworkController.cs:line 201
    at NUnit.Framework.Api.FrameworkController.LoadTests(ICallbackEventHandler handler) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\FrameworkController.cs:line 321
    at NUnit.Framework.Api.FrameworkController.LoadTestsAction..ctor(FrameworkController controller, Object handler) in C:\src\nunit\nunit\src\NUnitFramework\framework\Api\FrameworkController.cs:line 502
    --- End of inner exception stack trace ---
    at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
    at System.Reflection.RuntimeConstructorInfo.Invoke(BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
    at System.RuntimeType.CreateInstanceImpl(BindingFlags bindingAttr, Binder binder, Object[] args, CultureInfo culture, Object[] activationAttributes, StackCrawlMark& stackMark)
    at System.Activator.CreateInstance(Type type, BindingFlags bindingAttr, Binder binder, Object[] args, CultureInfo culture, Object[] activationAttributes)
    at System.Activator.CreateInstance(Type type, Object[] args)
    at nCrunch.Module.NUnit3.Integration.FrameworkController.LoadTests(INUnit3CallbackHandler handler)
    at nCrunch.Module.NUnit3.Integration.NUnit3FrameworkInteractor.<>c__DisplayClass8_0.<prepareFramework>b__0()
    at nCrunch.Common.PerformanceTracking.PerfTracker.TrackActivity(String name, Action activity)
    at nCrunch.Common.PerformanceTracking.PerfTracker.TryTrackActivity(String name, Action activity)
    at nCrunch.Module.NUnit3.Integration.NUnit3FrameworkInteractor.prepareFramework(DynamicProxy[] dynamicProxies)
    at nCrunch.Module.NUnit3.Integration.NUnit3FrameworkInteractor..ctor(ReflectedAssembly assembly, IList`1 referencedAssemblyFilePaths, ComponentUniqueName testComponentUniqueName, DynamicProxy[] dynamicProxies)
    at nCrunch.Module.NUnit3.Integration.NUnit3FrameworkRuntimeEnvironment.FindFrameworkTestsInAssembly(ReflectedAssembly assembly, FilePath assemblyFilePath, IList`1 referencedAssemblyFilePaths, ComponentUniqueName testComponentUniqueName, PlatformType platformType, DynamicProxy[] dynamicProxies)
    at nCrunch.TestExecution.TestFinder..()
    at nCrunch.Common.PerformanceTracking.PerfTracker.TrackActivity(String name, Action activity)
    at nCrunch.TestExecution.TestFinder..()
    at nCrunch.Common.PerformanceTracking.PerfTracker.TrackActivity(String name, Action activity)
    at nCrunch.TestExecution.TestFinder.FindTestsForFrameworks(ReflectedAssembly assembly, FilePath assemblyFilePath, IList`1 referencedAssemblyFilePaths, DescribedTestFrameworkDiscoverer[] describedDiscoverers, ComponentUniqueName testComponentUniqueName, PlatformType platformType, DynamicProxy[] dynamicProxies)
    at nCrunch.TestExecution.RemoteTaskRunner.AnalyseAssembly(DescribedTestFrameworkDiscoverer[] applicableFrameworks, ComponentUniqueName testComponentUniqueName, PerfTracker perfTracker)


Please let me know if I can provide any further information or assist in the diagnosis of this issue.

Nick Hall
Remco
#2 Posted : Wednesday, January 30, 2019 7:02:06 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,974

Thanks: 929 times
Was thanked: 1256 time(s) in 1169 post(s)
Hi Nick,

Thanks for sharing this issue in such detail.

NCrunch uses an unmanaged map to store code coverage details during execution and surface them from the test run. The only way you can disable this map at run time is by turning off the 'Instrument output assembly' setting and disabling code coverage for all of your projects. The code coverage storage system involves many calls from your own code (injected by NCrunch) into the map.

However, the above system is only active when tests are actually being executed. During the discovery run, no unmanaged code is run by NCrunch. The instrumented calls to the map are stubbed out using a managed interface. So the occurrence of this problem during your analysis step suggests that it isn't being caused directly by NCrunch's coverage tracking. It is, however, quite likely that any memory corruption could interfere with these stubbed calls and the AV exceptions thrown while they're being executed.

So broadly, there are two likely things that could cause this problem:

1. Unstable, unmanaged code running inside test generation (i.e. TestCaseSource) routines. If you have any unmanaged code that would be executed at discovery time, this is suspect. Consider that this code doesn't need to be inside the TestCaseSource methods, it might be called indirectly (i.e. static constructor/destructor). It may also be in third party code.

2. The .NET garbage collector. I've experienced problems like this sometimes while working with pre-release VS/.NET toolsets (and sadly, often released ones too). When the GC has a problem, crazy things can happen. Two years ago, MS pushed out a release where we had a problem with the GC where around 1/20000 times when NCrunch executed a test run, something would randomly blow up. We spent weeks tracking this but were never able to produce the problem consistently enough to help the MS team fix it. Eventually, we just turned off concurrent garbage collection in all NCrunch processes and the problem disappeared. Make sure you don't have any configuration settings in your test environment that would enable concurrent/workstation garbage collection. NCrunch should disable it by default, but something might have gone wrong in your environment or the setting may have been overridden.
nshallred
#3 Posted : Thursday, January 31, 2019 11:00:28 AM(UTC)
Rank: Newbie

Groups: Registered
Joined: 5/5/2016(UTC)
Posts: 4
Location: United Kingdom

Thanks: 1 times
Thanks for responding so quickly. Thanks also for the detailed explanation of what is going on behind the scenes.

1. I've looked in detail at all the tests using TestCaseSource; no obvious usage of unsafe code either directly in the generated tests. A couple use types defined in external packages; however not in any way that would make me think they are calling code in this way (unless something is going on as part of static initialization?)
2. Interesting information regarding the collector. As far as I can tell, nothing is attempting to switch this on in my environment (you're referring to the <gcConcurrent enabled="true|false"/> configuration element, yes?) Do you know of anyway to tell at runtime if this is enabled?

I've tried switching off instrumentation for just the unit test assemblies; however I'm still getting the failure. I think for the time being I will need to disable ncrunch for this solution and run my tests via another mechanism.

Nick Hall


Remco
#4 Posted : Thursday, January 31, 2019 9:11:24 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,974

Thanks: 929 times
Was thanked: 1256 time(s) in 1169 post(s)
Something else to try is switching to a different version of VS. MS have been making regular updates to GC behaviour and it's possible you've found a new problem or stumbled over something they've recently fixed.

Unfortunately I don't know of any certain way to know whether the concurrent GC is enabled or not.

If you're still seeing the problem even with NCrunch's instrumentation turned off, then I think it's just a matter of time before you see the problem appear in other runners too. Most likely it's appearing for you under NCrunch because your tests get run more frequently this way.
nshallred
#5 Posted : Friday, February 1, 2019 9:09:27 AM(UTC)
Rank: Newbie

Groups: Registered
Joined: 5/5/2016(UTC)
Posts: 4
Location: United Kingdom

Thanks: 1 times
With VS 2019 not too far away, that's probably the direction to go (presumably NCrunch support will be available around the same time?). Switching to an older version of VS is not a viable option due to our source code integration, not to mention our move to .net core.

It's curious that the problem reliably repeats under ncrunch but has not been seen in any other test runner. So far I have used Visual Studio's integrated test runner and Resharper. The tests are run on the build machine using dotnet test and also pass without issue. I've just tried running the same command on my machine and also get no issues in terms of tests failing.
Remco
#6 Posted : Friday, February 1, 2019 9:30:50 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,974

Thanks: 929 times
Was thanked: 1256 time(s) in 1169 post(s)
nshallred;13019 wrote:

It's curious that the problem reliably repeats under ncrunch but has not been seen in any other test runner. So far I have used Visual Studio's integrated test runner and Resharper. The tests are run on the build machine using dotnet test and also pass without issue. I've just tried running the same command on my machine and also get no issues in terms of tests failing.


NCrunch runs your tests an order of magnitude more frequently than other runners (as in, hundreds if not thousands of times more often). This tends to result in intermittent problems being kicked up much more frequently.

If you're able to reproduce this problem in a way that you can share with me, I should be able to investigate it further. Unfortunately, I have no leads of my own or suspect areas of the NCrunch code to review in an effort to track this down.

We're working on the VS2019 integration. MS have changed a huge amount in this release, and they're still changing it, so it's a big piece of work. We'll be publishing builds as soon as we reach an acceptable level of stability.
chredenex
#7 Posted : Monday, September 23, 2019 2:14:32 PM(UTC)
Rank: Newbie

Groups: Registered
Joined: 9/23/2019(UTC)
Posts: 3

Thanks: 1 times
I have finally found the cause of this and fixed it (stopped it happening, anyway).

I ran the following Powershell command:
Code:
gci env:* | where-object -property name -like *ncrunch*  | format-list


This gave me back the following environment variables (edited - ... added where I removed stuff):
Code:

Name  : NCrunch
Value : 1

Name  : NCrunchVSInstallPath.VS2017
Value : c:\program files (x86)\microsoft visual studio\2017\enterprise\

Name  : NCRUNCHENGINE
Value : 1

Name  : NCrunch.InstallPath
Value : c:\program files (x86)\microsoft visual studio\2017\enterprise\common7\ide\extensions\remco software\ncrunch
        for visual studio 2017

Name  : NCrunch.OriginalProjectPath
Value : A csproj...

Name  : nCrunch.TestRuntime.HostProcessId
Value : 5924

Name  : NCrunch.AllAssemblyLocations
Value : nuget folders...

Name  : nCrunch.TestRuntime.ExecutionMapSpecifications
Value : 1,449,19:8,76,3

Name  : NCrunch.OriginalSolutionPath
Value : A sln...


I think the problem was this one:
Code:

Name : nCrunch.TestRuntime.ExecutionMapSpecifications
Value : 1,449,19:8,76,3


In any case, I removed all of these environment variables, restarted the computer and TestCaseSource tests started to work again in nCrunch. I have no idea how or why these environment variables were set, or why they haven't been cleaned up by subsequent upgrades.
Remco
#8 Posted : Tuesday, September 24, 2019 1:32:19 AM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,974

Thanks: 929 times
Was thanked: 1256 time(s) in 1169 post(s)
This is very alarming, and quite a remarkable find ... how did you think to check this!?

These environment variables are fed into the process startup data for the NCrunch test processes. The nCrunch.TestRuntime.ExecutionMapSpecifications variable in particular is an important parameter for the coverage collection system. If these are present in the main system environment variable block, it will cause absolute chaos.

The API used to define these variables will never declare them outside of the transient process-specific block. Do you have any ideas on how they might have been applied at system level? Could there have been some test code that copied them across?

I'm adding a task to make the engine check for these variables at system level and clear them out if it finds them. At least in this way no one should ever see such a problem again.
chredenex
#9 Posted : Tuesday, September 24, 2019 10:40:37 AM(UTC)
Rank: Newbie

Groups: Registered
Joined: 9/23/2019(UTC)
Posts: 3

Thanks: 1 times
Quote:
how did you think to check this!?


Well, it's been broken for months now, so eventually you end up trying everything! The same problem only occurred for one or two of us, so it had to be some kind of environmental variable. I didn't think it would be an actual environment variable though!

Quote:
Do you have any ideas on how they might have been applied at system level? Could there have been some test code that copied them across?


I've searched in every way I can think of to find any trace of us doing something strange with these variables and I couldn't find anything.

As this started happening a while ago could it have been an old version of NCrunch that did it? The first mention I can find of it in our Slack was November 7th 2018 where it started falling apart for me in some projects. Unfortunately I can't tell you exactly which version it was but it was probably reasonably up to date.


You were right, there was some code that fiddled with environment variables - specifically it promoted them all from process to machine level, so other processes in a Docker container could see them. This was then called from a unit test in the project referenced by the now deleted NCrunch.OriginalProjectPath environment variable. This code has since been deleted, which made it a little harder to find. This explains why only two people were affected, as they happened to work on the project when the code was present.
Remco
#10 Posted : Tuesday, September 24, 2019 11:35:01 PM(UTC)
Rank: NCrunch Developer

Groups: Administrators
Joined: 4/16/2011(UTC)
Posts: 6,974

Thanks: 929 times
Was thanked: 1256 time(s) in 1169 post(s)
chredenex;13894 wrote:

You were right, there was some code that fiddled with environment variables - specifically it promoted them all from process to machine level, so other processes in a Docker container could see them. This was then called from a unit test in the project referenced by the now deleted NCrunch.OriginalProjectPath environment variable. This code has since been deleted, which made it a little harder to find. This explains why only two people were affected, as they happened to work on the project when the code was present.


Good find. This would explain the issue and confirms that we aren't doing anything too crazy inside NCrunch itself. I'll still see about getting a fix in to stop the product behaving this way if those environment variables do get somehow promoted. The fact that you aren't the only user to encounter this problem suggests there may be others writing tests that do the same thing.
1 user thanked Remco for this useful post.
chredenex on 9/25/2019(UTC)
chredenex
#11 Posted : Wednesday, September 25, 2019 7:11:24 AM(UTC)
Rank: Newbie

Groups: Registered
Joined: 9/23/2019(UTC)
Posts: 3

Thanks: 1 times
Quote:
there may be others writing tests that do the same thing.


The promotion happened inside product code that was being tested, the test looked innocent enough (as it just called a method).

Thanks for your response and for working to protect us from ourselves!
Users browsing this topic
Guest
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

YAF | YAF © 2003-2011, Yet Another Forum.NET
This page was generated in 0.153 seconds.
Trial NCrunch
Take NCrunch for a spin
Do your fingers a favour and supercharge your testing workflow
Free Download