Error loading some.dll: Unable to load the test container 'e:\some.dll' or one of its dependencies - visual-studio-2010

I have a VS2010 C# project, that references a large set of native .dll's (a commercial java runtime). These file are referenced as 'Content' files in the project, since the need to be copied with the project.
The code in these libraries is called using PInvoke, there is no assembly reference.
Every time I compile the solution, the Visual Studio testing framework tries to load all the referenced dll files, expecting to find .net assemblies which may contain unit tests. Since the are no .net assemblies, the following exception is thrown:
Error loading some.dll: Unable to load the test container 'e:\some.dll' or one of its dependencies. If you build your test project assembly as a 64 bit assembly, it cannot be loaded. When you build your test project assembly, select "Any CPU" for the platform. To run your tests in 64 bit mode on a 64 bit processor, you must change your test settings in the Hosts tab to run your tests in a 32 bit process. Error details: Could not load file or assembly 'file:///e:\some.dll' or one of its dependencies. The module was expected to contain an assembly manifest.
This takes a whole lot of time, and I would like to tell Visual Studio to not try to load these files.
How can I tell Visual Studio to stop trying to load these files?

Correct me if I got this wrong:
You are including the P/Invoke target binaries in to the VS solution because you want the binaries to be copied over to the target directory when the solution is built. You want this because the project will execute from the target directory as soon as the VS solution is built. Correct?
Often times VS packages (both default and 3rd party) try to get smart about the solution content and will follow certain triggers (which are difficult to contain and control by ourselves) and load the solution and project content in their own ways. Fighting the battle in this area has poor ROI than employing a simpler work around (below).
While I can't provide you with an authoritative answer on how to tell VS's test package to not load all binaries, I suggest removing such binaries from the project as 'content' and leave them in your source control where they are today. Add a post-build task that will copy the said binaries over to the target. This will still give you the same result as it is working today but, takes those binaries out of reach for the test probes.

You must check out configuration settings by just right clicking on your solution name and click on "Configuration Manager"
It will open a pop up window for Configuration Manager.
Check not for the platform your projects are using it is better to choose any CPU.
Hope this can help.Give it a try:)
Because thats what your exeception says as you have quoted
Thanks

I tried to repro this issue and found that the root cause is that you have set your test project to be compiled as !AnyCpu. Is there any particular reason why you would want this for managed test code?
So unless you change this you will continue to see this message.
If you want to continue using this configuration for your test project you would need to update your .testsettings file as suggested in the message.

Sorry if this seems remedial. I am including it for the sake of completeness.
General library behavior
A library can be referenced either in the project file (and so the compiler injects to code to load the references) or dynamically at runtime with LoadLibrary() or PInvoke calls. When a referenced library is loaded, a function at the entry point is run can in turn load any libraries it depends on. When loading the library, there is a well-known set of paths that Windows will search, including %WINDIR%\Assembly and the current directory. There's a lot of good conceptual information on Wikipedia about this. I recommend reading it.
Possible Root Causes
I can't tell from your question if you are having trouble building the application, building the tests, or executing either. Generally I would not expect PInvoke to cause compile errors.
Error during app build: VS generally will show you that you have a reference to a DLL it can't find. However, you may be missing a DLL that is needed to satisfy all the dependencies. To resolve, just add the reference to the missing DLL. (This is the simplest issue, so I'm guessing this isn't what you're seeing.)
Error during test build: Since your test will reference your application/library, it also needs to have the same reference. Usually the easiest way to ensure you are getting everything is to remove all references and add a reference to the project you are testing. It's possible you some additional libraries are necessary for some tests, but not your app/lib itself. These need to be added separately.
Error during app execution: This can happen when starting the application, or later when an call to the external library is made if late binding is used.
Error during test execution: This can happen the same as with app execution. However, tests can also be "partially built" to only execute a small number of tests. In these cases, some files may not be copied. Using the [DeploymentItem()] attribute, you can specify that a test requires the presence of certain files in the test or app/lib project to function. MSDN describes how this can be done.
Resolution
For #1 & #2 the solution lies in adjusting the references in the project.
For #3 & #4, it may get trickier. There is a similar question to yours regarding Windows Mobile here which you may find useful, especially referring to using dumpbin to list out library dependencies. You can also use procmon from SysInternals to monitor file access during compile or load to see which files are not found. Then you can either include the missing file, or remove the library referencing it.
Good luck. Hope this helps.

Related

Avoiding build twice when using a shared project together with build generated code

I have a visual studio solution with multiple projects. One generates code files as part of pre-build (grpc classes via Grpc.Tools). There is also a shared project that extends the partial classes built as part of that pre-build.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
Is there something I can do in this scenario? Is it possible to somehow move validating/compiling the shared project "further down" the compilation pipeline? Or even just set that particular project to try and compile twice if there's an error? Or is this the kind of thing that realistically I should just live with given what I'm doing - I haven't found any other references to this problem. It's not that big of an issue and it wouldn't happen very often, but I'd like to handle it reasonably if I can.
Edit
If I wasn't clear, this is a shared project, as in a .shproj, a project that is not compiled separately. The project that references it includes it and builds it all together as one.
If project B depends on project A, then project A must be built before project B. Visual Studio is smart enough to figure out the build order this way. Incidentally, this is also one of the reasons (among many) why circular dependencies simply cannot work.
I suspect that your projects are currently not linked via a dependency, as this issue wouldn't occur if there were such a link. Perhaps your second project is accessing the first project's files via the file system? That's just a guess though.
You can use this "A before B which depends on A" behavior of the build process to your advantage. Have project B (i.e. the project you need to go second) add project A (i.e. the project you need to go first) as a dependency. This forces VS to build them in the appropriate order.
Some remarks:
I am unsure if VS is able to omit dependencies that you add but not actually depend on (i.e. you never reference its content). I can't find any confirmation on this point (but absence of proof is not proof of absence!) But even if that is the case, that could be easily worked around by having a dummy class in B which actually references and uses something from project A.
Keep in mind that during a regular build, VS does not rebuild projects that have not changed since the last build. If this is an issue for you (unsure if it is, you didn't add enough context), make sure to always rebuild or clean to make sure that a new build will be triggered.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
That it is only sometimes and can be fixed by "trying again" points at one thing - you got race condition. But a race condition during compilation, is not a thing I heard off or encountered before.
I got a few possible cultripts. But in the end, race conditions are notoriously hard to debug:
- Maybe the compiler that deals with the shared project returns before it is finished - wich should be impossible - or
- Maybe something causes the main projects to compile before the shared projects files are ready.
- Maybe a 3rd party tool - like a Virus scanner or auto backup maker inteferes?
- Maybe the shared projects compiled files are hosted on a network drive, and there sometimes is just the slightest delay between "compiled" and "visible to all other computers in the network"?
Usually the proper ways for dependant compilation should deal with such issues. That indicates that what you got there, is propably not the most stable setup.

What is the $RANDOM_SEED$ file generated by Visual Studio build of C# solution?

We noticed that on a certain dev machine a Visual Studio (2015 update 3) debug build of a C# solution was generating a $RANDOM_SEED$ file alongside every built DLL.
The content of the file is just a single number e.g.
1443972318
Deleting the file(s) then rebuilding resulted in the file being regenerated, with a different number.
This behaviour was also observed when rebuilding a single project in the solution (one which has only the standard C# project refs/dependencies + System.Management).
Note that running a command line build e.g.
msbuild <sln-file>
did not regenerate the file (for build of complete solution or single project).
After a restart of VS, the file is no longer regenerated.
As far as we know this file name is not used in any of our source code, post build steps or internal dependencies.
There are quite a few dependencies on .NET framework classes, including Random and RNGCryptoServiceProvider, and also external dependencies. We don't have complete source code for all these so it's not possible to check exhaustively which if any of the dependencies are responsible.
This is a bit of a shot in the dark but the question is has anyone seen anything similar to this?
EDIT
I'm not surprised this has been downvoted - I appreciate it is pretty open ended, but as I'm currently not able to reproduce this and as it could have potentially serious consequences (random number generator attack?) I have posted it anyway. If I am able to repro I will of course update here.
I have the same file.
After a short investigation I found guilty:
this file is created by NUnit 3.x test adapter.
(You can check it in AdapterSettings.cs from NUnit adapter source code).
The file is used by NUnit to ensure that we use the same random seed value for generating random test cases in both the discovery and execution processes. This is required because the IDE uses two different processes to execute the adapter. It's not actually required (or created) when running the adapter under vstest.console.exe.

VS 2010 project reference looking for incorrect version

We have recently taken over a project from an outsourcing company. This project uses Moles and Pex for unit testing, but since we have not had the project for long, I am not very familiar with the frameworks.
That being said, we are busy upgrading this project to run in .Net 4. I have resolved most of the issues that have jumped out, but there is one that I cannot get a handle on. Some of the unit tests cannot compile because of the error:
Could not load file or assembly 'Example.Assembly, Version=0.0.0.0,
Culture=neutral, PublicKeyToken=null' or one of its dependencies. The
system cannot find the file specified.
The part that baffles me is that it is a project reference and the assembly is being copied to the output directory of the unit test. Most of the other project references are found and I cannot spot any difference between the ones that work and the ones that do not. I am not sure if this problem has to do with the pex/moles frameworks, but I thought I would mention it.
I have tried the usual things of removing and adding all the references and regenerating the moles assemblies.
Has anyone else run into this problem? Any help would be greatly appreciated.
EDIT1: Ok, after some more investigation into the build output, it appears as if it is not moles, but the .accessor files that are not generated correctly. I get the exact same problem as asked in Unit test project cannot find assembly under test (or dependencies), but unlike his problem, mine does not go away after deleting the accessor.
EDIT2: Turns out is is a program called Publicize.exe which falls over with that error. Still no idea why though. Looking at Fusion logs is looks like it does not search under the working directory for the dll that it is trying to generate the accessors for. Running it manually on a bunch of assemblies from our solution, I find it works on some, but not on others. I have not been able to identify a difference between the ones that work and the ones that don't, though.
Thanks
Ah, yes. I have read this story many times, and have the tee shirt. I run through my usual Moles first-aid kit, when encountering any issue, including this one.
Perhaps, this question will provide some help: Am I the only one getting "Assembly Not Available in the Currently Targeted Framework"?
Ensure the Moles framework is properly installed on the workstation and/or build server
Ensure the Moles assemblies are being built (see the excluded "Moles Assemblies" directory)
Check your build profile -- it may need to be set to full framework profile
Triple check your output destinations and post-build commands -- I have seems some solutions that copy the output to another location
Try using the Visual Studio Pex/Moles extension, if you are not already doing so
An invasive fix-all process is to simply create an all-new solution, projects, and test projects, and then copy the existing code files into them. It's surprising how many issues can be resolved for various project-related errors. Basically, a hard reboot for the entire solution.
Since you are updating to .NET 4, you may as well go to 4.5, and used the productized version of Moles, called "Fakes". You'll find Fakes in the Visual Studio 2012 release candidate. This significant feature hasn't received much attention.

Visual Studio 2008 Unnecessary Project Building

I have a C# project which includes one exe and 11 library files. The exe references all the libraries, and lib1 may reference lib2, lib3, lib4, etc.
If I make a change to a class in lib1 and built the solution, I assumed that only lib1 and the exe would need to be changed. However, all dll's and the exe are being built if I want to run the solution.
Is there a way that I can stop the dependencies from being built if they have not been changed?
Is the key this phrase? "However, all dll's and the exe are being built if I want to run the solution"
Visual Studio will always try to build everything when you run a single project, even if that project doesn't depend on everything. This choice can be changed, however. Go to Tools|Options|Projects and Solutions|Build and Run and check the box "Only build startup projects and dependencies on Run". Then when you hit F5, VS will only build your startup project and the DLLs it depends on.
I just "fixed" the same problem with my VS project. Visual Studio did always a rebuild, even if didn't change anything. My Solution: One cs-File had a future timestamp (Year 2015, this was my fault). I opened the file, saved it and my problem was solved!!!
I am not sure if there is a way to avoid dependencies from being built. You can find some info here like setting copylocal to false and putting the dlls in a common directory.
Optimizing Visual Studio solution build - where to put DLL files?
We had a similar problem at work. In post-build events we were manually embedding manifests into the outputs in the bin directory. Visual Studio was copying project references from the obj dir (which weren't modified). The timestamp difference triggered unnecessary rebuilds.
If your post-build events modify project outputs then either modify the outputs in the bin and obj dir OR copy the modified outputs in the bin dir on top of those in the obj dir.
You can uncheck the build option for specified projects in your Solution configuration:
(source: microsoft.com)
You can can create your own solution configurations to build specific project configurations...
(source: microsoft.com)
We actually had this problem on my current project, in our scenario even running unit tests (without any code changes) was causing a recompile. Check your build configuration's "Platform".
If you are using "Any CPU" then for some reason it rebuilds all projects regardless of changes. Try using processor specific builds, i.e. x86 or x64 (use the platform which is specific to the machine architecture of your machine). Worked for us for x86 builds.
(source: episerver.com)
Now, after I say this, some propeller-head is going to come along and contradict me, but there is no way to do what you want to do from Visual Studio. There is a way of doing it outside of VS, but first, I have a question:
Why on earth would you want to do this? Maybe you're trying to save CPU cycles, or save compile time, but if you do what you're suggesting you will suddenly find yourself in a marvelous position to shoot yourself in the foot. If you have a library 1 that depends upon library 2, and only library 2 changes, you may think you're OK to only build the changed library, but one of these days you are going to make a change to library 2 that will break library 1, and without a build of library 2 you will not catch it in the compilation. So in my humble opinion, DON'T DO IT.
The reason this won't work in VS2005 and 2008 is because VS uses MSBuild. MSBuild runs against project files, and it will examine the project's references and build all referenced projects first, if their source has changed, before building the target project. You can test this yourself by running MSBuild from the command line against one project that has not changed but with a referenced project that has changed. Example:
msbuild ClassLibrary4.csproj
where ClassLibrary4 has not changed, but it references ClassLibrary5, which has changed. MSBuild will build lib 5 first, before it builds 4, even though you didn't mention 5.
The only way to get around all these failsafes is to use the compiler directly instead of going through MSBuild. Ugly, ugly, but that's it. You will basically be reduced to re-implementing MSBuild in some form in order to do what you want to do.
It isn't worth it.
Check out the following site for more detailed information on when a project is built as well as the differences between build and rebuild.
I had this problem too, and noticed these warning messages when building on Windows 7 x64, VS2008 SP1:
cl : Command line warning D9038 : /ZI is not supported on this platform; enabling /Zi instead
cl : Command line warning D9007 : '/Gm' requires '/Zi'; option ignored
I changed my project properties to:
C/C++ -> General -> Debug Information Format = /Zi
C/C++ -> Code Generation -> Enable Minimal Build = No
After rebuilding I switched them both back and dependencies work fine again. But prior to that no amount of cleaning, rebuilding, or completely deleting the output directory would fix it.
I don't think there's away for you to do it out of the box in VS. You need this add-in
http://workspacewhiz.com/
It's not free but you can evaluate it before you buy.
Yes, exclude the non-changing bits from the solution. I say this with a caveat, as you can compile in a way where a change in build number for the changed lib can cause the non built pieces to break. This should not be the case, as long as you do not break interface, but it is quite common because most devs do not understand interface in the .NET world. It comes from not having to write IDL. :-)
As for X projcts in a solution, NO, you can't stop them from building, as the system sees a dependency has changed.
BTW, you should look at your project and figure out why your UI project (assume it is UI) references the same library as everything else. A good Dependency Model will show the class(es) that should be broken out as data objects or domain objects (I have made an assumption that the common dependency is some sort of data object or domain object, of course, but that is quite common). If the common dependency is not a domain/data object, then I would rethink my architecture in most cases. In general, you should be able to create a path from UI to data without common dependencies other than non-behavioral objects.
Not sure of an awesome way to handle this, but in the past if I had a project or two that kept getting rebuilt, and assuming I wouldn't be working in them, I would turn the build process off for them.
Right click on the sln, select configuration manager and uncheck the check boxes. Not perfect, but works when Visual Studio isn't behaving.
If you continue to experience this problem, it may be due to a missing or out of date calculated dependency (like a header) that is listed in your project, but does not exist.
This happens to me especially common after migrating to a new version (for example: from 2012 to 2013) because VS may have recalculated dependencies in the conversion, or you are migrating to a new location.
A quick check is to double-click every file in offending project from solution explorer. If you discover a file does not exist, that is your problem.
Failing a simple missing file: You may have a more complicated build date relationship between source and target. You can use a utility to find out what front-end test is triggering the build. To get that information you can enable verbose CPS logging. See: Andrew Arnott - Enable C++ and Javascript project system tracing (http://blogs.msdn.com/b/vsproject/archive/2009/07/21/enable-c-project-system-logging.aspx). I use the DebugView option. Invaluable tool when you need it.
(this is a C# specific question, but a different post was merged as identical)

Problem with VSTS UnitTesting. Can't supply C++ DLLs

I am using VSTS Unitesting platform. I am trying to test a method which got references to assemblies which in turn contain DllImport to C++ DLLs.
In order for it to work I need to copy C++ DLLs to reside on the same directory the EXE and DLLs are running.
Of course when I use the same code with Unittest I also need to supply those DLLs.
I found out that the Unittest framework us using the $(Solution)\TestResults[WorkSpace] [DateTime]\Out as a working directory.
If I manually copy the C++ DLLs to this directory the unit test is is working like a charm.
The problem is that every time the Unitest is running it creates a new directory.
Has anybody encountered it? do you have a solution?
Thanks,
Ariel
As Steve D mentions, deployment items are the answer here. You can either put them on the class, or test method using the attribute, or use the Test Run Configuration to add them so that when any tests are run from that solution they will be deployed.
The other option is to make sure they're in the path somewhere so that the standard windows look up rules for DLLs will apply, and the runtime will be able to locate them.
Why is this a problem? because theres little to no metadata from the project to the Native DLL -- we don't know to pick it up. The only option really would be to dive all types in the deployed managed dlls looking for the DllImport attrib. This would, however, fail, if you are doing explicit DLLLoads in the managed code.
You could try using a [DeploymentItem] attribute. It allows you to specify a relative path from the solution file which will get copied to the test output directory.

Resources