MSTest: A sensible way to deploy items from a common directory? - mstest

I have lost a few hairs when trying to deal with DeploymentItem recently.
We have a few common directories for native dll's, and many tests depends on these.
For C++ projects, we use propertypages, where these paths are defined. These can even be imported in a C# project aswell, with some manual editing (as they are MSBuild files). Still I can't figure out how to utilize them in tests.
Unfortunately, the DeploymentItemAttribute can't use the properties in the sheet, but it can utilize environment variables. I was hoping to avoid forcing everybody to define global environment variables...
I have seen various suggestions around the net, but haven't really found a simple solution.
Anybody have good approach to this ?

Anders' answer is a good solution, but in my case:
I don't like the idea of keeping binaries within the source tree
Many dlls dont have specific versions, and they are updated on regular
basis.
I somehow ended up with this solution:
First, I included the global VC++ property page into the test project. This must be done manually by adding this directive under the <Project> tag on top of the .csproj:
<Import Project="$(UserProfile)\AppData\Local\Microsoft\MSBuild\v4.0\Microsoft.Cpp.Win32.user.props" />
I now got access to the properties/macros that defines the dll paths in my C++ environment.
I then
added a new subfolder in the test-project, say "NativeDlls"
added the needed dlls as links into the NativeDlls folder
the links are absolute, but can be replaced with macros from the
property sheet included above:
<Content Include="$(MyLibLocation)\GDAL18BIN\gdal18.dll">
<Link>NativeDlls\mylib.dll</Link>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
The dlls is then ready to deploy:
[TestMethod]
[DeploymentItem(#"NativeDlls")]
public void TestSomeStuff()
{
}
And, as Anders mentions: The remaining work is to set debug/release and 32/64 conditions.

If these are external dependencies used only by this project (not shared between source trees) then I suggest moving them into the source control. Dependencies should be versioned together with the source. The rationale being that you should be able to check out a revision of the source tree (any revision in the history), and it should build. If you have binary dependencies that are not under source control, you will have problems knowing which version of the dependency you need when you build a specific version of the source.
If you can move the dependencies into the source tree (e.g. $svnroot/trunk/dependencies) then you can use test deployment with only relative paths. It will work under TeamCity as well as on any developer machine.
If you cannot version your dependencies or you must have them outside the repository for some other reason, then you can use an environment variable that the test deployment can use. See This msdn post for an example
EDIT: moved a comment about managing binary dependencies here
For csprojs I just have a dll-reference in the projects to the dll:s in the lib directory under the source tree (i.e. reference to ..\lib\log4net.dll). If you want to reference separate libs for separate builds, e.g. different for x86/64 or Debug/Release, then VS doesn't support it but MsBuild and the csproj file does, so you can add conditional references but you have to edit the csproj by hand to include for example the x86 dependency only if platform is x86 and so on.

Related

How to add TypeScript Definitions to .NET Core Nuget Packages

We have an internal NuGet Package that consists of some .NET Code and a TypeScript Definition File (*.d.ts). This is the content of the package:
After installing the package into a new .NET Core project, the folder structure in the solution explorer looks like this.
So far, everything went as expected. But note the small arrow symbols on the "i18n" folder and the "Index.d.ts" file. It looks like they're just links to the actual file. When I click on the d.ts file the content seems correct. But Visual Studio fails to recognize the declarations in it, so I can't use it in my TypeScripts.
One idea was to include the path to the packages in the tsconfig.json, but that can't be the solution... Any other ideas how to do that?
How to add TypeScript Definitions to .NET Core Nuget Packages
As far as I know, Definitely Typed packages are not compatible with .NET Core projects. That because the script files should be included in <contentFiles> element. You can refer to the Including content files for more detail info.
Besides, just as Martin comment, npm is the recommended method of installing Definitely Typed packages:
https://github.com/DefinitelyTyped/DefinitelyTyped#how-do-i-get-them
So, after seeing the replies here and not giving up, I have to put in my approach to this.
As I'm working on a large solution with over 100 subprojects - many of them fast moving NuGets, I couldn't give up. I wanted to have my .NET object models including their interface/class representations in TS, being able to have both imported by including one NuGet (and thereby reduce dependency hell a little bit). I have to mention, I tested this only with my own object model, which has no external dependencies - and I tested only on VS2022.
But in this restricted scenario it works without any issues.
On the project containing the TS definitions
Set the build action for the ts definitions you need to be included in the NuGet to "content". This will include them into the NuGet package.
On the consumer side
Adjust your package reference, add the following property/value:
<GeneratePathProperty>True</GeneratePathProperty>
This will create an MsBuild property/variable referencing the path to the local presence of the restored NuGet file (important if your building on multiple, different machines - like on CI pipelines, build servers etc.) and allowing you to avoid any hardcoded absolute paths.
The generated property has the following format
$(Pkg<PackageNameWithDotsBeingReplacedByUnderlines>)
So a package named "MyPackage.Hello" would result in the variable $(PkgMyPackage_Hello)
Now we create a new build target to copy the files from the restored package's contentfiles folder (as it's restored, and we have the restored and thereby extracted path, we can finally target them).
<Target Name="CopyImportedTypes" BeforeTargets="Build">
<ItemGroup>
<TsTypesToCopy Include="$(PkgMyPackage_Hello)\contentFiles\any\net6.0-windows10.0.20348\*.ts" />
</ItemGroup>
<Copy SourceFiles="#(TsTypesToCopy)" DestinationFolder="$(MSBuildProjectDirectory)\AnyProjectSubFolderIfDesired" SkipUnchangedFiles="true" OverwriteReadOnlyFiles="true" />
</Target>
Make sure to adjust the "Include" path to your package (TFM, Platform etc.). An easy way to get the relative path is to open up the solution explorer, expand your consuming project, expand dependencies and then packages, expand the package with your ts definitions and open up the properties of the contentfiles.
This target is being executed before the actual build (so we can use the imported types on the build being happening right this moment) (BeforeTargets property). The ItemGroup is nothing else than a definition of items (in our case, source files) we want to use, being stored into #(TsTypesToCopy) which is being used by the copy task.
Thankfully, VS does automatically set new files to the right build action (in most cases), so the brand new ts files should be in the right mode automatically - so we don't have to tweak anything manually.

How to make successive version of a project in Microsoft Visual Studio 2010

Maybe I just can't figure out the right keywords to get an answer out of Google, but here goes.
Say I have a Project I'm working on named "Project." For it's first version, I have it stored in the folder "Project_Version1", and have the name of it's solution, project, build exe, etc as "Project_Version1".
Now I want to make the next version of the Project, and call it "Project_Version2". To do this currently, I copy the original folder, and rename it to "Project_Version2", and I want to rename all of the other internal stuff to that as well. Currently, I have to do that with a combination of changing the names in windows explorer, and some of the various properties pages in my solution.
There has to be a better way to do this. How does somebody make a second version of their project, and store it's files separately from the first version? Is there a way to also rename appropriate files that contain version numbers?
Indeed, there is a better way to do this, it is called version control.
If you check your source into TFS (or git or any other SCM software) all versions will be tracked over time.
If you then need to maintain multiple versions simultaneously (e.g. for a customer base where customers slowly migrate to the bleeding edge and patches are required to fix critical issues in older, supported versions) you can keep these versions as branches and selectively merge changes.
If you do decide source control is a good fit for managing your versions (It brings a lot of other benefits as well), consider reading some posts such as this git tutorial or why use version control.
If you REALLY can't use source control, for some reason, I suggest trying to reference as much as you can using language constructs which can be easily changed (e.g. with C# and a reasonable IDE, a quick refactor, or with other languages some rigorous de-duplication of version-specific data).
Ideally you would have just one definition of the version for your code along with a version definition in the project files (which could be set as a build variable to change quickly).
In this case you would copy the project, change the code version and the version variable for the project and, assuming good use of relative paths etc. (no hardcoded versions!), a quick rename of a couple of variables should get you going.
Ideally, the bulk of of the file-names would not reference the version. The only thing referencing the version should probably be the containing directory, the project/solution files and a few isolated references in your code.
You can do some pretty neat tricks with MSBuild. Let's say, for instance, that this is your project.
Program.cs:
namespace MyCustomBuild
{
class Program
{
static void Main(string[] args)
{
System.Console.WriteLine("Hello World");
}
}
}
Create your MSBuild file as shown. build.msbuild:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<MajorVersion>1</MajorVersion>
<MinorVersion>0</MinorVersion>
</PropertyGroup>
<Target Name="Default">
<CombinePath BasePath="$(MSBuildProjectDirectory)"
Paths="Program.$(MajorVersion).$(MinorVersion)">
<Output TaskParameter="CombinedPaths" PropertyName="OutputDir"/>
</CombinePath>
<MakeDir Directories="$(OutputDir)"/>
<Csc Sources="program.cs"
OutputAssembly="$(OutputDir)\Program.$(MajorVersion).$(MinorVersion).exe"/>
</Target>
</Project>
After running msbuild build.msbuild, this is the result:
Ways to make what I posted even better:
Use the MSBuild Extension Pack AssemblyInfo. This will allow you to change the version info if you have it defined inside an AssemblyInfo.cs file. Or you can look into the GenerateApplicationManifest Task, which can set assembly versions, but as of right now the file version of the executable will always be 1.0.0.0
Implement some sort of autoincrement system like this one
You can copy all your source files into the versioned directory using the Copy Task
It's a lot to take in, especially if you aren't familiar with MSBuild, but it's a powerful program that allows customization that Visual Studio can't even imagine! Note that a solution file (.sln) is a MSBuild file, even though it's not xml.
Hopefully this is what you are looking for. I do also recommend that you use an official version control, as others have mentioned, but you seemed to have a specific task you wanted automated, which is MSBuild's forte.
Ideally, versioning your project should be done within a source control system since Visual Studio in and of itself is just an IDE.
In the situation of not using a source control system, unfortunately I'm not aware of a better solution than what you are doing currently.

Visual Studio 2008, MSBuild: "replacement" projects

My solution has a library project which needs a special environment to be built (lots of external libraries and tools)... but it is not vital to our application. We'd like to avoid installing these tools when not necessary (most of our developers work on other parts of code).
We have created another project which has the same API, but has an empty implementation and is compilable without those external tools. I'd like to be able to easily switch between those projects and still get all the references in other projects correct.
I don't know VS/MSBuild very well, but willing to learn whatever is necessary. Is it possible? I am looking for ideas... We're using Subversion, and solutions involving some hacks inside VCS are also welcome.
It sounds as if your library project is one that can be separated from your primary solution, taking the tool baggage with it. Doing that, you could build the speciality solution separately, an link the compiled assembly from the main solution.
Create another build-configuration for your project.
So you will have at least 2 build-configurations e.g. Debug_SpecialNeeds and Debug.
For discussion, I'll assume you have a project directory containing your solution file, a "RealLibrary\RealLibrary.csproj" project file (your "real" library, with the dependencies), and a "MockLibrary\MockLibrary.csproj" file (your "mock" library, with the empty implementations).
If I understand correctly, you want to easily "swap" the MockLibrary for the RealLibrary in your solution, and vice-versa.
The easiest/hackiest way to do this, assuming your solution (and dependent projects) are configured to look for the "RealLibrary.csproj" project, is to rename the "RealLibrary" directory (it doesn't matter to what), and rename the "MockLibrary" directory to "RealLibrary" and rename "MockLibrary.csproj" to "RealLibrary.csproj". This will effectively "trick" your solution and dependent projects into loading the "mock library" even though they are referencing the "real library".
A slightly more complex (and perhaps cleaner) solution is to actually modify your "sln" and "csproj" files to reference "MockLibrary.csproj" instead of "RealLibrary.csproj". In the "sln" file, you'll need to change the path to the project in the section near the top:
Microsoft Visual Studio Solution File, Format Version 10.00
# Visual Studio 2008
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "RealLibrary", "RealLibrary\RealLibrary.csproj", "{E1714F9A-E1D9-4132-A561-AE2B4919391C}"
EndProject
You need to change that path "RealLibrary\RealLibrary.csproj" to "MockLibrary\MockLibrary.csproj". If you're going for completeness, you can change the name as well (or perhaps just use a generic name like "Library" for the name).
Likewise, in the dependent csproj files, you'll need to find all instances of the "ProjectReference" node where you reference "RealLibrary.csproj" and modify the path. These sections look like this:
<ProjectReference Include="..\RealLibrary\RealLibrary.csproj">
<Project>{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</Project>
<Name>RealLibrary</Name>
</ProjectReference>
You could relatively easily write some scripts to perform this swap. However, I think there's a deeper problem here that can be addressed more directly. I'll post that as a separate answer, but I wanted you to have the actual answer you were looking for first.
The deeper problem I see here is that your library "needs a special environment to be built", specifically because it depends on "lots of external libraries and tools". I would suggest that you NOT go down the path of creating the mock library, but instead focus on getting the library to build correctly without a special environment. You can achieve this by including all of those dependencies in source control along with your project, and reference those dependencies via relative paths inside your working copy. In my build environments, I try to avoid static environmental dependencies as much as possible (ideally limiting it just to the .NET framework itself).
To get the dependencies into source control, you can either check them directly into the project itself, or you can check them into a different location and then "reference" them in your project via svn:external definitions. In my environment, I have a separate "bin" repository used just for these kind of third party library dependencies, and then many dependent projects can pull them in via externals.
If you can eliminate your library's build-time environmental dependencies, your build will be much more robust and it will be much easier for developers to work with the project.

How do you share external dependencies between Visual Studio solutions?

I have a Java background so I’m used to having Maven handle all problem around downloading and keeping dependencies up to date. But in the .NET environment I have not yet found a good way to manage all these external dependencies.
The main problem here is that I mass produce solutions and they all tend to depend on the same third party dll’s. But I don’t want to maintain separate copies of each component under each solution. So I need a way of linking all the different solutions to the same set of dll’s.
I realized that one solution might be to include the external libraries in a ”library project” that is included in all solutions and let the other projects references them through it. (Or just make sure to reference the external dll’s from the same place for all projects.)
But are there any better ways to do this?
(Preferably using some sort of plug-in for Visual Studio.)
I’ve looked at the Visual Studio Dependency Manager and it seems like a perfect match but have anyone tried it for real? I’ve also seen the .NET ports of Maven, but unfortunately I was not too impressed by the status of those. (But please go ahead and recommend them anyone if you think I should give them another try.)
So what would be the smartest way to tackle this problem?
Update:
I realized that I needed to explain what I meant with linking to the same set of dll’s.
One of the things I'm trying to achieve here is to avoid that the different solutions are referencing different versions of each component. If I update a component to a new version, it should be updated for all solutions upon next build. This would force me to make sure all solutions are up to date with the latest components.
Update 2:
Note that this is an old question asked before tools like NuGet or OpenWrap existed. If anyone is willing to provide a more up-to-date, please go ahead and I will change the accepted answer.
Find some place to store the assemblies. For example, I store the .Net core assemblies like so:
<branch>\NetFX\2.0527\*
<branch>\NetFX\3.0\*
<branch>\NetFX\3.5\*
<branch>\NetFX\Silverlight 2\*
<branch>\NetFX\Silverlight 3\*
Use the ReferencePath property in MSBuild (or AdditionalReferencePath in Team Build) to point your projects at the appropriate paths. For simplicity and easy maintenance, I have 1 *.targets file that knows about every such directory; all of my projects Import that file.
Make sure your version control strategy (branching, merging, local<->server mappings) keeps the relative paths between your projects & your reference paths constant.
EDIT
In response to the update in the question, let me add one more step:
4) Make sure every assembly reference in every project file uses the full .Net strong name and nothing else.
Bad:
<Reference Include="Microsoft.SqlServer.Smo">
<SpecificVersion`>False</SpecificVersion>
<HintPath>..\..\..\..\..\..\..\Program Files (x86)\Microsoft SQL Server\100\Shared\Microsoft.SqlServer.Smo.dll</HintPath>
</Reference>
Good:
<Reference Include="Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91, processorArchitecture=MSIL" />
Advantages of the latter format:
Using a HintPath in a collaborative development environment will inevitably lead to situations where "it works for me" but not others. Especially your build server. Omitting it forces you to get your reference paths correct or it won't compile.
Using a weak name invites the possibility of "DLL hell." Once you use strong names then it's safe to have multiple versions of the same assembly in your reference paths because the linker will only load ones that match every criterion. In addition, if you decide to update some assemblies in place (instead of adding copies), then you'll be notified of any breaking changes at compile time instead of whenever the bugs start coming in.
Adding to what everybody else is saying, it basically comes down to two things:
Making sure that all developers have the same versions of external libraries
Making sure that all developers have the external libraries located in the same place (at least, relative to the source code)
As Richard Berg points out, you can use ReferencePath and/or AdditionalReferencePath to help solve #2. If you're using msbuild in your build process (in our case, we're using CruiseControl instead of MS Team Build), you can also pass ReferencePath to it on the command line. To solve #1, I've found svn:externals to be useful (if you're using SVN).
My experience with Maven is that it's way overkill for most purposes.
I usually have a separate folder structure on the source control for extrenal or Internal dependencies, and these filders have the assemblies according to build or version number for example
public\External\libraries\Nunit\2.6\
or
Public\Internal\libraries\Logger\5.4.312\
and inside the solutions all the projects that need to use any of the dependencies just adds a reference to that assemblies in the public internal or extrenal folders.

How to associate external files with an assembly

Let's say you have a class library project that has any number of supplemental files that also need to be included with the compiled assembly (e.g. simple text files or even a legacy unmanaged DLL that's wrapped by the assembly as an interop layer). While embedding the supplemental files into the assembly itself is relatively straightforward, we have situations where this is not possible or just undesirable. We need to have them as "sidecar" files (i.e. files alongside the assembly, potentially in subdirectories relative to the assembly)
Adding those files to the project with an appropriate value for "Copy to Output Directory" specified appears to be sufficient for projects that are completely self-contained within a solution. But if a separate project in another solution adds a reference to the assembly, it does not automatically pickup its sidecar files. Is there a way in the project to somehow mark the resulting assembly such that anything referencing the assembly will also know it needs to include the associated sidecar files? How do you do this?
You can use al.exe, but there also appears to be a C# compiler option. You want to create a multifile assembly using the /linkresource C# compiler option. Instructions are here, but the command is similar to this:
csc /linkresource:N.dll /t:library A.cs
Where N.dll is a native DLL that will go wherever the managed assembly goes (including into the GAC.) There's a very clear description at the link I provided.
Have you tried creating a setup for your solution ? There's an option of including sidecar files targeting to application installation directory.
Another option would be to include the sidecar files in the Assembly resources and un-wrap them to disk when run for the first time.
What if you create a merge module containing the library plus its dependencies? Your installer will then need to reference this module, but you will ensure all of the necessary files will be present.
Unfortunately there doesn't appear to be a lot of built-in support in Visual Studio for this, although I can definitely see the use case.
If you use Subversion for your source control, then you could link in an external reference as an externals definition. This would bring in the source code, and you'd be making a reference to the necessary assembly as a project reference instead of a DLL reference, and then the copy to output directory rules would come into play.
If that's not possible, another solution would be to include commands in the pre/post-build events of your in-solution project to copy the most up-to-date sidecar files from the remote assembly on a build. Of course this comes with the caveat that it doesn't set itself up automatically when you include the DLL in your project; you have to take manual steps to set it up.
I deal with this some time ago. Its a common problem.
You can create some postbuild actions:
http://www.codingday.com/execute-batch-commands-before-or-after-compilation-using-pre-build-or-post-build-events/
Hope this helps... :)
It appears to me that you're using the wrong type of reference. There are two types of references- Reference and ProjectReference. Reference is an explicit reference to a specific assembly. ProjectReference is a reference to another project (say .csproj).
What you're looking for is ProjectReference. VS and the default MSBuild targets are setup to do CopyLocal. If you set CopyToOutputPath true for your "sidecar" files, any ProjectReferences to this project now will also pull in the same files.
I'm not sure if you can to ProjectReferences across solutions in the IDE. I deal a lot with MSBuild where sln files are not relevant and this is how I deal with it.
What we did in our project is that we created as separate build file to do all those stuffs.
In your build file you can have tags to build your main solution, then add tags to copy files you need after build.
NAnt is also your option, but right now I'm happy using Rake as my build/debug automation.
Since this cannot be integrated within Visual Studio, what I'm doing is I create a task (either in MSBuild, NAnt or Rake), that executes vsjitdebugger.exe in the end to attach it to my Visual Studio when debugging.
These are just my styles for now, you can maybe create your own style.

Resources