How to get the libraries you need into the bin folder when using IoC/DI - visual-studio

I'm using Castle Windsor to do some dependency injection, specifically I've abstracted the DAL layer to interfaces that are now being loaded by DI.
Once the project is developed & deployed all the .bin files will be in the same location, but for while I'm developing in Visual Studio, the only ways I can see of getting the dependency injected project's .bin file into the startup project's bin folder is to either have a post-build event that copies it in, or to put in a manual reference to the DAL project to pull the file in.
I'm not totally thrilled with either solution, so I was wondering if there was a 'standard' way of solving this problem?

Could you set the build output path of the concrete DAL project to be the bin folder of the dependent project?

Mike: Didn't think of that, that could work, have to remember to turn off copy-local for any libraries / projects that are common between them

Related

Creating dll and lib from one project in the solution

dll project for which i want to create unit tests.
In order to run unit tests i need my test project to compile all dependencies and i don't want to add my cpp files to test project.
The solution is to add lib in references but i have no clue how should i compile my project as both dll and as lib. Is this even possible ? I suggest that could be easy with naked cmake/make.
What i could do is create another project with same files and build it as a .lib. I think this solution is very primitive and it will require me to add .cpp files to projects twice if i ever want to add something new. I would prefer a solution in which i could have only one project in solution and have it build .dll (a main component) and a .lib for UT project reference.
I know i could also make UT load my .dll and have that kind of dynamic linkage, but that would make it harder due the need of creation of dll wrapper to access those functions.
I also considered using batch build and have custom build configuration. But this will require me to batch build every time i want to run UTs. I'd rather have it chose automaticaly. Maybe if i set UT project to be build under custom configuration the dependency will also be build with custom config?
Does Visual Studio 2015 bring any simple solution to this problem?

How to handle project.json when creating multiple versions of a .NET class library

I have a .NET class library project that targets UWP applications. I wish to re-purpose it to also support Xamarin.Forms applications.
Initially, I imagined that I could achieve this by creating a new .csproj file, in the same directory as the original, and configure it to reference the same set of source files, but with a different set of dependencies, as appropriate to the target framework.
However, this doesn't seem possible, since each of the projects expects its dependencies to be defined in a project.json file that resides in the project directory. If it were permissible to rename project.json to a framework-specific name, that would solve the problem. But, as far as I can see, the name and the location of project.json is fixed.
Is there a recommended way of creating multiple projects that reference the same codebase, but with different dependencies?

Why don't GWT launch configurations lookup sources in dependency projects by default?

Why there's a difference in Default Source Lookup Path between GWT and java launch configurations?
In multi-module projects instead of containing projects, the Default folder contains the class folders of those projects!
It causes "Source not found" errors when the debugger steps into a dependency project.
I know I can add projects manually. Just want to know why this difference is needed.
The only project in the list is the associated with the .launch.
GWT needs '.java' source files of dependencies to be able to compile (translate to js) while the Java launcher needs only '.class' compiled files.
I suppose you know GWT has a different way to manage dependencies (through modules within the same project), which is good for some advanced GWT practices like loading a module's js lazily (this feature doesn't work with "foreign" libs/projects).
So this could also be an answer for why eclipse GWT doesn't suppose you will have more projects (but more modules instead)
Finally if you really have an independent GWT lib that you're maintaining this is an issue as you said.

How to do dependency management in Visual Studio/MSBuild

There have been many posts on this topic, but I have yet to find the "real" solution.
How does one manage their dependency tree (both compile time and runtime) using MSBuild project files (i.e. Visual Studio project files via project and file references)?
It is well known that project references from child projects will not be copied to an application bin directory if there is no compile time reference, even if there is a runtime dependency, and even if copy-local=true. Hence, any loosely coupled component will not be copied over.
The hack to solve this problem is to include the dependency in the parent project with copy-local=true. However, this basically destroys your dependency tree as you no longer know where the dependency is and ultimately, as your app grows and morphs, you end up with a version of DLL hell. Your parent project ends up with 10s to 100s of dlls, most of which are runtime dependencies of dlls in child projects.
Another hack is to write a custom targets file and call it from every project file: http://blog.alexyakunin.com/2009/09/making-msbuild-visual-studio-to.html. But surely there is a better option. This is such a bread and butter thing. Java devs never have to deal with such trivial issues.
From what I can gather, the Microsoft way to solve this problem is to register every dependency in the GAC for every dev, test and production machine. But this is stupid and annoying. I won't bother giving this option and educated rebuttal.
Avoiding the GAC option, how could one use MSBuild to manage a dependency tree that includes runtime only dependencies? How does Microsoft do it? Surely they don't run custom targets files like the one in the link above.
I hope someone from an enterprise .NET background can step up and offer some real advice on this. Otherwise I'm just going to have to rewrite all my build scripts in NAnt (shudder).
Thanks All.
UPDATE
In response to some comments, the following is a practical example of the issue from my current project.
The app is a Web Application project that exposes a suite of WCF services. It has an external domain DLL containing the external service classes and an internal domain DLL containing internal service POCOs, domain objects and DAOs. There is a separate integration DLL containing interfaces (DTOs) for all the internal domain classes that allows us to completely decouple the external and internal domains. The whole thing is wired up with Spring.net. I hope this is clear, let me know if you need more clarification.
My current build process is to use MSBuild to generate a deployment package for the web application (in TFS Build). So while the whole solution is built initially, only the output from the web application gets packaged. Therefore, the Web Application is treated as the dependency root and I expect that any loosely coupled child references should get copied over on build if they are set to 'copy-always=true'.
So the Web Application contains a reference to the external domain DLL which contains a reference to the internal domain DLL which contains many references to 3rd party libraries and various indirect and loosely coupled dependencies required by the 3rd party libraries.
The problem occurs when there is a 3rd party dependency in the internal domain DLL e.g. oracle.dataaccess which is required by NHibernate at runtime. Even when I set 'copy-always=true' on these DLLs, they do not get copied to the Web App package. The only way I can include them in the package is to add these DLLs to the Web App's references. I don't want to do this because I no longer have a meaningful dependency tree.
I hope this makes the issue clearer. Please let me know if anything is unclear. It's hard to describe this sort of stuff.
If anyone is also having a similar issue, please speak up and share your experience.
I really want to give you a better answer but unfortunately you didn't put enough information about your solution/projects and your dependencies, so I will try to give you several ideas and I hope one of them works.
The easiest thing to do as you said is to set up a separate folder with all of your dependencies and create target file that will copy them to your bin folder. If you have dependencies that are not changing frequently that might work. If another team from your company is building them and they change frequently, this approach is not good.
Another simple approach - if you're referencing your dependencies from your solution only you can change the build path, so that they build directly into the bin folder of your main project. This way you don't have to reference them directly.
Use NuGet. You have a separate team producing loosely coupled dependencies it may make a sense to set up local NuGet repository and use it for that http://juristr.com/blog/2012/04/using-nuget-to-distribute-our-company/
I hope that helps.

Nuget, IoC and scanning

On my current project we are using Nuget to bring in dependencies on things like NHibernate, we use Ninject as our IoC container (although that's not an important detail of the problem I'm trying to solve) and we use Ninject's scanning functionality for establishing our bindings.
This all works fine but the problem that I'm facing is that I've gone through the trouble of making my Data Access Layer be exposed to the Application Layer in a Presistence Ignorant fashion but Ninject needs to have access to the NHibernate Dlls at scan time. If I wasn't using Nuget to pull in dependencies then this could easily be solved with a post build step to copy the dependencies from wherever I had decided to put them in my source tree.
However, with using Nuget my understanding is that the path to Dlls could change in the package directory as dependencies are upgraded. I definitely don't want to solve this problem by using Nuget in my Coposite Root projects (services, UX, etc...) and so I'm trying to figure out a clean way to get the dlls that I need at scanning time available in my Composite Root projects execution directories.

Resources