i am a good but not so advanced .NET Developer. This is more of a Expert to juniors knowledge transfer request.
I was thinking, in Visual Studio you can Add projects inside a solution. Of-course these projects will carry different namespace.
My question is
Why to build a project inside a solution
When it is good\useful to build multiple project inside a solution.
I suppose you mean more than one project in a solution, right?
We use it mainly from a library perspective. You receive more than one assembly and in this way you can share or exchange only parts of you application. This is for example helpful if you have a bug in your application which touches only a part of your app. In this case you can fix and exchange only the bad assembly instead of the whole app.
It allows you to separate parts of an application. Your GUI, business logic, and data access can all be separate.
In addition, projects within a solution can reference each other with "project references". This ensures they all build with the same configuration: all Debug or all Release. Also, a projects can build when the projects they reference change.
Related
I have a VS Project/Solution (.NET 6.0) that contains a ton of core functionality. WinForms, Classes, etc.
My intention is to copy/duplicate this project and customize it for each individual application (if you are curious, this is a project interacting with collaborative robots. While the core of the project will be similar across multiple robots, each individual robot will need its own tweaking (GUI, functionality, etc). I would like to keep these as individual projects and not just add new robots to the base project. I want to keep it to one project per robot. I have my reasons, from licensing to support.
My question is: what is the best practice way to copy/duplicate a project and rename it? These are the goals:
Keep the Base/Ref project intact so it can be used as a basis for new projects.
Be able to push each 'new' project to a new location/repo in GitHub.
Any thoughts are greatly appreciated!
I am looking for some guidance on the standard way of implementing plugins in D365 CE or what Microsoft recommends.
I generally follow this practice -
1. Have only 1 CRM Solution and one plug-in assembly (DLL) for all these plugin steps.
2. Have a separate ".cs" file for each plugin in this project.
3. Each plugin corresponds to specific functionality. So in case, we want to disable any functionality, it could be easily done without changing the code.
4. Have only 1 CRM Solution for all these plugins.
Looking forward to some expert guidance.
Thanks!
In this video former MVP Mitch Milam talks three plugin solution layout options. He recommends the approach you outline above, and that is the one I typically use.
I also generally use a Console app to test and debug plugins. For maximum flexibility, I often put all the business logic into a Visual Studio shared project. Then I reference that shared project from both the plugin project and the Console App.
While the Console app could reference a DLL, having the logic in a shared project easily allows me to also use the logic in a workflow project if I want. Ultimately, the shared project gives me the option run the code as a plugin, a workflow, or a Console app.
Here's an example:
The .Cmd project is the console app. The one with the double diamond icon is the shared project (which cannot be compiled on its own - it must be referenced by one or more compilable projects).
I presently work with a large solution, containing about 100 projects. At least 10 of the projects are executable applications. Some of the library projects are imported as plugins via MEF and reflection rather than with direct references. If a needed plugin's own dependencies are not copied to the output or plugin directory of the executable project using it, we'll get reflection errors at runtime.
We've already tried or discussed the following solutions, but none of them seem like a good fit:
"Hard" References: Originally, we had the executable projects reference other projects they needed, even if they were going to ultimately be imported as optional plugins. This quickly fell out of favor with team members who needed to make builds that excluded certain plugins and liked to unload those projects to begin with. This also made it difficult to use Resharper or other tools to clean unused references and remove obsolete third party libraries without accidentally blowing away the "unused" references to the needed plugins own dependencies.
Post-build copying (with pre-build "pull"): For a brief period of time, a senior team member set all the plugin projects to xcopy their outputs output themselves to a known "DependencyInjection" folder as post-build events. Projects that needed those plugins would have pre-build events, xcopying each desired plugin to their own output directories. While this meant that the plugin projects "rightly" had no knowledge of where they might be used, this caused two major headaches. First, any time one made a change in a plugin project, they had to separately build (in sequence) the plugin project and then the executable project they would test it in (to get the files to copy over). Rebuild all would be more convenient but far too slow. Second, the continuous integration build would have to have been reconfigured since it compiled everything all in one directory and only cared if everything built successfully.
Post-build copying (push): The present solution started with xcopy and now mostly uses robocopy in post-build events of the plugin projects to copy needed files directly to the plugin folders of the executable projects that use them. This works fairly well in that if one makes a change in a plugin, one can go straight to running with the debugger. Also, the CI build doesn't break, and users disabling certain "optional" plugin projects for various builds don't get build errors from missing references. This still seems hackish, and is cumbersome to maintain in all the separate post-build windows, which are rather small and can't be expanded. When executable projects get moved from a project restructure or renamed, we don't find out about broken references until the next day after hearing results from the overnight automated testing.
"Dummy" projects with references: One idea that was briefly tossed about involved making empty projects for each of the different executable build configurations and going back to the hard references method on those. Each would use its own references to gather up the plugins and their dependencies. They would also have a reference to the actual executable and copy it over. Then, if one wanted to run a particular executable in a particular configuration, you'd run its dummy project. This one seemed particularly bloated and was never attempted.
NuGet: In my limited familiarity with NuGet, this seems like a good fit for using packages, except I wouldn't know how to implement that internal to one solution. We've talked about breaking up the solution, but many members of the team are strongly opposed to that. Is using NuGet with packages coming from within the same solution possible?
What are best practices for a situation like this? Is there a better solution to managing dependencies of reflected dependencies like this than any of the above, or is a refinement of one of the above the best choice?
Ok, so I assume in this answer that each developer needs to constantly have all 100 assemblies (Debug mode) locally to do its job (develop, compile, smoke test, run automatic tests).
You are mentioning that RebuildAll takes long time. Generally this symptom is caused by too many assemblies + build process not rationalized. So the first thing to do is to try to merge the 100 assemblies into as few assemblies as possible and avoid using things like Copy Local = true. The effect will be a much faster (like 10x) RebuildAll process. Keep in mind that assemblies are physical artefacts and that they are useful only for physical things (like plug-in, loading on-demand, test/app separation...). I wrote a white-book that details my thoughts on the topic: http://www.ndepend.com/WhiteBooks.aspx
Partitioning code base through .NET assemblies and Visual Studio projects (8 pages)
Common valid and invalid reasons to create an assembly
Increase Visual Studio solution compilation performance (up to x10 faster)
Organize the development environment
In the white-book advice's, one of idea is to avoid referencing project but to reference assemblies instead. This way it becomes your responsibility to fill Project > right click > Project Dependencies that will define the Project > right click > Project Build Order. If you decide to keep dealing with 100 assemblies, defining this setting represents an effort, but as a bonus a high-level (executable) project can depend on a library only used by reflection and this will solve your problem.
Did you measure the Lines of Code in terms of # of PDB sequences points? I estimate that until the limit 200K to 300K doing a RebuildAll (with optimization described in the white-book) should take 5 to 10 seconds (on a decent laptop) and it remains acceptable. If your code base is very large and goes beyond this limit, you'll need to break my first assumption and find a way that a developer doesn't need all assemblies to do its job (in which case we can talk about this further).
Disclaimer: This answer references resources from the site of the tool NDepend that I created and now manage its development.
I have been in a situation like yours. We had almost 100 projects. We too were using MEF and System.AddIn. In the beginning we had a few solutions. I was working on the core solution that included the core assemblies and their tests. Each plug-in category in a separate solution, that included contracts, implementation (some plug-ins had more than one implementation) and tests, plus some test host as well as the core assemblies. At some later point we added a solution that included all projects and after trying a few of the approaches you mention we decided to do the following:
Keep the references that are mandatory,
All executable projects were set to output to common locations (one for debug and one for release configurations),
All projects that should not be referenced were set to output to these common locations,
All projects that were referenced by others, were left unchanged and each reference was set with Copy Local = true.
Tests were left unchanged.
Although building all was slow, we didn't have any other problems. Of course having almost 100 projects is a sign that the design is probably too modular and as Patrick advises, we should have tried to compact it.
Anyway, you could try this approach in a couple of hours and perhaps instead of setting Copy Local = true, try to set the output folder of all projects mentioned in 4 to have their output set to the common locations. We didn't know that this setting will slow down the build process as Patrick mentions.
PS. We never tried using NuGet because we didn't have enough resources and time to experiment with it. It looked promising though.
We are starting up a new project and I am looking for the "best practices" solution of this similar problem. For us, we can divide the projects into two categories 1) The Platform assemblies, which provide common set of services across the board and 2) Verticals which would be perform business specific functions.
In the past we have used a Visual Studio plug-in with a simple UI that allow developers to specify a common assemblies path to copy the output assemblies and then reference all assemblies (whereever they reside in a different solution) from the common assemblies folder.
I am looking at NUGET but the sheer work you have to do to created and maintain NUGET packages is punitive.
It's a very common scenario and would be really interested to see how others have addressed it.
I have an app (ASP.NET 3.5/ VS 2010) that works with a database project.
Is there any downside to having the database project as one more project within the app solution?
Is it better to have another solution just for the database project?
No real downside in the scope of this single app. If the same DB is used by multiple apps, you might be more easily able to reuse it by having it in its own solution, but even then, you could set it up as an external in your source control and be able to reference the latest build from a "lib" directory downloaded when you update source on other solutions.
My rule of thumb is: if one project directly references another project, they should be in the same app solution. If the projects are related, but don't reference one another, they should be their own solutions, in separate subfolders of the same main folder. If two multi-project solutions are related, place them in separate places, then use Add Existing Project to reference the appropriate other project in the solution.
Most of my projects end up being solutions because I add a testing project. So I split up every component into a separate solution, and then reference the appropriate projects in other solutions as needed using Add Existing Project, so I can debug from one solution into another. But they are all kept separate and are stand-alone.
We have a lot of different solutions/projects which are managed by different teams. Our solution needs to reference several projects that another team owns. We don't want to add these dependencies as project references because we do not intend on modifying that code, we just want to use it. Also we already have quite a bit of projects in our solution and don't want to add a bunch more since it will slow down Visual Studio. So we are building these projects in a separate solution and adding them as file references to our solution.
My question is, how do people manage these types of dependencies? Should I just have some automated process what looks for changes to those projects, builds them and checks the dlls into our source control, after which we treat them like other 3rd party dependencies? Is there a recommended way of doing this?
One solution, although it may not necessarily be what you are looking for, is to have each dependent sub-system perform a release. This release could be in the form of a MSI install, or just a network share of assemblies. When a significant change is made, that team could let you know, and you could run the install or a script to copy the files.
Once you got the release, you could put them into the GAC, that way you would not have to worry about copying them to your project bin folders.
Another solution, assuming you are using a build server or continuous integration of some kind, is to have a post build step or process stage the files. Than at any given moment, the developers of the other teams could grab the new files , or have a script or bat file pull them down locally.
EDIT - ANOTHER SOLUTION
It might be best to ask why do you have these dependencies? Do you really need them locally when building your part of the application? Could you mock out the dependencies in your solution, allowing you to code, build, and run unit tests? The the actual application would wire these up in your DEV/Test/Prod environments. Keeping your solution decoupled and dependent free may be a better solution for the individual team. Leave the integration and coupling when the application runs in a real setting.
(Not a complete answer, but still:)
Any delivery is better stored in a file/binary repository, as opposed to a VCS used to manage sources history.
We prefer managing those deliveries in a repo like Nexus, and we are using maven to get back the right dependencies.
Even if those tools can be more Java-oriented, Nexus can store anything, and maven is only there to read the pom.xml of each artifact and compute the right dependencies.