DB Project/Solution Best Practice - visual-studio-2010

I have an app (ASP.NET 3.5/ VS 2010) that works with a database project.
Is there any downside to having the database project as one more project within the app solution?
Is it better to have another solution just for the database project?

No real downside in the scope of this single app. If the same DB is used by multiple apps, you might be more easily able to reuse it by having it in its own solution, but even then, you could set it up as an external in your source control and be able to reference the latest build from a "lib" directory downloaded when you update source on other solutions.

My rule of thumb is: if one project directly references another project, they should be in the same app solution. If the projects are related, but don't reference one another, they should be their own solutions, in separate subfolders of the same main folder. If two multi-project solutions are related, place them in separate places, then use Add Existing Project to reference the appropriate other project in the solution.
Most of my projects end up being solutions because I add a testing project. So I split up every component into a separate solution, and then reference the appropriate projects in other solutions as needed using Add Existing Project, so I can debug from one solution into another. But they are all kept separate and are stand-alone.

Related

WorkSpace Vs Project Vs Target in Xcode

As I am new to Xcode, I do not understand clearly how Xcode organizes workspaces and projects. By using Google, I believe the following:
A workspace contains many projects.
Project contains many Targets
A project should have at least one target.
Are all of the above statements are true?
If i have one class called (login.h,login.m,login.xib) in my project, can I use the same login file in different targets?
Can i use the same file in different Projects.
If there is a tutorial or other articles that explain, please let me know of them.

Managing Dependencies of Reflected Dependencies

I presently work with a large solution, containing about 100 projects. At least 10 of the projects are executable applications. Some of the library projects are imported as plugins via MEF and reflection rather than with direct references. If a needed plugin's own dependencies are not copied to the output or plugin directory of the executable project using it, we'll get reflection errors at runtime.
We've already tried or discussed the following solutions, but none of them seem like a good fit:
"Hard" References: Originally, we had the executable projects reference other projects they needed, even if they were going to ultimately be imported as optional plugins. This quickly fell out of favor with team members who needed to make builds that excluded certain plugins and liked to unload those projects to begin with. This also made it difficult to use Resharper or other tools to clean unused references and remove obsolete third party libraries without accidentally blowing away the "unused" references to the needed plugins own dependencies.
Post-build copying (with pre-build "pull"): For a brief period of time, a senior team member set all the plugin projects to xcopy their outputs output themselves to a known "DependencyInjection" folder as post-build events. Projects that needed those plugins would have pre-build events, xcopying each desired plugin to their own output directories. While this meant that the plugin projects "rightly" had no knowledge of where they might be used, this caused two major headaches. First, any time one made a change in a plugin project, they had to separately build (in sequence) the plugin project and then the executable project they would test it in (to get the files to copy over). Rebuild all would be more convenient but far too slow. Second, the continuous integration build would have to have been reconfigured since it compiled everything all in one directory and only cared if everything built successfully.
Post-build copying (push): The present solution started with xcopy and now mostly uses robocopy in post-build events of the plugin projects to copy needed files directly to the plugin folders of the executable projects that use them. This works fairly well in that if one makes a change in a plugin, one can go straight to running with the debugger. Also, the CI build doesn't break, and users disabling certain "optional" plugin projects for various builds don't get build errors from missing references. This still seems hackish, and is cumbersome to maintain in all the separate post-build windows, which are rather small and can't be expanded. When executable projects get moved from a project restructure or renamed, we don't find out about broken references until the next day after hearing results from the overnight automated testing.
"Dummy" projects with references: One idea that was briefly tossed about involved making empty projects for each of the different executable build configurations and going back to the hard references method on those. Each would use its own references to gather up the plugins and their dependencies. They would also have a reference to the actual executable and copy it over. Then, if one wanted to run a particular executable in a particular configuration, you'd run its dummy project. This one seemed particularly bloated and was never attempted.
NuGet: In my limited familiarity with NuGet, this seems like a good fit for using packages, except I wouldn't know how to implement that internal to one solution. We've talked about breaking up the solution, but many members of the team are strongly opposed to that. Is using NuGet with packages coming from within the same solution possible?
What are best practices for a situation like this? Is there a better solution to managing dependencies of reflected dependencies like this than any of the above, or is a refinement of one of the above the best choice?
Ok, so I assume in this answer that each developer needs to constantly have all 100 assemblies (Debug mode) locally to do its job (develop, compile, smoke test, run automatic tests).
You are mentioning that RebuildAll takes long time. Generally this symptom is caused by too many assemblies + build process not rationalized. So the first thing to do is to try to merge the 100 assemblies into as few assemblies as possible and avoid using things like Copy Local = true. The effect will be a much faster (like 10x) RebuildAll process. Keep in mind that assemblies are physical artefacts and that they are useful only for physical things (like plug-in, loading on-demand, test/app separation...). I wrote a white-book that details my thoughts on the topic: http://www.ndepend.com/WhiteBooks.aspx
Partitioning code base through .NET assemblies and Visual Studio projects (8 pages)
Common valid and invalid reasons to create an assembly
Increase Visual Studio solution compilation performance (up to x10 faster)
Organize the development environment
In the white-book advice's, one of idea is to avoid referencing project but to reference assemblies instead. This way it becomes your responsibility to fill Project > right click > Project Dependencies that will define the Project > right click > Project Build Order. If you decide to keep dealing with 100 assemblies, defining this setting represents an effort, but as a bonus a high-level (executable) project can depend on a library only used by reflection and this will solve your problem.
Did you measure the Lines of Code in terms of # of PDB sequences points? I estimate that until the limit 200K to 300K doing a RebuildAll (with optimization described in the white-book) should take 5 to 10 seconds (on a decent laptop) and it remains acceptable. If your code base is very large and goes beyond this limit, you'll need to break my first assumption and find a way that a developer doesn't need all assemblies to do its job (in which case we can talk about this further).
Disclaimer: This answer references resources from the site of the tool NDepend that I created and now manage its development.
I have been in a situation like yours. We had almost 100 projects. We too were using MEF and System.AddIn. In the beginning we had a few solutions. I was working on the core solution that included the core assemblies and their tests. Each plug-in category in a separate solution, that included contracts, implementation (some plug-ins had more than one implementation) and tests, plus some test host as well as the core assemblies. At some later point we added a solution that included all projects and after trying a few of the approaches you mention we decided to do the following:
Keep the references that are mandatory,
All executable projects were set to output to common locations (one for debug and one for release configurations),
All projects that should not be referenced were set to output to these common locations,
All projects that were referenced by others, were left unchanged and each reference was set with Copy Local = true.
Tests were left unchanged.
Although building all was slow, we didn't have any other problems. Of course having almost 100 projects is a sign that the design is probably too modular and as Patrick advises, we should have tried to compact it.
Anyway, you could try this approach in a couple of hours and perhaps instead of setting Copy Local = true, try to set the output folder of all projects mentioned in 4 to have their output set to the common locations. We didn't know that this setting will slow down the build process as Patrick mentions.
PS. We never tried using NuGet because we didn't have enough resources and time to experiment with it. It looked promising though.
We are starting up a new project and I am looking for the "best practices" solution of this similar problem. For us, we can divide the projects into two categories 1) The Platform assemblies, which provide common set of services across the board and 2) Verticals which would be perform business specific functions.
In the past we have used a Visual Studio plug-in with a simple UI that allow developers to specify a common assemblies path to copy the output assemblies and then reference all assemblies (whereever they reside in a different solution) from the common assemblies folder.
I am looking at NUGET but the sheer work you have to do to created and maintain NUGET packages is punitive.
It's a very common scenario and would be really interested to see how others have addressed it.

When to build\use Multiple Projects within a Solution

i am a good but not so advanced .NET Developer. This is more of a Expert to juniors knowledge transfer request.
I was thinking, in Visual Studio you can Add projects inside a solution. Of-course these projects will carry different namespace.
My question is
Why to build a project inside a solution
When it is good\useful to build multiple project inside a solution.
I suppose you mean more than one project in a solution, right?
We use it mainly from a library perspective. You receive more than one assembly and in this way you can share or exchange only parts of you application. This is for example helpful if you have a bug in your application which touches only a part of your app. In this case you can fix and exchange only the bad assembly instead of the whole app.
It allows you to separate parts of an application. Your GUI, business logic, and data access can all be separate.
In addition, projects within a solution can reference each other with "project references". This ensures they all build with the same configuration: all Debug or all Release. Also, a projects can build when the projects they reference change.

What is the recommended way to setup projects like this?

We are working on a large project. The project has multiple external sites and multiple internal sites all stored in Subversion.
The external sites allow a customer to make requests of various things we provide, pay utility bills and more. We decided to break many of these functions apart because most work completely different than the others. So this is one Visual Studio solution with the WebUI and the database layer broken into two projects each. For instance, utility billing has a Utility.WebUI project and a Utility.Domain project. All DB/business logic is kept in the domain project.
The internal sites bridge the gap between the back-office system (IBM i) and the web database. Also will replace/enhance some of our older RPG programs. In theory they should use the exact same database logic that the external sites use because they access the same database right? What is the best way to reference these projects from a different solution? Should I just add a reference to the dll or should I import that project from the external application solution into the internal application solution?
This comes down to that we have two developers working on this project. Myself, I do most of the back-end coding. The other developer does most of the GUI coding. So we need to make sure that this project works on multiple workstations.
Does this make sense? Any thoughts?
Use the svn:externals property to reference the shared project into your project(s).
You have to choose between 1) referencing the directory containing the shared project's source code (i.e. where the csproj and cs files are located) or 2) referencing the directory containing the shared project's build output (assembly / dll).
I normally prefer method 1) since it makes modifications to the shared project's source code easier (you can make changes without having to open the shared project's solution in a second instance of Visual Studio). If you don't intend to make changes to the shared project often then method 2) might be better. It reduces compile time and prevents accidental modifications of the shared project's source code. Both methods are fine - matter of taste.
It is recommended for both methods that you version your shared project. i.e. create tags with version numbers and reference the tags, not the trunk. When a new version of the shared project comes out you can update the svn:externals property of your other project(s) with the new version number, run "svn update" to download the new version of the shared project, and recompile. This works especially well if you have a build server for the shared project that does the tagging for you automatically.
I think you can use a sort of "commons" solution that contains the common projects and then refer to these projects in you main solutions using SVN external pointing to the project folder in the SVN trunk.
Commons SVN repository must follow the suggested repository structure (trunk, branches, tags) to have always stable commons projects.
In this scenario you can consider to use a dependency management tool, such as NPanday or NDepend, where you must declare to which version of which assemblies every project depends on; using these tools you can have a local repository (such as Artifactory or Nexus) of binary assemblies to refer to, or choose to use SVN externals to refer directly to source code.

Managing internal 3rd Party Dependencies

We have a lot of different solutions/projects which are managed by different teams. Our solution needs to reference several projects that another team owns. We don't want to add these dependencies as project references because we do not intend on modifying that code, we just want to use it. Also we already have quite a bit of projects in our solution and don't want to add a bunch more since it will slow down Visual Studio. So we are building these projects in a separate solution and adding them as file references to our solution.
My question is, how do people manage these types of dependencies? Should I just have some automated process what looks for changes to those projects, builds them and checks the dlls into our source control, after which we treat them like other 3rd party dependencies? Is there a recommended way of doing this?
One solution, although it may not necessarily be what you are looking for, is to have each dependent sub-system perform a release. This release could be in the form of a MSI install, or just a network share of assemblies. When a significant change is made, that team could let you know, and you could run the install or a script to copy the files.
Once you got the release, you could put them into the GAC, that way you would not have to worry about copying them to your project bin folders.
Another solution, assuming you are using a build server or continuous integration of some kind, is to have a post build step or process stage the files. Than at any given moment, the developers of the other teams could grab the new files , or have a script or bat file pull them down locally.
EDIT - ANOTHER SOLUTION
It might be best to ask why do you have these dependencies? Do you really need them locally when building your part of the application? Could you mock out the dependencies in your solution, allowing you to code, build, and run unit tests? The the actual application would wire these up in your DEV/Test/Prod environments. Keeping your solution decoupled and dependent free may be a better solution for the individual team. Leave the integration and coupling when the application runs in a real setting.
(Not a complete answer, but still:)
Any delivery is better stored in a file/binary repository, as opposed to a VCS used to manage sources history.
We prefer managing those deliveries in a repo like Nexus, and we are using maven to get back the right dependencies.
Even if those tools can be more Java-oriented, Nexus can store anything, and maven is only there to read the pom.xml of each artifact and compute the right dependencies.

Resources