Dynamic CRM 2011 5 developers 5 databases - how to sync solutions - dynamics-crm

We are 5 developers working today with 1 database.
We have always one ASYNC service working in order to allow debugging, it means that when a developer wants to debug async, he announce to the others that he is hijacking the async service to his machine till he finishes the debugging.
We want to switch to a database per developer, there are a lot of issues with that, for example syncing schema changes / solutions with other programmers/
What is the best practice with large team of developers, is there any tool / methodology that is best for large teams.
Also, in general, what is the best practice for large teams developing Dynamic CRM 2011.
Thanks

Typically, I have worked/advised the following:
All devs work on their own virtual system. Much easier debugging. No trampling on or coordinating with others. I use VirtualBox.
Work is exported (unmanaged solutions) into a common build system.
Work is merged into the relevant managed solution(s) in build.
Managed solution(s) exported from build and applied to test / uat / pre-production etc.
Managed solution(s) applied to production environment.

Highly recommended reference: Microsoft released a very thorough whitepaper on Lifecycle management. Read about it here.
A typical development flow could be
Developers develop against their own personal development organization (Online/On-premise), in a solution with the same publisher / name
They export the developer solution
They unpack the zip file into the XML structure
And check it into source control, merging it with the master version
A typical deployment into the integration organization could be
Get a latest version of the XML structure from source control
Package it into a .zip solution
Import it into the integration organization
This way, you have a full history of all changes, linked to the developers, and you can make controlled merges, using merging tools you're familiar with.
A developer can always get a latest version from source control, package it and deploy it in his own development organization.

Related

How to manage stable binaries and avoid risk of CI rebuilds when install packaging?

I am looking for a tool to manage the collection of binary files (input components) that make up a software release. This is a software product and we have released multiple versions each year for the last 20 years. The details and types of files may vary, but this is something many software teams need to manage.
What's a Software Release made of?
A mixture of files go into our software releases, including:
Windows executables/binaries (40 DLLs and 30+ EXE files).
Scripts used by the installer to create a database
API assemblies for various platforms (.NET, ActiveX, and Java)
Documentation files (HTML, PDF, CHM)
Source code for example applications
The full collected files for a single version of the release are about 90MB. Most are built from source code, but some are 3rd party.
Manual Process
Long ago we managed this manually.
When starting each new release the files used to build the last release would be copied to a new folder on a shared drive.
The developers would manually add or update files in this folder (hoping nothing was lost or deleted accidentally).
The software installer script would be compiled using the files in this folder to produce a SETUP.EXE (output).
Iterate steps 2 and 3 during validation & testing until release.
Automatic Process
Some years ago we adopted CI (building our binaries nightly or on-demand).
We resorted to putting 3rd party binaries under version control since they usually don't change as often.
Then we automated the process of collecting & updating files for a release based on the CI build outputs. Finally we were able to automate the construction of our SETUP.EXE.
Remaining Gaps
Great so far, but this leaves us with two problems:
Rebuilding Assemblies The CI mostly builds projects when something has changed, but when forced it will re-compile a binary that doesn't have any code change. The output is a fresh build of a binary we've previously tested (hint: should we always trust these are equivalent?).
Latest vs Stable Mostly our CI machine builds the latest versions of each project. In some cases this is ok, but often we want to release an older tested or stable version. To do this we have separate CI projects for the latest and stable builds - this works but is clumsy.
Thanks for your patience if you've got this far :-)
I Still Haven't Found What I'm Looking For
After some time searching for solutions it seems it might be easier to build our own solution, but surely someone else has solved these problems before!?
What we want is a way to store and manage binary files (either outputs from CI, or 3rd party files) such that each is tagged with a version (v1.2.3.4) that allows:
The CI to publish new versions of each binary (but reject rebuilt versions that already exist).
The development team to make a recipe for a software release (kinda like NuGet packages.config) that specifies components to include:
package name
version
path/destination in the release folder
The Automatic package script to use the recipe collect the required files, and compile the install package (e.g. SETUP.EXE).
I am aware of past debates about storing binaries in a VCS. For now I am looking for a better solution. That approach does not appear ideal for long-term ongoing use (e.g. how to prune old binaries)... amongst other issues.
I have tried some artifact repositories currently available. From my investigation these provide a solution for component/artifact storage and version control. However they do not provide tools for managing a list of components/artifacts to include in a software release.
Does anybody out there know of tools for this?
Have you found a way to get your CI infrastructure to address these remaining issues?
If you're using an artifact repository to solve this problem, how do you manage and automate the process?
This is a very broad topic, but it sounds like you want a release management tool (e.g. BuildMaster, developed by my company Inedo), possibly in conjunction with a package management server like ProGet (which you tagged, and is how I discovered this question).
To address some specific questions you have, I'll associate it with a feature that would solve the problem:
A mixture of files go into our software releases, including...
This is handled in BuildMaster with artifacts. This video gives a basic overview of how they are manually added to releases and deployed to a file system: https://inedo.com/support/tutorials/buildmaster/deployments/deploying-a-simple-web-app-to-iis
Of course, once that works to satisfaction, you can automate the import of artifacts from your existing CI tool, create them from a BuildMaster deployment plan itself, pull them from your package server, whatever. Down the line you can also have your CI tool call the BuildMaster release management API to create a release and automatically have it include all the artifacts and components you want (this is what most of our customers do now, i.e. have a build step in TeamCity create a release from a template).
Rebuilding Assemblies ... The output is a fresh build of a binary we've previously tested (hint: should we always trust these are equivalent?)
You can mostly assume they are equivalent functionally, but it's only the times that they are not that problems arise. This is especially true with package managers that do not lock dependencies to specific version numbers (i.e. NuGet, npm). You should be releasing exactly the same binary that was tested in previous environments.
[we want] the development team to make a recipe for a software release (kinda like NuGet packages.config) that specifies components to include:
This is handled with releases. A developer can choose its name, dates, etc., and associate it with a pipeline (i.e. a set of testing stages that the artifacts are deployed to), then can "click the deploy button" and have the automation do all the work.
Releases are grouped by "application", similar to a project in TeamCity. As a more advanced use case, you can use deployables. Deployables are essentially individual components of an application you include in a release; in your case the "Documentation" could be a deployable, and maybe contain an artifact of the .pdf and .docx files. Deployables from other applications (maybe a different team is responsible for them, or whatever) can then be referenced and "included" in a release, or you can reference ones from a past release.
Hopefully that provides some overview and fits your needs. Getting into this space is a bit overwhelming because there are so many terms, technologies, and methodologies, but my advice is to start simple and then slowly build upon it, e.g.:
deploy a single, manually uploaded component through BuildMaster to a share drive, then manually deploy it from there
add a deployment plan that imports the component
add a second plan and associate it with the 2nd stage that takes the uploaded artifact and deploys it to the target, bypassing the need for the share drive
add more deployment plans and associate them with pipeline stages and promote through them all to "close out" a release
add an agent and deploy to that instead of the default localhost server
add more components and segregate their deployment with deployables
add event listeners to email team members at points in the process
start adding approvals if you require gated "sign-offs"
and so on.

Best way to manage releases in TFS

I am managing releases for a team of 8 developers. We have three environments:
DEV - where we all make our changes
UAT - an environment for users to test changes
LIVE - live environment
We use Visual Studio 2015 and TFS 2017.
Developers make changes to files and submit them for release to UAT by emailing a list (sometimes with a changeset number). Sometimes different users will make changes to the same files but not all changes should be released.
Once tested in UAT, the changes are released to Live however sometimes a file needs to move from UAT to Live that has earlier changes in it that are not approved for Live release yet.
Please could I ask users' advice as to what the best way for managing this process should be? Unintended changes keep getting released to UAT or Live when they should remain in DEV or UAT.
Any advice would be very welcome. Thanks
Usually this kind of "the best way" question is primarily opinion-based and hard to answer.
Many good questions generate some degree of opinion based on expert
experience, but answers to this question will tend to be almost
entirely based on opinions, rather than facts, references, or specific
expertise.
Developers make changes to files and submit them for release to UAT by emailing a list (sometimes with a changeset number).
For this scenario, instead of using E-Mail to send lists, perhaps you could use this extension
This extension is a build task you can use in build steps. This task generates a markdown release notes file based on a template passed into the tool. Here is an example of release notes output:
Release notes for build SampleSolution.Master
Build Number: 20160229.3 Build started: 29/02/16 15:47:58 Source
Branch: refs/heads/master
Associated work items
Task 60 [Assigned by: Bill ] Design WP8 client Associated change
sets/commits
ID bf9be94e61f71f87cb068353f58e860b982a2b4b Added a template ID
8c3f8f9817606e48f37f8e6d25b5a212230d7a86 Start of the project
The suggestion on the comment is a way that fits your needs and your circumstances. You could create three branches stands for your three environments. And for each branch you could use branch policy(GIT) which will protect your branches and avoid unintended changes merged to UAT and Live.
Since the TFS system or any other tool is hard to judge whether some files are approved or not to release yet. It' based on your team management, you could use permissions in TFS to limit users who have access to deployment or do the release. For example only PM and team leader could handle this. Combine with work items, charts , test management, reports and other functions in TFS.
Note Team Foundation Server is a product that not only provides source code management,build, release, but also reporting, requirements management, project management (for both agile software development and waterfall teams), lab management and testing capabilities. It covers the entire application lifecycle, and enables DevOps capabilities.
Suggest you first go through the Release Management in TFS and also take a look at how to configure your release pipelines for multiple environments deployments

TFS Team Build - Testing to Production

I have scoured the internet to find out what I can on this, but have come away short. I need to know two things.
Firstly, is there a best practice for how TFS & Team Build should be used in a Development > Test > Production environment? I currently have my local VS get the latest files. Then I work on them & check them in. This creates a build that then pushes the published files into a location on the test server which IIS references. This creates my test environment. I wonder then what is the best practice for deploying this to a Live environment once testing is complete?
Secondly, off the back of the previous - my web application is connected to a database. So, the test version will point to a test database. But when this is then tested and put live, I will need that process to also make sure that any data connections are changed to the live database.
I am pretty much doing all this from scratch and am learning as I go along.
I'd suggest you to look at Microsoft Release Management since it's the tool that can help you to do exactly the things you mentioned. It can also be integrated with TFS.
In general, release management is:
the process of managing, planning, scheduling and controlling a
software build through different stages and environments; including
testing and deploying software releases.
Specifically, the tool that Microsoft offers would enable you to automate the release process, from development to production, keeping track of what and how everything is done when a particular stage is reached.
There's an MSDN article, Automate deployments with Release Management, that gives a good overview:
Basically, for each release path, you can define your own stages, each one made of a workflow (the so-called deployment sequence) containing the activities you want to perform using pre-defined machines from a pool.
It's possible to insert manual interventions/approvals if necessary, and the whole thing can be triggered automatically once your build is done.
Since you are pretty much in control of the actions performed on each machine in each stage (through the use of built-in or custom actions/components) it is also certainly possible to change configuration files, for example to test different scenarios, etc..
Another image to give you and idea of how it can be done:

should changes to the db always be part of CI?

This question came up on the development team I'm working with and we couldn't really get to a consensus:
Should changes to the database be part of the CI script?
Assuming that the application you are working with has a database involved. I think yes because that's the definition of integration. If you aren't including a portion of your application then you aren't really testing your integration. The counter-argument is that the CI server is the place to make sure your basic project setup works -- essentially building a virgin checkout of the latest version of your code.
Is there a "best practices" document for CI that would answer this question? Is this something that is debated among those who are passionate about CI?
Martin Fowler's opinion on it:
A common mistake is not to include everything in the automated build.
The build should include getting the database schema out of the
repository and firing it up in the execution environment.
All code, including DB schema and prepulated table values should both be subject to source control and continous integration. I have seen far to many projects where source control is used - but not on the DB. Instead there is a master database instance where everyone is doing there changes, on the same time. This makes it impossible to do branching and also makes it impossible to recreate an earlier state of the system.
I'm very fond of using Visual Studio 2010 Premium's functionality for database schema handling. It makes the database schema part of the project structure, having the master schema under source control. An fresh database can be created right out of the project. Upgrade scripts to lift existing databases to the new schema are automatically generated.
Doing change management properly for databases without VS2010 Premium or a similar tool would at best be painful - if possible at all. If you don't have that tool support I can understand your collegue that wants to keep the DB out of CI. If you have problems arguing for including the DB in CI, then maybe it is an option to first get a descen toolset for DB work? Once you have the right tools it is a natural step to include the DB in CI.
You have no continuous integration if you have no real integration. This means that all components needed to run your software must be part of CI, otherwise you have something just a bit more sophisticated than source control, but no real CI benefits.
Without database in CI, you can't roll back to specific version of an application and you can't run your test in real, always complete environment.
It is of course not an easy subject. In the project I work in we use alter scripts that needs to be checked in together with source code changes. These scripts are run on our test database to ensure not only the correctness of current build, but also that upgrading/downgrading one version up/down is possible and the process of update itself don't mess anything up. I believe this is the better solution than dropping and recreating whole database, it lets you to have the consistent path to upgrade the database step by step and allows you to use the database in some kind of test environment - with data, users etc.

Infrastructure required for TDD?

I am 'relatively new' to unit-testing and TDD. Only more recently have I completed my first production application that has (at least in theory) 100% code coverage. I have done unit-testing in previous projects as well for some time, but not in true TDD fashion and with good code coverage. It had always been an after-thought. I feel I have a pretty good grasp on it now though.
I'm also trying to train the rest of the team on TDD and unit testing so that we can grow togeather and start moving forward with doing unit testing in all of our applications, and eventually progress to doing full TDD w/ automated builds & continous integration. I posted a thread here regarding my plan of attack / training agenda for comments & critisism.
One of the replies (in fact the highest voted) suggested I first setup infrastructure before I go forward with the training. Unfortunately I have no exposure to this, and googling on the topics is difficult because the pages for CruiseControl.NET / nAnt / etc do not really explain the 'why' we should set this up and the 'how' everything connects togeather.
We are a small shop (about 10 developers) and use almost exclusively microsoft technologies and do our development in VB.NET. We are looking to eventually start using C# but that's for another time. I've been using the MSTest project that comes with VS2008 for my unit tests, and I've been building my apps using Visual Studio, and deploying using MSI setup projects... We also (unfortunately) use VSS for our soure control - but that is also on the chopping block and I'd really like to get rid of it and use subversion.
I know that I need to use CruiseControl.NET for CI, and either nAnt or MSBuild for building the applications. And I probably need a build server to run all these builds. But I just can't find anything that 'connects' the dots and explains how they interact with eachother, what should be on your build server, when you should build with your build server (is it just for deployment builds, or even when you just want to compile the app you're developing after making a small change, on your local environment?). I'm also planning on axing MSTest as I've found it to be buggy and will use nUnit instead.
Can anyone perhaps illuminate this gap I have from 'knowing how to do TDD' to 'setting up the proper infrastructure so the whole team can do it and work togeather'? I do understand what continous integration is, but again, I'm not sure how a build server should be setup and how it connects with everything, and why we need one (e.g. the pitch to management).
thanks very much for your time.
What portion of finalbuilder do I need? It seems there's some overlap with final builder and teamcity. Finalbuilder server seems to be a CI server, so I'm guessing I don't need that. FinalBuilder seems to be a build server - but I thought TeamCity is also a build server... And Automise seems to be a visual windows automation tool, like some kind of development platform for winforms apps...
_I also don't see support for final builder in The Team City Supported Apps Diagram : _
Take a look at a webinar I did a few weeks ago - How To Start Unit Testing Successfully. In that webinar I've talked about tools and unit testing best practices and it was aimed at developers just like you who want to introduce unit testing in their organization.
First order of business you want to put a CI (Continuous Integration) process in place and for that you'll need three tools:
Source control
Build server
Build client/script
I hope you already have some form of source control in place so let's talk about the other two.
Build Server - checks the source control and when it changes (or some other condition met) runs a build script on some client (or same machine) there are several build server available I recommend JetBrain's TeamCity it's easy to install and use (great web interface) and is free for up to 20 developers (that's you).
Build Script - on your build client you want to run a build script that would build your solution and run your unit tests. TeamCity has some basic build & test capabilities but for more advanced options (build installer, documentation etc.) you'll need some script runner at work we use FinalBuilder - it's not free but has very good editor. If you're looking for a free alternative have a look at ANT or NANT - but be prepared to edit a lot of XML.
Other tools - Because an important part of successful unit testing is how easy it is to write and run tests on the developer's machines I suggest you check if there are better IDE's or external tools that would help the developers write & run their unit tests.

Resources