Poor MSBuild performance when deploying SSDT DB project programmatically - visual-studio-2010

I have an SSDT database project. I've set up a publish profile that publishes it to an SQL Server 2012 localdb instance. I have found some code that can be used to publish the database programmatically:
First import these:
using Microsoft.Build.Evaluation;
using Microsoft.Build.Logging;
using Microsoft.Build.Framework;
Then publish the DB:
// Create a logger.
FileLogger logger = new FileLogger();
logger.Parameters = #"logfile=Northwind.msbuild.log";
// Set up properties.
var projects = ProjectCollection.GlobalProjectCollection;
projects.SetGlobalProperty("Configuration", "Debug");
projects.SetGlobalProperty("SqlPublishProfilePath", #"Northwind.publish.xml");
// Load and build project.
var dbProject = ProjectCollection.GlobalProjectCollection.LoadProject(#"Northwind.sqlproj");
dbProject.Build(new[]{"Build", "Publish"}, new[]{logger});
When I use Visual Studio 2010 to manually publish the database using the publish profile shown in the code, it takes about 4-10 seconds. However, when I run this code to programmatically do the same thing, it takes about 40 seconds.
I have tried removing the Build target, but this didn't make a noticeable difference.
Why is the performance of the programmatic option poorer? What can I do to achieve the same performance as a manual publish?

Related

Win App Driver: How can you run tests on another machine

I am really from a Coded UI background and have started using Win App Driver to test a WPF application. Please forgive me if I am missing something about Win App Driver. The good news is that it works on my development machine!
When I developed Coded UI tests, I could copy my ordered tests and my test application dll to any machine, install VS Test Agent and run my tests there. That way, our customers etc can run our automated tests without having Visual Studio etc and VS Test Agent is free.
I ran the tests from a windows batch file like the one below.
C:
cd codedui
set mstestPath="C:\Program Files (x86)\Microsoft Visual
Studio\2017\TestAgent\Common7\IDE"
%mstestpath%\mstest /testcontainer:WinAppD_OrderedTest-
AcceptanceTest_Logon.orderedtest
pause
My question is can I do this with my Win App Driver tests? I have tried and it said it couldn't find "appium-dotnet-driver.dll" and "WebDriver.dll", I copied them into the same folder as my ordered test, bat file etc and then it asked for another 3 dlls ("Newtonsoft.dll", "WebDriver.Support.dll" and "Castle.Core.dll"). I copied those 3 over as well.
Now it simply says it can't find "Castle.Core". What confuses me is that it asked for 5 dlls, I copied them and that fixed the problem for the first 4, why doesn't it find Castle.Core.dll? Alternatively, is there a simpler, more Win App Drivery way to do this?
Many thanks for any advice from a Coded UI tester who wants to make the transition to Web App Driver!
Instead of manually copying files, it would be much better to configure your dependencies as nuget packages and then just perform a nuget restore on your remote server.
Microsoft provides the Appium.WinAppDriver nuget package which, when added to your UI test project, will provide all the required functionality to test your project.
If using version 4.0 of the package or greater, the docs on GitHub are slightly out of date. You should use the AppiumOptions() API to create a new session
// Set up the session
var opt = new AppiumOptions();
opt.AddAdditionalCapability("app", #"path/to/your/application.exe");
opt.AddAdditionalCapability("platformName", "Windows");
var session = new WindowsDriver<WindowsElement>(new Uri(WindowsApplicationDriverUrl), opt);
// Click a button
session.FindElementByName("MyButtonName").Click();
// Tear down
session.Close();
session.Dispose();

Visual Studio not running correctly transformed config file

My approach here may be wrong, so apologies if so - I'd appreciate any advice on what I did wrong.
I need to run (locally, for debugging), a specific configuration of a project that contains specific web.config transforms.
In my solution, in Configuration Manager I have the following listed:
Debug
Release
ClientFoo (copied from Release)
ClientBar (copied from Release)
I created a new entry, ClientXYZ (copied from Debug), then right-clicked web.config and chose Add Config Transform. I applied the transform rules, and when previewed, the transforms display correctly.
When I select ClientXYZ in the solution config drop down, and start the debugger...
...I see that the web.config used to initiate the application is the Debug one, and not my new ClientXYZ version.
Is it possible to run the project locally with web.config transforms applied, for debugging?
Web config transforms are only applied during publishing or building deployment packages by default (it does, after all, overwrite web.config). There is a way with some adjustments, however, described in this answer: https://stackoverflow.com/a/35561167/1464084
The purpose of using "Debug" and "Release" in Visual Studio are:
Debug constant defined in Debug configuration when you are developing application
Release optimize code enabled in Release configuration when you host that application for client side testing or publishing
Custom (ClientXYZ) constant defined for a developer's own settings (localhost or different IP's) for both Client site hosting and publishing your site

How to migrate MTM Test Cases from TFS 2013 to VSTS?

We have a legacy of thousands of manual Test Cases created in Microsoft Test Manager in our on premises TFS 2013.
We are trying to move them to VSTS and it proved to be difficult.
I.
As far as I can see at the moment there is no official migration tool from Microsoft, although they are working on one for full data migration
II.
We've tried a few third party tools:
OpsHub - free version has a 2500 limit which we exceed, and we can't justify $5,000 cost of commercial version
TFS Integration Tools - doesn't seem to migrate Test Cases at all (documentation by the link confirms this)
MTMCopyTool - doesn't seem to migrated Steps of Test Cases, leaves them empty
III.
We've also tried exporting-importing TFS\VSTS Query in Excel.
Which seems to export Steps too but all of them concatenated in one field, no even new line character between them, which makes it quite messy.
IV.
We've also tried using third part tool to export-import via Excel:
to export: https://tfstestcaseexporttoexcel.codeplex.com/ - seems to export everything fine, including Steps! Not sure how to import this file though to VSTS
to import: Test Case Migrator Plus just crashes on my machine, though source code is available so maybe I'll try to play with it
For a one-shot migration I can suggest a couple of options:
From the test hub in your on-premises web access, create a test plan including all the test cases and then switch to the grid view in the main pane. There you can select and copy all test cases (including steps, expected results and other test case fields) and paste them into the equivalent view in the VSTS project.
Create a powershell script that gets all the test cases from your on-premises TFS and copies them into VSTS.
Below you can find a snippet.
Caveat: I have not tested it extensively, so usual disclaimers apply. Please add additional fields you may want to copy.
$VerbosePreference = "Continue"
$tfsSource="the collection url that you want to copy form (eg. http://yourserver/tfs/yourcollection)";
$tpSource="the team project containing the test cases you want to copy form";
$tfsDest="the collection url that you want to copy to (eg. https://youraccount.visualstudio.com/DefaultCollection");
$tpDest="the team project containing the test cases you want to copy to";
[Reflection.Assembly]::LoadWithPartialName(‘Microsoft.TeamFoundation.Client’)
[Reflection.Assembly]::LoadWithPartialName(‘Microsoft.TeamFoundation.TestManagement.Client’)
[Reflection.Assembly]::LoadFile("C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PrivateAssemblies\Newtonsoft.Json.dll")
$sourceTpc = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($tfsSource)
$sourceTcm = $sourceTpc.GetService([Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService])
$sourceProject = $sourceTcm.GetTeamProject($tpSource);
$sourceTestCases = $sourceProject.TestCases.Query(“SELECT * FROM WorkItem”);
$destTpc= [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($tfsDest)
$destTcm = $destTpc.GetService([Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService])
$destProject = $destTcm.GetTeamProject($tpDest);
foreach ($tc in $sourceTestCases)
{
Write-Verbose ("Copying Test Case {0} - {1}" -f $tc.Id, $tc.Title)
$destTestCase= $destProject.TestCases.Create();
$destTestCase.Title = $tc.Title;
$destTestCase.Priority = $tc.Priority;
foreach ($step in $tc.Actions)
{
$destStep= $destTestCase.CreateTestStep();
$destStep.Title= $step.Title
$destStep.TestStepType= $step.TestStepType
$destStep.Description= $step.Description
$destStep.ExpectedResult= $step.ExpectedResult;
$destTestCase.Actions.Add($destStep);
}
$destTestCase.Save();
}

TFS Migration Risks

I would like to create a new installation of TFS 2013 on a new server.
I made my research and learnt that the migration process as it is described on this link below carries some risks:
TFS Migration Manual:
https://msdn.microsoft.com/en-us/library/ms404869.aspx
Risks:
http://blogs.msmvps.com/p3net/2014/04/12/tfs-upgrade-nightmares/
I have a plan to avoid using the TFS Migration manual above, instead; I would instead check all of my projects out (about 20) and then re-create them on the new TFS and check them in again.
However, we have work-items, users, workspaces and other agile information which I have created for my projects, and which I still require to be on the new installation.
I was wondering whether the following works (again without risks and hassle, as time is scarce):
Back up the TFS Databases from the old installation, and restore them into the new installation or simply import the data from old to new using SQL Server's Data Import Tool.
I am particularly referring to these databases, which TFS has:
Tfs_Configuration; Tfs_DefaultCollection; Tfs_Warehouse.
I found these databases on the SQL Server instance which TFS uses.
Also, this approach works easier without having to obstruct the team, as the Data Base Resotation can occur after hours..
Now, will this plan work?
No, your plan will not work and will leave your TFS in an unsupported state.
You need to follow a combination of the Upgrade and "changing environment" workflow.
1) Restore all TFS databases (tfs_*) to tye new environment
2) Install TFS 2015
3) Configure and select Upgrade Wizard - when running make sure you have all the new server names
4) (optional) ChangeServerID - if this is a practice run you should then immediately:
4.1) I unconfigure the application tier with "tfsconfig exe setup /uninstall:all"
4.2) run the ChangeServerID command
4.3) reconfigure tfs and run the "app tier only" wizard
Simples....
Note: You need to change the server ID if this is a test/practice instance as each server gets a unique ID. When clients first connect to the new server they will "upgrade/migrate" the users data across. You don't want that happening for a trial...so change the ID...
WARNING: If you manipulate the data in the TFS server in any way that is not done by the TFS Product Team tools you will turn your instance to crap. Do not ever edit, or cause to edit, the data in the operational store.

Can you monitor the execution of an SSIS package, in BIDS, as it runs on the server?

Although my title states my current goal, I am open to alternative solutions. In short, I have a series of SSIS packages that are scheduled to run on a nightly basis, on our SQL Server machine.
Due to various updates that happen in the ether of my corporate-IT, some times these exports brake and the process will stop working, in the middle of the job. To troubleshoot this process, I fire up BIDS on my workstation and restart which ever process fails. This is useful because it's been my experience that any error generated from within BIDS are much more useful than anything I've found from within SQL Server/Package Execution History or the servers event logs.
Historically, my problem has been that not all problems will occur in BIDS, but they will, consistently on the server. These issue have been painful to diagnose and have cost me a lot of time.
As such, what I would like to do is publish my package to the SSIS server, start the server instance from BIDS and monitor the project as it runs. Is this possible?
If not, is there something else that I can do so that I can monitor the internal steps as the process executes?
I fear that none of this may be possible but I have to ask. It will make my debugging and troubleshooting life so much easier.
Possible option:
You need to make use of Logging feature in SSIS. It allows you to configure the events for which you would like to capture the messages. I usually prefer the log OnWarning and OnError to keep track of all the warning and error messages that occur in the package. You have various provider to save the logging data. I prefer to use SQL Server so that I can query the logging information.
Logging options shown from SSIS 2012:
To enable logging in a package, you need to click the package Business Intelligence Development Studio (BIDS) if you are developing packages in SSIS 2005 - 2008 R2 or SQL Server Data Tools (SSDT) if you are developing packages in SSIS 2012.
Click SSIS menu and then click Logging...
You will see the Configure SSIS Logs dialog.
On the left side, you can check the package or individual tasks to log the event data.
On the Providers and Logs tab, you can select an appropriate provider to which you can save the log information. The below screenshot shows that event information are captured in a SQL Server database using the connection manager OLEDB_PracticeDB.
On the Details tab, you can select which events you would like to capture. The below screenshot shows that I am capturing the following events.
OnError
OnInformation
OnTaskFailed
OnWarning
Thanks to #William Todd Salzman for recommending OnTaskFailed event
Sample package illustration:
Let's say we have a package named SO_15004109.dtsx with a Data Flow Task and Script Task. Data Flow Task is just a dummy with no components inside.
Script task has the following code in Main method to fire custom information, warning and error messages so we can observe how it is captured in the logging data source. The code is written for SSIS 2012 so you may need to alter it for SSIS 2005. I chose VB.NET instead of C# because you have tagged this question under sql-server-2005 and SSIS 2005 only supports VB.NET.
Script Task code in VB.NET for SSIS 2005 and above.
#Region "Imports"
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
#End Region
<Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute()> _
<System.CLSCompliantAttribute(False)> _
Partial Public Class ScriptMain
Inherits Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
Public Sub Main()
Dim fireAgain As Boolean = False
Dts.Events.FireInformation(101, "Custom Script Information", "This is a test information message.", String.Empty, 0, fireAgain)
Dts.Events.FireWarning(201, "Custom Script Warning", "This is a test warning message.", String.Empty, 0)
Dts.Events.FireError(201, "Custom Script Error", "This is a test error message.", String.Empty, 0)
Dts.TaskResult = ScriptResults.Success
End Sub
#Region "ScriptResults declaration"
Enum ScriptResults
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
End Enum
#End Region
End Class
If we execute the package, it will fail because we raised an error within the Script Task.
If you navigate to the data source where you logged the errors, you will notice that SSIS creates a table to log the information if you chose SQL Server logging provider. Below table lists the logging table that SSIS creates in the SQL Server database chosen to be the log provider.
SSIS Version Log table name Table type
-------------- ---------------- ----------
SSIS 2005 dbo.sysdtslog90 User
SSIS 2008 dbo.sysdtslog100 User
SSIS 2008 R2 dbo.sysssislog System
SSIS 2012 dbo.sysssislog System
Below query was executed in the database to view the events captured by this sample package. You will notice some messages twice because the events are logged for each containers and tasks. The table is named dbo.sysssislog because the package was created in SSIS 2012.
select id, event, source, message from dbo.sysssislog;
Personal experience with logging:
I have had fairly good success in just viewing the logging error messages to understand what went wrong. Debugging packages in production environment is not advisable, in my opinion. However, capturing log events is preferable.
When I worked in SSIS 2005 and 2008, I have created SSRS based reports that queries the log table to generate daily report of the job executions and send a PDF attachment to persons of interest.
Things have improved in SSIS 2012 that the tool comes with in-built reporting capabilities with Integration Services Catalog that creates a database named SSIDB.
At my current gig, we've started using a product from Pragmatic Works called BIxPress, which includes tools to inject logging code into existing SSIS packages and monitor running packages as they execute. It's not cheap, but we have found it helpful. (Especially since it means we don't have to roll our own monitoring code...)

Resources