I have an SSDT database project in Visual Studio 2013. This is used as the "answer sheet" when publishing database updates to a database in the other environments. I recently came across Jamie Thompson's blog article on DacPacs, where he writes a great summary on what DacPacs are, and how to use them.
Now, say I have the following scenario:
The SSDT project in VS2013, which is version 1.0.33
A database in my Dev environment, which is version 1.0.32
A database in my S-test environment, whic is version 1.0.31
According to Jamie, publishing databases changes using DacPacs is idempotent, i.e. I can publish the DacPac from the SSDT project in bullet 1 to the database in bullet 3, and it will get all the changes done to the database in both version 1.0.32 and 1.033 since the DacPac contains information about the entire DB schema (which then also should include changes done in version 1.0.32).
Is this a correct understanding of how publishing a DacPac works?
Yes, once you defined your model in a DACPAC in a declarative way, you can then deploy your model to any target environment with whatever version of you database.
The engine will automatically generate the proper change scripts according to the target.
You can deploy (publish) your model from Visual Studio or from command line using the SqlPackage.exe utility. Here an example of a PowerShell script that use SqlPackage.exe and a Publish Profile file. You can choose to publish directly or generate the change script (set the $action variable). The DACPAC file and the Publish Profile file have to be in the same folder of the ps file. A log file will be generated:
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
####################################
$action = 'Publish' #Only generate script: 'Script'; Publish directly: 'Publish'
$databaseName = 'Test'
$serverName = 'localhost'
$dacpacPath = Join-Path $scriptPath '.\Test\bin\Debug\Test.dacpac'
$publishProfilePath = Join-Path $scriptPath '.\Test\Scripts\Publish\Test.publish.xml'
$outputChangeScriptPath = Join-Path $scriptPath 'TestDeploymentScript.sql'
$logPath = Join-Path $scriptPath 'TestDeployment.log'
####################################
$sqlPackageExe = 'C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe'
if ($action.ToUpper() -eq 'SCRIPT')
{
Write-Host '********************************' | Tee-Object -File "$logPath"
Write-Host '* Database Objects Scripting *' | Tee-Object -File "$logPath"
Write-Host '********************************' | Tee-Object -File "$logPath"
$args = "/Action:Script /TargetDatabaseName:$databaseName /TargetServerName:$serverName " +
"/SourceFile:""$dacpacPath"" /Profile:""$publishProfilePath"" /OutputPath:""$outputChangeScriptPath"" "
$command = "& ""{0}"" {1}" -F $sqlPackageExe, $args
Invoke-Expression $command | Tee-Object -File "$logPath"
if($LASTEXITCODE -ne 0)
{
$commandExitCode = $LASTEXITCODE
$Error[0] | Tee-Object -File $outputChangeScriptPath
return $commandExitCode
}
}
if ($action.ToUpper() -eq 'PUBLISH')
{
# DWH
Write-Host '*********************************' | Tee-Object -File "$logPath"
Write-Host '* Database Objects Deployment *' | Tee-Object -File "$logPath"
Write-Host '*********************************' | Tee-Object -File "$logPath"
$args = "/Action:Publish /TargetDatabaseName:$databaseName /TargetServerName:$serverName " +
"/SourceFile:""$dacpacPath"" /Profile:""$publishProfilePath"" "
$command = "& ""{0}"" {1}" -F $sqlPackageExe, $args
Invoke-Expression $command | Tee-Object -File "$logPath"
if($LASTEXITCODE -ne 0)
{
$commandExitCode = $LASTEXITCODE
$Error[0] | Tee-Object -File $outputChangeScriptPath
return $commandExitCode
}
}
Related
I have a window 2016 server running windowservercore. I am working on moving CI pipeline into containers. During our process, we build a version.html file. The file contains build data(like build date and build nbr) and TFS 2017 project information about merge/branch that have occured.
We had this working with TeamCity running a PowerShell script that would connect and run a query against TFS 2017. So I looked on docker hub for TFS, but did not have any luck. I also tried looking under Microsoft on docker hub and did not find anything.
I tried to create a new docker file
FROM microsoft/windowsservercore:10.0.14393.1480
# Setup shell
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
RUN Mkdir BuildStage
COPY powershell/CopyBuildToStageDir.ps1 \BuildStage
Copy powershell/BuildVersionFile.ps1 \BuildStage
RUN dir
But when I ran the Powershell file inside the windows container it said...
Unable to find type
[09:25:00][Step 2/2] [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory].
[09:25:00][Step 2/2] At C:\BuildStage\BuildVersionFile.ps1:192 char:12
In the PowerShell, there is this function
#============================================================================
# Setup TFS stuff
#============================================================================
function Setup-Tfs {
# Connect to TFS
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client") | out-null
$tfsServer = "http://ourServer";
$tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($tfsServer)
$Script:versionControlServer = $tfs.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer] )
$Script:recursionType = [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full
}
Here are more details of how we are using powershell to call TFS to get Merge and Branch information to build the version.html file...
# Need to get the last 5 changesets of Merge information for both MAIN and Iteration
Setup-Tfs
$baseLocation = "$/OurBaseLocation/"
$locationForMain = $baseLocation + $OurProjectLocation
# Query history for the TFS path
$vCSChangeSets = $versionControlServer.QueryHistory($locationForMain, $recursionType, 5)
# History of Merge changes to MAIN application (only 5 deep)
"<table border='2'>" | Out-File $VersionFname -append
"<caption>Merge Info For: $AppName </caption>" | Out-File $VersionFname -append
# Build out headers
"<TH>Changeset</TH><TH>Date</TH><TH>Comment</TH>" | Out-File $VersionFname -append
Foreach ($vCSChangeSet in $vCSChangeSets) {
# write row
$changeset = $vCSChangeSet.ChangesetID
$CheckinNotesName = $vCSChangeSet.Comment
$CreationDate = $vCSChangeSet.CreationDate
if ($CheckinNotesName.ToUpper().Contains("MERGE")){
"<TR>" | Out-File $VersionFname -append
"<TD>$changeset</TD><TD>$CreationDate</TD><TD>$CheckinNotesName</TD>" | Out-File $VersionFname -append
"</TR>" | Out-File $VersionFname -append
}
if ($CheckinNotesName.ToUpper().Contains("BRANCH")){
"<TR>" | Out-File $VersionFname -append
"<TD>$changeset</TD><TD>$CreationDate</TD><TD>$CheckinNotesName</TD>" | Out-File $VersionFname -append
"</TR>" | Out-File $VersionFname -append
}
}
# close table add space
"</table><BR/><BR/>" | Out-File $VersionFname -append
My guess is that my docker file needs to add something for "Microsoft.TeamFoundation.VersionControl.Client"
Any help would be appreciated.
What I found that worked best was to give up on the PowerShell namespaces for TFS. Instead, use the TFS API. Here is an example to get the properties of a single WI.
#============================================
# Get-TFSFieldsByWiId
#============================================
function Get-TFSFieldsByWiId([string]$Id) {
$url = 'http://YourTFSUrl:YourPort/YourProject/_apis/wit/workitems?ids=' + $Id+'&$expand=all&api-version=YourVersion'
# Step 1. Create a username:password pair
$credPair = "$(''):$($password)"
# Step 2. Encode the pair to Base64 string
$encodedCredentials = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($credPair))
# Step 3. Form the header and add the Authorization attribute to it
$headers = #{ Authorization = "Basic $encodedCredentials" }
# Step 4. Make the GET request
$responseData = Invoke-WebRequest -Uri $url -Method Get -Headers $headers -UseBasicParsing -Body ($QueryToRun|ConvertTo-Json) -ContentType "application/json"
$data = $responseData.Content
$data = $data | ConvertFrom-Json
$WIDetails = $data.value
return $WIDetails
}
Update2:
Now, when I know, that x32 is the problem I debugged into the script using powershell_ise_x32 and found out, that $Word.Documents is null.
So Powershell-API for Word has a different behaviour in x32 PowerShell, then in 64bit.
Update:
The error occurs, when using PowerShell x32 and occurs NOT on PowerShell 64bit. That was really it. Powershell x32 was executed because I started it from the Total Commander 32bit.
The question is now - why 32bit and 64bit PowerShell have different behaviour?
Initial Question:
I wrote a powershell script, to convert my WordDocuments and merge them to one.
I wrote a Batch script, to start this powershell script.
When I execute the script directly in "Powershell ISE" the script works fine.
When I execute the batch script as Administrator via context menu, the script reports errors. In this case the C:\WINDOWS\SysWOW64\cmd.exe is executed.
When I execute another cmd.exe found on my system as Administrator - everything works fine:
"C:\Windows\WinSxS\amd64_microsoft-windows-commandprompt_31bf3856ad364e35_10.0.15063.0_none_9c209ff6532b42d7\cmd.exe"
Why do I have different behaviour in different cmd.exe? What are those different cmd.exe?
Batch Script:
cd /d "%~dp0"
powershell.exe -noprofile -executionpolicy bypass -file "%~dp0%DocxToPdf.ps1"
pause
Powershell Script
$FilePath = $PSScriptRoot
$Pdfsam = "D:\Programme\PDFsam\bin\run-console.bat"
$Files = Get-ChildItem "$FilePath\*.docx"
$Word = New-Object -ComObject Word.Application
if(-not $?){
throw "Failed to open Word"
}
# Convert all docx files to pdf
Foreach ($File in $Files) {
Write-Host "Word Object: " $Word
Write-Host "File Object: " $Word $File
Write-Host "FullName prop:" $File.FullName
# open a Word document, filename from the directory
$Doc = $Word.Documents.Open($File.FullName)
# Swap out DOCX with PDF in the Filename
$Name=($Doc.FullName).Replace("docx","pdf")
# Save this File as a PDF in Word 2010/2013
$Doc.SaveAs([ref] $Name, [ref] 17)
$Doc.Close()
}
# check errors
if(-not $?){
Write-Host("Stop because an error occurred")
pause
exit 0
}
# wait until the conversion is done
Start-Sleep -s 15
# Now concat all pdfs to one single pdf
$Files = Get-ChildItem "$FilePath\*.pdf" | Sort-Object
Write-Host $Files.Count
if ($Files.Count -gt 0) {
$command = ""
Foreach ($File in $Files) {
$command += " -f "
$command += "`"" + $File.FullName + "`""
}
$command += " -o `"$FilePath\Letter of application.pdf`" -overwrite concat"
$command = $Pdfsam + $command
echo $command
$path = Split-Path -Path $Pdfsam -Parent
cd $path
cmd /c $command
}else{
Write-Host "No PDFs found for concatenation"
}
Write-Host -NoNewLine "Press any key to continue...";
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown");
I've found $PSScriptRoot to be unreliable.
$FilePath = $PSScriptRoot;
$CurLocation = Get-Location;
$ScriptLocation = Split-Path $MyInvocation.MyCommand.Path
Write-Host "FilePath = [$FilePath]";
Write-Host "CurLocation = [$CurLocation]";
Write-Host "ScriptLocation = [$ScriptLocation]";
Results:
O:\Data>powershell ..\Script\t.ps1
FilePath = []
CurLocation = [O:\Data]
ScriptLocation = [O:\Script]
As to the differences between the various cmd.exe implementations, I can't really answer that. I should have thought they'd be functionally identical, but maybe there's 32/64-bit differences that matter.
The error occurs, when using PowerShell x32 and occurs NOT on PowerShell 64bit.
I debugged into the script using powershell_ise_x32 and found out, that $Word.Documents is null.
This is because on my system Word 64bit is installed.
I am trying to automate a backup of an Azure database to my local machine using SQLPackage.exe. I am trying to add the date onto the filename so that every night it doesn't get overwritten.
The following line will pick up the date but will then stop the backup running with the error shown below
CMD
"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" /Action:Export /ssn:SERVER_NAME_HERE /sdn:DATABASE_NAME /su:USERNAME /sp:PASSWORD /tf:C:\Users\William\Desktop\BackupTest\BACKUPFILE'%date%'.bacpac
ERROR
*** Unrecognized command line argument '23/06/2017'.bacpac'.
I have tried using
+%date%+
+%date
And other options but no luck. Can anyone suggest anything?
More fundamentally, it is not recommend using bacpac to backup database. Bacpac is for load & move data in and out of Azure on demand.
SQLDB on Azure has backup service on by default so a scheduled backup is already provided by the service.
In addition, to properly make a bacpac, the database needs to be copied first then make a bacpac from the copy. Otherwise transactional consistency is not guaranteed and importing the bacpac can fail in the worst case.
You can add it using PowerShell as explained on below example.
Param(
[Parameter(Position=0,Mandatory=$true)]
[string]$ServerName
)
cls
try {
if((Get-PSSnapin -Name SQlServerCmdletSnapin100 -ErrorAction SilentlyContinue) -eq $null){
Add-PSSnapin SQlServerCmdletSnapin100
}
}
catch {
Write-Error "This script requires the SQLServerCmdletSnapIn100 snapin"
exit
}
$script_path = Split-Path -Parent $MyInvocation.MyCommand.Definition
$sql = "
SELECT name
FROM sys.databases
WHERE name NOT IN ('master', 'model', 'msdb', 'tempdb','distribution')
"
$data = Invoke-sqlcmd -Query $sql -ServerInstance $ServerName -Database master
$data | ForEach-Object {
$DatabaseName = $_.name
$now=get-Date
#
# Run sqlpackage
#
&"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" `
/Action:extract `
/SourceServerName:$ServerName `
/SourceDatabaseName:$DatabaseName `
/TargetFile:$script_path\DACPACs\$DatabaseName$now.dacpac `
/p:ExtractReferencedServerScopedElements=False `
/p:IgnorePermissions=False
}
Hope this helps.
Regards,
Alberto Morillo
SQLCoffee.com
I would like to get a list of packages of my Visual Studio solution after I ran the nuget restore command.
How can I do it from command line or Powershell (oustide Visual Studio)?
You could run following PowerShell script to list all installed packages in your solution. Please modify the $SOLUTIONROOT as your solution path.
#This will be the root folder of all your solutions - we will search all children of this folder
$SOLUTIONROOT = "D:\Visual Studio 2015 Project\SO Case Sample\PackageSource"
Function ListAllPackages ($BaseDirectory)
{
Write-Host "Starting Package List - This may take a few minutes ..."
$PACKAGECONFIGS = Get-ChildItem -Recurse -Force $BaseDirectory -ErrorAction SilentlyContinue |
Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -eq "packages.config")}
ForEach($PACKAGECONFIG in $PACKAGECONFIGS)
{
$path = $PACKAGECONFIG.FullName
$xml = [xml]$packages = Get-Content $path
foreach($package in $packages.packages.package)
{
Write-Host $package.id
}
}
}
ListAllPackages $SOLUTIONROOT
Write-Host "Press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
I am using a batch script for getting the latest version of specific projects. This script only runs tf.exe and gets the latest version of some Binaries. Everything works fine, but I would like to change the attrib of the downloaded files to be writeable (by deafult these files are read-only). For that I want to determine the local path of the files and use the attrib-command from batch.
tf.exe workfold [Workspace] shows me the local path in some kind of listing but it would be easier, if it only shows me what I want so I can use the prompt. Until now the it looks like this:
tf.exe workfold [Workspace]
=======================================
Arbeitsbereich: XYZ-xxxxxx (Username)
Auflistung: TFS-URL
[Workspace]: C:\xxx\TFS\xxx
Is it possible to determine only the local path mapping of a TFS Workspace so that I can use the prompt for the attrib-command without parsing?
What about the following (crude!!!) concept?
function Get-TfsWorkfold([string]$TfsCollection, [string]$TfsWorkspace)
{
$TfExePath = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 10.0\Common7\IDE\TF.exe"
Write-Output "Getting workfold for '$TfsCollection'->'$TfsWorkspace'..."
Push-Location $LocalPath
& "$TfExePath" workfold /collection:$TfsCollection /workspace:$TfsWorkspace
}
function Handle-Path()
{
param([Parameter(ValueFromPipeline=$true,Position=0)] [string] $line)
$startIndex = $line.IndexOf(': ') + 2;
$correctedLine = $line.subString($startIndex, $line.length - $startIndex - 1);
Write-Output $correctedLine;
Get-ChildItem $correctedLine
}
Get-TfsWorkfold "{serverAndcollection}" "{workspace}" > c:\temp\test.txt
Select-String c:\temp\test.txt -pattern:': ' | Select-Object Line | Handle-Path
The last line in Handle-Path is the example which you can rewirte with whatever you want to. It is PowerShell but it should work as you want.
Replace {serverAndcollection} and {workspace}.
Real men do it in one line
powershell -command "& {tf workfold | Select-String -pattern:' $' -SimpleMatch | Select-Object Line | ForEach-Object {$startIndex = $_.Line.IndexOf(': ') + 2; $_.Line.subString($startIndex, $_.Line.length - $startIndex - 1)}}"
Current answer will only return one last path if there are many.
You can also do it without any string manipulation, with calls to TF.exe. I have wrapped that in PowerShell scripts, so you get the following:
function Add-TfsTypes
{
# NOTE: Not all of the below are needed, but these are all the assemblies we load at the moment. Please note that especially NewtonSoft dll MUST be loaded first!
$PathToAssemblies = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer"
Add-Type -Path "$PathToAssemblies\NewtonSoft.Json.dll"
Add-Type -Path "$PathToAssemblies\System.Net.http.formatting.dll"
Add-Type -Path "$PathToAssemblies\Microsoft.TeamFoundation.Client.dll"
Add-Type -Path "$PathToAssemblies\Microsoft.TeamFoundation.Common.dll"
Add-Type -Path "$PathToAssemblies\Microsoft.TeamFoundation.VersionControl.Client.dll"
Add-Type -Path "$PathToAssemblies\Microsoft.TeamFoundation.WorkItemTracking.Client.dll"
}
function Get-TfsServerPathFromLocalPath {
param(
[parameter(Mandatory=$true)][string]$LocalPath,
[switch]$LoadTfsTypes
)
if ($LoadTfsTypes) {
Add-TfsTypes # Loads dlls
}
$workspaceInfo = [Microsoft.TeamFoundation.VersionControl.Client.Workstation]::Current.GetLocalWorkspaceInfo($LocalPath)
$server = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection $workspaceInfo.ServerUri
$workspace = $workspaceInfo.GetWorkspace($server)
return $workspace.GetServerItemForLocalItem($LocalPath)
}
The above method can then be called like this:
$serverFolderPath = Get-TfsServerPathFromLocalPath $folderPath -LoadTfsTypes
$anotherServerPath = Get-TfsServerPathFromLocalPath $anotherItemToTestPathOn