We have a SSIS project where one of the packages is connecting to a REST API. We use the HTTP connection manager (with username/password) and a script component to open the connection manager and parse the response. Protection level for all packages are EncryptSensitiveWithUserKey. Everything works in Visual Studio, and can be deployed with Deployment Wizard to the SSIS-DB. In the SSIS-DB we can run the package, and also change connection manager password/username via environments.
But we are not able to achieve this via our normal automated deployment: Check-in to TFS and use VSTS-buildserver with Powershell scripts. When running the package from SSIS-db we get:
Failed to decrypt protected XML node "DTS:Property" with error 0x80070002 "The system cannot find the file specified.".
You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available.
We (believe we) know how SSIS protection levels and encryption works, and the cause is obvious: The SSIS file is encrypted with user key, and the Deployment Wizard (run by developer!) decrypts/re-encrypts with the SSIS-catalog key. But the build server does not have the user key, hence the decryption-step is invalid.
However, we would expect that this should not be an issue, since the password is replaced by the SSIS-environment, but is gives the above error.
We have tried all protection levels:
DontSaveSensitive: Package can't run in either VS/SSISDB.
EncryptSensitiveWithPassword: Passwords are unsupported in the PowerShell $folder.DeployProject command. Same method as here.
With EncryptSensitiveWithUserKey mode, you can try to setup build/release agent on your machine and change service account to your account, then deploy through this agent.
I am encountering the same problem now with Azure DevOps and the SSIS DevOps tasks targeting SQL Server 2016.
I suspect that using the Microsoft.SQLServer.Management.IntegrationServices assembly behaves differently to the ISDeploymentWizard executable.
I have found that this issue occurs for sensitive package parameters only and not project parameters so one solution is to replace your sensitive package parameters with project parameters.
The issue would occur when running the package with the sensitive package parameter from the catalog but in some cases the package would run without issue when executed as a child package.
I also found that some packages would report successful package execution but looking at the event messages the Failed to decrypt protected XML node "DTS:Property" with error 0x80070002 would be present.
An alternative solution is to execute the ISDeploymentWizard from the command line. This does require that the target catalog folder already exists as the wizard will not create it. Therefore a step is needed before this to create the catalog folder if it does not already exist.
PowerShell script below should work for SQL Server 2016 as is:
### Variables
$targetServer = "localhost"
$targetCatalogFolder = "IsDeploymentWizard"
$sourceFolder = "C:\Users\mhept\source\repos\SsisDeploy\AzureDevOpsSensitiveInChildPackage"
### Ensure Target Catalog Folder Exists
Add-Type -AssemblyName "Microsoft.SQLServer.Management.IntegrationServices, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91, processorArchitecture=MSIL"
$ssisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
# Create a connection to the server
$sqlConnectionString = "Data Source=" + $targetServer + ";Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
# Create the Integration Services object
$integrationServices = New-Object $ssisNamespace".IntegrationServices" $sqlConnection
# Get the Integration Services catalog
$catalog = $integrationServices.Catalogs["SSISDB"]
$catalogFolder = $catalog.Folders[$targetCatalogFolder]
if($null -eq $catalogFolder){
# Create the target folder
Write-Host "Creating Catalog Folder $targetCatalogFolder"
$catalogFolder = New-Object $ssisNamespace".CatalogFolder" ($catalog, $targetCatalogFolder, "")
$catalogFolder.Create()
}
$targetCatalogPath = "/SSISDB/$targetCatalogFolder"
$ispacs = Get-ChildItem -Path $sourceFolder -Filter "*.ispac" -Recurse
$isDeploymentWizard = Get-ItemPropertyValue -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\130\SSIS\Setup\DeploymentWizardPath" -Name "(default)"
foreach($ispac in $ispacs) {
$projectName = $ispac.BaseName
$sourcePath = $ispac.FullName
Write-Host "Deploying $projectName ..."
Start-Process -Wait -FilePath $isDeploymentWizard -ArgumentList "/Silent", "/SourceType:File", "/ModelType:Project", "/SourcePath:$sourcePath", "/DestinationServer:$targetServer", "/DestinationPath:$targetCatalogPath/$projectName"
Write-Host "Successfully deployed $projectName"
}
Related
I'm trying to install the above package in VS Code. For some reason when I put this at the top of my script I get an error message.
using namespace system.collections.generic
Add-Type -AssemblyName System.Data.OracleClient
Add-Type -Path "C:\Users\me\OneDrive - company\Documents\2021\temp endToEnd\oracle.ManagedDataAccess.Core\oracle.manageddataaccess.core.3.21.50\lib\netstandard2.1\Oracle.ManagedDataAccess.dll"
Error:
Add-Type : Missing an argument for parameter 'AssemblyName'. Specify a parameter of type 'System.String[]' and try again.
I also tried Add-type -Path "C:\Users\me\OneDrive - company\Documents\2021\temp endToEnd\oracle.ManagedDataAccess.Core\oracle.manageddataaccess.core.3.21.50\lib\netstandard2.1\Oracle.ManagedDataAccess.dll"
And it had this error:
Add-Type : Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.
My question is, how do I load this, so I can do the following:
$connectionString = "Data Source=$dataSource;User Id=$username;Password=$password;"
$con = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($connectionString)
I am trying to install this package because I have this error when I try to execute the last code line above:
New-Object : Cannot find type [Oracle.ManagedDataAccess.Client.OracleConnection]: verify that the assembly containing this type is loaded.
I tried doing it through nuget manager as well, with ctrl shift P, nuget manager, but it's not coming up in the list that I can see (odp.net, oracle managed..., etc).
I had download this from the oracle website:
oracle.manageddataaccess.core.3.21.50.nupkg
Then I used 7-zip to unzip it to the location I'm Add-Type from.
I've been looking at these links:
New-object Oracle.ManagedDataAccess.Client.OracleConnection
oracle-developer-tools-vs-code
install nuget package in vs code
I can't seem to get this installed so the command works in the script. Any help would be appreciated.
You're basically treating powershell like a client application, you'll want: the Oracle Data Application Client(ODAC) driver, powershell is a managed memory model so you'll want the managed one, and most likely the 64-bit one unless for some reason you're running 32-bit powershell... Beyond that it'll likely depend on which version works best for your Oracle database.
For example 12cR1:
Download ODP.NET_Managed_ODAC122cR1.zip
extract odp.net\managed\common\Oracle.ManagedDataAccess.dll
PS C:\working> add-type -path (ls .\Oracle.ManagedDataAccess.dll).FullName
PS C:\working> $OraEntry = '(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(Host=database.example.com)(Port=1234)))(CONNECT_DATA=(service_name=BigData)))'
PS C:\working> $con = [Oracle.ManagedDataAccess.Client.OracleConnection]::new()
PS C:\working> $con.ConnectionString = "Data Source=$OraEntry;User Id=$username;Password=$password"
PS C:\working> $con.Open()
If you don't know what the OraEntry should be you can likely copy it from your tnsnames.ora and/or check with your database admin
Have a look at How the Runtime Locates Assemblies
Most common .NET assemblies are loaded either from current directory or from Global Assembly Cache (GAC). The GAC takes precedence over locally stored files.
Check files in your downloaded package, there should be the OraProvCfg.exe. Use it for adding the dll into GAC and doing the configuration:
OraProvCfg.exe /action:config /product:odpm /frameworkversion:v4.0.30319 /providerpath:"C:\Users\me\OneDrive - company\Documents\2021\temp endToEnd\oracle.ManagedDataAccess.Core\oracle.manageddataaccess.core.3.21.50\lib\netstandard2.1\Oracle.ManagedDataAccess.dll"
I'm trying to automate the configuration of a WSUS replica server. I'm mostly there but can't configure the option to download files from Microsoft Update.
I want my replica to pull updates from Microsoft and not from the upstream server. This is option in the UI:
WSUS Config
Had a look through GetConfiguration(), been have been unable to find the option. Also looked through the registry but can't see a key associated with the option.
Current code is:
Set-WsusServerSynchronization -UssServerName "SomeServer" -PortNumber 8530 -Replica
$WSUS = Get-Wsusserver
$WSUSConfig = $WSUS.GetConfiguration()
$WSUSConfig.ProxyName = $SomeProxy
$WSUSConfig.ProxyServerPort = $SomeProxy
$WSUSConfig.UseProxy = $True
$WSUSConfig.Save()
$Subscription = $WSUS.GetSubscription()
$Subscription.SynchronizeAutomatically = $true
$Subscription.SynchronizeAutomaticallyTimeOfDay = (New-TimeSpan -Hours 1)
$Subscription.NumberOfSynchronizationsPerDay = 1
$Subscription.Save()
Write-Host "Configured WSUS"
Could have used the powershell cmdlet again and just done the below to configure it to sync from Microsoft
Set-WsusServerSynchronization -SyncFromMU
Answered my own question. For anyone else: $WSUSConfig.GetContentFromMU = $True
I am doing a VSTS build.It is running perfectly in my machine but when we try to run it in the build server it is getting struck into one step:"deleting .manifest.pregam" and after that build will get terminated after the buffer time.We are suspecting that this maybe the problem because of the certificate installation in the build machine as in log files it showing that the build is waiting for the "Signfile".Can anyone here suggest me how to overcome this hurdle?your suggestions will be appreciated.
You need to import the certificate file (e.g. pfx) before Visual Studio Build step.
PowerShell task to build definition. Working folder:[certificate file path]
Script:
$pfxpath = 'pathtoees.pfx'
$password = 'password'
Add-Type -AssemblyName System.Security
$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
$cert.Import($pfxpath, $password, [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]"PersistKeySet")
$store = new-object system.security.cryptography.X509Certificates.X509Store -argumentlist "MY", CurrentUser
$store.Open([System.Security.Cryptography.X509Certificates.OpenFlags]"ReadWrite")
$store.Add($cert)
$store.Close()
Visual Studio Build task
A related thread: Visual studio team services deploymen/buildt certificate error
For a manual build on vsts online deploying to azure things work fine.
But when I trigger from my repo, the same build, I get this:
"No agent could be found with the following capabilities: msbuild, visualstudio, sqlpackage"
If I go to my Build def->General Tab
There isn’t the SqlPackage capability in Hosted VS2017 agent. I submit a user voice here: VSTS SQL Package deploy with Hosted VS2017 agent.
The workaround is that you can deploy sql package to azure through PowerShell.
param(
[string]$publish_profile,
[string]$path_to_snapshots,
[string]$database
)
$psPath=$PSScriptRoot
write-output $psPath
Add-Type -Path "$psPath\lib\Microsoft.SqlServer.Dac.dll"
$dacProfile = [Microsoft.SqlServer.Dac.DacProfile]::Load($publish_profile)
$dacService = new-object Microsoft.SqlServer.Dac.DacServices($dacProfile.TargetConnectionString)
$files = Get-ChildItem "$path_to_snapshots\*.dacpac"
Write-Output $path_to_snapshots
Write-Output $files.Length
foreach ($file in $files)
{
$fileName = $file.Name
Try
{
Write-output $fileName
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($file.FullName)
$dacService.Deploy($dp, $database, $true)
Start-Sleep -s 300
}
Catch
{
Write-Host "$fileName deployment has been failed" -foregroundcolor "red"
$Error | format-list -force
Write-Host $Error[0].Exception.ParentContainsErrorRecordException;
Break
}
}
Related thread: Deploy Dacpac packages via power shell script to Azure SQL Server
I added a copy of this Microsoft script to my pipeline which returns the path to SQLPackage.exe on the agent.
https://github.com/Microsoft/azure-pipelines-tasks/blob/master/Tasks/SqlAzureDacpacDeploymentV1/FindSqlPackagePath.ps1
At time of writing the returned path is:
C:\Program Files\Microsoft SQL Server\150\DAC\bin\SqlPackage.exe
If at any points the location changes in theory it should just be a case of updating it to the new location reported by the Script.
Is it possible for a currently running TeamCity build to detect that it is a "history build" through a build parameter or api call? I have build configurations that end by publishing a NuGet package and would like to hold off on publishing if a "History build" is detected. TeamCity server seems to already detect this state as the running build is shown as gray.
You can use the REST API. Something like http://MyTeamCity/app/rest/builds/BUILD_ID.
To get the BUILD_ID of the current build, use %teamcity.build.id% (the TeamCity parameter).
The call returns XML with details about the given build. If historical, there's a history="true" attribute.
My team was having similar issues with history builds and nuget packages. Using the tips from sferencik, I was able to put together the following PowerShell script that runs as the first step of all of our build projects (be sure and set the PowerShell runner option for "Format stderr output as" to error). This will cause the build to stop when it detects a history build.
Write-Host "Checking for history build."
$buildUrl = "%teamcity.serverUrl%/app/rest/builds/%teamcity.build.id%"
Write-Host "URL for build data is $buildUrl"
$wc = New-Object System.Net.WebClient
$cred = New-Object System.Net.NetworkCredential("%system.teamcity.auth.userId%", "%system.teamcity.auth.password%")
$wc.Credentials = $cred
[xml]$buildXml = $wc.DownloadString($buildUrl)
$isHistory = $buildXml.build.GetAttribute("history") -eq "true"
if ($isHistory) {
Write-Host "##teamcity[buildProblem description='This is a history build and is being abandoned.']"
Write-Error "Build is history"
} else {
Write-Host "Build is not history"
}