I'm trying to automate the configuration of a WSUS replica server. I'm mostly there but can't configure the option to download files from Microsoft Update.
I want my replica to pull updates from Microsoft and not from the upstream server. This is option in the UI:
WSUS Config
Had a look through GetConfiguration(), been have been unable to find the option. Also looked through the registry but can't see a key associated with the option.
Current code is:
Set-WsusServerSynchronization -UssServerName "SomeServer" -PortNumber 8530 -Replica
$WSUS = Get-Wsusserver
$WSUSConfig = $WSUS.GetConfiguration()
$WSUSConfig.ProxyName = $SomeProxy
$WSUSConfig.ProxyServerPort = $SomeProxy
$WSUSConfig.UseProxy = $True
$WSUSConfig.Save()
$Subscription = $WSUS.GetSubscription()
$Subscription.SynchronizeAutomatically = $true
$Subscription.SynchronizeAutomaticallyTimeOfDay = (New-TimeSpan -Hours 1)
$Subscription.NumberOfSynchronizationsPerDay = 1
$Subscription.Save()
Write-Host "Configured WSUS"
Could have used the powershell cmdlet again and just done the below to configure it to sync from Microsoft
Set-WsusServerSynchronization -SyncFromMU
Answered my own question. For anyone else: $WSUSConfig.GetContentFromMU = $True
Related
If I install the IIS Server and Hosting bundle in quick succession like this (on a new Windows 10 Server):
Install-WindowsFeature -Name Web-Server -IncludeAllSubFeature -IncludeManagementTools
# Ignore the next line for now, its my current workaround
Start-Sleep -Seconds 120
Write-Host "-- Installing Dotnet Hosting Bundle"
$ErrorActionPreference = "Stop";
$tempDir = [System.IO.Path]::GetTempPath()
$downloadPath = "$tempdir\netCoreHostingBundle.exe";
$DefaultProxy = [System.Net.WebRequest]::DefaultWebProxy;
$securityProtocol = #();
$securityProtocol += [Net.ServicePointManager]::SecurityProtocol;
$securityProtocol += [Net.SecurityProtocolType]::Tls12;
[Net.ServicePointManager]::SecurityProtocol = $securityProtocol;
$WebClient = New-Object Net.WebClient;
$Uri = 'https://download.visualstudio.microsoft.com/download/pr/0d000d1b-89a4-4593-9708-eb5177777c64/cfb3d74447ac78defb1b66fd9b3f38e0/dotnet-hosting-6.0.6-win.exe';
if ($DefaultProxy -and (-not $DefaultProxy.IsBypassed($Uri))) { $WebClient.Proxy = New-Object Net.WebProxy($DefaultProxy.GetProxy($Uri).OriginalString, $True); };
$WebClient.DownloadFile($Uri, $downloadPath);
$arguments = New-Object -TypeName System.Collections.Generic.List[System.String]
$arguments.Add("/quiet")
$arguments.Add("/norestart")
Start-Process -FilePath $downloadPath -ArgumentList $arguments -NoNewWindow -Wait -PassThru -WorkingDirectory $tempDir
Write-Host "-- Restarting IIS"
Stop-Service W3SVC
Start-Service W3SVC
Get-Service W3SVC
Everything works out fine from the installation point of view. But if I run a NET Core Application in IIS the following error occurs:
HTTP Error 500.19 - HRESULT code 0x8007000d
Googling around this happens when "Hosting Bundle is installed before IIS". The simple solution is written in the next sentence: "the bundle installation must be repaired" and indeed this works.
The Question being now:
How do I avert the situation altogether?
or How do I wait till IIS is really installed, so it is safe to install the Hosting Bundle?
If the Hosting Bundle is installed before IIS, the bundle installation
must be repaired. Run the Hosting Bundle installer again after
installing IIS.
This is the method I found in another site:
Use a custom action that enables IIS before the installation of ".NET
Core IIS Hosting" prerequisite.
For example, you can add a custom action as a non-sequential custom action (so it can be triggered from a UI control) and then schedule it in the "Dialogs" page --> "Pre-install UI" --> "WelcomePrereqDlg" dialog --> "Next" button. This will enable IIS before installing the prerequisites.
The method comes from this link: https://www.advancedinstaller.com/forums/viewtopic.php?t=44696
I am getting the error, "Argument types do not match" when trying to send email using MailKit/MimeKit versions 2.5.1 when I am on the production server which is using PowerShell 4.0 but works on the developer machine using PowerShell 5.1
The system information for production and developer machines follow:
I get the error on production in the try/catch after it tries to run $MkSmtp.Send($Message):
The code I have looks as follows:
#Load the .NET Core class dll for Mailkit based on .NET 4.5
add-type -path "c:\Windows\System32\WindowsPowerShell\v1.0\Modules\EmailUtilities\MailKit.dll"
add-type -path "c:\Windows\System32\WindowsPowerShell\v1.0\Modules\EmailUtilities\MimeKit.dll"
#Server and Mailbox properties for less secure access when connecting to google
$MkSmtp = New-Object MailKit.Net.smtp.SmtpClient
$CanToken = New-Object System.Threading.CancellationToken ($false)
$SSL = [MailKit.Security.SecureSocketOptions]::SslOnConnect
$MkSmtp.Connect($MailServer, $Port, $SSL, $CanToken)
$MkSmtp.Authenticate(([System.Text.Encoding]::UTF8), $Username, $Password, $CanToken)
#have the building of the message in a separate function which also worked
#on development PowerShell 5.1 but not on production PowerShell 4 but rebuilt message
#here until I figure it out
#--------------
$Message = New-Object MimeKit.MimeMessage
$Message.From.Add($global:EmailFrom)
$Message.To.Add($global:EmailTo)
$Message.Subject = "Test";
$TextPart = New-Object MimeKit.TextPart ("plain")
$TextPart.Text = "Testing message..."
$Message.Body = $TextPart
#$Options = New-Object MimeKit.Net.Smtp.FormatOptions.Default.Clone
#$Options.International = $true
#$Message = New-MimeMessage $global:EmailFrom $global:EmailTo $Subject $LogMsg
$MkSmtp.Send($Message)
$MkSmtp.Disconnect($true)
$MkSmtp.Dispose()
#--------------
Updated our live server to Windows Management Framework 5.1 (PowerShell) and now works as expected just like my development machine.
We have a SSIS project where one of the packages is connecting to a REST API. We use the HTTP connection manager (with username/password) and a script component to open the connection manager and parse the response. Protection level for all packages are EncryptSensitiveWithUserKey. Everything works in Visual Studio, and can be deployed with Deployment Wizard to the SSIS-DB. In the SSIS-DB we can run the package, and also change connection manager password/username via environments.
But we are not able to achieve this via our normal automated deployment: Check-in to TFS and use VSTS-buildserver with Powershell scripts. When running the package from SSIS-db we get:
Failed to decrypt protected XML node "DTS:Property" with error 0x80070002 "The system cannot find the file specified.".
You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available.
We (believe we) know how SSIS protection levels and encryption works, and the cause is obvious: The SSIS file is encrypted with user key, and the Deployment Wizard (run by developer!) decrypts/re-encrypts with the SSIS-catalog key. But the build server does not have the user key, hence the decryption-step is invalid.
However, we would expect that this should not be an issue, since the password is replaced by the SSIS-environment, but is gives the above error.
We have tried all protection levels:
DontSaveSensitive: Package can't run in either VS/SSISDB.
EncryptSensitiveWithPassword: Passwords are unsupported in the PowerShell $folder.DeployProject command. Same method as here.
With EncryptSensitiveWithUserKey mode, you can try to setup build/release agent on your machine and change service account to your account, then deploy through this agent.
I am encountering the same problem now with Azure DevOps and the SSIS DevOps tasks targeting SQL Server 2016.
I suspect that using the Microsoft.SQLServer.Management.IntegrationServices assembly behaves differently to the ISDeploymentWizard executable.
I have found that this issue occurs for sensitive package parameters only and not project parameters so one solution is to replace your sensitive package parameters with project parameters.
The issue would occur when running the package with the sensitive package parameter from the catalog but in some cases the package would run without issue when executed as a child package.
I also found that some packages would report successful package execution but looking at the event messages the Failed to decrypt protected XML node "DTS:Property" with error 0x80070002 would be present.
An alternative solution is to execute the ISDeploymentWizard from the command line. This does require that the target catalog folder already exists as the wizard will not create it. Therefore a step is needed before this to create the catalog folder if it does not already exist.
PowerShell script below should work for SQL Server 2016 as is:
### Variables
$targetServer = "localhost"
$targetCatalogFolder = "IsDeploymentWizard"
$sourceFolder = "C:\Users\mhept\source\repos\SsisDeploy\AzureDevOpsSensitiveInChildPackage"
### Ensure Target Catalog Folder Exists
Add-Type -AssemblyName "Microsoft.SQLServer.Management.IntegrationServices, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91, processorArchitecture=MSIL"
$ssisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
# Create a connection to the server
$sqlConnectionString = "Data Source=" + $targetServer + ";Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
# Create the Integration Services object
$integrationServices = New-Object $ssisNamespace".IntegrationServices" $sqlConnection
# Get the Integration Services catalog
$catalog = $integrationServices.Catalogs["SSISDB"]
$catalogFolder = $catalog.Folders[$targetCatalogFolder]
if($null -eq $catalogFolder){
# Create the target folder
Write-Host "Creating Catalog Folder $targetCatalogFolder"
$catalogFolder = New-Object $ssisNamespace".CatalogFolder" ($catalog, $targetCatalogFolder, "")
$catalogFolder.Create()
}
$targetCatalogPath = "/SSISDB/$targetCatalogFolder"
$ispacs = Get-ChildItem -Path $sourceFolder -Filter "*.ispac" -Recurse
$isDeploymentWizard = Get-ItemPropertyValue -Path "HKLM:\SOFTWARE\Microsoft\Microsoft SQL Server\130\SSIS\Setup\DeploymentWizardPath" -Name "(default)"
foreach($ispac in $ispacs) {
$projectName = $ispac.BaseName
$sourcePath = $ispac.FullName
Write-Host "Deploying $projectName ..."
Start-Process -Wait -FilePath $isDeploymentWizard -ArgumentList "/Silent", "/SourceType:File", "/ModelType:Project", "/SourcePath:$sourcePath", "/DestinationServer:$targetServer", "/DestinationPath:$targetCatalogPath/$projectName"
Write-Host "Successfully deployed $projectName"
}
For a manual build on vsts online deploying to azure things work fine.
But when I trigger from my repo, the same build, I get this:
"No agent could be found with the following capabilities: msbuild, visualstudio, sqlpackage"
If I go to my Build def->General Tab
There isn’t the SqlPackage capability in Hosted VS2017 agent. I submit a user voice here: VSTS SQL Package deploy with Hosted VS2017 agent.
The workaround is that you can deploy sql package to azure through PowerShell.
param(
[string]$publish_profile,
[string]$path_to_snapshots,
[string]$database
)
$psPath=$PSScriptRoot
write-output $psPath
Add-Type -Path "$psPath\lib\Microsoft.SqlServer.Dac.dll"
$dacProfile = [Microsoft.SqlServer.Dac.DacProfile]::Load($publish_profile)
$dacService = new-object Microsoft.SqlServer.Dac.DacServices($dacProfile.TargetConnectionString)
$files = Get-ChildItem "$path_to_snapshots\*.dacpac"
Write-Output $path_to_snapshots
Write-Output $files.Length
foreach ($file in $files)
{
$fileName = $file.Name
Try
{
Write-output $fileName
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($file.FullName)
$dacService.Deploy($dp, $database, $true)
Start-Sleep -s 300
}
Catch
{
Write-Host "$fileName deployment has been failed" -foregroundcolor "red"
$Error | format-list -force
Write-Host $Error[0].Exception.ParentContainsErrorRecordException;
Break
}
}
Related thread: Deploy Dacpac packages via power shell script to Azure SQL Server
I added a copy of this Microsoft script to my pipeline which returns the path to SQLPackage.exe on the agent.
https://github.com/Microsoft/azure-pipelines-tasks/blob/master/Tasks/SqlAzureDacpacDeploymentV1/FindSqlPackagePath.ps1
At time of writing the returned path is:
C:\Program Files\Microsoft SQL Server\150\DAC\bin\SqlPackage.exe
If at any points the location changes in theory it should just be a case of updating it to the new location reported by the Script.
Is it possible for a currently running TeamCity build to detect that it is a "history build" through a build parameter or api call? I have build configurations that end by publishing a NuGet package and would like to hold off on publishing if a "History build" is detected. TeamCity server seems to already detect this state as the running build is shown as gray.
You can use the REST API. Something like http://MyTeamCity/app/rest/builds/BUILD_ID.
To get the BUILD_ID of the current build, use %teamcity.build.id% (the TeamCity parameter).
The call returns XML with details about the given build. If historical, there's a history="true" attribute.
My team was having similar issues with history builds and nuget packages. Using the tips from sferencik, I was able to put together the following PowerShell script that runs as the first step of all of our build projects (be sure and set the PowerShell runner option for "Format stderr output as" to error). This will cause the build to stop when it detects a history build.
Write-Host "Checking for history build."
$buildUrl = "%teamcity.serverUrl%/app/rest/builds/%teamcity.build.id%"
Write-Host "URL for build data is $buildUrl"
$wc = New-Object System.Net.WebClient
$cred = New-Object System.Net.NetworkCredential("%system.teamcity.auth.userId%", "%system.teamcity.auth.password%")
$wc.Credentials = $cred
[xml]$buildXml = $wc.DownloadString($buildUrl)
$isHistory = $buildXml.build.GetAttribute("history") -eq "true"
if ($isHistory) {
Write-Host "##teamcity[buildProblem description='This is a history build and is being abandoned.']"
Write-Error "Build is history"
} else {
Write-Host "Build is not history"
}