I have a window 2016 server running windowservercore. I am working on moving CI pipeline into containers. During our process, we build a version.html file. The file contains build data(like build date and build nbr) and TFS 2017 project information about merge/branch that have occured.
We had this working with TeamCity running a PowerShell script that would connect and run a query against TFS 2017. So I looked on docker hub for TFS, but did not have any luck. I also tried looking under Microsoft on docker hub and did not find anything.
I tried to create a new docker file
FROM microsoft/windowsservercore:10.0.14393.1480
# Setup shell
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
RUN Mkdir BuildStage
COPY powershell/CopyBuildToStageDir.ps1 \BuildStage
Copy powershell/BuildVersionFile.ps1 \BuildStage
RUN dir
But when I ran the Powershell file inside the windows container it said...
Unable to find type
[09:25:00][Step 2/2] [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory].
[09:25:00][Step 2/2] At C:\BuildStage\BuildVersionFile.ps1:192 char:12
In the PowerShell, there is this function
#============================================================================
# Setup TFS stuff
#============================================================================
function Setup-Tfs {
# Connect to TFS
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client") | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client") | out-null
$tfsServer = "http://ourServer";
$tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($tfsServer)
$Script:versionControlServer = $tfs.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer] )
$Script:recursionType = [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full
}
Here are more details of how we are using powershell to call TFS to get Merge and Branch information to build the version.html file...
# Need to get the last 5 changesets of Merge information for both MAIN and Iteration
Setup-Tfs
$baseLocation = "$/OurBaseLocation/"
$locationForMain = $baseLocation + $OurProjectLocation
# Query history for the TFS path
$vCSChangeSets = $versionControlServer.QueryHistory($locationForMain, $recursionType, 5)
# History of Merge changes to MAIN application (only 5 deep)
"<table border='2'>" | Out-File $VersionFname -append
"<caption>Merge Info For: $AppName </caption>" | Out-File $VersionFname -append
# Build out headers
"<TH>Changeset</TH><TH>Date</TH><TH>Comment</TH>" | Out-File $VersionFname -append
Foreach ($vCSChangeSet in $vCSChangeSets) {
# write row
$changeset = $vCSChangeSet.ChangesetID
$CheckinNotesName = $vCSChangeSet.Comment
$CreationDate = $vCSChangeSet.CreationDate
if ($CheckinNotesName.ToUpper().Contains("MERGE")){
"<TR>" | Out-File $VersionFname -append
"<TD>$changeset</TD><TD>$CreationDate</TD><TD>$CheckinNotesName</TD>" | Out-File $VersionFname -append
"</TR>" | Out-File $VersionFname -append
}
if ($CheckinNotesName.ToUpper().Contains("BRANCH")){
"<TR>" | Out-File $VersionFname -append
"<TD>$changeset</TD><TD>$CreationDate</TD><TD>$CheckinNotesName</TD>" | Out-File $VersionFname -append
"</TR>" | Out-File $VersionFname -append
}
}
# close table add space
"</table><BR/><BR/>" | Out-File $VersionFname -append
My guess is that my docker file needs to add something for "Microsoft.TeamFoundation.VersionControl.Client"
Any help would be appreciated.
What I found that worked best was to give up on the PowerShell namespaces for TFS. Instead, use the TFS API. Here is an example to get the properties of a single WI.
#============================================
# Get-TFSFieldsByWiId
#============================================
function Get-TFSFieldsByWiId([string]$Id) {
$url = 'http://YourTFSUrl:YourPort/YourProject/_apis/wit/workitems?ids=' + $Id+'&$expand=all&api-version=YourVersion'
# Step 1. Create a username:password pair
$credPair = "$(''):$($password)"
# Step 2. Encode the pair to Base64 string
$encodedCredentials = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($credPair))
# Step 3. Form the header and add the Authorization attribute to it
$headers = #{ Authorization = "Basic $encodedCredentials" }
# Step 4. Make the GET request
$responseData = Invoke-WebRequest -Uri $url -Method Get -Headers $headers -UseBasicParsing -Body ($QueryToRun|ConvertTo-Json) -ContentType "application/json"
$data = $responseData.Content
$data = $data | ConvertFrom-Json
$WIDetails = $data.value
return $WIDetails
}
Related
I have been trying to schedule an ACR BUILD on a machine using Powershell. The approach I am using is with a service principal (as shown here: https://learn.microsoft.com/en-us/azure/container-registry/container-registry-authentication?tabs=azure-cli)
I have created a build script which works fine, if I call it form within the Powershell console. However, when I schedule the script to run from the windows scheduler, it seems to skip past the ACR BUILD portion and not execute as expected.
Script below:
$myreg = "myreg"
$myregfull = "myreg.azurecr.io"
$Date = Get-Date -format "yyyyMMdd"
$logfile = "c:\Log-$Date.txt"
$user ="xxx"
$pass="xxx"
$tenant="xxx"
$subscription="xxx"
$myimage="myimage:"
Try {
# 1. Logging in as service principal
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
"--- Logging in as service principal ---" | Out-File -FilePath $logfile -Append
az login --service-principal -u $user -p $pass --tenant $tenant | Out-File -FilePath $logfile -Append
}
Catch{
"Logging in as service principal failed at $(Get-Date). Error: $($_.Exception.Message)" |
Out-File -FilePath $logfile -Append
}
Try {
# 2. Switching to subscription
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
"--- Switching to subscription ---" | Out-File -FilePath $logfile -Append
az account set --subscription $subscription | Out-File -FilePath $logfile -Append
}
Catch{
"Switching to subscription failed at $(Get-Date). Error: $($_.Exception.Message)" |
Out-File -FilePath $logfile -Append
}
Try {
# 3. Logging in to registry
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
"--- Logging in to registry $myreg.azurecr.io ---" | Out-File -FilePath $logfile -Append
$TOKEN=$(az acr login --name $myreg --expose-token --output tsv --query accessToken)
docker login $myregfull -u 00000000-0000-0000-0000-000000000000 -p $TOKEN | Out-File -FilePath $logfile -Append
}
Catch{
"Logging in to registry failed at $(Get-Date). Error: $($_.Exception.Message)" |
Out-File -FilePath $logfile -Append
}
Try {
# 4. Confirm connected
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
"--- Confirming connected ---" | Out-File -FilePath $logfile -Append
az acr show -n $myreg | Out-File -FilePath $logfile -Append
az acr repository list -n $myreg | Out-File -FilePath $logfile -Append
}
Catch{
"Confirm connected failed at $(Get-Date). Error: $($_.Exception.Message)" |
Out-File -FilePath $logfile -Append
}
Try {
# 5. Triggerng Build
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
"--- Triggering build of myreg.azurecr.io/myimage:initial ---" | Out-File -FilePath $logfile -Append
az acr build -t $myimage$Date -r $myreg . --platform windows | Out-File -FilePath $logfile -Append
}
Catch{
"Triggerng Build failed at $(Get-Date). Error: $($_.Exception.Message)" |
Out-File -FilePath $logfile -Append
}
$DateForLog = Get-Date | Out-File -FilePath $logfile -Append
When called from console, the logs show the command called, then some 15mins later (after context upload) it shows (etc). :
2022/06/14 10:26:12 Downloading source code...
Then taking approx 30 mins to build before moving to next step.
Whereas when called form scheduler, it shows the step being finished in 8 secs.
The login process is definitely successful though, because the list of repositories is shown, no matter where it is called form.
Any suggestions on what might be causing this issue would be greatly appreciated.
EDIT
Updating the question to show logs.
From scheduler:
14 June 2022 14:46:05
--- Logging in as service principal ---
[
{
"cloudName": "AzureCloud",
"homeTenantId": "xxx",
--- OMMITTED ---
"user": {
"name": "xxx",
"type": "servicePrincipal"
}
}
]
14 June 2022 14:46:31
--- Switching to subscription ---
14 June 2022 14:46:36
--- Logging in to registry myreg.azurecr.io ---
Logging in to registry failed at 06/14/2022 14:46:47. Error: The term 'docker' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
14 June 2022 14:46:47
--- Confirming connected ---
{
"adminUserEnabled": true,
"anonymousPullEnabled": false,
"creationDate": "2021-04-06T10:23:22.985285+00:00",
--- OMMITTED ---
"type": "Microsoft.ContainerRegistry/registries",
"zoneRedundancy": "Disabled"
}
[
"myrepo1",
--- OMMITTED ---
"myrepo2"
]
14 June 2022 14:47:03
--- Triggering build of myreg.azurecr.io/myimage:initial ---
14 June 2022 14:47:12
From console:
14 June 2022 14:50:14
--- Logging in as service principal ---
[
{
"cloudName": "AzureCloud",
"homeTenantId": "xxx",
--- OMMITTED ---
"user": {
"name": "xxx",
"type": "servicePrincipal"
}
}
]
14 June 2022 14:50:41
--- Switching to subscription ---
14 June 2022 14:50:47
--- Logging in to registry myreg.azurecr.io ---
Logging in to registry failed at 06/14/2022 14:50:57. Error: The term 'docker' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
14 June 2022 14:50:57
--- Confirming connected ---
{
"adminUserEnabled": true,
"anonymousPullEnabled": false,
"creationDate": "2021-04-06T10:23:22.985285+00:00",
--- OMMITTED ---
"type": "Microsoft.ContainerRegistry/registries",
"zoneRedundancy": "Disabled"
}
[
"myrepo1",
--- OMMITTED ---
"myrepo2"
]
14 June 2022 14:51:11
--- Triggering build of myreg.azurecr.io/myimage:initial ---
2022/06/14 14:51:23 Downloading source code...
2022/06/14 14:51:29 Finished downloading source code
2022/06/14 14:51:30 Using acb_vol_77064302-024f-4c7c-8933-8f1fc9a4ce4f as the home volume
2022/06/14 14:51:31 Setting up Docker configuration...
2022/06/14 14:51:38 Successfully set up Docker configuration
2022/06/14 14:51:38 Logging in to registry: myreg.azurecr.io
2022/06/14 14:51:42 Successfully logged into myreg.azurecr.io
2022/06/14 14:51:42 Executing step ID: build. Timeout(sec): 28800, Working directory: '', Network: ''
2022/06/14 14:51:42 Scanning for dependencies...
2022/06/14 14:51:46 Successfully scanned dependencies
2022/06/14 14:51:46 Launching container with name: build
Sending build context to Docker daemon 804.4kB
Step 1/7 : FROM myreg.azurecr.io/myimage:empty
empty: Pulling from myimage
4612f6d0b889: Pulling fs layer
5ff1512f88ec: Pulling fs layer
--- OMMITTED ---
The problem was that the ACR BUILD command needed an absolute file path for the docker file and also <SOURCE_LOCATION>.
When called from the console, the current location was taken, but when the script was called from the scheduler it needed to be absolute.
So instead of:
az acr build -t $myimage$Date -r $myreg . --platform windows
It needed to be:
az acr build -t $myimage$Date -r $myreg -f c:/path-to-docker-file c:/path-to-source-folder/ --platform windows
The reason this was not evident to begin with, was because of the way I was capturing the logs. No errors or warnings were given when piping the output from ACR BUILD to Out-File -FilePath $logfile.
It was only when I switched to creating a transcript of the session (and removing the piped output) that an error was shown about not being able to see dockerfile.
Start-Transcript -Path "E:\transcript.txt" -NoClobber
I'm learning the powershell. Currently I have a tough requirement. I need to call an powershell script(ps1) in parallel from an powershell module(psm1). The ps1 task is like following
param(
[Parameter(Mandatory=$true)]
[String] $LogMsg,
[Parameter(Mandatory=$true)]
[String] $FilePath
)
Write-Output $LogMsg
$LogMsg | Out-File -FilePath $FilePath -Append
The FilePath is like "C:\Users\user\Documents\log\log1.log"
And in the psm1 file, I use the runspacepool to do async task. Like the following demo
$MaxRunspaces = 5
$RunspacePool = [runspacefactory]::CreateRunspacePool(1, $MaxRunspaces)
$RunspacePool.Open()
$Jobs = New-Object System.Collections.ArrayList
Write-Host $currentPath
Write-Host $lcmCommonPath
$Filenames = #("log1.log", "log2.log", "log3.log")
foreach ($File in $Filenames) {
Write-Host "Creating runspace for $File"
$PowerShell = [powershell]::Create()
$PowerShell.RunspacePool = $RunspacePool
$FilePath = -Join("C:\Users\user\Documents\log\",$File)
$PowerShell.AddScript("C:\Users\user\Documents\foo.ps1").AddArgument($FilePath) | Out-Null
$JobObj = New-Object -TypeName PSObject -Property #{
Runspace = $PowerShell.BeginInvoke()
PowerShell = $PowerShell
}
$Jobs.Add($JobObj) | Out-Null
}
But there are two serious problem.
Can't pass the parameters to ps1 file.
I just try to create the file path in the ps1 file side, it works and file created. But when I try to pass the argument from psm1 file. The files are not created. I also try to use script block and it can pass the parameters. But since my ps1 code is too large(The above is just part of it), using script block is unreal. I need a method to pass parameter to ps1 file.
Can't get write-host information in ps1 file while psm1 is still running
If the runspacepool has limitation for passing the parameters to ps1 file, is there any other solution to deal with the async task for powershell script? Thanks.
Can't pass the parameters to ps1 file.
Use AddParameter() instead of AddArgument() - this will allow you to bind the argument to a specific parameter by name:
$PowerShell.AddScript("C:\Users\user\Documents\foo.ps1").
AddParameter('FilePath', $FilePath).
AddParameter('LogMsg', 'Log Message goes here') | Out-Null
Can't get write-host information in ps1 file while psm1 is still running
Correct - you cannot get host output from a script not attached to the host application's default runspace - but if you're using PowerShell 5 or newer you can collect the resulting information from the $PowerShell instance and relay that if you want to:
# Register this event handler after creating `$PowerShell` but _before_ calling BeginInvoke()
Register-ObjectEvent -InputObject $PowerShell.Streams.Information -EventName DataAdded -SourceIdentifier 'WriteHostRecorded' -Action {
$recordIndex = $EventArgs.Index
$data = $PowerShell.Streams.Information[$recordIndex]
Write-Host "async task wrote '$data'"
}
is there a way to convert html to plaintext?
I have a script that exports all NuGet-Licenses which been used in a visual studio project to a textfile.
Unfortunately the exports are mostly in HTML, and I found no way to solve it.
# Run in Package Manager Console with `./download-packages-license.ps1`.
# If access denied, execute `Set-ExecutionPolicy -Scope Process -ExecutionPolicy RemoteSigned`.
# Save licenses to One text file and one csv file instead of individual files
$LicensesFile = (Join-Path (pwd) 'licenses\Licenses.txt')
$LicensesFile_csv = (Join-Path (pwd) 'licenses\Licenses.csv')
$results = #()
# Below 2 lines to comment if you uncomment Split-Path ..
$solutionFile = "d:\Solutions\SolFile.sln"
cd "d:\Solutions"
# Uncomment below line if you wish to want to use above 2 lines
# Split-Path -parent $dte.Solution.FileName | cd;
New-Item -ItemType Directory -Force -Path ".\licenses";
#( Get-Project -All | ? { $_.ProjectName } | % {
Get-Package -ProjectName $_.ProjectName | ? { $_.LicenseUrl }
} ) | Sort-Object Id -Unique | % {
$pkg = $_;
Try
{
if ($pkg.Id -notlike 'microsoft*' -and $pkg.LicenseUrl.StartsWith('http'))
{
Write-Host ("Download license for package " + $pkg.Id + " from " + $pkg.LicenseUrl);
#Write-Host (ConvertTo-Json ($pkg));
$licenseUrl = $pkg.LicenseUrl
if ($licenseUrl.contains('github.com')) {
$licenseUrl = $licenseUrl.replace("/blob/", "/raw/")
}
$extension = ".txt"
if ($licenseUrl.EndsWith(".md"))
{
$extension = ".md"
}
(New-Object System.Net.WebClient).DownloadFile($licenseUrl, (Join-Path (pwd) 'licenses\') + $pkg.Id + $extension);
$licenseText = get-content "$((Join-Path (pwd) 'licenses\') + $pkg.Id + $extension)"
Remove-Item $((Join-Path (pwd) 'licenses\') + $pkg.Id + $extension) -ErrorAction SilentlyContinue -Force
$data = '' | select PkgId, LicenseText
$data.PkgId = $pkg.Id
$data.LicenseText = $licenseText | Out-String
$results += $data
# save in txt file
"Designation: NugetPackage $($pkg.Id)" | Add-Content $LicensesFile
$licenseText | Add-Content $LicensesFile
"" | Add-Content $LicensesFile
"" | Add-Content $LicensesFile
"" | Add-Content $LicensesFile
"" | Add-Content $LicensesFile
Write-Host "Package $($pkg.Id): License Text saved to $LicensesFile" -ForegroundColor Green
}
}
Catch [system.exception]
{
Write-Host ("Could not download license for " + $pkg.Id)
}
}
# save in .csv file
$results | Export-Csv $LicensesFile_csv -nti
Source of the Script here
A user also said ,,Unfortunately, most license URLs now point to HTML-only versions (early 2020). For example, licenses.nuget.org ignores any "Accept: text/plain" (or json) headers and returns html regardless"
So is there even a way to get the license information in plaintext?
Thanks and stay healthy!
So is there even a way to get the license information in plaintext?
Actually, we do not recommend that you convert the html file into plaintext format. And when you get the license data from nuget.org, it is the data returned from the site in full HTML format, which is designed by that.
The returned data also contains various formats for the license field, so we should not easily modify the accepted data format(such as plaintext ). And if possible, the only way to do this is to get rid of the HTML format fields from the source data, but however, it is impossible by Powershell and it cannot be done so far.
Therefore, in order to strictly follow the format of the returned data, it is best to use an HTML file to receive license info. It can maintain consistency with the website in the form of html.
Suggestion
1) change these in powershell:
$LicensesFile = (Join-Path (pwd) 'licenses\Licenses.html')
$LicensesFile_csv = (Join-Path (pwd) 'licenses\Licenses_csv.html')
And then you can get what you want.
Hope it could help you.
I have an SSDT database project in Visual Studio 2013. This is used as the "answer sheet" when publishing database updates to a database in the other environments. I recently came across Jamie Thompson's blog article on DacPacs, where he writes a great summary on what DacPacs are, and how to use them.
Now, say I have the following scenario:
The SSDT project in VS2013, which is version 1.0.33
A database in my Dev environment, which is version 1.0.32
A database in my S-test environment, whic is version 1.0.31
According to Jamie, publishing databases changes using DacPacs is idempotent, i.e. I can publish the DacPac from the SSDT project in bullet 1 to the database in bullet 3, and it will get all the changes done to the database in both version 1.0.32 and 1.033 since the DacPac contains information about the entire DB schema (which then also should include changes done in version 1.0.32).
Is this a correct understanding of how publishing a DacPac works?
Yes, once you defined your model in a DACPAC in a declarative way, you can then deploy your model to any target environment with whatever version of you database.
The engine will automatically generate the proper change scripts according to the target.
You can deploy (publish) your model from Visual Studio or from command line using the SqlPackage.exe utility. Here an example of a PowerShell script that use SqlPackage.exe and a Publish Profile file. You can choose to publish directly or generate the change script (set the $action variable). The DACPAC file and the Publish Profile file have to be in the same folder of the ps file. A log file will be generated:
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
####################################
$action = 'Publish' #Only generate script: 'Script'; Publish directly: 'Publish'
$databaseName = 'Test'
$serverName = 'localhost'
$dacpacPath = Join-Path $scriptPath '.\Test\bin\Debug\Test.dacpac'
$publishProfilePath = Join-Path $scriptPath '.\Test\Scripts\Publish\Test.publish.xml'
$outputChangeScriptPath = Join-Path $scriptPath 'TestDeploymentScript.sql'
$logPath = Join-Path $scriptPath 'TestDeployment.log'
####################################
$sqlPackageExe = 'C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe'
if ($action.ToUpper() -eq 'SCRIPT')
{
Write-Host '********************************' | Tee-Object -File "$logPath"
Write-Host '* Database Objects Scripting *' | Tee-Object -File "$logPath"
Write-Host '********************************' | Tee-Object -File "$logPath"
$args = "/Action:Script /TargetDatabaseName:$databaseName /TargetServerName:$serverName " +
"/SourceFile:""$dacpacPath"" /Profile:""$publishProfilePath"" /OutputPath:""$outputChangeScriptPath"" "
$command = "& ""{0}"" {1}" -F $sqlPackageExe, $args
Invoke-Expression $command | Tee-Object -File "$logPath"
if($LASTEXITCODE -ne 0)
{
$commandExitCode = $LASTEXITCODE
$Error[0] | Tee-Object -File $outputChangeScriptPath
return $commandExitCode
}
}
if ($action.ToUpper() -eq 'PUBLISH')
{
# DWH
Write-Host '*********************************' | Tee-Object -File "$logPath"
Write-Host '* Database Objects Deployment *' | Tee-Object -File "$logPath"
Write-Host '*********************************' | Tee-Object -File "$logPath"
$args = "/Action:Publish /TargetDatabaseName:$databaseName /TargetServerName:$serverName " +
"/SourceFile:""$dacpacPath"" /Profile:""$publishProfilePath"" "
$command = "& ""{0}"" {1}" -F $sqlPackageExe, $args
Invoke-Expression $command | Tee-Object -File "$logPath"
if($LASTEXITCODE -ne 0)
{
$commandExitCode = $LASTEXITCODE
$Error[0] | Tee-Object -File $outputChangeScriptPath
return $commandExitCode
}
}
I have a task to update all client printer settings during a migration from and old 2003 R2 print server to a new 2008 R2 print server. All clients are Win7 with Powershell 2.0 and I created a script that adds new printers and deletes old printers on the client.
However, it mess up the default printer setting on the client, it seems to be random if it changes the default printer to a randrom printer or if no default printer is set at all.
I was thinking to use the method Get-WmiObject -Class Win32_Printer -Filter "Default = $true" and that works, I can see the correct (and old) default printer.
But if I try to set the new default printer to the same name, it fails (or more precisely, it just gets random what happens).
Maybe I am putting the function $printer.SetDefaultPrinter() on the wrong place?
Code:
Param (
$newPrintServer = "Server2",
$PrinterLog = "\\LogSVR\PrintMigration$\PrintMigration.csv"
)
<#
#Header for CSV log file:
"COMPUTERNAME,USERNAME,PRINTERNAME,RETURNCODE-ERRORMESSAGE,DATETIME,STATUS" |
Out-File -FilePath $PrinterLog -Encoding ASCII
#>
Try {
Write-Verbose ("{0}: Checking for printers mapped to old print server" -f $Env:USERNAME)
$printers = #(Get-WmiObject -Class Win32_Printer -Filter "SystemName='\\\\Server1'" -ErrorAction Stop)
$DefPrinter = Get-WmiObject -Class Win32_Printer -Filter "Default = $true"
If ($printers.count -gt 0) {
ForEach ($printer in $printers) {
Write-Verbose ("{0}: Replacing with new print server name: {1}" -f $Printer.Name,$newPrintServer)
$newPrinter = $printer.Name -replace "Server1",$newPrintServer
$returnValue = ([wmiclass]"Win32_Printer").AddPrinterConnection($newPrinter).ReturnValue
If ($returnValue -eq 0) {
"{0},{1},{2},{3},{4},{5}" -f $Env:COMPUTERNAME,
$env:USERNAME,
$newPrinter,
$returnValue,
(Get-Date),
"Added Printer" | Out-File -FilePath $PrinterLog -Append -Encoding ASCII
Write-Verbose ("{0}: Removing" -f $printer.name)
$printer.Delete()
"{0},{1},{2},{3},{4},{5}" -f $Env:COMPUTERNAME,
$env:USERNAME,
$printer.Name,
$returnValue,
(Get-Date),
"Removed Printer" | Out-File -FilePath $PrinterLog -Append -Encoding ASCII
$DefPrinter.SetDefaultPrinter()
} Else {
Write-Verbose ("{0} returned error code: {1}" -f $newPrinter,$returnValue) -Verbose
"{0},{1},{2},{3},{4},{5}" -f $Env:COMPUTERNAME,
$env:USERNAME,
$newPrinter,
$returnValue,
(Get-Date),
"Error Adding Printer" | Out-File -FilePath $PrinterLog -Append -Encoding ASCII
}
}
}
} Catch {
"{0},{1},{2},{3},{4},{5}" -f $Env:COMPUTERNAME,
$env:USERNAME,
"WMIERROR",
$_.Exception.Message,
(Get-Date),
"Error Querying Printers" | Out-File -FilePath $PrinterLog -Append -Encoding ASCII
}
I may be misunderstanding, but the default printer(defprinter) is also located on server1, right? so you create a link defprinter to printer x. then you delete all printers(including printer x) and you try to make defprinter(which no longer exists) default printer again. That won't work and a random printer will get the default attribute.
1st, you should store the unique printername($printer.name) of the defprinter before the loop starts. then when the loop is done: you search for the newly created printer wmi-object that represents the previous default printer(using the printername you saved pre-loop) and make that default..