large file transfer write-progress not updating in real-time - windows-7

I am attempting to write progress on a large file copy in powershell. I have the progress bar working quite well with small file copies, but when I transfer over a gig of data the progress bar does not update in real time. I can check the folder I am copying to and see that everything is transferring as it should, however, my progress bar just sits on the same file and the bar itself doesn't seem to move at all. I am using CredSSP to invoke the command on a server at our data center, so I'm not sure if that is making a difference in the write-progress cmdlet or not. Here is an excerpt of my script:
$credential = Get-Credential -Credential domain\user
$session = New-PSSession -ComputerName server.domain.edu -Credential $credential -authentication CredSSP
Invoke-Command -session $session -ScriptBlock {
$path = "\\domain\shares\dfsdatacenter" + "-DC" + "\HomeDir\ttest1"
$dest = "\\server\Archive$\Terminated"
$counter = 0
$files = gci $path -recurse | ? { -not $_.PSIsContainer; }
Foreach($file in $files)
{
Copy-Item $path $dest -recurse -force
Write-Progress -Activity "Backing Up Terminated User HomeDir:" -status $file.FullName -PercentComplete ($counter / $files.count*100)
$counter++
}
}
Thanks in advance for any help anyone can offer!

Related

Issue with Powershell 5.1 not copying over files with specified extensions, but does copy over directory structure

I am currently working on just some personal powershell 5.1 scripts. Ideally, I have two machines set up, one that is local (I'll call it source), and one that is remote (I'll call it destination). The source system has a certain file directory structure with various files. I've been wanting to copy over that exact directory, maintaining the structure to the destination server. I have two conditions that I am trying to maintain, to copy the source files to the destination location if the destination location does not have those files or, copy the source files to the destination if and only if the most recently modified time is earlier than that on the destination server (copying only files with specific extensions and ignoring the rest).
The issue that I am having now is that while the actual directory structure is being copied over from source to destination, the actual files with data (like .xml, .pem, etc) files are no where to be found. They are not being copied over and I have spent the better half of 10 hours trying to figure out why. I will post the code below.
foreach ($fileS1 in Get-ChildItem $RootFolder -Recurse) {
$FolderNameS1 = $fileS1.FullName
$FolderNameNewS1 = $fileS1.FullName
# check if the path on the remote session exists
$ValidPathS1 = Invoke-Command -Session $Session1 -ScriptBlock {
Test-Path -Path $using:FolderNameNewS1
}
If ($ValidPathS1 -eq $False) {
#create the folder structure
if ($fileS1.PSIsContainer -eq $True) {
Invoke-Command -Session $Session1 -ScriptBlock {
New-Item -Path $using:FolderNameNewS1 -ItemType Directory -Verbose
}
}
}
# Get the files in the source folder
$sourceFiles = Get-ChildItem $FolderNameS1 -Recurse
# Get the files in the destination folder
$destinationFiles = Invoke-Command -Session $Session1 -ScriptBlock {
Get-ChildItem $using:FolderNameNewS1 -Recurse
}
# Compare the last write time of each source file with the last write time of its corresponding destination file
$extensions = "p12", "properties", "pem"
foreach ($sourceFile in $sourceFiles) {
if($extensions -contains $sourceFile.Extension){
$fileCopied = $False
# check if the file on the remote session exists
$destinationFilePath = Invoke-Command -Session $Session1 -ScriptBlock {
$destFilePath = Join-Path $using:FolderNameNewS1 $using:sourceFile.Name
if(!(Test-Path $destFilePath)){
New-Item -Path $destFilePath -ItemType File -Force
}
return $destFilePath
}
if($destinationFilePath -ne ""){
$destinationFile = Invoke-Command -Session $Session1 -ScriptBlock {
Get-Item $using:destinationFilePath
}
$remoteFileExist = Invoke-Command -Session $Session1 -ScriptBlock {
Test-Path $using:destinationFilePath
}
if(!$remoteFileExist){
Copy-Item $sourceFile.FullName -Destination $FolderNameNewS1 -ToSession $Session1 -Force -Verbose
}
else{
$remoteLastWriteTime = Invoke-Command -Session $Session1 -ScriptBlock {
(Get-Item $using:destinationFilePath).LastWriteTime
}
$remoteLastWriteTime = $remoteLastWriteTime.ToLocalTime()
# Compare the last write time of source file and destination file
if ($sourceFile.LastWriteTime -gt $remoteLastWriteTime) {
#copy only the files with the specified extensions
if($extensions -contains $sourceFile.Extension){
Copy-Item $sourceFile.FullName -Destination $FolderNameNewS1 -ToSession $Session1 -Force -Verbose
$fileCopied = $True
}
}
}
if ($fileCopied -eq $True) {
#remove the source file
# Remove-Item $sourceFile.FullName -Force -Verbose -ErrorAction SilentlyContinue
$fileCopied = $False
}
}
}
}
}
I am using PSSession to remote into the destination server, and we can assume that $RootFolder contains the entire directory.

PowerShell - Remote Upgrade Script

I am trying to upgrade Powershell on a bunch of Windows 7 boxes so I can do other remote installs and such. I am using Invoke-Expression but I swear this worked once before without it. There doesn't appear to be a Wait option for any of this. It does work when I run the Invoke-Expression locally. I also tried Start-Process. Is there a better way to get feedback on why it didn't run? The debugging is painfully slow because it has been a lot of just guessing, both due to lack of feedback and due to its hard to tell on the remote machine when its actually installing the background. The script is getting copied. I've tried without the Remove-item in case I was deleting it too fast. The $cred is admin. I'm not sure Execution Policy is necessary.
foreach ($comp in $computers) {
$comp.Name
if(test-connection -ComputerName $comp.Name -quiet ){
$Destination = "\\$($comp.Name)\c$\Temp\"
copy-item -path "\\10.1.32.161\New Client Setups\WMF_5.1_PowerShell\*" -Destination $Destination -recurse -force
"`t Copied"
$session = Enter-PSSession $comp.Name -Credential $cred
$results = Invoke-Command -ComputerName $comp.Name -ScriptBlock {
Set-ExecutionPolicy RemoteSigned
$ver = $PSVersionTable.PSVersion.Major
"`t Powershell Version : $ver"
if ($ver -lt "5"){
"`tNeeds upgrade"
$argumentList = #()
$argumentList += , "-AcceptEULA"
$argumentList += , "-AllowRestart"
#Invoke-Expression "& 'C:\Temp\Windows7_Server2008r2\Install-WMF5.1.ps1' + $argumentList"
Invoke-Expression 'C:\Temp\Windows7_Server2008r2\Install-WMF5.1.ps1 -AllowRestart -AcceptEULA'
}
}
$results
Remove-item -Path "$Destination*" -recurse
Exit-PSSession
Remove-PSSession -session $session

Running a powershell from rundeck(linux) display different result

I'm trying to run a powershell script from rundeck(linux), If I run the script locally[Deletes some files from multiple terminal servers](Windows server) it is working as expected however if I call it from rundeck server(winrm configured) it seems that the script cant access the remote folders I'm trying to access.
I tried running the script using the same user but still shows different result.
Script bellow:
$userAD = "someuser"
$servers = Get-Content C:\TSList.csv
$Folder = "c$\Users\$userAD\"
$TSFolderShare = "\\sharepath"
Write-Output "#####Start of script#####"
Write-output `n
Write-output "Checking if $userAD user profile exist in Terminal servers..."
sleep -seconds 1
foreach ($server in $servers) {
Test-Path "\\$server\$Folder" -PathType Any
Get-ChildItem "\\$server\$Folder"
if (Test-Path "\\$server\$Folder" -PathType Any) {
Write-output "Resetting user profile in $server.."
Get-ChildItem "\\$server\$Folder" -Recurse -Force -ErrorAction SilentlyContinue | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue
sleep -seconds 1
Write-output "Done."
if( (Get-ChildItem "\\$server\$Folder" | Measure-Object).Count -eq 0)
{
Write-output "Done."
}
}
else
{
Write-output "Resetting user profile in $server.."
sleep -seconds 1
Write-output "User profile does not exist in $server."
#Write-output "\\$server\$Folder does not exist in $server!" -ForegroundColor Red
}
}
EDIT: It seems my problem is when running my script from another script with RunAS.
Below I'm trying to access a folder from another server using ps script, but since I want to integrate this to Rundeck I need to call my ps script from my linux server using python. I did a test running the ps script directly and calling the test path script using another script with RunUs using the same user I used to run the script manually
Scenario 1
Running PS script via separate PS script with RunAS(my_account)
$username = "my_account"
$password = "my_password"
$secstr = New-Object -TypeName System.Security.SecureString
$password.ToCharArray() | ForEach-Object {$secstr.AppendChar($_)}
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username, $secstr
Invoke-Command -FilePath "C:\testpath.ps1" -Credential $cred -Computer localhost
(C:\testpath.ps1) Content below:
Test-Path "\\server\c$\Users\myaccount\"
result:
Access is denied
+ CategoryInfo : PermissionDenied: (\server\c$\Users\myaccount:String) [Test-Path], UnauthorizedAccessException
+ FullyQualifiedErrorId : ItemExistsUnauthorizedAccessError,Microsoft.PowerShell.Commands.TestPathCommand
+ PSComputerName : localhost
False
Scenario 2
Running C:\testpath.ps1 directly as my_account
Test-Path "\\server\c$\Users\myaccount\"
result:
True
I used session configuration in powershell to solve the issue. This way allows you to tie a credential to a PowerShell session configuration and reuse this configuration for all future connections.
https://4sysops.com/archives/solve-the-powershell-multi-hop-problem-without-using-credssp/
Thanks a lot!
You're facing a double-hop issue with Rundeck and Powershell, here the explanation. That's asked before, take a look a this, and here a good workaround. Also this to solve it.

Continue powershell script execution after system restart from last execution point

What i'm trying to do ?
Create four files in local disk in the following order.
Note : In my local machine and not in any server remotely.
Three files to be created
Restart the system
On system startup create another file
Script i have used.
get-job | remove-job -Force
function create-file {
Param ([string] $a)
$p = "D:\" + $a
Write-Host $p
if (!(Test-Path $p))
{
New-Item -path D:\$a -type "file" -value "my new text"
Write-Host "Created new file and text content added"
}
else
{
Add-Content -path D:\$a -value "new text content"
Write-Host "File already exists and new text content added"
}
}
Workflow New-ServerSetup
{
create-file "one.txt"
create-file "two.txt"
create-file "three.txt"
Restart-Computer -ComputerName $env:COMPUTERNAME -Wait
Start-Sleep -Seconds 7
create-file "four.txt"
Unregister-ScheduledJob -Name NewServerSetupResume
}
$adm = "####"
$pwd = ConvertTo-SecureString -String "####" -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($adm, $pwd)
$AtStartup = New-JobTrigger -AtStartup
Register-ScheduledJob -Name NewServerSetupResume -Credential $cred -Trigger $AtStartup -ScriptBlock {Import-Module PSWorkflow; Get-Job -Name NewSrvSetup -State Suspended | Resume-Job}
New-ServerSetup -JobName NewSrvSetup
Issues i'm facing
The execution returns Cannot wait for local computer to restart
i'm new to powershell things if any mistakes burden me.
Thanks in advance.
Schedule a job first, then reboot without waiting.

Is it possible to invoke-command with in a workflow?

Do anyone know if I can use Invoke-Command in a PowerShell workflow?
Currently I have script that loops through a text file with the list of services but I would like it push to all of the servers at once verses going through one by one. Is this possible?
This is the current script block I am working with:
{
ForEach ($Server in $Servers) {
Write-Host "Copying code to $Server..."
If (!(Test-Path -path \\$Server\c$\Websites\Versions\v$version)) {
New-Item \\$Server\c$\Websites\Versions\v$version -Type Directory | Out-Null
}
Copy-Item .\Packages\v$version\* \\$Server\c$\Websites\Versions\v$version -Force -Recurse
Write-Host "Converting to application on $Server..."
Invoke-Command -ComputerName $Server -ScriptBlock $Script -Argumentlist $Version | Out-Null
}
}
The PowerShell Workflow engine is not capable of directly invoking PowerShell cmdlets. Instead, if a script writer calls a PowerShell cmdlet inside a Workflow definition, the PowerShell Workflow engine will automatically wrap invocations of PowerShell cmdlets inside the InlineScript Workflow Activity.
workflow test
{
ForEach ($Server in $Servers) {
Write-Host "Copying code to $Server..."
If (!(Test-Path -path \\$Server\c$\Websites\Versions\v$version)) {
New-Item \\$Server\c$\Websites\Versions\v$version -Type Directory | Out-Null
}
Copy-Item .\Packages\v$version\* \\$Server\c$\Websites\Versions\v$version -Force -Recurse
Write-Host "Converting to application on $Server..."
InlineScript {
Invoke-Command -ComputerName $Server -ScriptBlock $Script -Argumentlist $Version | Out-Null
}
}
}
As for whether or not it will work, you'll have to try it out, as suggested by Mathias.
#Trevor's response is good as an overall skeleton, but it won't work as it is.
There are several things missing or incorrect:
Passing arguments to workflow
Passing arguments to InlineScript
Passing ScriptBlock as an argument;
Using Out-Null in workflow
The working example:
$serversProd=#"
server1
server2
server3
server4
"#-split'\r\n'
$reportScript = "report.script.ps1"
$generateReport = {
param($reportScript)
cd D:\Automations\ConnectivityCheck
powershell -file $reportScript
}
workflow check-connectivity {
Param ($servers, $actionBlock, $reportScript)
# Prepare the results folder
$resultsFolder = "D:\Automations\ConnectivityCheckResults"
$unused1 = mkdir -Force $resultsFolder
# Run on all servers in parallel
foreach -parallel ($server in $servers) {
# Upload script to the server
$unused2 = mkdir -Force \\$server\D$\Automations\ConnectivityCheck
cp -Force $reportScript \\$server\D$\Automations\ConnectivityCheck\
"Starting on $server..."
# Execute script on the server. It should contain Start-Transcript and Stop-Transcript commands
# For example:
# $hostname = $(Get-Wmiobject -Class Win32_ComputerSystem).Name
# $date = (Get-Date).ToString("yyyyMMdd")
# Start-Transcript -path ".\$date.$hostname.connectivity.report.txt"
# ...Code...
# Stop-Transcript
$results = InlineScript {
$scriptBlock = [scriptblock]::Create($Using:actionBlock)
Invoke-Command -computername $Using:server -ScriptBlock $scriptBlock -ArgumentList $Using:reportScript
}
# Download transcript file from the server
$transcript = [regex]::Match($results,"Transcript started.+?file is \.\\([^\s]+)").groups[1].value
"Completed on $server. Transcript file: $transcript"
cp -Force \\$server\D$\Automations\ConnectivityCheck\$transcript $resultsFolder\
}
}
cls
# Execute workflow
check-connectivity $serversProd $generateReport $reportScript

Resources