I'm currently starting a "Spot Instance" on the g2.xlarge CPU, and passing a startup script with the userdata paremeter Amazon has made available.
I want to move some files upon startup, from the root volume to the EBS volume. This is all perfect, and I have written a Powershell script to do this which runs just fine when I run the .ps1 file in a Remote Desktop Connected session from my mac. It takes just about 30 seconds to complete moving these files.
However!!! :) When I pass that EXACT SAME PowerShell script into the User-data upon creating the Spot Instance, it looks like it kinda freezes (it doesn't though, it's just slow) and it takes a whopping 10-20 minutes to finish moving the files! :/
Which means that the whole boot process is taking forever.
I have tried moving the files with robocopy, standard move commands and even the Powershell native Move-Item in a foreach loop. Same applies though: All the commands work just fine when being run from an RDP session, but when passed through the userdata upon launching the instance, to be run at boot, they are extremely slow to finish (or start?) :/
Does anyone know what the issue could be?
I really need the server to move the files automatically, and without having to RDP into the server to do it.
Thanks! :)
btw I'm running Windows Server 2012 R2
EDIT:
In case it has something to do with the script I use to move the files, here it is. :)
Start-Job -Name MoveFolder -ScriptBlock {
$path = "C:\Program Files (x86)\OldDestination"
$archpath = "Z:\NewDestination"
$counter = 0
$oldpercentage = 0
$files = Get-Childitem -Path $path -recurse -force
$totalcount = $files.count
$lasttime = [int][double]::Parse((Get-Date -UFormat "%s"))
foreach($file in $files) {
$filename = $file.FullName
Move-Item $file.FullName -destination $archpath -force -ErrorAction:SilentlyContinue
$counter++
$percentage = ($counter/$totalcount)*100
$percentage = [math]::Round($percentage)
if($percentage -gt 99) { $percentage = 99 }
$runtime = [int][double]::Parse((Get-Date -UFormat "%s"))
$runtime = $runtime - $lasttime
if($runtime -gt 5) { //Max. log process on server every > 5 seconds
if($percentage -gt $oldpercentage) {
Invoke-WebRequest -Uri http://exampleurl.com/$percentage -Method GET -UseBasicParsing
$lasttime = [int][double]::Parse((Get-Date -UFormat "%s"))
$oldpercentage = $percentage
}
}
}
$percentage = 100
Invoke-WebRequest -Uri http://exampleurl.com/$percentage -Method GET -UseBasicParsing
}
Wait-Job -Name MoveFolder
Related
I'm trying to run a powershell script from rundeck(linux), If I run the script locally[Deletes some files from multiple terminal servers](Windows server) it is working as expected however if I call it from rundeck server(winrm configured) it seems that the script cant access the remote folders I'm trying to access.
I tried running the script using the same user but still shows different result.
Script bellow:
$userAD = "someuser"
$servers = Get-Content C:\TSList.csv
$Folder = "c$\Users\$userAD\"
$TSFolderShare = "\\sharepath"
Write-Output "#####Start of script#####"
Write-output `n
Write-output "Checking if $userAD user profile exist in Terminal servers..."
sleep -seconds 1
foreach ($server in $servers) {
Test-Path "\\$server\$Folder" -PathType Any
Get-ChildItem "\\$server\$Folder"
if (Test-Path "\\$server\$Folder" -PathType Any) {
Write-output "Resetting user profile in $server.."
Get-ChildItem "\\$server\$Folder" -Recurse -Force -ErrorAction SilentlyContinue | Remove-Item -Recurse -Force -ErrorAction SilentlyContinue
sleep -seconds 1
Write-output "Done."
if( (Get-ChildItem "\\$server\$Folder" | Measure-Object).Count -eq 0)
{
Write-output "Done."
}
}
else
{
Write-output "Resetting user profile in $server.."
sleep -seconds 1
Write-output "User profile does not exist in $server."
#Write-output "\\$server\$Folder does not exist in $server!" -ForegroundColor Red
}
}
EDIT: It seems my problem is when running my script from another script with RunAS.
Below I'm trying to access a folder from another server using ps script, but since I want to integrate this to Rundeck I need to call my ps script from my linux server using python. I did a test running the ps script directly and calling the test path script using another script with RunUs using the same user I used to run the script manually
Scenario 1
Running PS script via separate PS script with RunAS(my_account)
$username = "my_account"
$password = "my_password"
$secstr = New-Object -TypeName System.Security.SecureString
$password.ToCharArray() | ForEach-Object {$secstr.AppendChar($_)}
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username, $secstr
Invoke-Command -FilePath "C:\testpath.ps1" -Credential $cred -Computer localhost
(C:\testpath.ps1) Content below:
Test-Path "\\server\c$\Users\myaccount\"
result:
Access is denied
+ CategoryInfo : PermissionDenied: (\server\c$\Users\myaccount:String) [Test-Path], UnauthorizedAccessException
+ FullyQualifiedErrorId : ItemExistsUnauthorizedAccessError,Microsoft.PowerShell.Commands.TestPathCommand
+ PSComputerName : localhost
False
Scenario 2
Running C:\testpath.ps1 directly as my_account
Test-Path "\\server\c$\Users\myaccount\"
result:
True
I used session configuration in powershell to solve the issue. This way allows you to tie a credential to a PowerShell session configuration and reuse this configuration for all future connections.
https://4sysops.com/archives/solve-the-powershell-multi-hop-problem-without-using-credssp/
Thanks a lot!
You're facing a double-hop issue with Rundeck and Powershell, here the explanation. That's asked before, take a look a this, and here a good workaround. Also this to solve it.
This question already has answers here:
Batch file to delete folders older than 10 days in Windows 7
(3 answers)
Closed last year.
i'm building a setup with inno setup and i'd like to add scheduled task to clean log folders older than X days with single command.
I'm searching for some example to make powershell or prompt command, but none works.
Can you help me to find best way?
Thanks
I don't have much time to research this but if you would like to search for a file within a folder location continuously covering a specific time-frame you can use the following script;
while($true){
# You may want to adjust these
$fullPath = "C:\temp\_Patches\Java\Files\x86\Source"
$numdays = 5
$numhours = 10
$nummins = 5
function ShowOldFiles($path, $days, $hours, $mins)
{
$files = #(get-childitem $path -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Foregroundcolor Red
Start-Sleep -s 10
}
}
}
ShowOldFiles $fullPath $numdays $numhours $nummins
}
You would just need to add this script to your start-up folder and change the values (E.G file path, file age, sleep). You can also append the data to a text file.
I started with the following post:
How can I check if a file is older than a certain time with PowerShell?
Thanks,
Calvin
Edit: Formatting
Update2:
Now, when I know, that x32 is the problem I debugged into the script using powershell_ise_x32 and found out, that $Word.Documents is null.
So Powershell-API for Word has a different behaviour in x32 PowerShell, then in 64bit.
Update:
The error occurs, when using PowerShell x32 and occurs NOT on PowerShell 64bit. That was really it. Powershell x32 was executed because I started it from the Total Commander 32bit.
The question is now - why 32bit and 64bit PowerShell have different behaviour?
Initial Question:
I wrote a powershell script, to convert my WordDocuments and merge them to one.
I wrote a Batch script, to start this powershell script.
When I execute the script directly in "Powershell ISE" the script works fine.
When I execute the batch script as Administrator via context menu, the script reports errors. In this case the C:\WINDOWS\SysWOW64\cmd.exe is executed.
When I execute another cmd.exe found on my system as Administrator - everything works fine:
"C:\Windows\WinSxS\amd64_microsoft-windows-commandprompt_31bf3856ad364e35_10.0.15063.0_none_9c209ff6532b42d7\cmd.exe"
Why do I have different behaviour in different cmd.exe? What are those different cmd.exe?
Batch Script:
cd /d "%~dp0"
powershell.exe -noprofile -executionpolicy bypass -file "%~dp0%DocxToPdf.ps1"
pause
Powershell Script
$FilePath = $PSScriptRoot
$Pdfsam = "D:\Programme\PDFsam\bin\run-console.bat"
$Files = Get-ChildItem "$FilePath\*.docx"
$Word = New-Object -ComObject Word.Application
if(-not $?){
throw "Failed to open Word"
}
# Convert all docx files to pdf
Foreach ($File in $Files) {
Write-Host "Word Object: " $Word
Write-Host "File Object: " $Word $File
Write-Host "FullName prop:" $File.FullName
# open a Word document, filename from the directory
$Doc = $Word.Documents.Open($File.FullName)
# Swap out DOCX with PDF in the Filename
$Name=($Doc.FullName).Replace("docx","pdf")
# Save this File as a PDF in Word 2010/2013
$Doc.SaveAs([ref] $Name, [ref] 17)
$Doc.Close()
}
# check errors
if(-not $?){
Write-Host("Stop because an error occurred")
pause
exit 0
}
# wait until the conversion is done
Start-Sleep -s 15
# Now concat all pdfs to one single pdf
$Files = Get-ChildItem "$FilePath\*.pdf" | Sort-Object
Write-Host $Files.Count
if ($Files.Count -gt 0) {
$command = ""
Foreach ($File in $Files) {
$command += " -f "
$command += "`"" + $File.FullName + "`""
}
$command += " -o `"$FilePath\Letter of application.pdf`" -overwrite concat"
$command = $Pdfsam + $command
echo $command
$path = Split-Path -Path $Pdfsam -Parent
cd $path
cmd /c $command
}else{
Write-Host "No PDFs found for concatenation"
}
Write-Host -NoNewLine "Press any key to continue...";
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown");
I've found $PSScriptRoot to be unreliable.
$FilePath = $PSScriptRoot;
$CurLocation = Get-Location;
$ScriptLocation = Split-Path $MyInvocation.MyCommand.Path
Write-Host "FilePath = [$FilePath]";
Write-Host "CurLocation = [$CurLocation]";
Write-Host "ScriptLocation = [$ScriptLocation]";
Results:
O:\Data>powershell ..\Script\t.ps1
FilePath = []
CurLocation = [O:\Data]
ScriptLocation = [O:\Script]
As to the differences between the various cmd.exe implementations, I can't really answer that. I should have thought they'd be functionally identical, but maybe there's 32/64-bit differences that matter.
The error occurs, when using PowerShell x32 and occurs NOT on PowerShell 64bit.
I debugged into the script using powershell_ise_x32 and found out, that $Word.Documents is null.
This is because on my system Word 64bit is installed.
I want to deploy our Network Printers that are shared from a Print-Server to Windows 10 PCs, on per-machine basis.
Currently we do this with a Kix-Script and ini file, but I want to move this to PowerShell and deploy it as a Startup/Login Script with Group Policy. The deployment must be with PowerShell not purely GPO, with a script we are more flexible to deploy to singular machines.
I've written a PS Script and using a CSV File containing the PCs and Printers to map, but it seams completely wrong. Is there a better way to deploy the printers?
Here are my CSV, 'True' is to set Printer as Default:
#TYPE Selected.System.Management.ManagementObject.Data.DataRow
Name
PC0001
\\SV0002\PR0001, True
\\SV0002\PR00002
Name
PC0002
\\SV0002\PR0001, True
\\SV0002\PR00002
and the PS-Script:
Get–WMIObject Win32_Printer | where{$_.Network -eq ‘true‘} | foreach{$_.delete()}
$Printers=IMPORT-CSV \\server\$env:username\printers.csv
FOREACH ($Printer in $Printers) {
Invoke-Expression 'rundll32 printui.dll PrintUIEntry /in /q /n $($Printer.Name)'
}
I edited the csv File, and it looks like this now:
Client;1;2;3;4;5;6;7;8;9;10;11;12;13;14;15;Default
PC0001;\\SV0001\PR0001;\\SV0001\PR0002;;;;;;;;;;;;;;pr_01
PC0002;\\SV0001\PR0001;\\SV0001\PR0002;\\SV0001\PR0003;;;;;;;;;;;;;pr_03
We did that with Excel, so it's easier to edit, and save it as csv.
Also where is located, we changed it to \Server\Netlogon\Subfolder\Printers.csv so that also the the Variable is changed to:
$Printers=IMPORT-CSV \\server\Netlogon\Subfolder\printers.csv
But now I think the whole script is wrong?
Using a CSV like this:
name,printers,defaultprinter
PC0001,\\SV0001\PR0001;\\SV0001\PR0002,PR0002
PC0002,\\SV0001\PR0001;\\SV0001\PR0003,PR0003
PC0003,\\SV0001\PR0001;\\SV0001\PR0004,PR0004
The code would be:
$csv = "\\server\Netlogon\Subfolder\printers.csv"
$Computers = Import-Csv $csv
foreach ($Computer in $Computers){
If ($Computer.name -eq $env:computername) {
$Printers = ($Computer.printers).split(";")
foreach ($Printer in $Printers) {Add-Printer $Printer -ErrorAction SilentlyContinue}
(New-Object -ComObject WScript.Network).SetDefaultPrinter("$($Computer.defaultprinter)")
}
}
The way we do (did) it here at work was by invoking some VBScript from within the PowerShell script.
Print server and Printer are obtained via AD cmdlets.
$net = New-Object -Com WScript.Network
$net.AddWindowsPrinterConnection("\\" + $PRINT_SERVER + "\" + $PRINTER)
Starting from Windows 8 :
# Add the printer
Add-Printer -ConnectionName ("\\" + $printServer + "\" + $printerName) -Name $printerName
# Get the printer
$printer = Get-WmiObject -Query "Select * From Win32_Printer Where ShareName = '$printerName'"
# Set printer as default
$printer.SetDefaultPrinter()
I solved the Problem with the Script of James C., many thanks to him, it was a big help!.
The only wrong Thing was that between Add-Printer and $Printer, it had to be -ConnectionName. After that Little Edit in the script, everything was fine.
So we made a GP_Printers, where we putted under Computer Configuration/Windows Settings/Scripts/Startup this Script as printermapping.ps1
Also we putted into Shutdown a PowerShell Script where all Printer Connection are deleted.
Here are all the scripts.
CSV:
name,printers,defaultprinter
PC0001,\\SV0001\PR0001;\\SV0001\PR0002,PR0002
PC0002,\\SV0001\PR0001;\\SV0001\PR0003,PR0003
PC0003,\\SV0001\PR0001;\\SV0001\PR0004,PR0004
Printer Mappings with PowerShell depending on CSV:
$csv = "\\server\Netlogon\Subfolder\printers.csv"
$Computers = Import-Csv $csv
foreach ($Computer in $Computers){
If ($Computer.name -eq $env:computername) {
$Printers = ($Computer.printers).split(";")
foreach ($Printer in $Printers) {Add-Printer-ConnectionName $Printer -ErrorAction SilentlyContinue}
(New-Object -ComObject WScript.Network).SetDefaultPrinter("$($Computer.defaultprinter)")
}
}
And the Printer Disconnection:
Get-WmiObject -Class Win32_Printer | where{$_.Network -eq ‘true‘}| foreach{$_.delete()}
I hope this could be helpfoul for others.
Again many thanks to James C.
WBZ-ITS
I've made some correction and improvements to the script, and found also some Problem that Comes if you use it on a GPO, the changes are following:
CSV:
name,printers,defaultprinter
PC0001,\\SV0001\PR0001;\\SV0001\PR0002,PR0002
PC0002,\\SV0001\PR0001;\\SV0001\PR0003,PR0003
PC0003,\\SV0001\PR0001;\\SV0001\PR0004,PR0004
The Connection Script:
$csv = "\\server\Netlogon\Subfolder\printers.csv"
$Computers = Import-Csv $csv
foreach ($Computer in $Computers){
If ($Computer.name -eq $env:computername) {
$Printers = ($Computer.printers).split(";")
foreach ($Printer in $Printers) {Add-Printer-ConnectionName $Printer -ErrorAction SilentlyContinue}
(New-Object -ComObject WScript.Network).SetDefaultPrinter("$($Computer.defaultprinter)")
}
}
And also a disconnect Script when logging off:
#$a = Get-WMIObject -query "Select * From Win32_Printer Where Name = 'Microsoft Print to PDF'"
#$a.SetDefaultPrinter()
$TargetPrinter = "Microsoft Print to PDF"
$ErrorActionPreference = “SilentlyContinue”
$LocalPrinter = GWMI -class Win32_Printer | Where {$_.Name -eq $TargetPrinter}
$LocalPrinter.SetDefaultPrinter()
$ErrorActionPreference = “Stop”
Get-WmiObject -Class Win32_Printer | where{$_.Network -eq ‘true‘}| foreach{$_.delete()}
To disconnect the default printer must be changed, otherwise it won't be disconnected.
After all Script was made, we putted them in a GPO under User Configuration\Policies\Windows Settings\Scripts and there on Logon and Logoff.
You may have some troubles that the GPOs won't run, so here some usefull troubleshooting guides that i found:
The Scripts aren't working as Machine Policies under Startup and Shutdown, they have to be in the User Configuration as mentioned above.
Also you have to configure the Policie that deley the Script of 5 minutes. These are under Computer Configuration\Administrative Templates\System\Group Policy\Configure Logon Script Delay aktivate them and set the delay to 0 minutes, otherwise any Script will be deleyed to 5 minutes after logon.
Also a problem could be, if you are running the GPO on Windows 8/10 System, and you made them on a WIndows 7 PC. Create GPOs allways on the Server 2008/R2 or 2012R2 for this kind of system.
It could be helpfoul also if you configure the Logon/Logoff GPO as follows: As Scriptname "powershell.exe" (without quotes) and as Script Parameters -F "\SERVER\FREIGABE\meinskript.ps1" (with quotes.
I hope this could help someone else.
Thanks to who hleped me.
WBZ-ITS
I am migrating multiple users from XP to 7 and they all have different mapped drives/locations on their current PC. After copying their all data from old PC to new PC, I am currently manually mapping their drives which consumes lot of time. Is there anyway of automating this process?
Is there any way of running a script on existing XP machine and running the same script on new Win 7 machine to map all the drives?
I am looking for a script or any other way of automating this process.
Thanks.
You could do this for all your users, it would at least tell you what they had.
you'd probably want one central folder, lets say Mappings, so try
net use > \servername\Mappings\%username%_map.txt
Or try something like this
http://www.visualbasicscript.com/List-mapped-drives-on-remote-machine-m28529.aspx
out of boredom i quickly wrote a powershell script to help you out.
Run this on your workstation:
(newpcs and oldpcs must be in correct order so oldpc1 is the old pc of the user of newpc1)
$oldpcs=#("oldpc1", "oldpc2", "oldpc3")
$newpcs = #("newpc1", "newpc2", "newpc3")
$mapping = #{}
for($i=0;$i -lt $oldpcs.Count; $i++){
$mapping.add($oldpcs[$i], $newpcs[$i])
}
foreach ($comp in $oldpcs){
$m = Get-WmiObject win32_systemnetworkconnections -ComputerName $comp
$m | %{
#i know this is not very elegant but whatever
$temp = $_.partcomponent -split "="
$temp = $temp -replace "`"", ""
$temp2= $temp[1] -split " "
$driveletter = $temp2[1] -replace "\(", ""
$driveletter = $driveletter -replace "\)", ""
$path = $temp2[0] -replace "\\\\", "\"
$f = "C:\path\to\folder\" + $mapping.$comp + ".txt"
Add-Content $f "$driveletter;$path"
}
}
Then get the file with corresponding computername to the new computer and run the following:
$txt = Get-Content "C:\path\to\file\$env:computername.txt"
$txt | % {
$temp = $_ -split ";"
net use $temp[0] $temp[1]
}
Remember that you have to run the mapping-script in the context of the user you want to map the drives for
Regards
P.S. Remotely mapping network drives is not possible afaik (i would love to be proven wrong)
You could create a logon script and map it to a user though