Why won't this script run? - image

# Set the path to the AOMEI Backupper Technician executable
$abtExecutable = "C:\Program Files (x86)\AOMEI Backupper Technician\Backupper.exe"
# Set the destination for the system image backup
$backupDestination = "C:\Backups\System Image"
# Set the password for the backup
$password = "12121212121212121212"
# Set the compression level to 'high'
$compression = "high"
# Check if there is already a system image backup from 30 days ago
$thirtyDaysAgo = (Get-Date).AddDays(-30)
$oldBackup = Get-ChildItem $backupDestination | Where-Object { $_.LastWriteTime -lt $thirtyDaysAgo }
# If there is an old backup, delete it
if ($oldBackup) {
Remove-Item -Path $oldBackup.FullName -Force
}
# Create the full system image backup
& $abtExecutable backup system -d $backupDestination -p $password -c $compression
Whenever I try to run this in Powershell, the most it does is it launches the AOMEI backupper exe. But it doesn't do any of the other steps.
I expected it to work. Whenever I try to run this in Powershell, the most it does is it launches the AOMEI backupper exe. But it doesn't do any of the other steps.

As per my comment,
powershell /?
EXAMPLES
...
PowerShell -Command {Get-EventLog -LogName security}
PowerShell -Command "& {Get-EventLog -LogName security}"
PowerShell- Running Executables - TechNet Articles - United States (English) - TechNet Wiki
doing your console stuff in a script via Start-Process.
$ConsoleCommand = 'Some Console Command'
$startProcessSplat = #{
FilePath = 'powershell'
ArgumentList = '-NoExit', 'NoProfile', '-Command &{ $ConsoleCommand }'
Wait = $true
}
Start-Process #startProcessSplat
Or just call directly in your script without calling the powershell console at all, since your $abtExecutable is a stand-alone executable and really calls cmd.exe to run, which Start-Process will do by default.

Related

powrshell command does not run when I run the entire script

I wrote a ps1 script to automate some package installation but the strange part is when I run the command snippet for executing the .exe file for SEP (Symantec Endpoint Protection) , it is executing fine , but when I execute the entire script , it does run the command snippet.
Iam only running a simple .exe file , and even if I run it manually , it does not show any installer , rather it installs silently in the background.
So in the script, Iam only running the .exe file, thats it .
Should I be giving any wait time or any other inputs ?
Start-Process -Wait -FilePath "C:\Temp\Symantec-Windows\SEP 14.3.3384.1000 x64.exe" -passthru
$SymVersion = Get-WmiObject -Class Win32_Product -ComputerName $hostname | Where-Object -FilterScript {$_.Name -eq "symantec endpoint protection"} | Format-List -Property version, InstallState, name
echo $SymVersion
if($SymVersion)
{
echo 'Symantec is successfully installed' -ForegroundColor Green
}
else
{
echo 'Symantec is not successfully installed' -ForegroundColor Red
}
The symantec antivirus exe files are made for silent installations. If you want to proceed with GUI mode, better unzip the file and use MSI file with arguments. With your current script,Its better to check the process is exited with code 0. The following code is not tested.
$process = Start-Process -FilePath "C:\Temp\Symantec-Windows\SEP 14.3.3384.1000 x64.exe" -passthru -Wait
if($process.ExitCode -ne 0)
{
throw "Installation process returned error code: $($process.ExitCode)"
} else { Write-Host "Installation Successful"}

PowerShell Invoke-Command with Start-Process outputs different result

I have an update script for running the Dell Command Update tool. In short dcu-cli.exe. The thing now is than when i run the same script code on the computer local then everything runs OK but when i run the exact same code in a script with invoke-command(and yes i have full admin rights) than the exitcode is 2 meaning An unknown application error has occurred instead of 0 (everything OK)
It is a very large script so i created a new one to debug this. This is the shorted code:
Invoke-Command -ComputerName "MyComputer" -ScriptBlock {
$ExitCode = 0
#Declare path and arguments
$DcuCliPath = 'C:\Program Files (x86)\Dell\CommandUpdate\dcu-cli.exe'
$DellCommand = "/applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log"
#Verify Dell Command | Update exists
If (Test-Path -Path $DcuCliPath) {
$objWMI = Get-WmiObject Win32_ComputerSystem
Write-Host ("Dell Model [{0}]" -f $objWMI.Model.Trim())
$serviceName = "DellClientManagementService"
Write-Host ("Service [{0}] is currently [{1}]" -f $serviceName, (Get-Service $serviceName).Status)
If ((Get-Service $serviceName).Status -eq 'Stopped') {
Start-Service $serviceName
Write-Host "Service [$serviceName] started"
}
#Update the system with the latest drivers
Write-Host "Starting Dell Command | Update tool with arguments [$DellCommand] dcu-cli found at [$DcuCliPath]"
$ExitCode = (Start-Process -FilePath ($DcuCliPath) -ArgumentList ($DellCommand) -PassThru -Wait).ExitCode
Write-Host ("Dell Command | Update tool finished with ExitCode: [$ExitCode] current Win32 ExitCode: [$LastExitCode] Check log for more information: C:\Dell_Update.log")
}
}
When i remove the Invoke-Command -ComputerName "MyComputer" -ScriptBlock { and then copy + run the script local on the PC then the exitcode = 0
What i also noticed than when i run the command via 'Invoke-Command' then there is also no log file created as i passed along in the arguments... So my best guess is something is going wrong with local an remote paths?
So what am i missing? I'm guessing it is something simple but i spend several hours to get this running without any luck...
Try running it this way. You should be able to see any output or error messages. I typically add to the path first rather than using & or start-process.
invoke-command mycomputer {
$env:path += ';C:\Program Files (x86)\Dell\CommandUpdate';
dcu-cli /applyUpdates -autoSuspendBitLocker=enable -outputLog=C:\Dell_Update.log }
Using start-process inside invoke-command seems pretty challenging. I can't even see the output of findstr unless I save it to a file. And if I didn't wait the output would be truncated. By default start-process runs in the background and in another window. There's a -nonewwindow option too but it doesn't help with invoke-command.
invoke-command localhost { # elevated
start-process 'findstr' '/i word c:\users\joe\file1' -wait -RedirectStandardOutput c:\users\joe\out }
#js2010, thanks for your additional help. Unfortunately this didn't helped either.
So i did some more debugging and it turns out it was a bug in the dcu-cli version running on my test machine, DOH...!!
On the test machine version 3.1.1 was running and on another machine version 4.0 was running and that worked fine via remote Powershell. So i looked for the release notes, which i found here: https://www.dell.com/support/kbdoc/000177325/dell-command-update
And as you can see in version 3.1.3 there was this fix:
A problem was solved where dcu-cli.exe was not executed in an external interactive session of PowerShell.

Running powershell script to perform automated webpage login from task schedular behaves differently from running the script manually

My university requires all computers to perform a web-based login in order to get-access to the internet, and claims that all users will log-off automatically in the mid-night (sounds strange, but it is true), so I am trying to write a powershell script (in Windows 10) to perform automatic login at mid-night.
My script is list here. It opens an IE process in the background (in a nonvisible way), fill in the username and password, login, and kills the IE process.
# If there are existing Internet Explorer processes, close it
$IE_Process = Get-Process iexplore -ErrorAction Ignore
if ($IE_Process) {
$IE_Close = Foreach-Object { $IE_Process.CloseMainWindow() }
}
Stop-Process -Name "iexplore" -ErrorAction Ignore
# Login Information
$url = "http://xxx.xxx.xxx.xxx/"
$username = "xxxxxxxx"
$password = "xxxxxxxx"
# Open an IE process
$ie = New-Object -com internetexplorer.application;
$ie.silent = $true
$ie.navigate($url);
while ($ie.Busy -eq $true)
{
Start-Sleep -s 1;
}
# The stupid webpage needs to submit twice
$ie.Document.getElementById("loginname").value = $username
$ie.Document.getElementByID("password").value = $password
$ie.Document.getElementById("button").Click()
Start-Sleep -s 1;
$ie.Document.getElementById("loginname").value = $username
$ie.Document.getElementByID("password").value = $password
$ie.Document.getElementById("button").Click()
# Close the IE process
$IE_Process = Get-Process iexplore -ErrorAction Ignore
if ($IE_Process) {
$IE_Close = Foreach-Object { $IE_Process.CloseMainWindow() }
}
Stop-Process -Name "iexplore" -ErrorAction Ignore
Remove-Variable -Name ie,username,password,url,IE_Process -ErrorAction Ignore
The script is saved as "login_IE.ps1". It may be poorly written as I am new to powershell, but it works. If I open an cmd window and execute the following command, I am logged in.
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe -ExecutionPolicy RemoteSigned -File C:\Users\MyName\Documents\Powershell\login_IE.ps1
However, if I create a scheduled task in windows task scheduler executing this script, it doesn't work. I fill the "Program/script:" as:
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe
and fill the "Add arguments (optional):" as:
-ExecutionPolicy RemoteSigned -File C:\Users\MyName\Documents\Powershell\login_IE.ps1
The scheduled task is run under my account (I am the only user of this computer).
If I run the scheduled task manually, in the task manager I can see two IE process opened in the "Background process", communicate with the internet, and then get killed, so I am pretty sure that the script has actually been executed. But I found I am not logged in since I don't have internet access, where could the problem be?
Any advice is really appreciated. Thanks in advance.
Similar type of issue I had, when trying to run the script directly from Powershell window it works as expected, but from the task scheduler or command line both not getting the desired results.
Commenting and adding lines like below in my script, help me to run the script from command line and task scheduler as well
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[Net.ServicePointManager]::SecurityProtocol =
[Net.SecurityProtocolType]::Tls12
Hope this helps anyone else.

Create SymLink with GPO

I have a BATCH script that makes symlinks with mklink. When I run it as an administrator or as a system account (with psexec -s -e) it works as it should. But when I try to use it in a GPO as a startup script it gives me an error "you do not have sufficient privilege to perform this operation" on a target computer. Windows 7 Pro SP1 x64. UAC is disabled.
Batch example:
mklink C:\log\cmd.link.exe C:\Windows\System32\cmd.exe >> C:\log\symlink.log 2>&1
I also tried to wrap it into a powershell script:
Start-Process -FilePath "$env:windir\system32\cmd.exe" -ArgumentList "/c mklink C:\log\cmd.link.exe C:\Windows\System32\cmd.exe >> C:\log\symlink.txt 2>&1" -Verb RunAs
but got the same error. What am I doing wrong?
Maybe there's another way to create a SymLink with GPO or PowerShell?
It appeared that the Group Policy Client (gpsvc) service (since GPO scripts runs with its privilegies) does not contain the privilege to create symolic links (SeCreateSymbolicLinkPrivilege):
C:\>sc qprivs gpsvc
[SC] QueryServiceConfig2 SUCCESS
SERVICE_NAME: gpsvc
PRIVILEGES : SeTakeOwnershipPrivilege
: SeIncreaseQuotaPrivilege
: SeAssignPrimaryTokenPrivilege
: SeSecurityPrivilege
: SeChangeNotifyPrivilege
: SeCreatePermanentPrivilege
: SeShutdownPrivilege
: SeLoadDriverPrivilege
: SeRestorePrivilege
: SeBackupPrivilege
If I want to use this privilege I should at first grant this privelege to the service. It can be done with this command:
sc privs gpsvc SeTakeOwnershipPrivilege/SeIncreaseQuotaPrivilege/SeAssignPrimaryTokenPrivilege/SeSecurityPrivilege/SeChangeNotifyPrivilege/SeCreatePermanentPrivilege/SeShutdownPrivilege/SeLoadDriverPrivilege/SeRestorePrivilege/SeBackupPrivilege/SeCreateSymbolicLinkPrivilege
After that you will be able to use mklink inside GPO scripts.
There are several caveats:
You should list all permissions (current + new). Otherwise you risk to replace all permissions with one.
The command needs System account permission to set privileges so you'll need to use psexec or GPO script (not sure).
If you intend to use psexec it will throw you an error about argument being too long. So you should save this command as a .bat file and then run it with psexec.
Many thanks to #PetSerAl who helped me to find this out.
I still had an issue running the sc.exe commands with PowerShell as a startup script via Group Policy. It was being denied access per Start-Transcript log file. I used the below PowerShell logic for example and it did not work for me in my case.
I tried several variations of multiple things and syntaxes using PowerShell.exe, -verb RunAs, Start-Process and slews of things short of running it as a local script with Task Scheduler as SYSTEM which I was trying to avoid.
Note: This is just a general example of one of the variations that failed with the same result and transcript output as all other
variations tried.
$privs = (sc.exe qprivs gpsvc).Split(":")[5..99] | % { Process { If( $_.Trim().Length -gt 0 ){ $_.Trim() } } };
$privs = $privs + "SeCreateSymbolicLinkPrivilege";
$privs = $privs -Join "/";
Invoke-Expression "sc.exe privs gpsvc $privs"
A solution that works (in my case)
I used the below PowerShell logic as a startup script via Group Policy and now creating symbolic links works. To keep the example simple, I used Google Chrome for generalization.
Basically I had to manipulate the multistring registry value of the correlated permissions for the service rather than using sc.exe appending the needed "SeCreateSymbolicLinkPrivilege" value that way.
#Start-Transcript -Path C:\Log\Transcript.txt
$v = (Get-ItemProperty "HKLM:\SYSTEM\CurrentControlSet\Services\gpsvc").RequiredPrivileges;
If ( $v -notcontains "SeCreateSymbolicLinkPrivilege" ) {
$v = $v + "SeCreateSymbolicLinkPrivilege";
Set-ItemProperty "HKLM:\SYSTEM\CurrentControlSet\Services\gpsvc" RequiredPrivileges $v -Type MultiString;
};
$Chrome86 = "C:\Program Files (x86)\Google\Chrome";
$Chrome = "C:\Program Files\Google\Chrome";
If(!(Test-Path $Chrome86)){
If(Test-Path $Chrome){New-Item -Path $Chrome86 -ItemType SymbolicLink -Value $Chrome -Force}
}
If(!(Test-Path $Chrome)){
If(Test-Path $Chrome86){New-Item -Path $Chrome -ItemType SymbolicLink -Value $Chrome86 -Force}
}

How can I launch .cmd files on a remote machine?

I need to be able to launch a .cmd file that is on a remote machine, from within the directory that the file resides on that machine.
I've tried: invoke-command -ComputerName test123 -ScriptBlock { cmd /c c:/myfile.cmd } in powershell, which launches the .cmd, but then fails because it can't find the corresponding .cmds that this one launches (which all reside in the same directory).
Is there a way to launch this .cmd file, and have it's execution persist? i.e., even after the powershell window is closed, the .cmd will continue to run on the remote machine.
You need to change the working directory in the scriptblock. Add a Set-Location before calling the batch script:
Invoke-Command -ComputerName test123 -ScriptBlock {
Set-Location 'C:\'
& cmd /c ".\myfile.cmd"
}
If you need to create a detached process, you can do that for instance via WMI:
$hostname = 'test123'
$command = 'C:\path\to\script.cmd'
$workdir = 'C:\working\directory'
$p = [wmiclass]"\\$hostname\root\cimv2:Win32_Process"
$p.Create($command, $workdir)
Note that you need admin privileges on the remote host for this.

Resources