In this post:
https://gist.github.com/Neo23x0/e4c8b03ff8cdf1fa63b7d15db6e3860b
the following powershell command is claimed to check for log4j vulnerabilities:
gci 'C:\' -rec -force -include *.jar -ea 0 | foreach {select-string "JndiLookup.class" $_} | select -exp Path
That results in access denied errors.
Further checking reveals one has to run powershell as Administrator.
https://learn.microsoft.com/en-us/answers/questions/32390/access-is-denied-while-i-am-running-a-command-in-p.html
Doing that still gives access denied errors.
Further on in the above article it says to first issue:
Set-ExecutionPolicy AllSigned
That still gives Access Denied.
What is further required to get this to execute?
tl;dr
Use the following, streamlined version of the command, which should also perform much better.
# Run WITH ELEVATION (as admin):
gci C:\ -rec -file -force -filter *.jar -ev errs 2>$null | # Use -filter, not -include
select-string "JndiLookup.class" | # Pipe directly to select-string
select -exp Path
Note: -ea 0 - short for: -ErrorAction SilentlyContinue - should normally silence any error messages, but if that doesn't work for you for some reason, 2>$null should be effective.
-ev errs - short for: -ErrorVariable errs - collects all errors that occur in variable $errs, which you can examine after the fact to determine whether the errors are an indication of an actual permission problem.
Errors are expected in Windows PowerShell, even when running with elevation, namely relating to hidden system junctions, discussed below. However, you can ignore these errors.
In PowerShell (Core) 7+, where these errors no longer occur, you could omit -ev errs 2>$null above. Any errors that surface then would be indicative of a true permission problem.
Background information:
In general, even running with elevation (as admin) doesn't guarantee that all directories can be accessed. File-system ACLs at the directory level can prevent even elevated processes from enumerating the directory's files and subdirectories.
Notably, there are several hidden system junctions (links to other directories), defined for pre-Vista backward-compatibility only - such as C:\Documents and Settings and C:\Users\<username>\My Documents - that even elevated processes aren't permitted to enumerate the children of.
During file-system enumeration, this fact only becomes apparent in Windows PowerShell, which reports access-denied errors for these junctions. PowerShell (Core) 7+, by contrast, quietly skips them.
Even in Windows PowerShell the problem is only a cosmetic one, because these junctions merely point to directories that can be enumerated with elevation and therefore are with a -Recursive enumeration of the entire drive.
To find all these hidden system junctions:
# Run WITH ELEVATION (as admin):
cmd /c dir c:\ /s /b ashdl
Additional information is in this answer.
Recently I've started transitioning to using PowerShell 7 for my scripting needs, since most of my required modules are now compatible.
However the "MSOnline" module has not yet been updated, and they recommend using the -UseWindowsPowerShell parameter when importing the module.
This is absolutely great, and the module can be used to (almost) it's full potential.
However the -ErrorVariable parameter does not seem to work, and I'm therefore unable to do any error handling.
See below code for an example of what doesn't work:
Import-Module MSOnline -UseWindowsPowerShell
Connect-MsolService -ErrorAction SilentlyContinue -ErrorVariable msolError
Write-Host $msolError
It seems that the -ErrorAction parameter is respected, but I can't seem to get anything inside the error variable.
Does anyone know if it is possible to somehow get the error values of a module loaded like this?
-ErrorAction SilentlyContinue will supress any error messages. Remove it from the command to get an output.
Command should be:
Connect-MsolService -ErrorVariable "msolError"
The just call $msolError to view the error.
I know that the error is pretty much self-explenatory but yet I am not able to find solution. I write a PowerShell script to automate the set-up project for the dev machines. There are a set of programs that must be installed so what I want to do i first download the file and then install it. I am having problems with downloading file from the web to the local machine.
My logic is as follows. I have an .xml file where I configure all the stuff. For the downloads it looks like this:
<download source="https://github.com/msysgit/msysgit/releases/download/Git-1.9.5-preview20150319/Git-1.9.5-preview20150319.exe"
destination="C:\Temp" filename="Git-1.9.5-preview20150319.exe"/>
Then in my PowerShell script file I have this function:
function install-tools() {
Set-ExecutionPolicy RemoteSigned
$xmlFileInformation = $config.SelectNodes("/setup/downloads/download");
Foreach ($file in $xmlFileInformation)
{
$("Filename: " + $file.filename)
$("File source: " + $file.source)
$("File destionation: " + $file.destination)
$("****************************************************************");
$("*** Downlading "+ $file.filename);
$("****************************************************************");
Invoke-WebRequest $file.source -OutFile $file.destination
}
$("Task finished");
}
After executin I get the error from the title UnauthorizedAccessException from PowerShell when using Invoke-WebRequest. Two things that I can mention is that I have included Set-ExecutionPolicy RemoteSigned and also, I execute the script running PowerShell as administrator. I've tried different paths but it's the same everytime, I don't get permission to write anywhere. The only thing that I can't try is using another drive I have only one - C:\.
And one strange thing - my destination directory is C:\Temp but during one of my attempts I didn't have such a directory in C:\ so I ended up with file named Temp in my C:\ but this was the closest I get to getting a file.
I don't need to save those files in a particular place since it's very possible to delete the entire directory after successfull set-up so what I need is a way to let powershell save files somewhere in my C:\ drive. Since I'm not sure if this is related with administrating my system and setting the correct rights (I tried to lower the protection as much as I can) or I miss something in my PowerShell script?
You does not specify file name to download to. Replace
-OutFile $file.destination
to
-OutFile (Join-Path $file.destination $file.filename)
I have a small script on my Domain Controller that is setup to email me via SMTP about the latest Security Event 4740.
The script, when executed manually, will run as intended; however, when setup to run via Scheduled Tasks, and although it shows to have been executed, nothing happens (no email).
The script is as follows:
If (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
$arguments = "& '" + $myinvocation.mycommand.definition + "'"
Start-Process powershell -Verb runAs -ArgumentList $arguments
Break
}
$Event = Get-EventLog -LogName Security -InstanceId 4740 -Newest 5
$MailBody= $Event.Message + "`r`n`t" + $Event.TimeGenerated
$MailSubject= "Security Event 4740 - Detected"
$SmtpClient = New-Object system.net.mail.smtpClient
$SmtpClient.host = "smtp.domain.com"
$MailMessage = New-Object system.net.mail.mailmessage
$MailMessage.from = "fromemail#domain.com"
$MailMessage.To.add("toemail.domain.com")
$MailMessage.IsBodyHtml = 1
$MailMessage.Subject = $MailSubject
$MailMessage.Body = $MailBody
$SmtpClient.Send($MailMessage)
Scheduled Task is setup as follows:
RunsAs:LOCAL SYSTEM
Trigger: On event - Log: Security, Event ID: 4740
Action: Start Program - C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Argument: -executionpolicy bypass c:\path\event4740.ps1
I have also tried the following:
Trigger: On event - Log: Security, Event ID: 4740
Action: Start Program - C:\path\event4740.ps1
According to the Tasks History: Task Started, Action Started, Created Task Process, Action Completed, Task Completed. I have looked through some various links on the site with the same 'issue' but they all seem to have some sort of variable that I do not have. I have also tried some of the mentioned solutions thinking they may be somewhat related, but alas nothing is working. I have even tried removing my Scheduled Task and resetting it as mentioned here: http://blogs.technet.com/b/heyscriptingguy/archive/2012/08/11/weekend-scripter-use-the-windows-task-scheduler-to-run-a-windows-powershell-script.aspx
Has anyone run into this type of error before or know how to bypass this issue?
Troubleshooting:
I decided to try an call a .bat file via a scheduled task. I created a simple file that would echo the current date/time to a monitored folder. Running the file manually and via a task triggered by the 4740 Event achieved desired results. Changing the .bat file to instead call the .ps1 file worked manually. When triggered by the 4740 Event, now the .bat will no longer run.
Change your Action to:
powershell -noprofile -executionpolicy bypass -file C:\path\event4740.ps1
On a Windows 2008 server R2: In Task Scheduler under the General Tab -
Make sure the 'Run As' user is set to an account with the right permissions it takes to execute the script.
Also, I believe you have the "Run only when user is logged on" Option checked off. Change that to "Run whether user is logged on or not". Leave the Do Not Store password option unchecked, and you'll probably need the "Run with Highest Privileges" option marked.
NOTE: Please ensure that you select Create a Basic task Action and NOT the Create Task Action.
I found the following solution:
1) Make powershell.exe run as administrator for this
right-click on the powershell.exe icon
click on properties under the shortcut key menu
click on the advance button; check that "run as administrator" is
checked.
2) in the task scheduler window under the action pane add the
following script as a new command
%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe -NoLogo -NonInteractive -ExecutionPolicy Bypass -noexit -File "C:\ps1\BackUp.ps1"
Although you may have already found a resolution to your issue, I'm still going to post this note to benefit someone else. I ran into a similar issue.
I basically used a different domain account to test and compare. The task ran just fine with "Run whether user is logged on or not" checked.
A couple of things to keep in mind and make sure of:
The account being use to execute task must have "Logon as batch job" rights under the local security policy of the server (or be member of local Admin group). You must specified the account you need to run scripts/bat files.
Make sure you are entering the correct password characters
Tasks in 2008 R2 don't run interactively specially if you run them as "Run whether user is logged on or not". This will likely fail specially if on the script you are looking for any objects\resource specific to a user-profile when the task was created as the powershell session will need that info to start, otherwise it will start and immediately end.
As an example for defining $Path when running script as "Run whether user is logged on or not" and I specify a mapped drive. It would look for that drive when the task kicks off, but since the user account validated to run task is not logged in and on the script you are referring back to a source\object that it needs to work against it is not present task will just terminate.
mapped drive (\server\share) x:\ vs. Actual UNC path \server\share
Review your steps, script, arguments. Sometimes the smallest piece can make a big difference even if you have done this process many times. I have missed several times a character when entering the password or a semi-colon sometimes when building script or task.
Check this link and hopefully you or someone else can benefit from this info: https://technet.microsoft.com/en-us/library/cc722152.aspx
If you don't have any error messages and don't know what the problem is - why PowerShell scripts don't want to start from a Scheduled Task do the following steps to get the answer:
Run CMD as a user who has been set for Scheduled Task to execute the PowerShell script
Browse to the folder where the PowerShell script is located
Execute the PowerShell script (remove all statements that block the error notifications if any exists inside of the script like $ErrorActionPreference= 'silentlycontinue')
You should be able to see all error notifications.
In case of one of my script it was:
"Unable to find type [System.ServiceProcess.ServiceController]. Make sure that the assembly that contains this type is loaded."
And in this case I have to add additional line at the begining of the script to load the missing assembly:
Add-Type -AssemblyName "System.ServiceProcess"
And next errors:
Exception calling "GetServices" with "1" argument(s): "Cannot open Service Control Manager on computer ''. This operation might require other privileges."
select : The property cannot be processed because the property "Database Name" already exists
Good morning,
I know this is an old thread but I just ran across it while looking for a similar problem - script was running successfully but not doing its work. I can't find the post that helped me but my issue was that I was running the script as the domain admin. When I followed the suggestion of the post and added the domain admin to the local administrator's group it worked. I hope this helps others with the same issue I had.
Joe
Implemented the ExecutionPolicy Bypass argument to get the scheduled task working.
Program: Powershell.exe
Add Arguments: -ExecutionPolicy Bypass -File C:\pscommandFile.ps1
Found successful workaround that is applicable for my scenario:
Don't log off, just lock the session!
Since this script is running on a Domain Controller, I am logging in to the server via the Remote Desktop console and then log off of the server to terminate my session. When setting up the Task in the Task Scheduler, I was using user accounts and local services that did not have access to run in an offline mode, or logon strictly to run a script.
Thanks to some troubleshooting assistance from Cole, I got to thinking about the RunAs function and decided to try and work around the non-functioning logons.
Starting in the Task Scheduler, I deleted my manually created Tasks. Using the new function in Server 2008 R2, I navigated to a 4740 Security Event in the Event Viewer, and used the right-click > Attach Task to this Event... and followed the prompts, pointing to my script on the Action page. After the Task was created, I locked my session and terminated my Remote Desktop Console connection. WIth the profile 'Locked' and not logged off, everything works like it should.
In addition to advices from above I was getting error and found solution on following link http://blog.vanmeeuwen-online.nl/2012/12/error-value-2147942523-on-scheduled.html.
Also this can help:
In task scheduler, click on the scheduled job properties, then settings.
In the last listed option:
"if the task is already running, the following rule applies:"
Select "stop the existing instance" from the drop down list.
I think the answer to this is relevant too:
Why is my Scheduled Task updating its 'Last Run Time' correctly, and giving a 'Last Run Result' of '(0x0)', but still not actually working?
Summary: Windows 2012 Scheduled Tasks do not see the correct environment variables, including PATH, for the account which the task is set to run as. But you can test for this, and if it is happening, and once you understand what is happening, you can work around it.
One more idea that worked. It's really silly, but, apparently, the default target OS setting (bottom right corner of the screen) is Vista / Windows Server 2008. As we're past the 10 year mark, it is likely that your Powershell script will not be compatible to these.
Changing the target to Windows Server 2016, as shown on the screenshot below, did the trick for me.
I was having almost the same problem as this but slightly different on Server 2012 R2. I have a powershell script in Task Scheduler that copies 3 files from one location to another. If I run the script manually from powershell, it works like a charm. But when run from Task Scheduler, it only copies the first 2 small files, then hang on the 3rd (large file). And I was also getting a result of "The operator or administrator has refused the request". And I have done almost everything in this forum.
Here is the scenario and how I fixed it for me. May not work for others, but just in case it will:
Scenario:
1. Powershell script in Task Scheduler
2. Ran using a domain account which is a local admin on the server
3. Selected 'Run whether user is logged on or not"
4. Run with highest priviledges
Fix:
1. I had to login to the server using the domain account so that it created a local profile in C:\Users.
2. Checked and made user that the user has access to all the drives I referred to on my script
I believe #1 is the main fix for me. I hope this works for others out there.
In my case (the same problem) helped to add -NoProfile in task action command arguments and check checkbox "Run with highest privileges", because on my server UAC is on (active).
More info about it
enter link description here
I have another solution for this problem that might apply to some of you.
After I created my power shell (xyz.ps1) script, I opened it in notepad for subsequent editing. Hence Windows made an association between my xyz.ps1 file with notepad.exe and Scheduler was trying to run my power shell script (xyz.ps1) with notepad.exe in the background instead of executing it in Powershell. I found this problem by paying close attention to "Display all running tasks" section in the scheduler, which showed that notepad.exe was being used to run the xyz.ps1 script. To verify this, I right clicked on my xyz.ps1 file in windows explorer, went to "Properties", and it showed Notepad against the "Opens With" section. Then I changed the "Opens With" to %SystemRoot%\SysWOW64\WindowsPowerShell\v1.0\powershell.exe. This did the trick. Now the scheduler would execute my xyz.ps1 using powershell.exe and gave me the desired results.
To locate your powershell.exe, refer to this article:
https://www.powershelladmin.com/wiki/PowerShell_Executables_File_System_Locations
I had very similar issue, i was keeping the VSC window with powershell script all the time when running the schedule task manually. Just closed it and it started working as expected.
I had the same issue, while running the couple of scripts. When i execute it manually from task scheduler, The script was executing flawlessly.
But it was not executing at the scheduled time automatically.
The following resolution worked for me
Find the location of the powershell exe , Right click and go to security options,Add the "Authenticated users" to the group or user names and give full control.
Once this is done wait for the script to executed.
If youu are having this problem under WIN 10 this might solve your problem as it did for me. An update messed up the task scheduler.
http://answers.microsoft.com/en-us/windows/forum/windows_10-performance/anniversary-update-version-1607-build14393-breaks/d034ab52-5d49-4b92-976a-a1355b5a6e6d?page=2
This comment solved my problem.
Your tip about "one-time" tasks works great - it will definitely be
sufficient as a workaround until MS fixes the issue. The only
advantage to "daily" as far as I can see is that lack of the arbitrary
date associated with the run time. It might be confusing to others as
to why the job is set to start on X date.
Trigger settings "Einmal" means "one-time", "Sofort" means "At once"
In my case it was related to a .ps1 referral inside the ps1 script which was not signed (you need to unblock it at the file properties) , also I added as first line:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Force
Then it worked
My fix to this problem was to ensure I used the full path for all files names in the ps1 file.
I had a similar problem where only half the script would run using task scheduler, but would run fine under the same account running the script manually. The problem was I was referencing my own module. When I added the functions directly to my script file, the task scheduler worked, but when I used the module task scheduler failed. The same coded (module) running under the same account worked fine without task scheduler.
I think this was some type of issue with how windows handles environment variables doing a run as. When I referenced the module via the full path (instead of module name) it worked from task scheduler.
after trying a lot of time...
task scheduler : powershell.exe -noexit & .\your_script.ps1
be sure to put your script in this folder : windows\system32
good luck !