I am working with a log analytics tool called Splunk, which executes a Powershell script and stores the output.
The script execution takes pretty long time and the output is around 100MB+ when the script completes the execution.
When I check the log of the script, I found out that the execution was paused and continued after around an hour!
After looking at the question about QuickEdit mode of the Powershell console metioned here, There are 2 possible reasons why this can happen:
If the QuickEdit mode of the Powershell console is On, and if there are any selected words or area in terminal. But, this is not the case, as the Powershell is invoked as a background process
If the stdout throughput is too much than the host console can handle. Possible reason as the output of the script is very large as mentioned.
Note: there is no sleep or stdin expected in the powershell script as it is meant for collecting performance metrics.
What is the exact reason of the hang of the process in this case, (from 2 mentioned about or any other) and how can I prevent it?
Related
I recently "inherited" a complex PowerShell script that performs tasks in the background via task scheduler. Now we're seeing that the script hangs in some occasions, but until now I'm unable to identify the root cause.
Is there a way to pol or attach a debugger to an already running script so i can get the current line-number without rewriting large portions? In it's current state, the maintainability of the script is sub-par with 20k lines of code.
I tried checking WMI for properties, but found nothing useful. I did found a chronograph script that may be useful https://powershellexplained.com/2017-02-05-Powershell-Chronometer-line-by-line-script-execution-times/ .
I also wrote a debugging wrapper, but the hang only happens in some occasions. I am unable to reproduce on demand.
Thanks
Have you looked at Enter-PSHostProcess ?
The Enter-PSHostProcess cmdlet connects to and enters into an interactive session with a local process. Instead of creating a new process to host PowerShell and run a remote session, the remote, interactive session is run in an existing process that is already running PowerShell. When you are interacting with a remote session on a specified process, you can enumerate running runspaces, and then select a runspace to debug by running either Debug-Runspace or Enable-RunspaceDebug
Enter-PSHostProcess
I have a batch script where I call a program several times, that is failing. This is a known issue and not by me. The crash is not intended, but for my case, it does not matter, as my desired output was generated before this crash. Every time it fails, the following known window appears:
.
As said, I don't care about that misbehavior of the application and I am happy with the result processed before the failure. But in my script, I call that program several times and I always have to cancel that operation. I only want the script to terminate without showing that window, similar to Linux systems where the execution aborts and the command line can be used without any interaction. An example with an Segmentation Fault can be seen in the image found online:
How can I call the program that this window does not pop up and the script works with no action by the user?
I'm attempting to run an automated powershell script all day, everyday. But I'm have problems getting it to run consistently and reliably, the script itself runs fine, it's getting the windows scheduler to run it consistently that's the problem.
The script is invoked every morning by windows scheduler at 1am as powershell.exe with the command arguments:
-windowstyle Normal -NoExit -file "d:\work\PwrShellScripts\FlmToDb_010.ps1"
Once invoked, the script will run continuously until 11pm at night when it will exit.
The script itself works reliably, but the scheduling only works nine times out of ten, once in a while it fails with the error:
Task Scheduler did not launch task "\DailyFlm" because instance "{aa18e048-d8b2-4e16-8737-fc7babbb609e}" of the same task is already running.
The question is, how to get the script to run reliably every day?
Other info that may be relevant…....
The arguments -windowstyle Normal -NoExit mean that the powershell script runs in a command window (rather than as background process) and the window will remain open if the session ends.
This is done for two reasons, firstly it provides a visual indication that the process is actually running, and secondly if the process fails, it allows the error message to be inspected. The powershell script doesn’t include any file logging, so running it in a command prompt also allows me to confirm that the previous days session made a clean exit when it stopped.
One of the issues is that because the process works 90% of the time, if I make any tweaks I have to wait 10 days or more to confirm if they’ve really worked!
I suspect that the issue may be related to the fact that the console remains open (-NoExit) when the script exits. Most of the time windows seems to recognise that although the console is still open, the associated script has exited.
My guess is that occasionally it decides that since the console is open the process is still running. I'm unable to spot any difference between those occasions when the scheduling works fine and those when it doesn't.
Any suggestions?
Updates...
The scheduler fails to start the job on average once every 10 days. I would prefer to keep the script running in the foreground, it makes monitoring it's progress so much easier, and makes it so obvious if it does fail.
the same task is already running.
The script may do what it's supposed while it runs, but there is one flaw... it's not closing properly. It's exactly the -NoExit issue you talk about. When you run a powershell task, the process name is powershell.exe, with it's associated process id, and that's how the task scheduler knows if it's finished or not.
To fix this, I suggest writing a script to kill all powershell.exe processes and scheduling it to run right at 12:55am every day.
I have triggered a Unix script which is basically put details from DB and create a xml for many id's(approximately 1000 items). After generating all XML it will post it to a queue. Right now I have doubt. My Script is running in a script. I am just monitoring it using PUTTY. will my script continuing executing if I shut down my system?
As Barmar's comment suggests, you should first start a screen session by typing screen in the shell.
Next, you should start your script.
Finally, you should disconnect your screen session and log out, with Control-a, d.
Whenever you login again, if you want to see the progress, you can do screen -a to reattach the screen.
If you executed your script in the foreground (or forked it to the background with &), it will terminate once your session closes.
To avoid this, you have a few alternatives:
Use nohup with & to force the program to ignore the SIGHUP signal.
Use a scheduler like cron, at, or your init system to schedule the process to run at/for a certain amount of time.
Use a virtual terminal multiplexer like GNU screen or tmux to run the program in the "background" on a resumable terminal.
Using a scheduler is probably the most versatile of the three, but a multiplexer can be useful if you'd like to see the script's output in real time.
I have a windows batch file that is invoked by windows scheduler. When I try to have multiple windows scheduler tasks trying to run the batch file simultaneously, the batch file is locked by the first process and the all the other instances fail.
Is there is way in Windows to run multiple instances of batch file simultaneously?
My script is a simple one all it does is:
set java_classpath
java javaClass
There is nothing inherent to batch file mechanics that limits the number of processes that can simultaneously run the same script. The actual batch script is not locked when it is run. In fact, it is possible to modify a batch script while it is running, though that is usually a very bad idea.
But a batch script could take any number of actions that would prevent simultaneous runs. The most obvious is if the script attempts to redirect output to a specific file (constant path and name). The output redirection establishes an exclusive lock that will prevent any other process from obtaining the same lock.
Another possibility is your script could be calling an external command or program that establishes an exclusive lock in some way.
Either way, there should be nothing to prevent multiple processes from launching the same script simultaneously. But if the script establishes an exclusive lock, then one (or more) of the instances may either crash or exit prematurely, or seem to hang, depending on how the failed lock aquisition is handled.
There really isn't any way to be more specific unless you post your actual script. But if it is a long script, then you should attempt to isolate where the problem is occurring before posting.
Windows 8 task scheduler has the following option (on the last, "Settings" tab):
If the task is already running, then the following rule applies:
Do not start a new instance (default)
Run a new instance in parallel
...
Probably you should change this setting. And also, I would suggest you look into http://serverfault.com and post there.
Did you try calling your batchfile by using %systemroot%\cmd.exe /K C:\path\batchfile.bat? With /K each time a new instance of cmd is opened, guess it is the shell not the file making you weird.
to people coming here from google simply looking for a way to run multiple instances of a .bat file simultaneously, a simple way would be this script:
set N=3
for /L %%i in (1,1,%N%) do (
start yourscript.bat
)