Is it possible for the windows task scheduler to read the return code of a batch script and to fire off a task based on the value of said return code? I have a batch script that:
copies files from one drive in server A to another drive in server B.
If server B is down it'll complete with a return code 1
and I want the task scheduler to see that return code of 1 and send an email saying there's something wrong with server B.
It's clear in the task scheduler it sees the return codes but I can't seem to find an option to use the return code.
Another idea I've thought about is have the task scheduler send an email if a task fails after starting up. But it appears even when the batch file returns a 1 for a failed copy, the task scheduler still marks the task as ran successfully.
I'm using Ubuntu. I have two bash script files. Both will run in parallel. Now I want to continuously monitor on another file that it is running or not.
So any way to find that the file is currently executing or not ?
Numerous possibilities, it is a question of creativity...
Some suggestions:
periodically poll the process list and filter it by name or process id
start the script with control sockets, as long as the sockets are open the script runs
use the usual locking strategy in file system.
have the script do a lifebeat on a regular base, then watch that lifebeat.
start the script in a series of commands, the moment the script exists the next command will be executed by the calling shell. That one could be a notification script or something.
have the script do some wiggling on your desktop and watch it yourself.
start it using nohup and watch the log file.
implement a deamon inside the script and connect periodically.
open a file from within the script and watch the file system using the fuser system call.
periodically write a token into a file by the monitoring script and have the monitored script remove that token, like a baton.
call the script using a blocking call. The script executes as long as that blocking call does not return.
create a singleton strategy on process level and simply try starting it periodically.
make the monitoring script act as a monitor deamon the executing script connects to. If the connection is terminated the scipt obviously has stopped executing.
...
Sorry, this starts getting boring...
I am trying to run unmodified reports using batch processing in Microsoft Dynamics AX 2009. I have set up my configuration, and set up an AOS printer to run the report on. When I send a report to the batch queue, it immediately has an error when it begins execution.
The error is as follows:
Error executing code: SysGlobalCache object not initialized.
(S)\Classes\SysGlobalCache\get (S)\Classes\ClassFactory\reportRunClass
- line 14 (S)\Classes\RunBaseReport\makeReportRun - line 19 (S)\Classes\RunBaseReport\unpack - line 31
(S)\Classes\RunbaseReportStd\unpack - line 26
(S)\Classes\BatchRun\runJobStatic - line 27
I have tried running three different reports: Customer, Vendor, and Purchase Lines. I get the same error every time.
Any suggestions?
We faced a similar problem at my work, but didn't want to rely on having to set up the legacy batch processing method, suggested previously. Luckily in our case, it wasn't a requirement that the report actually be printed to hard-copy. So rather than try to send the report to a printer, you can run it to a file (ASCII, PDF, etc).
The batch server can process these, but since you'll need to specify a place to save the file, watch out for the following:
Be sure to use a UNC file path the path you wish to save to, otherwise you may get the following error: "Target file must be in UNC format."
Also be sure the necessary permissions have been applied to allow writing to that location, otherwise you'd get an error such as: "Unable to open file "
I believe the issue is that the batches are trying to process server code, and the reports are meant to run client side. Try the work around at this URL:
http://blogs.msdn.com/b/emeadaxsupport/archive/2009/06/16/how-to-run-client-batches-on-ax-2009.aspx
The gist is this, you create a batch group called "Client" or whatever, assign it to a batch server, then you run the legacy batch processor on the group. This might work for you.
Another option is to change the report to run on the server.
You'll need to check the menu item and make sure it's set to run on server.
It's a property on the menu item.
When you add a report to the batch, take a look at the batch inquiry screen.
Select the batch job - then click 'tasks'. If the task shows 'Run Location' = client, it won't run in the server-based batch framework.
Rob.
I was getting similar error. I restarted AOS and SQL reporting service and it worked all fine. Hope this helps.
I have created a Windows Task that runs on Admin account with highest privileges that runs a batch file every minute.
This batch file will execute a PHP script to retrieve a webpage , after which it checks if no page or wrong content is returned.
If the result is negative then the batch routine kills the httpd process and its children using taskkill (I am currently dealing with a PHP hang causing the Apache Http process to hang as well).
This entire process works perfectly when executed while logged onto the machine as admin. However when executing as a task (and despite admin privileges) the process does NOT get killed. There is no event or debug entry.
So my question is why is task kill unable to kill the process, how can I get more info and what alternatives exist?
In my windows task scheduler, I have scheduled a task to run a c# console application executable on a daily basis. This application sends some data to the database and then sends an email.
When I run it normally it works but however when it is run through task scheduler, it sends data to the database but is unable to send the email.
Any ideas on how to fix this?
EDIT:
Yes I can send correctly through console application. It uses default network credentials..However when I look at the event logs I have the following .NET Runtime exception logged:
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.IO.IOException
Stack:
at iTextSharp.text.pdf.PdfPages.WritePageTree()
at iTextSharp.text.pdf.PdfWriter.Close()
at iTextSharp.text.pdf.PdfDocument.Close()
at iTextSharp.text.Document.Close()
at ReadReutersRates.Program.SavePDF(System.DateTime)
at ReadReutersRates.Program.Main(System.String[])
I think it has something to do with pdf file which I am attaching to the email. But it's quite strange that this works when I manually run it.
This is a permissions error, the user you're running the task as when running through the console will have more permissions than the user running the task.
It appears the PDF app you are using is trying to write to a temp file or similar and it doesn't have permissions. (I'm assuming the email has a PDF attachment or similar that is being generated on the fly)
If you made your task run as an administrator that should work, you could then run it as a more restricted user and work out which permissions to apply to where to lock it down.