Detect a Missing file based on its Date - windows

I need to manually monitor a random windows-based backup program for failures and the only way to do this is via checking for the daily backup file it makes.
A batch file would be simplest for me to implement. It would need to scan a directory for a certain file type and then do X if a file doesn't exist that was created/modified yesterday.
C:\backups\random-named-file.tib 27/03/2017
C:\backups\random-named-file.tib 28/03/2017
C:\backups\random-named-file.tib 29/03/2017
So if a random-named-file.tib wasnt created or modified yesterday, it means the backup failed and I would write output to the console saying "Backup Failed"
Does anyone please have any ideas ?
EDIT :
To clarify, the backup files are coming from a custom application that has no way to check for missed backups. The batch file solution I'm looking for will be integrated into our RMM platform, which will take the results of the batch file and raise a support desk ticket for our Service desk to action.
The backup application will apply its own retention policies and delete any backups older then 4 days.
I'm just trying to detect if a backup was missed the previous day. The filenames are variable, the only constant is the file extension ( *.tib)
This backup app doesn't write eventlogs, nor does it create its own log files I can parse for Failure messages.

1) I would suggest you, go through this article
four-ways-to-manage-windows-server-2008-backup-on-multiple-servers
2) There is a windows scheduler which can act according to your situation
Window scheduler Wiki
3) You can write scripts which alert you whenever it fails
4) I have found a Solution for your similar kind of problem
make-windows-task-scheduler-alert-on-fail
TL:Dr --> see step 4

I manged to do what you asked. On a quick note, since it's Windows backup program there should be a way to automatically alert when backup fails.
Note about this example: I did this in a specific folder structure. If you want to change the location of batch file, second line of code inside batch file needs to be updated with appropriate path. In this example the path is relative, but it's easy to change, so I leave that up to you.
Here is my folder structure:
\---batch # Folder with batch file
backups.txt # check_backups.bat creates this file
check_backups.bat # Batch file that does the job
| filename1.tib
| filename2.tib
| filename3.tib
| filename5.tib
Here is the code of batch file (check_backups.bat):
#echo off
dir ..\* /a-d /b > "%~dp0\backups.txt"
set count=1
setlocal EnableDelayedExpansion
FOR /F "delims=*" %%A IN (backups.txt) DO (
SET fl=%%~nA
IF !fl:~8! NEQ !count! (
echo Backup Failed: Missing file is 'filename!count!.tib'
goto failed
)
SET /A count=!count!+1
)
echo All is good
:failed
pause
If filename9.tib is missing you will get a message: "Backup failed: Missing file is filename9.tib".
If everyhing is good you will get a message: "All is good".

Related

File Move Scheduled Tasks not running bat file

I know this question might have been asked in other ways, but all the things I have read and tried had not yet fixed my problem, so I am hoping to get some help here with context to my issue.
The problem:
I need to move files from a local drive to a network drive (the network drive is a SharePoint mapped library) on my server in Windows Azure (don't think the azure part matters, but it provides context).
My thought was to schedule a task that will run a bat file to move the files I need moved and do so frequently (every 5 to 10 minutes). The batch file I have created does what I need it to when I manually run the batch file but not when the task runs it.
Here is the batch file:
echo Write log file > LogStart.txt
C:\Windows\System32\robocopy.exe "\\PCICSWKS001\D$\ToBeMoved" "V:" /s /e /MOV /r:0 /W:0
echo Write log file > LogEnd.txt
You can see that it writes a log file before and after running robocopy.exe.
When the task runs it does write both of these log files so I know that the batch file is at least running.
You can also see that I have tried using the UNC path for the drive in the source, that is because I was reading that the task scheduler might not be able to pick up properly on the drive letters. When I put the machine name in for the destination and run the batch file myself or with the scheduler it errors.
Here is the action taken by the task:
Here is the task general tab
Any assistance would be much appreciated.
I had that same error with Robocopy after a few times of running it:
ERROR 3 (0x00000003) Getting File System Type of Destination
I think it has to do with how robocopy scans the destination when it has a fair amount of files on it. It worked fine when I started the copy job and the Azure destination was empty.
Anyway, I think you should try to use the AzCopy command as that should be less error prone and faster since it was designed for this kind of thing. It's command line switches are similar to robocopy so it should feel pretty familiar.

Windows Batch File and an FTP Datestamp

I am struggling to grasp the concept of writing batch files. I am hoping to schedule a .bat file to download a file with a timestamp on a daily basis. Setting up the scheduled task etc. is straight forward, my problem arises when I try execute the code to take the fixed part of the filename and append a datestamp to identify the appropiate file.
From the research I have done I still cannot get my code to pick up yesterday date in a YYYYMMDD format. eg. today is the 15th of August, I would like my batch to identify the file with 20130814. My exisitng (static) ftp batch code is
option confirm off
option batch abort
open Ftp_name
lcd "U:\XXXX\YYYY\ZZZZ"
get "LoanSummary__Daily.xlsx"
Whereas I would like the batch to consider..
get "LoanSummary__Daily_" & YYYYMMDD & ".xlsx"
Thanks.
I don't think you can dynamically build file names within an FTP script. But you can dynamically build the ftp script with a batch file prior to invoking it.
The simplest way to do this is to create an ftp script template that has the variable portion(s) represented as an environment variable with delayed expansion. For example !yesterday! could refer to an environment variable that gets expanded into yesterday's date. A simple FOR /F loop reads the template, and writes each line to a new ftp script file. The variables are automatically expanded in the process as long as delayed expansion is enabled.
Getting yesterday's date in a Windows batch file is a non-trivial excercise. There have been many methods posted on SO as well as other sites. Most methods have various limitations, the most common of which is susceptability to problems due to differences in locale date formatting. I use a hybrid batch/JScript utility called getTimestamp.bat that will work on any Windows platform from XP onward, regardless of the locale. It is pure script, so no .exe download is needed. getTimestamp.bat is avaiable here.
Assuming getTimestamp.bat is in you current directory, or better yet, somewhere within your PATH, then the following should work.
getYesterday.ftp.template
option confirm off
option batch abort
open Ftp_name
lcd "U:\XXXX\YYYY\ZZZZ"
get "LoanSummary__Daily_!yesterday!.xlsx"
getYesterday.bat
#echo off
setlocal enableDelayedExpansion
:: Get yesterday's date in YYYYMMDD format
call getTimestamp -od -1 -f {yyyy}{mm}{dd} -r yesterday
:: Create a temporary ftp script that uses the "yesterday" date
>temp.ftp (for /f "delims=" %%L in (getYesterday.ftp.template) do echo %%L)
:: Now simply invoke your ftp client using the temp script, something like
ftp -s:temp.ftp ...
:: Delete the temporary ftp script
del temp.ftp

Close Adobe Acrobat 4 (executed via batch) when file name not found

I have developed a solution where a PL/SQL Oracle API generates the file name of a PDF (inc. full file path) that requires printing (parameter 1) and then using the DBMS_SCHEDULER passes both that file name and a printer name (parameter 2) to the following batch file:
"C:\Program Files (x86)\Adobe\Acrobat 4.0\Reader\AcroRd32.exe" /t %1 %2
However there are occurrences where the file name passed to the batch file does not exist. Because the file does not exist Adobe continues to run (in background). This stops the API from executing again, until someone ends the windows process manually, as the DBMS job is connected to the Adobe instance.
Unfortunately (unless there is a way in Oracle to check if the file exists in the directory) I cannot work around this issue on the Oracle side, therefore I need to tackle it on the Windows side.
Therefore is there any additional logic that I can add to the batch file or any other script that will validate the existence of the file and if the file does not exist then end the process. The solution must be efficient as the printing of the PDF files is time sensitive.
If anyone does have an Oracle side solution for this issue then I will happily supply the relevant code.
Thanks in advance.
Not launching Acrobat is easier than trying to close it. You can just check for the file existence in the batch file using the IF EXIST command:
#IF EXIST %1 (
"C:\Program Files (x86)\Adobe\Acrobat 4.0\Reader\AcroRd32.exe" /t %1 %2
) ELSE (
REM optionally report error?
)
There are ways to check for the file from Oracle, but this is probably simpler since you already have a batch file, unless you want to test and report the error earlier in the process.

Pause command if not succeeded instead of automatically closing

Lets say I want to copy a file or run a program, but the file or the location is not found. How can I let my computer pause the command window when these kind of problems occure? I made a batch file to copy files to another location, but also to run programs.
Any help will be appreciated.
In your batch script, use xcopy to do the actual copying, rather than the copy command -
xcopy, unlike copy, returns error codes depending on the result of the copy, which are documented here in the remarks section.
As Aleksandr mentioned, you can use the error codes as part of your script to pause on error.
Assuming you would like to pause on all errors, you could do script this as below:
xcopy /HECY <source> <destination>
if %ERRORLEVEL% NEQ 0 pause
Obviously you'll need to replace and in the above with the locations you are copying from and to respectively.
The /HECY switches are just an example, but in this case could be used to instruct xcopy to copy hidden files, recurse directories, and automatically overwrite files in the destination if they exist. You can tweak these to your specific needs.
Check the error code and pause if it is non zero

Batch process all files in directory

Right now i have a batch job I wrote that calls another file and passes in the variables that executable needs to run (password and filename).
Ex:
> cd f:\test\utils
> admin import-xml -Dimport.file=f:\DB\file1.xml -Dadmin.db.password=test123
I wrote a job that does this, but found out that there would be multiple files.
The username and password never change but the filename differs for like 15 different xml files--with maybe more coming soon.
The files will always be located in the same folder. Instead of ending up with like 15-20 jobs (one for each file), can I write something that will process each file located in this directory. And either wait till one is completed before the next or I can add a 3 min sleep before it starts the next file.
pushd C:\test\utils
for %%F in (F:\DB\*.xml) do (
admin import-xml "-Dimport.file=%%~dpnxF" -Dadmin.db.password=test123
)
popd
The %%~dpnxF expands to d‎rive, p‎ath, base‎n‎ame and e‎x‎tension of the current file.
If you intend to set and use environment variables (%foo%) in that loop, read help set first before you get into trouble.
You can use the for command. Something like this in a batch file:
for %%f in (*.xml) do call myotherbatch.bat %%f
Assuming that the admin command you are running doesn't return until the job is finished, the above loop would process them sequentially.
If you run the command at the prompt (as opposed to in a batch file), only use a single %.
for file in f:\DB\*
do
admin import-xml -Dimport.file="$file" -Dadmin.db.password=test123
done

Resources