I have about 290 files that I need to optimize in a short period of time.
When I do optipng *.png it takes about 10 minutes to complete the transaction.
However when I do optipng a*.png and optipng m*.png in two separate command line it gets the work done in 5 minutes.
Now is there a way I can launch about 20 processes at the same time which will get the work done faster and not take up all of the space on my Desktop?
I have written a batch file that executes only a maximum number of commands a while ago: Parallel execution of shell processes:
#echo off
for /l %%i in (1,1,20) do call :loop %%i
goto :eof
:loop
call :checkinstances
if %INSTANCES% LSS 5 (
rem just a dummy program that waits instead of doing useful stuff
rem but suffices for now
echo Starting processing instance for %1
start /min wait.exe 5 sec
goto :eof
)
rem wait a second, can be adjusted with -w (-n 2 because the first ping returns immediately;
rem otherwise just use an address that's unused and -n 1)
echo Waiting for instances to close ...
ping -n 2 ::1 >nul 2>&1
rem jump back to see whether we can spawn a new process now
goto loop
goto :eof
:checkinstances
rem this could probably be done better. But INSTANCES should contain the number of running instances afterwards.
for /f "usebackq" %%t in (`tasklist /fo csv /fi "imagename eq wait.exe"^|wc -l`) do set INSTANCES=%%t
goto :eof
It spawns a maximum of four new processes that execute in parallel and minimized. Wait time needs to be adjusted probably, depending on how much each process does and how long it is running. You probably also need to adjust the process name for which tasklist is looking if you're doing something else.
There is no way to properly count the processes that are spawned by this batch, though. One way would be to create a random number at the start of the batch (%RANDOM%) and create a helper batch that does the processing (or spawns the processing program) but which can set its window title to a parameter:
#echo off
title %1
"%2" "%3"
This would be a simple batch that sets its title to the first parameter and then runs the second parameter with the third as argument. You can then filter in tasklist by selecting only processes with the specified window title (tasklist /fi "windowtitle eq ..."). This should work fairly reliable and prevents too many false positives. Searching for cmd.exe would be a bad idea if you still have some instances running, as that limits your pool of worker processes.
You can use %NUMBER_OF_PROCESSORS% to create a sensible default of how many instances to spawn.
You can also easily adapt this to use psexec to spawn the processes remotely (but wouldn't be very viable as you have to have admin privileges on the other machine as well as provide the password in the batch). You would have to use process names for filtering then, though.
Looks like you can write a batch file and run your commands asynchronously from that file.
Running Windows batch file commands asynchronously
Related
Should be able to sort this but I'm going round in circles. I know this has to do with setlocal
EnableDelayedExpansion, but I'm missing something.
Goal:
Execute a windows (cleanmgr.exe) script on a remote machine, wait till Cleanmgr.exe closes then have
the initiating script "type" the resultant log file (generated via cleanup script) from the remote
system in the CMD window.
What's working:
The script running on the remote machine runs fine, it echo's C: free drive space into a log file,
then cleans up the PC, and then re runs the disk space report and echo's result into same log file,
so the user can see (/have transparency of) the reclaimed space via the before & after results.
What's Broken:
The WMIC command to check for Cleanmgr.exe on the target PC, only works once, when it waits to retry
the variable containing the Hostname has been wiped out. I can see the behavior by echoing the
variable back.
Fix Attempts:
I have a hunch this has to do with the variable being lost once the if statement is ran within the
Parentheses. I have tried lots of options but they all behave the same. I have tried jumping the
process out to loop outside the original code using %1 instead of %%i but just cant quite get there.
Thanks for any improvements.
#echo off
pushd %~dp0
color 1e
setlocal EnableDelayedExpansion
title HDD Space Checker...
for /f %%i in (hostnames.txt) do (
xcopy /y cleanupwindows-sfd.bat \\%%i\C$\IT
WMIC /node:"%%i" process call create "C:\IT\cleanupwindows-sfd.bat"
echo Waiting For Processes...
timeout -t 10 /nobreak >nul
:loop
WMIC /node:"%%i" process where name="cleanmgr.exe" get name |find "cleanmgr.exe">nul
IF "!errorlevel!"=="0" set running=var
IF "!running!"=="var" timeout -t 4 >nul & echo Still Running & goto :loop
IF "!running!"=="" timeout -t 4 >nul & type \\%%i\C$\IT\%%i_HHD_Space.log
)
pause
exit
There is at least two points to see.
Your running variable, once set, is never reset, triggering an infinite loop
Your goto statement inside the enclosing parenthesis drives the command interpreter (cmd.exe) to stop evaluating the block, thus your script loose the %%i and leave the for loop, thus when terminating the :loop cycle your script will leave the for loop without cycling to other values in hostnames.txt.
To address that, put your process code in a subroutine called with CALL and reset the running variable at each :loop cycle :
#echo off
pushd %~dp0
color 1e
setlocal EnableDelayedExpansion
title HDD Space Checker...
for /f %%i in (hostnames.txt) do (
CALL:Process "%%i"
)
pause
exit
:Process
xcopy /y cleanupwindows-sfd.bat \\%~1\C$\IT
WMIC /node:"%~1" process call create "C:\IT\cleanupwindows-sfd.bat"
echo Waiting For Processes...
timeout -t 10 /nobreak >nul
:loop
set "running="
WMIC /node:"%~1" process where name="cleanmgr.exe" get name |find "cleanmgr.exe">nul
IF "!errorlevel!"=="0" set "running=var"
IF "!running!"=="var" timeout -t 4 >nul & echo Still Running & goto :loop
IF "!running!"=="" timeout -t 4 >nul & type \\%~1\C$\IT\%~1_HHD_Space.log
GOTO:EOF
Explanations: The CALL statement implies that the command interpreter will store the current executed line of your script and its state before executing the associated subprocess/command/etc.. When the subprocess/command/etc.. finishes, the command interpreter resumes its execution of the script to the next line with a restored context. This avoids then the loose of the for loop context.
I have a batch file that downloads 21 separate files (all very large). I have timeouts to try to allow some files to finish before trying to get more (not a nice solution), and if they all run together it uses all my system resources and everything grinds to a halt.
I can put /WAIT in my FOR loop, then it will only launch one download task at a time. I would like to launch 5 download windows, and once they are finished, continue with the next 5. I've tried to use GOTO to wait until finished but it does not work.
rem *****Step 1*****
set tstart=0
set tend=5
set incr=1
FOR /L %%a IN (%tstart%,%incr%,%tend%) DO (start /ABOVENORMAL GET_PIECE_ENSEMBLE.bat %newformatdate% 0%%a %HH%)
timeout /t 30
:WAIT
if exist flagfile.* goto WAIT
Then I have 4 more steps to download files 5-10, 11-16, etc. But if it takes extra long to download, the script moves along and starts downloading more files or moves on to the next processes, which require all the data to be downloaded first. I am trying 2 things here: a short timeout to ensure the data has started downloading (so a flagfile exists), and a wait if flagfile exists. But the wait isn't working, the script still moves along and starts downloading the next bunch. My workaround has been to set a timeout of 600 but sometimes the data takes more than 10 min to download, sometimes it hangs and takes up to two hours, sometimes only a few minutes. If all the pieces launch at once, it freezes up the workstation and we have to restart the machine.
And here is the code of GET_PIECE_ENSEMBLE, the script that runs in the FOR loop:
set "newformatdate=%1"
set "ename=%2"
set "HH=%3"
echo %time% > flagfile.%ename%
wget -O C:/Data/DataFile_%HH%.%ename%.grb2 http://url/%newformatdate%.grib2
del flagfile.%ename%
Looking for advice on either how to make the "wait if flagfile exists" work (any idea why this isn't working?), or if there is some simple argument I can put in my FOR loop to limit the number of instances so downloads finish before others start? Surely there is a better solution than guessing at a timeout interval when the network conditions vary so much!
In GET_PIECE_ENSEMBLE
try
...
set "HH=%3"
:only5
set /a running=0
for /f %%c in ('dir /b/a-d flagfile.* 2^>nul') do set /a running+=1
if %running% geq 5 (
timeout /t 1 >nul
goto only5
)
echo %time% > flagfile.%ename%
...
which should count the flagfiles and while there are 5 or more, delay 1 sec.
As soon as one flagfile is deleted, the next queued file to get activated will bump the number of in-process files to 5 again, blocking the other queued processes.
UNTESTED!!!
:: Start with list of files to download as parameters
:: down.cmd "file1" "file2" "file3" etc..
#echo off
set "url=http://host.com"
:loop
for /f %%a in ('tasklist /FI "imagename eq wget.exe" /FI "windowtitle eq _download*" /FO csv /NH ^| find /C "wget"') do (
if %%a LEQ 5 (
start "_download" "%path%\wget.exe" -O "%~1" "%url%/%~1"
shift
)
timeout /T 5
if not "%~1"=="" goto loop
I'm writing a batch script where I want the user to be able to control how long the script runs. When running it from the command line, the user will pass in a switch like this:
./myscript --stop-after 30
which means that the script will keep doing its job, and check every iteration how much time has passed. If more than a half a minute has passed, then it'll quit. How would I implement this in a batch script?
For reference, here is the code I have so far:
:parseArgs
if "%~1" == "" goto doneParsing
if /i "%~1" == "--stop-after" (
shift
set "duration=%~1"
)
:: Parse some other options...
shift
goto parseArgs
:doneParsing
:: Now do the actual work (psuedocode)
set "start=getCurrentTime"
set "end=%start% + %duration%"
while getCurrentTime < %end% (
:: Do some lengthy task...
)
How would I go about implementing the latter part of the script, after parsing the options?
Thanks for helping.
This is not that trivial. You'll have to do a lot of calculation within your script to cover all cases of full minute, full hour, or even new day. I can think of two different ways. Both are based on two batch files:
1. Termination via taskkill
starter.bat:
#echo off
if "%1"=="" (
set duration=5
) else (
set duration=%1
)
start "myscript" script.bat
ping 127.0.0.1 -n %duration% -w 1000 > nul
echo %duration% seconds are over. Terminating!
taskkill /FI "WINDOWTITLE eq myscript*"
pause
script.bat:
#echo off
:STARTLOOP
echo doing work
ping 127.0.0.1 -n 2 -w 1000 > nul
goto STARTLOOP
For this solution, it's important that you give the window executing your script a unique name inside the line start "myscript" script.bat. In this example, the name is myscript. taskkill /FI "WINDOWTITLE eq myscript*" uses myscript to identify which process to terminate.
However, this might be a bit dangerous. Your script will be killed after x seconds, no matter if an iteration is done or not. So, e.g., write access would be a bad idea.
2. Termination via flag file
starter.bat:
#echo off
if "%1"=="" (
set duration=5
) else (
set duration=%1
)
if exist terminationflag.tmp del terminationflag.tmp
start script.bat
ping 127.0.0.1 -n %duration% -w 1000 > nul
echo %duration% seconds are over. Setting termination flag!
type NUL>terminationflag.tmp
script.bat:
#echo off
:STARTLOOP
echo doing work
ping 127.0.0.1 -n 2 -w 1000 > nul
if not exist terminationflag.tmp goto STARTLOOP
del terminationflag.tmp
echo terminated!
Here, it's important to ensure that your script is allowed to create/delete a file at the current location. This solution is safer. The starter script will wait the given amount of time and then create the flag file. Your script will check after each full iteration whether the flag is there or not. If it's not, it will go on—if it is, it will delete the flag file and terminate safely.
In both solutions ping is used as timeout function. You could also use timeout/t <TimeoutInSeconds> if you are on Windows 2000 or later. However, timeout doesn't always work. It will fail in some scheduled tasks, on build servers, and many other cases. You'd be well advised to stick to ping.
Batch file scripting is brand new to me, so please be patient with me...
In order to run many (e.g. 5076) calculations on a windows (10, 64-bit) environment, I use the .bat file:
gulp.exe < input-1.dat > output-1.out
gulp.exe < input-2.dat > output-2.out
gulp.exe < input-3.dat > output-3.out
.
.
.
gulp.exe < input-5076.dat > output-5076.out
Unfortunately, some of the calculations will hang (for unknown reasons)...currently, when this occurs, I manually kill the whole batch command...which, means I have to keep an eye on how the calculations are progressing and cannot simply leave them running (e.g. overnight)
Thus, I am looking for a way of automatically killing the gulp.exe executable if it has been running for 30mins, but in a way that will mean the next calculation in my list still runs.
i.e. if calculation 2 hangs, kill it and run calculation 3
A search [1, 2]
indicates that taskkill might be the command I am looking for, but am a little confused as to whether it kills the .bat script or the executable and how to apply it to more than 1 calculation in the list.
Thus, I would really appreciate some pointers...
Thanks
The calculations consume CPU time so you can analyze it in 1 second samples with built-in typeperf, assuming there can only be one GULP process running at a time.
#echo off
set INTERVAL=1
set CPUTHRESHOLD=0.1
call :runGulp "input-1.dat" "output-1.out"
call :runGulp "input-2.dat" "output-2.out"
call :runGulp "input-3.dat" "output-3.out"
pause
exit /b
:runGulp
start /b gulp <"%~1" >"%~2"
:wait
for /f "delims=, tokens=2 skip=2" %%a in ('
typeperf "\Process(GULP)\%% Processor Time" -si %INTERVAL% -sc 1
') do (
if %%~a GEQ %CPUTHRESHOLD% goto wait
if %%~a==-1 exit /b
)
taskkill /f /im gulp.exe
exit /b
I have 4 .cmd files. I want to run then in parallel taking 2 at one time.
say my files are : 1.cmd, 2.cmd, 3.cmd, 4.cmd
i want to run 1.cmd and 2.cmd in parallel. Now when any of these ends , i want to run 3.cmd and then 4.cmd. In short, at any given time i want 2 of the .cmd files to run.
I am using the Start command for parallel execution. But I am new to scripting and I am getting confused on how to furmulate the above mentioned way of running the cmd files.
Any help would be appreciated.
Thanks
Debjani
I have given an answer to “Parallel execution of shell processes” once, quoted here:
Sounds more like you want to use
Powershell 2. However, you can spawn
new cmd windows (or other processes)
by using start, see also this
answer. Although you probably have to
use some other tools and a little
trickery to create something like a
"process pool" (to have only a maximum
of n instances running at a time).
You could achieve the latter by using
tasklist /im and counting how many
are already there (for loop or wc,
if applicable) and simply wait (ping
-n 2 ::1 >nul 2>&1) and re-check again whether you can spawn a new
process.
I have cobbled together a little test
batch for this:
#echo off
for /l %%i in (1,1,20) do call :loop %%i
goto :eof
:loop
call :checkinstances
if %INSTANCES% LSS 5 (
rem just a dummy program that waits instead of doing useful stuff
rem but suffices for now
echo Starting processing instance for %1
start /min wait.exe 5 sec
goto :eof
)
rem wait a second, can be adjusted with -w (-n 2 because the first ping
returns immediately;
rem otherwise just use an address that's unused and -n 1)
echo Waiting for instances to close ...
ping -n 2 ::1 >nul 2>&1
rem jump back to see whether we can spawn a new process now
goto loop
goto :eof
:checkinstances
rem this could probably be done better. But INSTANCES should contain
the number of running instances
afterwards.
for /f "usebackq" %%t in (tasklist /fo csv /fi "imagename eq
wait.exe"^|wc -l) do set
INSTANCES=%%t
goto :eof
It spawns a maximum of four new
processes that execute in parallel and
minimized. Wait time needs to be
adjusted probably, depending on how
much each process does and how long it
is running. You probably also need to
adjust the process name for which
tasklist is looking if you're doing
something else.
There is no way to properly count the
processes that are spawned by this
batch, though. One way would be to
create a random number at the start of
the batch (%RANDOM%) and create a
helper batch that does the processing
(or spawns the processing program) but
which can set its window title to a
parameter:
#echo off
title %1
"%2" "%3"
This would be a simple batch that sets
its title to the first parameter and
then runs the second parameter with
the third as argument. You can then
filter in tasklist by selecting only
processes with the specified window
title (tasklist /fi "windowtitle eq
..."). This should work fairly
reliable and prevents too many false
positives. Searching for cmd.exe
would be a bad idea if you still have
some instances running, as that limits
your pool of worker processes.
You can use %NUMBER_OF_PROCESSORS%
to create a sensible default of how
many instances to spawn.
You can also easily adapt this to use
psexec to spawn the processes
remotely (but wouldn't be very viable
as you have to have admin privileges
on the other machine as well as
provide the password in the batch).
You would have to use process names
for filtering then, though.
I have not tried this, but I assume you can do this with PowerShell. Use this type of structure:
http://www.dougfinke.com/blog/index.php/2008/01/30/run-your-powershell-loops-in-parallel/
Within this example you should be able to execute cmd/bat files.
Check out the following thread for some ideas (possible duplicate?)
Parallel execution of shell processes