I need to run similiar multiple mp3 converting processes at the same time with windows command prompt.
As far as I know, if I start one converting process using batch file, only 1 console window apears and only one process starts. After this process is finished (and if there is a command line for it) starts another process and so on.
As I just tested, if I start multiple batch files at the same time, I have more then 75% time economy!!!!! It's very important form me, because overall time processing counts in months (in my case)!
So, my problem is - how to force start all other multiple batch files at the same time and make the main batch file wait for finished executing all others batches. And only then continiue actions in main batch file.
THANK YOU!
Related
We have a QProcess that runs a bash script. The script finishes properly and produces expected output, but the finished signal takes a very long time (minutes) afterward to emit. Basically, our script is generating an encrypted tarball from a list of files fed as an argument. The final bundle is sitting there on disk, intact, but Process takes a very long time to return. This is preventing our UI from moving on to the next task, because we need to ensure the script has run to completion programatically, instead of through inspection. We're not doing anything other than
connect(myProcess, SIGNAL(finished()), mySlot, SLOT(tidyUp()));
myProcess.start();
We can monitor the size of the file with Qt, and we have an estimate of its final size based on the file list we feed the script, but the script hangs around for a very long time after the file has reached its estimated size. We've inserted sync statements, but that doesn't seem to have any effect. When the script is run on the command line, the file grows, and the script stops as soon as it reaches its final size.
Why is QProcess not sending it's finished signal immediately after the script completes?
We would very much like to attach a progress bar indicating percentage of file size produced, or give some other indication of progress, but we're stumped by this behavior. We've tried using both a worker thread moved to a QThread, and running the QProcess directly in a busy loop, calling processEvents(), to no avail.
Turns out this was a problem with the commands I had in my pipe. GPG plunks out the file fairly quickly (quite variable timing, though) but then often spends quite a lot of time idling/working after the file itself has reached its final size. I'm not sure what it's doing, or why it only does on some runs of the same content, but eventually it finishes, the script completes, and I get my finished() signal delivered. I may have to put a more elaborate progress bar in place that switches to busy wait if the file size hasn't changed for a while, but it appears that Qt is working as expected here after all.
I need to write a shell (bash) script that will be executing several Hive queries.
Each of the queries will produce a directory with a lot of files.
After all queries are finished I need to process all these files in a specific order.
I want to run Hive queries in parallel as background processes as each one might take couple of hours.
I would also like to parallelize resulting file processing but there are some culprits, that I don't know how to handle. I.e. I can start processing results of the first and second queries as soon as they are finished, but for the third, I need to hold until first two processors are done. Similarly for the fourth and fifth.
I won't have any problems writing such a program in Java, but how to do it in shell - beats me.
If someone can give me a hint on how can I monitor execution of these components in the shell script, I would appreciate it greatly.
I am creating a batch file which is calling other batch file. Sometimes the second batch file gives an error (because the license for the software I am running is not found). When the error hits, a window pop ups and I need to close manually (undesirable because it needs to run in a loop).
I would like to call the second batch file and if it didn't finish to run after 90seconds, kill it and go to the next line of my first batch file.
Is that possible?
Pause, sleep, timeout, and a few others can help you.
I would suggest timeout.
timeout /t 90
Here's some more info: http://www.wikihow.com/Delay-a-Batch-File
To do exactly what you want you'll have to work in some logic to decide what to do after the timeout, but you will likely use a loop after that. It depends on the structure of your code. Maybe toggle a boolean variable to determine if you want to go to the top of the script.
I have a reasonably large number of .bat files that are launched by the Windows Task Scheduler. And, subsequently, or by an app that's called in the process. In the latter case, the app launches a .bat file to log that it has started and another .bat file to log that it has completed. They all trigger another single logging .bat file that writes to a log file. There a multiple situations that cause them to overlap:
all of the Task Scheduler tasks are manually Run at once
one of the app tasks is still running when another related Task
Scheduler runs on schedule.
So, we sometimes see:
the process cannot access the file because it is being used by another
process.
And, the result of this is that log entries are missed.
Just to be clear:
Task Scheduler tasks:
go1 >>> launches bat_name1.bat
go2 >>> launches bat_name2.bat
etc.
bat_name1.bat, bat_name2.bat,....
CALL log.bat %bat_nameN%
app.exe %bat_nameN%
EXIT
app.exe task:nameN
launches STARTnameN.bat
(runs the core of the app)
launches ENDnameN.bat
STARTnameN.bat and ENDnameN.bat
log.bat %nameN%
log.bat
#ECHO OFF
SET fileloc=C:\Users\Public\BackupLogs
echo %time% %date% %2 %3 %~1>%fileloc%\temp.txt
type %fileloc%\temp.txt>>%fileloc%\backuplog.txt
So the objective would be to allow all these programs to run autonomously but to sequentialize the result so the log files can be completely written without interference.
One thought would be to separate the temp.txt into tempN.txt,... and to append the result to the single backuplog.txt as a part of the ending process. That would likely make it better but doesn't appear to be a 100% solution as there could still be overlaps?
You could test if the append fails and retry via something like:
:try_append
copy /b %fileloc%\backuplog.txt+%fileloc%\temp.txt %fileloc%\backuplog.txt
if errorlevel 1 goto try_append
(copy must be used as internal commands such as echo and type won't set the error level.)
That would improve things, however you'd still have the problem of collision on the %fileloc%\temp.txt file. Perhaps you have a way to easily resolve this using unique temp names in your various .bat files.
If not, better random temp filenames can be created using %time::=% (millisecond randomness), but even that could conceivably collide.
When I want a truly random filename I involve the value of the RDTSC opcode which changes every processor clock cycle making collisions impossible. There are open source tools available to help with this, eg: capture RDTSC opcode. But perhaps that is a topic for another question.
I've got an executable that does some structural analysis. It's compiled from old Fortran code, somewhat of a black box. It reads an input file and writes output to the command window.
I've integrated that executable into an Excel VBA macro to do design optimization. My optimization routine does
Write 10 input files in different directories
Call 10 concurrent instances of the executable (each of the 10 instances is from a copied and renamed version of the exe file) and pipe the output to a file
Wait for them all to finish
Read in output files, use the results to generate a new set of designs, and start again.
The executable runs very quickly, less than a second for all the concurrent instances.
This scheme is pretty reliable when I run it on its own. However, I'd like to run multiple optimization jobs concurrently. So imagine 8 or 10 instances of Excel, each running these optimizations concurrently. On my computer, it generally runs fine. On other, nominally identical spec, machines, we're running into problems, where the output file isn't getting created, either because the executable isn't getting called, or is failing to run, or the output is failing to be piped to the results file. I'd welcome suggestions to check for those. This doesn't happen every time, maybe once per 1000 iterations. But it does happen simultaneously across most of the Excel instances and most of the 10 executable calls.
Any idea what is going wrong? It seems like it has something to do with calling so many executables or writing so many files so quickly.