Bash - does 'more' command block writing on file? - file-locking

If I execute the command 'more' on a file while a program is writing on that file, does the program stop or halt or can't write anymore?

more reads the file and some other program writes the file. This can happen simultaneously as long as no mandatory file locking is used. So opening a file with more has no impact on other programs writing the file.

Related

Redirecting output to file in batch file requires closing? (batch script)

In a batch script redirecting the output to a file like so
set "output=C:\output.txt"
echo Blah blah blah. >> %output%
Is it required that the file is closed after the redirection of writing stuff to it is completed (similar to the approach in other programming languages)?
I have tried searching for related information online but could not find something on it; I assume the fact that most scripts are closed after they finish their tasks (commands) is maybe the reason why.
But if say a script is to run in an endless loop where a different output file is written (e.g. by appending the time to the output file name) or if new output is constantly being redirected to the same output file, could the "not closing of the file" potentially lead to problems, memory or other?
No, you don't have to close any file handles in batch scripts. You don't know the file handle value so you could not even close it if you wanted to.
On Windows, all open kernel handles are closed when a process ends/crashes but since a batch file is interpreted by cmd.exe without starting a new cmd.exe process in most cases, it cannot take advantage of the automatic handle cleanup and will manually close the file handle after each redirected operation.

Keep a shell or batch file from proceeding until the previous run finishes

Is it possible for a batch program and shell script to know if a previous instance of itself is still running and wait until that program ends before triggering the next step in the command line?
We have a bat file and a corresponding shell that is triggered by an application which writes logs into a file, what happens when multiple users use the same application and triggers the program at the same time is that the logs get jumbled and is not readable by the program that turns the log into a PDF.
Thanks.

file flush and sync with other program

I am running a Python script that is meant to launch a .bat file on Windows 7. The batch file runs a sort of Monte Carlo program for 2 minutes. At the end of those 2 minutes the program must update a data file that I will read later in the same Python script, do some data analysis on the output and relaunch the same .bat file if necessary.
I do this process an unlimited amount of time until I get a satisfactory output.
The Python code looks like this
import os
import numpy as np
os.system('myBatch.bat > messages.txt')
while True:
# force flushing
fd = open("output.txt",'a')
fd.flush()
os.fsync(fd.fileno())
fd.close()
# read data
data = np.loadtxt("output.txt")
# some data analysis ...
# testing
if dataSatisfactory:
break
else:
os.system('myBatch.bat > messages.txt')
As you can see I am trying to force flushing the output.txt file that must be written by the program in the batch file. As I have no control over this compiled executable program the output file is not getting updated after the command os.system('myBatch.bat > messages.txt'), but only after I kill the whole python process.
I tried to add the 4 lines after # force flushing, but apparently still not working.
For the sake of clarification, the input of the executable is the output.txt as well as the output, so the executable must over-write the file output.txt.
The file messages.txt is to redirect all the messages of the executable from the stdout to messages.txt.
Any advice or a hint?
NB: The batch file looks like this
.\TheExecutable %output.txt
if your batch file does nothing but calling your monte carlo program, get rid of it and call the program directly from python: os.system("TheExecutable.exe output.txt")
If you can make your program write to standard output instead of a file, you could use subprocess.check_output("TheExecutable.exe") which just returns that standard output. See the documentation of numpy.loadtxt() for functions loading from string instead of a filename.
This way, you completely avoid writing & reading to/from disk.

Logging the exit of a batch file

So I wrote a tool with some batch commands, nothing specific. In the beginning the user can choose which task to perform thanks to a loop.
In that loop I included the "Q" option, as to quit the batch file. When this happens, it gets written to a logfile to check when the user started the script(s), and when it ended.
The issue is this only happens if the user actually quits/exits with Q. If (s)he quits by just closing the batch file, this won't be logged.
In short: how can I record when the user has quit the batch file without using the build-in function?
a batch file can't receive the "exiting"-event. What you can do is:
Make a launcher.bat file, that starts the original (yourfilename).bat file with:
start /wait (yourfilename).bat
the launcher.bat file will now wait until you close the second (yourfilename).bat file. place your log-information on the next line of launcher.bat
convert launcher.bat to launcher.exe using bat to exe converter (and make it invisible).

Is there a limit for a file size when redirecting input from a file?

http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/redirection.mspx?mfr=true says:
Reads the command input from a file, instead of reading input from the keyboard.
How large can the file be?
I know that putting myprogram.exe <arguments> into a .bat file has a limit on size of batch file, and I wonder if I can avoid it by running myprogram.exe < arguments.txt?
As large as your program can accept from the standard input stream.

Resources