Short story: need a method to get the write-status of a file on a server (using BASH) from a client (using CMD batch).
Long-time lurker, first-time poster. I did many searches on variations of what I'm looking for and have not yet found enough data.
I am writing a batch file in CMD (because the clients could be any WinOS [XP - up] with unknown packages installed). The batch uses puTTY's "plink" to connect via SSH to the server. Once connected to the server, plink executes a command to write data to a new file.
Once that file is written, I use PSCP to copy the file to the client
So far, so good; I have successfully accomplished all of this.
The creation of that file is instantaneous but the time it takes to write all of the data is unknown / variable. Therefore I need an automated method to determine when the file is complete, to then copy it. Simply using timeout/sleep for XX seconds is not feasible in my circumstances.
The approach I have taken so far (as of yet unsuccessfully) is to repeatedly grab the filesize using "stat -c '%s' filemane" and run that in a loop until grab1 EQU grab2, indicating a complete file. I am finding this difficult because I can't get the output of stat into the CMD batch to process it.
Q1: Is this (stat result going into CMD for loop) the best approach? Maybe there's something existing in BASH?
Q2: if Q1 is true, any ideas on how to get the stat result into the CMD batch as a variable to parse/analyze the data?
Thanks in advance for suggestions and your time.
DCT
Have the command writing the file write it with a temp filename. So if it will be called xyz.txt, have it written with filename tmpxyz.txt.tmp, then the final step will be a rename.
That way you can just check for the presence of the named file.
Usually a good idea to give the file a unique name, probably incorporating the date and time, I find.
Related
I am looking for a strategy suggestion.
I am very new to Linux shell scripting. Just learning tcsh not more than a month.
I need a script to automatically detects when is the result files are done copied back from a remote server to a folder in a remote machine, then start scp the files back to my workstation.
I do not know in advance when the job will finish run, so the folder could have no result files for a long while. I also do not know when will the last result file done copied back from remote server to the folder (and thus can start the scp).
I had tried crontab. Work fine when I guess correctly, most of the time just disappointing.
So I tried to write a script myself and I have it now. I intend to produce a script that serves me and my colleagues too.
To use the script, the user first need to login to the remote machine manually. Then only execute the script at remote machine. The script first asks for user to input their local machine name and directory where they wish to save the result files.
Then the script will just looping to test when is the total number of files change. When it detected that, which means the first result file is starting to be copied back from the remote server, then it loops again to detect when is the total files size in the folder stop changing, which means last result file is finished copied to the folder. After that it executes scp to send all the result files to the user workstation, at the initially specified directory.
Script works fine but I wish to make the script able to run in background and still running by itself even if the user logout from the remote machine and close the terminal. And also I wish to let the user just type in a simple command in terminal to start the script, something like a simple
./script.tcsh
I tried to run the script by command
./script.tcsh &
but fails, because background process unable to accept user input.
Google and found something called disown, but the command is not found. Apparently the remote machine and my machine does not support this command.
Tried to modify the script to first accept the user input, then attempt to use
cat > temp_script.tcsh << EOF
{rest of my script}
EOF
and then a line of
./temp_script.tcsh &
to try to create another script file and use the first script to initiate the second script in background. Also fail, because cat does not treat $variable as a literal text, it replaces it with values. I have a foreach i(1 2) loop, and the cat command just keep reporting error (missing value of variable i, which is just a counter in foreach loop syntax).
I am out of idea at the moment.
Can anyone enlighten me with some strategy that I can try myself?
The goal is to use only 1 script file, and prompt user for 2 inputs (machine name and directory to save), then no more interaction with user or waiting, and able to run even closing the terminal.
Note: I do not need password to login to remote machine and back.
In a batch script redirecting the output to a file like so
set "output=C:\output.txt"
echo Blah blah blah. >> %output%
Is it required that the file is closed after the redirection of writing stuff to it is completed (similar to the approach in other programming languages)?
I have tried searching for related information online but could not find something on it; I assume the fact that most scripts are closed after they finish their tasks (commands) is maybe the reason why.
But if say a script is to run in an endless loop where a different output file is written (e.g. by appending the time to the output file name) or if new output is constantly being redirected to the same output file, could the "not closing of the file" potentially lead to problems, memory or other?
No, you don't have to close any file handles in batch scripts. You don't know the file handle value so you could not even close it if you wanted to.
On Windows, all open kernel handles are closed when a process ends/crashes but since a batch file is interpreted by cmd.exe without starting a new cmd.exe process in most cases, it cannot take advantage of the automatic handle cleanup and will manually close the file handle after each redirected operation.
I'm looking for a simple solution which would enable batching SAS programs, which would run directly from cmd, e.g.,
"C:\Program Files\SASHome2\SASFoundation\9.4\sas.exe" -CONFIG "C:\Program Files\SASHome2\SASFoundation\9.4\nls\en\sasv9.cfg" -sysin "C:\Users\Documents\sas\Run_Program.sas" -LOG "C:\Users\Documents\sas\f2.log"
However the problem is that i have two SAS files - one representing config, and the second - being program (config file loads particular data sets, global variables, etc...).
Is there a simple solution how i could automate program running process from cmd using current file structure.
Other considered ideas:
i have considered creating new (dynamic) sas file batch.sas with it's content %inc "config.sas"; %inc "program.sas" where "program.sas" is dynamic argument provided for every run.
If I understand correctly - you are looking to run the config.sas file before the program.sas file? Your suggestion of doing so via %inc is a good one, another approach would be to call your config.sas file on startup by using it as an autoexec, eg as follows:
"C:\Program Files\SASHome2\SASFoundation\9.4\sas.exe"
-CONFIG "C:\Program Files\SASHome2\SASFoundation\9.4\nls\en\sasv9.cfg"
-sysin "C:\Users\Documents\sas\Run_Program.sas"
-LOG "C:\Users\Documents\sas\f2.log"
-autoexec "C:\Users\Documents\sas\config.sas"
An autoexec file runs once, when the SAS session is initialised. Your config.sas file could start by calling any relevant / existing autoexec(s) if needed.
The autoexec solution is a reasonable one, but I think that unless you are always (in every single SAS program/session you ever run) loading exactly the same config, that ultimately the right way to do this is the same way c programs have done this for decades with header files.
If you have a config.sas that loads datasets for a particular program, that program should include %include config.sas at the top. Then you just batch the program.
In my windows server, am taking SQL DB backup in C drive. I wants to copy this .bak file to some other client windows machine in a particular drive with the current date using batch script. So that i can schedule this batch script using scheduled task. please help me out.
Can anyone, give a script to run this. thanks
Batch files are always tricky to get just right. First I'd open a command line and see if you can copy between two machines using the following syntax:
copy C:\localfile.bak \\remotemachine\c$\Path\remotefile.bak
(where "remotemachine" is the name of the remote machine and "c$" is the drive you wish to copy to). You can then copy this into a batch file and set up a scheduled task.
As for renaming the file to have the current date and time, I suggest you start with this question as it may involve some effort to get it into a format you want.
I've got a PHP script which I'm running from a command line (windows) that performs a variety of tasks, and the only output it gives is via 'print' statements which output direct to screen.
What I want to do is capture this to a log file as well.
I know I can do:
php-cli script.php > log.txt
But the problem with this approach is that all the output is written to the log file, but I can't see how things are running in the mean time (so I can stop the process if anything dodgy is happening).
Just to pre-empt other possible questions, I can't change all the print's to a log statement as there are far too many of them and I'd rather not change anything in the code lest I be blamed for something going fubar. Plus there's the lack of time aspect as well. I also have to run this on a windows machine.
Thanks in advance :)
Edit: Thanks for the answers guys, in the end I went with the browser method because that was the easiest and quickest to set up, although I am convinced there is an actual answer to this problem somewhere.
You can create a powershell script that runs the command, reads the data from the command's STDOUT then outputs the output to both the log file and the terminal for you to watch. You can use the commands Write-Output and Write-Host.
Microsoft's site: http://www.microsoft.com/technet/scriptcenter/topics/msh/cmdlets/tee-object.mspx
Another option would be use find a tee program that will read input and divert it to two different outputs. I believe I have seen these for windows but I don't think they are standard.
Wikipedia: http://en.wikipedia.org/wiki/Tee_(command)
I have always opened the log file up in my web browser. This allows me to refresh it easily and does not interrupt any writing to the file that windows does. It isn't particularly elegant but it does work!
You want the "tee" command for Windows. See http://en.wikipedia.org/wiki/Tee_(command)
Powershell includes a tee command, and there are also numerous versions of tee for Windows available, for instance:
http://unxutils.sourceforge.net/
http://www.chipstips.com/?p=129
Also can be implemented in VBScript if you prefer.
EDIT: Just occurred to me I should also mention the tail command: http://en.wikipedia.org/wiki/Tail_(Unix). Tail allows you to read the last N lines of a file, and also includes a "file monitor" mode that just continually displays the end of the file in real-time. This is perfect for log file monitoring since it allows you to watch the log in real-time without interfering with the process that's writing to the log. There are several implementations of tail for Windows, both command line and GUI based. Microsoft's Services For UNIX packages (or whatever they're calling it now) also include a version of tail. Some examples:
mTail
Tail for Win32
WinTail
MakeLogic Tail
Some of these go far beyond just displaying the file in real-time as it updates and can send email alerts and colorize string matches, monitor multiple files at once, etc.
Slow:
for /f "delims=" %a in ('php-cli script.php') do #echo %a&echo %a>>log.txt
or in a batch file:
for /f "delims=" %%a in ('php-cli script.php') do #echo %%a&echo %%a>>log.txt