How to share the screen output of a running process? - bash

I have a program that will run for a long time , This program edit and run in remote server .I use
the computer in office to remote connect the server and run it . The progress shows on the computer screen of the computer in my office ...
I want to see the output in my home , How can I capture the output which is on the screen of computer in office and see the result at home?
I think about writing the output to a file , but I need to close the file . So I should open file then write output , close .. open again?
thanks
I don't know the proper tag should use , but the program is written in perl .....

you can tee it
your_program.pl | tee logfile.txt
and see the lastest result in logfile.txt at home with
tail -f logfile.txt

Why not just redirect to a log file and tail it (or load it in an editor etc.) ?
$ myprog.pl >/tmp/logfile 2>&1
The above redirects your output to a log file (/tmp/logfile - you may wish to choose a better location since /tmp is temporary and can be trashed during a reboot) and redirects stdout/stderr to the same place. Note that this captures the output of your program and you don't need to modify your script.
An alternative is to run your program within screen
Perhaps one of the most useful features is that you can re-connect to
your sessions after you log out, move to another computer, or get
disconnected, and continue working as if nothing happened. All your
edit sessions, shell windows, etc. will be there just as you left
them.

Personally, I use screen for this sort of thing.
Connect to the server
Enter the command screen. It displays a nice message stating the version of screen and that it's under the GPL.
Run the actual command.
At any time, hit ctrl-A, D to disconnect from the screen session. You'll see a message along the lines of [detached from pid.tty.server]. Log out from the server normally.
Connect to the server again and enter the command screen -x to reconnect to your session.

Related

Mac OSX Bash Script stop processing while specific window is open

I actually wrote a script that is compiling my LaTeX files and open the generated PDF in a viewer. That works fine.
cd Documents/my-bachelor-thesis/
latexmk 000_Root_Bachelor_Thesis.tex -pdf
open 000_Root_Bachelor_Thesis.pdf
ps -A | grep -m1 vorschau | awk '{print $1}'
So with the last line I get the PID of the process my PDF is opened.
There is the problem: I want stop my script process at the point the PDF is open. After I click on the close sign of the viewer, the process should continuo automatically. Is that somehow possible.
Current solution: I interrupt the process while waiting for some user input. After i type in something the process goes on.
echo "Can I proceed?"
read input
... more script
Thanks for helping me out.
Not familar with open in OSX. Quick search on man page suggests that you can force open to start new application, and wait until it finish by
open -Wn 000_Root_Bachelor_Thesis.pdf
Options:
-W Wait until the applications exit (even if they were already open). Use with the -n flag to allow open to function as an appropriate app for the $EDITOR environment variable.
-n Open a new instance of the application(s) even if one is already running.

How to script to read user input, then run in background itself even closing terminal in TCSH?

I am looking for a strategy suggestion.
I am very new to Linux shell scripting. Just learning tcsh not more than a month.
I need a script to automatically detects when is the result files are done copied back from a remote server to a folder in a remote machine, then start scp the files back to my workstation.
I do not know in advance when the job will finish run, so the folder could have no result files for a long while. I also do not know when will the last result file done copied back from remote server to the folder (and thus can start the scp).
I had tried crontab. Work fine when I guess correctly, most of the time just disappointing.
So I tried to write a script myself and I have it now. I intend to produce a script that serves me and my colleagues too.
To use the script, the user first need to login to the remote machine manually. Then only execute the script at remote machine. The script first asks for user to input their local machine name and directory where they wish to save the result files.
Then the script will just looping to test when is the total number of files change. When it detected that, which means the first result file is starting to be copied back from the remote server, then it loops again to detect when is the total files size in the folder stop changing, which means last result file is finished copied to the folder. After that it executes scp to send all the result files to the user workstation, at the initially specified directory.
Script works fine but I wish to make the script able to run in background and still running by itself even if the user logout from the remote machine and close the terminal. And also I wish to let the user just type in a simple command in terminal to start the script, something like a simple
./script.tcsh
I tried to run the script by command
./script.tcsh &
but fails, because background process unable to accept user input.
Google and found something called disown, but the command is not found. Apparently the remote machine and my machine does not support this command.
Tried to modify the script to first accept the user input, then attempt to use
cat > temp_script.tcsh << EOF
{rest of my script}
EOF
and then a line of
./temp_script.tcsh &
to try to create another script file and use the first script to initiate the second script in background. Also fail, because cat does not treat $variable as a literal text, it replaces it with values. I have a foreach i(1 2) loop, and the cat command just keep reporting error (missing value of variable i, which is just a counter in foreach loop syntax).
I am out of idea at the moment.
Can anyone enlighten me with some strategy that I can try myself?
The goal is to use only 1 script file, and prompt user for 2 inputs (machine name and directory to save), then no more interaction with user or waiting, and able to run even closing the terminal.
Note: I do not need password to login to remote machine and back.

Export/Stream Mac Terminal Output To File

I am running a php script on my Mac terminal which takes hours and hours to run. It consumes memory very quickly, and after a while the scroll back gets truncated. Settings on the Terminal: Scrollback - Limit to available memory.
Is there a way to automatically stream (or just save) the output into a file (whether on local disk or on external harddrive). Also I realise the memory doesn't get cleared until I restart my com (my Finder indicates 0 space on my harddrive after a while but when I restart my com it becomes 20GB). Is there a way to clear this once my output is saved?
It will be nice to include the timestamp as well in the file.
Run your PHP script in the background (or even with nohup as well, if you want to be able to log out and leave it running), and save your output to a log file on disk like this:
someScript.php > log.txt &
Now, if you want to watch the log file growing at a later point, just use the -f option to tail to follow the log:
tail -f log.txt
If you see that all is well and the job is still running, press CTRL+C and you will stop following it but the job will continue. If you want another look later, just run tail again.
If you want to see if your script has passed, say "PHASE 2", just grep for that in the log file:
grep "PHASE 2" log.txt
If you want to timestamp the lines, I would suggest you use the ts utility which is part of moreutils, so hoping you use homebrew to manage packages (as any sensible Mac user does), you would install it with:
brew install moreutils
Then you could do:
someScript.php | ts > log.txt &

command line tool text output

I have a small command line tool and after running it, I'd like to display the text output in a way that's easy for someone to copy/paste and save it or email it to someone else.
Copy/pasting from a command prompt is not done in the standard way, so I don't want people to have to copy/paste from there. Saving the file to disk is possible, but the folder where the tool is located may not have access rights so the user would have to configure the output file location (this may be too tricky for some users).
I was thinking of launching notepad with some text in it, generated from the command line tool. Is this possible? Any other suggestions?
You can use clip.
After you have clip, which can be downloaded from the link above, you use the pipe (|) command to copy the previously executed command's output to the clipboard.
The article gives you the full explanation, but here are the basics with examples:
dir /h | clip – Copy the help manual for DIR command to the clipboard
tracert www.labnol.org | clip – Trace the path from your computer to another website – the output is automatically copied to the clipboard and not displayed on the screen.
netstat | clip - Check if your computer is connecting to websites without your knowledge.
I think your command sould receive the destination e-mail as a parameter and then after executing, your command you can have simple script/.BAT file which e-mails your text output to the user using the standard Telnet SMTP commands, like explained for example in the following page:
"http://www.yuki-onna.co.uk/email/smtp.html".
You could add an option to your program that tells it to copy its own output to the clipboard using the clipboard API. Then the user could just paste it.
I like the clip suggestion, though.

How to capture and display output from a task via Windows CMD

I've got a PHP script which I'm running from a command line (windows) that performs a variety of tasks, and the only output it gives is via 'print' statements which output direct to screen.
What I want to do is capture this to a log file as well.
I know I can do:
php-cli script.php > log.txt
But the problem with this approach is that all the output is written to the log file, but I can't see how things are running in the mean time (so I can stop the process if anything dodgy is happening).
Just to pre-empt other possible questions, I can't change all the print's to a log statement as there are far too many of them and I'd rather not change anything in the code lest I be blamed for something going fubar. Plus there's the lack of time aspect as well. I also have to run this on a windows machine.
Thanks in advance :)
Edit: Thanks for the answers guys, in the end I went with the browser method because that was the easiest and quickest to set up, although I am convinced there is an actual answer to this problem somewhere.
You can create a powershell script that runs the command, reads the data from the command's STDOUT then outputs the output to both the log file and the terminal for you to watch. You can use the commands Write-Output and Write-Host.
Microsoft's site: http://www.microsoft.com/technet/scriptcenter/topics/msh/cmdlets/tee-object.mspx
Another option would be use find a tee program that will read input and divert it to two different outputs. I believe I have seen these for windows but I don't think they are standard.
Wikipedia: http://en.wikipedia.org/wiki/Tee_(command)
I have always opened the log file up in my web browser. This allows me to refresh it easily and does not interrupt any writing to the file that windows does. It isn't particularly elegant but it does work!
You want the "tee" command for Windows. See http://en.wikipedia.org/wiki/Tee_(command)
Powershell includes a tee command, and there are also numerous versions of tee for Windows available, for instance:
http://unxutils.sourceforge.net/
http://www.chipstips.com/?p=129
Also can be implemented in VBScript if you prefer.
EDIT: Just occurred to me I should also mention the tail command: http://en.wikipedia.org/wiki/Tail_(Unix). Tail allows you to read the last N lines of a file, and also includes a "file monitor" mode that just continually displays the end of the file in real-time. This is perfect for log file monitoring since it allows you to watch the log in real-time without interfering with the process that's writing to the log. There are several implementations of tail for Windows, both command line and GUI based. Microsoft's Services For UNIX packages (or whatever they're calling it now) also include a version of tail. Some examples:
mTail
Tail for Win32
WinTail
MakeLogic Tail
Some of these go far beyond just displaying the file in real-time as it updates and can send email alerts and colorize string matches, monitor multiple files at once, etc.
Slow:
for /f "delims=" %a in ('php-cli script.php') do #echo %a&echo %a>>log.txt
or in a batch file:
for /f "delims=" %%a in ('php-cli script.php') do #echo %%a&echo %%a>>log.txt

Resources