I am running a jar file in terminal and I want to do something with the last output that I see, which happens to be a System.out.print statement. I have tried:
tail -1 myjar.jar
$(!!)
but neither of them seem to work for me.
Try this: java -jar myjar.jar | tail -1
You want to pipe the output of your jar into tail. The pipe will connect the output stream of java -jar myjar.jar to the input stream of tail -1.
tail -1 myjar.jar will output the last line of the file myjar.jar, which is probably not what you want.
Related
Is it possible to write to file in one bash process and read it with tail in another (same way you can read system generated logs with tail -f.
I would like to open and continuously write something to file
vi /tmp/myfile
And in other terminal prints what was written to that file
tail -f /tmp/myfile
I've tried this, but tail doesn't print anything after I save changes in vi (only initial lines, before save).
Motivation:
In my toy project. I would like to build shared clipboard using pipeto.me service. Where I would write to my file continuously and all changes captured by tail would be piped to curl. Something like watch log example from pipeto.me
tail -f logfile | curl -T- -s https://pipeto.me/2xrGcZtQ.
But instead of logfile it will watch my file, where I would write in vi
But apart from solving my problem, I'm looking for general answer if something like this is possible with vi and tail.
You can use cat command, by changing its output stream as /tmp/file that is whatever you type will be added to myfile,
cat > /tmp/myfile;
#input-> add text(standard input by default is set as keyboard)
#typing...
And to print the file with tail command with -F as argument,
tail -F /tmp/file; #-F -> output appended data as the file grows and with retry
#output-> input given to file
#typing....
Writing text to file with vim,
vi /tmp/file;
#typing...
#:w -> write text to file
tail -F /tmp/file;
#
#typing...
When you write to your file using vim, it doesn't write(save) it instantly as you type, instead when you exit the insert mode and save the file explicitly(:w), it is then the output of tail will be updated.
Hence you can use a plugin like Autosaveplugin which could help to save automatically, to display logs synchronously.
I'm looking for a way to pipe multiple log files on multiple remote servers, and then pipe the result to another program.
Right now I'm using multitail, but it does not exactly do what I need, or maybe I'm doing something wrong!
I would like to be able to send the merge of all log files, to another program. For example jq. Right now if I do:
multitail --mergeall -l 'ssh server1 "tail -f /path/to/log"' -l 'ssh server2 "tail -f /path/to/log"' -l 'ssh server3 "tail -f /path/to/log"' | jq .
for instance, I get this:
parse error: Invalid numeric literal at line 1, column 2
But more generally, I would like to give the output of this to another program I use to parse and display logs :-)
Thanks everybody!
One way to accomplish this feat would be to pipe all your outputs together into a named pipe and then deal with the output from that named pipe.
First, create your named pipe: $ mknod MYFIFO p
For each location you want to consolidate lines from, $ tail -f logfile > MYFIFO (note, the tail -f can be run through an ssh session).
Then have another process take the data out of the named pipe and handle it appropriately. An ugly solution could be:
$ tail -f MYFIFO | jq
Season to taste.
I'm working on a project, and it's being run by an autoscript. The script has the following line:
./executable ./dev | grep -i "GET.*index.*200" > ./dev/logs/log1
I have my code writing to stdout, but it never gets written to log1. If I change it though and remove the grep command, it writes just fine. Any help would be appreciated, as I seemingly don't understand grep as well as I should.
You might try to redirect std output in your script "executable" using commands:
exec > ./dev/logs/log1
exec 2> ./dev/logs/errlog1
So, now not need to use ">" in the line
./executable ./dev | grep -i "GET.*index.*200"
Also I recommend you to use only absolute paths in scripts.
ps. [offtop] I can't write comments yet (not enough reputation).
I was wondering if there's any possible way to save the output of a shell script while to a file while it's running in terminal. for example, say I'm compiling a java program with the command javac foo.java. How would I save all the output of that particular command (errors, etc.) to a file for future reference without having to hit command- s and select save and replace after every time I run the command?
Use javac foo.java > output.txt to capture the output of your command to the file output.txt.
This will however hide all output from you while it is compiling your module.
If you would like to see the output of your build in the terminal and at the same time capture the output into file output.txt you can use tee:
javac foo.java | tee output.txt
The tee program reads from stdin and writes everything into the specified file and also to stdout again.
Is that what you want ?
javac foo.java > output.txt
All error will go in the output.txt and nothing will print on the shell.
I use tee
javac foo.java | tee output.log
similar to this thread.
Dump terminal session to file?
I have the program running using this command
command 2> sample.txt
now that file is growing continuously and command will exit in 5-6 days and i beleive that file size won't go in GB
I tried this
echo "" > sample.txt but thats not making any differnce to it and filesize is growing.
i was thinking of setting up cron job after 1 hour to empty its contents
How can i empty the contents of file
Try the following command, this will write the console output to a file. (Your console will also get the messages printed).
command | tee -a file.log
and you can empty the contents by
> file.log