execute each line of output in bash - bash

I have a feeling there's already and answer to this but I wasn't able to find it.
How can I execute each line of output in bash as it comes out?
For example, as my script runs, it generates,
command-1
command-2
command-3
etc.
I need some way to pipe them or something into something that will run them neatly. I've been experimenting with xargs but haven't found anything good to put on the receiving end.
I'd like to avoid doing something like dumping them into a separate script on the side if possible. (I also tried for loops, but they ended up breaking on words instead of lines.)

$ bash echosomecommands.sh | bash

Related

Is it possible to make a .bat Bash hybrid?

In cmd, it is possible to use Linux commands with the ubuntu or bash commands, but they are very fickle. In batch, it is also possible to make a VBScript-batch hybrid, which got me thinking, is it possible to make a Bash-batch hybrid? Besides being a tongue-twister, I feel that Bash-batch scripts may be really useful.
What I have tried so far
So far I tried using the empty bash and ubuntu commands alone since they switch the normal command-prompt to the Ubuntu/Bash shell, but even if you put commands after the ubuntu/bash they wouldn't show or do anything.
After I tried that, I tried using the ubuntu -run command, but like I said earlier, it’s really fickle and inconsistent on what things work and what things don't. It is less inconsistent when you pipe things into it, but it still usually doesn't work.
I looked here since it seemed like it would answer my question and I tried it, but it didn't work since it required another program (I think).
I also looked to this and I guess it failed miserably, but interesting concept.
What I've gotten from all of my research is that most people think when this is mentioned of a file that could be run either as a .bat file or as .sh shell file instead of my goal, to make a file that runs both batch and Bash commands in the same instance.
What I want this for relates to my other question where I am trying to hash a string instead of a file in cmd, and you could do it with a Bash command, but I would still like to keep the file as a batch file.
Sure you can use Bash in batch, assuming it is available. Just use the command bash -c 'cmd', where cmd is the command that you want to run in Bash.
The following batch line pipes the Hello to cat -A command that prints it including the invisible symbols:
echo Hello | bash -c "cat -A"
Compare the output with the result of the version completely written in Bash:
bash -c "echo Hello | cat -A"
They will slightly differ!

Bash: How to create a test mode that displays commands instead of executing them

I have a bash script that executes a series of commands, some involving redirection. See cyrus-mark-ham-spam.
I want the script to have a test mode, where all the commands run are printed instead of executing them. As you can see, I have tried to do that by just putting "echo" on the front of each command in test mode.
Unfortunately this doesn't deal with redirection - any redirections are still done, so the program leaves lots of temp files littered about the place when run in test mode.
I have tried various ways to get round this, like quoting the whole command and passing it to a function that either prints it or runs it, but either the redirections work in test mode, or they don't work in run mode.
I thought this must have come up before, and wonder if there is a known solution which does not involve every command being repeated with an if TEST round the pair?
Please note, this is NOT a duplicate of show commands without executing them because neither that question, nor its answers, covers redirection (which is the essence of this question).
I see that it is not a duplicate but there is not general solution to this. You need to look at each command separately.
As long as the command doesn't use arguments enclosed in spaces, like
cmd -a -b -c > filename
, you can quote it:
echo 'cmd -a -b -c > filename'
But real life code is more complex, sure.

First line in file is not always printed in bash script

I have a bash script that prints a line of text into a file, and then calls a second script that prints some more data into the same file. Lets call them script1.sh and script2.sh. The reason it's split into two scripts, is because I have different versions of script2.sh.
script1.sh:
rm -f output.txt
echo "some text here" > output.txt
source script2.sh
script2.sh:
./read_time >> output.txt
./run_program
./read_time >> output.txt
Variations on the three lines in script2.sh are repeated.
This seems to work most of the time, but every once in a while the file output.txt does not contain the line "some text here". At first I thought it was because I was calling script2.sh like this: ./script2.sh. But even using source the problem still occurs.
The problem is not reproducible, so even when I try to change something I don't know if it's actually fixed.
What could be causing this?
Edit:
The scripts are very simple. script1 is exactly as you see here, but with different file names. script 2 is what I posted, but then the same 3 lines repeated, and ./run_program can have different arguments. I did a grep for the output file, and for > but it doesn't show up anywhere unexpected.
The way these scripts are used is that script1 is created by a program (the only difference between the versions is the source script2.sh line. This script1.sh is then run on a different computer (linux on an FPGA actually) using ssh. Before that is done, the output file is also deleted using ssh. I don't know why, but I didn't write all of this. Also, I've checked the code running on the host. The only mention of the output file is when it is deleted using ssh, and when it is copied back to the host after the script1 is done.
Edit 2:
I finally managed to make the problem reproducible at a reasonable rate by stripping script2.sh of everything but a single line printing into the file. This also let me do the testing a bit faster. Once I had this I got the problem between 1 and 4 times for every 10 runs. Removing the command that was deleting the file over ssh before the script was run seems to have solved the problem. I will test it some more to be sure, but I think it's solved. Although I'm still not sure why it would be a problem. I thought that the ssh command would not exit before all the remove commands were executed.
It is hard to tell without seeing the real code. Most likely explanation is that you have a typo, > instead of >>, somewhere in one of the script2.sh files.
To verify this, set noclobber option with set -o noclobber. The shell will then terminate when trying to write to existing file with >.
Another possibility, is that the file is removed under certain rare conditions. Or it is damaged by some command which can have random access to it - look for commands using this file without >>. Or it is used by some command both as input and output which step on each other - look for the file used with <.
Lastly, you can have a racing condition with a command outputting to the file in background, started before that echo.
Can you grep all your scripts for 'output.txt'? What about scripts called inside read_time and run_program?
It looks like something in one of the script2.sh scripts must be either overwriting, truncating or doing a substitution on output.txt.
For example,there could be a '> output.txt' burried inside a conditional for a condition that rarely obtains. Just a guess, but it would explain why you don't always see it.
This is an interesting problem. Please post the solution when you find it!

Converting a history command into a shell script

This is sort of one of those things that I figured a lot of people would use a lot, but I can't seem to find any people who have written about this sort of thing.
I find that a lot of times I do a lot of iteration on a command-line one-liner and when I end up using it a lot, or anticipate wanting to use it in the future, or when it becomes cumbersome to work with in one line, it generally is a good idea to turn the one-liner into a shell script and stick it somewhere reasonable and easily accessible like ~/bin.
It's obviously too cumbersome to use any sort of roundabout method involving a text editor to get this done, and it's possible to simply do it on the shell, for instance in zsh typing
echo "#!/usr/bin/env sh" > ~/bin/command_from_history_number_523.sh && echo !523 >> ~/bin/command_from_history_number_523.sh
followed by pressing Tab to inject the !523rd command and somehow shoehorning it into an acceptable string to be saved.
This is particularly cumbersome and has at minimum three problems:
Does not work in bash as it does not complete the !523
Requires some manual inspection and string escapement
Requires too much typing such as the script name must be entered twice
So it looks like I need to do some meta shell scripting here.
I think a good solution would function under both bash and zsh, and it should probably work by taking two arguments, an integer for the history command number and a name for the shell script to poop out in a hardcoded directory which contains that one command. Furthermore, under bash, it appears that multi-line commands are treated as separate commands, but I'm willing to assume that we only care about one-liners here and I only use zsh anyway at this point.
The stumbling block here is that i think I'll still be running shell scripts through bash even when using zsh, so it won't likely then be able to parse zsh's history files. I may need to make this into two separate programs then.
Update: I agree with #Floris 's comment that direct use of the commands like !! would be helpful though I am not sure how to make this work. Suppose I have the usage be
mkscript command_number_24 !24
this is inadequate because mkscript will be receiving the expanded out contents of the 24th command. if the 24th command contains any file globs or somesuch they will have been expanded already. This is bad, and I basically want the contents of the history file, i.e. the raw command string. I guess this can be worked around by manually implementing those shortcuts in here. Or just screw it and just take an integer argument.
function mkscript() {
echo '#!/bin/bash' > ~/bin/$2
history -p '!'$1 >> ~/bin/$2
}
Only tested in Bash.
Update from OP: In zsh I can accomplish this with fc -l $2 $2

Switch from file contents to STDIN in piped command? (Linux Shell)

I have a program (that I did not write) which is not designed to read in commands from a file. Entering commands on STDIN is pretty tedious, so I'd like to be able to automate it by writing the commands in a file for re-use. Trouble is, if the program hits EOF, it loops infinitely trying to read in the next command dropping an endless torrent of menu options on the screen.
What I'd like to be able to do is cat a file containing the commands into the program via a pipe, then use some sort of shell magic to have it switch from the file to STDIN when it hits the file's EOF.
Note: I've already considered using cat with the '-' for STDIN. Unfortunately (I didn't know this before), piped commands wait for the first program's output to terminate before starting the second program -- they do not run in parallel. If there's some way to get the programs to run in parallel with that kind of piping action, that would work!
Any thoughts? Thanks for any assistance!
EDIT:
I should note that my goal is not only to prevent the system from hitting the end of the commands file. I would like to be able to continue typing in commands from the keyboard when the file hits EOF.
I would do something like
(cat your_file_with_commands; cat) | sh your_script
That way, when the file with commands is done, the second cat will feed your script with whatever you type on stdin afterwards.
Same as Idelic answer with more simple syntax ;)
cat your_file_with_commands - | sh your_script
I would think expect would work for this.
Have you tried using something like tail -f commandfile | command I think that should pipe the lines of the file to command without closing the file descriptor afterwards. Use -n to specify the number of lines to be piped if tail -f doesn't catch all of them.

Resources