Unexpected output from cat `bash` command - bash

Can someone please explain this? I ran the commands as shown below
$ cat `bash`
$ ls
$ ctrl+D
and it's giving me some unexpected output on terminal.
NOTE: bash is in backquotes.

Good question! The "unexpected output" is cat printing all of the files found by ls in the cwd. Detailed explanation follow:
On your first line:
$ cat `bash`
The bash part actually spawns a new shell from your original shell because bash is enclosed by backquotes (backquotes means to run the enclosed program in this context)
Then when you do:
$ ls
This is actually done in the newly spawned bash shell. It lists the directory of wherever the newly spawned bash shell is (should be the same as the original). This, in turn, in essence changes the cat command in the first step to
$ cat file_1 file_2 ... file_x
(basically all of the files in that directory returned by ls. However, you won't see these results yet because the output is waiting to be printed to the stdout of your original shell : cat is waiting to evaluate the stdout of your new bash shell.)
Lastly, when you do:
$ ctrl+D
It exits the new bash shell that you spawned from your original shell, and then cat outputs everything that got printed to stdout in the new shell (the search results from ls) into your old shell.
You can verify what I just said by:
$ cd ~/
$ mkdir temp_test_dir
$ cd temp_test_dir
$ echo "some text for file1" > file1
$ echo "other text for file2" > file2
Now run what you had in your question:
$ cat `bash`
$ ls
$ ctrl+D
And this is what you should see:
some text for file1
other text for file2
in some order, which is just cat outputting all of the files that were found by ls.

Related

bash: run multiple commands each separately with a command prompt

I want to run multiple commands like they are executed one at a time on command prompt
Eg i have the following list of commands
ls
pwd
du -sh
Now i try to copy paste them and run:
$ ls
pwd
du -sh
file1.txt file2.txt
/home/user/test
1M .
but instead i want to get them executed separately. So that i can see their outputs like below
$ ls
file1.txt file2.txt
$ pwd
/home/user/test
$ du -sh
1M .
So is it possible if i a have a list of commands to paste them in such a way that they can execute as if one per command prompt. Else the only option is paste one command at a time.
Generally i get a list of commands to get executed.
While pasting essentially works the way you describe, it may end up looking cosmetically wrong when the input (and its local echo) shows up while the shell is still busy executing the previous command.
You could instead feed the commands to bash -i, which will read and execute them in turn, showing the prompt:
$ mypaste() { x="$(cat)"; bash -i <<< "$x"; }
$ mypaste # Now paste some commands and hit ctrl-d
ls
pwd
whoami
^D
This results in:
you#yourdir $ ls
some files
you#yourdir $ pwd
/home/you/yourdir
you#yourdir $ whoami
you
you#yourdir $ exit
$
nano myscript.sh or your favorite editor and paste the following.
#!/bin/bash
ls
pwd
du -sh
make it executable with chmod +x myscript.sh and run the script with
./myscript.sh
You can run any bash commands and see outputs
Try each command separated by semicolon:
ls; pwd; du -sh;
This will make it batch of commands. Shell will execute one by one and you don't have to paste each command separately.
Hope this helps.
The answer from that other guy worked and I used a slightly modified version using heredoc.
I wanted to script a sequence of commands that show the prompt so I could copy/paste on different systems and show how to replicate a bug.
simple version
bash -i << 'EOF'
echo "command one"
echo "command two"
EOF
more commands and pretty output
bash -i << 'EOF' && echo -e '\e[1A\e[K==========================================='
unset PROMPT_COMMAND; PS1='command-sequence:$ ' ; clear ; echo "==========================================="
mkdir /tmp/demo-commands
echo "file contents" > /tmp/demo-commands/file
cd /tmp/demo-commands
pwd
ls
cat file
rm file
rm -r /tmp/demo-commands
EOF
I customize the prompt and use echo -e '\e[1A\e[K to replace the last line with a separator

I don't understand bash exec [duplicate]

This question already has answers here:
How to redirect output of an entire shell script within the script itself?
(6 answers)
Closed 7 years ago.
#!/bin/bash
#...
exec >> logfile
#cmd
I guest it's save output.
Could you expand it with the same example?
Thank you very much!
I often use this command to list txt files:
find / -type f -exec grep -l "bash" {} \;
but I can't very understand too.
How can there be such good people in the world.very very moved!
Yes, it sends any further output to the file named logfile. In other words, it redirects standard output (also known as stdout) to the file logfile.
Example
Let's start with this script:
$ cat >script.sh
#!/bin/bash
echo First
exec >>logfile
echo Second
If we run the script, we see output from the first but not the second echo statements:
$ bash script.sh
First
The output from the second echo statement went to the file logfile:
$ cat logfile
Second
$
If we had used exec >logfile, then the logfile would be overwritten each time the script was run. Because we used >> instead of >, however, the output will be appended to logfile. For example, if we run it once again:
$ bash script.sh
First
$ cat logfile
Second
Second
Documentation
This is documented in man bash:
exec [-cl] [-a name] [command [arguments]] If command
is specified, it replaces the shell. No new process is created. The
arguments become the arguments to command. If the -l option is
supplied, the shell places a dash at the beginning of the zeroth
argument passed to command. This is what login(1) does. The -c
option causes command to be executed with an empty environment. If
-a is supplied, the shell passes name as the zeroth argument to the executed command. If command cannot be executed for some reason, a
non-interactive shell exits, unless the execfail shell option is
enabled. In that case, it returns failure. An interactive shell
returns failure if the file cannot be executed. If command is not
specified, any redirections take effect in the current shell, and the
return status is 0. If there is a redirection error, the return
status is 1. [Emphasis added.]
In your case, no command argument is specified. So, the exec command performs redirections which, in this case, means any further stdout is sent to file logfile.
find command and -exec
The find command has a -exec option. For example:
find / -type f -exec grep -l "bash" {} \;
Other than the similarity in name, the -exec here has absolutely nothing to do with the shell command exec.
The construct -exec grep -l "bash" {} \; tells find to execute the command grep -l "bash" on any files that it finds. This is unrelated to the shell command exec >>logfile which executes nothing but has the effect of redirecting output.
Everything you output to stdout will not go to stdout but to the logfile.
See example below:
#!/bin/bash
# reassign-stdout.sh
LOGFILE=logfile.txt
exec 6>&1 # Link file descriptor #6 with stdout.
# Saves stdout.
exec > $LOGFILE # stdout replaced with file "logfile.txt".
# ----------------------------------------------------------- #
# All output from commands in this block sent to file $LOGFILE.
echo -n "Logfile: "
date
echo "-------------------------------------"
echo
echo "Output of \"ls -al\" command"
echo
ls -al
echo; echo
echo "Output of \"df\" command"
echo
df
# ----------------------------------------------------------- #
exec 1>&6 6>&- # Restore stdout and close file descriptor #6.
echo
echo "== stdout now restored to default == "
echo
ls -al
echo
exit 0
This redirects standard output produced by your script and any other programs it subsequently calls into the logfile. So if your script runs in terminal after it executes
exec >> logfile
you'll see no output in your terminal window but you'll find it inside the logfile. logfile will be created if it doesn't exist or appended every time you run your script.

How do I automatically save the output of the last command I've run (every time)?

If I wanted to have the output of the last command stored in a file such as ~/.last_command.txt (overwriting output of previous command), how would I go about doing so in bash so that the output goes to both stdout and that file? I imagine it would involve piping to tee ~/.last_command.txt but I don't know what to pipe to that, and I definitely don't want to add that to every command I run manually.
Also, how could I extend this to save the output of the last n commands?
Under bash this seems to have the desired effect.
bind 'RETURN: "|tee ~/.last_command.txt\n"'
You can add it to your bashrc file to make it permanent.
I should point out it's not perfect. Just hitting the enter key and you get:
matt#devpc:$ |tee ~/.last_command.txt
bash: syntax error near unexpected token `|'
So I think it needs a little more work.
This will break program/feature expecting a TTY, but...
exec 4>&1
PROMPT_COMMAND="exec 1>&4; exec > >(mv ~/.last_command{_tmp,}; tee ~/.last_command_tmp)"
If it is acceptable to record all output, this can be simplified:
exec > >(tee ~/.commands)
Overwrite for 1 command:
script -c ls ~/.last_command.txt
If you want more than 1 command:
$ script ~/.last_command.txt
$ command1
$ command2
$ command3
$ exit
If you want to save during 1 login session, append "script" to .bashrc
When starting a new session (after login, or after opening the terminal), you can start another "nested" shell, and redirect its output:
<...login...>
% bash | tee -a ~/.bash_output
% ls # this is the nested shell
% exit
% cat ~/.bash_output
% exit
Actually, you don't even have to enter a nested shell every time. You can simply replace your shell-command in /etc/passwd from bash to bash | tee -a ~USERNAME/.bash_output.

Bash script - Run commands that correspond to the lines of a file

I have a file like this (text.txt):
ls -al
ps -au
export COP=5
clear
Each line corresponds at a command. In my script, I need to read each line and launch each command.
ps: I tried all these options and with all of them I have the same problem with the command "export". In the file there is "export COP=5", but after running the script, if I do echo $COP in the same terminal, no value is displayed
while IFS= read line; do eval $line; done < text.txt
Be careful about it, it's generally not advised to use eval as it's quite powerful and as easy to be abused.
However, if there is no risk of influence from unprivileged users on text.txt it should be ok.
cat test.txt | xargs -l1 bash -c '"$#"' echo
In order to avoid confusion I would simply rename the file from text.txt to text and add a shebang (e.g. #!/bin/bash) as the first line of the file. Make sure it is executable by calling chmod +x text. Afterwards you can execute it as expected.
$ cat text
#!/bin/bash
ls -al
ps -au
clear
$ chmod +x text
$ ./text

how to log all the command output to one single file in bash scripting [duplicate]

This question already has an answer here:
Output bash script into file (without >> )
(1 answer)
Closed 9 years ago.
In gnu/Linux i want to log all the command output to one particular file.
Say in terminal,i am typing
echo "Hi this is a dude"
It should print in the file name specified earlier without using the redirection in every command.
$ script x1
Script started, file is x1
$ echo "Hi this is a dude"
Hi this is a dude
$ echo "done"
done
$ exit
exit
Script done, file is x1
Then, the contents of file x1 are:
Script started on Thu Jun 13 14:51:29 2013
$ echo "Hi this is a dude"
Hi this is a dude
$ echo "done"
done
$ exit
exit
Script done on Thu Jun 13 14:51:52 2013
You can easily edit out your own commands and start/end lines using basic shell scripting (grep -v, especially if your Unix prompt has a distinctive substring pattern)
Commands launched from the shell inherit the file descriptor to use for standard output from the shell. In your typical interactive shell, standard output is the terminal. You can change that by using the exec command:
exec > output.txt
Following that command, the shell itself will write its standard output to a file called output.txt, and any command it spawns will do likewise, unless otherwise redirected. You can always "restore" output to the terminal using
exec > /dev/tty
Note that your shell prompt and text you type at the prompt continue to be displayed on the screen (since the shell writes both of those to standard error, not standard output).
{ command1 ; command2 ; command3 ; } > outfile.txt
Output redirection can be achieved in bash with >: See this link for more info on bash redirection.
You can run any program with ported output and all its output will go to a file, for example:
$ ls > out
$ cat out
Desktop
Documents
Downloads
eclipse
Firefox_wallpaper.png
...
So, if you want to open a new shell session with ported output, just do so!:
$ bash > outfile
will start a new bash session porting all of stdout to that file.
$ bash &> outfile
will port all of stdout AND stderr to that file (meaning you will no longer see prompts show up in your terminal)
For example:
$ bash > outfile
$ echo "hello"
$ echo "this is an outfile"
$ cd asdsd
bash: cd: asdsd: No such file or directory
$ exit
exit
$ cat outfile
hello
this is an outfile
$
$ bash &> outfile
echo "hi"
echo "this saves everythingggg"
cd asdfasdfasdf
exit
$ cat outfile
hi
this saves everythingggg
bash: line 3: cd: asdfasdfasdf: No such file or directory
$
If you want to see the output and have it written to a file (say for later analysis) then you can use the tee command.
$ echo "hi this is a dude" | tee hello
hi this is a dude
$ ls
hello
$ cat hello
hi this is a dude
tee is a useful command because it allows you to store everything that goes into it as well as displaying it on the screen. Particularly useful for logging the output of scripts.

Resources