Execute a shell script as it would be an interactive session - bash

For documentation purposes I am trying to execute a shell script in a way that it looks as you typed it by hand in an interactive shell.
Script:
x=123
echo $x
Then execute:
PS4="$PS1"
set -x -v
. ./demo
Output:
. ./demo
user#host:~/tmp$ . ./demo
x=123
user#host:~/tmp$ x=123
echo $x
user#host:~/tmp$ echo 123
123
Desired output:
user#host:~/tmp$ x=123
user#host:~/tmp$ echo $x
123
It does not have to be bash. Any solution that simulates an interactive session is welcome.
How can I achieve the desired result?

Use script. It will record your session including timing information and play it back for you. eg:
$ script -r output /bin/sh
Script started, output file is output
sh-3.2$ x=123
sh-3.2$ echo "$x"
123
sh-3.2$ exit
exit
Script done, output file is output
$ script -p output
Script started on Sat Oct 29 20:00:16 2022
sh-3.2$ x=123
sh-3.2$ echo "$x"
123
sh-3.2$ exit
exit
Script done on Sat Oct 29 20:00:22 2022
In the above, the first section is entered interactively, and then played back with script -p

A pretty simple solution is to start an interactive shell and redirect your script file to the input stream:
bash -i </path/to/script-file

Related

shell script - run list of commands

for i in `cat foo.txt`
do
$i
done
And I have a input file "foo.txt", with list of commands.
ls -ltr | tail
ps -ef | tail
mysql -e STATUS | grep "^Uptime"
when I run the shell script, it executes, but splits the commands in each line at spaces i.e for first line it executes only "ls", then "-ltr" for which I get command not found error.
How can I run each list as one command?
why am I doing this?
I execute lot of arbitrary shell commands including DB commands. I need to have a error handling as I execute each command(each line from foo.txt), I can't think of what can go wrong, so the idea is put all commands in order and call them in loop and check for error (#?) at each line and stop on error.
Why not just do this?
set -e
. ./foo.txt
set -e causes the shell script to abort if a command exits with a non-zero exit code, and . ./foo.txt executes commands from foo.txt in the current shell.
but I guess I can't send notification (email).
Sure you can. Just run the script in a subshell, and then respond to the result code:
#!/bin/sh
(
set -e
. ./foo.txt
)
if [ "$?" -ne 0 ]; then
echo "The world is on fire!" | mail -s 'Doom is upon us' you#youremail.com
fi
Code mentioned.
for i in `cat foo.txt`
do
$i
done
Please use https://www.shellcheck.net/
This will result _
$ shellcheck myscript
Line 1:
for i in `cat foo.txt`
^-- SC2148: Tips depend on target shell and yours is unknown. Add a shebang.
^-- SC2013: To read lines rather than words, pipe/redirect to a 'while read' loop.
^-- SC2006: Use $(...) notation instead of legacy backticked `...`.
Did you mean: (apply this, apply all SC2006)
for i in $(cat foo.txt)
$
Will try while loop, and for test purpose content of foo.txt mentioned below
cat foo.txt
ls -l /tmp/test
ABC
pwd
while read -r line; do $line; if [ "$?" -ne 0 ]; then echo "Send email Notification stating $line Command reported error "; fi; done < foo.txt
total 0
-rw-r--r--. 1 root root 0 Dec 24 11:41 test.txt
bash: ABC: command not found...
Send email Notification stating ABC Command reported error
/tmp
In case error reported you can break the loop.
http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_09_05.html
while read -r line; do $line; if [ "$?" -ne 0 ]; then echo "Send email Notification stating $line Command reported error "; break; fi; done < foo.txt
total 0
-rw-r--r--. 1 root root 0 Dec 24 11:41 test.txt
bash: ABC: command not found...
Send email Notification stating ABC Command reported error
while read -r line; do eval $line; if [ "$?" -ne 0 ]; then echo "Send email Notification stating $line Command reported error "; break; fi; done < foo.txt
total 0
-rw-r--r--. 1 root root 0 Dec 24 11:41 test.txt
bash: ABC: command not found...
Send email Notification stating ABC Command reported error

How to write a shell script that upon invocation shows the time and date and lists all the logged-in users.This information is then saved in a file

I need to list all the logged in users along with the date and time of log in. How can I do that using shell script?
#!/bin/bash
cat > log.log << EOF1
how to replace the bash command 'w' in here
EOF1
This is written for Mac, not tried in a linux machine.
write this to login.sh
#!/bin/bash
last |grep "logged in" > ./login.log
touch login.log
run ./login.sh
Then cat login.log you can see
userA ttys001 Sun Jun 9 12:07 still logged in
userB ttys000 Sun Jun 9 11:53 still logged in
userC console Sun Jun 9 11:49 still logged in
You can use only w > log.log like oguz ismail writes, or you can write the output of w into a variable and display this variable in the here-document.
#!/bin/bash
w_content=$(w)
cat > log <<EOF
Headline
$w_content
Footer
EOF
Kindly check below script, It will work your demand
#!/bin/bash
a=$(echo -e "current date and time :- \n $(date)\n"
echo -e "All logged in Users with details :- \n $(who)\n "
echo -e "Server uptime :-\n $(uptime |awk -F ',' '{print $1}') \n"
echo -e "Script running entries are logged in log file /var/log/sh.log")
echo "$a" && echo "$a" >> /var/log/sh.log 2>&1

What does run_command() { echo "+" "$#"; "$#"; } means in bash script?

So I am new learning bash script and I came up with the following piece of code.
run_command() {
echo "+" "$#"
"$#"
}
I am confused on what "$#" means and why is it twice?
Thanks a lot for your time and have a great day.
Aagam Jain's got the answer. I will add some explanation that wouldn't fit in a comment section. I apologize for the verbosity.
Consider this example.
Showing parameters given to a script
test.sh:
echo "$1"
echo "$2"
Let's run this script and give it 2 parameters.
$> bash test.sh ls -l
Result:
ls
-l
First parameter ls, represented by $1, is echo'ed in the first line. Second parameter -l, represented by $2, is echo'ed in the second line.
Bash manual - let's see what it says
($#) Expands to the positional parameters, starting from one
See this: https://www.gnu.org/software/bash/manual/bash.html#Special-Parameters
How does that impact our example? Let's change test.sh a bit.
Expanding parameters starting from one
test.sh:
echo "$#"
Let's run it.
$> bash test.sh ls -l
Result:
ls -l
$# listed both parameters in the same line one after the other. If you had 5 parameters, they'd be printed one after the other.
Let's change test.sh a bit more.
Adding a + to the echo
test.sh:
echo "+" "$#"
Let's run it.
$> bash test.sh ls -l
Result:
+ ls -l
That means, a + appeared before both parameters were printed.
Change test.sh a bit more.
Executing all provided parameters
test.sh:
echo "+" "#"
"$#"
Let's run this.
bash test.sh ls -l
Result:
+ ls -l
total 4
-rw-r--r-- 1 eapo users 0 Sep 23 19:38 file1
-rw-r--r-- 1 eapo users 19 Sep 23 19:38 test.sh
Great. As the commenters and Aagam mentioned, the script printed out what it was going to execute (using echo "+" "$#") and then executed the command. The "$#" basically is just doing ls -lh. Terminal just executes it as is.
Let's put a function in the script now.
Adding a function in the script
test.sh:
run_command() {
echo "+" "$#"
"$#"
}
run_command ls -l
Note that we are executing the function in the script itself instead of giving it on the command line
Let's run it.
bash test.sh
Result:
+ ls -l
total 4
-rw-r--r-- 1 eapo users 0 Sep 23 19:38 file1
-rw-r--r-- 1 eapo users 58 Sep 23 19:41 test.sh
Hope the examples walk you through how the script functions.
This prints the command and its output.
e.g.
run_command() {
echo "+" "$#"
"$#"
}
run_command ls
#output
#+ ls
#files_list_in_current_directory

tee -a : No such file or Directory

Please have a look at following code.
#!/bin/sh
LOG_FILE=/home/admin/scriptLogs.log
rm -f ${LOG_FILE}
echo "`date`:: Script Execution Started" | tee -a ${LOG_FILE}
DATABASE ACCESS CODE 2>&1
echo "`date`:: Script Execution Successful " | tee -a {$LOG_FILE}
exit 0
It produces following output:
> Tue Feb 7 12:14:49 IST 2017:: Script Execution Started
> tee:{/home/admin/scriptLogs.log}: No such file or directory
> Tue Feb 7 12:14:49 IST 2017:: Script Execution Successfull
However, the file is present in the specified location. It also gets appended with data except for the last echo statement. Why such behavior?
Make sure to always quote your variables before expanding them to avoid reinterpretation (read about it here: http://www.tldp.org/LDP/abs/html/quotingvar.html)
Also, you should define LOG_FILE as a string (note the quotes below):
LOG_FILE="/home/admin/scriptLogs.log"
With that said, you have a typo in your script as mentioned by #codeforester
Also, you print that the execution was successful, without checking that it really was.
So you code should look like this:
#!/bin/sh
LOG_FILE="/home/admin/scriptLogs.log"
rm -f "$LOG_FILE"
echo "`date`:: Script Execution Started" | tee -a "$LOG_FILE"
DATABASE ACCESS CODE 2>&1
if [ $? -eq 0 ]; then
echo "`date`:: Script Execution Successful " | tee -a "$LOG_FILE"
else
...
exit 0
Note: I have removed the curly brackets, as they are not needed here (although it is not a mistake to do so)
As pointed out by #codeforester, use:
#!/bin/sh
LOG_FILE=/home/admin/scriptLogs.log
rm -f ${LOG_FILE}
echo "`date`:: Script Execution Started" | tee -a ${LOG_FILE}
DATABASE ACCESS CODE 2>&1
echo "`date`:: Script Execution Successful " | tee -a ${LOG_FILE}
exit 0

Command prompt appearing during execution of a bash script

This is a simplified version of a script I use:
In its simplified version tt should read the file input line by line and then print it to the standard output and also write to a file log.
input file:
asas
haha
asha
hxa
The script (named simple):
#!/bin/bash
FILE=input
logfile="log"
exec > >(tee "$logfile") # redirect the output to a file but keep it on stdout
exec 2>&1
DONE=false
until $DONE; do
read || DONE=true
[[ ! $REPLY ]] && continue #checks if a line is empty
echo "----------------------"
echo $REPLY
done < "$FILE"
echo "----------------------"
echo ">>> Finished"
The output (on console):
-bash-3.2$ ./simple
-bash-3.2$ ----------------------
asas
----------------------
haha
----------------------
asha
----------------------
hxa
----------------------
>>> Finished
At this time I need to press enter to terminate the script. Notice that a command prompt -bash-3.2$ showed up during execution.
I checked that those lines are to blame:
exec > >(tee "$logfile") # redirect the output to a file but keep it on stdout
exec 2>&1
Without them the output is as expected:
-bash-3.2$ ./simple
----------------------
asas
----------------------
haha
----------------------
asha
----------------------
hxa
----------------------
>>> Finished
-bash-3.2$
What is more I don't need to press enter to terminate the script.
Unfortunately I need saving the output both to the console (stdout) and a log file.
How can this be fixed?
You can use tee on the echo lines directly.
For example:
[ben#lappy ~]$ echo "echo and save to log" | tee -a example.log
echo and save to log
[ben#lappy ~]$ cat example.log
echo and save to log
The -a argument to tee will append to the log.
If you just need it to pause and wait for user input use the pause command:
pause
What's happening is that tee "$logfile" is being run asynchronously. When you use process substitution like that, the main script doesn't wait for the process to exit.
So the until loop runs, the main script exits, the shell prints the prompt, and then tee prints its output.
You can see this more easily with:
echo Something > >(sleep 5; cat)
You'll get a command prompt, and then 5 seconds later Something will appear.
There was a thread about this behavior in comp.unix.shell a couple of years ago. See it here

Resources