Read from .txt file containing executable commands to be executed w/ output of commands executed sent to another file - bash

When I run my script, the .txt file is read, the executable commands are assigned to $eggs, then to execute the commands and redirect the output to a file I use echo $eggs>>eggsfile.txt but when I cat the file, I just see all the commands and not the execution output of those commands.
echo "Hi, $USER"
cd ~/mydata
echo -e "Please enter the name of commands file:/s"
read fname
if [ -z "$fname" ]
then
exit
fi
terminal=`tty`
exec < $fname #exec destroys current shell, opens up new shell process where FD0 (STDIN) input comes from file fname
count=1
while read line
do
echo $count.$line #count line numbers
count=`expr $count + 1`; eggs="${line#[[:digit:]]*}";
touch ~/mydata/eggsfile.txt; echo $eggs>>eggsfile.txt; echo "Reading eggsfile contents: $(cat eggsfile.txt)"
done
exec < $terminal

If you just want to execute the commands, and log the command name before each command, you can use 'sh -x'. You will get '+ command' before each command.
sh -x commands
+pwd
/home/user
+ date
Sat Apr 4 21:15:03 IDT 2020
If you want to build you own (custom formatting, etc), you will have to force execution of each command. Something like:
cd ~/mydata
count=0
while read line ; do
count=$((count+1))
echo "$count.$line"
eggs="${line#[[:digit:]]*}"
echo "$eggs" >> eggsfile.txt
# Execute the line.
($line) >> eggsfile.txt
done < $fname
Note that this approach uses local redirection for the while loop, avoiding having to revert the input back to the terminal.

Related

How do I programmatically execute a carriage return in bash?

My main file is main.sh:
cd a_folder
echo "executing another script"
source anotherscript.sh
cd ..
#some other operations.
anotherscript.sh:
pause(){
read -p "$*"
}
echo "enter a number: "
read number
#some operation
pause "Press enter to continue..."
I wanted to skip the pause command. But when I do:
echo "/n" | source anotherscript.sh
It doesn't allow to enter the number. I want the "/n" to occur so that I allow the user to enter a number but skip the pause statement.
PS: can't do any changes in anotherscript.sh. All changes to be done in main.sh.
Try
echo | source anotherscript.sh
Your approach does not work, because the script to be sourced expectes two lines from stdin: First a line containing a number, then an empty line (which is doing the pause). Hence you would have to feed two lines, the number and the empty line, to the script. If you still want to get the number from your own stdin, you would have to use a read command before:
echo "executing another script"
echo "enter a number: "
read number
printf "$number\n\n" | source anotherscript.sh
But this still has some danger lurking: The source command is executed in a subshell; hence, any changes in the environment performed by anotherscript.sh won't be visible in your shell.
A workaround would be to be to put the number-reading logic outside of main.sh:
# This is script supermain.sh
echo "executing another script"
echo "enter a number: "
read number
printf "$number\n\n"|bash main.sh
where in main.sh, you simply keep your source anotherscript.sh without any piping.
As user1934428 comments, the bash pipeline causes the cascading
commands to be executed in subshells and the variable modifications
there are not reflected in the current process.
To change this behavior, you can set lastpipe with shopt builtin.
Then bash changes the job control so that the last command in the
pipeline is executed in the current shell (as tsch does).
Then would you please try:
main_sh
#!/bin/bash
shopt -s lastpipe # this changes the job control
read -p "enter a number: " x # ask for the number in main_sh instead
cd a_folder
echo "executing another script"
echo "$x" | source anotherscript.sh > /dev/null
# anotherscript.sh is executed in the current process
# unnecessary messages are redirected to /dev/null
cd ..
echo "you entered $number" # check the result
#some other operations.
which will properly print the value of number.
Alternatively you can also say as:
#!/bin/bash
read -p "enter a number: " x
cd a_folder
echo "executing another script"
source anotherscript.sh <<< "$x" > /dev/null
cd ..
echo "you entered $number"
#some other operations.

How to capture shell script output

I have an unix shell script. I have put -x in shell to see all the execution step. Now I want to capture these in one log file on a daily basis.
Psb script.
#!/bin/ksh -x
Logfile= path.log.date
Print " copying file" | tee $logifle
Scp -i key source destination | tee -a $logfile.
Exit 0;
First line of the shell script is known as shebang , which indicates what interpreter has to be execute the below script.
Similarly first line is commented which denotes coming lines not related to that interpreted session.
To capture the output, run the script redirect your output while running the script.
ksh -x scriptname >> output_file
Note:it will output what your script's doing line by line
There are two cases, using ksh as your shell, then you need to do IO redirection accordingly, and using some other shell and executing a .ksh script, then IO redirection could be done based on that shell. Following method should work for most of the shells.
$ cat somescript.ksh
#!/bin/ksh -x
printf "Copy file \n";
printf "Do something else \n";
Run it:
$ ./somescript.ksh 1>some.log 2>&1
some.log will contain,
+ printf 'Copy file \n'
Copy file
+ printf 'Do something else \n'
Do something else
In your case, no need to specify logfile and/or tee. Script would look something like this,
#!/bin/ksh -x
printf "copying file\n"
scp -i key user#server /path/to/file
exit 0
Run it:
$ ./myscript 1>/path/to/logfile 2>&1
2>&1 captures both stderr and stdout into stdout and 1>logfile prints it out into logfile.
I would prefer to explicitly redirecting the output (including stderr 2> because set -x sends output to stderr).
This keeps the shebang short and you don't have to cram the redirecton and filename-building into it.
#!/bin/ksh
logfile=path.log.date
exec >> $logfile 2>&1 # redirecting all output to logfile (appending)
set -x # switch on debugging
# now start working
echo "print something"

Replacing 'source file' with its content, and expanding variables, in bash

In a script.sh,
source a.sh
source b.sh
CMD1
CMD2
CMD3
how can I replace the source *.sh with their content (without executing the commands)?
I would like to see what the bash interpreter executes after sourcing the files and expanding all variables.
I know I can use set -n -v or run bash -n -v script.sh 2>output.sh, but that would not replace the source commands (and even less if a.sh or b.sh contain variables).
I thought of using a subshell, but that still doesn't expand the source lines. I tried a combination of set +n +v and set -n -v before and after the source lines, but that still does not work.
I'm going to send that output to a remote machine using ssh.
I could use <<output.sh to pipe the content into the ssh command, but I can't log as root onto the remote machine, but I am however a sudoer.
Therefore, I thought I could create the script and send it as a base64-encoded string (using that clever trick )
base64 script | ssh remotehost 'base64 -d | sudo bash'
Is there a solution?
Or do you have a better idea?
You can do something like this:
inline.sh:
#!/usr/bin/env bash
while read line; do
if [[ "$line" =~ (\.|source)\s+.+ ]]; then
file="$(echo $line | cut -d' ' -f2)"
echo "$(cat $file)"
else
echo "$line"
fi
done < "$1"
Note this assumes the sourced files exist, and doesn't handle errors. You should also handle possible hashbangs. If the sourced files contain themselves source, you need to apply the script recursively, e.g. something like (not tested):
while egrep -q '^(source|\.)' main.sh; do
bash inline.sh main.sh > main.sh
done
Let's test it
main.sh:
source a.sh
. b.sh
echo cc
echo "$var_a $var_b"
a.sh:
echo aa
var_a="stack"
b.sh:
echo bb
var_b="overflow"
Result:
bash inline.sh main.sh
echo aa
var_a="stack"
echo bb
var_b="overflow"
echo cc
echo "$var_a $var_b"
bash inline.sh main.sh | bash
aa
bb
cc
stack overflow
BTW, if you just want to see what bash executes, you can run
bash -x [script]
or remotely
ssh user#host -t "bash -x [script]"

Use read builtin command to read from parent stdin while in a subshell

I have script that is launching a subshell/background command to read input and then doing more work:
#!/bin/bash
(
while true; do
read -u 0 -r -e -p "test_rl> " line || break
echo "line: ${line}"
done
) &
sleep 3600 # more work
With the above I don't even get a prompt. If I exec 3>&0 prior to launching the subshell and then read from descriptor 3 (-u 3) then I at least get the prompt, but the read command still doesn't get any input that I type.
How do I get the read builtin to read correctly from the terminal (parent's stdin file descriptor)?
How do I get the read builtin to read correctly from the terminal
(parent's stdin file descriptor)?
You might want to try this (using the parent's filedescriptors):
#!/bin/bash
(
while true; do
read -u 0 -r -e -p "test_rl> " line || break
echo "line: ${line}"
done
)<&0 >&1 &
sleep 3600 # more work

Redirect to a file which is got from command line

I'm a beginner. :)
I'm trying to ask the name of file from prompt in a shell
and edit that file in another shell like this:
test.sh
echo "enter file name"
read word
sh test2.sh
test2.sh
read number
echo "$number" >> $word
I get an error
Test2.sh: line 1: $mAmbiguous redirect
Any suggestion?
If you want a variable from test.sh to be visible to its child processes, you need to export it. In your case, you would seem to want to export word. Perhaps a better approach would be for test2.sh to accept the destination file as a parameter instead, though.
test.sh
echo "enter file name"
read word
test2.sh "$word"
test2.sh
#!/bin/sh
: ${1?must have destination file name} # fail if missing
read number
echo "$number" >> "$1"

Resources