Bash script stderr and stdout - bash

First, i'm not sure if i'm formulating the question correctly.
So I have a bash script that delete a user in a system. The problem is, it is showing the err msg before the terminal code.
Here.
Code:
echo -e "\nCommencing user $UserName removal"
echo -e "Deactivating $UserName shell account ..." "$(chsh -s /usr/bin/false $UserName)"
echo -e "Listing group(s) user: " "$(groups $UserName)"
echo -e "Removing Crontab ... " "$(crontab -r -u $UserName)"
Here is the output:
Commencing user Test2 removal
chsh: unknown user: Test2
Deactivating Test2 shell account ...
groups: Test2: no such user
Listing group(s) user:
scripts/sudo.sh: line 332: /delete_Test2/crontab.bak: No such file or directory
Saving Crontab ...
The user to delete is Test2, which is this case "supposedly" does not exist (a different question for a different time). Now, shouldn't the stderr msg display next to the command or below it instead of above it?
Thanks in advance

When the shell executes the line echo -e "Deactivating $UserName shell account ..." "$(chsh -s /usr/bin/false $UserName)", here's the sequence of events:
The shell runs chsh -s /usr/bin/false Test2, with its stdout going to a capture buffer so the shell can use it later.
chsh discover's that "Test2" doesn't exist, and prints "chsh: unknown user: Test2" to its stderr. Since the shell didn't do anything special with its stderr, this goes directly to the shell's stderr, which is your terminal.
chsh exits
The shell takes the captured output (note: stdout, not stderr) from chsh (there wasn't any), and substitutes it into the echo command line, giving echo -e "Deactivating Test2 shell account ..." ""
Note that the error message gets printed at step 2, but the message about what's supposedly about to happen doesn't get printed until step 4.
There are several ways to solve this; generally the best is to avoid the whole mess of capturing the command's output, then echoing it. It's pointless, and just leads to confusions like this (and some others you haven't run into). Just run the command directly, and let its output (both stdout and stderr) go to their normal places, in normal order.
BTW, I also recommend avoiding echo -e, or indeed echo -anything. The POSIX standard for echo says "Implementations shall not support any options." In fact, some implementations do support options; others just treat them as part of the string to be printed. Also, some interpret escape sequences (like \n) in the strings to print, some don't, and some only do if -e is specified (violating the POSIX standard). Given how unpredictable these features are, it's best to just avoid such iffy situations, and either use printf instead (which is more complicated to use, but much more predictable), or (as in your case) just use a separate echo command for each line.
Also, you should almost always double-quote variable references (e.g. groups "$UserName" instead of groups $UserName), just in case they contain spaces, wildcards, etc.
Based on the above, here's how I'd write the script:
echo # print a blank line
echo "Commencing user $UserName removal"
echo "Deactivating $UserName shell account ..."
chsh -s /usr/bin/false "$UserName"
echo "Listing group(s) user: "
groups "$UserName"
echo "Removing Crontab ... "
crontab -r -u "$UserName"

Now, shouldn't the stderr msg display next to the command or below it instead of above it?
No, because the command is executed first and then its output is echoed. Perform the echo and command in two separate lines.

That's because chsh is writing to stderr before bash has a chance to write to stdout. Redirect stderr to stdout to get it right:
echo -e "Deactivating $UserName shell account ..." "$(chsh -s /usr/bin/false $UserName 2>&1)"
In general, it is better to check if the user exists before trying to deactivate it.

Related

printing output of command history 1 from shell script

Here's my problem, from console if I type the below,
var=`history 1`
echo $var
I get the desired output. But when I do the same inside a shell script, it is not showing any output. Also, for other commands like pwd, ls etc, the script shows the desired output without any issue.
As value of variable contains space, add quotes around it.
E.g.:
var='history 1'
echo $var
I believe all you need is this as follows:
1- Ask user for the number till which user need to print the history in script.
2- Run the script and take Input from user and get the output as follows:
cat get_history.ksh
echo "Enter the line number of history which you want to get.."
read number
if [[ $# -eq 0 ]]
then
echo "Usage of script: get_history.ksh number_of_lines"
exit
else
history "$number"
fi
Added logic where it will check arguments if number of arguments passed is 0 then it will exit from script then.
By default history is turned off in a script, therefore you need to turn it on:
set -o history
var=$(history 1)
echo "$var"
Note the preferred use of $( ) rather than the deprecated backticks.
However, this will only look at the history of the current process, that is this shell script, so it is fairly useless.

How can I easily log some specific command line commands into a file?

I often perform configuration changes using single line commands on Mac OS, Linux or even Windows and I want to easily log them in a file, so I can replay if I have to reconfigure the machine again.
Please not that I want to do these only for some commands, so the shell history is of not use.
Ideally I would like to be able to use some kind of shell extension that logs some of the commands.
As you know if you start your bash command with a space, this command is not logged into the history.
What if I can have another prefix that would do the opposite? Is there something there that can be used for this? A solution for bash would be more than enough and if there is an already existing solution it would much better than me writing a new one.
You could do your logging in PROMPT_COMMAND, extracting the specific commands from shell history and writing them to a file.
Something like:
log () {
last_command="$(history -p \!\!)"
if [[ $last_command == " "* ]] # save commands starting with *two* spaces
then
printf "%s\n" "$last_command" >> ~/special.log
fi
}
PROMPT_COMMAND="log; $PROMPT_COMMAND"
This has problems:
PROMPT_COMMAND is run each time the prompt is printed. Just pressing Enter multiple times could cause a command to be logged multiple times.
Marking with two spaces would, of course, need you to remove ignorespace or ignoreboth from HISTCONTROL so that commands starting spaces are logged at all.
AFAICT, history is updated when the next command is read, so the command is logged after the next command returns to the prompt, since that's when the correct history is available in PROMPT_COMMAND.
All this would be easier in zsh, with a preexec hook:
preexec () {
if [[ $1 == " "* ]]
then
printf "%s\n" "$1" >> ~/special.log
fi
}
The preexec function automatically gets the command as the first argument if history is enabled, saving us a deal of trouble. It is run when the command has been read, but before it begins execution, so the timing is perfect. From the documentation:
preexec
Executed just after a command has been read and is about to be
executed. If the history mechanism is active (regardless of whether
the line was discarded from the history buffer), the string that the
user typed is passed as the first argument, otherwise it is an empty
string. The actual command that will be executed (including expanded
aliases) is passed in two different forms: the second argument is a
single-line, size-limited version of the command (with things like
function bodies elided); the third argument contains the full text
that is being executed.
$ ls
$ echo foo | echo bar
bar
$ cat ~/special.log
ls
echo foo | echo bar
A function in .bashrc can be used like a prefix:
log_this_command () {
echo "$#" >> ~/a_log_file # log the command to file
"$#" # and run the command itself
}
Caveat: this only logs expanded arguments, rather than the raw input.
Source function with the same name function screencapture {echo "used parms: $#"; command screencapture $#}
appending to log file function screencapture {echo "$(date) screencapture " $# >> ~/log.txt; command screencapture $#}
as one runs screencapture command, log entry is created and command executes as uninterfered
you could automate in creating these functions, if the list of them is like .... all of them

In bash, how to process all user input on command line

How to make all user input on command line as stdin for a program?
In my case, I want to replace certain words inputed by user. For example, every time user uses the word animal1, I want it received as goldfish. So it would look like this:
$ animal1
goldfish: command not found
I tried the following bash command
while read input
do
sed "s/animal2/zebra/g;s/animal1/goldfish/g" <<< "$input"
done
But it prompts for user input and does not return to bash. I want it to run while using bash command line.
Also, this allowed me to capture output only.
bash | sed 's/animal2/zebra/g;s/animal1/goldfish/g'
But not user input.
If I understand you correctly, sounds like you just need to set up some aliases:
$ alias animal1=goldfish
$ animal1
bash: goldfish: command not found
This allows the shell to be used interactively as usual but will make the substitutions you want.
You can add this alias definition to one of your startup files, commonly ~/.bashrc or ~/.profile, to have them take effect on any new shell that you open.
The solution provided by Tom Fenech is good, however, if you plan to add more features to the command you can use a function like the following:
animal1() {
echo "Welcome to the new user interface!"
goldfish
# other commands
}
and put it in the user ~/.bashrc or ~/.bash_profile
The output will be:
$>animal1
Welcome to the new user interface!
-bash: goldfish: command not found
By using this approach you can, for example, create a custom output message. In the following snippet I take the return vale from the command and process it word by word. Then I remove the -bash: part of the output and reconstruct the message and output it.
animal1() {
echo "Welcome to the new user interface!"
retval=$(goldfish 2>&1)
# Now retval stores the output of the command glodfish (both stdout and stderr)
# we can give it directly to the user
echo "Default return value"
echo "$retval"
echo
# or test the return value to do something
# here I build a custom message by removing the bash part
message=""
read -ra flds <<< "$retval"
for word in "${flds[#]}" #extract substring from the line
do
# remove bash
msg="$(echo "$word" | grep -v bash)"
# append each word to message
[[ msg ]] && message="$message $msg"
done
echo "Custom message"
echo "$message"
echo
}
Now the output would:
Welcome to the new user interface!
Default return value
-bash: goldfish: command not found
Custom message
goldfish: command not found
If you comment the lines that echoes the default return value then you get exactly the output you asked for.

Write output to $file (allowing it to be stdout)

Suppose I have this script:
logfile=$1
echo "This is just a debug message indicating the script is starting to run..."
# Do some work...
echo "Results: x, y and z." >> $logfile
Is it possible to invoke the script from the command-line such that $logfile is actually stdout?
Why? I would like to have a script that prints part of its output to stdout or, optionally, to a file.
"But why not remove the >> $logfile part and just invoke it with ./script >> filename when you want to write to a file?", you may ask.
Well, because I just want to do this "optional redirect" thing for some output messages. In the example above, just the second message should be affected.
Use /dev/stdout, if your operating system is Linux or something similarly compliant with convention. Or:
#!/bin/bash
# works on bash even if OS doesn't provide a /dev/stdout
# for non-bash shells, consider using exec 3>&1 explicitly if $1 is empty
exec 3>${1:-/dev/stdout}
echo "This is just a debug message indicating the script is starting to run..." >&2
echo "Results: x, y and z." >&3
This is also vastly more efficient than putting >>"$filename" on every line that should log to the file, which reopens the file for output on each command.

Open a shell in the second process of a pipe

I'm having problems understanding what's going on in the following situation. I'm not familiar with UNIX pipes and UNIX at all but have read documentation and still can't understand this behaviour.
./shellcode is an executable that successfully opens a shell:
seclab$ ./shellcode
$ exit
seclab$
Now imagine that I need to pass data to ./shellcode via stdin, because this reads some string from the console and then prints "hello " plus that string. I do it in the following way (using a pipe) and the read and write works:
seclab$ printf "world" | ./shellcode
seclab$ hello world
seclab$
However, a new shell is not opened (or at least I can't see it and iteract with it), and if I run exit I'm out of the system, so I'm not in a new shell.
Can someone give some advice on how to solve this? I need to use printf because I need to input binary data to the second process and I can do it like this: printf "\x01\x02..."
When you use a pipe, you are telling Unix that the output of the command before the pipe should be used as the input to the command after the pipe. This replaces the default output (screen) and default input (keyboard). Your shellcode command doesn't really know or care where its input is coming from. It just reads the input until it reaches the EOF (end of file).
Try running shellcode and pressing Control-D. That will also exit the shell, because Control-D sends an EOF (your shell might be configured to say "type exit to quit", but it's still responding to the EOF).
There are two solutions you can use:
Solution 1:
Have shellcode accept command-line arguments:
#!/bin/sh
echo "Arguments: $*"
exec sh
Running:
outer$ ./shellcode foo
Arguments: foo
$ echo "inner shell"
inner shell
$ exit
outer$
To feed the argument in from another program, instead of using a pipe, you could:
$ ./shellcode `echo "something"`
This is probably the best approach, unless you need to pass in multi-line data. In that case, you may want to pass in a filename on the command line and read it that way.
Solution 2:
Have shellcode explicitly redirect its input from the terminal after it's processed your piped input:
#!/bin/sh
while read input; do
echo "Input: $input"
done
exec sh </dev/tty
Running:
outer$ echo "something" | ./shellcode
Input: something
$ echo "inner shell"
inner shell
$ exit
outer$
If you see an error like this after exiting the inner shell:
sh: 1: Cannot set tty process group (No such process)
Then try changing the last line to:
exec bash -i </dev/tty

Resources