I use control-r on the command line frequently to search for previous commands but cannot get this to work for commands that have just been run in a bash script.
I've tried running the script directly and using 'source' but history shows no record.
Is there anyway to get history updated via a script?
You can try using history -s command to store the command in the history list.
Example:
$ history -s echo foo
[Ctrl+R]
(reverse-i-search)`foo': echo foo
Alternatively, write your commands to a file and then use history -n file to read commands from the file into the current history list.
Example:
$ echo "echo bar" > /tmp/file
$ history -n /tmp/file
[Ctrl+R]
(reverse-i-search)`bar': echo bar
Related
This question already has answers here:
History command works in a terminal, but doesn't when written as a bash script
(3 answers)
Closed 2 years ago.
Suppose we have env.sh file that contains:
echo $(history | tail -n2 | head -n1) | sed 's/[0-9]* //' #looking for the last typed command
when executing this script with bash env.sh, the output will be empty:
but when we execute the script with ./env.sh, we get the last typed command:
I just want to know the diffrence between them
Notice that if we add #!/bin/bash at the beginning of the script, the ./env.sh will no longer output anything.
History is disabled by BASH in non-interactive shells by-default. If you want to enable it however, you can do so like this:
#!/bin/bash
echo $HISTFILE # will be empty in non-iteractive shell
HISTFILE=~/.bash_history # set it again
set -o history
# the command will work now
history
The reason this is done is to avoid cluttering the history by any commands being run by any shell scripts.
Adding hashbang (meaning the file is to be interpreted as a script by the program specified in your hashbang) to your script when being run via ./env.sh invokes your script using the binary /bin/bash i.e. run via bash, thus again printing no history.
I want to run multiple commands like they are executed one at a time on command prompt
Eg i have the following list of commands
ls
pwd
du -sh
Now i try to copy paste them and run:
$ ls
pwd
du -sh
file1.txt file2.txt
/home/user/test
1M .
but instead i want to get them executed separately. So that i can see their outputs like below
$ ls
file1.txt file2.txt
$ pwd
/home/user/test
$ du -sh
1M .
So is it possible if i a have a list of commands to paste them in such a way that they can execute as if one per command prompt. Else the only option is paste one command at a time.
Generally i get a list of commands to get executed.
While pasting essentially works the way you describe, it may end up looking cosmetically wrong when the input (and its local echo) shows up while the shell is still busy executing the previous command.
You could instead feed the commands to bash -i, which will read and execute them in turn, showing the prompt:
$ mypaste() { x="$(cat)"; bash -i <<< "$x"; }
$ mypaste # Now paste some commands and hit ctrl-d
ls
pwd
whoami
^D
This results in:
you#yourdir $ ls
some files
you#yourdir $ pwd
/home/you/yourdir
you#yourdir $ whoami
you
you#yourdir $ exit
$
nano myscript.sh or your favorite editor and paste the following.
#!/bin/bash
ls
pwd
du -sh
make it executable with chmod +x myscript.sh and run the script with
./myscript.sh
You can run any bash commands and see outputs
Try each command separated by semicolon:
ls; pwd; du -sh;
This will make it batch of commands. Shell will execute one by one and you don't have to paste each command separately.
Hope this helps.
The answer from that other guy worked and I used a slightly modified version using heredoc.
I wanted to script a sequence of commands that show the prompt so I could copy/paste on different systems and show how to replicate a bug.
simple version
bash -i << 'EOF'
echo "command one"
echo "command two"
EOF
more commands and pretty output
bash -i << 'EOF' && echo -e '\e[1A\e[K==========================================='
unset PROMPT_COMMAND; PS1='command-sequence:$ ' ; clear ; echo "==========================================="
mkdir /tmp/demo-commands
echo "file contents" > /tmp/demo-commands/file
cd /tmp/demo-commands
pwd
ls
cat file
rm file
rm -r /tmp/demo-commands
EOF
I customize the prompt and use echo -e '\e[1A\e[K to replace the last line with a separator
English is not my native language, please accept my apologies for any language issues.
I want to execute a script (bash / sh) through CRON, which will perform various maintenance actions, including backup. This script will execute other scripts, one for each function. And I want the entirety of what is printed to be saved in a separate file for each script executed.
The problem is that each of these other scripts executes commands like "duplicity", "certbot", "maldet", among others. The "ECHO" commands in each script are printed in the file, but the outputs of the "duplicity", "certbot" and "maldet" commands do not!
I want to avoid having to put "| tee --append" or another command on each line. But even doing this on each line, the "subscripts" do not save in the log file. That is, ideally in the parent script, you could specify in which file each script prints.
Does not work:
sudo bash /duplicityscript > /path/log
or
sudo bash /duplicityscript >> /path/log
sudo bash /duplicityscript | sudo tee –append /path/log > /dev/null
or
sudo bash /duplicityscript | sudo tee –append /path/log
Using exec (like this):
exec > >(tee -i /path/log)
sudo bash /duplicityscript
exec > >(tee -i /dev/null)`
Example:
./maincron:
sudo ./duplicityscript > /myduplicity.log
sudo ./maldetscript > /mymaldet.log
sudo ./certbotscript > /mycertbot.log
./duplicityscript:
echo "Exporting Mysql/MariaDB..."
{dump command}
echo "Exporting postgres..."
{dump command}
echo "Start duplicity data backup to server 1..."
{duplicity command}
echo "Start duplicity data backup to server 2..."
{duplicity command}
In the log file, this will print:
Exporting Mysql/MariaDB...
Exporting postgres...
Start duplicity data backup to server 1...
Start duplicity data backup to server 2...
In the example above, the "ECHO" commands in each script will be saved in the log file, but the output of the duplicity and dump commands will be printed on the screen and not on the log file.
I made a googlada, I even saw this topic, but I could not adapt it to my necessities.
There is no problem in that the output is also printed on the screen, as long as it is in its entirety, printed on the file.
try 2>&1 at the end of the line, it should help. Or run the script in sh -x mode to see what is causing the issue.
Hope this helps
I made script like below.
history
read input
!$input
But that makes the error:
./history.sh: line 8: !2185: command not found
How to run '!' command in shell script?
maybe you should need something like:
history
echo $!
Inside your script you can use fc shell builtin for this.
e.g.
#!/bin/bash
# enable history
set -o history
# example command
ls -lrt
echo "now run ls from history..."
fc -s ls
to re-execute most recent ls from history.
I am missing something really simple I think:
$ cat hs.sh
#!/bin/bash
echo $1
history | grep -i $1
echo $#
exit
$
here is output:
$ ./history_search sed
sed
1
$
Trying to create a script which I can use in form of './hs.sh sed' to search for all sed commands in history. I can create an alias using this which works fine, but not this script.
Here is the alias:
alias hg='history | grep -i $1'
Interactive shells have history; scripted shells do not have history. You can only ask for history from an interactive shell, which is why the alias works but the script does not.
When you run this as a shell script, it spawns a new shell that has no history.
Try running it in the same shell like this:
source ./history_search see
and it should work.