Bash script to operate another script - bash

I am trying to create a bash script to operate another bash script through CRON without the need for human intervention.
The script needs to be able to interact with the other script so that it accomplishes:
Enter
Press a number..
Then takes you to another section of the script where you need to enter another number..
Then enter another number..
Press enter again..
I can't get the script to hit Enter correctly. What I have so far, "echo | ./module1.sh" flickers, even tried "echo "\n"" which doesn't work.
#!/bin/bash
cd /home/usernamehere/scripts
echo | ./module1.sh
echo "1"
This script requires a person to sit at the terminal while it finishes what it needs to or be run in a tmux session with the user safely exiting the session.

If everything is read from stdin (as opposed to from the terminal device--which is what passwd and screen editors do), and the script requires you to enter ENTER, 1, 2 and 3, you can run it with
printf '\n1\n2\n\3\n' | ./module1.sh
An alternative is with a here-document (read your shell's manual page):
./module1.sh << EOF
1
2
3
EOF

Related

Echo a command back to to bash shell prompt?

I'm trying to implement a simple command line util that will allow users to select from a set of commands and then echo those commands strings back to the shell. I don't want the shell to execute said commands, but I want the commands to simply echo out to the prompt, so the user can verify them or change them before pressing return button.
No idea where to start. Of course echoing the command to STDOUT is pretty easy with a log command, or println kind of thing, but that would be the stdout of current process, Ideally, I would like the stdout of that process to be the stdin of the shell, but only into the prompt line, not a pipe into a new shell or a command execution. Is this possible?
e.g.
$ help # user asks for help
1. you can do this
2. you can do that
? 1 # user chooses 1, help echoes back a string to the parent shell $$
$ this-command --flags # simply ends up on prompt line, but doesn't exec
Is this possible without a hook in the terminal ui or tty?

Why isn't this command returning to shell after &?

In Ubuntu 14.04, I created the following bash script:
flock -nx "$1" xdg-open "$1" &
The idea is to lock the file specified in $1 (flock), then open it in my usual editor (xdg-open), and finally return to prompt, so I can open other files in sequence (&).
However, the & isn't working as expected. I need to press Enter to make the shell prompt appear again. In simpler constructs, such as
gedit test.txt &
it works as it should, returning the prompt immediately. I think it has to do with the existence of two commands in the first line. What am I doing wrong, please?
EDIT
The prompt is actually there, but it is somehow "hidden". If I issue the command
sudo ./edit error.php
it replies with
Warning: unknown mime-type for "error.php" -- using "application/octet-stream"
Error: no "view" mailcap rules found for type "application/octet-stream"
Opening "error.php" with Geany (application/x-php)
__
The errors above are not related to the question. But instead of __ I see nothing. I know the prompt is there because I can issue other commands, like ls, and they work. But the question remains: WHY the prompt is hidden? And how can I make it show normally?
Why isn't this command returning to shell after &?
It is.
You're running a command in the background. The shell prints a new prompt as soon as the command is launched, without waiting for it to finish.
According to your latest comment, the background command is printing some message to your screen. A simple example of the same thing:
$ echo hello &
$ hello
The cursor is left at the beginning of the line after the $ hello.
As far as the shell is concerned, it's printed a prompt and is waiting a new command. It doesn't know or care that a background process has messed up your display.
One solution is to redirect the command's output to somewhere other than your screen, either to a file or to /dev/null. If it's an error message, you'll probably have to redirect both stdout and `stderr.
flock -nx "$1" xdg-open "$1" >/dev/null 2>&1 &
(This assumes you don't care about the content of the message.)
Another option, pointed out in a comment by alvits, is to sleep for a second or so after executing the command, so the message appears followed by the next shell prompt. The sleep command is executed in the foreground, delaying the printing of the next prompt. A simple example:
$ echo hello & sleep 1
hello
[1] + Done echo hello
$
or for your example:
flock -nx "$1" xdg-open "$1" & sleep 1
This assumes that the error message is printed in the first second. That's probably a valid assumption for you example, but it might not be in general.
I don't think the command is doing what you think it does.
Have you tried to run it twice to see if the lock cannot be obtained the second time.
Well, if you do it, you will see that it doesn't fail because xdg-open is forking to exec the editor. Also if it fails you expect some indication.
You should use something like this
flock -nx "$1" -c "gedit '$1' &" || { echo "ERROR"; exit 1; }

Run a bash script via another bash script to delete a file is not working properly

I have a bash script start.sh which calls another run.sh, which takes me to another prompt where I have to delete a file file.txt and then exit out of that prompt.
When I call run.sh from inside start.sh, I see the prompt and I believe that it deletes the file.txt but the inner/new prompt waits for me to exit out of it while the script is running - meaning it needs intervention to proceed. How do I avoid it in bash?
In Python I can use Popen and get it going but not sure about bash.
EDIT: I would rather like to know what command to provide to exit out of the shell (generated from running run.sh") so I can go back to the prompt where "start.sh" was started.
Etan: To answer your question
VirtualBox:~/Desktop/ > ./start
company#4d6z74d:~$ ->this is the new shell
company#4d6z74d:~$ logout ---> I did a "Control D here" so the script could continue.
Relevant part of start.sh which:
/../../../../run.sh (this is the one that takes us to the new $ prompt)
echo "Delete file.txt "
rm -f abc/def/file.txt
You can run run.sh in the background using &. In start.sh, you would invoke the script via /path/run.sh &. Now, start.sh will exit without waiting for run.sh to finish (which is running in the background).

Writing shell script using another shell script

I'm trying to automate running of a shell script that would take some user inputs at various points of its execution.
The basic logic that I've in my mind is copied below, but this is only for one input. I wanna run it recursively until the shell prompt is received after the original script completes its execution. I said recursively because, the question that prompts for an input and the input itself will be the same all the time.
#!/usr/bin/expect
spawn new.sh $1
expect "Please enter input:"
send "my_input"
Sharing any short-cut/simple method to achieve this will be highly appreciated.
You don't need expect to do this - read can read from a pipe as well as from user input, so you can pass the input through a pipe to your script. Example script:
#!/bin/bash
read -p "Please enter input: " input
echo "Input: $input"
Running the script prompts for input as normal, but if you pipe to it:
$ echo "Hello" | sh my_script.sh
Input: Hello
You said that your input is always the same - if so, then you can use yes (which just prints a given string over and over) to pass your script the input repeatedly:
yes "My input" | sh my_script.sh
This would run my_script.sh, any read commands within the script will read "My input".

bash show output only during runtime

I am trying to write a script that displays its output to the terminal only while it's running, much like the 'less' or 'ssh' commands.
When I launch said script, it would take over the whole terminal, print what it needs to print, and then when I exit the script, I would return to my terminal where the only record that my script has run will be the line that shows the command itself. When I scroll up, I don't want to see what my script output.
[snoopdougg#machine /home/snoopdougg/logs]$ ls
alog.log blog.log clog.log myScript.sh
[snoopdougg#machine /home/snoopdougg/logs]$ whoami
snoopdougg
[snoopdougg#machine /home/snoopdougg/logs]$ ./myScript.sh
[snoopdougg#machine /home/snoopdougg/logs]$
(Like nothing ever happened... but myScript.sh would have print things to the terminal while it was running).
How can I do this?
You're talking about the alternate screen, which you can access with a pair of terminal attributes, smcup and rmcup. Put this in a script and run it for a small demo:
tput smcup
echo hello
sleep 5
tput rmcup
Use screen:
screen ./myScript.sh

Resources