Difference between piping a file to sh and calling a shell file - shell

This is what was trying to do:
$ wget -qO- www.example.com/script.sh | sh
which quietly downloads the script and prints it to stdout which is then piped to sh. This unfortunately doesn't quite work, failing to wait for user input at various points, aswell as a few syntax errors.
This is what actually works:
$ wget -qOscript www.example.com/script.sh && chmod +x ./script && ./script
But what's the difference?
I'm thinking maybe piping the file doesn't execute the file, but rather executes each line individually, but I'm new to this kind of thing so I don't know.

When you pipe to sh , stdin of that shell/script will be the pipe. Thus the script cannot take e.g. user input from the console. When you run the script normally, stdin is the console - where you can enter input.

You might try telling the shell to be interactive:
$ wget -qO- www.example.com/script.sh | sh -i

I had the same issue, and after tinkering and googling this is what worked for me.
wget -O - www.example.com/script.sh | sh

Related

Allow user input in second command in bash pipe

I'm looking for how I might allow user input in a second command in a bash statement and I'm not sure how to go about it. I'd like to be able to provide a one-liner for someone to be able to install my application, but part of that application process requires asking some questions.
The current script setup looks like:
curl <url/to/bootstrap.sh> | bash
and then boostrap.sh does:
if [ $UID -ne 0 ]; then
echo "This script requires root to run. Restarting the script under root."
exec sudo $0 "$#"
exit $?
fi
git clone <url_to_repo> /usr/local/repo/
bash /usr/local/repo/.setup/install_system.sh
which in turn calls a python3 script that asks for input.
I know that the the curl in the first line is using stdin and so that might make what I'm asking impossible and that it has to be two lines to ever work:
wget <url/to/boostrap.sh>
bash bootstrap.sh
You can restructure your script to run this way:
bash -c "$(curl -s http://0.0.0.0//test.bash 2>/dev/null)"
foo
wololo:a
a
My test.bash is really just
#!/bin/bash
echo foo
python -c 'x = raw_input("wololo:");print(x)'`
To demonstrate that stdin can be read from in this way. Sure it creates a subshell to take care of curl but it allows you to keep reading from stdin as well.

Print all script output to file from within another script

English is not my native language, please accept my apologies for any language issues.
I want to execute a script (bash / sh) through CRON, which will perform various maintenance actions, including backup. This script will execute other scripts, one for each function. And I want the entirety of what is printed to be saved in a separate file for each script executed.
The problem is that each of these other scripts executes commands like "duplicity", "certbot", "maldet", among others. The "ECHO" commands in each script are printed in the file, but the outputs of the "duplicity", "certbot" and "maldet" commands do not!
I want to avoid having to put "| tee --append" or another command on each line. But even doing this on each line, the "subscripts" do not save in the log file. That is, ideally in the parent script, you could specify in which file each script prints.
Does not work:
sudo bash /duplicityscript > /path/log
or
sudo bash /duplicityscript >> /path/log
sudo bash /duplicityscript | sudo tee –append /path/log > /dev/null
or
sudo bash /duplicityscript | sudo tee –append /path/log
Using exec (like this):
exec > >(tee -i /path/log)
sudo bash /duplicityscript
exec > >(tee -i /dev/null)`
Example:
./maincron:
sudo ./duplicityscript > /myduplicity.log
sudo ./maldetscript > /mymaldet.log
sudo ./certbotscript > /mycertbot.log
./duplicityscript:
echo "Exporting Mysql/MariaDB..."
{dump command}
echo "Exporting postgres..."
{dump command}
echo "Start duplicity data backup to server 1..."
{duplicity command}
echo "Start duplicity data backup to server 2..."
{duplicity command}
In the log file, this will print:
Exporting Mysql/MariaDB...
Exporting postgres...
Start duplicity data backup to server 1...
Start duplicity data backup to server 2...
In the example above, the "ECHO" commands in each script will be saved in the log file, but the output of the duplicity and dump commands will be printed on the screen and not on the log file.
I made a googlada, I even saw this topic, but I could not adapt it to my necessities.
There is no problem in that the output is also printed on the screen, as long as it is in its entirety, printed on the file.
try 2>&1 at the end of the line, it should help. Or run the script in sh -x mode to see what is causing the issue.
Hope this helps

Bash script - Run commands that correspond to the lines of a file

I have a file like this (text.txt):
ls -al
ps -au
export COP=5
clear
Each line corresponds at a command. In my script, I need to read each line and launch each command.
ps: I tried all these options and with all of them I have the same problem with the command "export". In the file there is "export COP=5", but after running the script, if I do echo $COP in the same terminal, no value is displayed
while IFS= read line; do eval $line; done < text.txt
Be careful about it, it's generally not advised to use eval as it's quite powerful and as easy to be abused.
However, if there is no risk of influence from unprivileged users on text.txt it should be ok.
cat test.txt | xargs -l1 bash -c '"$#"' echo
In order to avoid confusion I would simply rename the file from text.txt to text and add a shebang (e.g. #!/bin/bash) as the first line of the file. Make sure it is executable by calling chmod +x text. Afterwards you can execute it as expected.
$ cat text
#!/bin/bash
ls -al
ps -au
clear
$ chmod +x text
$ ./text

bash read is being skipped when run from curl pipe

I'm building a bootstrap for a github project and would like it to be a simple one-liner. The script requires a password input.
This works and stops the script to wait for an input:
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh -o setup.sh
bash setup.sh
This does not, and just skips over the input request:
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh | bash
setup.sh contains code is something like:
# code before
read -p "Password:" -s password
# code after
Is it possible to have a clean one-liner? If so, how might one do it?
Workaround:
Use three commands instead of piping output.
curl -s https://raw.github.com/willfarrell/.vhosts/master/setup.sh -o vhosts.sh && bash vhosts.sh && rm vhosts.sh
I had the same exact problem as the OP and was looking for an answer. This question was one of the first hits on Google for me and since it doesn't have a real answer yet, here's the command that I eventually stumbled upon which solved my need of using read in a remote script.
bash <(curl -s https://example.com/my-bash-script.sh)
With the pipe, the read reads from standard input (the pipe), but the shell already read all the standard input so there isn't anything for the read to read.

Bash script "read" not pausing for user input when executed from SSH shell

I'm new to Bash scripting, so please be gentle.
I'm connected to a Ubuntu server via SSH (PuTTY) and when I run this command, I expect the bash script that downloads and executes to allow user input and then echo that input. It seems to just write out the echo label for the input request and terminate.
wget -O - https://raw.github.com/aaronhancock/pub/master/bash/readtest.sh | bash
Any clue what I might be doing wrong?
UPDATE: This bash command does exactly what I wanted
bash <(wget -q -O - https://raw.github.com/aaronhancock/pub/master/bash/readtest.sh)
Jonathan already mentioned: bash takes its stdin from the pipe.
And therefore you cannot pipe the script into bash when you want to interactively input something. But you could use the process substitution feature of bash (assumed your login shell is a bash):
bash <(wget -O - https://raw.github.com/aaronhancock/pub/master/bash/readtest.sh)
Bash is taking stdin from the pipe, not from the terminal. So you can't pipe a script to bash and still use the "read" command for user input.
Notice that you have the same problem if you save the script to a local file and pipe it to bash:
less readtest.sh | bash
I found this also works and helps keep the data in the current scope.
eval "wget -q -O - https://raw.github.com/aaronhancock/pub/master/bash/readtest.sh"

Resources