I am running into an issue trying to echo some strings into files inside a shell script.
I am calling the shell script through pyspark/spark's rdd.pipe() command, and I checked to make sure the input is coming through by echoing each line in the shell script.
Here is the shell script code:
#!/bin/sh
while read -r one; do
read -r two
read -r three
read -r four
read -r five
read -r six
read -r seven
read -r eight
echo -e "$one\n$two\n$three\n$four\n" >> 1.txt
echo -e "$five\n$six\n$seven\n$eight\n" >> 2.txt
done
I ran the echo command WITHOUT piping to a file and that showed up in the output back in my spark program. The input to the shell script is just strings. Does anyone have ideas why 1.txt and 2.txt aren't being written to?
Related
I'm trying to run this command over multiple machines
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#x.x.x.x "mkdir test"
The IPs are stored in the following .txt file
$ cat ips.txt
10.0.2.15
10.0.2.5
I created a bash script that reads this file line by line. If I run it with an echo:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
echo "$line"
#sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It prints every line:
$ ./variables.sh
10.0.2.15
10.0.2.5
This makes me understand that the script is working as intended. However, when I replace the echo line with the command I want to run for each line:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
#echo "$line"
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It only performs the action for the first IP on the file, then stops. Why?
Managed to solve this by using a for instead of a while. Script ended up looking like this:
for file in $(cat ips.txt)
do
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$file "mkdir test"
done
While your example is a solution that works, it's not the explanation.
Your could find the explanation here : ssh breaks out of while-loop in bash
In two words :
"while" loop continue reading from the same file-descriptor that defined in the loop header ( $input in your case )
ssh (or sshpass) read data from stdin (but in your case from file descriptor $input). And here is the point that hide the things as we didn't exect "ssh" to read the data.
Just to understand the problem you could have same strange experience for example using commands like "ffmpeg" or "mplayer" in while loop. Mplayer and ffmpeg use the keyboards while they are running, so they will consume all the the file-descriptor.
Another good and funny example :
#!/bin/bash
{
echo first
for ((i=0; i < 16384; i++)); do echo "testing"; done
echo "second"
} > test_file
while IFS= read -r line
do
echo "Read $line"
cat | uptime > /dev/null
done < test_file
At first part we write 1st line : first
16384 lines : testing
then last line : second
16384 lines "testing" are equal to 128Kb buffer
At the second part, the command "cat | uptime" will consume exactly 128Kb buffer, so our script will give
Read first
Read second
As solution, as you did, we could use "for" loop.
Or use "ssh -n"
Or playing with some file descriptor - you could find the example in the link that I gave.
I want to make a simple bash script that makes a for loop over a file with commands and execute those commands, and finishes when an error happens. I have something like this
#!/bin/bash
while IFS= read -r line; do
echo $line
output=$(eval $line)
if [ $? -eq 0 ]
then
echo ok
else
echo $output
break
fi
echo
done < summary.txt
The problem is that the first command I'm trying to make is a sudo command, so I have to put the password. I tried putting it in the command like
sudo -S <<< password <<< Y command
with no luck. I've checked that works if I just put it directly without having to read it (not putting it as a string). The thing is that without the loop, the program would be long with no much sense.
Thanks
echo <password> | sudo -S < your command>
From man sudo
-S, --stdin
Write the prompt to the standard error and read the password from the standard input instead of using the terminal
device. The password must be followed by a newline character.
In a script.sh,
source a.sh
source b.sh
CMD1
CMD2
CMD3
how can I replace the source *.sh with their content (without executing the commands)?
I would like to see what the bash interpreter executes after sourcing the files and expanding all variables.
I know I can use set -n -v or run bash -n -v script.sh 2>output.sh, but that would not replace the source commands (and even less if a.sh or b.sh contain variables).
I thought of using a subshell, but that still doesn't expand the source lines. I tried a combination of set +n +v and set -n -v before and after the source lines, but that still does not work.
I'm going to send that output to a remote machine using ssh.
I could use <<output.sh to pipe the content into the ssh command, but I can't log as root onto the remote machine, but I am however a sudoer.
Therefore, I thought I could create the script and send it as a base64-encoded string (using that clever trick )
base64 script | ssh remotehost 'base64 -d | sudo bash'
Is there a solution?
Or do you have a better idea?
You can do something like this:
inline.sh:
#!/usr/bin/env bash
while read line; do
if [[ "$line" =~ (\.|source)\s+.+ ]]; then
file="$(echo $line | cut -d' ' -f2)"
echo "$(cat $file)"
else
echo "$line"
fi
done < "$1"
Note this assumes the sourced files exist, and doesn't handle errors. You should also handle possible hashbangs. If the sourced files contain themselves source, you need to apply the script recursively, e.g. something like (not tested):
while egrep -q '^(source|\.)' main.sh; do
bash inline.sh main.sh > main.sh
done
Let's test it
main.sh:
source a.sh
. b.sh
echo cc
echo "$var_a $var_b"
a.sh:
echo aa
var_a="stack"
b.sh:
echo bb
var_b="overflow"
Result:
bash inline.sh main.sh
echo aa
var_a="stack"
echo bb
var_b="overflow"
echo cc
echo "$var_a $var_b"
bash inline.sh main.sh | bash
aa
bb
cc
stack overflow
BTW, if you just want to see what bash executes, you can run
bash -x [script]
or remotely
ssh user#host -t "bash -x [script]"
I'm trying to read full stdin into a variable :
script.sh
#/bin/bash
input=""
while read line
do
echo "$line"
input="$input""\n""$line"
done < /dev/stdin
echo "$input" > /tmp/test
When I run ls | ./script.sh or mostly any other commands, it works fine.
However It doesn't work when I run cat | ./script.sh , enter my message, and then hit Ctrl-C to exit cat.
Any ideas ?
I would stick to the one-liner
input=$(cat)
Of course, Ctrl-D should be used to signal end-of-file.
I have a file like this (text.txt):
ls -al
ps -au
export COP=5
clear
Each line corresponds at a command. In my script, I need to read each line and launch each command.
ps: I tried all these options and with all of them I have the same problem with the command "export". In the file there is "export COP=5", but after running the script, if I do echo $COP in the same terminal, no value is displayed
while IFS= read line; do eval $line; done < text.txt
Be careful about it, it's generally not advised to use eval as it's quite powerful and as easy to be abused.
However, if there is no risk of influence from unprivileged users on text.txt it should be ok.
cat test.txt | xargs -l1 bash -c '"$#"' echo
In order to avoid confusion I would simply rename the file from text.txt to text and add a shebang (e.g. #!/bin/bash) as the first line of the file. Make sure it is executable by calling chmod +x text. Afterwards you can execute it as expected.
$ cat text
#!/bin/bash
ls -al
ps -au
clear
$ chmod +x text
$ ./text