I'm trying to add a one line perl command in to an expect script.
For purposes of this thread, I've boiled my script down to 2 lines:
#!/usr/bin/expect -f
/usr/bin/perl -i -pe 's/\015/\012/g' 'test.txt'
If I execute just the perl line in the Terminal (Mac), it runs fine. However, my expect script yields the error
error "invalid command name \"/usr/bin/perl\"
I am a novice at shell scripts. hopefully this is an easy solution, but expect documentation seems sparse on the web.
My other option is to just keep the 2 scripts separate.
Expect is a superset of Tcl. What you need is the exec command. To execute your command in a subshell and store the result, you would use a construct like this:
set result [exec /usr/bin/perl -i -pe 's/\015/\012/g' test.txt]
Related
I have a simple Bash script:
#!/usr/bin/env bash
read X
echo "X=$X"
When I execute it with ./myscript.sh it works. But when I execute it with cat myscript.sh | bash it actually puts echo "X=$X" into $X.
So this script prints Hello World executed with cat myscript.sh | bash:
#!/usr/bin/env bash
read X
hello world
echo "$X"
What's the benefit of executing a script with cat myscript.sh | bash? Why doesn't do it the same things as if I execute it with ./myscript.sh?
How can I avoid Bash to execute line by line but execute all lines after the STDIN reached the end?
Instead of just running
read X
...instead replace it with...
read X </dev/tty || {
X="some default because we can't read from the TTY here"
}
...if you want to read from the console. Of course, this only works if you have a /dev/tty, but if you wanted to do something robust, you wouldn't be piping from curl into a shell. :)
Another alternative, of course, is to pass in your value of X on the command line.
curl https://some.place/with-untrusted-code-only-idiots-will-run-without-reading \
| bash -s "value of X here"
...and refer to "$1" in your script when you want X.
(By the way, I sure hope you're at least using SSL for this, rather than advising people to run code they download over plain HTTP with no out-of-band validation step. Lots of people do it, sure, but that's making sites they download from -- like rvm.io -- big targets. Big, easy-to-man-in-the-middle-or-DNS-hijack targets).
When you cat a script to bash the code to execute is coming from standard input.
Where does read read from? That's right also standard input. This is why you can cat input to programs that take standard input (like sed, awk, etc.).
So you are not running "a script" per-se when you do this. You are running a series of input lines.
Where would you like read to read data from in this setup?
You can manually do that (if you can define such a place). Alternatively you can stop running your script like this.
Is there a way to make a perl one-liner into a bash function?
#!/bin/bash
# ~/.bashrc:
stopwatch() {
perl -wE 'for (reverse 1..(shift)-1) {system q!clear!;open FIGLET,q!|figlet -f banner -c!;printf FIGLET "%2d:%02d",$_/60,$_%60;sleep 1}' "$1"
}
source-ing ~/.bashrc complains as follows:
unexpected EOF while looking for matching `''
syntax error near unexpected token `reverse'
and so on..
Usual shell wrapping works of course, but here I try to have a bash alias/function invoking perl.
There must be a way without needing to create a brand new *.pl file. Much appreciated!
You may try to run bash and perl script together in this way as well.
Moreover in your case -E is where I doubt. perl -e always works but perl -E didn't work in perl 5.8.8
See the code of combining perl with bash
#!/bin/bash
# bash_test
echo "bash commands that you wish to write"
exit 0
# End of Bash part of the script.
# =======================================================
#!/usr/bin/perl
# This part of the script must be invoked with -x option.
print "here you can use your simple system() or exec() that you might wish to use right?";
# End of Perl part of the script.
I have a simple Bash script:
#!/usr/bin/env bash
read X
echo "X=$X"
When I execute it with ./myscript.sh it works. But when I execute it with cat myscript.sh | bash it actually puts echo "X=$X" into $X.
So this script prints Hello World executed with cat myscript.sh | bash:
#!/usr/bin/env bash
read X
hello world
echo "$X"
What's the benefit of executing a script with cat myscript.sh | bash? Why doesn't do it the same things as if I execute it with ./myscript.sh?
How can I avoid Bash to execute line by line but execute all lines after the STDIN reached the end?
Instead of just running
read X
...instead replace it with...
read X </dev/tty || {
X="some default because we can't read from the TTY here"
}
...if you want to read from the console. Of course, this only works if you have a /dev/tty, but if you wanted to do something robust, you wouldn't be piping from curl into a shell. :)
Another alternative, of course, is to pass in your value of X on the command line.
curl https://some.place/with-untrusted-code-only-idiots-will-run-without-reading \
| bash -s "value of X here"
...and refer to "$1" in your script when you want X.
(By the way, I sure hope you're at least using SSL for this, rather than advising people to run code they download over plain HTTP with no out-of-band validation step. Lots of people do it, sure, but that's making sites they download from -- like rvm.io -- big targets. Big, easy-to-man-in-the-middle-or-DNS-hijack targets).
When you cat a script to bash the code to execute is coming from standard input.
Where does read read from? That's right also standard input. This is why you can cat input to programs that take standard input (like sed, awk, etc.).
So you are not running "a script" per-se when you do this. You are running a series of input lines.
Where would you like read to read data from in this setup?
You can manually do that (if you can define such a place). Alternatively you can stop running your script like this.
Im trying to execute a perl script as a passed argument from the command line. I compiled a c file and named it "Test", so trying to pass an argument I try this
>Test perl -e "print qq{A\n}x500"
which I want to mean, run Test file and pass 500 A's, but it seems to not be working
Why do you think it should work? It runs Test and passes 3 arguments to it - perl, -e, "print qq{A\n}\x500". In bash it would be:
Test `perl -a "print qq{A\n}x500"`
For windows, there is no simple way to get a programs output as a variable or pass it to another command directly.
See this post, it describes how set a commands output to a variable.
Try using a pipe, you were latterly passing perl -e "print qq{A\n}x500" to Test.
Example of using a pipe :
perl -e "print qq{A\n}x500" | Test
I'm trying to read commands from a text file and execute each line from a bash script.
#!/bin/bash
while read line; do
$line
done < "commands.txt"
In some cases, if $line contains commands that are meant to run in background, eg command 2>&1 & they will not start in background, and will run in the current script context.
Any ideea why?
if all your commands are inside "commands.txt", essentially, you can call it a shell script. That's why you can either source it, or run it like normal, ie chmod u+x , then you can execute it using sh commands.txt
I don't have anything to add to ghostdog74's answer about the right way to do this, but I can cover why it's failing: The shell parses I/O redirections, backgrounding, and a bunch of other things before it does variable expansion, so by the time $line is replaced by command 2>&1 & it's too late to recognize 2>&1 and & as anything other than parameters to command.
You could improve this by using eval "$line" but even there you'll run into problems with multiline commands (e.g. while loops, if blocks, etc). The source and sh approaches don't have this problem.