Bash scripted curl commands producing different results than manual runs - bash

I have a text file of roughly 900 cURLs to run. They are pretty hairy, with tons of quotes, apostrophes and other special characters.
To run them I have been trying to create a bash script to loop through the list:
#!/bin/sh
OLDIFS=$IFS
IFS="&&&"
echo "getting started"
cat staging_curl_script|while read line
do
$line
done
echo "done"
Unfortunately I have had an unusual issue. commands that run fine in the command prompt are returning the "file name too long" error. I echoed out these commands from the script and compared them to the manually run command, and they are identical.
Any idea why I am seeing different results?

silly mistake here, needed bash -c "$line"

Related

How to read user's input from bash (when catting a script) [duplicate]

I have a simple Bash script:
#!/usr/bin/env bash
read X
echo "X=$X"
When I execute it with ./myscript.sh it works. But when I execute it with cat myscript.sh | bash it actually puts echo "X=$X" into $X.
So this script prints Hello World executed with cat myscript.sh | bash:
#!/usr/bin/env bash
read X
hello world
echo "$X"
What's the benefit of executing a script with cat myscript.sh | bash? Why doesn't do it the same things as if I execute it with ./myscript.sh?
How can I avoid Bash to execute line by line but execute all lines after the STDIN reached the end?
Instead of just running
read X
...instead replace it with...
read X </dev/tty || {
X="some default because we can't read from the TTY here"
}
...if you want to read from the console. Of course, this only works if you have a /dev/tty, but if you wanted to do something robust, you wouldn't be piping from curl into a shell. :)
Another alternative, of course, is to pass in your value of X on the command line.
curl https://some.place/with-untrusted-code-only-idiots-will-run-without-reading \
| bash -s "value of X here"
...and refer to "$1" in your script when you want X.
(By the way, I sure hope you're at least using SSL for this, rather than advising people to run code they download over plain HTTP with no out-of-band validation step. Lots of people do it, sure, but that's making sites they download from -- like rvm.io -- big targets. Big, easy-to-man-in-the-middle-or-DNS-hijack targets).
When you cat a script to bash the code to execute is coming from standard input.
Where does read read from? That's right also standard input. This is why you can cat input to programs that take standard input (like sed, awk, etc.).
So you are not running "a script" per-se when you do this. You are running a series of input lines.
Where would you like read to read data from in this setup?
You can manually do that (if you can define such a place). Alternatively you can stop running your script like this.

UNIX shell script do loop execute commands

In general I don't understand how to make most commands in a UNIX shell script do loop work the same as they work directly from the command line (using bash).
As a simple test, a script called looping.sh to execute an SQL script (what's in filelist.txt doesn't matter in this case):
for i in $(cat filelist.txt)
do $(sqlplus DB_USER/password#abc #test.sql)
done
results in
looping.sh: line 2: SQL*Plus:: command not found
for each line in filelist.txt. Other variations on the 2nd line don't work, like putting it in quotes etc.
Or, if filelist.txt has names of other sh scripts, let's say a single line in this case called_file1.sh and I want to execute it
for i in $(cat filelist.txt)
do exec $i
done
results in
: not found line 2: exec: called_file1.sh
The files are all in the same folder. I tried variations for the second line like /bin/sh $i, putting it in quotes and so on. What's the magic way to execute a command in the do loop?
$(...) takes the contents and runs it as a command and then returns the output from the command.
So when you write:
for i in $(cat filelist.txt)
do $(sqlplus DB_USER/password#abc #test.sql)
done
what the shell does when it hits the body of the loop is run sqlplus DB_USER/password#abc #test.sql and then it takes the output from that command (whatever it may be) and replaces the $(...) bit with it. So you end up with (not exactly since it happens again every loop but for sake of illustration) a loop that looks like this:
for i in $(cat filelist.txt)
do <output of 'sqlplus DB_USER/password#abc #test.sql' command>
done
and if that output isn't a valid shell command you are going to get an error.
The solution there is to not do that. You don't want the wrapping $() there at all.
for i in $(cat filelist.txt)
do sqlplus DB_USER/password#abc #test.sql
done
In your second example:
for i in $(cat filelist.txt)
do exec $i
done
you are telling the shell that the filename in $i is something that it should try to execute like a binary or executable shell script.
In your case two things are happening here. The filename in $i can't be found and (and this is harder to notice) the filename in $i contains a carriage-return at the end (probably a DOS line-ending file). (That's why the error message is a bit more confused then normal.) (I actually wonder about that since I wouldn't have expected that from an unquoted $i but from a quoted "$i" but I might just be wrong about that.)
So, for this case, you need to both strip the carriage-returns from the file (see point 1 of the "Before asking about problematic code" section of the bash tag info wiki for more about this) and then you need to make sure that filename is an executable script and you have the correct path to it.
Oh, also, exec never returns so that loop will only execute one file ever.
If you want multiple executions then drop exec.
That all being said you Don't Read Lines With For. See Bash FAQ 001 for how to correctly (and safely) read lines from a file.

Reading input while also piping a script via stdin

I have a simple Bash script:
#!/usr/bin/env bash
read X
echo "X=$X"
When I execute it with ./myscript.sh it works. But when I execute it with cat myscript.sh | bash it actually puts echo "X=$X" into $X.
So this script prints Hello World executed with cat myscript.sh | bash:
#!/usr/bin/env bash
read X
hello world
echo "$X"
What's the benefit of executing a script with cat myscript.sh | bash? Why doesn't do it the same things as if I execute it with ./myscript.sh?
How can I avoid Bash to execute line by line but execute all lines after the STDIN reached the end?
Instead of just running
read X
...instead replace it with...
read X </dev/tty || {
X="some default because we can't read from the TTY here"
}
...if you want to read from the console. Of course, this only works if you have a /dev/tty, but if you wanted to do something robust, you wouldn't be piping from curl into a shell. :)
Another alternative, of course, is to pass in your value of X on the command line.
curl https://some.place/with-untrusted-code-only-idiots-will-run-without-reading \
| bash -s "value of X here"
...and refer to "$1" in your script when you want X.
(By the way, I sure hope you're at least using SSL for this, rather than advising people to run code they download over plain HTTP with no out-of-band validation step. Lots of people do it, sure, but that's making sites they download from -- like rvm.io -- big targets. Big, easy-to-man-in-the-middle-or-DNS-hijack targets).
When you cat a script to bash the code to execute is coming from standard input.
Where does read read from? That's right also standard input. This is why you can cat input to programs that take standard input (like sed, awk, etc.).
So you are not running "a script" per-se when you do this. You are running a series of input lines.
Where would you like read to read data from in this setup?
You can manually do that (if you can define such a place). Alternatively you can stop running your script like this.

Shell script - check the syntax

How to check the correctness of the syntax contained in the ksh shell script without executing it? To make my point clear: in perl we can execute the command:
perl -c test_script.pl
to check the syntax. Is something similar to this available in ksh?
ksh -n
Most of the Borne Shell family accepts -n. tcsh as well.
I did a small test with the following code:
#!/bin/bash
if [ -f "buggyScript.sh" ; then
echo "found this buggy script"
fi
Note the missing ] in the if. Now I entered
bash -n buggyScript.sh
and the missing ] was not detected.
The second test script looked like this:
#!/bin/bash
if [ -f "buggyScript.sh" ]; then
echo "found this buggy script"
Note the missing fi at at end of the if. Testing this with
bash -n buggyScript.sh
returned
buggyScript.sh: line 5: syntax error: unexpected end of file
Conclusion:
Testing the script with the n option detects some errors, but by no means all of them. So I guess you really find all error only while executing the script.
The tests that you say failed to detect syntax errors, where not in fact syntax errors...
echo is a command (OK a builtin, but still a command) so ksh/bash are not going to check the spelling/syntax of your command.
Similarly "[" is effectively an alias for the test command, and the command expects the closing brace "]" as part of its syntax, not ksh/bash's.
So -n does what it says on the tin, you just haven't read the tin correctly! :-)

exec line from file in bash

I'm trying to read commands from a text file and execute each line from a bash script.
#!/bin/bash
while read line; do
$line
done < "commands.txt"
In some cases, if $line contains commands that are meant to run in background, eg command 2>&1 & they will not start in background, and will run in the current script context.
Any ideea why?
if all your commands are inside "commands.txt", essentially, you can call it a shell script. That's why you can either source it, or run it like normal, ie chmod u+x , then you can execute it using sh commands.txt
I don't have anything to add to ghostdog74's answer about the right way to do this, but I can cover why it's failing: The shell parses I/O redirections, backgrounding, and a bunch of other things before it does variable expansion, so by the time $line is replaced by command 2>&1 & it's too late to recognize 2>&1 and & as anything other than parameters to command.
You could improve this by using eval "$line" but even there you'll run into problems with multiline commands (e.g. while loops, if blocks, etc). The source and sh approaches don't have this problem.

Resources