bash for loop work in command line, but failed in script - bash

When a run a for statement in debian bash command line, it works fine.
But when I run it in a sh script or run it with bash command, it's keeping report "error near unexpected token `do'"
Where is the difference?
[leon#www] ~/tmp $ for i in {1..10}; do echo $i; done
1
2
3
4
5
6
7
8
9
10
[leon#www] ~/tmp $ bash for i in {1..10}; do echo $i; done
-bash: syntax error near unexpected token `do'
BTW, all works fine in centos enviorment.

Use the -c option so that bash reads the commands from the string you pass in. Also, use single quotes around the command.
bash -c 'for i in {1..10}; do echo $i; done'

your bash command line ends with the first ;
so it gets executed separately as:
bash for i in {1..10};
do echo $i;
done
and man bash says command argument should be a file to load: bash [options] [file]

You can wrap all your script inside inverted commas or in a file. Because here, you're doing bash for i in {1..10} then do echo $i and so on. You should use -c option if you don't put it in a file.

Related

Bash script working locally but returning syntax error in CI

On my gitlab CI I am running the following simple script (.gitlab-ci.yml):
STR=$(cat $FILE)
if grep -q "substring" <<< "$STR"; then echo "ok"; fi
Unfortunatley this gives me the error
/bin/sh: eval: line 100: syntax error: unexpected redirection
Running the same command locally as a script is working as expected:
#!/bin/sh
FILE="./file.txt"
STR=$(cat $FILE)
if grep -q "substring" <<< "$STR"; then
echo "ok"
fi
The file has the content:
This has a substring somewhere
/bin/sh is not bash and <<< is a bash extension not available on every shell. Install bash, change shebang to /bin/bash and make sure the script is run under bash or use posix compatible syntax printf "%s\n" "$str" | grep...
Note: UPPER CASE VARIABLES are by convention reserved for exported variables, like IFS COLUMNS PWD UID EUID LINES etc. Use lower case variables in your scripts.

Inline unix command as filename [duplicate]

This question already has answers here:
Why does my Bash code fail when I run it with 'sh'?
(2 answers)
Closed 4 years ago.
I have the following bash script test.sh (with execution permissions):
#!/bin/sh
CAT_BIN="cat"
"$CAT_BIN" <(tail -n +2 test.sh)
It gives me that error when I run it:
$ ./test.sh
./test.sh: line 4: syntax error near unexpected token `('
./test.sh: line 4: `"$CAT_BIN" <(tail -n +2 test.sh)'
However, when I source the following commands it executes alright.
$ CAT_BIN="cat"
$ "$CAT_BIN" <(tail -n +2 test.sh)
How can this work in a script? (Use <(tail -n +2 test.sh) inline as a filename argument)
The <(tail -n +2 test.sh) construct is a bash feature, so you need to run your script in the bash shell,
Replace your top line
#!/bin/sh
with
#!/bin/bash
(Or the proper path to the bash executable if it is not /bin/bash on your system)
Note, even if /bin/sh is e.g. a symlink to bash, it will start bash in posix compatibility mode when you run it as /bin/sh , and many bash specific features will not be available)

How to pass argument in bash pipe from terminal

i have a bash script show below in a file called test.sh
#!/usr/bin/env bash
echo $1
echo "execution done"
when i execute this script using
Case-1
./test.sh "started"
started
execution done
showing properly
Case-2
If i execute with
bash test.sh "started"
i'm getting the out put as
started
execution done
But i would like to execute this using a cat or wget command with arguments
For example like.
Q1
cat test.sh |bash
Or using a command
Q2
wget -qO - "url contain bash" |bash
So in Q1 and Q2 how do i pass argument
Something simlar to this shown in this github
https://github.com/creationix/nvm
Please refer installation script
$ bash <(curl -Ls url_contains_bash_script) arg1 arg2
Explanation:
$ echo -e 'echo "$1"\necho "done"' >test.sh
$ cat test.sh
echo "$1"
echo "done"
$ bash <(cat test.sh) "hello"
hello
done
$ bash <(echo -e 'echo "$1"\necho "done"') "hello"
hello
done
You don't need to pipe to bash; bash runs as standard in your terminal.
If I have a script and I have to use cat, this is what I'll do:
cat script.sh > file.sh; chmod 755 file.sh; ./file.sh arg1 arg2 arg3
script.sh is the source script. You can replace that call with anything you want.
This has security implications though; just running an arbitrary code in your shell - especially with wget where the code comes from a remote location.

Bash script how to execute a command from a variable

I am trying to alter the Bash function below to execute each command argument. But when I run this script, the first echo works as intended, but the second echo that attempts to append to the scratch.txt file does not actually execute. It just gets echo'd into the prompt.
#!/bin/sh
clear
function each(){
while read line; do
for cmd in "$#"; do
cmd=${cmd//%/$line}
printf "%s\n" "$cmd"
$cmd
done
done
}
# pipe in the text file and run both commands
# on each line of the file
cat scratch.txt | each 'echo %' 'echo -e "%" >> "scratch.txt"'
exit 0
How do I get the $cmd variable to execute as a command?
I found the original code from answer 2 here:
Running multiple commands with xargs
You want eval. It's evil. Or at least, dangerous. Read all about it at BashFAQ #48.

Executing commands containing space in Bash

I have a file named cmd that contains a list of Unix commands as follows:
hostname
pwd
ls /tmp
cat /etc/hostname
ls -la
ps -ef | grep java
cat cmd
I have another script that executes the commands in cmd as:
IFS=$'\n'
clear
for cmds in `cat cmd`
do
if [ $cmds ] ; then
$cmds;
echo "****************************";
fi
done
The problem is that commands in cmd without spaces run fine, but those with spaces are not correctly interpreted by the script. Following is the output:
patrick-laptop
****************************
/home/patrick/bashFiles
****************************
./prog.sh: line 6: ls /tmp: No such file or directory
****************************
./prog.sh: line 6: cat /etc/hostname: No such file or directory
****************************
./prog.sh: line 6: ls -la: command not found
****************************
./prog.sh: line 6: ps -ef | grep java: command not found
****************************
./prog.sh: line 6: cat cmd: command not found
****************************
What am I missing here?
Try changing the one line to eval $cmds rather than just $cmds
You can replace your script with the command
sh cmd
The shell’s job is to read commands and run them! If you want output/progress indicators, run the shell in verbose mode
sh -v cmd
I personally like this approach better - I don't want to munge the IFS if I don't have to do so. You do need to use an eval if you are going to use pipes in your commands. The pipe needs to be processed by the shell not the command. I believe the shell parses out pipes before the expanding strings.
Note that if your cmd file contains commands that take input there will be an issue. (But you can always create a new fd for the read command to read from.)
clear
while read cmds
do
if [ -n "$cmds" ] ; then
eval $cmds
echo "****************************";
fi
done < cmd
Edit: Turns out this fails on pipes and redirection. Thanks, Andomar.
You need to change IFS back inside the loop so that bash knows where to split the arguments:
IFS=$'\n'
clear
for cmds in `cat cmd`
do
if [ $cmds ] ; then
IFS=$' \t\n' # the default
$cmds;
echo "****************************";
IFS=$'\n'
fi
done
EDIT: The comment by Ben Blank pointed out that my old answer was wrong, thanks.
Looks like you're executing commands as a single string, so bash sees them as the script/executable name.
One way to avoid that would be to invoke bash on the command. Change:
if [ $cmds ] ; then
$cmds;
echo "****************************";
fi
to
if [ $cmds ] ; then
bash -c $cmds
echo "****************************";
fi
sed 'aecho "-----------"' cmd > cmd.sh; bash cmd.sh
sed 'aXX' appends XX to every line. This will not work for multiline-commands like:
for f in *
do
something $f
fi
but for single-line commands in most cases, it should do.

Resources