For a bunch of files in a directory I want to get the number of lines for each one, store it
in a variable and do additional stuff. Via shell I can do it without problems if I do
read NLINES <<< $( cat file | wc -l )
but if I do it in a script
#!/bin/bash
for i in `ls *.dat `
do
read NLINES <<< $( cat $i | wc -l )
done
I get
Syntax error: redirection unexpected
Why the difference? How could I fix it?
I bet your default shell isn't bash but something else. Leave the #!/bin/bash and replace it with #!/bin/sh, to let your script use the default shell.
I made this error the other way, when I tried to use some debian scripts on Ubuntu, where #!/bin/sh behaved differently from my assumed #!/bin/bash.
Related
I am currently trying to read from files with shell. However, I met one sytax issue. My code is below:
while read -r line;do
echo $line
done < <(tail -n +2 /pathToTheFile | cut -f5,6,7,8 | sort | uniq )
However, it returns me error syntax error near unexpected token('`
I tried with following How to use while read line with tail -n but still cannot see the error.
The tail command works properly.
Any help will be apprepricated.
process substitution isn't support by the posix shell /bin/sh. It is a feature specific to bash (and other non posix shells). Are you running this in /bin/bash?
Anyhow, the process substitution isn't needed here, you could simple use a pipe, like this:
tail -n +2 /pathToTheFile | cut -f5,6,7,8 | sort -u | while read -r line ; do
echo "${line}"
done
Your interpreter must be #!/bin/bash not #!/bin/sh and/or you must run the script with bash scriptname instead of sh scriptname.
Why?
POSIX shell doesn't provide process-substitution. Process substitution (e.g. < <(...)) is a bashism and not available in POSIX shell. So the error:
syntax error near unexpected token('
Is telling you that once the script gets to your done statement and attempts to find the file being redirected to the loop it finds '(' and chokes. (that also tells us you are invoking your script with POSIX shell instead of bash -- and now you know why)
In my program I need to know the maximum number of process I can run. So I write a script. It works when I run it in shell but but when in program using system("./limit.sh"). I work in bash.
Here is my code:
#/bin/bash
LIMIT=\`ulimit -u\`
ACTIVE=\`ps -u | wc -l \`
echo $LIMIT > limit.txt
echo $ACTIVE >> limit.txt
Anyone can help?
Why The Original Fails
Command substitution syntax doesn't work if escaped. When you run:
LIMIT=\`ulimit -u\`
...what you're doing is running a command named
-u`
...with the environment variable named LIMIT containing the value
`ulimit
...and unless you actually have a command that starts with -u and contains a backtick in its name, this can be expected to fail.
This is because using backticks makes characters which would otherwise be syntax into literals, and running a command with one or more var=value pairs preceding it treats those pairs as variables to export in the environment for the duration of that single command.
Doing It Better
#!/bin/bash
limit=$(ulimit -u)
active=$(ps -u | wc -l)
printf '%s\n' "$limit" "$active" >limit.txt
Leave off the backticks.
Use modern $() command substitution syntax.
Avoid multiple redirections.
Avoid all-caps names for your own variables (these names are used for variables with meaning to the OS or system; lowercase names are reserved for application use).
Doing It Right
#!/bin/bash
exec >limit.txt # open limit.txt as output for the rest of the script
ulimit -u # run ulimit -u, inheriting that FD for output
ps -u | wc -l # run your pipeline, likewise with output to the existing FD
You have a typo on the very first line: #/bin/bash should be #!/bin/bash - this is often known as a "shebang" line, for "hash" (#) + "bang" (!)
Without that syntax written correctly, the script is run through the system's default shell, which will see that line as just a comment.
As pointed out in comments, that also means only the standardised options available to the builtin ulimit command, which doesn't include -u.
I am writing shell script that works with files. I need to find files and print them with some inportant informations for me. Thats no problem... But then I wanted to add some "features" and make it to work with arguments as well. One of the feature is ignoring some files that match patterm (like *.c - to ignore all c file). So I set variable and added string into it.
#!/bin/sh
command="grep -Ev \"$2\"" # in 2nd argument is pattern, that will be ignored
echo "find $PWD -type f | $command | wc -l" # printing command
file_num=$(find $path -type f | $command | wc -l) # saving number of files
echo "Number of files: $file_num"
But, command somehow ignor my variable and count all files. But when I put the same command into bash or shell, I get different number (the correct one) of files. I though, it could be just beacouse of bash, but on other machine, where is ksh, same problem and changing #!/bin/sh to #!/bin/bash did not help too.
The command line including the arguments is processed by the shell before it is executed. So, when you run script the command will be grep -Ev "c"and when you run single command grep -Ev "c" shell will interpreter this command as grep -Ev c.
You can use this command to check it: echo grep -Ev "c".
So, just remove quotes in $command and everything will be ok )
You need only to modify command value :
command="grep -Ev "$1
Consider a ASCII text file (lets say it contains code of a non-shell scripting language):
Text_File.msh:
spool on to '$LOG_FILE_PATH/logfile.log';
login 'username' 'password';
....
Now if this were a shell script I could run it as $ sh Text_File.msh and the shell would automatically expand the variables.
What I want to do is have shell expand these variables and then create a new file as Text_File_expanded.msh as follows:
Text_File_expanded.msh:
spool on to '/expanded/path/of/the/log/file/../logfile.log';
login 'username' 'password';
....
Consider:
$ a=123
$ echo "$a"
123
So technically this should do the trick:
$ echo "`cat Text_File.msh`" > Text_File_expanded.msh
...but it doesn't work as expected and the output-file while is identical to the source.
So I am unsure how to achieve this.. My goal is make it easier to maintain the directory paths embedded within my non-shell scripts. These scripts cannot contain any UNIX code as it is not compiled by the UNIX shell.
This question has been asked in another thread, and this is the best answer IMO:
export LOG_FILE_PATH=/expanded/path/of/the/log/file/../logfile.log
cat Text_File.msh | envsubst > Text_File_expanded.msh
if on Mac, install gettext first: brew install gettext
see:
Forcing bash to expand variables in a string loaded from a file
This solution is not elegant, but it works. Create a script call shell_expansion.sh:
echo 'cat <<END_OF_TEXT' > temp.sh
cat "$1" >> temp.sh
echo 'END_OF_TEXT' >> temp.sh
bash temp.sh >> "$2"
rm temp.sh
You can then invoke this script as followed:
bash shell_expansion.sh Text_File.msh Text_File_expanded.msh
If you want it in one line (I'm not a bash expert so there may be caveats to this but it works everywhere I've tried it):
when test.txt contains
${line1}
${line2}
then:
>line1=fark
>line2=fork
>value=$(eval "echo \"$(cat test.txt)\"")
>echo "$value"
line1 says fark
line2 says fork
Obviously if you just want to print it you can take out the extra value=$() and echo "$value".
If a Perl solution is ok for you:
Sample file:
$ cat file.sh
spool on to '$HOME/logfile.log';
login 'username' 'password';
Solution:
$ perl -pe 's/\$(\w+)/$ENV{$1}/g' file.sh
spool on to '/home/user/logfile.log';
login 'username' 'password';
One limitation of the above answers is that they both require the variables to be exported to the environment. Here's what i came up with that would allow the variables to be local to the current shell script:
#!/bin/sh
FOO=bar;
FILE=`mktemp`; # Let the shell create a temporary file
trap 'rm -f $FILE' 0 1 2 3 15; # Clean up the temporary file
(
echo 'cat <<END_OF_TEXT'
cat "$#"
echo 'END_OF_TEXT'
) > $FILE
. $FILE
The above example allows the variable $FOO to be substituted in the files named on the command line. I'm sure it can be improved, but this works for me so far.
Thanks to both previous answers for their ideas!
If the variables you want to translate are known and limited in number, you can always do the translation yourself:
sed "s/\$LOG_FILE_PATH/$LOG_FILE_PATH/g" input > output
And also assuming the variable itself is already known
This solution allows you to keep the same formatting in the ouput file
Copy and paste the following lines in your script
cat $1 | while read line
do
eval $line
echo $line
eval echo $line
done | uniq | grep -v '\$'
this will read the file passed as argument line by line, and then process to try and print each line twice:
- once without substitution
- once with substitution of the variables.
then remove the duplicate lines
then remove the lines containing visible variables ($)
Yes eval should be used carefully, but it provided me this simple oneliner for my problem. Below is an example using your filename:
eval "echo \"$(<Text_File.msh)\""
I use printf instead of echo for my own purposes, but that should do the trick. Thank you abyss.7 providing the link that solve my problem. Hope it helps.
Create an ascii file test.txt with the following content:
Try to replace this ${myTestVariable1}
bla bla
....
Now create a file “sub.sed” containing variable names, eg
's,${myTestVariable1},'"${myTestVariable1}"',g;
s,${myTestVariable2},'"${myTestVariable2}"',g;
s,${myTestVariable3},'"${myTestVariable3}"',g;
s,${myTestVariable4},'"${myTestVariable4}"',g'
Open a terminal move to the folder containing test.txt and sub.sed.
Define the value of the varible to be replaced
myTestVariable1=SomeNewText
Now call sed to replace that variable
sed "$(eval echo $(cat sub.sed))" test.txt > test2.txt
The output will be
$cat test2.txt
Try to replace this SomeNewText
bla bla
....
#logfiles.list:
$EAMSROOT/var/log/LinuxOSAgent.log
$EAMSROOT/var/log/PanacesServer.log
$EAMSROOT/var/log/PanacesStrutsGUI.log
#My Program:
cat logfiles.list | while read line
do
eval Eline=$line
echo $Eline
done
Is there any bash trick that allows giving some parameters in command line to a program that gets its inputs via input stream? Something like this:
program < 'a=1;b=a*2;'
but < needs a file input stream.
For very short here-documents, there are also here-strings:
program <<< "a=1;b=a*2"
I think
echo 'a=1;b=a*2;' | program
is what you need. This process is called "piping"
As a side note: doing the opposite (i.e. piping other programs output as arguments) could be done with xargs
echo works great. The other answer is Here-documents [1]
program <<EOF
a=1;b=a*2;
EOF
I use echo when I have one very short thing on one line, and heredocs when I have something that requires newlines.
[1] http://tldp.org/LDP/abs/html/here-docs.html
shopt -s expand_aliases
alias 'xscript:'='<<:ends'
xscript: bc | anotherprog | yetanotherprog ...
a=1;b=a*2;
:ends
Took me a year to hack this one out. Premium bash script here fellas. Give respect where due please :)
I call this little 'diddy' xscript because you can expand bash variables and substitutions inside of the here document.
alias 'script:'='<<":ends"'
The above version does not expand substitutions.
xscript: cat
The files in our path are: `ls -A`
:ends
script: cat
The files in our path are: `ls -A`
:ends
I'm not finished!
source <(xscript: cat
echo \$BASH "hello world, I'mma script genius!"
echo You can thank me now $USER
:ends
)