Using grep, ls to get a file in bash - bash

I'm trying to write a bash script which would locate a single file in the current directory. The file will be used later but I don't need help there. I tried using ls and grep but it doesn't work, I'm a newbie using bash.
#!/bin/sh
#Here I need smt like
#trFile = ls | grep myString (but I get file not found error)
echo $trFile

Use shell wildcards, as in
ls *${pattern}*
And, to store the result in a variable, put it inside a $() structure (you can also use deprecated backticks if you like using deprecated functionality that doesn't nest well)
var=$( ls *${pattern}* )
Or, put your ls | grep in there (but that's bad practice, IMHO):
var=$( ls | grep -- "$pattern" )

#!/bin/sh
#
trfile=$( ls | grep myString )
echo $trfile
The $( xxx ) causes the commands within to be executed and the output returned.

I believe you are looking for something like this:
#!/bin/sh
trFile=`ls | grep "$myString"`
In order to run a command and redirect/store its output, you need put the command between backticks. The variable that will store the output, equal sign and the backtick need to be together, as in my example.
Hope this helps.

Try this, if I guess what you're trying to do is, get the capture of the filename from grepping via the output of the ls into a shell variable, try this:
#!/bin/sh
trFile=`ls | grep "name_of_file"`
echo $trFile
Notice the usage of the back-tick operator surrounding the command, what-ever is the output, in this case, will get captured.

using output of ls will bite when you least expect. Better use Globbing.
http://tldp.org/LDP/abs/html/globbingref.html

Related

variable as shell command

I am writing shell script that works with files. I need to find files and print them with some inportant informations for me. Thats no problem... But then I wanted to add some "features" and make it to work with arguments as well. One of the feature is ignoring some files that match patterm (like *.c - to ignore all c file). So I set variable and added string into it.
#!/bin/sh
command="grep -Ev \"$2\"" # in 2nd argument is pattern, that will be ignored
echo "find $PWD -type f | $command | wc -l" # printing command
file_num=$(find $path -type f | $command | wc -l) # saving number of files
echo "Number of files: $file_num"
But, command somehow ignor my variable and count all files. But when I put the same command into bash or shell, I get different number (the correct one) of files. I though, it could be just beacouse of bash, but on other machine, where is ksh, same problem and changing #!/bin/sh to #!/bin/bash did not help too.
The command line including the arguments is processed by the shell before it is executed. So, when you run script the command will be grep -Ev "c"and when you run single command grep -Ev "c" shell will interpreter this command as grep -Ev c.
You can use this command to check it: echo grep -Ev "c".
So, just remove quotes in $command and everything will be ok )
You need only to modify command value :
command="grep -Ev "$1

How to pass a shell script argument as a variable to be used when executing grep command

I have a file called fruit.txt which contains a list of fruit names (apple, banana.orange,kiwi etc). I want to create a script that allows me to pass an argument when calling the script i.e. script.sh orange which will then search the file fruit.txt for the variable (orange) using grep. I have the following script...
script name and argument as follows:
script.sh orange
script snippet as follows:
#!/bin/bash
nameFind=$1
echo `cat` "fruit.txt"|`grep` | $nameFind
But I get the grep info usage command and it seems that the script is awaiting some additional command etc. Advice greatly appreciated.
The piping syntax is incorrect there. You are piping the output of grep as input to the variable named nameFind. So when the grep command tries to execute it is only getting the contents of fruit.txt. Do this instead:
#!/bin/bash
nameFind=$1
grep "$nameFind" fruit.txt
Something like this should work:
#!/bin/bash
name="$1"
grep "$name" fruit.txt
There's no need to use cat and grep together; you can simply pass the name of the file as the third argument, after the pattern to be matched. If you want to match fixed strings (i.e. no regular expressions), you can also use the -F modifier:
grep -F "$name" fruit.txt

Executing a variable with a pipe in shell

I have a command that I need to use repeatedly within a shell script. This is command contains a pipe and the output of the whole command will be piped to other commands.
e.g. Let's say for simplicity sake the command is ls | tee. Then I might pipe it to other commands, says ls | tee | someprogram or ls | tee | anotherprogram.
So naturally I'll want to keep ls | tee is a variable. The problem is that I can't seem to execute a variable with a pipe in it.
#!/bin/sh
TEST="ls | tee"
$TEST
Gives the following output
ls: cannot access |: No such file or directory
ls: cannot access tee: No such file or directory
How do I execute a variable like $TEST above, whist being able to pipe the output to other commands?
The short answer is eval.
eval $TEST somefile
eval $TEST otherfile | more
However, you need to be aware that eval means problems with quoting special characters and blanks and the like. If everything is simple (TEST="ls -l | tee"), then it is easy. If you have spaces in arguments or shell metacharacters, then it is hard — very hard — to do it right. At that point, you'd be better off creating a function or separate shell script.
You might well be better off with a function or shell script even so.
If the string you eval comes from a user, you have to worry even more!

Ruby Backticks - break command into multiple lines?

In Ruby, I know I can execute a shell command with backticks like so:
`ls -l | grep drw-`
However, I'm working on a script which calls for a few fairly long shell commands, and for readability's sake I'd like to be able to break it out onto multiple lines. I'm assuming I can't just throw in a plus sign as with Strings, but I'm curious if there is either a command concatenation technique of some other way to cleanly break a long command string into multiple lines of source code.
You can escape carriage returns with a \:
`ls -l \
| grep drw-`
You can use interpolation:
`#{"ls -l" +
"| grep drw-"}`
or put the command into a variable and interpolate the variable:
cmd = "ls -l" +
"| grep drw-"
`#{cmd}`
Depending on your needs, you may also be able to use a different method of running the shell command, such as system, but note its behavior is not exactly the same as backticks.
Use %x:
%x( ls -l |
grep drw- )
Another:
%x(
echo a
echo b
echo c
)
# => "a\nb\nc\n"
You can also do this with explicit \n:
cmd_str = "ls -l\n" +
"| grep drw-"
...and then put the combined string inside backticks.
`#{cmd_str}`

Counting file lines in shell and in a script gives different results

For a bunch of files in a directory I want to get the number of lines for each one, store it
in a variable and do additional stuff. Via shell I can do it without problems if I do
read NLINES <<< $( cat file | wc -l )
but if I do it in a script
#!/bin/bash
for i in `ls *.dat `
do
read NLINES <<< $( cat $i | wc -l )
done
I get
Syntax error: redirection unexpected
Why the difference? How could I fix it?
I bet your default shell isn't bash but something else. Leave the #!/bin/bash and replace it with #!/bin/sh, to let your script use the default shell.
I made this error the other way, when I tried to use some debian scripts on Ubuntu, where #!/bin/sh behaved differently from my assumed #!/bin/bash.

Resources