Unix Variable Declaration - shell

Hello I'm a total beginner in shell scripting.
I have a log file named logA.log and a B.sh file.
In log file there are some lines and I want to find the number of a spesific word in that log (in last 10 line) by executing the B.sh
In B I wrote
#!/bin/bash
variableString = tail -10f /home/appuser/logA.log
grep ERROR $variableString | wc -l
but the output is:
variableString: command not found
I know "grep" line is working but I cannot reach the logA in b.sh.
How can I define a variable called variableString as last 10 line of logA

Your commands are ok but you have to be aware of the way to store command output: var=$(command). Also, you may get several lines, so quote the return command to keep the format. Hence, you should use:
variableString="$(tail -10f /home/appuser/logA.log)"
grep ERROR "$variableString" | wc -l
When you get the error
variableString: command not found
it is because as you define your syntax, bash interprets that has to execute the variableString command with the = tail -10f /home/appuser/logA.log parameters. See Basic bash script variable declaration - command not found for further information regarding this.

tail -f ("follow") will not finish, so it never gets to the next line. You probably meant tail -n 10 (the -n makes it POSIX compatible).
You cannot use spaces around equals signs when assigning a variable.
Variables are assigned to the string which the right-hand side evaluates to. Without special constructs, the result will simply be the literal string after the equals sign.
You should quote variables to avoid expansion.
In summary, you should use:
variableString=$(tail -10 /home/appuser/logA.log)

Related

Can't manage to give two arguments from a fil to bash script : command not found [duplicate]

This question already has answers here:
Why does a space in a variable assignment give an error in Bash? [duplicate]
(3 answers)
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 3 years ago.
I'm new to bash script, it is interesting, but somehow I'm struggling with everything.
I have a file, separated by tab "\t" with 2 infos : a string and a number.
I'd like to use both info on each line into a bash script in order to look into a file for those infos.
I'm not even there, I'm struggling to give the arguments from the two columns as two arguments for bash.
#/!bin/bash
FILE="${1}"
while read -r line
do
READ_ID_WH= "echo ${line} | cut -f 1"
POS_HOTSPOT= echo '${line} | cut -f 2'
echo "read id is : ${READ_ID_WH} with position ${POS_HOTSPOT}"
done < ${FILE}
and my file is :
ABCD\t1120
ABC\t1121
I'm launching my command with
./script.sh file_of_data.tsv
What I finally get is :
script.sh: line 8: echo ABCD 1120 | cut -f 1: command not found
I tried a lot of possibilities by browsing SO, and I can't make it to divide my line into two arguments to be used separately in my script :(
Hope you can help me :)
Best,
The quotes cause the shell to look for a command whose name is the string between the quotes.
Apparently you are looking for
while IFS=$'\t' read -r id hotspot; do
echo "read id is: $id with position $hotspot"
done <"$1"
You generally want to avoid capturing things into variables you only use once, but the syntax for that is
id=$(echo "$line" | cut -f1)
See also Correct Bash and shell script variable capitalization and When to wrap quotes around a shell variable?. You can never have whitespace on either side of the = assignment operator (or rather, incorrect whitespace changes the semantics to something you probably don't want).
You have a space after the equals sign on lines 5 and 6, so it thinks you are looking for an executable file named echo ABCD 1120 | cut -f 1 and asking to execute it while assigning the variable READ_ID_WH to the empty string, rather than assigning the string you want to the variable.

Set a variable equal to the output of a command containing non-command words

I'm writing a small script in which I want to set the value of a variable equal to the output of a command. However, the command in question is a call to another script with command-line arguments. I'm using backticks as you normally should in this scenario, but the problem is that the the computer gives an error, in which it tries to interpret the command-line arguments as commands.
#!/bin/bash
filename="$1"
while read p; do
echo "This is the gene we are looking at: ""$p"
lookIn= `./findGeneIn "$p" burgdorferi afzelii garinii hermsii miyamotoi parkeri`
echo "$lookIn"
#grep "$p" "$lookIn""/""prokka_""$lookIn""/*.tsv" | awk '{print $1}'
done < $filename
I'm trying to set variable lookIn equal to the output of ./findGeneIn "$p" burgdorferi afzelii garinii hermsii miyamotoi parkeri, where ./findGeneIn is a script, and the words burgdorferi,...,parkeri are command line arguments for ./findGeneIn.
The issue, is that I get an error saying "burgdorferi: command not found". So it's trying to interpret those arguments as commands. How do I get it to not do that?
lookIn= `./findGeneIn "$p" burgdorferi afzelii garinii hermsii miyamotoi parkeri`
^
Delete the extra space. Assignments must not have spaces around the equal sign.
With the space there, Bash parses the line as var=value command, which runs a command with $var temporarily set to "value". Or in this case, it interprets result of the backticks as a command name and lookIn= as an empty variable assignment.

Why doesn't LIMIT=\`ulimit -u\` work in bash?

In my program I need to know the maximum number of process I can run. So I write a script. It works when I run it in shell but but when in program using system("./limit.sh"). I work in bash.
Here is my code:
#/bin/bash
LIMIT=\`ulimit -u\`
ACTIVE=\`ps -u | wc -l \`
echo $LIMIT > limit.txt
echo $ACTIVE >> limit.txt
Anyone can help?
Why The Original Fails
Command substitution syntax doesn't work if escaped. When you run:
LIMIT=\`ulimit -u\`
...what you're doing is running a command named
-u`
...with the environment variable named LIMIT containing the value
`ulimit
...and unless you actually have a command that starts with -u and contains a backtick in its name, this can be expected to fail.
This is because using backticks makes characters which would otherwise be syntax into literals, and running a command with one or more var=value pairs preceding it treats those pairs as variables to export in the environment for the duration of that single command.
Doing It Better
#!/bin/bash
limit=$(ulimit -u)
active=$(ps -u | wc -l)
printf '%s\n' "$limit" "$active" >limit.txt
Leave off the backticks.
Use modern $() command substitution syntax.
Avoid multiple redirections.
Avoid all-caps names for your own variables (these names are used for variables with meaning to the OS or system; lowercase names are reserved for application use).
Doing It Right
#!/bin/bash
exec >limit.txt # open limit.txt as output for the rest of the script
ulimit -u # run ulimit -u, inheriting that FD for output
ps -u | wc -l # run your pipeline, likewise with output to the existing FD
You have a typo on the very first line: #/bin/bash should be #!/bin/bash - this is often known as a "shebang" line, for "hash" (#) + "bang" (!)
Without that syntax written correctly, the script is run through the system's default shell, which will see that line as just a comment.
As pointed out in comments, that also means only the standardised options available to the builtin ulimit command, which doesn't include -u.

How to pass a shell script argument as a variable to be used when executing grep command

I have a file called fruit.txt which contains a list of fruit names (apple, banana.orange,kiwi etc). I want to create a script that allows me to pass an argument when calling the script i.e. script.sh orange which will then search the file fruit.txt for the variable (orange) using grep. I have the following script...
script name and argument as follows:
script.sh orange
script snippet as follows:
#!/bin/bash
nameFind=$1
echo `cat` "fruit.txt"|`grep` | $nameFind
But I get the grep info usage command and it seems that the script is awaiting some additional command etc. Advice greatly appreciated.
The piping syntax is incorrect there. You are piping the output of grep as input to the variable named nameFind. So when the grep command tries to execute it is only getting the contents of fruit.txt. Do this instead:
#!/bin/bash
nameFind=$1
grep "$nameFind" fruit.txt
Something like this should work:
#!/bin/bash
name="$1"
grep "$name" fruit.txt
There's no need to use cat and grep together; you can simply pass the name of the file as the third argument, after the pattern to be matched. If you want to match fixed strings (i.e. no regular expressions), you can also use the -F modifier:
grep -F "$name" fruit.txt

How to execute lines of text on the clipboard as bash commands

I'm working with Mac OS X's pbpaste command, which returns the clipboard's contents. I'd like to create a shell script that executes each line returned by pbpaste as a separate bash command. For example, let's say that the clipboard's contents consists of the following lines of text:
echo 1234 >~/a.txt
echo 5678 >~/b.txt
I would like a shell script that executes each of those lines, creating the two files a.txt and b.txt in my home folder. After a fair amount of searching and trial and error, I've gotten to the point where I'm able to assign individual lines of text to a variable in a while loop with the following construct:
pbpaste | egrep -o [^$]+ | while read l; do echo $l; done
which sends the following to standard out, as expected:
echo 1234 >~/a.txt
echo 5678 >~/b.txt
Instead of simply echoing each line of text, I then try to execute them with the following construct:
pbpaste | egrep -o [^$]+ | while read l; do $l; done
I thought that this would execute each line (thus creating two text files a.txt and b.txt in my home folder). Instead, the first term (echo) seems to be interpreted as the command, and the remaining terms (nnnn >~/...) seem to get lumped together as if they were a single parameter, resulting in the following being sent to standard out without any files being created:
1234 >~/a.txt
5678 >~/b.txt
I would be grateful for any help in understanding why my construct isn't working and what changes might get it to work.
[…] the remaining terms (nnnn >~/...) seem to get lumped together as if they were a single parameter, […]
Not exactly. The line actually gets split on whitespace (or whatever $IFS specifies), but the problem is that the redirection operator > cannot be taken from a shell variable. For example, this snippet:
gt='>'
echo $gt foo.txt
will print > foo.txt, rather than printing a newline to foo.txt.
And you'll have similar problems with various other shell metacharacters, such as quotation marks.
What you need is the eval builtin, which takes a string, parses it as a shell command, and runs it:
pbpaste | egrep -o [^$]+ | while IFS= read -r LINE; do eval "$LINE"; done
(The IFS= and -r and the double-quotes around $LINE are all to prevent any other processing besides the processing performed by eval, so that e.g. whitespace inside quotation marks will be preserved.)
Another possibility, depending on the details of what you need, is simply to pipe the commands into a new instance of Bash:
pbpaste | egrep -o [^$]+ | bash
Edited to add: For that matter, it occurs to me that you can pass everything to eval in a single batch; just as you can (per your comment) write pbpaste | bash, you can also write eval "$(pbpaste)". That will support multiline while-loops and so on, while still running in the current shell (useful if you want it to be able to reference shell parameters, to set environment variables, etc., etc.).

Resources