Bash scripting - escape characters in unix - bash

I am trying to execute a word search in my directory where I look into all text files, and try to find words that have a length of 14.
$ ls *.txt | & grep -o -w '\w\{14,14\}'
This works as intended when I run it in a command line.
Now I want to run this same exact command but in a .sh file
In my file, I have this:
eval $("ls *.txt | & grep -o -w '\w\{14,14\}'")
I then get this error:
test.sh: line 1: ls *.txt | & grep -o -w '\w{14,14}': command not found
Is the problem that I have escape characters in my text? How can I get it so that when I run my .sh file, I get the same output as if I ran it on the command line?

You don't need to eval something to put it in a shell script. eval is evil, and should be avoided like the plague (as in, only use if you have invented the antidote yourself). Your statement could be changed to simply grep -o -w '\w\{14,14\}' *.txt and chucked verbatim into a script file.
An excellent web site to learn both the fundamentals and the subtleties of Bash scripting is Greg's Wiki.

Related

Bash get the command that is piping into a script

Take the following example:
ls -l | grep -i readme | ./myscript.sh
What I am trying to do is get ls -l | grep -i readme as a string variable in myscript.sh. So essentially I am trying to get the whole command before the last pipe to use inside myscript.sh.
Is this possible?
No, it's not possible.
At the OS level, pipelines are implemented with the mkfifo(), dup2(), fork() and execve() syscalls. This doesn't provide a way to tell a program what the commands connected to its stdin are. Indeed, there's not guaranteed to be a string representing a pipeline of programs being used to generate stdin at all, even if your stdin really is a FIFO connected to another program's stdout; it could be that that pipeline was generated by programs calling execve() and friends directly.
The best available workaround is to invert your process flow.
It's not what you asked for, but it's what you can get.
#!/usr/bin/env bash
printf -v cmd_str '%q ' "$#" # generate a shell command representing our arguments
while IFS= read -r line; do
printf 'Output from %s: %s\n' "$cmd_str" "$line"
done < <("$#") # actually run those arguments as a command, and read from it
...and then have your script start the things it reads input from, rather than receiving them on stdin.
...thereafter, ./yourscript ls -l, or ./yourscript sh -c 'ls -l | grep -i readme'. (Of course, never use this except as an example; see ParsingLs).
It can't be done generally, but using the history command in bash it can maybe sort of be done, provided certain conditions are met:
history has to be turned on.
Only one shell has been running, or accepting new commands, (or failing that, running myscript.sh), since the start of myscript.sh.
Since command lines with leading spaces are, by default, not saved to the history, the invoking command for myscript.sh must have no leading spaces; or that default must be changed -- see Get bash history to remember only the commands run with space prefixed.
The invoking command needs to end with a &, because without it the new command line wouldn't be added to the history until after myscript.sh was completed.
The script needs to be a bash script, (it won't work with /bin/dash), and the calling shell needs a little prep work. Sometime before the script is run first do:
shopt -s histappend
PROMPT_COMMAND="history -a; history -n"
...this makes the bash history heritable. (Code swiped from unutbu's answer to a related question.)
Then myscript.sh might go:
#!/bin/bash
history -w
printf 'calling command was: %s\n' \
"$(history | rev |
grep "$0" ~/.bash_history | tail -1)"
Test run:
echo googa | ./myscript.sh &
Output, (minus the "&" associated cruft):
calling command was: echo googa | ./myscript.sh &
The cruft can be halved by changing "&" to "& fg", but the resulting output won't include the "fg" suffix.
I think you should pass it as one string parameter like this
./myscript.sh "$(ls -l | grep -i readme)"
I think that it is possible, have a look at this example:
#!/bin/bash
result=""
while read line; do
result=$result"${line}"
done
echo $result
Now run this script using a pipe, for example:
ls -l /etc | ./script.sh
I hope that will be helpful for you :)

Unix shell programming to count number of active user using Korn shell

What is the shell script to count number of logged in user who are using Korn shell currently using grep and any other Unix command. Thanks in advance.
who is a command that lists users who are online. In order to count the number of online users, you can pipe the output of who to grep, which can count the number of lines with the -c argument:
who | grep -c .
EDIT: I missed the detail about users using Korn shell.
You can try this instead:
ps -e -o command | grep -c "[k]sh"
ps is a command that lists information about current running processes. The -e argument makes it show information about all system processes and the -o command arguments makes it shows only commands.
ps -e -o command will show you a list of currently running processes. Now, you can pipe the output to grep and count the number of lines that match [k]sh using the -c argument. Brackets are used around the "k" because otherwise grep will match itself, as the grep command contains "ksh" as an argument. (You can see this by checking the output of ps -e -o command.)
(I am assuming that the name of the Korn shell process is "ksh". If it is something else, you should using that as the argument for grep.)

variable as shell command

I am writing shell script that works with files. I need to find files and print them with some inportant informations for me. Thats no problem... But then I wanted to add some "features" and make it to work with arguments as well. One of the feature is ignoring some files that match patterm (like *.c - to ignore all c file). So I set variable and added string into it.
#!/bin/sh
command="grep -Ev \"$2\"" # in 2nd argument is pattern, that will be ignored
echo "find $PWD -type f | $command | wc -l" # printing command
file_num=$(find $path -type f | $command | wc -l) # saving number of files
echo "Number of files: $file_num"
But, command somehow ignor my variable and count all files. But when I put the same command into bash or shell, I get different number (the correct one) of files. I though, it could be just beacouse of bash, but on other machine, where is ksh, same problem and changing #!/bin/sh to #!/bin/bash did not help too.
The command line including the arguments is processed by the shell before it is executed. So, when you run script the command will be grep -Ev "c"and when you run single command grep -Ev "c" shell will interpreter this command as grep -Ev c.
You can use this command to check it: echo grep -Ev "c".
So, just remove quotes in $command and everything will be ok )
You need only to modify command value :
command="grep -Ev "$1

Bash script from Codesourcery arm-2011.03 can't find grep or sed

I'm trying to run the CodeSourcery arm-2011.03.42 BASH script in Ubuntu 12.04. At the top of the script is the following:
#! /usr/bin/env bash
But, when I execute it, I get the following errors:
line 140: grep: command not found
line 140: sed: command not found
I can run both grep and sed from the command line, but not in the script.
Here's what line 140 look like
env_var_list=$(export | \
grep '^declare -x ' | \
sed -e 's/^declare -x //' -e 's/=.*//')
If I change the first line to
#!/bin/sh
I get the following error:
Line 51: Syntax error: "(" unexpected (expecting "}")
Here's what Line 51 looks like
check_pipe() {
local -a status=("${PIPESTATUS[#]}") #<-- Line 51
local limit=$1
local ix
The #<-- Line 51 actually doesn't appear in the shell script. I just added it to this post for clarity.
I've tried dos2unix and a number of other things, but I just can't win. I would very much appreciate your help.
I changed this line in the script
pushenvvar PATH /usr/local/tools/gcc-4.3.3/bin
to
pushenvvar PATH /usr/local/tools/gcc-4.3.3/bin:/bin
and it seems to work now.
Shell script must be bash as arrays don't exist in sh.
Check your PATH evironment variable, and the path of grep and sed /bin normally.
Ther might be several possiable reasons.
As #AntonioD pointed out, there must not be any space between '#!' and '/usr/bin/env' at the begining of the file.
The grep and sed command does not exists in your $PATH, checkout your /bin and /user/bin to see if they are existed, or run which grep and which sed in your shell.
If grep and sed are indeed existed, you need to make sure they have right access.They should be accessable and executable, in general, this should not happen.
You must not using #!/bin/sh instead of #!/usr/bin/evn or #!/bin/bash, because that would cause the shell run in POSIX compatible mode in which most of bash advanced features such as arrays are not functional.
If all of above are not the case, then it is really weird.

Pretty-print for shell script

I'm looking for something similiar to indent but for (bash) scripts. Console only, no colorizing, etc.
Do you know of one ?
Vim can indent bash scripts. But not reformat them before indenting.
Backup your bash script, open it with vim, type gg=GZZ and indent will be corrected. (Note for the impatient: this overwrites the file, so be sure to do that backup!)
Though, some bugs with << (expecting EOF as first character on a line) e.g.
EDIT: ZZ not ZQ
In bash I do this:
reindent() {
source <(echo "Zibri () {";cat "$1"; echo "}")
declare -f Zibri|head --lines=-1|tail --lines=+3 | sed -e "s/^\s\s\s\s//"
}
this eliminates comments and reindents the script "bash way".
If you have HEREDOCS in your script, they got ruined by the sed in the previous function.
So use:
reindent() {
source <(echo "Zibri () {";cat "$1"; echo "}")
declare -f Zibri|head --lines=-1|tail --lines=+3"
}
But all your script will have a 4 spaces indentation.
Or you can do:
reindent ()
{
rstr=$(mktemp -u "XXXXXXXXXX");
source <(echo "Zibri () {";cat "$1"|sed -e "s/^\s\s\s\s/$rstr/"; echo "}");
echo '#!/bin/bash';
declare -f Zibri | head --lines=-1 | tail --lines=+3 | sed -e "s/^\s\s\s\s//;s/$rstr/ /"
}
which takes care also of heredocs.
bash5+ has a --pretty-print option.. it will remove comments though, including a first-line '#!/bin...'
shfmt works very well.
You can format bash scripts and also check the formatting from pre-commit hooks.
# reformat
shfmt -l -w script.sh
# check if the formatting is OK
shfmt -d script.sh
# works on the whole directory as well
shfmt -l -w .
The only option absent is that it does not reformat according to line length (yet).
Since it is written in go you can just download the binary for most platforms, e.g. for Travis (.travis.yml):
install:
- curl -LsS -o ~/shfmt https://github.com/mvdan/sh/releases/download/v3.1.2/shfmt_v3.1.2_linux_amd64
- chmod +x ~/shfmt
script:
- ~/shfmt -d .
There is also a cross compiled js version on npm and a lot of editor plugins (see related projects)

Resources