How to use a single bash command like $1 or $2 or $3 as a string of text instead of one word? - bash

How can I use a bashrc command like below
# .bashrc file
# google_speech
say () {
google_speech -l en "$1"
}
as a string of text, since the above code only reads out the first word of the sentence or paragraph i paste.
like for example if i go into terminal and type:
$ say hello friends how are you
then the script only thinks i typed
$ say hello

Try to use "$#" (with double quotes arround) to get all the arguments of your function:
$ declare -f mysearch # Showing the definition of mysearch (provided by your .bashrc)
mysearch ()
{
echo "Searching with keywords : $#"
}
$ mysearch foo bar # execution result
Searching with keywords : foo bar
Function or scripts arguments are like arrays, so you can use:
1) $# / ${#array[#]} to getting the number of arguments / array's elements.
2) $1 / ${array[1]} to getting the first argument / array's element.
3) $# / ${array[#]} to getting all arguments / array's elements.
EDIT: according to chepner's comment:
Using $# inside a larger string can sometimes produce unintended results. Here, the intent is to produce a single word, so it would be better to use $*
And here's a good topic with great answers explaining the differences.
EDIT 2: Do not forget to put double quotes around $# or $*, maybe your google_speach is taking only one arg. Here's a demo to give you a better understanding:
$ mysearch () { echo "Searching with keywords : $1"; }
$ mysearch2 () { mysearch "$*"; }
$ mysearch2 Hello my dear
Searching with keywords : Hello my dear
$ mysearch3 () { mysearch $*; } # missing double quotes
$ mysearch3 Hello my dear
Searching with keywords : Hello # the other arguments are at least ignored (or sometimes the program will fail when he's controlling the args).

Related

Is it possible to combine bash variable search and replace with substring?

I have this function:
#! /usr/bin/env bash
function underline() {
U="${1//?/${2:--}}"
echo -e "\n$1\n${U:0:${#1}}\n"
}
underline "$1" "^-v-"
will work as expected:
$ ./u.sh "This is all you're going to see"
This is all you're going to see
^-v-^-v-^-v-^-v-^-v-^-v-^-v-^-v
It does what you expect. Originally it assumed that the underline character was just that, 1 character.
For "fun", I extended it to move from a single character underline to a repeating string underline (why? ... well ... because I can I suppose ... almost zero practical value in this exercise!).
And so, being the "let's make this so difficult that you need to write 2 pages of documentation to explain what's going on" sort of guy, I was wondering if the function could be written as a single line. And I don't mean:
function underline() { U="${1//?/${2:--}}"; echo -e "\n$1\n${U:0:${#1}}\n"; }
I don't think you can combine bash's variable search/replace with substring which is what is required.
I'm aware that this will work happily in zsh:
#! /usr/bin/env zsh
function underline() {
echo -e "\n$1\n${${1//?/${2:--}}:0:${#1}}\n"
}
underline $1 "^-v-"
e.g.
$ ./u.sh "This is all you're going to see"
This is all you're going to see
^-v-^-v-^-v-^-v-^-v-^-v-^-v-^-v
but not capable of this in bash (it seems).
No, you cannot combine those in a single parameter expansion in bash. But this single printf (a builtin in bash) should do the trick:
underline() { printf '%s\n%.*s\n' "$1" ${#1} "${1//?/${2:--}}"; }
underline "This is all you're going to see" "^-v-"
outputs
This is all you're going to see
^-v-^-v-^-v-^-v-^-v-^-v-^-v-^-v

Parameter expansion with replacement, avoid additional variable

I'm trying to join input $* which is one parameter consisting of all the parameters added together.
This works.
#!/bin/bash
foo() {
params="${*}"
echo "${params//[[:space:]]/-}"
}
foo 1 2 3 4
1-2-3-4
However, is it possible to skip the assignment of variable?
"${"${*}"//[[:space:]]/-}"
I'm getting bad substitution error.
I can also do
: "${*}"
echo "${_//[[:space:]]/-}"
But it feels hacky.
One option could be to set bash's internal field separator, IFS, to - locally and just echo "$*":
foo() {
local IFS=$'-'
echo "$*"
}
To answer your question, you can do global pattern substitutions on the positional parameters like this:
${*//pat/sub}
${#//pat/sub}
And also arrays like this:
${arr[*]//pat/sub}
${arr[#]//pat/sub}
This won’t join the parameters, but substitute inside them.
Setting IFS to dash adds a dash in between each parameter for echo "$*", or p=$*, but won’t replace anything inside a parameter.
Eg:
$ set -- aa bb 'cc cc'
$ IFS=-
$ echo "$*"
aa-bb-cc cc
To remove all whitespace, including inside a parameter, you can combine them:
IFS=-
echo "${*//[[:space:]]/-}"
Or just assign to a name first, like you were doing:
no_spaces=$*
echo "${no_spaces//[[:space:]]/-}"

Bash: SPACE-triggered completion

In bash, I would like to achieve the following workflow (prompt shown in brackets, _ is the cursor):
Type "foo" ($ foo_)
Press space. At this point:
if foo is a function, say function foo() { printf "hello from foo" }
space is printed ($ foo _)
the function is called and hello from foo is printed after a space - no newline ($ foo hello from foo_)
if foo is not a function, space is printed and that's it ($ foo _)
I have tried:
Making space send 0x0a (return) to the terminal emulator (iTerm2 on Mac). This works except it obviously prints the newline character as well
Using the built-in complete function: complete -F foo -o nospace foo. This works but I have to type foo then SPACE then TAB for hello from foo to be printed inline
I have heard you could somehow embed a \n-eating script into PS1. Really don't know how to get started on that one.
I could also trap a shortcut, such as Ctrl+T, to execute foo - but I'd really like to only press SPACE.
Ideally space would behave like this only for the first word being typed into the terminal. Any help would be appreciated, please save my sanity.
Why in the world I need this: (I'm a geek AND) I've got an emacs-like ido script that I'd like to be invoked when I type cd followed by SPACE.
A way to do this is using -n 1 with read. For example:
foo(){
echo hello from foo
}
string=''
while read -n 1 a ; do
if [ "$a" = "" ] ; then
tpe=`type -t $string`
if [ "$tpe" = "function" ] ; then
$string
fi
string=''
else
string="$string$a"
fi
done

Print a string with its special characters printed as literal escape sequences

I have a string in a shell/bash script. I want to print the string with all its "special characters" (eg. newlines, tabs, etc.) printed as literal escape sequences (eg. a newline is printed as \n, a tab is printed as \t, and so on).
(Not sure if I'm using the correct terminology; the example should hopefully clarify things.)
Example
The desired output of...
a="foo\t\tbar"
b="foo bar"
print_escape_seq "$a"
print_escape_seq "$b"
...is:
foo\t\tbar
foo\t\tbar
$a and $b are strings that were read in from a text file.
There are two tab characters between foo and bar in the $b variable.
An attempt
This is what I've tried:
#!/bin/sh
print_escape_seq() {
str=$(printf "%q\n" $1)
str=${str/\/\//\/}
echo $str
}
a="foo\t\tbar"
b="foo bar"
print_escape_seq "$a"
print_escape_seq "$b"
The output is:
foo\t\tbar
foo bar
So, it doesn't work for $b.
Is there an entirely straightforward way to accomplish this that I've missed completely?
Bash has a string quoting operation ${var#Q}
Here is some example code
bash_encode () {
esc=${1#Q}
echo "${esc:2:-1}"
}
testval=$(printf "hello\t\tworld")
set | grep "^testval="
echo "The encoded value of testval is" $(bash_encode "$testval")
Here is the output
testval=$'hello\t\tworld'
The encoded value of testval is hello\t\tworld
You will need to create a search and replace pattern for each binary value you wish to replace. Something like this:
#!/bin/bash
esc() {
# space char after //
v=${1// /\\s}
# tab character after //
v=${v// /\\t}
echo $v
}
esc "hello world"
esc "hello world"
This outputs
hello\sworld
hello\tworld
I required something similar for file paths, and I realized that ls -1b does the work, but in the research I found this solution in stackoverflow which is closer to what you were requiring.
Command to escape a string in bash
just compile it with gcc -o "escapify" escapify.c

read stdin in function in bash script

I have some set of bash functions which output some information:
find-modelname-in-epson-ppds
find-modelname-in-samsung-ppds
find-modelname-in-hp-ppds
etc ...
I've been writing functions which read output and filter it:
function filter-epson {
find-modelname-in-epson-ppds | sed <bla-blah-blah>
}
function filter-hp {
find-modelname-in-hp-ppds | sed <the same bla-blah-blah>
}
etc ...
But the I thought that it would be better do something like this:
function filter-general {
(somehow get input) | sed <bla-blah-blah>
}
and then call in another high-level functions:
function high-level-func {
# outputs filtered information
find-modelname-in-hp/epson/...-ppds | filter-general
}
How can I achieve that with the best bash practices?
If the question is How do I pass stdin to a bash function?, then the answer is:
Shellscript functions take stdin the ordinary way, as if they were commands or programs. :)
input.txt:
HELLO WORLD
HELLO BOB
NO MATCH
test.sh:
#!/bin/sh
myfunction() {
grep HELLO
}
cat input.txt | myfunction
Output:
hobbes#metalbaby:~/scratch$ ./test.sh
HELLO WORLD
HELLO BOB
Note that command line arguments are ALSO handled in the ordinary way, like this:
test2.sh:
#!/bin/sh
myfunction() {
grep "$1"
}
cat input.txt | myfunction BOB
Output:
hobbes#metalbaby:~/scratch/$ ./test2.sh
HELLO BOB
To be painfully explicit that I'm piping from stdin, I sometimes write
cat - | ...
A very simple means to get stdin into a variable is to use read. By default, it reads file descriptor "0", i.e. stdin i.e., /dev/stdin.
Example Function:
input(){ local in; read in; echo you said $in; }
Example implementation:
echo "Hello World" | input
Result:
you said Hello World
Additional info
You don't need to declare a variable as being local, of course. I just included that for the sake of good form. Plain old read in does what you need.
So you understand how read works, by default it reads data off the given file descriptor (or implicit stdin) and blocks until it encounters a newline. Much of the time, you'll find that will implicitly be attached to your input, even if you weren't aware of it. If you have a function that seems to hang with this mechanism just keep this detail in mind (there are other ways of using read to deal with that...).
More robust solutions
Adding on to the basic example, here's a variation that lets you pass the input via a stdin OR an argument:
input()
{
local in=$1; if [ -z "$in" ]; then read in; fi
echo you said $in
}
With that tweak, you could ALSO call the function like:
input "Hello World"
How about handling an stdin option plus other arguments? Many standard nix utilities, especially those which typically work with stdin/stdout adhere to the common practice of treating a dash - to mean "default", which contextually means either stdin or stdout, so you can follow the convention, and treat an argument specified as - to mean "stdin":
input()
{
local a=$1; if [ "$a" == "-" ]; then read a; fi
local b=$2
echo you said $a $b
}
Call this like:
input "Hello" "World"
or
echo "Hello" | input - "World"
Going even further, there is actually no reason to only limit stdin to being an option for only the first argument! You might create a super flexible function that could use it for any of them...
input()
{
local a=$1; if [ "$a" == "-" ]; then read a; fi
local b=$2; if [ "$b" == "-" ]; then read b; fi
echo you said $a $b
}
Why would you want that? Because you could formulate, and pipe in, whatever argument you might need...
myFunc | input "Hello" -
In this case, I pipe in the 2nd argument using the results of myFunc rather than the only having the option for the first.
Call sed directly. That's it.
function filter-general {
sed <bla-blah-blah>
}

Resources