Change function arguments in Bash - bash

I'd like to change function arguments before passing them to a next function.
firstfunction() {
# change "-f" to "--format" in arguments
secondfunction "$#"
}
I tried to convert to an array, change the array and convert back to arguments. But it looks so complicated. Is it possible to make it simpler?
UPDATE: to be more specific...
firstfunction data.txt -f "\d+"
should call
secondfunction data.txt --format "\d+"

This is a surprisingly tough problem. Bash is simply not very good at working with (slightly) complex data structures like arrays.
I think the only conceivable robust solution will require a loop. This is the easiest way I can think of:
function firstfunction {
local -A map=(['-f']='--format');
local -a args=();
local arg;
for arg in "$#"; do
if [[ -v map[$arg] ]]; then
args+=("${map[$arg]}");
else
args+=("$arg");
fi;
done;
echo ${args[#]+"${args[#]}"}; ## replace echo with secondfunction to run
};
firstfunction;
##
firstfunction a b;
## a b
firstfunction x -f -fff -f-f -fxy x-f \ -f -f\ -f;
## x --format -fff -f-f -fxy x-f -f -f --format
Using ${args[#]+"${args[#]}"} instead of just "${args[#]}" for the final expansion works around the ill-advised design decision the bash developers made to reject an empty array as an "unbound variable" if you have the nounset set option (set -u) enabled. See Bash empty array expansion with `set -u`.
Alternative:
function firstfunction {
local -A map=(['-f']='--format');
local -a args=("$#");
local -i i;
for ((i = 0; i < ${#args[#]}; ++i)); do
if [[ -v map[${args[i]}] ]]; then
args[i]="${map[${args[i]}]}";
fi;
done;
echo ${args[#]+"${args[#]}"}; ## replace echo with secondfunction to run
};

You can use getopts to reliable parse and process optional arguments like this:
firstfunction() {
OPTIND=1
local arr=()
while getopts "f:x:" opt; do
case $opt in
f) arr+=("--format $OPTARG");;
x) arr+=("--execute $OPTARG");;
esac
done
echo "${arr[#]}"; # or call second function here
}
firstfunction -fabc -x foobar
--format abc --execute foobar
firstfunction -fabc -xfoobar
--format abc --execute foobar
firstfunction -f abc -xfoobar
--format abc --execute foobar

Related

Printing bash command line args with $# inline

I want to add verbosity to my bash function by printing the command that it will run. What is the best way to print all arguments $# inline?
ggtest ()
{
echo 'git grep -n $# -- "src/tests/*"'
git grep -n "$#" -- "src/tests/*";
}
So that I can see an output such as:
$ ggtest "models and views"
git grep -n "models and views" -- "src/tests/*"
...
An overcomplicated version you can cut down to support only the specific shell releases you need support for:
ggtest ()
{
# note the following explicitly exits if run in a shell w/o array support
local -a cmd || return # declare a function-local array
cmd=( git grep -n "$#" -- "src/tests/*" ) # store intended command in array
# below here, we take a different approach based on running bash version
case $BASH_VERSION in
'') # no BASH_VERSION == we're running with a shell that's not bash at all
set -x # enable trace logging
: "${cmd[#]}" # run : with our array as arguments
{ set +x; } 2>/dev/null # silently disable tracing
;;
[1-4].*) # pre-5.0 bash does not support ${var#Q}; these logs are uglier
{ printf '%q ' "${cmd[#]}"; printf \n; } >&2 ;;
*) # modern bash; shortest syntax, prettiest output
printf '%s\n' "${cmd[*]#Q}" >&2;;
esac
"${cmd[#]}" # execute our array
}
Note that in current shell releases printf %q will use backslashes rather than quotes for escaping, so it would change ggtest "some string" to have some\ string in the logs; not the worst thing in the word, but it's less pretty than ${array[*]#Q}'s representation

Giving relative address as an input to read in bash scripts [duplicate]

I have a variable in my bash script whose value is something like this:
~/a/b/c
Note that it is unexpanded tilde. When I do ls -lt on this variable (call it $VAR), I get no such directory. I want to let bash interpret/expand this variable without executing it. In other words, I want bash to run eval but not run the evaluated command. Is this possible in bash?
How did I manage to pass this into my script without expansion? I passed the argument in surrounding it with double quotes.
Try this command to see what I mean:
ls -lt "~"
This is exactly the situation I am in. I want the tilde to be expanded. In other words, what should I replace magic with to make these two commands identical:
ls -lt ~/abc/def/ghi
and
ls -lt $(magic "~/abc/def/ghi")
Note that ~/abc/def/ghi may or may not exist.
If the variable var is input by the user, eval should not be used to expand the tilde using
eval var=$var # Do not use this!
The reason is: the user could by accident (or by purpose) type for example var="$(rm -rf $HOME/)" with possible disastrous consequences.
A better (and safer) way is to use Bash parameter expansion:
var="${var/#\~/$HOME}"
Due to the nature of StackOverflow, I can't just make this answer unaccepted, but in the intervening 5 years since I posted this there have been far better answers than my admittedly rudimentary and pretty bad answer (I was young, don't kill me).
The other solutions in this thread are safer and better solutions. Preferably, I'd go with either of these two:
Charle's Duffy's solution
Håkon Hægland's solution
Original answer for historic purposes (but please don't use this)
If I'm not mistaken, "~" will not be expanded by a bash script in that manner because it is treated as a literal string "~". You can force expansion via eval like this.
#!/bin/bash
homedir=~
eval homedir=$homedir
echo $homedir # prints home path
Alternatively, just use ${HOME} if you want the user's home directory.
Plagarizing myself from a prior answer, to do this robustly without the security risks associated with eval:
expandPath() {
local path
local -a pathElements resultPathElements
IFS=':' read -r -a pathElements <<<"$1"
: "${pathElements[#]}"
for path in "${pathElements[#]}"; do
: "$path"
case $path in
"~+"/*)
path=$PWD/${path#"~+/"}
;;
"~-"/*)
path=$OLDPWD/${path#"~-/"}
;;
"~"/*)
path=$HOME/${path#"~/"}
;;
"~"*)
username=${path%%/*}
username=${username#"~"}
IFS=: read -r _ _ _ _ _ homedir _ < <(getent passwd "$username")
if [[ $path = */* ]]; then
path=${homedir}/${path#*/}
else
path=$homedir
fi
;;
esac
resultPathElements+=( "$path" )
done
local result
printf -v result '%s:' "${resultPathElements[#]}"
printf '%s\n' "${result%:}"
}
...used as...
path=$(expandPath '~/hello')
Alternately, a simpler approach that uses eval carefully:
expandPath() {
case $1 in
~[+-]*)
local content content_q
printf -v content_q '%q' "${1:2}"
eval "content=${1:0:2}${content_q}"
printf '%s\n' "$content"
;;
~*)
local content content_q
printf -v content_q '%q' "${1:1}"
eval "content=~${content_q}"
printf '%s\n' "$content"
;;
*)
printf '%s\n' "$1"
;;
esac
}
How about this:
path=`realpath "$1"`
Or:
path=`readlink -f "$1"`
A safe way to use eval is "$(printf "~/%q" "$dangerous_path")". Note that is bash specific.
#!/bin/bash
relativepath=a/b/c
eval homedir="$(printf "~/%q" "$relativepath")"
echo $homedir # prints home path
See this question for details
Also, note that under zsh this would be as as simple as echo ${~dangerous_path}
Here is a ridiculous solution:
$ echo "echo $var" | bash
An explanation of what this command does:
create a new instance of bash, by... calling bash;
take the string "echo $var" and substitute $var with the value of the variable (thus after the substitution the string will contain the tilde);
take the string produced by step 2 and send it to the instance of bash created in step one, which we do here by calling echo and piping its output with the | character.
Basically the current bash instance we're running takes our place as the user of another bash instance and types in the command "echo ~..." for us.
Expanding (no pun intended) on birryree's and halloleo's answers: The general approach is to use eval, but it comes with some important caveats, namely spaces and output redirection (>) in the variable. The following seems to work for me:
mypath="$1"
if [ -e "`eval echo ${mypath//>}`" ]; then
echo "FOUND $mypath"
else
echo "$mypath NOT FOUND"
fi
Try it with each of the following arguments:
'~'
'~/existing_file'
'~/existing file with spaces'
'~/nonexistant_file'
'~/nonexistant file with spaces'
'~/string containing > redirection'
'~/string containing > redirection > again and >> again'
Explanation
The ${mypath//>} strips out > characters which could clobber a file during the eval.
The eval echo ... is what does the actual tilde expansion
The double-quotes around the -e argument are for support of filenames with spaces.
Perhaps there's a more elegant solution, but this is what I was able to come up with.
why not delve straight into getting the user's home directory with getent?
$ getent passwd mike | cut -d: -f6
/users/mike
I believe this is what you're looking for
magic() { # returns unexpanded tilde express on invalid user
local _safe_path; printf -v _safe_path "%q" "$1"
eval "ln -sf ${_safe_path#\\} /tmp/realpath.$$"
readlink /tmp/realpath.$$
rm -f /tmp/realpath.$$
}
Example usage:
$ magic ~nobody/would/look/here
/var/empty/would/look/here
$ magic ~invalid/this/will/not/expand
~invalid/this/will/not/expand
Here is the POSIX function equivalent of Håkon Hægland's Bash answer
expand_tilde() {
tilde_less="${1#\~/}"
[ "$1" != "$tilde_less" ] && tilde_less="$HOME/$tilde_less"
printf '%s' "$tilde_less"
}
2017-12-10 edit: add '%s' per #CharlesDuffy in the comments.
Here's my solution:
#!/bin/bash
expandTilde()
{
local tilde_re='^(~[A-Za-z0-9_.-]*)(.*)'
local path="$*"
local pathSuffix=
if [[ $path =~ $tilde_re ]]
then
# only use eval on the ~username portion !
path=$(eval echo ${BASH_REMATCH[1]})
pathSuffix=${BASH_REMATCH[2]}
fi
echo "${path}${pathSuffix}"
}
result=$(expandTilde "$1")
echo "Result = $result"
Simplest: replace 'magic' with 'eval echo'.
$ eval echo "~"
/whatever/the/f/the/home/directory/is
Problem: You're going to run into issues with other variables because eval is evil. For instance:
$ # home is /Users/Hacker$(s)
$ s="echo SCARY COMMAND"
$ eval echo $(eval echo "~")
/Users/HackerSCARY COMMAND
Note that the issue of the injection doesn't happen on the first expansion. So if you were to simply replace magic with eval echo, you should be okay. But if you do echo $(eval echo ~), that would be susceptible to injection.
Similarly, if you do eval echo ~ instead of eval echo "~", that would count as twice expanded and therefore injection would be possible right away.
For anyone's reference, a function to mimic python's os.path.expanduser() behavior (no eval usage):
# _expand_homedir_tilde ~/.vim
/root/.vim
# _expand_homedir_tilde ~myuser/.vim
/home/myuser/.vim
# _expand_homedir_tilde ~nonexistent/.vim
~nonexistent/.vim
# _expand_homedir_tilde /full/path
/full/path
And the function:
function _expand_homedir_tilde {
(
set -e
set -u
p="$1"
if [[ "$p" =~ ^~ ]]; then
u=`echo "$p" | sed 's|^~\([a-z0-9_-]*\)/.*|\1|'`
if [ -z "$u" ]; then
u=`whoami`
fi
h=$(set -o pipefail; getent passwd "$u" | cut -d: -f6) || exit 1
p=`echo "$p" | sed "s|^~[a-z0-9_-]*/|${h}/|"`
fi
echo $p
) || echo $1
}
Just to extend birryree's answer for paths with spaces: You cannot use the eval command as is because it seperates evaluation by spaces. One solution is to replace spaces temporarily for the eval command:
mypath="~/a/b/c/Something With Spaces"
expandedpath=${mypath// /_spc_} # replace spaces
eval expandedpath=${expandedpath} # put spaces back
expandedpath=${expandedpath//_spc_/ }
echo "$expandedpath" # prints e.g. /Users/fred/a/b/c/Something With Spaces"
ls -lt "$expandedpath" # outputs dir content
This example relies of course on the assumption that mypath never contains the char sequence "_spc_".
You might find this easier to do in python.
(1) From the unix command line:
python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' ~/fred
Results in:
/Users/someone/fred
(2) Within a bash script as a one-off - save this as test.sh:
#!/usr/bin/env bash
thepath=$(python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' $1)
echo $thepath
Running bash ./test.sh results in:
/Users/someone/fred
(3) As a utility - save this as expanduser somewhere on your path, with execute permissions:
#!/usr/bin/env python
import sys
import os
print os.path.expanduser(sys.argv[1])
This could then be used on the command line:
expanduser ~/fred
Or in a script:
#!/usr/bin/env bash
thepath=$(expanduser $1)
echo $thepath
Just use eval correctly: with validation.
case $1${1%%/*} in
([!~]*|"$1"?*[!-+_.[:alnum:]]*|"") ! :;;
(*/*) set "${1%%/*}" "${1#*/}" ;;
(*) set "$1"
esac&& eval "printf '%s\n' $1${2+/\"\$2\"}"
I have done this with variable parameter substitution after reading in the path using read -e (among others). So the user can tab-complete the path, and if the user enters a ~ path it gets sorted.
read -rep "Enter a path: " -i "${testpath}" testpath
testpath="${testpath/#~/${HOME}}"
ls -al "${testpath}"
The added benefit is that if there is no tilde nothing happens to the variable, and if there is a tilde but not in the first position it is also ignored.
(I include the -i for read since I use this in a loop so the user can fix the path if there is a problem.)
for some reason when the string is already quoted only perl saves the day
#val="${val/#\~/$HOME}" # for some reason does not work !!
val=$(echo $val|perl -ne 's|~|'$HOME'|g;print')
I think that
thepath=( ~/abc/def/ghi )
is easier than all the other solutions... or I am missing something? It works even if the path does not really exists.

How to set an arbitrary positional argument, while still preserving the rest?

I would like to do something like this, but preserve every argument after $i:
for i in "$#"; do
if [[ $i == "--" ]]; then
set $i "-S --"
break
fi
done
ls "$#"
In this example, I want to make a simple wrapper over ls where -S is always the final option that is applied.
This is simple if the arguments do not have "--":
ls "$#" -S
However, this breaks whenever there is a "--" as an argument.
To work around this, I would like to find the first occurrence of -- and place an -S before it.
EDIT:
The reason why I do not use:
ls -S "$#"
is because I want the output to be sorted by size LAST. So if -t is passed into the arguments, the output should be sorted by modification time THEN by size. That use case fails here:
ls -S -t
Create a second array by iterating over the first one and inserting -S where needed.
#! /bin/bash
arr=()
for arg in "$#" ; do
if [[ $arg == -- ]] ; then
arr+=(-S --)
else
arr+=("$arg")
fi
done
ls "${arr[#]}"
You might need to insert it just once to be utterly correct:
#! /bin/bash
arr=()
inserted=
for arg in "$#" ; do
if [[ $arg == -- && ! $inserted ]] ; then
arr+=(-S --)
inserted=1
else
arr+=("$arg")
fi
done
If you really need to set the positional arguments, use
set "${arr[#]}"
to set positional arguments to the members of ${arr[#]}.

Why are "declare -f" and "declare -a" needed in bash scripts?

Sorry for the innocent question - I'm just trying to understand...
For example - I have:
$ cat test.sh
#!/bin/bash
declare -f testfunct
testfunct () {
echo "I'm function"
}
testfunct
declare -a testarr
testarr=([1]=arr1 [2]=arr2 [3]=arr3)
echo ${testarr[#]}
And when I run it I get:
$ ./test.sh
I'm function
arr1 arr2 arr3
So here is a question - why do I have to (if I have to ...) insert declare here?
With it - or without it it works the same...
I can understand for example declare -i var or declare -r var. But for what is -f (declare function) and -a (declare array)?
declare -f functionname is used to output the definition of the function functionname, if it exists, and absolutely not to declare that functionname is/will be a function. Look:
$ unset -f a # unsetting the function a, if it existed
$ declare -f a
$ # nothing output and look at the exit code:
$ echo $?
1
$ # that was an "error" because the function didn't exist
$ a() { echo 'Hello, world!'; }
$ declare -f a
a ()
{
echo 'Hello, world!'
}
$ # ok? and look at the exit code:
$ echo $?
0
$ # cool :)
So in your case, declare -f testfunct will do nothing, except possibly if testfunct exists, it will output its definition on stdout.
As far as I know, the -a option alone does not have any practical relevance, but I think it's a plus for readability when declaring arrays. It becomes more interesting when it is combined with other options to generate arrays of a special type.
For example:
# Declare an array of integers
declare -ai int_array
int_array=(1 2 3)
# Setting a string as array value fails
int_array[0]="I am a string"
# Convert array values to lower case (or upper case with -u)
declare -al lowercase_array
lowercase_array[0]="I AM A STRING"
lowercase_array[1]="ANOTHER STRING"
echo "${lowercase_array[0]}"
echo "${lowercase_array[1]}"
# Make a read only array
declare -ar readonly_array=(42 "A String")
# Setting a new value fails
readonly_array[0]=23
declare -f allows you to list all defined functions (or sourced) and their contents.
Example of use:
[ ~]$ cat test.sh
#!/bin/bash
f(){
echo "Hello world"
}
# print 0 if is defined (success)
# print 1 if isn't defined (failure)
isDefined(){
declare -f "$1" >/dev/null && echo 0 || echo 1
}
isDefined f
isDefined g
[ ~]$ ./test.sh
0
1
[ ~]$ declare -f
existFunction ()
{
declare -f "$1" > /dev/null && echo 0 || echo 1
}
f ()
{
echo "Hello world"
}
However as smartly said gniourf_gniourf below : it's better to use declare -F to test the existence of a function.

Listing defined functions in Bash

I'm trying to write some code in bash which uses introspection to select the appropriate function to call.
Determining the candidates requires knowing which functions are defined. It's easy to list defined variables in bash using only parameter expansion:
$ prefix_foo="one"
$ prefix_bar="two"
$ echo "${!prefix_*}"
prefix_bar prefix_foo
However, doing this for functions appears to require filtering the output of set -- a much more haphazard approach.
Is there a Right Way?
How about compgen:
compgen -A function # compgen is a shell builtin
$ declare -F
declare -f ::
declare -f _get_longopts
declare -f _longopts_func
declare -f _onexit
...
So, Jed Daniel's alias,
declare -F | cut -d" " -f3
cuts on a space and echos the 3rd field:
$ declare -F | cut -d" " -f3
::
_get_longopts
_longopts_func
_onexit
I have an entry in my .bashrc that says:
alias list='declare -F |cut -d" " -f3'
Which allows me to type list and get a list of functions. When I added it, I probably understood what was happening, but I can't remember to save my life at the moment.
Good luck,
--jed
zsh only (not what was asked for, but all the more generic questions have been closed as a duplicate of this):
typeset -f +
From man zshbuiltins:
-f The names refer to functions rather than parameters.
+ If `+' appears by itself in a separate word as the last
option, then the names of all parameters (functions with -f)
are printed, but the values (function bodies) are not.
Example:
martin#martin ~ % cat test.zsh
#!/bin/zsh
foobar()
{
echo foobar
}
barfoo()
{
echo barfoo
}
typeset -f +
Output:
martin#martin ~ % ./test.zsh
barfoo
foobar
Use the declare builtin to list currently defined functions:
declare -F
This has no issues with IFS nor globbing:
readarray -t funcs < <(declare -F)
printf '%s\n' "${funcs[#]##* }"
Of course, that needs bash 4.0.
For bash since 2.04 use (a little trickier but equivalent):
IFS=$'\n' read -d '' -a funcs < <(declare -F)
If you need that the exit code of this option is zero, use this:
IFS=$'\n' read -d '' -a funcs < <( declare -F && printf '\0' )
It will exit unsuccesful (not 0) if either declare or read fail. (Thanks to #CharlesDuffy)
One (ugly) approach is to grep through the output of set:
set \
| egrep '^[^[:space:]]+ [(][)][[:space:]]*$' \
| sed -r -e 's/ [(][)][[:space:]]*$//'
Better approaches would be welcome.
Pure Bash:
saveIFS="$IFS"
IFS=$'\n'
funcs=($(declare -F)) # create an array
IFS="$saveIFS"
funcs=(${funcs[#]##* }) # keep only what's after the last space
Then, run at the Bash prompt as an example displaying bash-completion functions:
$ for i in ${funcs[#]}; do echo "$i"; done
__ack_filedir
__gvfs_multiple_uris
_a2dismod
. . .
$ echo ${funcs[42]}
_command
This collects a list of function names matching any of a list of patterns:
functions=$(for c in $patterns; do compgen -A function | grep "^$c\$")
The grep limits the output to only exact matches for the patterns.
Check out the bash command type as a better alternative to the following. Thanks to Charles Duffy for the clue.
The following uses that to answer the title question for humans rather than shell scripts: it adds a list of function names matching the given patterns, to the regular which list of shell scripts, to answer, "What code runs when I type a command?"
which() {
for c in "$#"; do
compgen -A function |grep "^$c\$" | while read line; do
echo "shell function $line" 1>&2
done
/usr/bin/which "$c"
done
}
So,
(xkcd)Sandy$ which deactivate
shell function deactivate
(xkcd)Sandy$ which ls
/bin/ls
(xkcd)Sandy$ which .\*run_hook
shell function virtualenvwrapper_run_hook
This is arguably a violation of the Unix "do one thing" philosophy, but I've more than once been desperate because which wasn't finding a command that some package was supposed to contain, me forgetting about shell functions, so I've put this in my .profile.
#!/bin/bash
# list-defined-functions.sh
# Lists functions defined in this script.
#
# Using `compgen -A function`,
# We can save the list of functions defined before running out script,
# the compare that to a new list at the end,
# resulting in the list of newly added functions.
#
# Usage:
# bash list-defined-functions.sh # Run in new shell with no predefined functions
# list-defined-functions.sh # Run in current shell with plenty of predefined functions
#
# Example predefined function
foo() { echo 'y'; }
# Retain original function list
# If this script is run a second time, keep the list from last time
[[ $original_function_list ]] || original_function_list=$(compgen -A function)
# Create some new functions...
myfunc() { echo "myfunc is the best func"; }
function another_func() { echo "another_func is better"; }
function superfunction { echo "hey another way to define functions"; }
# ...
# function goo() { echo ok; }
[[ $new_function_list ]] || new_function_list=$(comm -13 \
<(echo $original_function_list) \
<(compgen -A function))
echo "Original functions were:"
echo "$original_function_list"
echo
echo "New Functions defined in this script:"
echo "$new_function_list"

Resources