Is there a way to implement/use lambda functions in bash? I'm thinking of something like:
$ someCommand | xargs -L1 (lambda function)
I don't know of a way to do this, however you may be able to accomplish what you're trying to do using:
somecommand | while read -r; do echo "Something with $REPLY"; done
This will also be faster, as you won't be creating a new process for each line of text.
[EDIT 2009-07-09]
I've made two changes:
Incorporated litb's suggestion of using -r to disable backslash processing -- this means that backslashes in the input will be passed through unchanged.
Instead of supplying a variable name (such as X) as a parameter to read, we let read assign to its default variable, REPLY. This has the pleasant side-effect of preserving leading and trailing spaces, which are stripped otherwise (even though internal spaces are preserved).
From my observations, together these changes preserve everything except literal NUL (ASCII 0) characters on each input line.
[EDIT 26/7/2016]
According to commenter Evi1M4chine, setting $IFS to the empty string before running read X (e.g., with the command IFS='' read X) should also preserve spaces at the beginning and end when storing the result into $X, meaning you aren't forced to use $REPLY.
if you want true functions, and not just pipes or while loops (e.g. if you want to pass them around, as if they were data) I’d just not do lambdas, and define dummy functions with a recurring dummy name, to use right away, and throw away afterwards. Like so:
# An example map function, to use in the example below.
map() { local f="$1"; shift; for i in "$#"; do "$f" "$i"; done; }
# Lambda function [λ], passed to the map function.
λ(){ echo "Lambda sees $1"; }; map λ *
Like in proper functional languages, there’s no need to pass parameters, as you can wrap them in a closure:
# Let’s say you have a function with three parameters
# that you want to use as a lambda:
# (As in: Partial function application.)
trio(){ echo "$1 Lambda sees $3 $2"; }
# And there are two values that you want to use to parametrize a
# function that shall be your lambda.
pre="<<<"
post=">>>"
# Then you’d just wrap them in a closure, and be done with it:
λ(){ trio "$pre" "$post" "$#"; }; map λ *
I’d argue that it’s even shorter than all other solutions presented here.
What about this?
somecommand | xargs -d"\n" -I{} echo "the argument is: {}"
(assumes each argument is a line, otherwise change delimiter)
#!/bin/bash
function customFunction() {
eval $1
}
command='echo Hello World; echo Welcome;'
customFunction "$command"
GL
Source
if you want only xargs (due parallel -P N option for example), and only bash as function code, then bash -c can be used as parameter for xargs.
seq 1 10 | tr '\n' '\0' | xargs -0 -n 1 bash -c 'echo any bash code $0'
tr and -0 option are used here to disable any xargs parameters substitutions.
Yes. One can pass around a string variable representing a command call, and then execute the command with eval.
Example:
command='echo howdy'
eval "$command"
The eval trick has been already mentioned but here's my extended example of bash closures:
#!/usr/bin/env bash
set -e
function multiplyBy() {
X="$1"
cat <<-EOF
Y="\$1"
echo "$X * \$Y = \$(( $X * \$Y ))"
EOF
}
function callFunc() {
CODE="$1"
shift
eval "$CODE"
}
MULT_BY_2=`multiplyBy 2`
MULT_BY_4=`multiplyBy 4`
callFunc "$MULT_BY_2" 10
callFunc "$MULT_BY_4" 10
PS I've just came up with this for a completely different purpose and was just searching google to see if sb is using that. I actually needed to evaluate a reusable function in the context (shell) of main script.
Related
I'm setting up my shell environments and I want to be able to use some of the same functions/aliases in zsh as in bash. One of these functions opens either .bashrc or .zshrc in an editor (whichever file is relevant), waits for the editor to close, then reloads the rc file.
# a very simplified version of this function
editrc() {
local rcfile=".$(basename $SHELL)rc"
code -w ~/$rcfile
. ~/$rcfile
}
I use the value of rcfile in a few other functions, so I've pulled it out of the function declaration.
_rc=".$(basename $SHELL)rc"
editrc() {
code -w ~/$_rc
. ~/$_rc
}
# ... other functions that use it ...
unset _rc
However, because I'm a neat freak, I want to unset _rc at the end of my script, but I still want my functions to run correctly. Is there a clever way to evaluate $_rc at the time the function is declared?
I know I could use eval and place everything except $_rc instances within single quotes, but that seems like a pain, since the full version of my function uses both single-quotes and double-quotes.
_rc=".$(basename $SHELL)rc"
eval 'editrc() {
echo Here'"'"'s a thing that uses single quotes. As you can see it'"'"'s a pain.
code -w ~/'$_rc'
. ~/'$_rc'
}'
# ... other functions using `_rc`
unset _rc
I'm guessing I could declare my functions, then do some magic with eval "$(declare -f editrc | awk)". It very well be more pain than it's worth, but I'm always interested in learning new things.
Note: I'd love to generalize this into a utility function that does this.
_myvar=foo
anothervar=bar
myfunc() {
echo $_myvar $anothervar
}
# redeclares myfunc with `$_myvar` expanded, but leaves `$anothervar` as-is
expandfunctionvars myfunc '$_myvar'
Is there a clever way to evaluate $_rc at the time the function is declared?
_rc=".$(basename "$SHELL")rc"
# while you could eval here, source lets you work with a stream
source <(
cat <<EOF
editrc() {
local _rc
# first safely trasfer context
$(declare -p _rc)
EOF
# use quoted here string to do anything inside without caring.
cat <<'EOF'
# do anything else
echo "Here's a thing that uses single quotes. As you can see it's not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}
EOF
)
unset _rc
Generally first use declare -p to transfer variables as strings to be evaluated. Then after you "import" variables, use a quoted here document to do anything as in a normal script.
References to read:
<<EOF is a here document. Note the difference in parsing when the here delimiter is quoted vs unquoted.
<(..) is a process substitution
The source command reads a pipe created by process substitution. Inside the process subtitution I output the function to be sourced. With the first here document I output the function name definition, with a local of the variable so that it doesn't pollute global namespace. Then with declare -p I output the variable definition as a properly quoted string later to be sourced by source. Then with a quoted here document I output the rest of the function, so that I do not need to care about quoting.
The code is bash specific, I know nothing about zsh and don't use it.
You could do it with eval too:
eval '
editrc() {
local _rc
# first safely trasfer context
'"$(declare -p _rc)"'
# use quoted here string to do anything inside without caring.
# do anything else
echo "Here'\''s a thing that uses single quotes. As you can see it'\''s not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}'
But for me using a quoted here document delimiter allows for easier writing.
While KamilCuck was working on their answer, I devised a function that will take in any function name and a set of variable names, expand just those variables, and redeclare the function.
expandFnVars() {
if [[ $# -lt 2 ]]; then
>&2 echo 'expandFnVars requires at least two arguments: the function name and the variable(s) to be expanded'
return 1
fi
local fn="$1"
shift
local vars=("$#")
if [[ -z "$(declare -F $fn 2> /dev/null)" ]]; then
>&2 echo $fn is not a function.
return 1
fi
foundAllVars=true
for v in $vars; do
if [[ -z "$(declare -p $v 2> /dev/null)" ]]; then
>&2 echo $v is not a declared value.
foundAllVars=false
fi
done
[[ $foundAllVars != true ]] && return 1
fn="$(declare -f $fn)"
for v in $vars; do
local val="$(eval 'echo $'$v)" # get the value of the varable represented by $v
val="${val//\"/\\\"}" # escape any double-quotes
val="${val//\\/\\\\\\}" # escape any backslashes
fn="$(echo "$fn" | sed -r 's/"?\$'$v'"?/"'"$val"'"/g')" # replace instances of "$$v" and $$v with $val
done
eval "$fn"
}
Usage:
foo="foo bar"
bar='$foo'
baz=baz
fn() {
echo $bar $baz
}
expandFnVars fn bar
declare -f fn
# prints:
# fn ()
# {
# echo "$foo" $baz
# }
expandFnVars fn foo
declare -f fn
# prints:
# fn ()
# {
# echo "foo bar" $baz
# }
Looking at it now, I see one flaw. Suppose $bar in the original function was in single-quotes. We probably would not want its value to be replaced. This could be fixed by some clever regex lookbehinds to count the number of unescaped 's, but I'm happy with it as-is.
I have this (test) script:
#!/bin/bash
my_cmd_bad_ ( ) {
cmd="$#"
$cmd
}
my_cmd_good_ ( ) {
"$#"
}
my_cmd_bad_ ls -l "file with space"
my_cmd_good_ ls -l "file with space"
The output is (the file does not exist, which is not the point of this question):
» ~/test.sh
ls: cannot access file: No such file or directory
ls: cannot access with: No such file or directory
ls: cannot access space: No such file or directory
ls: cannot access file with space: No such file or directory
I am surprised that the first version does not work as expected: the parameter is not quoted, and instead of processing one file, it processes three. Why?
How can I save the command that I want to execute, properly quoted? I need to execute it later, where I do not have "$#" anymore.
A simple rework of this test script would be appreciated.
See similar question: How to pass command line parameters with quotes stored in single variable?
Use those utility functions ho save a command to a string for later execution:
bash_escape() {
# backtick indirection strictly necessary here: we use it to strip the
# trailing newline from sed's output, which Solaris/BSD sed *always* output
# (unlike GNU sed, which outputs "test": printf %s test | sed -e s/dummy//)
out=`echo "$1" | sed -e s/\\'/\\''\\\\'\\'\\'/g`
printf \'%s\' "$out"
}
append_bash_escape() {
printf "%s " "$1"
bash_escape "$2"
}
your_cmd_fixed_ ( ) {
cmd="$#"
while [ $# -gt 0 ] ; do
cmd=`append_bash_escape "$cmd" "$1"` ; shift
done
$cmd
}
You can quote any single parameter and evaluate it later:
my_cmd_bad_ ( ) {
j=0
for i in "$#"; do
cmd["$j"]=\"$"$i"\"
j=$(( $j + 1 ))
done;
eval ${cmd[*]}
}
You are combining three space-delimited strings "ls", "-l", and "file with space" into a single space-delimited string cmd. There's no way to know which spaces were originally quoted (in "file with space") and which spaces were introduced during the assignment to cmd.
Typically, it is not a good idea to try to build up command lines into a single string. Use functions, or isolate the actual command and leave the arguments in $#.
Rewrite the command like this:
my_cmd_bad_ () {
cmd=$1; shift
$cmd "$#"
}
See http://mywiki.wooledge.org/BashFAQ/050
Note that your second version is greatly preferred most of the time. The only exceptions are if you need to do something special. For example, you can't bundle an assignment or redirect or compound command into a parameter list.
The correct way to handle the quoting issue requires non-standard features. Semi-realistic example involving a template:
function myWrapper {
typeset x IFS=$' \t\n'
{ eval "$(</dev/fd/0)"; } <<-EOF
for x in $(printf '%q ' "$#"); do
echo "\$x"
done
EOF
}
myWrapper 'foo bar' $'baz\nbork'
Make sure you understand exactly what's going on here and that you really have a good reason for doing this. It requires ensuring side-effects can't affect the arguments. This specific example doesn't demonstrate a very good use case because everything is hard-coded so you're able to correctly escape things in advance and expand the arguments quoted if you wanted.
Im trying to create a bourne shell script that will take 0 or more arugments and print out the last arugment, I am use to writing Java and im so confused by this, slowly starting to learn C.
An alternative to #Michael's solution:
#!/usr/bin/env bash
echo "${#: -1}"
$# is the array with all the parameters.
: is used to split strings normally (you can try this with echo "${USER: -1}" to verify that it prints the last character of your user name), but can also be used for arrays to get the last element.
The curly brackets are needed for the array indexing to work
The quotes are simply good practice, to make the code more flexible in case you want to mix in other variables, or the value needs to be used in a statement which needs an empty string rather than nothing in case of no parameters (for example if [ "${#: -1}" = "--help" ])
lastArg=`echo $# | awk '{print $NF}'`
Here is a short Bash script that will do it:
#!/usr/bin/env bash
echo "${!#}"
This is not a Bourne shell script, though. Args are not read from the keyboard with the read command. Instead they are supplied on the command line when running your script. For example, if you put this text in script.sh and run ./script.sh a b c d e it will print:
e
In bash:
last_arg=`echo $* | rev | cut -d " " -f1 | rev`;
echo $last_arg
Your question mentions C. In C its easier:
int main (int argc, char *argv[]) {
char *last_arg = argv[argc - 1];
[...]
}
This should work:
eval lastarg='$'$#
echo last arg is $lastarg
This works with /bin/sh being bash or dash.
I am unable to test it with a "real" /bin/sh, whatever that would be.
How can shell arguments be stored in a file for later use while preserving quoting?
To be clear: I don't want to pass on the arguments in place, which could be easily done using "$#". But actually need to store them in a file for later use.
#!/bin/sh
storeargs() {
: #-)
}
if "$1"
then
# useargs is actuall 'git filter-branch'
useargs "$#"
storeargs "$#"
else
# without args use those from previous invocation
eval useargs $(cat store)
fi
.
$ foo 'a "b"' "c 'd'" '\'' 'd
e'
$ foo # behave as if called with same arguments again
The question likely comes down to how to quote a string using common tools in general (awk, perl, ...). I would prefer a solution that does not make the quoted string unreadable. The content of store should look more or less like what I would specify on the commandline.
The question is complicated by the fact that the arguments/strings to be quoted might already contain any kind of valid (shell) quoting and/or any kind of (significant) whitespace, so unconditionally putting single or double quotes around every argument or storing one argument per line won't work.
Why do the heavy lifting?
storeargs() {
while [ $# -gt 0 ]
do
printf "%q " "$1"
shift
done
}
You can now
storeargs "some" "weird $1 \`bunch\` of" params > myparams.txt
storeargs "some" 'weird $1 \`bunch\` of' params >> myparams.txt
cat myparams.txt
Output
some weird\ \ \`bunch\`\ of params
some weird\ \$1\ \\\`bunch\\\`\ of params
This version stores the arguments one per line, so may be a bit ugly in terms of storage. I doubt that it is completely robust, but it satisfies your example (for useargs() { for i in "$#"; do echo $i; done; } ):
storeargs() { printf "%q\n" "$#"; } > store
if test -n "$1"; then
useargs "$#"
storeargs "$#"
else
eval useargs $args
fi
--EDIT--
Use %q in printf to quote the strings (shamelessly copied from sehe's answer). Note that %q is available in the bash built-in printf, but not in standard printf.
I'm trying to write some code in bash which uses introspection to select the appropriate function to call.
Determining the candidates requires knowing which functions are defined. It's easy to list defined variables in bash using only parameter expansion:
$ prefix_foo="one"
$ prefix_bar="two"
$ echo "${!prefix_*}"
prefix_bar prefix_foo
However, doing this for functions appears to require filtering the output of set -- a much more haphazard approach.
Is there a Right Way?
How about compgen:
compgen -A function # compgen is a shell builtin
$ declare -F
declare -f ::
declare -f _get_longopts
declare -f _longopts_func
declare -f _onexit
...
So, Jed Daniel's alias,
declare -F | cut -d" " -f3
cuts on a space and echos the 3rd field:
$ declare -F | cut -d" " -f3
::
_get_longopts
_longopts_func
_onexit
I have an entry in my .bashrc that says:
alias list='declare -F |cut -d" " -f3'
Which allows me to type list and get a list of functions. When I added it, I probably understood what was happening, but I can't remember to save my life at the moment.
Good luck,
--jed
zsh only (not what was asked for, but all the more generic questions have been closed as a duplicate of this):
typeset -f +
From man zshbuiltins:
-f The names refer to functions rather than parameters.
+ If `+' appears by itself in a separate word as the last
option, then the names of all parameters (functions with -f)
are printed, but the values (function bodies) are not.
Example:
martin#martin ~ % cat test.zsh
#!/bin/zsh
foobar()
{
echo foobar
}
barfoo()
{
echo barfoo
}
typeset -f +
Output:
martin#martin ~ % ./test.zsh
barfoo
foobar
Use the declare builtin to list currently defined functions:
declare -F
This has no issues with IFS nor globbing:
readarray -t funcs < <(declare -F)
printf '%s\n' "${funcs[#]##* }"
Of course, that needs bash 4.0.
For bash since 2.04 use (a little trickier but equivalent):
IFS=$'\n' read -d '' -a funcs < <(declare -F)
If you need that the exit code of this option is zero, use this:
IFS=$'\n' read -d '' -a funcs < <( declare -F && printf '\0' )
It will exit unsuccesful (not 0) if either declare or read fail. (Thanks to #CharlesDuffy)
One (ugly) approach is to grep through the output of set:
set \
| egrep '^[^[:space:]]+ [(][)][[:space:]]*$' \
| sed -r -e 's/ [(][)][[:space:]]*$//'
Better approaches would be welcome.
Pure Bash:
saveIFS="$IFS"
IFS=$'\n'
funcs=($(declare -F)) # create an array
IFS="$saveIFS"
funcs=(${funcs[#]##* }) # keep only what's after the last space
Then, run at the Bash prompt as an example displaying bash-completion functions:
$ for i in ${funcs[#]}; do echo "$i"; done
__ack_filedir
__gvfs_multiple_uris
_a2dismod
. . .
$ echo ${funcs[42]}
_command
This collects a list of function names matching any of a list of patterns:
functions=$(for c in $patterns; do compgen -A function | grep "^$c\$")
The grep limits the output to only exact matches for the patterns.
Check out the bash command type as a better alternative to the following. Thanks to Charles Duffy for the clue.
The following uses that to answer the title question for humans rather than shell scripts: it adds a list of function names matching the given patterns, to the regular which list of shell scripts, to answer, "What code runs when I type a command?"
which() {
for c in "$#"; do
compgen -A function |grep "^$c\$" | while read line; do
echo "shell function $line" 1>&2
done
/usr/bin/which "$c"
done
}
So,
(xkcd)Sandy$ which deactivate
shell function deactivate
(xkcd)Sandy$ which ls
/bin/ls
(xkcd)Sandy$ which .\*run_hook
shell function virtualenvwrapper_run_hook
This is arguably a violation of the Unix "do one thing" philosophy, but I've more than once been desperate because which wasn't finding a command that some package was supposed to contain, me forgetting about shell functions, so I've put this in my .profile.
#!/bin/bash
# list-defined-functions.sh
# Lists functions defined in this script.
#
# Using `compgen -A function`,
# We can save the list of functions defined before running out script,
# the compare that to a new list at the end,
# resulting in the list of newly added functions.
#
# Usage:
# bash list-defined-functions.sh # Run in new shell with no predefined functions
# list-defined-functions.sh # Run in current shell with plenty of predefined functions
#
# Example predefined function
foo() { echo 'y'; }
# Retain original function list
# If this script is run a second time, keep the list from last time
[[ $original_function_list ]] || original_function_list=$(compgen -A function)
# Create some new functions...
myfunc() { echo "myfunc is the best func"; }
function another_func() { echo "another_func is better"; }
function superfunction { echo "hey another way to define functions"; }
# ...
# function goo() { echo ok; }
[[ $new_function_list ]] || new_function_list=$(comm -13 \
<(echo $original_function_list) \
<(compgen -A function))
echo "Original functions were:"
echo "$original_function_list"
echo
echo "New Functions defined in this script:"
echo "$new_function_list"