How to access variables used in "git submodule foreach" from outside? - bash

How to recurse into all submodules and save the info to an array? that array should be accessible from outside of git submodule foreach, in the below example, I am trying to save all the paths which has foo in it.
$ declare -a paths
$ git submodule foreach --recursive '[[ "$name" = *"foo"* ]] && \
( echo $path; paths+=($path) ) || true'
Entering 'bar-1'
Entering 'foo-1'
foo-1
Entering 'foo-2'
foo-2
Entering 'foo-8'
foo-8
Entering 'foo'
foo
Entering 'baz'
$
$ echo ${paths[#]}
$

git submodule foreach runs in a sub-shell. This means there is no direct way to affect the parent shell, and that, in turn, means you need to affect the parent shell indirectly.
There are any number of ways to do this, but a simple one is to write to a file, then use source or . to read the file. Given your syntax above, you are presumably using bash, so:
git submodule foreach --recursive '[[ "$name" = *"foo"* ]] && \
( echo $path; echo "paths+=($path)" >> /tmp/paths ) || true'
source /tmp/paths
rm /tmp/paths
echo ${paths[#]}
Another way to do this is to eval the output of the foreach, but this is trickier since you then have to be careful with all output. There's a handy trick with exec for this, to redirect various file descriptors:
exec 1>&3
eval $(command)
where command expands (via alias or shell function, or script, or whatever) to:
command() {
exec 4>&1 1>&3 3>&-
echo now we can print normally
echo var=value 1>&4 # this is a directive for the "eval"
}
The outer 3>&1 makes a copy of stdout for the inner command, which then moves its fd 1 to fd 4, moves 3 to 1, and closes 3. Now the inner command's stdout is the same as the outer stdout, while fd 4 is where the items to be eval-ed go.

Write the values as assignment statements to a temp file. Source the temp file.

Related

Perform command from command line inside directories from glob in bash shell script

In a bash shell script do-for.sh I want to perform a command inside all directories named in a glob using bash. This has been answered oodles of times, but I want to provide the command itself on the command line. In other words assuming I have the directories:
foo
bar
I want to enter
do-for * pwd
and have bash print the working directory inside foo and then inside bar.
From reading the umpteen answers on the web, I thought I could do this:
for dir in $1; do
pushd ${dir}
$2 $3 $4 $5 $6 $7 $8 $9
popd
done
Apparently though the glob * gets expanded into the other command line arguments variable! So the first time through the loop, for $2 $3 $4 $5 $6 $7 $8 $9 I expected foo pwd but instead it appears I get foo bar!
How can I keep the glob on the command line from being expanded into the other parameters? Or is there a better way to approach this?
To make this clearer, here is how I want to use the batch file. (This works fine on the Windows batch file version, by the way.)
./do-for.sh repo-* git commit -a -m "Added new files."
I will assume you are open to your users having to provide some kind of separator, like so
./do-for.sh repo-* -- git commit -a -m "Added new files."
Your script could do something like (this is just to explain the concept, I have not tested the actual code) :
CURRENT_DIR="$PWD"
declare -a FILES=()
for ARG in "$#"
do
[[ "$ARG" != "--" ]] || break
FILES+=("$ARG")
shift
done
if
[[ "${1-}" = "--" ]]
then
shift
else
echo "You must terminate the file list with -- to separate it from the command"
(return, exit, whatever you prefer to stop the script/function)
fi
At this point, you have all the target files in an array, and "$#" contains only the command to execute. All that is left to do is :
for FILE in "${FILES[#]-}"
do
cd "$FILE"
"$#"
cd "$CURRENT_DIR"
done
Please note that this solution has the advantage that if your user forgets the "--" separator, she will be notified (as opposed to a failure due to quoting).
In this case the problem is not the expansion of metacharacter, is just that your script has an undefined number of arguments of which the last one is the command to execute for all previous arguments.
#!/bin/bash
CMND=$(eval echo "\${$#}") # get the command as last argument without arguments or
while [[ $# -gt 1 ]]; do # execute loop for each argument except last one
( cd "$1" && eval "$CMND" ) # switch to each directory received and execute the command
shift # throw away 1st arg and move to the next one in line
done
Usage: ./script.sh * pwd or ./script.sh * "ls -l"
To have the command followed by arguments (ex. ./script.sh * ls -l) the script has to be longer because each argument has to be tested if it's a directory until the command is identified (or backwards until a dir is identified).
Here is an alternative script that would accept the syntax: ./script.sh <dirs...> <command> <arguments...>
For example: ./script.sh * ls -la
# Move all dirs from args to DIRS array
typeset -i COUNT=0
while [[ $# -gt 1 ]]; do
[[ -d "$1" ]] && DIRS[COUNT++]="$1" && shift || break
done
# Validate that the command received is valid
which "$1" >/dev/null 2>&1 || { echo "invalid command: $1"; exit 1; }
# Execute the command + it's arguments for each dir from array
for D in "${DIRS[#]}"; do
( cd "$D" && eval "$#" )
done
Here is how I would do it:
#!/bin/bash
# Read directory arguments into dirs array
for arg in "$#"; do
if [[ -d $arg ]]; then
dirs+=("$arg")
else
break
fi
done
# Remove directories from arguments
shift ${#dirs[#]}
cur_dir=$PWD
# Loop through directories and execute command
for dir in "${dirs[#]}"; do
cd "$dir"
"$#"
cd "$cur_dir"
done
This loops over the arguments as seen after expansion, and as long as they are directories, they are added to the dirs array. As soon as the first non-directory argument is encountered, we assume that now the command starts.
The directories are then removed from the arguments with shift, and we store the current directory in cur_dir.
The last loop visits each directory and executes the command consisting of the rest of the arguments.
This works for your
./do-for.sh repo-* git commit -a -m "Added new files."
example – but if repo-* expands to anything other than directories, the script breaks because it will try to execute the filename as part of the command.
It could be made more stable if, for example, the glob and the command were separated by an indicator such as --, but if you know that the glob will always be just directories, this should work.
I will begin with the Windows batch file that you mentioned twice as working. The big difference is that on Windows, the shell doesn’t make any globbing, leaving it to the various commands (and each of them does it differently), while on Linux/Unix the globbing is usually done by the shell, and can be prevented by quoting or escaping. Both the Windows approach and the Linux approach have their merits, and they compare differently in different use cases.
For regular bash users, quoting
./do-for.sh repo-'*' git commit -a -m "Added new files."
or escaping
./do-for.sh repo-\* git commit -a -m "Added new files."
are the simplest solution, because they are what they consistently use on a daily basis. If your users need a different syntax, you have all the solutions proposed so far, that I will classify into four categories before proposing my own (note that in each example below do-for.sh stands for a different script adopting the respective solution, which can be found in one of the other answers.)
Disable shell globbing. This is clumsy, because, even if you remember which shell option does it, you have to remember to reset it to default to have the shell working normally afterwards.
Use a separator:
./do-for.sh repo-* -- git commit -a -m "Added new files."
This works, is similar to the solution adopted in similar situations with other shell commands, and fails only if your expansion of directory names includes a directory name exactly equal to the separator (an unlikely event, which wouldn’t happen in the above example, but in general might happen.)
Have the command as the last argument, all the rest are directories:
./do-for.sh repo-* 'git commit -a -m "Added new files."'
This works, but again, it involves quoting, possibly even nested, and there is no point in preferring it to the more usual quoting of globbing characters.
Try to be smart:
./do-for.sh repo-* git commit -a -m "Added new files."
and consider to be dealing with directories till you hit a name which is not a directory. This would work in many cases, but might fail in obscure ways (e.g. when you have a directory named like the command).
My solution doesn’t belong to any of the mentioned categories. In fact, what I propose is not to use * as a globbing character in the first argument of your script. (This is similar to the syntax used by the split command where you provide a non-globbed prefix argument for the files to be generated.) I have two versions (code below). With the first version, you would do the following:
# repo- is a prefix: the command will be excuted in all
# subdirectories whose name starts with it
./do-for.sh repo- git commit -a -m "Added new files."
# The command will be excuted in all subdirectories
# of the current one
./do-for.sh . git commit -a -m "Added new files."
# If you want the command to be executed in exactly
# one subdirectory with no globbing at all,
# '/' can be used as a 'stop character'. But why
# use do-for.sh in this case?
./do-for.sh repo/ git commit -a -m "Added new files."
# Use '.' to disable the stop character.
# The command will be excuted in all subdirectories of the
# given one (paths have to be always relative, though)
./do-for.sh repos/. git commit -a -m "Added new files."
The second version involves using a globbing character the shell knows nothing about, such as SQL’s % character
# the command will be excuted in all subdirectories
# matching the SQL glob
./do-for.sh repo-% git commit -a -m "Added new files."
./do-for.sh user-%-repo git commit -a -m "Added new files."
./do-for.sh % git commit -a -m "Added new files."
The second version is more flexible, as it allows non-final globs, but is less standard for the bash world.
Here is the code:
#!/bin/bash
if [ "$#" -lt 2 ]; then
echo "Usage: ${0##*/} PREFIX command..." >&2
exit 1
fi
pathPrefix="$1"
shift
### For second version, comment out the following five lines
case "$pathPrefix" in
(*/) pathPrefix="${pathPrefix%/}" ;; # Stop character, remove it
(*.) pathPrefix="${pathPrefix%.}*" ;; # Replace final dot with glob
(*) pathPrefix+=\* ;; # Add a final glob
esac
### For second version, uncomment the following line
# pathPrefix="${pathPrefix//%/*}" # Add a final glob
tmp=${pathPrefix//[^\/]} # Count how many levels down we have to go
maxDepth=$((1+${#tmp}))
# Please note that this won’t work if matched directory names
# contain newline characters (comment added for those bash freaks who
# care about extreme cases)
declare -a directories=()
while read d; do
directories+=("$d")
done < <(find . -maxdepth "$maxDepth" -path ./"$pathPrefix" -type d -print)
curDir="$(pwd)"
for d in "${directories[#]}"; do
cd "$d";
"$#"
cd "$curDir"
done
As in Windows, you would still need to use quotes if the prefix contains spaces
./do-for.sh 'repository for project' git commit -a -m "Added new files."
(but if the prefix does not contain spaces, you can avoid quoting it and it will correctly deal with any space-containing directory names beginning with that prefix; with obvious changes, the same is true for %-patterns in the second version.)
Please note the other relevant differences between a Windows and a Linux environment, such as case sensitivity in pathnames, differences in which characters are considered special, and so on.
In bash you may execute "set -o noglob" which will inhibit the shell to expand path names (globs). But this has to be set on the running shell before you execute the script, otherwise you should quote any meta character which you provide in the arguments.
find-while-read combination is one of the safest combination to parse file names. Do something like below
#!/bin/bash
myfunc(){
cd "$2"
eval "$1" # Execute the command parsed as an argument
}
cur_dir=$(pwd) # storing the current directory
find . -type d -print0 | while read -rd '' dname
do
myfunc "pwd" "$dname"
cd "$cur_dir" #Remember myfunc changes the current working dir, so you need this
done
Why not keep it simple and create a shell function that uses find but eases the burden for your users of typing out its commands, for example:
do_for() { find . -type d \( ! -name . \) -not -path '*/\.*' -name $1 -exec bash -c "cd '{}' && "${#:2}" " \; }
So they can type something like do_for repo-* git commit -a -m "Added new files."
Note, if you want to use the * by itself, you'll have to escape it:
do_for \* pwd
Wildcards are evaluated by the shell before being passed to any program or script. There is nothing you can do about that.
But if you accept quoting the globbing expression then this script should to do the trick
#!/usr/bin/env bash
for dir in $1; do (
cd "$dir"
"${#:2}"
) done
I tried it out with two test directories and it seems to be working. Use it like this:
mkdir test_dir1 test_dir2
./do-for.sh "test_dir*" git init
./do-for.sh "test_dir*" touch test_file
./do-for.sh "test_dir*" git add .
./do-for.sh "test_dir*" git status
./do-for.sh "test_dir*" git commit -m "Added new files."
Nobody proposing a solution using find ? Why not try something like this:
find . -type d \( -wholename 'YOURPATTERN' \) -print0 | xargs -0 YOURCOMMAND
Look at man find for more options.

Create shell sub commands by hierarchy

I'm trying to create a system for my scripts -
Each script will be located in a folder, which is the command itself.
The script itself will act as a sub-command.
For example, a script called "who" inside a directory called "git",
will allow me to run the script using git who in the command line.
Also, I would like to create a sub command to a psuedo-command, meaning a command not currently available. E.g. some-arbitrary-command sub-command.
Is that somehow possible?
I thought of somehow extending https://github.com/basecamp/sub to accomplish the task.
EDIT 1
#!/usr/bin/env bash
command=`basename $0`
subcommand="$1"
case "$subcommand" in
"" | "-h" | "--help" )
echo "$command: Some description here" >&2
;;
* )
subcommand_path="$(command -v "$command-$subcommand" || true)"
if [[ -x "$subcommand_path" ]]; then
shift
exec "$subcommand_path" "${#}"
return $?
else
echo "$command: no such command \`$subcommand'" >&2
exit 1
fi
;;
esac
This is currently the script I run for new custom-made commands.
Since it's so generic, I just copy-paste it.
I still wonder though -
can it be generic enough to just recognize the folder name and create the script by its folder name?
One issue though is that it doesn't seem to override the default command name, if it supposed to replace it (E.g. git).
EDIT 2
After tinkering around a bit this is what I came to eventuall:
#!/usr/bin/env bash
COMMAND=`basename $0`
SUBCOMMAND="$1"
COMMAND_DIR="$HOME/.zsh/scripts/$COMMAND"
case "$SUBCOMMAND" in
"" | "-h" | "--help" )
cat "$COMMAND_DIR/help.txt" 2>/dev/null ||
command $COMMAND "${#}"
;;
* )
SUBCOMMAND_path="$(command -v "$COMMAND-$SUBCOMMAND" || true)"
if [[ -x "$SUBCOMMAND_path" ]]; then
shift
exec "$SUBCOMMAND_path" "${#}"
else
command $COMMAND "${#}"
fi
;;
esac
This is a generic script called "helper-sub" I symlink to all the script directories I have (E.g. ln -s $HOME/bin/helper-sub $HOME/bin/ssh).
in my zshrc I created this to call all the scripts:
#!/usr/bin/env bash
PATH=${PATH}:$(find $HOME/.zsh/scripts -type d | tr '\n' ':' | sed 's/:$//')
export PATH
typeset -U path
for aliasPath in `find $HOME/.zsh/scripts -type d`; do
aliasName=`echo $aliasPath | awk -F/ '{print $NF}'`
alias ${aliasName}=${aliasPath}/${aliasName}
done
unset aliasPath
Examples can be seen here: https://github.com/iwfmp/zsh/tree/master/scripts
You can't make a directory executable as a script, but you can create a wrapper that calls the scripts in the directory.
You can do this either with a function (in your profile script or a file in your FPATH) or with a wrapper script.
A simple function might look like:
git() {
local subPath='/path/to/your/git'
local sub="${1}" ; shift
if [[ -x "${subPath}/${1}" ]]; then
"${subPath}/${sub}" "${#}"
return $?
else
printf '%s\n' "git: Unknown sub-command '${sub}'." >&2
return 1
fi
}
(This is the same way that the sub project you linked works, just simplified.)
Of course, if you actually want to create a sub-command for git specifically (and that wasn't just an example), you'll need to make sure that the built-in git commands still work. In that case you could do like this:
git() {
local subPath='/path/to/your/git'
local sub="${1}"
if [[ -x "${subPath}/${sub}" ]]; then
shift
"${subPath}/${sub}" "${#}"
return $?
else
command git "${#}"
return 1
fi
}
But it might be worth pointing out in that case that git supports adding arbitrary aliases via git config:
git config --global alias.who '!/path/to/your/git/who'

Use forward slash in variable

I wrote a script to ease the syncing and building of Android source. I tried adding a function to cherrypick patches, but I can't get it to work properly. I know it's because of the forward slashes, but I don't know how to protect/escape them.
Part of the code is:
echo "Copy/paste the project folder, i.e. 'frameworks/base'"
read folder
echo ""
echo "Now paste the cherry-pick git link, i.e. 'git fetch <someproject> refs/changes/... && git cherry-pick FETCH_HEAD'"
read cherry
echo ""
Begin
clear
echo ""
export IFS="&&"
for x in $cherry
do
cd ${CM}/${folder}
CHERRY=$(trim "$x")
$CHERRY
done
Let's say that the 'cherry' variable is:
git fetch http://r.cyanogenmod.com/CyanogenMod/android_frameworks_base refs/changes/68/22968/2 && git cherry-pick FETCH_HEAD
I would get this error:
/home/tristan202/bin/build_cm.sh: line 159: git fetch http://r.cyanogenmod.com/CyanogenMod/android_frameworks_base refs/changes/91/23491/2: No such file or directory
/home/tristan202/bin/build_cm.sh: line 159: git cherry-pick FETCH_HEAD: command not found
I cannot figure out why it fails.
The 'trim' function it calls is a function that trims leading and trailing spaces. If I do echo "$CHERRY" within the for loop, the commands are printed correctly, but it still fails.
I will give your another example:
cmd='echo hello && echo world'
$cmd
The result is:
hello && echo world
bash parses the command $cmd as Simple Commands not Lists of Commands.
After Parameter Expansion, && is passed as argument to echo(1st word after Word Splitting).
The solution is pulling && out:
cmd1='echo hello'
cmd2='echo world'
$cmd1 && $cmd2
Once you put && in a variable it ceases to be interpreted as separating two commands:
$ A="echo a && echo b"
$ echo $A
echo a && echo b
$ echo c && ${A}
c
a && echo b
So you need to avoid putting && into a variable.
git was even telling you that the && was the problem in its error message.

exiting script while running source scriptname over SSH

I have a script with a number of options in it one of the option sets is supposed to change the directory and then exit the script however running over ssh with the source to get it to change in the parent it exits SSH is there another way to do this so that it does not exit? my script is in the /usr/sbin directory.
You might try having the script run a subshell instead of whatever method it is using to “change [the directory] in the parent” (presumably you have the child print out a cd command and have the parent do something like eval "$(script --print-cd)"). So instead of (e.g.) a --print-cd option, add a --subshell option that starts a new instance of $SHELL.
d=/path/to/some/dir
#...
cd "$d"
#...
if test -n "$opt_print_cd"; then
sq_d="$(printf %s "$d" | sed -e "s/'/'\\\\''/g")"
printf "cd '%s'\n" "$sq_d"
elif test -n "$opt_subshell"; then
exec "$SHELL"
fi
If you can not edit the script itself, you can make a wrapper (assuming you have permission to create new, persistent files on the ‘server’):
#!/bin/sh
script='/path/to/script'
print_cd=
for a; do test "$a" = --print-cd && print_cd=yes && break; done
if test -n "$print_cd"; then
eval "$("$script" ${1+"$#"})" # use cd instead of eval if the script prints a bare dir path
exec "$SHELL"
else
exec $script" ${1+"$#"}
fi

Can I specify redirects and pipes in variables?

I have a bash script that creates a Subversion patch file for the current directory. I want to modify it to zip the produced file, if -z is given as an argument to the script.
Here's the relevant part:
zipped=''
zipcommand='>'
if [ "$1" = "-z" ]
then
zipped='zipped '
filename="${filename}.zip"
zipcommand='| zip >'
fi
echo "Creating ${zipped}patch file $filename..."
svn diff $zipcommand $filename
This doesn't work because it passes the | or > contained in $zipcommand as an argument to svn.
I can easily work around this, but the question is whether it's ever possible to use these kinds of operators when they're contained in variables.
Thanks!
I would do something like this (use bash -c or eval):
zipped=''
zipcommand='>'
if [ "$1" = "-z" ]
then
zipped='zipped '
filename="${filename}.zip"
zipcommand='| zip -#'
fi
echo "Creating ${zipped}patch file $filename..."
eval "svn diff $zipcommand $filename"
# this also works:
# bash -c "svn diff $zipcommand $filename"
This appears to work, but my version of zip (Mac OS X) required that i change the line:
zipcommand='| zip -#'
to
zipcommand='| zip - - >'
Edit: incorporated #DanielBungert's suggestion to use eval
eval is what you are looking for.
# eval 'printf "foo\nbar" | grep bar'
bar
Be careful with quote characters on that.
Or you should try zsh shell whic allows to define global aliases, e.g.:
alias -g L='| less'
alias -g S='| sort'
alias -g U='| uniq -c'
Then use this command (which is somewhat cryptic for the ones who took a look from behind ;-) )
./somecommand.sh S U L
HTH
Open a new file handle on either a process substitution to handle the compression or on the named file. Then redirect the output of svn diff to that file handle.
if [ "$1" = "-z" ]; then
zipped='zipped '
filename=$filename.zip
exec 3> >(zip > "$filename")
else
exec 3> "$filename"
fi
echo "Creating ${zipped}patch file $filename"
svn diff >&3

Resources