bash expand cd with shortcuts like zsh - bash

Is it possible in bash to expand something like
cd /u/lo/b<hit tab>
to
cd /usr/local/bin
?

Sorry I couldn't post earlier, I was held at work, and the bind function was more issue-prone than I first thought.
Here is what I came up with :
Bind the following script :
#!/bin/bash
#$HOME/.bashrc.d/autocomplete.sh
autocomplete_wrapper() {
BASE="${READLINE_LINE% *} " #we save the line except for the last argument
[[ "$BASE" == "$READLINE_LINE " ]] && BASE=""; #if the line has only 1 argument, we set the BASE to blank
EXPANSION=($(autocomplete "${READLINE_LINE##* }"))
[[ ${#EXPANSION[#]} -gt 1 ]] && echo "${EXPANSION[#]:1}" #if there is more than 1 match, we echo them
READLINE_LINE="$BASE${EXPANSION[0]}" #the current line is now the base + the 1st element
READLINE_POINT=${#READLINE_LINE} #we move our cursor at the end of the current line
}
autocomplete() {
LAST_CMD="$1"
#Special starting character expansion for '~', './' and '/'
[[ "${LAST_CMD:0:1}" == "~" ]] && LAST_CMD="$HOME${LAST_CMD:1}"
S=1; [[ "${LAST_CMD:0:1}" == "/" || "${LAST_CMD:0:2}" == "./" ]] && S=2; #we don't expand those
#we do the path expansion of the last argument here by adding a * before each /
EXPANSION=($(echo "$LAST_CMD*" | sed s:/:*/:"$S"g))
if [[ ! -e "${EXPANSION[0]}" ]];then #if the path cannot be expanded, we don't change the output
echo "$LAST_CMD"
elif [[ "${#EXPANSION[#]}" -eq 1 ]];then #else if there is only one match, we output it
echo "${EXPANSION[0]}"
else
#else we expand the path as much as possible and return all the possible results
while [[ $l -le "${#EXPANSION[0]}" ]]; do
for i in "${EXPANSION[#]}"; do
if [[ "${EXPANSION[0]:$l:1}" != "${i:$l:1}" ]]; then
CTRL_LOOP=1
break
fi
done
[[ $CTRL_LOOP -eq 1 ]] && break
((l++))
done
#we add the partial solution at the beggining of the array of solutions
echo "${EXPANSION[0]:0:$l} ${EXPANSION[#]}"
fi
}
with the following command :
source "$HOME/.bashrc.d/autocomplete.sh"
bind -x '"\t" : autocomplete_wrapper'
Output :
>$ cd /u/lo/b<TAB>
>$ cd /usr/local/bin
>$ cd /u/l<TAB>
/usr/local /usr/lib
>$ cd /usr/l
The bind line could be added to your ~/.bashrc file, doing something like this :
if [[ -s "$HOME/.bashrc.d/autocomplete.sh" ]]; then
source "$HOME/.bashrc.d/autocomplete.sh"
bind -x '"\t" : autocomplete_wrapper'
fi
(taken from this answer)
Furthermore, I would strongly advise against binding this command to your Tab key as it would override the default autocomplete.
Note: In some cases, this will misbehave, for isntance if you try to autocomplete "/path/with spaces/something", as the last argument to complete is determined by ${READLINE_LINE##* }. If this is an issue in your case, you should code a function that returns the last argument of a line when considering quotes
Feel free to ask for further clarification, and I welcome any suggestion to improve this script

I have come up with an alternative solution that does not break existing bash
completion rules in other places.
The idea is to append a wildcard (asterisk) to every element of the path and
invoke normal bash completion process from there. So when user types
/u/lo/b<Tab> my function substitutes that with /u*/lo*/b* and invokes bash
completion as usual.
To enable the described behavior source this file from your
~/.bashrc. Supported features are:
Special characters in completed path are automatically escaped if present
Tilde expressions are properly expanded (as per bash documentation)
If user had started writing the path in quotes, no character escaping is
applied. Instead the quote is closed with a matching character after expanding
the path.
If bash-completion package is already in use, this code will safely override
its _filedir function. No extra configuration is required.
Watch a demo screencast to see this feature in action:
Full code listing below (you should check the GitHub repo for latest updates though):
#!/usr/bin/env bash
#
# Zsh-like expansion of incomplete file paths for Bash.
# Source this file from your ~/.bashrc to enable the described behavior.
#
# Example: `/u/s/a<Tab>` will be expanded into `/usr/share/applications`
#
# Copyright 2018 Vitaly Potyarkin
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Take a single incomplete path and fill it with wildcards
# e.g. /u/s/app/ -> /u*/s*/app*
#
_put_wildcards_into_path() {
local PROCESSED TILDE_EXPANSION
PROCESSED=$( \
echo "$#" | \
sed \
-e 's:\([^\*\~]\)/:\1*/:g' \
-e 's:\([^\/\*]\)$:\1*:g' \
-e 's:\/$::g' \
-e 's:^\(\~[^\/]*\)\*\/:\1/:' \
-Ee 's:(\.+)\*/:\1/:g' \
)
eval "TILDE_EXPANSION=$(printf '%q' "$PROCESSED"|sed -e 's:^\\\~:~:g')"
echo "$TILDE_EXPANSION"
}
#
# Bash completion function for expanding partial paths
#
# This is a generic worker. It accepts 'file' or 'directory' as the first
# argument to specify desired completion behavior
#
_complete_partial() {
local WILDCARDS ACTION LINE OPTION INPUT UNQUOTED_INPUT QUOTE
ACTION="$1"
if [[ "_$1" == "_-d" ]]
then # _filedir compatibility
ACTION="directory"
fi
INPUT="${COMP_WORDS[$COMP_CWORD]}"
# Detect and strip opened quotes
if [[ "${INPUT:0:1}" == "'" || "${INPUT:0:1}" == '"' ]]
then
QUOTE="${INPUT:0:1}"
INPUT="${INPUT:1}"
else
QUOTE=""
fi
# Add wildcards to each path element
WILDCARDS=$(_put_wildcards_into_path "$INPUT")
# Collect completion options
COMPREPLY=()
while read -r -d $'\n' LINE
do
if [[ "_$ACTION" == "_directory" && ! -d "$LINE" ]]
then # skip non-directory paths when looking for directory
continue
fi
if [[ -z "$LINE" ]]
then # skip empty suggestions
continue
fi
if [[ -z "$QUOTE" ]]
then # escape special characters unless user has opened a quote
LINE=$(printf "%q" "$LINE")
fi
COMPREPLY+=("$LINE")
done <<< $(compgen -G "$WILDCARDS" "$WILDCARDS" 2>/dev/null)
return 0 # do not clutter $? value (last exit code)
}
# Wrappers
_complete_partial_dir() { _complete_partial directory; }
_complete_partial_file() { _complete_partial file; }
# Enable enhanced completion
complete -o bashdefault -o default -o nospace -D -F _complete_partial_file
# Optional. Make sure `cd` is autocompleted only with directories
complete -o bashdefault -o default -o nospace -F _complete_partial_dir cd
# Override bash-completion's _filedir (if it's in use)
# https://salsa.debian.org/debian/bash-completion
_filedir_original_code=$(declare -f _filedir|tail -n+2)
if [[ ! -z "$_filedir_original_code" ]]
then
eval "_filedir_original() $_filedir_original_code"
_filedir() {
_filedir_original "$#"
_complete_partial "$#"
}
fi
# Readline configuration for better user experience
bind 'TAB:menu-complete'
bind 'set colored-completion-prefix on'
bind 'set colored-stats on'
bind 'set completion-ignore-case on'
bind 'set menu-complete-display-prefix on'
bind 'set show-all-if-ambiguous on'
bind 'set show-all-if-unmodified on'

Related

List the files in Directory and Copy-Replace them into another Directory in Linux

I am trying to automate the below: Any help, please.
We have 2 directories as mentioned below, whenever we get new files in Directory-1, only they should be copied and replaced into Directory-2. How to achieve this in Linux scripting. Filename remains the same but the version will be different.
Directory-1:
FileOne_2.0.0.txt
FileTwo_3.0.0.txt
Directory-2:
FileOne_1.0.0.txt
FileTwo_2.0.0.txt
FileThree_3.0.0.txt
FileFive_5.0.0.txt
Try this code (on a test setup before you trust your real directories and files with it):
#! /bin/bash -p
shopt -s extglob # Enable extended globbing ( +([0-9]) ... )
shopt -s nullglob # Globs that match nothing expand to nothing
shopt -s dotglob # Globs match files with names starting with '.'
srcdir='Directory-1'
destdir='Directory-2'
# A(n extended) glob pattern to match a version string (e.g. '543.21.0')
readonly kVERGLOB='+([0-9]).+([0-9]).+([0-9])'
# shellcheck disable=SC2231 # (Bad warning re. unquoted ${kVERGLOB})
for srcpath in "$srcdir"/*_${kVERGLOB}.txt; do
srcfile=${srcpath##*/} # E.g. 'FileOne_2.0.0.txt'
srcbase=${srcfile%_*} # E.g. 'FileOne'
# Set and check the path that the file will be moved to
destpath=$destdir/$srcfile
if [[ -e $destpath ]]; then
printf "Warning: '%s' already exists. Skipping '%s'.\\n" \
"$destpath" "$srcpath" >&2
continue
fi
# Make a list of the old versions of the file
# shellcheck disable=SC2206 # (Bad warning re. unquoted ${kVERGLOB})
old_destpaths=( "$destdir/$srcbase"_${kVERGLOB}.txt )
# TODO: Add checks that the number of old files (${#old_destpaths[*]})
# is what is expected (exactly one?)
# Move the file
if mv -i -- "$srcpath" "$destpath"; then
printf "Moved '%s' to '%s'\\n" "$srcpath" "$destpath" >&2
else
printf "Warning: Failed to move '%s' to '%s'. Skipping '%s'.\\n" \
"$srcpath" "$destpath" "$srcpath" >&2
continue
fi
# Remove the old version(s) of the file (if any)
for oldpath in "${old_destpaths[#]}"; do
if rm -- "$oldpath"; then
printf "Removed '%s'\\n" "$oldpath" >&2
else
printf "Warning: Failed to remove '%s'.\\n" "$oldpath" >&2
fi
done
done
The code is Shellcheck-clean. Two Shellcheck suppression comments are used because the unquoted expansions are necessary here.
srcdir and destdir are set to constant values. You might want to take them from command line parameters, or set them to different constant values.
The code could be made shorter by removing checks. However, moves and removes are destructive operations that can do a lot of damage if they are done incorrectly. I'd add even more checks if it was my own data.
See glob - Greg's Wiki for an explanation of the "extended globbing" used in the code.
See Parameter expansion [Bash Hackers Wiki] for an explanation of ${srcpath##*/} and ${srcfile%_*}.
mv -i is used as a double protection against overwriting an existing file.
All external commands are invoked with -- to explicitly end options, in case they are ever used with paths that begin with -.
Make sure that you understand the code and test it VERY carefully before using it for real.
source_dir=./files/0
dest_dir=./files/1/
for file in $source_dir/*
do
echo $file
echo "processing"
if [[ "1" == "1" ]]; then
mv $file $dest_dir
fi
done
Where processing and the 1 == 1 is whatever your 'prechecks' are (which you haven't told us)
If your coreutils sort is newer than or equal to v7.0 (2008-10-5) after which sort command
supports -V option (version-sort), would you please try:
declare -A base2ver base2file
# compare versions
# returns 0 if $1 equals to $2
# 1 if $1 is newer than $2
# -1 if $1 is older than $2
vercomp() {
if [[ $1 = $2 ]]; then
echo 0
else
newer=$(echo -e "$1\n$2" | sort -Vr | head -n 1)
if [[ $newer = $1 ]]; then
echo 1
else
echo -1
fi
fi
}
for f in Directory-1/*.txt; do
basename=${f##*/}
version=${basename##*_}
version=${version%.txt} # version number such as "2.0.0"
basename=${basename%_*} # basename such as "FileOne"
base2ver[$basename]=$version # associates basename with version number
base2file[$basename]=$f # associates basename with full filename
done
for f in Directory-2/*.txt; do
basename=${f##*/}
version=${basename##*_}
version=${version%.txt}
basename=${basename%_*}
if [[ -n ${base2ver[$basename]} ]] && (( $(vercomp "${base2ver[$basename]}" "$version") > 0 )); then
# echo "${base2file[$basename]} is newer than $f"
rm -- "$f"
cp -p -- "${base2file[$basename]}" Directory-2
fi
done

Looping over shell script arguments and passing quoted arguments to function

I have a script below that sources a directory of bash scripts and then parses the flags of the command to run a specific function from the sourced files.
Given this function within the scripts dir:
function reggiEcho () {
echo $1
}
Here are some examples of current output
$ reggi --echo hello
hello
$ reggi --echo hello world
hello
$ reggi --echo "hello world"
hello
$ reggi --echo "hello" --echo "world"
hello
world
As you can see quoted parameters are not honored as they should be `"hello world" should echo properly.
This is the script, the issue is within the while loop.
How do I parse these flags, and maintain passing in quoted parameters into the function?
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
STR="$(find $DIR/scripts -type f -name '*.sh' -print)"
ARR=( $STR )
TUSAGE="\n"
for f in "${ARR[#]}"; do
if [ -f $f ]
then
. $f --source-only
if [ -z "$USAGE" ]
then
:
else
TUSAGE="$TUSAGE \t$USAGE\n"
fi
USAGE=""
else
echo "$f not found"
fi
done
TUSAGE="$TUSAGE \t--help (shows this help output)\n"
function usage() {
echo "Usage: --function <args> [--function <args>]"
echo $TUSAGE
exit 1
}
HELP=false
cmd=()
while [ $# -gt 0 ]; do # loop until no args left
if [[ $1 = '--help' ]] || [[ $1 = '-h' ]] || [[ $1 = '--h' ]] || [[ $1 = '-help' ]]; then
HELP=true
fi
if [[ $1 = --* ]] || [[ $1 = -* ]]; then # arg starts with --
if [[ ${#cmd[#]} -gt 0 ]]; then
"${cmd[#]}"
fi
top=`echo $1 | tr -d -` # remove all flags
top=`echo ${top:0:1} | tr '[a-z]' '[A-Z]'`${top:1} # make sure first letter is uppercase
top=reggi$top # prepend reggi
cmd=( "$top" ) # start new array
else
echo $1
cmd+=( "$1" )
fi
shift
done
if [[ "$HELP" = true ]]; then
usage
elif [[ ${#cmd[#]} -gt 0 ]]; then
${cmd[#]}
else
usage
fi
There are many places in this script where you have variable references without double-quotes around them. This means the variables' values will be subject to word spitting and wildcard expansion, which can have various weird effects.
The specific problem you're seeing is due to an unquoted variable reference on the fourth-from-last line, ${cmd[#]}. With cmd=( echo "hello world" ), word splitting makes this equivalent to echo hello world rather than echo "hello world".
Fixing that one line will fix your current problem, but there are a number of other unquoted variable references that may cause other problems later. I recommend fixing all of them. Cyrus' recommendation of shellcheck.net is good at pointing them out, and will also note some other issues I won't cover here. One thing it won't mention is that you should avoid all-caps variable names (DIR, TUSAGE, etc) -- there are a bunch of all-caps variables with special meanings, and it's easy to accidentally reuse one of them and wind up with weird effects. Lowercase and mixed-case variables are safer.
I also recommend against using \t and \n in strings, and counting on echo to translate them into tabs and newlines, respectively. Some versions of echo do this automatically, some require the -e option to tell them to do it, some will print "-e" as part of their output... it's a mess. In bash, you can use $'...' to translate those escape sequences directly, e.g:
tusage="$tusage"$' \t--help (shows this help output)\n' # Note mixed quoting modes
echo "$tusage" # Note that double-quoting is *required* for this to work right
You should also fix the file listing so it doesn't depend on being unquoted (see chepner's comment). If you don't need to scan subdirectories of $DIR/scripts, you can do this with a simple wildcard (note lowercase vars and that the var is double-quoted, but the wildcard isn't):
arr=( "$dir/scripts"/*.sh )
If you need to look in subdirectories, it's more complicated. If you have bash v4 you can use a globstar wildcard, like this:
shopt -s globstar
arr=( "$dir/scripts"/**/*.sh )
If your script might have to run under bash v3, see BashFAQ #20: "How can I find and safely handle file names containing newlines, spaces or both?", or just use this:
while IFS= read -r -d '' f <&3; do
if [ -f $f ]
# ... etc
done 3< <(find "$dir/scripts" -type f -name '*.sh' -print0)
(That's my favorite it-just-works idiom for iterating over find's matches. Although it does require bash, not some generic POSIX shell.)

Recursively list hidden files without ls, find or extendedglob

As an exercise I have set myself the task of recursively listing files using bash builtins. I particularly don't want to use ls or find and I would prefer not to use setopt extendedglob. The following appears to work but I cannot see how to extend it with /.* to list hidden files. Is there a simple workaround?
g() { for k in "$1"/*; do # loop through directory
[[ -f "$k" ]] && { echo "$k"; continue; }; # echo file path
[[ -d "$k" ]] && { [[ -L "$k" ]] && { echo "$k"; continue; }; # echo symlinks but don't follow
g "$k"; }; # start over with new directory
done; }; g "/Users/neville/Desktop" # original directory
Added later: sorry - I should have said: 'bash-3.2 on OS X'
Change
for k in "$1"/*; do
to
for k in "$1"/* "$1"/.[^.]* "$1"/..?*; do
The second glob matches all files whose names start with a dot followed by anything other than a dot, while the third matches all files whose names start with two dots followed by something. Between the two of them, they will match all hidden files other than the entries . and ...
Unfortunately, unless the shell option nullglob is set, those (like the first glob) could remain as-is if there are no files whose names match (extremely likely in the case of the third one) so it is necessary to verify that the name is actually a file.
An alternative would be to use the much simpler glob "$1"/.*, which will always match the . and .. directory entries, and will consequently always be substituted. In that case, it's necessary to remove the two entries from the list:
for k in "$1"/* "$1"/.*; do
if ! [[ $k =~ /\.\.?$ ]]; then
# ...
fi
done
(It is still possible for "$1"/* to remain in the list, though. So that doesn't help as much as it might.)
Set the GLOBIGNORE file to exclude . and .., which implicitly turns on "shopt -u dotglob". Then your original code works with no other changes.
user#host [/home/user/dir]
$ touch file
user#host [/home/user/dir]
$ touch .dotfile
user#host [/home/user/dir]
$ echo *
file
user#host [/home/user/dir]
$ GLOBIGNORE=".:.."
user#host [/home/user/dir]
$ echo *
.dotfile file
Note that this is bash-specific. In particular, it does not work in ksh.
You can specify multiple arguments to for:
for k in "$1"/* "$1"/.*; do
But if you do search for .* in directories , you should be aware that it also gives you the . and .. files. You may also be given a nonexistent file if the "$1"/* glob matches, so I would check that too.
With that in mind, this is how I would correct the loop:
g() {
local k subdir
for k in "$1"/* "$1"/.*; do # loop through directory
[[ -e "$k" ]] || continue # Skip missing files (unmatched globs)
subdir=${k##*/}
[[ "$subdir" = . ]] || [[ "$subdir" = .. ]] && continue # Skip the pseudo-directories "." and ".."
if [[ -f "$k" ]] || [[ -L "$k" ]]; then
printf %s\\n "$k" # Echo the paths of files and symlinks
elif [[ -d "$k" ]]; then
g "$k" # start over with new directory
fi
done
}
g ~neville/Desktop
Here the funky-looking ${k##*/} is just a fast way to take the basename of the file, while local was put in so that the variables don't modify any existing variables in the shell.
One more thing I've changed is echo "$k" to printf %s\\n "$k", because echo is irredeemably flawed in its argument handling and should be avoided for the purpose of echoing an unknown variable. (See Rich's sh tricks for an explanation of how; it boils down to -n and -e throwing a spanner in the works.)
By the way, this will NOT print sockets or fifos - is that intentional?

Creating a which command in bash script

For an assignment, I'm supposed to create a script called my_which.sh that will "do the same thing as the Unix command, but do it using a for loop over an if." I am also not allowed to call which in my script.
I'm brand new to this, and have been reading tutorials, but I'm pretty confused on how to start. Doesn't which just list the path name of a command?
If so, how would I go about displaying the correct path name without calling which, and while using a for loop and an if statement?
For example, if I run my script, it will echo % and wait for input. But then how do I translate that to finding the directory? So it would look like this?
#!/bin/bash
path=(`echo $PATH`)
echo -n "% "
read ans
for i in $path
do
if [ -d $i ]; then
echo $i
fi
done
I would appreciate any help, or even any starting tutorials that can help me get started on this. I'm honestly very confused on how I should implement this.
Split your PATH variable safely. This is a general method to split a string at delimiters, that is 100% safe regarding any possible characters (including newlines):
IFS=: read -r -d '' -a paths < <(printf '%s:\0' "$PATH")
We artificially added : because if PATH ends with a trailing :, then it is understood that current directory should be in PATH. While this is dangerous and not recommended, we must also take it into account if we want to mimic which. Without this trailing colon, a PATH like /bin:/usr/bin: would be split into
declare -a paths='( [0]="/bin" [1]="/usr/bin" )'
whereas with this trailing colon the resulting array is:
declare -a paths='( [0]="/bin" [1]="/usr/bin" [2]="" )'
This is one detail that other answers miss. Of course, we'll do this only if PATH is set and non-empty.
With this split PATH, we'll use a for-loop to check whether the argument can be found in the given directory. Note that this should be done only if argument doesn't contain a / character! this is also something other answers missed.
My version of which handles a unique option -a that print all matching pathnames of each argument. Otherwise, only the first match is printed. We'll have to take this into account too.
My version of which handles the following exit status:
0 if all specified commands are found and executable
1 if one or more specified commands is nonexistent or not executable
2 if an invalid option is specified
We'll handle that too.
I guess the following mimics rather faithfully the behavior of my which (and it's pure Bash):
#!/bin/bash
show_usage() {
printf 'Usage: %s [-a] args\n' "$0"
}
illegal_option() {
printf >&2 'Illegal option -%s\n' "$1"
show_usage
exit 2
}
check_arg() {
if [[ -f $1 && -x $1 ]]; then
printf '%s\n' "$1"
return 0
else
return 1
fi
}
# manage options
show_only_one=true
while (($#)); do
[[ $1 = -- ]] && { shift; break; }
[[ $1 = -?* ]] || break
opt=${1#-}
while [[ $opt ]]; do
case $opt in
(a*) show_only_one=false; opt=${opt#?} ;;
(*) illegal_option "${opt:0:1}" ;;
esac
done
shift
done
# If no arguments left or empty PATH, exit with return code 1
(($#)) || exit 1
[[ $PATH ]] || exit 1
# split path
IFS=: read -r -d '' -a paths < <(printf '%s:\0' "$PATH")
ret=0
# loop on arguments
for arg; do
# Check whether arg contains a slash
if [[ $arg = */* ]]; then
check_arg "$arg" || ret=1
else
this_ret=1
for p in "${paths[#]}"; do
if check_arg "${p:-.}/$arg"; then
this_ret=0
"$show_only_one" && break
fi
done
((this_ret==1)) && ret=1
fi
done
exit "$ret"
To test whether an argument is executable or not, I'm checking whether it's a regular file1 which is executable with:
[[ -f $arg && -x $arg ]]
I guess that's close to my which's behavior.
1 As #mklement0 points out (thanks!) the -f test, when applied against a symbolic link, tests the type of the symlink's target.
#!/bin/bash
#Get the user's first argument to this script
exe_name=$1
#Set the field separator to ":" (this is what the PATH variable
# uses as its delimiter), then read the contents of the PATH
# into the array variable "paths" -- at the same time splitting
# the PATH by ":"
IFS=':' read -a paths <<< $PATH
#Iterate over each of the paths in the "paths" array
for e in ${paths[*]}
do
#Check for the $exe_name in this path
find $e -name $exe_name -maxdepth 1
done
This is similar to the accepted answer with the difference that it does not set the IFS and checks if the execute bits are set.
#!/bin/bash
for i in $(echo "$PATH" | tr ":" "\n")
do
find "$i" -name "$1" -perm +111 -maxdepth 1
done
Save this as my_which.sh (or some other name) and run it as ./my_which java etc.
However if there is an "if" required:
#!/bin/bash
for i in $(echo "$PATH" | tr ":" "\n")
do
# this is a one liner that works. However the user requires an if statment
# find "$i" -name "$1" -perm +111 -maxdepth 1
cmd=$i/$1
if [[ ( -f "$cmd" || -L "$cmd" ) && -x "$cmd" ]]
then
echo "$cmd"
break
fi
done
You might want to take a look at this link to figure out the tests in the "if".
For a complete, rock-solid implementation, see gniourf_gniourf's answer.
Here's a more concise alternative that makes do with a single invocation of find [per name to investigate].
The OP later clarified that an if statement should be used in a loop, but the question is general enough to warrant considering other approaches.
A naïve implementation would even work as a one-liner, IF you're willing to make a few assumptions (the example uses 'ls' as the executable to locate):
find -L ${PATH//:/ } -maxdepth 1 -type f -perm -u=x -name 'ls' 2>/dev/null
The assumptions - which will hold in many, but not all situations - are:
$PATH must not contain entries that when used unquoted result in shell expansions (e.g., no embedded spaces that would result in word splitting, no characters such as * that would result in pathname expansion)
$PATH must not contain an empty entry (which must be interpreted as the current dir).
Explanation:
-L tells find to investigate the targets of symlinks rather than the symlinks themselves - this ensures that symlinks to executable files are also recognized by -type f
${PATH//:/ } replaces all : chars. in $PATH with a space each, causing the result - due to being unquoted - to be passed as individual arguments split by spaces.
-maxdepth 1 instructs find to only look directly in each specified directory, not also in subdirectories
-type f matches only files, not directories.
-perm -u=x matches only files and directories that the current user (u) can execute (x).
2>/dev/null suppresses error messages that may stem from non-existent directories in the $PATH or failed attempts to access files due to lack of permission.
Here's a more robust script version:
Note:
For brevity, only handles a single argument (and no options).
Does NOT handle the case where entries or result paths may contain embedded \n chars - however, this is extremely rare in practice and likely leads to bigger problems overall.
#!//bin/bash
# Assign argument to variable; error out, if none given.
name=${1:?Please specify an executable filename.}
# Robustly read individual $PATH entries into a bash array, splitting by ':'
# - The additional trailing ':' ensures that a trailing ':' in $PATH is
# properly recognized as an empty entry - see gniourf_gniourf's answer.
IFS=: read -r -a paths <<<"${PATH}:"
# Replace empty entries with '.' for use with `find`.
# (Empty entries imply '.' - this is legacy behavior mandated by POSIX).
for (( i = 0; i < "${#paths[#]}"; i++ )); do
[[ "${paths[i]}" == '' ]] && paths[i]='.'
done
# Invoke `find` with *all* directories and capture the 1st match, if any, in a variable.
# Simply remove `| head -n 1` to print *all* matches.
match=$(find -L "${paths[#]}" -maxdepth 1 -type f -perm -u=x -name "$name" 2>/dev/null |
head -n 1)
# Print result, if found, and exit with appropriate exit code.
if [[ -n $match ]]; then
printf '%s\n' "$match"
exit 0
else
exit 1
fi

rename files by pulling new names from a file

I have a bunch of files that need to be renamed and the new name is in a text file.
Example file name:
ASBC_Fishbone_Ia.pdf
Example entry in text file:
Ia. Propagation—Design Considerations
Expected new file name:
Ia. Propagation—Design Considerations.pdf
or
Ia._Propagation—Design_Considerations
What would be a good way of going about this using typical linux cli tools? I'm thinking some combination of ls, grep and rename?
You can try:
#!/bin/bash
# Do not allow the script to run if it's not Bash or Bash version is < 4.0 .
[ -n "$BASH_VERSION" ] && [[ BASH_VERSINFO -ge 4 ]] || exit 1
# Do not allow presenting glob pattern if no match is found.
shopt -s nullglob
# Use an associative array.
declare -A MAP=() || exit 1
while IFS=$'\t' read -r CODE NAME; do
# Maps name with code e.g. MAP['Ia']='Propagation—Design Considerations'
MAP[${CODE%.}]=$NAME
done < /path/to/text_file
# Change directory. Not needed if files are in current directory.
cd "/path/to/dir/containing/files" || exit 1
for FILE in *_*.pdf; do
# Get code from filename.
CODE=${FILE##*_} CODE=${CODE%.pdf}
# Skip if no code was extracted from file.
[[ -n $CODE ]] || continue
# Get name from map based from code.
NAME=${MAP[$CODE]}
# Skip if no new name was registered based on code.
[[ -n $NAME ]] || continue
# Generate new name.
NEW_NAME="${CODE}. $NAME.pdf"
# Replace spaces with _ at your preference. Uncomment if wanted.
# NEW_NAME=${NEWNAME// /_}
# Rename file. Remove echo if you find it correct already.
echo mv -- "$FILE" "$NEW_NAME"
done

Resources