Bash problem with function that hides files - bash

I have this function
function hide {
for f in "$#"; do
if [[ ! ${f::1} == '.' ]]; then
mv $f .$f
fi
done
}
that should hide a file passed as input, if it is not already hidden.
When I use it on files whose names contain spaces, like:
touch "ciao ciao"
hide ciao\ ciao
it doesn't work and I get this error instead:
usage: mv [-f | -i | -n] [-v] source target
mv [-f | -i | -n] [-v] source ... directory
I tried changing .$f to ."$f" in the mv command but I still get the error.

Improving #rtx13 's answer:
Handle full paths safely
Prioritize use of POSIX syntax over Bash specifics.
#!/usr/bin/env sh
hide ()
{
# Without in, for loop iterates arguments
for path; do
# Strips-out the leading directory path to keep file name only
file="${path##*/}"
# If file name starts with a dot, continue to next argument
[ -n "${file##.*}" ] || continue
# Strips out the trailing file name to keep the leading directory path only
base="${path%/*}"
# If base is same as file, then there is no leading dir, so prepend current
if [ "$base" = "$file" ]; then
base='.'
path="./$path"
fi
# Performs the file rename
mv --no-clobber -- "$path" "$base/.$file" # Always double quote variables expansion
done
}
EDIT:
Following gniourf_gniourf's good comments.
Simplified but kept the harmless -- options terminator as a lead to good practice.
Added --no-clobber to not overwrite existing files.

Per #Jetchisel, you need to quote the variables when passing args to mv to ensure spaces are preserved:
function hide {
for f in "$#"; do
if [[ ! ${f::1} == '.' ]]; then
mv "$f" ".$f" # <= note variable references are quoted
fi
done
}

Related

How can I check if exists file with name according to "template" in the directory?

Given variable with name template , for example: template=*.txt.
How can I check if files with name like this template exist in the current directory?
For example, according to the value of the template above, I want to know if there is files with the suffix .txt in the current directory.
I would do it like this with just built-ins:
templcheck () {
for f in * .*; do
[[ -f $f ]] && [[ $f = $1 ]] && return 0
done
return 1
}
This takes the template as an argument (must be quoted to prevent premature expansion) and returns success if there was a match in the current directory. This should work for any filenames, including those with spaces and newlines.
Usage would look like this:
$ ls
file1.txt 'has space1.txt' script.bash
$ templcheck '*.txt' && echo yes
yes
$ templcheck '*.md' && echo yes || echo no
no
To use with the template contained in a variable, that expansion has to be quoted as well:
templcheck "$template"
Use find:
: > found.txt # Ensure the file is empty
find . -prune -exec find -name "$template" \; > found.txt
if [ -s found.txt ]; then
echo "No matching files"
else
echo "Matching files found"
fi
Strictly speaking, you can't assume that found.txt contains exactly one file name per line; a filename with an embedded newline will look the same as two separate files. But this does guarantee that an empty file means no matching files.
If you want an accurate list of matching file names, you need to disable field splitting while keeping pathname expansion.
[[ -v IFS ]] && OLD_IFS=$IFS
IFS=
shopt -s nullglob
files=( $template )
[[ -v OLD_IFS ]] && IFS=$OLD_IFS
printf "Found: %s\n" "${files[#]}"
This requires several bash extensions (the nullglob option, arrays, and the -v operator for convenience of restoring IFS). Each element of the array is exactly one match.

How to pass files to a script that processes folders

So I have this bash script which will rename all the files from the current directory. I need help modifying it so I can instead specify only certain files which will be renamed, but also still have the ability to pass it a directory instead. I'm not super familiar with bash so it's fairly confusing to me.
#!/bin/bash
#
# Filename: rename.sh
# Description: Renames files and folders to lowercase recursively
# from the current directory
# Variables: Source = x
# Destination = y
#
# Rename all directories. This will need to be done first.
#
# Process each directory’s contents before the directory itself
for x in `find * -depth -type d`;
do
# Translate Caps to Small letters
y=$(echo $x | tr '[A-Z]' '[a-z]');
# check if directory exits
if [ ! -d $y ]; then
mkdir -p $y;
fi
# check if the source and destination is the same
if [ "$x" != "$y" ]; then
# check if there are files in the directory
# before moving it
if [ $(ls "$x") ]; then
mv $x/* $y;
fi
rmdir $x;
fi
done
#
# Rename all files
#
for x in `find * -type f`;
do
# Translate Caps to Small letters
y=$(echo $x | tr '[A-Z]' '[a-z]');
if [ "$x" != "$y" ]; then
mv $x $y;
fi
done
exit 0
Your script has a large number of beginner errors, but the actual question in the title has some merit.
For a task like this, I would go for a recursive solution.
tolower () {
local f g
for f; do
# If this is a directory, process its contents first
if [ -d "$f" ]; then
# Recurse -- run the same function over directory entries
tolower "$f"/*
fi
# Convert file name to lower case (Bash 4+)
g=${f,,}
# If lowercased version differs from original, move it
if [ "${f##*/}" != "${g##*/}" ]; then
mv "$f" "$g"
fi
done
}
Notice how variables which contain file names always need to be quoted (otherwise, your script will fail on file names which contain characters which are shell metacharacters) and how Bash has built-in functionality for lowercasing a variable's value (in recent versions).
Also, tangentially, don't use ls in scripts and try http://shellcheck.net/ before asking for human debugging help.

ls $FOLDER_PATH with space in $FOLDER_PATH: No such file or directory

I am trying to get the filename in a folder with only one file in it.
FYI: The $FOLDER_TMP contains a space in it, that is why I use printf
function nameofkeyfile(){
FOLDER_TMP="${PWD%/*/*}/folder/"
FOLDER=$(printf %q "${FOLDER_TMP}")
FILENAME=ls "$FOLDER" # Error: No such file or directory
# or this: FILENAME=$(ls "$FOLDER") # Error: No such file or directory
FNAME=`basename $FILENAME`
}
The problem is the line:
FILENAME=ls "$FOLDER" # Error: No such file or directory
Do you know why - and yes the folder is there?
And if I echo the $FOLDER it gives me the right folder.
I am trying to get the filename in a folder with only one file in it.
You definitely have the wrong approach.
Instead, consider using globbing like so:
The assignment
fname=( "${PWD%/*/*}"/folder/* )
will populate the array fname will the expansion of the given glob: that is, all files in the directory "${PWD%/*/*}"/folder/, if any. If there are no files at all, your array will contain the glob, verbatim.
Hence, a more robust approach is the following:
nameofkeyfile() {
fname=( "${PWD%/*/*}"/folder/* )
# Now check that there's at most one element in the array
if (( ${#fname[#]} > 1 )); then
echo "Oh no, there are too many files in your folder"
return 1
fi
# Now check that there is a file
if [[ ! -f ${fname[0]} ]]; then
echo "Oh no, there are no files in your folder"
return 1
fi
# Here, all is good!
echo "Your file is: $fname"
}
This uses Bash (named) arrays. If you want the function to be POSIX-compliant, it's rather straightforward since POSIX shells have an unnamed array (the positional parameters):
# POSIX-compliant version
nameofkeyfile() {
set -- "${PWD%/*/*}"/folder/*
# Now check that there's at most one element in the array
if [ "$#" -gt 1 ]; then
echo "Oh no, there are too many files in your folder"
return 1
fi
# Now check that there is a file
if [ ! -f "$1" ]; then
echo "Oh no, there are no files in your folder"
return 1
fi
# Here, all is good!
echo "Your file is: $1, I'll store it in variable fname for you"
fname=$1
}
I didn't strip the full path from the filename, but that's really easy (don't use basename for that!):1
fname=${fname##*/}
More precisely: in the Bash version, you'd use:
fname=${fname[0]##*/}
and in the POSIX version you'd use:
fname=${1##*/}
1there's a catch when using parameter expansions to get the basename, it's the case of /. But it seems you won't be in this case, so it's all safe!
To store the output ls "$FOLDER" in a variable, put it in a sub-shell:
FILENAME=$(ls "$FOLDER")
Another problem is the printf.
It adds escaping backslashes in the string,
and when you try to list the directory in the next step,
those backslashes are used literally by the shell.
So drop the printf:
function nameofkeyfile() {
FOLDER="${PWD%/*/*}/folder/"
FILENAME=$(ls "$FOLDER")
FNAME=$(basename $FILENAME)
}
Lastly, it's better to use $(...) than `...`:

Creating a which command in bash script

For an assignment, I'm supposed to create a script called my_which.sh that will "do the same thing as the Unix command, but do it using a for loop over an if." I am also not allowed to call which in my script.
I'm brand new to this, and have been reading tutorials, but I'm pretty confused on how to start. Doesn't which just list the path name of a command?
If so, how would I go about displaying the correct path name without calling which, and while using a for loop and an if statement?
For example, if I run my script, it will echo % and wait for input. But then how do I translate that to finding the directory? So it would look like this?
#!/bin/bash
path=(`echo $PATH`)
echo -n "% "
read ans
for i in $path
do
if [ -d $i ]; then
echo $i
fi
done
I would appreciate any help, or even any starting tutorials that can help me get started on this. I'm honestly very confused on how I should implement this.
Split your PATH variable safely. This is a general method to split a string at delimiters, that is 100% safe regarding any possible characters (including newlines):
IFS=: read -r -d '' -a paths < <(printf '%s:\0' "$PATH")
We artificially added : because if PATH ends with a trailing :, then it is understood that current directory should be in PATH. While this is dangerous and not recommended, we must also take it into account if we want to mimic which. Without this trailing colon, a PATH like /bin:/usr/bin: would be split into
declare -a paths='( [0]="/bin" [1]="/usr/bin" )'
whereas with this trailing colon the resulting array is:
declare -a paths='( [0]="/bin" [1]="/usr/bin" [2]="" )'
This is one detail that other answers miss. Of course, we'll do this only if PATH is set and non-empty.
With this split PATH, we'll use a for-loop to check whether the argument can be found in the given directory. Note that this should be done only if argument doesn't contain a / character! this is also something other answers missed.
My version of which handles a unique option -a that print all matching pathnames of each argument. Otherwise, only the first match is printed. We'll have to take this into account too.
My version of which handles the following exit status:
0 if all specified commands are found and executable
1 if one or more specified commands is nonexistent or not executable
2 if an invalid option is specified
We'll handle that too.
I guess the following mimics rather faithfully the behavior of my which (and it's pure Bash):
#!/bin/bash
show_usage() {
printf 'Usage: %s [-a] args\n' "$0"
}
illegal_option() {
printf >&2 'Illegal option -%s\n' "$1"
show_usage
exit 2
}
check_arg() {
if [[ -f $1 && -x $1 ]]; then
printf '%s\n' "$1"
return 0
else
return 1
fi
}
# manage options
show_only_one=true
while (($#)); do
[[ $1 = -- ]] && { shift; break; }
[[ $1 = -?* ]] || break
opt=${1#-}
while [[ $opt ]]; do
case $opt in
(a*) show_only_one=false; opt=${opt#?} ;;
(*) illegal_option "${opt:0:1}" ;;
esac
done
shift
done
# If no arguments left or empty PATH, exit with return code 1
(($#)) || exit 1
[[ $PATH ]] || exit 1
# split path
IFS=: read -r -d '' -a paths < <(printf '%s:\0' "$PATH")
ret=0
# loop on arguments
for arg; do
# Check whether arg contains a slash
if [[ $arg = */* ]]; then
check_arg "$arg" || ret=1
else
this_ret=1
for p in "${paths[#]}"; do
if check_arg "${p:-.}/$arg"; then
this_ret=0
"$show_only_one" && break
fi
done
((this_ret==1)) && ret=1
fi
done
exit "$ret"
To test whether an argument is executable or not, I'm checking whether it's a regular file1 which is executable with:
[[ -f $arg && -x $arg ]]
I guess that's close to my which's behavior.
1 As #mklement0 points out (thanks!) the -f test, when applied against a symbolic link, tests the type of the symlink's target.
#!/bin/bash
#Get the user's first argument to this script
exe_name=$1
#Set the field separator to ":" (this is what the PATH variable
# uses as its delimiter), then read the contents of the PATH
# into the array variable "paths" -- at the same time splitting
# the PATH by ":"
IFS=':' read -a paths <<< $PATH
#Iterate over each of the paths in the "paths" array
for e in ${paths[*]}
do
#Check for the $exe_name in this path
find $e -name $exe_name -maxdepth 1
done
This is similar to the accepted answer with the difference that it does not set the IFS and checks if the execute bits are set.
#!/bin/bash
for i in $(echo "$PATH" | tr ":" "\n")
do
find "$i" -name "$1" -perm +111 -maxdepth 1
done
Save this as my_which.sh (or some other name) and run it as ./my_which java etc.
However if there is an "if" required:
#!/bin/bash
for i in $(echo "$PATH" | tr ":" "\n")
do
# this is a one liner that works. However the user requires an if statment
# find "$i" -name "$1" -perm +111 -maxdepth 1
cmd=$i/$1
if [[ ( -f "$cmd" || -L "$cmd" ) && -x "$cmd" ]]
then
echo "$cmd"
break
fi
done
You might want to take a look at this link to figure out the tests in the "if".
For a complete, rock-solid implementation, see gniourf_gniourf's answer.
Here's a more concise alternative that makes do with a single invocation of find [per name to investigate].
The OP later clarified that an if statement should be used in a loop, but the question is general enough to warrant considering other approaches.
A naïve implementation would even work as a one-liner, IF you're willing to make a few assumptions (the example uses 'ls' as the executable to locate):
find -L ${PATH//:/ } -maxdepth 1 -type f -perm -u=x -name 'ls' 2>/dev/null
The assumptions - which will hold in many, but not all situations - are:
$PATH must not contain entries that when used unquoted result in shell expansions (e.g., no embedded spaces that would result in word splitting, no characters such as * that would result in pathname expansion)
$PATH must not contain an empty entry (which must be interpreted as the current dir).
Explanation:
-L tells find to investigate the targets of symlinks rather than the symlinks themselves - this ensures that symlinks to executable files are also recognized by -type f
${PATH//:/ } replaces all : chars. in $PATH with a space each, causing the result - due to being unquoted - to be passed as individual arguments split by spaces.
-maxdepth 1 instructs find to only look directly in each specified directory, not also in subdirectories
-type f matches only files, not directories.
-perm -u=x matches only files and directories that the current user (u) can execute (x).
2>/dev/null suppresses error messages that may stem from non-existent directories in the $PATH or failed attempts to access files due to lack of permission.
Here's a more robust script version:
Note:
For brevity, only handles a single argument (and no options).
Does NOT handle the case where entries or result paths may contain embedded \n chars - however, this is extremely rare in practice and likely leads to bigger problems overall.
#!//bin/bash
# Assign argument to variable; error out, if none given.
name=${1:?Please specify an executable filename.}
# Robustly read individual $PATH entries into a bash array, splitting by ':'
# - The additional trailing ':' ensures that a trailing ':' in $PATH is
# properly recognized as an empty entry - see gniourf_gniourf's answer.
IFS=: read -r -a paths <<<"${PATH}:"
# Replace empty entries with '.' for use with `find`.
# (Empty entries imply '.' - this is legacy behavior mandated by POSIX).
for (( i = 0; i < "${#paths[#]}"; i++ )); do
[[ "${paths[i]}" == '' ]] && paths[i]='.'
done
# Invoke `find` with *all* directories and capture the 1st match, if any, in a variable.
# Simply remove `| head -n 1` to print *all* matches.
match=$(find -L "${paths[#]}" -maxdepth 1 -type f -perm -u=x -name "$name" 2>/dev/null |
head -n 1)
# Print result, if found, and exit with appropriate exit code.
if [[ -n $match ]]; then
printf '%s\n' "$match"
exit 0
else
exit 1
fi

Bash: Pass alias or function as argument to program

Quite often i need to work on the newest file in a directory.
Normally i do:
ls -rt
and then open the last file in vim or less.
Now i wanted to produce an alias or function, like
lastline() {ls -rt | tail -n1}
# or
alias lastline=$(ls -rt | tail -n1)
Calling lastline outputs the newest file in the directory, which is nice.
But calling
less lastline
wants to open the file "lastline" which doesn't exist.
How do i make bash execute the function or alias, if possible without a lot of typing $() or ``?
Or is there any other way to achieve the same result?
Thanks for your help.
You're parsing ls, and this is very bad. Moreover, if the last modified “file” is a directory, you'll be lessing/viming a directory.
So you need a robust way to determine the last modified file in the current directory. You may use a helper function like the following (that you'll put in your .bashrc):
last_modified_regfile() {
# Finds the last modified regular file in current directory
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file
last_modified_regfile_ret=
for file in *; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
[[ $last_modified_regfile_ret ]]
}
Then you may define another function that will vim the last found file:
vimlastline() {
last_modified_regfile && vim -- "$last_modified_regfile_ret"
}
You may even have last_modified_regfile take optional arguments: the directories where it will find the last modified regular file:
last_modified_regfile() {
# Finds the last modified regular file in current directory
# or in directories given as arguments
# Found file is in variable last_modified_regfile_ret
# Returns a failure return code if no reg files are found
local file dir
local save_shopt_nullglob=$(shopt -p nullglob)
shopt -s nullglob
(( $# )) || set .
last_modified_regfile_ret=
for dir; do
dir=${dir%/}
[[ -d $dir/ ]] || continue
for file in "$dir"/*; do
[[ -f $file ]] || continue
if [[ -z $last_modified_regfile_ret ]] || [[ $file -nt $last_modified_regfile_ret ]]; then
last_modified_regfile_ret=$file
fi
done
done
$save_shopt_nullglob
[[ $last_modified_regfile_ret ]]
}
Then you can even alter vimlastline accordingly:
vimlastline() {
last_modified_regfile "$#" && vim -- "$last_modified_regfile_ret"
}
Use command substitution like this:
lastline() { ls -rt | tail -n1; }
less "$(lastline)"
Or pipe it to xargs:
lastline | xargs -I {} less '{}'

Resources