(This is debian squeeze amd64)
I need to test if a file is a member of a list of files.
So long my (test) script is:
set -x
array=$( ls )
echo $array
FILE=log.out
# This line gives error!
if $FILE in $array
then echo "success!"
else echo "bad!"
fi
exit 0
¿Any ideas?
Thanks for all the responses. To clarify: The script given is only an example, the actual problem is more complex. In the final solution, it will be done within a loop, so I need the file(name) to be tested for to be in a variable.
Thanks again. No my test-script works, and reads:
in_list() {
local search="$1"
shift
local list=("$#")
for file in "${list[#]}" ; do
[[ "$file" == "$search" ]] && return 0
done
return 1
}
#
# set -x
array=( * ) # Array of files in current dir
# echo $array
FILE="log.out"
if in_list "$FILE" "${array[#]}"
then echo "success!"
else echo "bad!"
fi
exit 0
if ls | grep -q -x t1 ; then
echo Success
else
echo Failure
fi
grep -x matches full lines only, so ls | grep -x only returns something if the file exists.
If you just want to check if a file exists, then
[[ -f "$file" ]] && echo yes || echo no
If your array contains a list of files generated by some means other than ls, then you have to iterate over it as demonstrated by Sorpigal.
How about
in_list() {
local search="$1"
shift
local list=("$#")
for file in "${list[#]}" ; do
[[ $file == $search ]] && return 0
done
return 1
}
if in_list log.out * ; then
echo 'success!'
else
echo 'bad!'
fi
EDIT: made it a bit less idiotic.
EDIT #2:
Of course if all you're doing is looking in the current directory to see if a particular file is there, which is effectively what the above is doing, then you can just say
[ -e log.out ] && echo 'success!' || echo 'bad!'
If you're actually doing something more complicated involving lists of files then this might not be sufficient.
Related
I can't tell if something I'm trying here is simply impossible or if I'm really lacking knowledge in bash's syntax. This is the first script I've written.
I've got a Nextcloud instance that I am backing up daily using a script. I want to log the output of the script as it runs to a log file. This is working fine, but I wanted to see if I could also pipe the Nextcloud occ command's output to the log file too.
I've got an if statement here checking if the file scan fails:
if ! sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all; then
Print "Error: Failed to scan files. Are you in maintenance mode?"
fi
This works fine and I am able to handle the error if the system cannot execute the command. The error string above is sent to this function:
Print()
{
if [[ "$logging" -eq 1 ]] && [ "$quiet_mode" = "No" ]; then
echo "$1" | tee -a "$log_file"
elif [[ "$logging" -eq 1 ]] && [ "$quiet_mode" = "Yes" ]; then
echo "$1" >> "$log_file"
elif [[ "$logging" -eq 0 ]] && [ "$quiet_mode" = "No" ]; then
echo "$1"
fi
}
How can I make it so the output of the occ command is also piped to the Print() function so it can be logged to the console and log file?
I've tried piping the command after ! using | Print without success.
Any help would be appreciated, cheers!
The Print function doesn't read standard input so there's no point piping data to it. One possible way to do what you want with the current implementation of Print is:
if ! occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1); then
Print "Error: Failed to scan files. Are you in maintenance mode?"
fi
Print "'occ' output: $occ_output"
Since there is only one line in the body of the if statement you could use || instead:
occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1) \
|| Print "Error: Failed to scan files. Are you in maintenance mode?"
Print "'occ' output: $occ_output"
The 2>&1 causes both standard output and error output of occ to be captured to occ_output.
Note that the body of the Print function could be simplified to:
[[ $quiet_mode == No ]] && printf '%s\n' "$1"
(( logging )) && printf '%s\n' "$1" >> "$log_file"
See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I replaced echo "$1" with printf '%s\n' "$1".
How's this? A bit unorthodox perhaps.
Print()
{
case $# in
0) cat;;
*) echo "$#";;
esac |
if [[ "$logging" -eq 1 ]] && [ "$quiet_mode" = "No" ]; then
tee -a "$log_file"
elif [[ "$logging" -eq 1 ]] && [ "$quiet_mode" = "Yes" ]; then
cat >> "$log_file"
elif [[ "$logging" -eq 0 ]] && [ "$quiet_mode" = "No" ]; then
cat
fi
}
With this, you can either
echo "hello mom" | Print
or
Print "hello mom"
and so your invocation could be refactored to
if ! sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all; then
echo "Error: Failed to scan files. Are you in maintenance mode?"
fi |
Print
The obvious drawback is that piping into a function loses the exit code of any failure earlier in the pipeline.
For a more traditional approach, keep your original Print definition and refactor the calling code to
if output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1); then
: nothing
else
Print "error $?: $output"
Print "Error: Failed to scan files. Are you in maintenance mode?"
fi
I would imagine that the error message will be printed to standard error, not standard output; hence the addition of 2>&1
I included the error code $? in the error message in case that would be useful.
Sending and receiving end of a pipe must be a process, typically represented by an executable command. An if statement is not a process. You can of course put such a statement into a process. For example,
echo a | (
if true
then
cat
fi )
causes cat to write a to stdout, because the parenthesis put it into a child process.
UPDATE: As was pointed out in a comment, the explicit subprocess is not needed. One can also do a
echo a | if true
then
cat
fi
I seem to have this problem. This code breaks at line 119 in my script with bash associative arrays. I am sorry for the comments but I am kind to new to bash scripting. This is the code:
#!/bin/bash
# Aliases file
# Command usage: cpRecent/mvRecent -d {dirFrom},{dirTo} -n {numberofFiles} -e {editTheNames}
# Error codes
NO_ARGS="You need to pass in an argument"
INVALID_OPTION="Invaild option:"
NO_DIRECTORY="No directory found"
# Return values
fullpath=
directories=
numfiles=
interactive=
typeset -a files
typeset -A filelist
# Advise that you use relative paths
__returnFullPath(){
local npath
if [[ -d $1 ]]; then
cd "$(dirname $1)"
npath="$PWD/$(basename $1)"
npath="$npath/" #Add a slash
npath="${npath%.*}" #Delete .
fi
fullpath=${npath:=""}
}
__usage(){
wall <<End-Of-Message
________________________________________________
<cpRecent/mvRecent> -d "<d1>,<d2>" -n <num> [-i]
-d First flag: Takes two arguments
-n Second flag: Takes one argument
-i Takes no arguments. Interactive mode
d1 Directory we are reading from
d2 Directory we are writing to
num Number of files
________________________________________________
End-Of-Message
}
__processOptions(){
while getopts ":d:n:i" opt; do
case $opt in
d ) IFS=',' read -r -a directories <<< "$OPTARG";;
n ) numfiles=$OPTARG;;
i ) interactive=1;;
\? ) echo "$INVALID_OPTION -$OPTARG" >&2 ; return 1;;
: ) echo "$NO_ARGS"; __usage; return 1;;
* ) __usage; return 1;;
esac
done
}
__getRecentFiles(){
# Check some conditions
(( ${#directories[#]} != 2 )) && echo "$INVALID_OPTION Number of directories must be 2" && return 2
#echo ${directories[0]} ${directories[1]}
# Get the full paths of the directories to be read from/written to
__returnFullPath "${directories[0]}"
directories[0]="$fullpath"
__returnFullPath "${directories[1]}"
directories[1]="$fullpath"
if [[ -z ${directories[0]} || -z ${directories[1]} ]]; then
echo $NO_DIRECTORY
return 3
fi
[[ numfiles != *[!0-9]* ]] && echo "$INVALID_OPTION Number of files cannot be a string" && return 4
#numfiles=$(($numfiles + 0))
(( $numfiles == 0 )) && echo "$INVALID_OPTION Number of files cannot be zero" && return 4
local num="-"$numfiles""
# Get the requested files in directory(skips directories)
if [[ -n "$(ls -t ${directories[0]} | head $num)" ]]; then
# For some reason using local -a or declare -a does not seem to split the string into two
local tempfiles=($(ls -t ${directories[0]} | head $num))
#IFS=' ' read -r -a tempfiles <<< "$string"
#echo ${tempfiles[#]}
for index in "${!tempfiles[#]}"; do
echo $index ${tempfiles[index]}
[[ -f "${directories[0]}${tempfiles[index]}" ]] && files+=("${tempfiles[index]}")
done
fi
}
####################################
# The problem is this piece of code
__processLines(){
local name
local answer
local dirFrom
local dirTo
if [[ -n $interactive ]]; then
for (( i=0; i< ${#files[#]}; i++ )); do
name=${files[i]}
read -n 1 -p "Old name: $name. Do you wish to change the name(y/n)?" answer
[[ answer="y" ]] && read -p "Enter new name:" name
dirFrom="${directories[0]}${files[i]}"
dirTo="${directories[1]}$name"
fileslist["$dirFrom"]="$dirTo"
done
else
for line in $files; do
dirFrom="${directories[0]}$line"
echo $dirFrom # => /home/reclusiarch/Documents/test
dirTo="${directories[1]}$line"
echo $dirTo # => /home/reclusiarch/test
fileslist["$dirFrom"]="$dirTo" # This is the offending line
done
fi
}
###########################################################
cpRecent(){
__processOptions $*
__getRecentFiles
__processLines
for line in "${!filelist[#]}"; do
cp $line ${filelist[$line]}
done
echo "You have copied ${#fileList[#]} files"
unset files
unset filelist
return
}
mvRecent(){
__processOptions $*
__getRecentFiles
__processLines
for line in "${!filelist[#]}"; do
mv $line ${filelist[$line]}
done
echo "You have copied ${#fileList[#]} files"
unset files
unset filelist
return
}
cpRecent "$*"
I have tried a lot of things. To run the script,
$ bash -x ./testing.sh -d "Documents,." -n 2
But nothing seems to work:
The error is this(when using bash -x):
./testing.sh: line 119: /home/reclusiarch/Documents/test: syntax error: operand expected (error token is "/home/reclusiarch/Documents/test")
If I run that section on the command line, it works:
$ typeset -A filelist
$ filelist["/home/reclusiarch/Documents/test"]=/home/reclusiarch/test
$ echo ${filelist["/home/reclusiarch/Documents/test"]}
/home/reclusiarch/test
Thanks for your help!!
Edit: I intially pared down the script to the piece of offending code but that might make it not run. Again, if you want to test it, you could run the bash command given. (The script ideally would reside in the user's $HOME directory).
Edit: Solved (Charles Duffy solved it) It was a simple mistake of forgetting which name was which.
Your declaration is:
typeset -A filelist
However, your usage is:
fileslist["$dirFrom"]="$dirTo"
fileslist is not filelist.
I have a script that must be able to accept both by files and stdin on the first argument. Then if more or less than 1 arguments, reject them
The goal that I'm trying to accomplish is able to accpet using this format
./myscript myfile
AND
./myscript < myfile
What I have so far is
if [ "$#" -eq 1 ]; then #check argument
if [ -t 0 ]; then #check whether input from keyboard (read from github)
VAR=${1:-/dev/stdin} #get value to VAR
#then do stuff here!!
else #if not input from keyboard
VAR=$1
if [ ! -f "$VAR" ]; then #check whether file readable
echo "ERROR!"
else
#do stuff heree!!!
fi
fi
fi
The PROBLEM is when I tried to say
./myscript < myfile
it prints
ERROR!
I dont know whether this is the correct way to do this, I really appreciate for suggestion or the correct code for my problem. Thank you
#!/bin/bash
# if nothing passed in command line pass "/dev/stdin" to myself
# so all below code can be made branch-free
[[ ${#} -gt 0 ]] || set -- /dev/stdin
# loop through the command line arguments, treating them as file names
for f in "$#"; do
echo $f
[[ -r $f ]] && while read line; do echo 'echo:' $line; done < $f
done
Examples:
$ args.sh < input.txt
$ args.sh input.txt
$ cat input.txt | args.sh
Sorry for asking this question again. I have already received answer but with using find but unfortunately I need to write it without using any predefined commands.
I am trying to write a script that will loop recursively through the subdirectories in the current directory. It should check the file count in each directory. If file count is greater than 10 it should write all names of these file in file named "BigList" otherwise it should write in file "ShortList". This should look like:
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
---<directory name>
<filename>
<filename>
<filename>
<filename>
....
My script only works if subdirectories don't include subdirectories in turn.
I am confused about this because it doesn't work as I expect.
Here is my script
#!/bin/bash
parent_dir=""
if [ -d "$1" ]; then
path=$1;
else
path=$(pwd)
fi
parent_dir=$path
loop_folder_recurse() {
local files_list=""
local cnt=0
for i in "$1"/*;do
if [ -d "$i" ];then
echo "dir: $i"
parent_dir=$i
echo before recursion
loop_folder_recurse "$i"
echo after recursion
if [ $cnt -ge 10 ]; then
echo -e "---"$parent_dir >> BigList
echo -e $file_list >> BigList
else
echo -e "---"$parent_dir >> ShortList
echo -e $file_list >> ShortList
fi
elif [ -f "$i" ]; then
echo file $i
if [ $cur_fol != $main_pwd ]; then
file_list+=$i'\n'
cnt=$((cnt + 1))
fi
fi
done
}
echo "Base path: $path"
loop_folder_recurse $path
How can I modify my script to produce the desired output?
This bash script produces the output that you want:
#!/bin/bash
bigfile="$PWD/BigList"
shortfile="$PWD/ShortList"
shopt -s nullglob
loop_folder_recurse() {
(
[[ -n "$1" ]] && cd "$1"
for i in */; do
[[ -d "$i" ]] && loop_folder_recurse "$i"
count=0
files=''
for j in *; do
if [[ -f "$j" ]]; then
files+="$j"$'\n'
((++count))
fi
done
if ((count > 10)); then
outfile="$bigfile"
else
outfile="$shortfile"
fi
echo "$i" >> "$outfile"
echo "$files" >> "$outfile"
done
)
}
loop_folder_recurse
Explanation
shopt -s nullglob is used so that when a directory is empty, the loop will not run. The body of the function is within ( ) so that it runs within a subshell. This is for convenience, as it means that the function returns to the previous directory when the subshell exits.
Hopefully the rest of the script is fairly self-explanatory but if not, please let me know and I will be happy to provide additional explanation.
How does one test for the existence of files in a directory using bash?
if ... ; then
echo 'Found some!'
fi
To be clear, I don't want to test for the existence of a specific file. I would like to test if a specific directory contains any files.
I went with:
(
shopt -s dotglob nullglob
existing_files=( ./* )
if [[ ${#existing_files[#]} -gt 0 ]] ; then
some_command "${existing_files[#]}"
fi
)
Using the array avoids race conditions from reading the file list twice.
From the man page:
-f file
True if file exists and is a regular file.
So:
if [ -f someFileName ]; then echo 'Found some!'; fi
Edit: I see you already got the answer, but for completeness, you can use the info in Checking from shell script if a directory contains files - and lose the dotglob option if you want hidden files ignored.
I typically just use a cheap ls -A to see if there's a response.
Pseudo-maybe-correct-syntax-example-ahoy:
if [[ $(ls -A my_directory_path_variable ) ]] then....
edit, this will work:
myDir=(./*) if [ ${#myDir[#]} -gt 1 ]; then echo "there's something down here"; fi
You can use ls in an if statement thus:
if [[ "$(ls -a1 | egrep -v '^\.$|^\.\.$')" = "" ]] ; then echo empty ; fi
or, thanks to ikegami,
if [[ "$(ls -A)" = "" ]] ; then echo empty ; fi
or, even shorter:
if [[ -z "$(ls -A)" ]] ; then echo empty ; fi
These basically list all files in the current directory (including hidden ones) that are neither . nor ...
If that list is empty, then the directory is empty.
If you want to discount hidden files, you can simplify it to:
if [[ "$(ls)" = "" ]] ; then echo empty ; fi
A bash-only solution (no invoking external programs like ls or egrep) can be done as follows:
emp=Y; for i in *; do if [[ $i != "*" ]]; then emp=N; break; fi; done; echo $emp
It's not the prettiest code in the world, it simply sets emp to Y and then, for every real file, sets it to N and breaks from the for loop for efficiency. If there were zero files, it stays as Y.
Try this
if [ -f /tmp/foo.txt ]
then
echo the file exists
fi
ref: http://tldp.org/LDP/abs/html/fto.html
you may also want to check this out: http://tldp.org/LDP/abs/html/fto.html
How about this for whether directory is empty or not
$ find "/tmp" -type f -exec echo Found file {} \;
#!/bin/bash
if [ -e $1 ]; then
echo "File exists"
else
echo "Files does not exist"
fi
I don't have a good pure sh/bash solution, but it's easy to do in Perl:
#!/usr/bin/perl
use strict;
use warnings;
die "Usage: $0 dir\n" if scalar #ARGV != 1 or not -d $ARGV[0];
opendir my $DIR, $ARGV[0] or die "$ARGV[0]: $!\n";
my #files = readdir $DIR;
closedir $DIR;
if (scalar #files == 2) { # . and ..
exit 0;
}
else {
exit 1;
}
Call it something like emptydir and put it somewhere in your $PATH, then:
if emptydir dir ; then
echo "dir is empty"
else
echo "dir is not empty"
fi
It dies with an error message if you give it no arguments, two or more arguments, or an argument that isn't a directory; it's easy enough to change if you prefer different behavior.
# tested on Linux BASH
directory=$1
if test $(stat -c %h $directory) -gt 2;
then
echo "not empty"
else
echo "empty"
fi
For fun:
if ( shopt -s nullglob ; perl -e'exit !#ARGV' ./* ) ; then
echo 'Found some!'
fi
(Doesn't check for hidden files)