"Command not found" (simple bash script) - bash

I need to write a basic program which would find the files which have odd (uneven) size in bytes in user specified directory and then rename them. I wrote a code but can't figure it out what's wrong with it since I have only just began to programm bash scripts... I have 3 files in my directory and here are the errors I'am getting for them:
./Untitled: line 18: AppIcon.icns: command not found
mv: cannot stat ‘AppIcon.icns’: No such file or directory
./Untitled: line 18: AssociatedVm.txt: command not found
mv: cannot stat ‘AssociatedVm.txt’: No such file or directory
./Untitled: line 18: Info.plist: command not found
mv: cannot stat ‘Info.plist’: No such file or directory
My script Code:
#!/bin/bash
n=0
echo “Specify directory”
read directory
if [ -d $directory ]; then
echo “Directory found”
else
echo “Directory not found”
exit 0
fi
for file in $( ls $directory );
do
fsize=$(stat "$directory/$file" -c %s)
if [ $((fsize%2))=1 ]; then
mv "$directory/$file" "$directory/$file.odd"
n=$((n + 1))
fi
done
echo ”Number of renamed files: $n ”

I think you meant
fsize=$(stat "$file" -c %s)
but you wrote
fsize=stat "$file" -c %s
Also, you need to use the absolute path($directory/$file) instead of $file alone if you are running the script from a directory which is not $directory.
Bash uses -eq for integer comparison, so you should also change
if [ $((fsize%2))=1 ]; then
to
if [ $((fsize%2)) -eq 1 ]; then
What is the -c %s for? I don't see a -c option in the stat man page. Did you mean -f? (EDIT: Ok I was looking at the Mac stat command (which is BSD). The stat in GNU version uses -c for format specification)

Related

For files in directory Bash [duplicate]

I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:
#!/bin/bash
for filename in "$1"/*; do
echo "File:"
echo $filename
done
And I am running the above using:
sh test.sh path/to/loop/over
However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:
File:
path/to/loop/over/*
I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:
File:
foo.txt
File:
bar.txt
Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)
Iterate over content of directory
Compatible answer (not only bash)
As this question is tagged shell, there is a POSIX compatible way:
#!/bin/sh
for file in "$1"/* ;do
[ -f "$file" ] && echo "Process '$file'."
done
Will be enough (work with filenames containing spaces):
$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.
This work well by using any posix shell. Tested with bash, ksh, dash, zsh and busybox sh.
#!/bin/sh
cd "$1" || exit 1
for file in * ;do
[ -f "$file" ] && echo "Process '$file'."
done
This version won't print path:
$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.
Some bash ways
Introduction
I don't like to use shopt when not needed... (This change standard
bash behaviours and make script less readables).
There is an elegant way for doing this by using standard bash, without requirement of shopt.
Of course, previous answer work fine under bash, but. There are some
interresting way for making your script more powerfull, flexible, pretty, detailed...
Sample
#!/bin/bash
die() { echo >&2 "$0 ERROR: $#";exit 1;} # Emergency exit function
[ "$1" ] || die "Argument missing." # Exit unless argument submitted
[ -d "$1" ] || die "Arg '$1' is not a directory." # Exit if argument is not dir
cd "$1" || die "Can't access '$1'." # Exit unless access dir.
files=(*) # All files names in array $files
[ -f "$files" ] || die "No files found." # Exit if no files found
for file in "${files[#]}";do # foreach file:
echo Process "$file" # Process file
done
Explanation: considering globbing vs real files
When doing:
files=(/path/to/dir/*)
variable $files becomes an array containing all files contained under /path/to/dir/:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
But if nothing match glob pattern, star won't be replaced and array become:
declare -p files
declare -a files=([0]="/path/to/dir/*")
From there. looking for $files is like looking for ${files[0]} ie: first field in array. So
[ -f "$files" ] || die "No files found."
will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).
But you could do replace this filesystem test by some string based test, like:
[ "$files" = "/path/to/dir/*" ] && die "No files found."
or, using array length:
((${#files[#]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."
Dropping paths by using Parameter expansion:
For suppressing path from filenames, instead of cd $path you could do:
targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."
Then:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
You could
printf 'File: %s\n' ${files[#]#$targetPath/}
File: bar
File: baz
File: foo
This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)
You can verify this easily for yourself. In a directory with four files,
sh$ echo *
a file or two
sh$ echo [ot]*
or two
sh$ echo n*
n*
And in Bash,
bash$ echo n*
n*
bash$ shopt -s nullglob
bash$ echo n*
I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/

ShellScript not working on Pre-session-command (PowerCenter)

The goal is to check the existence of a file and create a blank file if this doesn't exist, using Shell Script on the Pre-session-command (Informatica PowerCenter) like the code below:
ParamDirTrabalho=/dir/powercenter/project1
ParamArq=file.csv
ParamQtdArq=`cat ${ParamDirTrabalho}/${ParamArq} | wc -l`
if [ $ParamQtdArq == 0 ];then touch ${ParamDirTrabalho}/${ParamArq};fi
This is the error:
Message: [Pre/Post Session Command] Process id 10683. Standard output and error:
sh: line 2:
: command not found
cat: /dir/powercenter/project1
/file.csv
: No such file or directory
sh: line 4:
: command not found
I can execute successfully when pointing to a sh file with the code above. But I need to write the code inside the pre-session-command.
Please enclose parameters by double quotes.
ParamDirTrabalho="/dir/powercenter/project1"
ParamArq="file.csv"
Also pls make sure you provide RWX permission to folders.
You cannot get the WC from a file if it doesn't exist at all. That's what the error is "No such file or directory" if I understand it right. What you need to do is check if file exists or not rather than the count and then touch if it doesn't exist.
if [ ! -f filename ];then touch filename; fi
or
if [ -f filename ];then exit 0; else touch filename; fi

Treating space as newline character in bash

I have written a bash just to display the name of all the files of a given directory but when I am running this it breaking the file name which has spaces.
if [ $# -eq 0 ]
then
echo "give a source directory in the command line argument in order to rename the jpg file"
exit 1
fi
if [ ! -d "$1" ]; then
exit 2
fi
if [ -d "$1" ]
then
for i in $(ls "$1")
do
echo "$i"
done
fi
I am getting the following thing when I run the bash script
21151991jatinkhurana_image
(co
py).jpg
24041991jatinkhurana_im
age.jpg
35041991jatinkhurana_image
.jpg
The thing that i have tried till now is resetting the IFS variable like IFS=$(echo -en "\t\n\0") but found no change....
If anyone know please help me.....
Do not loop through the result of ls. Parsing ls makes world worse (good read: Why you shouldn't parse the output of ls).
Instead, you can do make use of the *, that expands to the existing content in a given directory:
for file in /your/dir/*
do
echo "this is my file: $file"
done
Using variables:
for file in $dir/*
do
echo "this is my file: $file"
done

Passing a path as an argument to a shell script

I've written bash script to open a file passed as an argument and write it into another file. But my script will work properly only if the file is in the current directory. Now I need to open and write the file that is not in the current directory also.
If compile is the name of my script, then ./compile next/123/file.txt should open the file.txt in the passed path. How can I do it?
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
#echo $1
name=$1 # Copy the filename to name
find . -iname $name -maxdepth 1 -exec cp {} $name \;
new_file="tempwithfile.adb"
cp $name $new_file #copy the file to new_file
echo "compiling"
dir >filelist.txt
gcc writefile.c
run_file="run_file.txt"
echo $name > $run_file
./a.out
echo ""
echo "cleaning"
echo ""
make clean
make -f makefile
./semantizer -da <withfile.adb
Your code and your question are a bit messy and unclear.
It seems that you intended to find your file, given as a parameter to your script, but failed due to the maxdepth.
If you are given next/123/file.txt as an argument, your find gives you a warning:
find: warning: you have specified the -maxdepth option after a
non-option argument -iname, but options are not positional (-maxdepth
affects tests specified before it as well as those specified after
it). Please specify options before other arguments.
Also -maxdepth gives you the depth find will go to find your file until it quits. next/123/file.txt has a depth of 2 directories.
Also you are trying to copy the given file within find, but also copied it using cp afterwards.
As said, your code is really messy and I don't know what you are trying to do. I will gladly help, if you could elaborate :).
There are some questions that are open:
Why do you have to find the file, if you already know its path? Do you always have the whole path given as an argument? Or only part of the path? Only the basename ?
Do you simply want to copy a file to another location?
What does your writefile.c do? Does it write the content of your file to another? cp does that already.
I also recommend using variables with CAPITALIZED letters and checking the exit status of used commands like cp and find, to check if these failed.
Anyway, here is my script that might help you:
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
echo "FILE: $1"
[ $# -ne 1 ] && echo "Usage: $0 <file>" 1>&2 && exit 1
FILE="$1" # Copy the filename to name
FILE_NEW="tempwithfile.adb"
cp "$FILE" "$FILE_NEW" # Copy the file to new_file
[ $? -ne 0 ] && exit 2
echo
echo "----[ COMPILING ]----"
echo
dir &> filelist.txt # list directory contents and write to filelist.txt
gcc writefile.c # ???
FILE_RUN="run_file.txt"
echo "$FILE" > "$FILE_RUN"
./a.out
echo
echo "----[ CLEANING ]----"
echo
make clean
make -f makefile
./semantizer -da < withfile.adb

Issue in Bash script

I have a bash script which creates a directory if not already present and moves all the files to the newly created directory.
The bash script I have is returning is not working and the error is receive is
./move.sh: line 5: =/data/student/stud_done_11-11-2013: No such file or directory
already present
mv: missing destination file operand after `a.xml'
Try `mv --help' for more information.
The bash script is:
# Back up
if [ $# = 1 ]
then
$dir="/data/student/stud_done_$1"
echo $dir
if [ ! -d $dir ]; then
mkdir $dir
else
echo "already present"
fi
cd /data/student/stud_ready
mv * $dir
else
echo "No files to move"
fi
I invoke the script as follows:
./move.sh "11-11-2013"
What is the error in my script.
Here (on line 5)...
$dir="/data/student/stud_done_$1"
You meant...
dir="/data/student/stud_done_$1"

Resources