How to access a target directory using bash scripting - bash

I am relatively new to shell scripting. I am writing a script to compress all the files in current and target directory. I have found success in compressing the files of a current directory but I'm unable to write a script for compressing files in a target directory can anyone guide me?
I want to do something like this
% myCompress -t /home/users/bigFoot/ pdf ppt jpg

next time try to spread your code (it will make it easier to answer):
#!/bin/bash
if [[ $# == 0 ]]; then
echo "This shell script compress files with a specific extensions"
echo "Call Syntax: compress <extension_list>"
exit
fi
for ext in $*; do
for file in ls *.$ext; do
gzip -k $file
done
done
Mistakes made
1) $* - all args coming after command - so.... -t and path are not $ext variables
2) ls *.$ext is red in loop as 2 strings "ls and *.$ext" should be written as $(ls *.$ext) to get ls command executed
My script for your request
#!/bin/bash
script_name=`basename "$0"`
if [[ $# == 0 ]]; then
echo "This shell script compress files with a specific extensions"
echo "Call Syntax: $script_name <dirctories_list> <extension_list>"
exit
fi
# check if $1 is a directory
path=". "
file_type=""
for check_type in $* ; do
if [[ -d $check_type ]]; then
path=$path$check_type" "
else
file_type=$file_type"*."$check_type" "
fi
done
echo paths to gzip $path
echo files type to check "$file_type"
for x in $path; do
cd $x
for file in $(ls $file_type); do
gzip $file
done
cd -
done
Explanation
1) basename "$0" - get scripts name - it is more generic for usage - in case you change script's name
2) path=". " - variable hold a string of all directories to be compressed, your request is to run it also on current directory ". "
file_type="" - variable hold a string of all extensions to be compressed in $path string
3) running a loop on all input ARGS and concatenate directories names to $path string and other file types to $file_type
4) for each of the directories inserted to script:
i. cd $x - enter directorie
ii. gzip - compress all files with inserted extensions
iii. cd - - go back to base directories
Check gzip
I'm not familiar with the gzip command , check that you have -k flag

Related

For files in directory Bash [duplicate]

I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:
#!/bin/bash
for filename in "$1"/*; do
echo "File:"
echo $filename
done
And I am running the above using:
sh test.sh path/to/loop/over
However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:
File:
path/to/loop/over/*
I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:
File:
foo.txt
File:
bar.txt
Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)
Iterate over content of directory
Compatible answer (not only bash)
As this question is tagged shell, there is a POSIX compatible way:
#!/bin/sh
for file in "$1"/* ;do
[ -f "$file" ] && echo "Process '$file'."
done
Will be enough (work with filenames containing spaces):
$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.
This work well by using any posix shell. Tested with bash, ksh, dash, zsh and busybox sh.
#!/bin/sh
cd "$1" || exit 1
for file in * ;do
[ -f "$file" ] && echo "Process '$file'."
done
This version won't print path:
$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.
Some bash ways
Introduction
I don't like to use shopt when not needed... (This change standard
bash behaviours and make script less readables).
There is an elegant way for doing this by using standard bash, without requirement of shopt.
Of course, previous answer work fine under bash, but. There are some
interresting way for making your script more powerfull, flexible, pretty, detailed...
Sample
#!/bin/bash
die() { echo >&2 "$0 ERROR: $#";exit 1;} # Emergency exit function
[ "$1" ] || die "Argument missing." # Exit unless argument submitted
[ -d "$1" ] || die "Arg '$1' is not a directory." # Exit if argument is not dir
cd "$1" || die "Can't access '$1'." # Exit unless access dir.
files=(*) # All files names in array $files
[ -f "$files" ] || die "No files found." # Exit if no files found
for file in "${files[#]}";do # foreach file:
echo Process "$file" # Process file
done
Explanation: considering globbing vs real files
When doing:
files=(/path/to/dir/*)
variable $files becomes an array containing all files contained under /path/to/dir/:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
But if nothing match glob pattern, star won't be replaced and array become:
declare -p files
declare -a files=([0]="/path/to/dir/*")
From there. looking for $files is like looking for ${files[0]} ie: first field in array. So
[ -f "$files" ] || die "No files found."
will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).
But you could do replace this filesystem test by some string based test, like:
[ "$files" = "/path/to/dir/*" ] && die "No files found."
or, using array length:
((${#files[#]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."
Dropping paths by using Parameter expansion:
For suppressing path from filenames, instead of cd $path you could do:
targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."
Then:
declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")
You could
printf 'File: %s\n' ${files[#]#$targetPath/}
File: bar
File: baz
File: foo
This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)
You can verify this easily for yourself. In a directory with four files,
sh$ echo *
a file or two
sh$ echo [ot]*
or two
sh$ echo n*
n*
And in Bash,
bash$ echo n*
n*
bash$ shopt -s nullglob
bash$ echo n*
I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/

How to pass arguments to a file which the locations to switch in the shell script

I have a shell script which takes an location input from a text file. I had loop in the script to read line(location) and change directory to the location and list the files.
In the below the location directory "switch" changes because this location may not exists on the all the servers. I have to pass this directory as an argument to the file and pass this location to the script.
locations.txt has below content.
/usr/test/home/process_file/switch/process.txt
shell script:
for i in `cat locations`;
do
echo $line
cd $i
if [ -d "$i" ]
then
cd $i
pwd
ls -ltr
else
exit
fi
done
Storing a filename in a file seems a little inefficient.
Don't read lines with for
cd will return non-zero if the directory does not exist.
while IFS= read -r dir; do
if cd "$dir" 2>/dev/null; then
pwd
ls -ltr
else
echo "no such directory: $dir"
fi
done < /usr/test/home/process_file/switch/process.txt

How to identify files which are not in list using bash?

Unfortunately my knowledge in bash not so well and I have very non-standard task.
I have a file with the files list.
Example: /tmp/my/file1.txt /tmp/my/file2.txt
How can I write a script which can check that files from folder /tmp/my exist and to have two types messages after script is done.
1 - Files exist and show files:
/tmp/my/file1.txt
/tmp/my/file2.txt
2 - The folder /tmp/my including files and folders which are not in your list. The files and folders:
/tmp/my/test
/tmp/my/1.txt
You speak of files and folders, which seems unclear.
Anyways, I wanted to try it with arrays, so here we go :
unset valid_paths; declare -a valid_paths
unset invalid_paths; declare -a invalid_paths
while read -r line
do
if [ -e "$line" ]
then
valid_paths=("${valid_paths[#]}" "$line")
else
invalid_paths=("${invalid_paths[#]}" "$line")
fi
done < files.txt
echo "VALID PATHS:"; echo "${valid_paths[#]}"
echo "INVALID PATHS:"; echo "${invalid_paths[#]}"
You can check for the files' existence (assuming a list of files, one filename per line) and print the existing ones with a prefix using this
# Part 1 - check list contents for files
while read thefile; do
if [[ -n "$thefile" ]] && [[ -f "/tmp/my/$thefile" ]]; then
echo "Y: $thefile"
else
echo "N: $thefile"
fi
done < filelist.txt | sort
# Part 2 - check existing files against list
for filepath in /tmp/my/* ; do
filename="$(basename "$filepath")"
grep "$filename" filelist.txt -q || echo "U: $filename"
done
The files that exist are prefixed here with Y:, all others are prefixed with N:
In the second section, files in the tmp directory that are not in the file list are labelled with U: (unaccounted for/unexpected)
You can swap the -f test which checks that a path exists and is a regular file for -d (exists and is a directory) or -e (exists)
See
man test
for more options.

Bash script with a loop not executing utility that has paramters passed in?

Anyone able to help me out? I have a shell script I am working on but for the loop below the command after "echo "first file is $firstbd" is not being executed.. the $PROBIN/proutil ?? Not sure why this is...
Basically I have a list of files in a directory (*.list), I grab them and read the first line and pass it as a parameter to the cmdlet then move the .list and the content of the .list to another directory (the .list has a list of files with full path).
for i in $(ls $STAGEDIR/*.list); do
echo "Working with $i"
# grab first .bd file
firstbd=`head -1 $i`
echo "First file is $firstbd"
$PROBIN/proutil $DBENV/$DBNAME -C load $firstbd tenant $TENANT -dumplist $STAGEDIR/$i.list >> $WRKDIR/$i.load.log
#move the list and its content to finished folder
binlist=`cat $i`
for movethis in $binlist; do
echo "Moving file $movethis to $STAGEDIR/finished"
mv $movethis $STAGEDIR/finished/
done
echo "Finished working with list $i"
echo "Moving it to $STAGEDIR/finished"
mv $i $STAGEDIR/finished/
done
The error I was getting is..
./tableload.sh: line 107: /usr4/dlc/bin/proutil /usr4/testdbs/xxxx2 -C load /usr4/dumpdir/xxxxx.bd tenant xxxxx -dumplist /usr4/dumpdir/PUB.xxxxx.list >> /usr4/dumpdir/PUB.xxxx.list.load.log: A file or directory in the path name does not exist... however if I run "/usr4/dlc/bin/proutil"
The fix was to remove ">> $WRKDIR/$i.load.log".. the binary utility wouldn't run when trying to output results to file.. strange..
A couple of really bad practices here
parse the output of ls
not quoting variables
iterating the lines of a file with cat and for
As shelter comments, you don't check that you've created all the directories in the path for your log file.
A rewrite:
for i in "$STAGEDIR"/*.list; do
echo "Working with $i"
# grab first .bd file
firstbd=$(head -1 "$i")
echo "First file is $firstbd"
# ensure the output directory exists
logfile="$WRKDIR/$i.load.log"
mkdir -p "$(dirname "$logfile")"
"$PROBIN"/proutil "$DBENV/$DBNAME" -C load "$firstbd" tenant "$TENANT" -dumplist "$STAGEDIR/$i.list" >> "$logfile"
# move the list and its content to finished folder
while IFS= read -r movethis; do
echo "Moving file $movethis to $STAGEDIR/finished"
mv "$movethis" "$STAGEDIR"/finished/
done < "$i"
echo "Finished working with list $i"
echo "Moving it to $STAGEDIR/finished"
mv "$i" "$STAGEDIR"/finished/
done

Passing a path as an argument to a shell script

I've written bash script to open a file passed as an argument and write it into another file. But my script will work properly only if the file is in the current directory. Now I need to open and write the file that is not in the current directory also.
If compile is the name of my script, then ./compile next/123/file.txt should open the file.txt in the passed path. How can I do it?
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
#echo $1
name=$1 # Copy the filename to name
find . -iname $name -maxdepth 1 -exec cp {} $name \;
new_file="tempwithfile.adb"
cp $name $new_file #copy the file to new_file
echo "compiling"
dir >filelist.txt
gcc writefile.c
run_file="run_file.txt"
echo $name > $run_file
./a.out
echo ""
echo "cleaning"
echo ""
make clean
make -f makefile
./semantizer -da <withfile.adb
Your code and your question are a bit messy and unclear.
It seems that you intended to find your file, given as a parameter to your script, but failed due to the maxdepth.
If you are given next/123/file.txt as an argument, your find gives you a warning:
find: warning: you have specified the -maxdepth option after a
non-option argument -iname, but options are not positional (-maxdepth
affects tests specified before it as well as those specified after
it). Please specify options before other arguments.
Also -maxdepth gives you the depth find will go to find your file until it quits. next/123/file.txt has a depth of 2 directories.
Also you are trying to copy the given file within find, but also copied it using cp afterwards.
As said, your code is really messy and I don't know what you are trying to do. I will gladly help, if you could elaborate :).
There are some questions that are open:
Why do you have to find the file, if you already know its path? Do you always have the whole path given as an argument? Or only part of the path? Only the basename ?
Do you simply want to copy a file to another location?
What does your writefile.c do? Does it write the content of your file to another? cp does that already.
I also recommend using variables with CAPITALIZED letters and checking the exit status of used commands like cp and find, to check if these failed.
Anyway, here is my script that might help you:
#!/bin/sh
#FIRST SCRIPT
clear
echo "-----STARTING COMPILATION-----"
echo "FILE: $1"
[ $# -ne 1 ] && echo "Usage: $0 <file>" 1>&2 && exit 1
FILE="$1" # Copy the filename to name
FILE_NEW="tempwithfile.adb"
cp "$FILE" "$FILE_NEW" # Copy the file to new_file
[ $? -ne 0 ] && exit 2
echo
echo "----[ COMPILING ]----"
echo
dir &> filelist.txt # list directory contents and write to filelist.txt
gcc writefile.c # ???
FILE_RUN="run_file.txt"
echo "$FILE" > "$FILE_RUN"
./a.out
echo
echo "----[ CLEANING ]----"
echo
make clean
make -f makefile
./semantizer -da < withfile.adb

Resources