Bash Shell Script if statement issue and cp/date command issues - bash

I am new to UNIX and am having some issues with my bash script and would like some help. My script is designed to copy files from one directory to a target directory while outputting which files are being copied, the current date each file is being copied, and an error message if not enough parameters/input from the user.
So far, I have three parameters/arguments. 1)sourcePath which is the directory that has files. 2)backupPath, the directory the user wants to copy files to from sourcePath (if backupPath is not a directory, then it will automatically be made), and 3)filePrefix which prompts the user to enter a certain letter or pattern and any file from sourcePath that starts with that letter or has that pattern will be copied. Couple of issues so far are that the files being copied are not being displayed, so the user doesn't know which files were copied until they cd into the backupPath directory and ls to see. The second problem is the current date not showing. The third problem is that I have an if statement for if not enough paramteters have been passed that will echo a certain message, but that message is echoed no matter what, even if all parameters have been given. Code is below:
read -r -p "sourcePath: " sourcePath
read -r -p "backupPath: " backupPath
read -r -p "filePrefix: " filePrefix
if [ $# -lt 3 ]; then
echo "Create backup files in a target directory given
the target directory name and a file name prefix.
Only files found in the specified source directory
whose name begins with the supplied prefix will
be copied. In addition, all copied echo files will
have a datestamp suffix added.
syntax: ./dobackup.bash sourcePath backupPath filePrefix"
fi
mkdir -p /home/public/"$backupPath"
cp -v /export/home/public/"$sourcePath/$filePrefix"* `(date +%y%m%d)`
So as said above, "cp -v" seems to not be working properly as it lists every file in the backupPath and sourcePath directories when it is to only display files being copied.
My if statement outputs whenever the script is executed, when it should only output if not all of the parameters/inputs have been met, or typed in by the user. I understand this is a lot of help to ask for but I am still new to UNIX and scripting. I know how to do all of this if I was to just input the separate commands myself but am having difficulties here. All help is appreciated.

Related

problem with copying directory files to another directory

So I want to copy some files from one directory to another.
Essentially, I want to capture the directory path to a variable, say, "pathname" and use
"cp-r $pathname ." to copy file1 and file2 to to a new folder in which I made using mkdir and have cd'ed into (hence the "." as the second command line argument).
source_to_copy_from:
home/folder1/folder2/file1
home/folder1/folder2/file2
destination_to_copy_to:
home/newfolder1/newfolder2/file1
home/newfolder1/newfolder2/file2
I did:
pathname=$"$(pwd)"
//code to make the newfolder1 here
cp -r $"$pathname" .
But there appears to be nothing in the new pathname that was supposed to be copied in.
Also am using Mac bash
also quite beginner to bash

Copying files from one folder to another using a shell script

I have a script to copy the files from one location to another where i am passing the first location as parameter to the script
#!/bin/bash
locatn=$1
echo $locatn
cp -r /locatn/ /ws/priyapan-rcd/workspace/automation/
but when i run this code this throws error as
cp: cannot stat `locatn': No such file or directory
what could be the issue
Formatting looks a bit weird but as #Patick Trentin said you simply forgot a $ making your script always copy the files to the same location ignoring the given parameter.
#!/bin/bash
locatn=$1
echo $locate
cp -r /${locatn}/ /ws/priyapan-rcd/workspace/automation/

Create a bash script that runs and updates a log whenever a file is deleted

I am new to bash scripting and I have to create a script that will run on all computers within my group at work (so it's not just checking one computer). We have a spreadsheet that keeps certain file information, and I am working to automate the updating of that spreadsheet. I already have an existing python script that gathers the information needed and writes to the spreadsheet.
What I need is a bash script (cron job, maybe?) that is activated anytime a user deletes a file that matches a certain extension within the specified file path. The script should hold on to the file name before it is completely deleted. I don't need any other information besides the name.
Does anyone have any suggestions for where I should begin with this? I've searched a bit but not found anything useful yet.
It would be something like:
for folders and files in path:
if file ends in .txt and is being deleted:
save file name
To save the name of every file .txt deleted in some directory path or any of its subdirectories, run:
inotifywait -m -e delete --format "%w%f" -r "path" 2>stderr.log | grep '\.txt$' >>logfile
Explanation:
-m tells inotifywait to keep running. The default is to exit after the first event
-e delete tells inotifywait to only report on file delete events.
--format "%w%f" tells inotifywait to print only the name of the deleted file
path is the target directory to watch.
-r tells inotifywait to monitor subdirectories of path recursively.
2>stderr.log tells the shell to save stderr output to a file named stderr.log. As long as things are working properly, you may ignore this file.
>>logfile tells the shell to redirect all output to the file logfile. If you leave this part off, output will be directed to stdout and you can watch in real time as files are deleted.
grep '\.txt$' limits the output to files with .txt extensions.
Mac OSX
Similar programs are available for OSX. See "Is there a command like “watch” or “inotifywait” on the Mac?".

unknown error in shell script

I have cobbled together a shell script to submit multiple jobs on a cluster, which it appears to without giving me an error message, but the output files are missing and the error log files are also empty. What the script supposed to do is 1.) make a bunch of new directories, 2.) copy four files to each (mainparams, extraparams, infile, and structurejobsubmissionfile) 3.) then submit each one to the cluster for it to run structure while changing one parameter in the mainparams file every tenth directory (that's the 's/changethis/'$k'/g' line).
Test running it on the front end gives no errors, the structure program is up to date on the cluster, and the cluster administrators don't see anything wrong. Thanks!
#!/bin/bash
reps=10
numK=10
for k in $(seq $numK);
do
for i in $(seq $reps);
do
#make folder name (ex. k4rep7)
tmpstr="k${k}rep${i}"
#echo "Making folder and filename $tmpstr"
#make the new folder
mkdir $tmpstr
#go to that folder
cd ./$tmpstr
#copy in the input files
cp ../str_in/* ./
#modify the recently copied input file here so source file remains the same
cp ./mainparams ./temp.txt
#change maxpops to current value of k and the directory for the files to the current directory
sed -e 's/changethis/'$k'/g' -e "s:pathforrunningstructurehere:$PWD:g" ./temp.txt > ./mainparams
#get rid of temporary file
rm ./temp.txt
#inside $i so run STRUCTURE here
qsub -q fnrgenetics -l nodes=1:ppn=1,walltime=20:00:00 structurejobsubmissionfile
#go back to parent directory
cd ../
done
done
I can't see anything obviously wrong, but I think the place that you'll find the answer lies in better logging and better error checking. Some of the things that you're not checking that you should:
Is $tmpstr created correctly? (will fail on disk full or if permissions are not set correctly)
does str_in/ exist, and is it a directory?
does it contain files?
does it contain mainparams?
is qsub in $PATH?
does the call to qsub return an error?
You can roll an error logging function of your own, or use a package like log4bash

shell "if" statement

I am new to unix and am practicing a simple script to unzip a load of files within a specified directory. I need the program to move the zipped file into another folder when it is done unzipping it (I called this oldzipped folder). For simplicity, I have removed the part of the code unzipping the file and currently have the program working for a specific file rather than the *tar.7z file extention. For some reason, the mv statement is not working. Unix is saying the following when I try to run the script. Could someone give me a hand with this? Again, I know this is the long way of doing things, but I want practice writing a script. Please be nice, as I am very new to Unix :(
unzip5: line 14: [ASDE0002.tar.7z]: command not found
#!~/bin/bash
# My program to try to unzip several files with ending of tar.7z
# I have inserted the ability to enter the directory where you want this to be done
echo "What file location is required for unzipping?"
read dirloc
cd $dirloc
mkdir oldzippedfiles
for directory in $dirloc
do
if
[ASDE0002.tar.7z]
then
mv -f ASDE0002.tar.7z $dirloc/oldzippedfiles
fi
done
echo "unzipping of file is complete"
exit 0
[ is the name of a (sometimes built-in) command which accepts arguments. As such you need to put a space after it as you would when invoking any other program. Also, you need a test. For example, to determine if the file exists and is a file, you need to use -f:
if [ -f ASDE0002.tar.7z ]
then
mv -f ASDE0002.tar.7z $dirloc/oldzippedfiles
fi
Here are some other possible tests.

Resources