Copying files from one folder to another using a shell script - shell

I have a script to copy the files from one location to another where i am passing the first location as parameter to the script
#!/bin/bash
locatn=$1
echo $locatn
cp -r /locatn/ /ws/priyapan-rcd/workspace/automation/
but when i run this code this throws error as
cp: cannot stat `locatn': No such file or directory
what could be the issue

Formatting looks a bit weird but as #Patick Trentin said you simply forgot a $ making your script always copy the files to the same location ignoring the given parameter.
#!/bin/bash
locatn=$1
echo $locate
cp -r /${locatn}/ /ws/priyapan-rcd/workspace/automation/

Related

Failed to copy list of files to another folder

I have a text file called "list.txt" that contain all the directories of the files that need to be copied to a new folder (dir_newfolder). I wrote the code like below:
for file in $(cat list.txt); do cp ${file} dir_newfolder; done
I got list of errors: cp:"file_name":No such file or directory. The file_names are the lines pulled out from the "list.txt". But when I copy each file_names from the error message and use cp to copy to the new folder. There is no error.
I am using mac os terminal.
Thanks in advance.
Copy a file or folder locally
In the Terminal app on your Mac, use the cp command to make a copy of a file.
For example, to copy a folder named Expenses in your Documents folder to another volume named Data:
% cp -R ~/Documents/Expenses /Volumes/Data/Expenses
The -R flag causes cp to copy the folder and its contents. Note that the folder name does not end with a slash, which would change how cp copies the folder.
in your case:
make sure you are providing correct path list.txt and the correct path for destiny folder, also i mentioned how to access file variable in double quotes , try this code it's working for me
for file in $(cat ~/Documents/list.txt); do cp "$file" ~/dir_newfolder; done

Unix Shell scripting for copying file from one folder to another

Hi this is my shell script to copy files from one directory to another directory with timestamp.But my script shows too many arguments.I want to copy files from one directory to another.What error in my code.
Date_Val="$(date +%Y%m%d%H%M%S)";
cd /etl_mbl/SrcFiles/
if [ -f /etl_mbl/SrcFiles/SrcFiles_TEMP*.csv ]
then
cp /etl_mbl/SrcFiles/SrcFiles_TEMP/*.csv /etl_mbl/SrcFiles/Archive/*_$Date_Val.csv
fi
The reason you got "too many arguments" error is that the wildcard in the "if" statement expands to a multitude of files. Please also note that you cannot have wildcards in the destination of a "cp". You probably want something like this:
#!/bin/bash
Date_Val="$(date +%Y%m%d%H%M%S)";
for file in ./src/*.csv; do
filename=${file##*/}
basename=${filename%.*}
cp $file ./archive/$basename\_$Date_Val.csv
done

Bash script to execute only on new files in directory

I'm currently running a Bash script called (log2csv) that runs against a specified .log file. Sitting in the desired directory I can type in terminal:
log2csv Red1_1.log
This will create Red1_1.csv
This is my current bash script:
#!/bin/bash
for path
do
base=$(basename "$path")
noext="${base/.log}"
/Users/joshuacarter/bin/read_scalepack.pl "$path" > "${noext}.csv"
done
This script is actually running a perl script on the specified log and putting the results in a CSV output.
I can alternatively run in terminal:
log2csv *.log
This will run the script against all .log files in the current directory and create .csv files for every one.
What I would like the script to do is only run on .log files that haven't had .csv files created for them. After doing some research I think I possibly can use inotifywait to achieve this, but I'm unsure how to make this work in my script? I also have read that this may be an issue if you overwrite a file. Any help or ideas would be most appreciated!
What I would like the script to do is only run on .log files that haven't had .csv files created for them.
Simply skip those .log files whose corresponding .csv files already exist:
for path
do
base=$(basename "$path")
noext="${base/.log}"
[ -e "${noext}.csv" ] && continue # <---------------
/Users/joshuacarter/bin/read_scalepack.pl "$path" > "${noext}.csv"
done

unknown error in shell script

I have cobbled together a shell script to submit multiple jobs on a cluster, which it appears to without giving me an error message, but the output files are missing and the error log files are also empty. What the script supposed to do is 1.) make a bunch of new directories, 2.) copy four files to each (mainparams, extraparams, infile, and structurejobsubmissionfile) 3.) then submit each one to the cluster for it to run structure while changing one parameter in the mainparams file every tenth directory (that's the 's/changethis/'$k'/g' line).
Test running it on the front end gives no errors, the structure program is up to date on the cluster, and the cluster administrators don't see anything wrong. Thanks!
#!/bin/bash
reps=10
numK=10
for k in $(seq $numK);
do
for i in $(seq $reps);
do
#make folder name (ex. k4rep7)
tmpstr="k${k}rep${i}"
#echo "Making folder and filename $tmpstr"
#make the new folder
mkdir $tmpstr
#go to that folder
cd ./$tmpstr
#copy in the input files
cp ../str_in/* ./
#modify the recently copied input file here so source file remains the same
cp ./mainparams ./temp.txt
#change maxpops to current value of k and the directory for the files to the current directory
sed -e 's/changethis/'$k'/g' -e "s:pathforrunningstructurehere:$PWD:g" ./temp.txt > ./mainparams
#get rid of temporary file
rm ./temp.txt
#inside $i so run STRUCTURE here
qsub -q fnrgenetics -l nodes=1:ppn=1,walltime=20:00:00 structurejobsubmissionfile
#go back to parent directory
cd ../
done
done
I can't see anything obviously wrong, but I think the place that you'll find the answer lies in better logging and better error checking. Some of the things that you're not checking that you should:
Is $tmpstr created correctly? (will fail on disk full or if permissions are not set correctly)
does str_in/ exist, and is it a directory?
does it contain files?
does it contain mainparams?
is qsub in $PATH?
does the call to qsub return an error?
You can roll an error logging function of your own, or use a package like log4bash

shell "if" statement

I am new to unix and am practicing a simple script to unzip a load of files within a specified directory. I need the program to move the zipped file into another folder when it is done unzipping it (I called this oldzipped folder). For simplicity, I have removed the part of the code unzipping the file and currently have the program working for a specific file rather than the *tar.7z file extention. For some reason, the mv statement is not working. Unix is saying the following when I try to run the script. Could someone give me a hand with this? Again, I know this is the long way of doing things, but I want practice writing a script. Please be nice, as I am very new to Unix :(
unzip5: line 14: [ASDE0002.tar.7z]: command not found
#!~/bin/bash
# My program to try to unzip several files with ending of tar.7z
# I have inserted the ability to enter the directory where you want this to be done
echo "What file location is required for unzipping?"
read dirloc
cd $dirloc
mkdir oldzippedfiles
for directory in $dirloc
do
if
[ASDE0002.tar.7z]
then
mv -f ASDE0002.tar.7z $dirloc/oldzippedfiles
fi
done
echo "unzipping of file is complete"
exit 0
[ is the name of a (sometimes built-in) command which accepts arguments. As such you need to put a space after it as you would when invoking any other program. Also, you need a test. For example, to determine if the file exists and is a file, you need to use -f:
if [ -f ASDE0002.tar.7z ]
then
mv -f ASDE0002.tar.7z $dirloc/oldzippedfiles
fi
Here are some other possible tests.

Resources