How to export multiple files to database using shell script? - shell

Hi everyone I have some files to be copied to database. but for every time I should write "cp .csv'/tmp" so suggest me command in shell script so that I can copy all the files at once. Thank you

#!/bin/sh
file=/pathtoyour/textfile.txt
while read line
do
cp $line /tmp/
done < $file

Related

How do you write a Rsync multi folder shell script like this?

Source path /var/www/html/20170101/*.jpg
Destination path /Backup/html/20170101/
There will be a lot of Source path below, such as 20170101, 20171231, a directory with date name, and there will be a lot of graphics in each directory. How do I write a Rsync shell script? I'm using it now
ex:
Rsync -avzh --progress /var/www/html/20170101/ /Backup/html/20170101/
I want to write a script that can make the date part of the variable when the day directory is finished, and then change the directory for the next day to continue Rsync
Is this what you want:
Put all the directory names in a text file in the below format
cat dirnamefile
2017010
20171231
Now write a shell script to read the file and substitute the directory each time:
i=0
while read line;
do
arr[$i]=`echo $line`
i = `expr $i + 1`
done<dirname
for var in ${arr[#]}
do
Rsync -avzh --progress /var/www/html/$var/ /Backup/html/$var/
done
The above script will take in the list of different names for the directories and then loop though running rsync at each loop.
Is this what you are looking for?

How to write a bash script using Mac to take files with the same date and put them in a folder with that date

I have hundreds of datalogger files in a directory and I want to write a bash script that will take files with the same date in the filename (an example file name is "2016-06-15T170000_SMARTFLUX.data", where 2016-06-15 is the date) and store them in a folder with the date as the name. I am using a Mac Terminal window, which I believe is Linux (I apologize for my ignorance in computer terminology)
So far I have:
#Type of file (extension) to process:
FILES=*.data
#get date string from file name to use for newly created file
DATE=`ls $FILES|head -n 1|cut -c 1-10`
Any help would be greatly appreciated. I have only modified a bash script that combines these types of files into a text, and I have not created any folders or moved files.
Assuming your script is in the same dir as the data files:
#!/bin/bash
for filename in *.data; do
target_dir=${filename:0:10}
if [[ ! -d $target_dir ]]; then
mkdir $target_dir
fi
mv $filename $target_dir
done

Bash - Moving files from subdirectories

I am relatively new to bash scripting.
I need to create a script that will loop through a series of directories, go into subdirectories with a certain name, and then move their file contents into a common folder for all of the files.
My code so far is this:
#!/bin/bash
#used to gather usable pdb files
mkdir -p usable_pdbFiles
#loop through directories in "pdb" folder
for pdbDirectory in */
do
#go into usable_* directory
for innerDirectory in usable_*/
do
if [ -d "$innerDirectory" ] ; then
for file in *.ent
do
mv $file ../../usable_pdbFiles
done < $file
fi
done < $innerDirectory
done
exit 0
Currently I get
usable_Gather.sh: line 7: $innerDirectory: ambiguous redirect
when I try and run the script.
Any help would be appreciated!
The redirections < $innerDirectory and < $file are invalid and this is causing the problem. You don't need to use a loop for this, you can instead rely on the shell's filename expansion and use mv directly:
mkdir -p usable_pdbFiles
mv */usable_*/*.ent usable_pdbFiles
Bear in mind that this solution, and the loop based one that you are working on, will overwrite files with the same name in the destination directory.

Simple Bash Script File Copy

I am having trouble with a simple grading script I am writing. I have a directory called HW5 containing a folder for each student in the class. From my current directory, which contains the HW5 folder, I would like to copy all files starting with the word mondial, to each of the students' folders. My script runs but does not copy any of the files over. Any suggestions?
#!/bin/bash
for file in ./HW5; do
if [ -d $file ]; then
cp ./mondial.* ./$file;
fi
done
Thanks,
The first loop was executing only once, with file equal ./HW5. Add the star to actually select the files or directories inside it.
#!/bin/bash
for file in ./HW5/*; do
if [ -d "$file" ]; then
cp ./mondial.* ./"$file"
fi
done
As suggested by Mark Reed, this can be simplified:
for file in ./HW5/*/; do
cp ./mondial.* ./"$file"
done

read the contents of a directory using shell script

I'm trying to get the contents of a directory using shell script.
My script is:
for entry in `ls`; do
echo $entry
done
However, my current directory contains many files with whitespaces in their names. In that case, this script fails.
What is the correct way to loop over the contents of a directory in shell scripting?
PS: I use bash.
for entry in *
do
echo "$entry"
done
don't parse directory contents using ls in a for loop. you will encounter white space problems. use shell expansion instead
for file in *
do
if [ -f "$file" ];then
echo "$file"
fi
done

Resources