I am trying to recreate the folder structure from a source in a target location and perform a command on each file found in the process using BASH.Based on some feedback and some searches I am trying to get this solution to work properly. Right now it is breaking because the windows folders have directories with spaces that it refuses to find.
I was able to get this to work after installing some additional features for my cygwin.
source='/cygdrive/z/austin1/QA/Platform QA/8.0.0/Test Cases'
target='/cygdrive/c/FullBashScripts'
# let ** be recursive
shopt -s globstar
for file in "$source"/**/*.restomatic; do
cd "${file%/test.restomatic}"
locationNew="$target${file#$source}"
mkdir -p "$(dirname "$target${file#$source}")"
sed -e 's/\\/\//g' test.restomatic | awk '{if ($1 ~ /^(LOAD|IMPORT)/) system("cat " $2); else print;}' | sed -e 's/\\/\//g' |awk '{if ($1 ~ /^(LOAD|IMPORT)/) system("cat " $2); else print;}' > $locationNew
done
If your bash version is 4 or above, this should work:
source="testing/web testing/"
target="c:/convertedFiles/"
# let ** be recursive
shopt -s globstar
for file in "$source"/**/*.test; do
newfile= "$target/${file#$source}"
mkdir -p "$(dirname "$newfile")"
conversion.command "$file" > "$newfile"
done
${file#$source} lops $source off the beginning of $file.
If you can guarantee that no files have newlines in their name, you can use find to get the names:
source="testing/web testing/"
target="c:/convertedFiles/"
find "$source" -name \*.test | while read file; do
newfile= "$target/${file#$source}"
mkdir -p "$(dirname "$newfile")"
conversion.command "$file" > "$newfile"
done
Your best bet would be to find to get the list of files:
You can do it as follows:
export IFS=`/bin/echo -ne "\n"` # set field separator to new lines only
cd testing # change to the source directory
find . -type d > /tmp/test.dirs # make a list of local directories
for i in `cat /tmp/test.dirs`; do # for each directory
mkdir -p "c:/convertedFiles/$i" # create it in the new location
done
find . -iname *.test > /tmp/test.files # record local file paths as needed
for i in `cat /tmp/test.files`; do # for each test file
process "$i" > "c:/convertedFiles/$i" # process it and store in new dir
done
Note that this is not the most optimal way -- but the easiest to understand and follow. This should work with spaces in filenames. You may have to tweak it further to get it to work under windows.
I would look into a tool called sshfs, or Secure Shell File System. It lets you mount a portion of a remote file system to somewhere local to you.
Once you have the remote fs mounted locally, you can run the follow shell script:
for f in *.*;
do
echo "do something to $f file..";
done
EDIT: I initially did not realize that target was always local anyway.
Related
I want to copy the functionality of a windows program called files2folder, which basically lets you right-click a bunch of files and send them to their own individual folders.
So
1.mkv 2.png 3.doc
gets put into directories called
1 2 3
I have got it to work using this script but it throws out errors sometimes while still accomplishing what I want
#!/bin/bash
ls > list.txt
sed -i '/list.txt/d' ./list.txt
sed 's/.$//;s/.$//;s/.$//;s/.$//' ./list.txt > list2.txt
for i in $(cat list2.txt); do
mkdir $i
mv $i.* ./$i
done
rm *.txt
is there a better way of doing this? Thanks
EDIT: My script failed with real world filenames as they contained more than one . so I had to use a different sed command which makes it work. this is an example filename I'm working with
Captain.America.The.First.Avenger.2011.INTERNAL.2160p.UHD.BluRay.X265-IAMABLE
I guess you are getting errors on . and .. so change your call to ls to:
ls -A > list.txt
-A List all entries except for . and ... Always set for the super-user.
You don't have to create a file to achieve the same result, just assign the output of your ls command to a variable. Doing something like this:
files=`ls -A`
for file in $files; do
echo $file
done
You can also check if the resource is a file or directory like this:
files=`ls -A`
for res in $files; do
if [[ -d $res ]];
then
echo "$res is a folder"
fi
done
This script will do what you ask for:
files2folder:
#!/usr/bin/env sh
for file; do
dir="${file%.*}"
{ ! [ -f "$file" ] || [ "$file" = "$dir" ]; } && continue
echo mkdir -p -- "$dir"
echo mv -n -- "$file" "$dir/"
done
Example directory/files structure:
ls -1 dir/*.jar
dir/paper-279.jar
dir/paper.jar
Running the script above:
chmod +x ./files2folder
./files2folder dir/*.jar
Output:
mkdir -p -- dir/paper-279
mv -n -- dir/paper-279.jar dir/paper-279/
mkdir -p -- dir/paper
mv -n -- dir/paper.jar dir/paper/
To make it actually create the directories and move the files, remove all echo
I try to search for files and seperate path and version as variable because each will be needed later for creating a directory and to unzip a .jar in desired path.
file=$(find /home/user/Documents/test/ -path *.jar)
version=$(echo "$file" | grep -P -o '[0-9].[0-9].[0-9].[0-9]')
path=$(echo "$file" | sed 's/\(.*\)[/].*/\1/')
newpath=$(echo "${path}/${version}")
echo "$newpath"
result
> /home/user/Documents/test/gb0500
> /home/user/Documents/test/gb0500 /home/user/Documents/test/gb0500
> /home/user/Documents/test /home/user/Documents/test/1.3.2.0
> 1.3.2.1
> 1.3.2.2
> 1.2.0.0
> 1.3.0.0
It's hilarious that it's only working at one line.
what else I tried:
file=$(find /home/v990549/Dokumente/test/ -path *.jar)
version=$(grep -P -o '[0-9].[0-9].[0-9].[0-9]')
path=$(sed 's/\(.*\)[/].*/\1/')
while read $file
do
echo "$path$version"
done
I have no experience in scripting. Thats what I figured out some days ago. I am just practicing and trying to make life easier.
find output:
/home/user/Documents/test/gb0500/gb0500-koetlin-log4j2-web-1.3.2.0-javadoc.jar
/home/user/Documents/test/gb0500/gb0500-koetlin-log4j2-web-1.3.2.1-javadoc.jar
/home/user/Documents/test/gb0500/gb0500-koetlin-log4j2-web-1.3.2.2-javadoc.jar
/home/user/Documents/test/gb0500-co-log4j2-web-1.2.0.0-javadoc.jar
/home/user/Documents/test/gb0500-commons-log4j2-web-1.3.0.0-javadoc.jar
As the both variables version and path are newline-separated, how about:
file=$(find /home/user/Documents/test/ -path *.jar)
version=$(echo "$file" | grep -P -o '[0-9].[0-9].[0-9].[0-9]')
path=$(echo "$file" | sed 's/\(.*\)[/].*/\1/')
paste -d "/" <(echo "$path") <(echo "$version")
Result:
/home/user/Documents/test/gb0500/1.3.2.0
/home/user/Documents/test/gb0500/1.3.2.1
/home/user/Documents/test/gb0500/1.3.2.2
/home/user/Documents/test/1.2.0.0
/home/user/Documents/test/1.3.0.0
BTW I do not recommend to store multiple filenames in a single variable
as a newline-separated variable due to several reasons:
Filenames may contain a newline character.
It is not easy to manipulate the values of each line.
For instance you could simply say
the third line as path=${file%/*} if file contains just one.
Hope this helps.
I have a directory that has thousands of files in it with various extensions. I also have a drop location where users drop files to be migrated to this directory. I'm looking for a script that will scan the target directory for a duplicate file name, if found, rename the file in the drop folder, then move it to the target directory.
Example:
/target/file.doc
/drop/file.doc
Script will rename file.doc to file1.doc then move it to /target/.
It needs to maintain the file extension too.
for fil in /drop/*
do
test -f "/target/$fil"
if [ "$?" = 0 ]
then
suff=$(awk -F\. '{ print "."$NF }' <<<$fil)
bdot=$(basename -s $suff $fil)
mv "/drop/$fil" "/drop/${bdot}1$suff"
cp "/drop/${bdot}1.$suff" "/target/${bdot}1$suff"
fi
done
Take each file in the drop directory and check it is existing the /target using test -e. If it does then move (rename) and then copy.
You have to take a bit more care than simply checking if a file exists before moving in order to provide a flexible solution that can handle files with or without extensions. You also may want to provide a way of forming duplicate filenames that preserves sort order. e.g. if file.txt already exists, you may want to use file_001.txt as the duplicate in target rather than file1.txt as when you reach 10 you will no longer have a canonical sort by filename.
Also, you never want to iterate with for i in $(ls dir) that is wrought with pitfalls. See Bash Pitfalls No. 1
Putting those pieces together, and including detail in the comments below, you could do something similar to the following and have a reasonable flexible solution allowing you to specify only the filename.ext to move or /path/to/drop/filename.ext. You must specify the drop and target directories in the script to meet your circumstance., e.g.
#!/bin/bash
tgt=target ## set target and drop directories as required
drp=drop
declare -i cnt=1 ## counter for filename_$cnt
test -z "$1" && { ## validate one argument given
printf "error: insufficient input\nusage: %s filename\n" "${0##*/}"
exit 1
}
test -w "$1" || test -w "$drp/$1" || { ## validate valid filename is writeable
printf "error: file not found or lack permission to move '%s'.\n" "$1"
exit 1
}
fn="${1##*/}" ## strip any path info from filename
if test "$1" != "${1%.*}" ; then
ext="${fn##*.}" ## get file extension
fnwoe="${fn%."$ext"}" ## get filename without extension
test "$fnwoe" = '' && ext= ## was a dotfile, reset ext
fi
vfn="$fn" ## set valid filename = filename
## form valid filename e.g. "$fn_001.$ext" if duplicate found
while test -e "$tgt/$vfn"; do
if test -n "$ext" ## did we have have an extension?
then
printf -v vfn "%s_%03d.%s" "$fnwoe" "$((cnt++))" "$ext"
else
printf -v vfn "%s_%03d" "$fn" "$((cnt++))"
fi
done
mv "$drp/$fn" "$tgt/$vfn" ## move file under non-conflicting name
Example drop and target
$ ls -1 drop
file
file.txt
$ ls -1 target
file.txt
file_001.txt
file_002.txt
Example Use
$ bash mvdrop.sh file
$ bash mvdrop.sh drop/file.txt
Resulting drop and target
$ ls -1 drop
$ ls -1 target
file
file.txt
file_001.txt
file_002.txt
file_003.txt
This will test to see if it exists, preserve the extension (along with any structure before the extension such as in the case of FILE.tar.gz), and move it to the target directory.
#!/bin/bash
TARGET="\target\"
DROP="\drop\"
for F in `ls $DROP`; do
if [[ -f $TARGET$F ]]; then
EXT=`echo $F | awk -F "." '{print $NF}'`
PRE=`echo $F | awk -F "." '{$NF="";print $0}' | sed -e 's/ $//g;s/ /./g'`
mv $DROP$F $DROP$PRE"1".$EXT
F=$PRE"1".$EXT
fi
mv $DROP$F $TARGET
done
Additionally you may want to do come restricting in the ls command, so that you aren't copying entire directories.
Display only regular files (no directories or symbolic links)
ls -p $DROP | grep -v /
Hi guys i've a problem with grep . I don't know if there is another search code in shell script.
I'm trying to backup a folder AhmetsFiles which is stored in my Flash Disk , but at the same time I've to group them by their extensions and save them into [extensionName] Folder.
AhmetsFiles
An example : /media/FlashDisk/AhmetsFiles/lecture.pdf must be stored in /home/$(whoami)/Desktop/backups/pdf
Problem is i cant copy a file which name contains spaces.(lecture 2.pptx)
After this introduction here my code.
filename="/media/FlashDisk/extensions"
count=0
exec 3<&0
exec 0< $filename
mkdir "/home/$(whoami)/Desktop/backups"
while read extension
do
cd "/home/$(whoami)/Desktop/backups"
rm -rf "$extension"
mkdir "$extension"
cd "/media/FlashDisk/AhmetsFiles"
files=( `ls | grep -i "$extension"` )
fCount=( `ls | grep -c -i "$extension"` )
for (( i=0 ; $i<$fCount ; i++ ))
do
cp -f "/media/FlashDisk/AhmetsFiles/${files[$i]}" "/home/$(whoami)/Desktop/backups/$extension"
done
let count++
done
exec 0<&3
exit 0
Your looping is way more complicated than it needs to be, no need for either ls or grep or the files and fCount variables:
for file in *.$extension
do
cp -f "/media/FlashDisk/AhmetsFiles/$file" "$HOME/Desktop/backups/$extension"
done
This works correctly with spaces.
I'm assuming that you actually wanted to interpret $extension as a file extension, not some random string in the middle of the filename like your original code does.
Why don't you
grep -i "$extension" | while IFS=: read x ; do
cp ..
done
instead?
Also, I believe you may prefer something like grep -i ".$extension$" instead (anchor it to the end of line).
On the other hand, the most optimal way is probably
cp -f /media/FlashDisk/AhmetsFiles/*.$extension "$HOME/Desktop/backups/$extension/"
Is it possible to find out the full path to the script that is currently executing in KornShell (ksh)?
i.e. if my script is in /opt/scripts/myscript.ksh, can I programmatically inside that script discover /opt/scripts/myscript.ksh ?
Thanks,
You could use:
## __SCRIPTNAME - name of the script without the path
##
typeset -r __SCRIPTNAME="${0##*/}"
## __SCRIPTDIR - path of the script (as entered by the user!)
##
__SCRIPTDIR="${0%/*}"
## __REAL_SCRIPTDIR - path of the script (real path, maybe a link)
##
__REAL_SCRIPTDIR=$( cd -P -- "$(dirname -- "$(command -v -- "$0")")" && pwd -P )
In korn shell, all of these $0 solutions fail if you are sourcing in the script in question. The correct way to get what you want is to use $_
$ cat bar
echo dollar under is $_
echo dollar zero is $0
$ ./bar
dollar under is ./bar
dollar zero is ./bar
$ . ./bar
dollar under is bar
dollar zero is -ksh
Notice the last line there? Use $_. At least in Korn. YMMV in bash, csh, et al..
Well it took me a while but this one is so simple it screams.
_SCRIPTDIR=$(cd $(dirname $0);echo $PWD)
since the CD operates in the spawned shell with $() it doesn't affect the current script.
How the script was called is stored in the variable $0. You can use readlink to get the absolute file name:
readlink -f "$0"
The variable $RPATH contains the relative path to the real file or the real path for a real file.
CURPATH=$( cd -P -- "$(dirname -- "$(command -v -- "$0")")" && pwd -P )
CURLOC=$CURPATH/`basename $0`
if [ `ls -dl $CURLOC |grep -c "^l" 2>/dev/null` -ne 0 ];then
ROFFSET=`ls -ld $CURLOC|cut -d ">" -f2 2>/dev/null`
RPATH=`ls -ld $CURLOC/$ROFFSET 2>/dev/null`
else
RPATH=$CURLOC
fi
echo $RPATH
This is what I did:
if [[ $0 != "/"* ]]; then
DIR=`pwd`/`dirname $0`
else
DIR=`dirname $0`
fi
readlink -f would be the best if it was portable, because it resolves every links found for both directories and files.
On mac os x there is no readlink -f (except maybe via macports), so you can only use readlink to get the destination of a specific symbolic link file.
The $(cd -P ... pwd -P) technique is nice but only works to resolve links for directories leading to the script, it doesn't work if the script itself is a symlink
Also, one case that wasn't mentioned : when you launch a script by passing it as an argument to a shell (/bin/sh /path/to/myscript.sh), $0 is not usable in this case
I took a look to mysql "binaries", many of them are actually shell scripts ; and now i understand why they ask for a --basedir option or need to be launched from a specific working directory ; this is because there is no good solution to locate the targeted script
This works also, although it won't give the "true" path if it's a link. It's simpler, but less exact.
SCRIPT_PATH="$(whence ${0})"
Try which command.
which scriptname
will give you the full qualified name of the script along with its absolute path
I upgraded the Edward Staudt's answer, to be able to deal with absolute-path symbolic links, and with chains of links too.
DZERO=$0
while true; do
echo "Trying to find real dir for script $DZERO"
CPATH=$( cd -P -- "$(dirname -- "$(command -v -- "$DZERO")")" && pwd -P )
CFILE=$CPATH/`basename $DZERO`
if [ `ls -dl $CFILE | grep -c "^l" 2>/dev/null` -eq 0 ];then
break
fi
LNKTO=`ls -ld $CFILE | cut -d ">" -f2 | tr -d " " 2>/dev/null`
DZERO=`cd $CPATH ; command -v $LNKTO`
done
Ugly, but works...
After run this, the path is $CPATH and the file is $CFILE
Try using this:
dir = $(dirname $0)
Using $_ provides the last command.
>source my_script
Works if I issue the command twice:
>source my_script
>source my_script
If I use a different sequence of commands:
>who
>source my_script
The $_ variable returns "who"