Trouble with cp command for directories - shell

I'm trying to use the cp function to do copy directories:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
then the copying should result in
tgt/a/4
tgt/b/1
tgt/c/3
tgt/d/2
tgt/e/5
I tried to use the 'basename' function as well as 'cp dir1/*dir2'. With the basename, do I make a loop to find every directory or is there a recursive builtin? Also tried the 'cp-r' recursive copy function. But nothing so far has worked.

I used tmp folder that will hols the SOURCE list of files, yo can readjust:
cat tmp
result:
src/1/b
src/2/d
src/3/c
src/4/a
src/5/e
from here, I echo out the command, but you can remove echo and it will execute, if this output seems correct:
#!/bin/bash
cat tmp |while read z
do
echo cp "$z" "tgt/$(echo "$z"|cut -d/ -f 3)/$(echo "$z"|cut -d/ -f 2)"
done
result:
cp src/1/b tgt/b/1
cp src/2/d tgt/d/2
cp src/3/c tgt/c/3
cp src/4/a tgt/a/4
cp src/5/e tgt/e/5
you can also add parameters to cp as you see fit. But first test with the echo command, then execute :)

Related

Send files to folders using bash script

I want to copy the functionality of a windows program called files2folder, which basically lets you right-click a bunch of files and send them to their own individual folders.
So
1.mkv 2.png 3.doc
gets put into directories called
1 2 3
I have got it to work using this script but it throws out errors sometimes while still accomplishing what I want
#!/bin/bash
ls > list.txt
sed -i '/list.txt/d' ./list.txt
sed 's/.$//;s/.$//;s/.$//;s/.$//' ./list.txt > list2.txt
for i in $(cat list2.txt); do
mkdir $i
mv $i.* ./$i
done
rm *.txt
is there a better way of doing this? Thanks
EDIT: My script failed with real world filenames as they contained more than one . so I had to use a different sed command which makes it work. this is an example filename I'm working with
Captain.America.The.First.Avenger.2011.INTERNAL.2160p.UHD.BluRay.X265-IAMABLE
I guess you are getting errors on . and .. so change your call to ls to:
ls -A > list.txt
-A List all entries except for . and ... Always set for the super-user.
You don't have to create a file to achieve the same result, just assign the output of your ls command to a variable. Doing something like this:
files=`ls -A`
for file in $files; do
echo $file
done
You can also check if the resource is a file or directory like this:
files=`ls -A`
for res in $files; do
if [[ -d $res ]];
then
echo "$res is a folder"
fi
done
This script will do what you ask for:
files2folder:
#!/usr/bin/env sh
for file; do
dir="${file%.*}"
{ ! [ -f "$file" ] || [ "$file" = "$dir" ]; } && continue
echo mkdir -p -- "$dir"
echo mv -n -- "$file" "$dir/"
done
Example directory/files structure:
ls -1 dir/*.jar
dir/paper-279.jar
dir/paper.jar
Running the script above:
chmod +x ./files2folder
./files2folder dir/*.jar
Output:
mkdir -p -- dir/paper-279
mv -n -- dir/paper-279.jar dir/paper-279/
mkdir -p -- dir/paper
mv -n -- dir/paper.jar dir/paper/
To make it actually create the directories and move the files, remove all echo

How to escape characters like colon : and comma , in for loop?

I have a file named "blocked.txt" which contains name of 4000 files like below:
1502146676.VdeI4b5c5cbM804631.vps47619.domain.local:2,
1502146676.VdeI4b5c5cdM808282.vps47619.domain.local:2,
1502146677.VdeI4b5c5d3M192892.vps47619.domain.local:2,
1502146677.VdeI4b5c5d7M213070.vps47619.domain.local:2,
1502146677.VdeI4b5c5e5M796312.vps47619.domain.local:2,
1502146678.VdeI4b5c5efM412992.vps47619.domain.local:2,
1502146678.VdeI4b5c5f1M613275.vps47619.domain.local:2,
1502146679.VdeI4b5c5f8M11301.vps47619.domain.local:2,
1502146682.VdeI4b5c66dM115848.vps47619.domain.local:2,S
1502146682.VdeI4b5c676M608733.vps47619.domain.local:2,
1502146685.VdeI4b5c69aM1652.vps47619.domain.local:2,
....
....
i ran below command on shell to copy the files to /tmp/backup directory
for i in `cat blocked.txt`; do cp -f "${i}" /tmp/backup/ ; done
but this gives me error "do you want to overwrite ? y/n" even though i have used -f with cp
Any idea whats wrong in the command ?
You likely have an alias,
alias cp="cp -i"
or function
cp () {
command cp -i "$#"
}
that interferes.
To solve this, simply use command cp instead of just cp:
while read -r name; do
command cp "$name" /tmp/backup
done <blocked.txt
Or specify the full path to cp:
while read -r name; do
/bin/cp "$name" /tmp/backup
done <blocked.txt

Shell script that watches HTML templates and compiles with Handlebars

I'm trying to create a script that watches my HTML template files in a directory and when it notify's changes, it compiles the template. I can't get it working, this is what I got:
#!/bin/sh
while FILENAME=$(inotifywait --format %w -r templates/*.html)
do
COMPILED_FILE=$(echo "$FILENAME" | sed s/templates/templates\\/compiled/g | sed s/.html/.js/g)
handlebars $FILENAME -f $COMPILED_FILE -a
done
I use inotifywait to watch the current dir, although I want it also to check for sub directories. The compiled files then need to be saved in a sub directory called templates/compiled with optionally the sub directory.
So templates/foo.html needs to be compiled and stored as templates/compiled/foo.js
So templates/other/foo.html needs to be compiled and stored as templates/compiled/other/foo.js
As you can see I tried to watch the directoy and replace the templates/ name with templates/compiled.
Any help is welcome!
A few observations, then a solution:
Passing the argument -r templates/*.html only matches .html files in templates/ — not in templates/other/. Instead we're going to do -r templates which notifies us of changes to any file anywhere under templates.
If you don't use inotifywait in --monitor mode, you will miss any files that are changed in the brief period that handlebars is running (which could happen if you save all your open files at once). Better to do something like this:
#!/bin/bash
watched_dir="templates"
while read -r dirname events filename; do
printf 'File modified: %s\n' "$dirname$filename"
done < <(inotifywait --monitor --event CLOSE_WRITE --recursive "$watched_dir")
Then, as for transforming the paths, you could do something like:
$ dirname=templates/other/
$ echo "${dirname#*/}"
other/
$ echo "$watched_dir/compiled/${dirname#*/}"
templates/compiled/other/
$ filename=foo.html
$ echo "${filename%.html}"
foo
$ echo "${filename%.html}.js"
foo.js
$ echo "$watched_dir/compiled/${dirname#*/}${filename%.html}.js"
templates/compiled/other/foo.js
Notice that we can leverage Bash's builtin parameter expansion — no need for sed.
Putting it all together, we get:
#!/bin/bash
watched_dir="templates"
while read -r dirname events filename; do
[[ "${filename##*.}" != 'html' ]] && continue
output_path="$watched_dir/compiled/${dirname#*/}${filename%.html}.js"
handlebars "$dirname$filename" -f "$output_path" -a
done < <(inotifywait --monitor --event CLOSE_WRITE --recursive "$watched_dir")

grep spacing error

Hi guys i've a problem with grep . I don't know if there is another search code in shell script.
I'm trying to backup a folder AhmetsFiles which is stored in my Flash Disk , but at the same time I've to group them by their extensions and save them into [extensionName] Folder.
AhmetsFiles
An example : /media/FlashDisk/AhmetsFiles/lecture.pdf must be stored in /home/$(whoami)/Desktop/backups/pdf
Problem is i cant copy a file which name contains spaces.(lecture 2.pptx)
After this introduction here my code.
filename="/media/FlashDisk/extensions"
count=0
exec 3<&0
exec 0< $filename
mkdir "/home/$(whoami)/Desktop/backups"
while read extension
do
cd "/home/$(whoami)/Desktop/backups"
rm -rf "$extension"
mkdir "$extension"
cd "/media/FlashDisk/AhmetsFiles"
files=( `ls | grep -i "$extension"` )
fCount=( `ls | grep -c -i "$extension"` )
for (( i=0 ; $i<$fCount ; i++ ))
do
cp -f "/media/FlashDisk/AhmetsFiles/${files[$i]}" "/home/$(whoami)/Desktop/backups/$extension"
done
let count++
done
exec 0<&3
exit 0
Your looping is way more complicated than it needs to be, no need for either ls or grep or the files and fCount variables:
for file in *.$extension
do
cp -f "/media/FlashDisk/AhmetsFiles/$file" "$HOME/Desktop/backups/$extension"
done
This works correctly with spaces.
I'm assuming that you actually wanted to interpret $extension as a file extension, not some random string in the middle of the filename like your original code does.
Why don't you
grep -i "$extension" | while IFS=: read x ; do
cp ..
done
instead?
Also, I believe you may prefer something like grep -i ".$extension$" instead (anchor it to the end of line).
On the other hand, the most optimal way is probably
cp -f /media/FlashDisk/AhmetsFiles/*.$extension "$HOME/Desktop/backups/$extension/"

Recursively copying a file into multiple directories, if a directory does not exist in Bash

so I need to copy the file /home/servers/template/craftbukkit.jar into every folder inside of /home/servers, Ex. /home/servers/server1, /home/servers/server2, etc.
But I only want to do it if /home/servers/whateverserveritiscurrentlyon/mods does not exsist. This is what I came up with and was wondering if it will work:
echo " Script to copy a file to all server directories, only if mods does not exist in that directory"
for i in /home/servers/*/; do
if [ ! -d "$i/mods" ]; then
cp -f /home/servers/template/craftbukkit.jar "$i"
fi
done
echo " completed script ..."
Looks like it should work. To non-destructively test, change the cp -f ... line to say echo cp -f ... and review the output.
It could also be somewhat shortened, but it wouldn't affect efficiency much:
for i in /home/servers/*/
do
[[ -d "${i}/mods" ]] || cp -f /home/servers/template/craftbukkit.jar "${i}/."
done

Resources