Unable to use dirname within subshell in find command - bash

I am trying to make a small script that can move all files from one directory to another, and I figured using find would be the best solution. However, I have run into a problem of using subshells for the 'dirname' value in creating the target directory paths. This does not work because {} evaluates to '.' (a single dot) when inside a subshell. As seen in my script bellow, the -exec mkdir -p $toDir/$(dirname {}) \; portion of the find command is what does not work. I want to create all of the target directories needed to move the files, but I cannot use dirname in a subshell to get only the directory path.
Here is the script:
#!/bin/bash
# directory containting files to deploy relative to this script
fromDir=../deploy
# directory where the files should be moved to relative to this script
toDir=../main
if [ -d "$fromDir" ]; then
if [ -d "$toDir" ]; then
toDir=$(cd $toDir; pwd)
cd $fromDir
find * -type f -exec echo "Moving file [$(pwd)/{}] to [$toDir/{}]" \; -exec mkdir -p $toDir/$(dirname {}) \; -exec mv {} $toDir/{} \;
else
echo "A directory named '$toDir' does not exist relative to this script"
fi
else
echo "A directory named '$fromDir' does not exist relative to this script"
fi
I know that you can us -exec sh -c 'echo $(dirname {})' \;, but with this, I would then not be able to use the $toDir variable.
Can anyone help me figure out a solution to this problem?

Since you appear to be re-creating all the files and directories, try the tar trick:
mkdir $toDir
cd $fromDir
tar -cf - . | ( cd $toDir ; tar -xvf - )

Related

Copy all files with a certain extension from all subdirectories and preserving structure of subdirectories

How can I copy specific files from all directories and subdirectories to a new directory while preserving the original subdirectorie structure?
This answer:
find . -name \*.xls -exec cp {} newDir \;
solves to copy all xls files from all subdirectories in the same directory newDir. That is not what I want.
If an xls file is in: /s1/s2/ then it sould be copied to newDir/s1/s2.
copies all files from all folders and subfolders to a new folder, but the original file structure is lost. Everything is copied to a same new folder on top of each other.
You can try:
find . -type f -name '*.xls' -exec sh -c \
'd="newDir/${1%/*}"; mkdir -p "$d" && cp "$1" "$d"' sh {} \;
This applies the d="newDir/${1%/*}"; mkdir -p "$d" && cp "$1" "$d" shell script to all xls files, that is, first create the target directory and copy the file at destination.
If you have a lot of files and performance issues you can try to optimize a bit with:
find . -type f -name '*.xls' -exec sh -c \
'for f in "$#"; do d="newDir/${f%/*}"; mkdir -p "$d" && cp "$f" "$d"; done' sh {} +
This second version processes the files by batches and thus spawns less shells.
This should do:
# Ensure that newDir exists and is empty. Omit this step if you
# don't want it.
[[ -d newDir ]] && rm -r newDir && mkdir newDir
# Copy the xls files.
rsync -a --include='**/*.xls' --include='*/' --exclude='*' . newDir
The trick here is the combination of include and exclude. By default, rsync copies everything below its source directory (. in your case). We change this by excluding everything, but also including the xls files.
In your example, newDir is itself a subdirectory of your working directory and hence part of the directory tree searched for copying. I would rethink this decision.
NOTE: This would not only also copy directories whrere the name ends in .xls, bur also recreated the whole directory structure of your source tree (even if there are no xls files in it), and populate it only with xls files.
Thanks for the solutions.
Meanwhile I found also:
find . -name '*.xls' | cpio -pdm newDir

How to solve no such file or directory while using xargs

I am trying to copy a file(s) to the same directory but with a prefix. I am using xargs to do that. NOTE: I can't use cd as it will break build in my pipeline.
This is what I have
root#gotoadmins-OU:~# ls
snap test
root#gotoadmins-OU:~# ls test/
me.war
root#gotoadmins-OU:~# ls test | xargs -I {} cp {} latest-{} test/
cp: cannot stat 'me.war': No such file or directory
cp: cannot stat 'latest-me.war': No such file or directory
If I understand the question correctly, you simply want to copy all of the files in a subdirectory to the same subdirectory, but with the prefix "latest-".
find test -type f -execdir bash -c 'cp "$1" latest-"$(basename "$1")"' "$(which bash)" "{}" \;
$(which bash) can be replaced with the word bash or anything, really. It's just a placeholder.
As #KamilCuk commented, a subshell might also be an option. If you put commands inside parentheses (e.g. (cd test; for FILE in *; do cp "$FILE" latest-"$FILE"; done)), those commands are run in a subshell, so the environment, including your working directory, is unaffected.
can you just use the cp command to do this?
cp test/me.war test/latest-me.war

changing to current directory where file.sh is located using variable

I want a script that changes directory to the directory where the file.sh is located, lets say var1.
Then I want to copy files from another location ,lets say var2, to the current dir which would be var.
Then I want to do some unzipping and deleting rows in the files, which would be in var
I have tried the below, but my syntax is not correct. Can someone please advise?
#!/bin/bash
# Configure bash so the script will exit if a command fails.
set -e
#var is where the script is stored
var="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd";
#another dir I want to copy from
var2 = another/directory
#cd to the directory I want to copy my files to
cd "$var" + /PointB
#copy from var2 to the current location
#include the subdirectories
cp -r var2 .
# This will unzip all .zip files in this dir and all subdirectories under this one.
# -o is required to overwrite everything that is in there
find -iname '*.zip' -execdir unzip -o {} \;
#delete specific rows 1-6 and the last one from the csv file
find ./ -iname '*.csv' -exec sed -i '1,6d;$ d' '{}' ';'
a few mistakes here:
# no: var="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd";
var=$(cd "$(dirname "$0") && pwd)
The stuff in $() executes in a subshell, so the "pwd" must be performed in the same shell that you have "cd"-ed in.
# no: var2 = another/directory
var2=another/directory
The = cannot have whitespace around it.
# no: cd "$var" + /PointB
cd "$var"/PointB
shell is not javascript, string contatenation does not have a separate operator
# no: cp -r var2 .
cp -r "$var2" .
Need the $ to get the variable's value.
# no: find -iname '*.zip' -execdir unzip -o {} \;
find . -iname '*.zip' -execdir unzip -o {} \;
Specify the starting directory as the first argument to find.

Recursively change directories and execute a command in each

I'm trying to write a bash script to recursively go through a directory and execute a command at each landing. Each folder from the base has the prefix "lab" and I only want to recurse through those folders. An example without recursively going through the folders would be:
#!/bin/bash
cd $HOME/gpgn302/lab00
scons -c
cd $HOME/gpgn302/lab00/lena
scons -c
cd $HOME/gpgn302/lab01
scons -c
cd $HOME/gpgn302/lab01/cloudpeak
scons -c
cd $HOME/gpgn302/lab01/bear
scons -c
And while this works, if I want to add more directories in say lab01, I would have to edit the script. Thank you in advance.
There are a few close suggestions here, but here's one that actually works:
find "$HOME"/gpgn302/lab* -type d -exec bash -c 'cd "$1"; scons -c' -- {} \;
Use find for this kind of task:
find "$HOME/gpgn302" -name 'lab*' -type d -execdir scons -c . \;
It's easy to use find to locate and run commands.
Here's an example which changes into the correct directory before running your command:
find -name 'lab*' -type d -execdir scons -c \;
Update:
As per thatotherguy's comment, this doesn't work. The find -type d will only return directory names, however -execdir command operates on the subdirectory containing the matched file, so in this example the scons -c command would be execute in the parent directory of the found lab* directory.
Use thatotherguy's method or this which is very similar:
find -name 'a*' -type d -print -exec bash -c 'cd "{}"; scons -c' \;
If you want to do it with bash:
#!/bin/bash
# set default pattern to `lab` if no arguments
if [ $# -eq 0 ]; then
pattern=lab
fi
# get the absolute path to this script
if [[ "$0" = /* ]]
then
script_path=$0
else
script_path=$(pwd)/$0
fi
for dir in $pattern*; do
if [ -d $dir ] ; then
echo "Entering $dir"
cd $dir > /dev/null
sh $script_path dummy
cd - > /dev/null
fi
done

Change directories

I am having some trouble in doing some comands on shell.
My problem is that I want to change directories more specifically to a directory which I don't know but that contains the file named xxx.
How can I change directly to that directory that contains that file?
If I knew the names of the directories that contained that file would be easier because I only had to use cd ~/Name of directory.
Can anyone help me?
thanks
If you have GNU find:
cd "$(find /startdir -name 'filename' -printf %h -quit)"
You can replace "/startdir" with any valid directory, for example /, . or `~/.
If you want to cd to a directory which is in the $PATH that contains an executable file:
cd "$(dirname "$(type -P "filename")")" # Bash
or
cd "$(f=$(type -P "ksh"); echo "${f%/*}")" # Bash
or
cd "$(dirname "$(which "filename")")"
If you don't know where a file is, go to the root of the system and find it:
cd /
find . -iname filename
In several linux systems you could do:
$ cd `find . -name "filename" | xargs dirname`
But change "filename" to the file you want to find.
BASH
cd `find . -name "*filename*" | head -1`
This is kind of a variation to Qiau answer. Finds the first file which contains the string filename and then change the current directory to its location.
* is a wild card, there may be something before and/or after filename.

Resources