How to copy unique files to each directory using shell scripting - bash

I have lot of files in source directory with different name and ending with some number as suffix name. I need to copy all the files into 3 different directory. While copying, each file should be unique to other directory.
Example :
Source directory
1) Test01.csv
2) Test02.csv
3) Test03.csv
4) Nontesting01.csv
5) Nontesting02.csv
6) Nontesting03.csv
Destination directory
Directory 1 : Test01.csv
Nontesting01.csv
Directory 2 : Test02.csv
Nontesting02.csv
Directory 3 : Test03.csv
Nontesting03.csv
I have tried below code but it's copying 1 file per directory.
#!/bin/bash
dest=/Users/myprofile/Testing_
count=0
for f in *.csv ; do (cp $f/*.csv* "${dest}"${count}/$f ) ;
((count++))
done
Could someone help how to achieve this scenario using shell scripting.

Piggy-backing on what Jetchisel said, the code should be adapted, as a loop on patterns, to look like this:
#!/bin/bash
destPref="/Users/myprofile/Testing_"
for pattern in 01 02 03
do
files=(*${pattern}.csv)
cp -v -- "${files[#]}" "${destPref}${pattern}/"
done
As a general guidance, if you start by writing the code logic
in pseudo-code,
fine-grained for each task that you are trying to accomplish,
in the correct sequence and
in the correct context,
then having that worded so that it does exactly what you want it to do ... will, almost explicitly, tell you WHAT you need to code for each of those, not the HOW. The HOW is the nitty gritty of coding.
If you give that a try, the solution will almost pop out of the page at you.
Good luck with your apprenticeship!

Related

Add prefix of root folder to all files in directory/subdirectories

So I have a bunch of directories that are kind of like this (but with more files):
1-0
file1
file2
folder
file 3
1-1
file1
file2
folder
file 3
etc.
And as the question asks I want to prefix the 1-0, 1-1 to all the "files" in the directories and subdirectories of the root folder being the 1-0 etc. Like follows:
1-0
1-0file1
1-0file2
folder
1-0file 3
1-1
1-1file1
1-1file2
folder
1-1file 3
etc.
I've tried a few cmd/batch solutions from other questions and ReNamer but I couldn't find anything that quite did what I wanted. Pretty inexperienced with this kind of stuff so I could have missed something for sure.
ReNamer I can do it but I have to do each directory individually and I have a fair few that I am trying to rename so it's a bit impractical.
I would find this in general a quite terrible idea to do... I cannot imagine a situation where on the long term this may really has been a good idea. One of the reasons is that it just duplicates structural information of the location in the filename itself. Why ?...
On a linux shell there are certainly various ways to accomplish this. Here is a one liner:
find . -type f -name "*" | xargs -n 1 -I {} sh -c 'file_full={}; file_bare=${file_full#./}; file=${file_bare##*/} prefix=${file_bare%%/*}; dir=${file_bare%/*}; echo mv ${file_full} ./${dir}/${prefix}${file}'
Note, there is an echo in the final output. Remove the echo only if you are really happy with the solution and want this to be executed.

How to recursively rename all files and folder including specific part of the filename with Windows Bash?

This has to be a duplicate but I have read and tried at least a dozen of Q&As here on SO, and I cannot get any of them working for my case.
Really hope this won't result in downvotes because of it.
So I'm on Windows (10) and have a Bash terminal that I want to use for my task. The MINGW64 one I downloaded when I started working with Git.
I would prefer the solution with this program, but will be perfectly happy with one in Command Prompt Terminal or even PowerShell.
I created a TemplateApp which is in C:\Apps\TemplateApp folder which has multiple folders and subfolders named TemplateApp or TemplateApp.something as well as a lot of files that have TemplateApp as a part of their name.
Could be:
TemplateApp.ext
TemplateApp.something.ext
something.TemplateApp.something.ext
Then I copied the uppermost folder to C:\Apps\TemplateApp - Copy and in turn renamed it to C:\Apps\ProductionApplication.
Now for the love of whomever, I cannot make any of the scripts I found on SO to work for my case, ie. to rename all the above mentioned files and folders by replacing TemplateApp with ProductionApplication.
Here is a bash function I wrote that I think does very much like what you are wanting to do.
function func_CreateSourceAndDestination() {
#
for (( i = 0 ; i < ${#files_syncSource[#]} ; i++ )) ; do
files_syncDestination[${i}]="${files_syncSource[${i}]#${directory_MusicLibraryRoot_source}}"
file_destinationPath="$( dirname -- "${directory_PMPRoot_destination}${files_syncDestination[${i}]}" )"
if [ ! -d "${file_destinationPath}" ] ; then
mkdir -p "${file_destinationPath}"
fi
rsync -rltDvPmz "${files_syncSource[${i}]}" "${directory_PMPRoot_destination}${files_syncDestination[${i}]}"
done
}
In my case I'm feeding into rsync for a source and a destination. I'm pulling all the file paths from an array that has been split into path segments. I have to make certain character substitutions for FAT and NTFS file systems. I do this recursively.
files_syncDestination[${i}]="${files_syncDestination[${i}]//\:/__}"
That's the magic. I load a new array with the character substituted. You could do the same with a loaded variable including your phrases for change.
files_syncDestination[${i}]="${files_syncDestination[${i}]//${targetPhrase}/${subPhrase}}"
After that change in the function, you could use rsync or cp or mv as you prefer to go from your source array to your destination array.
(The double-slash in the substitution makes the substitution global.)

Shell, copy files with similar names

I would like to copy a series of similar files from the current directory to the target directory, the files under the current directory are:
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0001_ux.hst
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0001_uz.hst
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0002_ux.hst
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0002_uz.hst
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0003_ux.hst
prod07_sim0500-W31-0.2_velocity-models-2D_t80_f0003_uz.hst
Where sim is from sim0001 to sim0500 and f is from f0001 to f0009. I only need f0002, f0005 and f0008. I write the following code:
target_dir="projects/data"
for i in {0001..0500}; do
for s in f000{2,5,8}; do
files="[*]$i[*]$s[*]"
cp $files target_dir
done
done
I am very new to Shell, and wondering how to write the $files="[*]$i[*]$s[*]"$, so that it could match only the f0002, f0005 and f0008. The reason why I also use for i in {0001..0500}; do is that the files are too large and I would like to make sure I could access some completed ones (for example, including all sim0001) in the beginning.
Edit: changed for s in f0002 f0005 f0008; do to f000{2,5,8}.
What you need is globbing and a bit different quoting:
cp *"$i"*"$s"* "$target_dir"
Not storing this in a variable is intentional - it's faster and it's safe. If you end up with such a large list of files that you start running into system limits you'll have to look into xargs.

Bash/shell/OS interpretation of . and .. — can I define ...?

How do . and .., as paths (vs. ranges, e.g., {1..10}, which I'm not concerned with), really work? I know what they do, and use them all the time, but don't fully grasp how/where they're interpreted. Does the shell handle them? The interpreting process? The OS?
The reason why I'm asking is that I'd like to be able to use ... to refer to ../.., .... to refer to ../../.., etc. (up to some small finite number; I don't need bash to process an arbitrarily large number of dots). I.e., if my current directory is /tmp/let/me/out, and I call cd ..., my resulting current directory should be /tmp/let. I don't particularly care if ... etc. show up in ls -a output like . and .. do, but I would like to be able to call cat /tmp/let/me/out/..../phew.txt to print the contents of /tmp/phew.txt.
Pointers to relevant documentation appreciated as well as direct answers. This kind of syntax question is very hard to Google.
I'm using bash 4.3.42, by the way, with the autocd and globstar shell options.
. and .. are genuine directory names. They are not "sort-cuts", aliases, or anything fake.
They happen to point to the same inode as the other name you use. A file or directory can have several names pointing to the same inode, these are usually known as hard links, to distinguish them from symbolic (or soft) links.
If you are on Linux or OS X you can use stat to look at most of the inode metadata - it is what ls looks at. You will see there is an inode number. If you stat . and stat current-directory-name you will see that number is the same.
The one thing that is not held in the inode is the filename - that is held in the directory.
So . and .. reside in the directory on the file system, they are not a figment of the shell's imagination. So, for example, I can use . and .. quite happily from C.
I doubt you can change them - personally I have never tried and I never will. You would have to change what these filenames linked to by editing the directory. If you managed it you would probably do irreparable damage to your file system.
I write this to clarify what has already been written before.
In many file systems a DIRECTORY is a file; a special type of file that the file system identifies as being distinctly a directly.
A directory file contains a list of names that map to files on the disk
A file, including a directly does not have an intrinsic name associated with it (not true in all file systems). The name of a file exists only in a directory.
The same file can have an entry in multiple directories (hard link). The same file can then have multiple names and multiple paths.
The file system maintains in every directory entries for "." and ".."
In such file systems there are always directory ENTRIES for the NAMES "." and "..". These entries are maintained by the file system.
The name "." links to its own directory.
The name ".." links to the parent directory EXCEPT for the top level directory where it links to itself (. and .. thus link to the same directory file).
So when you use "." and ".." as in /dir1/dir2/../dir3/./dir4/whatever,
"." and ".." are processed in the exact same way as "dir1" and "dir2".
This translation is done by the file system; not the shell.
cd ...
Does not work because there is no entry for "..." (at least not normally).
You can create a directory called "..." if you want.
You can actually achieve something like this, though this is an ugly hack:
You can run a command before every command entered to bash, and after every command. For that you trap the DEBUG pseudo signal and set a command to PROMPT_COMMAND, respectively.
trap 'ln -s ../.. ... &>/dev/null | true' DEBUG
PROMPT_COMMAND='rm ...'
With this, it seems like there's an additional entry in the current directory:
pwd
# /tmp/crazy-stuff
ls -a
# . .. ... foo
ls -a .../tmp/crazy-stuff
# . .. ... foo
Though this only works in the current directory, because the symbolic links is deleted after each command invokation. Thus ls foo/bar/... won't work this way.
Another ugly hack would be to "override" mkdir such that it populates every new directory with these symbolic links.
See also the comments on the second answer here, particularly Eliah's: https://askubuntu.com/questions/327126/what-is-a-dot-only-named-folder
Much in the same way that when you cd into some directory subdir, you're actually following a pointer that points to that directory, .. is a pointer added by the OS that points to the parent directory, and I'd imagine . works the same way.

Why is my bash randomly selecting files and renaming?

On my Mac I am terminal scripting an increment rename of some files in a folder. The files have a number at the end but from a top down approach they are in order:
foobar101.png
foobar107.png
foobar115.png
foobar121.png
foobar127.png
foobar133.png
foobar141.png
foobar145.png
foobar151.png
foobar155.png
When I create and run my loop it works:
DIR="/customlocation/on/mac"
add=1;
for thefile in $(find $DIR -name "*.png" ); do
cd $DIR
mv -v "${thefile}" foobar"${add}".png
((add++))
done
However, when it runs the increment it's not as expected:
foobar101.png -> need foobar1.png but is foobar10.png
foobar107.png -> need foobar2.png but is foobar3.png
foobar115.png -> need foobar3.png but is foobar4.png
foobar121.png -> need foobar4.png but is foobar2.png
foobar127.png -> need foobar5.png but is foobar9.png
foobar133.png -> need foobar6.png but is foobar6.png
foobar141.png -> need foobar7.png but is foobar1.png
foobar145.png -> need foobar8.png but is foobar5.png
foobar151.png -> need foobar9.png but is foobar8.png
foobar155.png -> need foobar10.png but is foobar7.png
Ive tried searching on SO, Linux/Unix, Ask Ubuntu, and SuperUser but I don't see any questions that solve the issue of controlling the increment and I dont know if it's something in particular I should be looking at. So how can I control the increment from the lowest number/filename instead of the Mac possibly randomly renaming with an increment so I get the desired output?
EDIT:
After a comment from Etan I was looking into the numerical values at the end and some of the files are named foobarXXXX and that is the issue. The below answer, while awesome and a new approach I will look into still produces the same outcome because of some other files. If I remove all files that are foobarXXXX and only leave files with values of foobarXXX my code and the code in fedorqui's answer work. Is there a way then I can target this while in the loop process or do I have to target all names and test to see the length of values and adjust accordingly?
You cannot rely on the order of a find command, which uses the order that the VFS gives them to it in.
You may, instead, want to sort it:
DIR="/customlocation/on/mac"
add=1;
while IFS= read -r thefile; do
cd $DIR
mv -v "${thefile}" foobar"${add}".png
((add++))
done < <(find $DIR -name "*.png" | sort)
#-------^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note this uses process substitution, which feed the while loop:
Process substitution is a form of redirection where the input or
output of a process (some sequence of commands) appear as a temporary
file.

Resources