command rename does not work on my bash script - bash

Yesterday I made a question here: How can I run a bash in every subfolder of a base folder and my main problem was solved but I have another one: I don't know why but the rename command does NOTHING if I try to use it recursively. I've tried all different options they told me and other I found and if I run the rename on a single directory it works fine (so the line its ok) but can't make it work recursively.
The question of optimizing images doen't matter now cause I changed the script to do it first. Now I have all the images like this way: image.png (which is the oriniginal) and image-nq8.png which is the optimized one)
What I want now is to have the optimized one with the name of the original, and the original deleted. But as any of my attempts on it, they all fail and I don't know why.
I made an script: scriptloop
for i in $(find /path/to/start/ -name "*.png");do
rename -nq8.png .png *-nq8*
done
and call it this way: ./scriptloop
and tried too using: find . -name '*-nq8.png' -print0 | xargs -0 -P6 -n1 scriptOneLine
with this inside scriptOneLine: rename -nq8.png .png *-nq8*
Note: as I said if I run rename -nq8.png .png *-nq8* on a directory it works but I can't make it work recursive. Any idea of why or what am I doing wrong? (I'm on fedora)
Thank you so much

Related

macOS find command behaving strangely

Example of command as used in bash script:
find '/Files' -type d -name temp* -depth -delete -print
This command should delete all folders, whose names start with "temp" in '/Files' folder and its subfolders ("temp0", "temp1", "temp2" etc.).
Script is working as expected, folders are found and properly deleted.
But sometimes, for some users, on some computers etc. script is not working as expected, despite the fact that folders & files are exactly the same.
Find command fails like this:
find /Files -type d -name tempta temptal -depth -delete -print
find: temptal: unknown primary or operator
I can't find out where "tempta" and "temptal" are coming - i don't have files with that names anywhere in the folder. Temp* folders are present, but not deleted because of this error.
The only thing which might be connected, are two files named "AbcInstall.sh" and "AbcInstall.log" in "AbcTemp" subfolder. So we have "ta" and "tal" plus "Temp". These are elements which reminds on "tempta" and "temptal", but they make no real sense - it could be a coincidence.
How can "find" result resolve into something like this !?!
Sorry for the lack of better explanation - this problem is really weird. The problem is that i can't replicate this issue on my computer so all i can do is experimenting (so far without success).
Any hints or ideas are greatly appreciated.
Thx!

Copying a list of files with wildcards into a new folder

I don't have much experience with the command line, but essentially I have a list of files in a single folder as follows:
file1_a_1
file1_a_2
file2_b_1
file2_b_2
file3_c_1
file3_c_2
And I also have a text file with the files I want. However, this list does not have the full file path, instead, it looks like this:
file1_a file3_c
because I want to move all files that start with 30 or so specific codes (i.e. everything that starts with file1_a and file1_c for all the files that start with this).
I have tried:
cp file1_a* file3_c* 'dir/dest'
but this does not work. I have also tried the find command. I think I have to use a loop to do this but I cannot find any help on looping through files with a wildcard on the end.
Thanks in advance! I am working on a linux machine in bash.
you can use the xargs command with find command and a pipe
find / -name xxxxx | xargs cp /..

How to Copy Files Without Overwriting Any Files in Bash Script

I am trying to write a script that finds all my files that are .jpg, and copies them do a new directory. It currently looks like this:
find ~/Pictures -iname \*.jpg -exec cp {} ...newDirectory \;
The problem is that some of my older files have the same name as newer files, when the IMG_#### reset back to 0001 and started counting again.
Is there a way to find the .jpgs and copy without overwriting the files? Ideally giving them a new name in the process.
EDIT
I ended up learning about rsync, which in its own way does exactly what I was looking for. Thanks for the help!
Use -n parameter for cp, that means: do not overwrite an existing file.
To prevent identical names, you could just name all of them unique.
Example:
$ touch screenshot.jpg
$ cp screenshot.jpg screenshot-$(date "+%s").jpg
So basically, mass rename the new files you want to copy to the same name+date.
That will make them different from what's already there, sice the older ones are unnamed or (if you repeat this later) will have different dates.

Bash Script: find command getting stuck

I'm currently writing a bash script wherein a portion of it needs to be able to look at a bunch of directory hierarchies and spit out two text files each containing a list of the directories and all the files, respectively, in the given directory.
As I understand the following should do the trick:
find $directory -type d >> alldirs.txt
where directory is assigned different directory path names since I'm supposed to check a number of them.
I have a for loop the iterates through my list of directories and uses the above function to complete my task. The above command gets to a certain point and then it gets stuck. When I investigated the issue it seemed like it would get to a directory that's empty and then it get stuck. And or it would actually start looking for directories that don't exist in the first place then it would get stuck. Any ideas?
Is there something I'm missing? Or did I understand how that works incorrectly? Is there a better alternative?
You haven't said $directory is a name. Without it, bash will complain that "find: $directory: No such file or directory"
For example:
find . -iname $directory -type d >> alldirs.txt
Note: The above will start searching in the current directory, specified by the "."
Change it to whatever directory you wish e.g. /home/mys.celeste
I had similar issue: find / -name blahblah stuck somewhere
When debugging I tried to search in all root directories like/tmp, /var, /sbin, /user and so on. And found that it is stuck on /media.
In /media I had RHEL repo mounted. So afterunmount - find continue to work normally.

Copying multiple files with same name in the same folder terminal script

I have a lot of files named the same, with a directory structure (simplified) like this:
../foo1/bar1/dir/file_1.ps
../foo1/bar2/dir/file_1.ps
../foo2/bar1/dir/file_1.ps
.... and many more
As it is extremely inefficient to view all of those ps files by going to the
respective directory, I'd like to copy all of them into another directory, but include
the name of the first two directories (which are those relevant to my purpose) in the
file name.
I have previously tried like this, but I cannot get which file is from where, as they
are all named consecutively:
#!/bin/bash -xv
cp -v --backup=numbered {} */*/dir/file* ../plots/;
Where ../plots is the folder where I copy them. However, they are now of the form file.ps.~x~ (x is a number) so I get rid of the ".ps.~*~" and leave only the ps extension with:
rename 's/\.ps.~*~//g' *;
rename 's/\~/.ps/g' *;
Then, as the ps files have hundreds of points sometimes and take a long time to open, I just transform them into jpg.
for file in * ; do convert -density 150 -quality 70 "$file" "${file/.ps/}".jpg; done;
This is not really a working bash script as I have to change the directory manually.
I guess the best way to do it is to copy the files form the beginning with the names
of the first two directories incorporated in the copied filename.
How can I do this last thing?
If you just have two levels of directories, you can use
for file in */*/*.ps
do
ln "$file" "${file//\//_}"
done
This goes over each ps file, and hard links them to the current directory with the /s replaced by _. Use cp instead of ln if you intend to edit the files but don't want to update the originals.
For arbitrary directory levels, you can use the bash specific
shopt -s globstar
for file in **/*.ps
do
ln "$file" "${file//\//_}"
done
But are you sure you need to copy them all to one directory? You might be able to open them all with yourreader */*/*.ps, which depending on your reader may let browse through them one by one while still seeing the full path.
You should run a find command and print the names first like
find . -name "file_1.ps" -print
Then iterate over each of them and do a string replacement of / to '-' or any other character like
${filename/\//-}
The general syntax is ${string/substring/replacement}. Then you can copy it to the required directory. The complete script can be written as follows. Haven't tested it (not on linux at the moment), so you might need to tweak the code if you get any syntax error ;)
for filename in `find . -name "file_1.ps" -print`
do
newFileName=${filename/\//-}
cp $filename YourNewDirectory/$newFileName
done
You will need to place the script in the same root directory or change the find command to look for the particular directory if you are placing the above script in some other directory.
References
string manipulation in bash
find man page

Resources