How to extract files with extension from directories? - bash

I am currently extracting pictures from a large multi-levels directory using the following bash command:
find . -name \*.jpg -exec cp {} /newdir_path_.. \;
However, all pictures are stored under 3 versions:
xxx-LD.jpg
xxx-SD.jpg
xxx.jpg
I just want to extract the xxx.jpg pictures, not the LD and SD...
how should my command be modified to perform such extraction?

You can add more tests:
find . -name '*.jpg' -not -name '*-[LS]D.jpg' -exec cp {} /newdir_path_.. \;
-not is a GNU extension; you can use ! -name instead. In some shells, ! has to be escaped: \! -name.

This may also work for you considering your filenames only contain digits before .jpg:
find . -iregex '.*/[0-9]*\.jpg$' -exec cp {} /newdir_path_.. \;

Related

Find for YML and YAML files on bash find

I am trying to find all .yaml and .yml
I tried
find . -name '*.{yml,yaml}' -exec echo "{}" \;
But no results
Neither in the following way
find . -name '*.yml' -name '*.yaml' -exec echo "{}" \;
Returns nothing.
Is it possible to use the find command to search for both extensions?
With GNU find find none or one a:
find . -regextype egrep -regex '.*ya?ml$'
or
find . -regextype egrep -regex '.*ya{0,1}ml$'
See: man find
Something like this.
find . \( -name '*.yaml' -o -name '*.yml' \)
See UsingFind
See Understanding-the-exec-option-of-find

Mac terminal how can i search in all folders and subfolders for image

How can i search for example all .png files on an external disk and copy them to another directory?
Have tried to use the cp command. Have try it but don't work for me
?
Monterey 2.2.1
cp /Volumes/Data *.png /Volumes/Data/pictures_png
cp command won't work if you need to recursively copy from the sub directories. You need to use find.
Syntax:
find $SOURCE -type f -name '*.type' -exec cp '{}' $DESTINATION ';'
In your case,
find /Volumes/Data -type f -name '*.png' -exec cp '{}' /Volumes/Data/pictures_png ';'
Here is how it works:
-type f means copy only files not directories.
-name is to provide the filename to find. Here *.png for pattern matching
-exec executes the following line for each result the above find returns.
{} will be replaced with the results from find
; terminates -exec command

Compress multiple files individually with 7zip

Searching I noticed that it was possible to compress several files individually by gzip. Unfortunately this format is not useful to me.
I tried to compose with the following command.
Find. -type f -name "* .txt" -exec 7z {} \;
Suggestions?
To add a file to archive, you should use "a" command: 7z a archive.7z file. So your command should be like this:
find . -type f -name "*.txt" -exec 7z a $(basename {}).7z {} \;
or, if you want archive names like ".txt.7z",
find . -type f -name "*.txt" -exec 7z a {}.7z {} \;
basename here is to get your file name without extension.

Delete all hidden files in folder and subfolders

I need to delete all hidden files in the current folder and its sub folders. Is there any way to do it with a single line command without creating a script?
Use
find "$some_directory" -type f -name '.*' -delete
If you want to remove hidden directories as well, you'll need to take a little more care to avoid . and .., as mentioned by Ronald.
find "$some_directory" -name '.*' ! -name '.' ! -name '..' -delete
With either command, you should run without the -delete primary first, to verify that the list of files/directories that find returns includes only files you really want to delete.
For completeness, I should point out that -delete is a GNU extension to find; the POSIX-compliant command would be
find "$some_directory" -type f -name '.*' -exec rm '{}' \;
i.e., replace -delete with -exec ... \;, with ... replaced with the command line you would use to remove a file, but with the actual file name replaced by '{}'.
For my Netgear Stora, I wanted to remove all the hidden .webview .thumbnails .AppleDouble etc files and folders.
This works from the /home/yourusername/ folder:
find -type f -name '.*' ! -name '.' ! -name '..' -exec rm -fv '{}' \;
and then
find -type d -name '.*' ! -name '.' ! -name '..' -exec rm -frdv '{}' \;

I am getting an error "arg list too long" in unix

i am using the following command and getting an error "arg list too long".Help needed.
find ./* \
-prune \
-name "*.dat" \
-type f \
-cmin +60 \
-exec basename {} \;
Here is the fix
find . -prune -name "*.dat" -type f -cmin +60 |xargs -i basename {} \;
To only find files in the current directory, use -maxdepth 1.
find . -maxdepth 1 -name '*.dat' -type f -cmin +60 -exec basename {} \;
In all *nix systems the shell has a maximum length of arguments that can be passed to a command. This is measured after the shell has expanded filenames passed as arguments on the command line.
The syntax of find is find location_to_find_from arguments..... so when you are running this command the shell will expand your ./* to a list of all files in the current directory. This will expand your find command line to find file1 file2 file3 etc etc This is probably not want you want as the find is recursive anyway. I expect that you are running this command in a large directory and blowing your command length limit.
Try running the command as follows
find . -name "*.dat" -type f -cmin +60 -exec basename {} \;
This will prevent the filename expansion that is probably causing your issue.
Without find, and only checking the current directory
now=$(date +%s)
for file in *.dat; do
if (( $now - $(stat -c %Y "$file") > 3600 )); then
echo "$file"
fi
done
This works on my GNU system. You may need to alter the date and stat formats for different OS's
If you have to show only .dat filename in the ./ tree. Execute it without -prune option, and use just path:
find ./ -name "*.dat" -type f -cmin +60 -exec basename {} \;
To find all the .dat files which are older than 60 minutes in the present directory only do as follows:
find . -iregex "./[^/]+\.dat" -type f -cmin +60 -exec basename {} \;
And if you have croppen (for example aix) version of find tool do as follows:
find . -name "*.dat" -type f -cmin +60 | grep "^./[^/]\+dat" | sed "s/^.\///"

Resources