How to remove all subdirectories? (unix shell scripting) - bash

I have a directory called "cdrs-roaming". Everyday I receive one or more .zip files and unzip them with this:
#!/bin/bash
for i in *.zip
do
j=${i//\.zip/}
mkdir $j
cd $j
unzip ../$i
cd -
done
Then I have for example:
"example1.zip" and "example1"; "example2.zip" and "example2"
I'm removing all zip files (in this case: "example1.zip" and "example2.zip") with this:
#! /bin/bash
find /dados/cdrs-roaming/*.zip -mtime +1 -exec rm {} \;
So I want to remove the directories (or folders - I really don't know the difference) "example1" and "example2". I've tried this:
#! /bin/bash
find /dados/cdrs-roaming/ -type d -mtime +1 -exec rm -rf {} \;
But it also removes "cdrs-roaming". I've also tried to use:
find /cdrs-roaming/ -type d -mtime +1 -exec rm -rf {} \;
But it returns: find: ‘/cdrs-roaming/’: No such file or directory
Any idea for doing this? I need to delete only the directories within "cdrs-roaming" but I can't remove anything else inside it (my .sh files are inside of it)

Since you are using bash, how about
rm -rf /dados/cdrs-roaming/*/
The final slash ensures that bash only expands the pattern to directories.

Use -mindepth 1 option:
find /dados/cdrs-roaming/ -mindepth 1 -type d -mtime +1 -exec rm -rf {} \;

Related

"dir/*: No such file or directory" with find -exec ... "{}/*"

The current directory contains files and directories. The directories have no sub-directories, but may contain zero or more files, for example:
./file1
./file2
./directory1/file3
./directory2/file4
./directory2/file5
./directory3/
When I execute find . -type d -maxdepth 1 I get a listing of the directories:
./directory1
./directory2
If I execute mv ./directory1/* . all files in directory1 are moved to the current level . so I thought I could use find -exec to do everything in one go:
find . -type d -maxdepth 1 -exec mv "{}/*" . \;
But I get this response:
mv: rename ./directory1/* to ./*: No such file or directory
How can I move all the files in subdirectories to the current level?
Globbing (replacing foo/* with foo/dirA, foo/dirB, etc) is performed by the shell, not by mv. find -exec doesn't start a shell unless you do so manually; for example:
find . -type d -mindepth 1 -maxdepth 1 \
-exec sh -c 'for dir; do mv -- "$dir"/* .; done' _ {} +
There's no real need to use find. You can do it with a single mv to move the files and rmdir to remove the now-empty directories.
mv */* .
rmdir */

copy files with the base directory

I am searching specific directory and subdirectories for new files, I will like to copy the files. I am using this:
find /home/foo/hint/ -type f -mtime -2 -exec cp '{}' ~/new/ \;
It is copying the files successfully, but some files have same name in different subdirectories of /home/foo/hint/.
I will like to copy the files with its base directory to the ~/new/ directory.
test#serv> find /home/foo/hint/ -type f -mtime -2 -exec ls '{}' \;
/home/foo/hint/do/pass/file.txt
/home/foo/hint/fit/file.txt
test#serv>
~/new/ should look like this after copy:
test#serv> ls -R ~/new/
/home/test/new/pass/:
file.txt
/home/test/new/fit/:
file.txt
test#serv>
platform: Solaris 10.
Since you can't use rsync or fancy GNU options, you need to roll your own using the shell.
The find command lets you run a full shell in your -exec, so you should be good to go with a one-liner to handle the names.
If I understand correctly, you only want the parent directory, not the full tree, copied to the target. The following might do:
#!/usr/bin/env bash
findopts=(
-type f
-mtime -2
-exec bash -c 'd="${0%/*}"; d="${d##*/}"; mkdir -p "$1/$d"; cp -v "$0" "$1/$d/"' {} ./new \;
)
find /home/foo/hint/ "${findopts[#]}"
Results:
$ find ./hint -type f -print
./hint/foo/slurm/file.txt
./hint/foo/file.txt
./hint/bar/file.txt
$ ./doit
./hint/foo/slurm/file.txt -> ./new/slurm/file.txt
./hint/foo/file.txt -> ./new/foo/file.txt
./hint/bar/file.txt -> ./new/bar/file.txt
I've put the options to find into a bash array for easier reading and management. The script for the -exec option is still a little unwieldy, so here's a breakdown of what it does for each file. Bearing in mind that in this format, options are numbered from zero, the {} becomes $0 and the target directory becomes $1...
d="${0%/*}" # Store the source directory in a variable, then
d="${d##*/}" # strip everything up to the last slash, leaving the parent.
mkdir -p "$1/$d" # create the target directory if it doesn't already exist,
cp "$0" "$1/$d/" # then copy the file to it.
I used cp -v for verbose output as shown in "Results" above, but IIRC it's also not supported by Solaris, and can be safely ignored.
The --parents flag should do the trick:
find /home/foo/hint/ -type f -mtime -2 -exec cp --parents '{}' ~/new/ \;
Try testing with rsync -R, for example:
find /your/path -type f -mtime -2 -exec rsync -R '{}' ~/new/ \;
From the rsync man:
-R, --relative
Use relative paths. This means that the full path names specified on the
command line are sent to the server rather than just the last parts of the
filenames.
The problem with the answers by #Mureinik and #nbari might be that the absolute path of new files will spawn in the target directory. In this case you might want to switch to the base directory before the command and go back to your current directory afterwards:
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec cp --parents '{}' ~/new/ \; ; cd $path_current
or
path_current=$PWD; cd /home/foo/hint/; find . -type f -mtime -2 -exec rsync -R '{}' ~/new/ \; ; cd $path_current
Both ways work for me at a Linux platform. Let’s hope that Solaris 10 knows about rsync’s -R ! ;)
I found a way around it:
cd ~/new/
find /home/foo/hint/ -type f -mtime -2 -exec nawk -v f={} '{n=split(FILENAME, a, "/");j= a[n-1];system("mkdir -p "j"");system("cp "f" "j""); exit}' {} \;

Delete node_modules folder recursively from a specified path using command line

I have multiple npm projects saved in a local directory. Now I want to take backup of my projects without the node_modules folder, as it is taking a lot of space and can also be retrieved any time using npm install.
So, what would be a solution to delete all node_modules folders recursively from a specified path using the command line interface?
Print out a list of directories to be deleted:
find . -name 'node_modules' -type d -prune
Delete directories from the current working directory:
find . -name 'node_modules' -type d -prune -exec rm -rf '{}' +
Alternatively you can use trash (brew install trash) for staged deletion:
find . -name node_modules -type d -prune -exec trash {} +
Try
https://github.com/voidcosmos/npkill
npx npkill
it will find all node_modules and let you remove them selectively.
Improving on the accepted answer:
find . -name 'node_modules' -type d -prune -exec rm -rf '{}' +
I found that the command above would run a very long time to fetch all folders and then run a delete command. To make the command resumable, I'd suggest using \;. To see progress of the command being run, use -print to see the directory being deleted.
Note: You must first cd into the root directory and then run the command. Or instead of find ., use find {project_directory}.
To delete folders one by one:
find . -name 'node_modules' -type d -prune -exec rm -rf '{}' \;
To delete folders one by one and print the folder being deleted:
find . -name 'node_modules' -type d -prune -print -exec rm -rf '{}' \;
For the people who like an interactive way of doing this, refer to jeckep's answer. Run this in the directory you wish to prune:
npx npkill
I have come across with this solution,
first find the folder using find and specify name of the folder.
execute delete command recursively -exec rm -rf '{}' +
run the following command to delete folders recursively
find /path -type d -name "node_modules" -exec rm -rf '{}' +
When on Windows, I use the following .BAT file to delete node_modules recursively from the current folder:
#for /d /r . %d in (node_modules) do #if exist %d (echo %d && rd %d /s /q)
Or, via CMD.EXE:
>cmd.exe /c "#for /d /r . %d in (node_modules) do #if exist %d (echo "%d" && rd "%d" /s /q)""
bash function to remove node_modules. It will remove all node_modules directories recursively from the current working directory,
while printing found paths.
You just need to put in somewhere in your $PATH
rmnodemodules(){
find . -name 'node_modules' -type d -prune -exec echo '{}' \; -exec rm -rf {} \;
}
In Bash, you can simply use
rm -rf node_modules **/node_modules
If you want to move instead of delete it:
find . -name 'node_modules' -type d -prune -exec mkdir -p ./another/dir/{} \; -exec mv -i {} ./NODE_MODULES/{} \;
This will keep the directory structure.
A simple trick to remove all node_modules folders in your servers (which can reduce a lot of space) is to run:
# For Ubuntu
sudo find / -not -path "/usr/lib/*" -name 'node_modules' -type d -prune -exec rm -rf '{}' +
# For macOS
sudo find / -not -path "/usr/local/*" -name 'node_modules' -type d -prune -exec rm -rf '{}' +
Here we need to exclude /usr/lib/* because if you won’t, it will delete your npm and you need to reinstall it :)
Note: This is just improving on the accepted answer, please use the accepted answer first.
If you are so bored then keep reading.
Basically, this command should work fine for 99% of cases
find . -name 'node_modules' -type d -prune -exec rm -rf '{}' +
I notice that deleting files via the command line is longer than deleting a folder via Finder (when deleting from Finder it moves that folder to ~/.Trash directory).
So if you want to move node_modules to ~/.Trash folder then you can try
find . -name 'node_modules' -type d -prune -exec sh -c 'mv -f "$1" "$(dirname "$1")/$(basename $(dirname "$1"))_$(basename "$1")" && mv "$(dirname "$1")/$(basename $(dirname "$1"))_$(basename "$1")" ~/.Trash/' sh {} \;
as you notice it consist of 2 parts.
find . -name 'node_modules' -type d -prune find all node_module dirs
-exec sh -c 'mv -f "$1" "$(dirname "$1")/$(basename $(dirname "$1"))_$(basename "$1")" && mv "$(dirname "$1")/$(basename $(dirname "$1"))_$(basename "$1")" ~/.Trash/' sh {} \; rename node_module by prefixing it with it's parent folder name and move it to Trash
Before I had
~/Development/angular-projects
┣ project1
┣ project2
┗ project3
After running command
~/.Trash
┣ ~project1_node_modules
┣ ~project2_node_modules
┗ ~project3_node_modules
Then make sure to empty trash
Or Turn On empty trash feature
Python Script to Delete the node_modules folder from multiple projects. Just place it in your project folder consisting multiple projects and run it.
import os
import shutil
dirname = '/root/Desktop/proj' #Your Full Path of Projects Folder
dirfiles = os.listdir(dirname)
fullpaths = map(lambda name: os.path.join(dirname, name), dirfiles)
dirs = []
for file in fullpaths:
if os.path.isdir(file): dirs.append(file)
for i in dirs:
dirfiles1 = os.listdir(i)
fullpaths1 = map(lambda name: os.path.join(i, name), dirfiles1)
dirs1 = []
for file in fullpaths1:
if os.path.isdir(file):
dirs1.append(file)
if(file[-12:]=='node_modules'):
shutil.rmtree(file)
print(file)

How to print the deleted file names along with path in shell script

I am deleting the files in all the directories and subdirectories using the command below:
find . -type f -name "*.txt" -exec rm -f {} \;
But I want to know which are the files deleted along with their paths. How can I do this?
Simply add a -print argument to your find.
$ find . -type f -name "*.txt" -print -exec rm -f {} \;
As noted by #JonathanRoss below, you can achieve an equivalent result with the -v option to rm.
It's not the scope of your question, but more generally it gets more interesting if you want to delete directories recursively. Then:
a simple -exec rm -r argument keeps it silent
a -print -exec rm -r argument reports the toplevel directories you're operating on
a -exec rm -rv argument reports all you're removing

Shell script to delete directories older than n days

I have directories named as:
2012-12-12
2012-10-12
2012-08-08
How would I delete the directories that are older than 10 days with a bash shell script?
This will do it recursively for you:
find /path/to/base/dir/* -type d -ctime +10 -exec rm -rf {} \;
Explanation:
find: the unix command for finding files / directories / links etc.
/path/to/base/dir: the directory to start your search in.
-type d: only find directories
-ctime +10: only consider the ones with modification time older than 10 days
-exec ... \;: for each such result found, do the following command in ...
rm -rf {}: recursively force remove the directory; the {} part is where the find result gets substituted into from the previous part.
Alternatively, use:
find /path/to/base/dir/* -type d -ctime +10 | xargs rm -rf
Which is a bit more efficient, because it amounts to:
rm -rf dir1 dir2 dir3 ...
as opposed to:
rm -rf dir1; rm -rf dir2; rm -rf dir3; ...
as in the -exec method.
With modern versions of find, you can replace the ; with + and it will do the equivalent of the xargs call for you, passing as many files as will fit on each exec system call:
find . -type d -ctime +10 -exec rm -rf {} +
If you want to delete all subdirectories under /path/to/base, for example
/path/to/base/dir1
/path/to/base/dir2
/path/to/base/dir3
but you don't want to delete the root /path/to/base, you have to add -mindepth 1 and -maxdepth 1 options, which will access only the subdirectories under /path/to/base
-mindepth 1 excludes the root /path/to/base from the matches.
-maxdepth 1 will ONLY match subdirectories immediately under /path/to/base such as /path/to/base/dir1, /path/to/base/dir2 and /path/to/base/dir3 but it will not list subdirectories of these in a recursive manner. So these example subdirectories will not be listed:
/path/to/base/dir1/dir1
/path/to/base/dir2/dir1
/path/to/base/dir3/dir1
and so forth.
So , to delete all the sub-directories under /path/to/base which are older than 10 days;
find /path/to/base -mindepth 1 -maxdepth 1 -type d -ctime +10 | xargs rm -rf
find supports -delete operation, so:
find /base/dir/* -ctime +10 -delete;
I think there's a catch that the files need to be 10+ days older too. Haven't tried, someone may confirm in comments.
The most voted solution here is missing -maxdepth 0 so it will call rm -rf for every subdirectory, after deleting it. That doesn't make sense, so I suggest:
find /base/dir/* -maxdepth 0 -type d -ctime +10 -exec rm -rf {} \;
The -delete solution above doesn't use -maxdepth 0 because find would complain the dir is not empty. Instead, it implies -depth and deletes from the bottom up.
I was struggling to get this right using the scripts provided above and some other scripts especially when files and folder names had newline or spaces.
Finally stumbled on tmpreaper and it has been worked pretty well for us so far.
tmpreaper -t 5d ~/Downloads
tmpreaper --protect '*.c' -t 5h ~/my_prg
Original Source link
Has features like test, which checks the directories recursively and lists them.
Ability to delete symlinks, files or directories and also the protection mode for a certain pattern while deleting
OR
rm -rf `find /path/to/base/dir/* -type d -mtime +10`
Updated, faster version of it:
find /path/to/base/dir/* -mtime +10 -print0 | xargs -0 rm -f

Resources