bash script rm cannot delete folder created by php mkdir - bash

I cannot delete folder created by php mkdir
for I in `echo $*`
do
find $I -type f -name "sess_*" -exec rm -f {} \;
find $I -type f -name "*.bak" -exec rm -f {} \;
find $I -type f -name "Thumbs.db" -exec rm -f {} \;
find $I -type f -name "error.log" -exec sh -c 'echo -n > "{}"' -f {} \;
find $I -type f -path "*/cache/*" -name "*.*" -exec rm -f {} \;
find $I -path "*/uploads/*" -exec rm -rdf {} \;
done
I want to delete under /uploads/ all files and folders please help me thanks...

You should consider changing your find command to use the -o pragma to join your conditions together as the final exec is basically the same. This will avoid recursing the file system repeatedly.
The other answers address your concern about php mkdir. I'll just add that it has nothing to do with the fact it was created with php mkdir rather than any other code or command. It is due to the ownership and permissions.

I think this is most likely because php is running in apache or another http server under a different user than you are invoking the bash script. Or perhaps the files uploaded in uploads/ are owned by the http server's user and not the user invoking it.
Make sure that you run the bash script under the same user as your http server.
To find out which user owns which file do:
ls -l
If you run you bash script as root, you should be able to delete it anyway, but that is not recommended.
Update
To run it as root for nautilus script use the following as your nautilus script:
gksudo runmydeletescript
Then put all the other code into another file with the same path as whatever you have put for runmydeletescript and run chmod +x on it. This is extremely dangerous!

You should probably add -depth to the command to delete sub-directories of upload before the directory itself.
I worry about the -path but I'm not familiar with it.
Also consider using + instead of \; to reduce the number of commands executed.

Related

Find files that has write permission for current user

I need to find files in a directory which has "write" permissions for the current user.
I tried using find like below:
find $DIR -type f -user $(whoami) -perm -u+w
But the above find looks for file owned by $(whoami).
The files that i am looking for may not be owned by $(whoami) but still has permissions (for eg., 666 permission)
I also tried below (sorry if it looks stupid):
find $DIR -type f -exec test -w {} \;
Your second approach seems to be on the right track; you just forgot to print the filename.
You could make a script
#!/bin/sh
test -w "$1" && echo "$1"
and then use -exec on this script.
UPDATE: If you don't like the idea of having a separate script, you can put it also into one line:
find $DIR -type f -exec bash -c 'test -w "$1" && echo "$1"' x {} \;
The lone x just serves as a placeholder, because this is what bash sees as $0, and you want to assign the current file to $1. Another possibility would be
find $DIR -type f -exec bash -c 'test -w "$0" && echo "$0"' {} \;
but for me, this looks less clear and like a misuse of $0.
Your second command is correct, you just have to print the paths. Usually -print doesn't have to be mentioned, but with a few options like -exec you have to explicitly specify that you want to print the found paths.
find "$DIR" -type f -exec test -w {} \; -print
You may wonder: »Why does this print only writeable files?«
find uses short-circuit evaluation – the next option is only evaluated if the preceding option succeeded.
Example: In the command find -type f -user USER the check -user USER will only be performed on files, not on directories as -type f fails for directories.
The -exec cmd option also acts as a check – the exit status of cmd will be used to determine whether the check passed or not.
Example: find -exec false \; -user USER won't ever perform the check -user USER since the program false never succeeds.
In your case that means that -print will only be executed if test -w succeeded.

`rm -f` asks for confirmation, when aliased as `rm -i`

I'm using this command to try and delete all Thumbs.db files in a very large folder. I thought -f should force deletion without asking for confirmation, but i'm still being prompted for "y" or "n" on every file.
find "megapacks" -name Thumbs.db -ok rm -f {} \;
I tried type rm to see if there was an alias and it responded with
rm is aliased to `rm -i'
I tried using /bin/rm instead but i'm still being prompted
find "megapacks" -name Thumbs.db -ok /bin/rm -f {} \;
Does anyone have another idea for how to avoid the confirmation?
Problem is with -ok option that is as per man find:
Like -exec but ask the user first. If the user agrees, run the command. Otherwise just return false.
This should work for you with -exec:
find "megapacks" -name Thumbs.db -exec /bin/rm -f {} \;
or faster:
find "megapacks" -name Thumbs.db -exec /bin/rm -f {} +
But I think the problem is that you pass -ok to find, which is
Like -exec but ask the user first.
If the alias were the problem, simply unset the alias:
unalias rm
Note that this only affect the current shell session.
You can also use the -delete option for find:
find "megapacks" -name Thumbs.db -delete

Collect jars into a tar without directory

I have a large project that creates a large number of jars in a path similar to project/subproject/target/subproject.jar. I want to make a command to collect all the jars into one compressed tar, but without the directories. The command I have come up with so far is: find project -name \*.jar -exec tar -rvf Collectors.tar.gz -C $(dirname {}) $(basename {}) \; but this isn't quite working as I am intending, the directories are still there.
Does anyone have any ideas for how to resolve this issue?
Your command is quite close, but the problem is that Bash is executing $(dirname {}) and $(basename {}) before executing find; so your command expands to this:
find project -name \*.jar -exec tar -rvf Collectors.tar.gz -C . {} \;
where the -C . is a no-op and the {} just expands to the full relative directory+filename.
One general-purpose way to fix this sort of thing is to wrap up the argument to -exec in a Bash one-liner, so you invoke Bash for each individual file, and let it execute the dirname and basename at the right time:
find project -name \*.jar -exec bash -c 'tar -rvf Collectors.tar.gz -C "$(dirname "$1")" "$(basename "$1")"' '' '{}' \;
In your specific case, however, I'd point you to find's -execdir action, which is the same as -exec except that it cd's into the file's directory first. So you can simply write:
find project -name '*.jar' -execdir tar -rvf "$PWD/Collectors.tar.gz" '{}' \;
(Note that $PWD part, which is to make sure that you write to the Collectors.tar.gz in the current directory, rather than in the directory that find -execdir will cd into.)

How to use the pwd as a filename

I need to go through a whole set of subdirectories, and each time it finds a subdirectory named 0, it has to go inside it. Once there, I need to execute a tar command to compact some files.
I tried the following
find . -type d -name "0" -exec sh -c '(cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/)' ';'
which seems to work. However, because this is done in many directories named 0, it will always overwrite the previous filename.tar one (and I lose the info about where it was created).
One way to solve this would be to use the $pwd as the filename (+.tar at the end).
I tried double ticks, backticks, etc, but I never manage to get the correct filename.
"$PWD"".tar", `$PWD.tar`, etc
Any idea? Any other way is ok, as long as I can link the name of the file with the directory it was created.
I'd need this to transfer the directoryToTransfer easily from the cluster to my home computer.
You can try "${PWD//\//_}.tar". However you have to use bash -c instead of sh -c.
Edit:
So now your code should look like this:
find . -type d -name "0" -exec bash -c 'cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/"${PWD//\//_}.tar"' ';'
I personally don't really like the using -exec flag for find as it makes the code less readable and also forks a new process for each file. I would do it like this, which should work unless a filename somewhere contains a newline (which is very unlikely).
while read dir; do
cd {} && tar -cvf filename.tar ../PARFILE RS* && cp filename.tar ~/home/directoryForTransfer/"${PWD//\//_}.tar"
done < <(find . -type d -name "0")
But this is just my personal preference. The -exec variant should work too.
You can use -execdir option in find to descend in each found directory and then run the tar command to greatly simplify your tar command:
find . -type d -name "0" -execdir tar -cvf filename.tar RS* \;
If you want tar file to be created in ~/home/directoryForTransfer/ then use:
find . -type d -name "0" -execdir tar -cvf ~/home/directoryForTransfer/filename.tar RS* \;

shell script to traverse files recursively

I need some assistance in creating a shell script to run a specific command (any) on each file in a folder, as well as recursively dive into sub-directories.
I'm not sure how to start.
a point in the right direction would suffice. Thank you.
To apply a command (say, echo) to all files below the current path, use
find . -type f -exec echo "{}" \;
for directories, use -type d
You should be looking at the find command.
For example, to change permissions all JPEG files under your /tmp directory:
find /tmp -name '*.jpg' -exec chmod 777 {} ';'
Although, if there are a lot of files, you can combine it with xargs to batch them up, something like:
find /tmp -name '*.jpg' | xargs chmod 777
And, on implementations of find and xargs that support null-separation:
find /tmp -name '*.jpg' -print0 | xargs -0 chmod 777
Bash 4.0
#!/bin/bash
shopt -s globstar
for file in **/*.txt
do
echo "do something with $file"
done
To recursively list all files
find . -name '*'
And lets say for example you want to 'grep' on each file then -
find . -type f -name 'pattern' -print0 | xargs -0 grep 'searchtext'
Within a bash script, you can go through the results from "find" command this way:
for F in `find . -type f`
do
# command that uses $F
done

Resources