AppleScript to delete all user home folders except specific ones? - applescript

I manage a local schools IT infrastructure. Recently thought of a situation that we may run into since we have incorporated Apple computers into the system. It deals with network users home folder stored locally on the mac. Each year we have 500+ new students, so I can only assume at some point new user names are going to be the same as somebody from a previous year. Which is fine. Except I do not want the previous users files/data loaded up when the log into the mac with their network account.
What I would like some help with is a script, possibly a unix script for ARD, that I can use to delete all users home folders (or any other information I should remove) except a list of specified ones.
Real examples of users:
dadmin
dbaker
jdunn
Again, I do not know if this is the best way to do such a thing, but that's why I'm asking.
Also we do not use Mobile accounts. Network users log in, local profiles/home folders are created. I'm going to guess it will need to erase keychain data, as well as any other user specific information.

The OP found the solution:
--------------------------- SCRIPT INFORMATION ---------------------------
PURPOSE OF SCRIPT: Remove specific user accounts home folders on macs,
that are NOT mobile accounts, and only the ones that do not match
names given to not delete.
PLATFORM: Apple iMac
OS VERSION: 10.6.8
TESTED and WORKS: True
#!/bin/sh
echo "Deleting home directory for the following users…"
userList=`find /Users -type d -maxdepth 1 -mindepth 1 -not -name "*.*" -not -name "username" -not -name "Shared" -not -name "Guest" -mtime +7 | awk -F'/' '{print $NF}'`
for a in $userList ; do
rm -r /Users/"$a"; #delete the home directory
done
This will find users in the User directory disregarding hidden files. This will disregard the "Shared" account folder. This will disregard the "Guest" account folder. And it will disregard anyname that you change the "username" area into. To add more user names to not delete simply add more
-not -name "usernamegoeshere"
switches to the command anywhere after first -not -name "." switch.
I'm sure there is a way to have it read from a file list of user names, but as I am a beginner I won't go any farther. And I got this to work by running it through the terminal. I am yet to test on ARD.
I also did a test first to make sure that it pulled the correct names, and only the ones I wanted to delete by using this script FIRST which does nothing but display the names to delete based on the given criteria defined by -not -name "username" switches in the script.
#!/bin/sh
echo "Deleting home directory for the following users…"
userList=`find /Users -type d -maxdepth 1 -mindepth 1 -not -name "*.*" -not -name "username" -not -name "Shared" -not -name "Guest" -mtime +7 | awk -F'/' '{print $NF}'`
for a in $userList ; do
echo "$a"
done

Related

Retrieve Folder Older than Date Based On folder Name

I have a set of snapshots. Each snap shot resides in a accountname folder, and each snap shot is named with a date format as: YYYY-MM-DD-accountname
How can I retrieve the name of the "snap shot folder" where it is older than 2 days old? (The 2017-05-* directories)
Folder structure such as:
/home/snapshots
/home/snapshots/account1
/home/snapshots/account1/2017-05-01-account1
/home/snapshots/account1/2017-05-02-account1
/home/snapshots/account1/2017-05-03-account1
/home/snapshots/account1/2017-05-04-account1
/home/snapshots/account1/2017-05-05-account1
/home/snapshots/account1/2017-05-06-account1
/home/snapshots/account2
/home/snapshots/account2/2017-05-01-account1
/home/snapshots/account2/2017-05-02-account1
/home/snapshots/account2/2017-05-03-account1
/home/snapshots/account2/2017-05-04-account1
/home/snapshots/account2/2017-05-05-account1
/home/snapshots/account2/2017-05-06-account1
For instance... I want to list /home/snapshots/account1/2017-05-01 through /home/snapshots/account1/2017-05-04, given that today is 05/06/2017 (US), and vice-versa for account2
I thought find /home/snapshots/ -type d -mtime +2 -exec -ls -la {} \; may do the trick, but that returned me all folders older directories older than 2 days... and adding maxdepth 1 returned nothing...
Continuing from my comment above, the reason you are having problems is you want to search within /home and then select and delete the snapshots directories found if they are more than two days old. With -execdir, it would be
find /home -type d -name "snapshots" -mtime +2 -execdir rm -r '{}' +
Let me know if you have problems. (also, there is no need to use ls -la within find, the -printf option provide you complete output format control without spawning a multiple separate subshells for each ls call, see man find)
note: you should quote '{}' to protect against filenames with whitespace, etc.
Edit
Sorry I misread your question, Obviously if you only want to delete the account* subdirectories of each snapshots directory, then the search path of /home/snapshots is fine and you then include the account*/*account* designator as #BroSlow correctly caught below.

Searching for exact file extension using sh or bash on macOS (syntax issue)

Looking for files with .pst or .pst extension. A few apps have files in their bundles that have either/both extensions, so excluding them.
After some testing, found this script works:
#!/bin/sh
find /Users -type f -not -path "*AnApplication.app*" | grep -i "*.pst$" > /path/to/search-result.txt
exit 0
However it is returning *.dpst" files. which I thought would not happen given the grep -i "*.pst$" part of the command.
We are using the $ to ensure search returns extensions, and not files with ".pst" in the path or middle of name (Example: myFile.pst.doc or /path/my.pst.files/).
Our goal is to find only files ending in ".pst", what am I doing rong? :)
Thanks for the huge help, my apologies for the belated response. Here is what we ended up going with:
find /Users -type f -not -path '*AnApplication.app*' -iname '*.pst'

Search only directories with permission -find (Bash 4.2)

I have a command on AIX that finds files containing a phrase and are older than a certain age. however my report is full of permission denied errors. I would like the find to only search files that it has permission for.
I have command for linux that works
find /home/ ! -readable -prune -name 'core.20*' -mtime +7 -print
however in this case i am unable to use readable.
find /home/ -name 'core.20*' -mtime +7 -print 2>/dev/null
works rather well, but this still tries to search the directories costing time.
Just use your
find /home/ -name 'core.20*' -mtime +7 -print 2>/dev/null
When you want to skip dir's without permission, your script must somehow ask Unix for permission. This is exactly what find is doing: when the toplevel is closed, no time is spent on the tree beneath. The only cost is the stderr, and that is what you redirect.
When you want to optimise this for daily use, you might want to make a file with files not changed for over a day in a once-every-6-days crontab, and use the log as input for the daily claing. This solution will not help much and is very dirty. Just stick with your 2>/dev/null
A really simple fix, if it works for you, would be to filter out the errors using grep, something like:
find /home/ -name 'core.20*' -mtime +7 -print 2>/dev/null | grep -v 'Permission denied'
This will hide any results containing the phrase 'Permission denied' (case sensitive).
HTH!

Finding all files with certain extension in Unix?

I have a file /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home
I am trying to find if I have the *.jdk anywhere else on my hard drive. So I do a search command:
find . -name "*.jdk"
But it doesn't find anything. Not even the one I know that I have. How come?
find . only looks in your current directory. If you have permissions to look for files in other directories (root access) then you can use the following to find your file -
find / -type f -name "*.jdk"
If you are getting tons of permission denied messages then you can suppress that by doing
find / -type f -name "*.jdk" 2> /dev/null
a/
find . means, "find (starting in the current directory)." If you want to search the whole system, use find /; to search under /System/Library, use find /System/Library, etc.
b/
It's safer to use single quotes around wildcards. If there are no files named *.jdk in the working directory when you run this, then find will get a command-line of:
find . -name *.jdk
If, however, you happen to have files junk.jdk and foo.jdk in the current directory when you run it, find will instead be started with:
find . -name junk.jdk foo.jdk
… which will (since there are two) confuse it, and cause it to error out. If you then delete foo.jdk and do the exact same thing again, you'd have
find . -name junk.jdk
…which would never find a file named (e.g.) 1.6.0.jdk.
What you probably want in this context, is
find /System -name '*.jdk'
…or, you can "escape" the * as:
find /System -name \*.jdk
Probably your JDKs are uppercase and/or the version of find available on OS X doesn't default to -print if no action is specified; try:
find . -iname "*.jdk" -print
(-iname is like -name but performs a case-insensitive match; -print says to find to print out the results)
--- EDIT ---
As noted by #Jaypal, obviously find . ... looks only into the current directory (and subdirectories), if you want to search the whole drive you have to specify / as search path.
The '.' you are using is the current directory. If you're starting in your home dir, it will probably miss the JDK files.
Worst case search is to start from root
find / -name '*.jdk' -o -name '*.JDK' -print
Otherwise replace '/' with some path you are certain should be parent to you JDK files.
I hope this helps.
If you are on Mac terminal, and also already in the directory where you want the search to be conducted at, then this may also work for you:
find *.pdf
At least it worked for me.
find / -type f -name "*.jdk" works on Mac also
This works for me on macOS.
find . -type f -iname '*.jdk'
ls *.jpg | cut -f 1 -d "."
sub out the '.jpg' to whatever extension you want to list

Recursively CVS add files/directories and ignore existing CVS files

There's a similar post # How to add CVS directories recursively
However, trying out some of the answers such as:
find . -type f -print0| xargs -0 cvs add
Gave:
cvs add: cannot open CVS/Entries for
reading: No such file or directory cvs
[add aborted]: no repository
And
find . \! -name 'CVS' -and \! -name 'Entries' -and \! -name 'Repository' -and \! -name 'Root' -print0| xargs -0 cvs add
Gave:
cvs add: cannot add special file `.';
skipping
Does anyone have a more thorough solution to recursively adding new files to a CVS module? It would be great if I could alias it too in ~/.bashrc or something along those lines.
And yes, I do know that it is a bit dated but I'm forced to work with it for a certain project otherwise I'd use git/hg.
This may be a bit more elegant:
find . -type f -print0 | egrep -v '\/CVS\/|^\.$' | xargs -0 cvs add
Please note that print0, while very useful for dealing with file names containing spaces, is NOT universal - it is not, for example, in Solaris's find.
find . -name CVS -prune -o -type f -print0
See this answer of mine to the quoted question for an explanation of why you get the "cannot open CVS/Entries for reading" error.
Two important things to keep in mind when looking at the other solutions offered here:
folders have to be added, too
in order to add a file or folder, the parent folder of that item must already have been added, so the order in which items are processed is very important
So, if you're just starting to populate your repository and you haven't yet got anything to check out that would create a context for the added files then cvs add cannot be used - or at least not directly. You can create the "root context" by calling the following command in the parent folder of the first folder you want to add:
cvs co -l .
This will create the necessary sandbox meta-data (i.e. a hidden "CVS" subfolder containing the "Root", "Repository" and "Entries.*" files) that will allow you to use the add command.

Resources