According to the Mac App Store Review Guidelines:
2.4.5 Apps distributed via the Mac App Store have some additional requirements to keep in mind:
(i) They must be appropriately sandboxed, and follow macOS File System Documentation. They should also only use the appropriate macOS APIs for modifying user data stored by other Apps (e.g. bookmarks, Address Book, or Calendar entries).
...
(iv) They may not download or install standalone apps, kexts, additional code, or resources to add functionality or significantly change the app from what we see during the review process.
(v) They may not request escalation to root privileges or use setuid attributes.
Sandboxing already precludes the use of APIs such as AuthorizationCreate(), and anyway, item (v) is pretty clear.
Certainly an app like, say, Parallels (MAS link) can't be coded without ever resorting to privilege escalation. Indeed, the regular (non-MAS) Parallels app installs at least 3 kexts, one of them being the hypervisor, without which I believe Parallels would be absolutely useless. So they are clearly violating these rules.
If a developer wished to write an app that, like Parallels, needs privilege escalation and is completely useless without it, how would the developer go about bypassing these restrictions? Or is it just a question of being big enough that Apple will turn a blind eye to this during the review process? Can you request an exception to Apple?
No comment on the App Store policy issue (unfortunately), but I can answer your question about Parallels. The version of Parallels on the Mac App Store does not use a kext, nor does it need to. The Hypervisor framework makes it possible to write a Parallels-like application without needing root privileges, or writing and distributing a custom kext (which requires separate approval by Apple). The Hypervisor framework is also usable from sandboxed apps. I believe this framework was created specifically to workaround this problem. Hope this helps!
Related
The short version: is it possible to delete helper tools which were set up by the app (SMJobBless() etc.) when the app is deleted? If so, how?
The long version:
The Mac app we are developing unfortunately requires admin privileges to perform an occasional operation, and it also requires a background task to be live for other apps' plugins to connect to even when the app itself isn't running (this one can be unprivileged). The app will be signed with a Developer ID certificate, and distributed only outside the App Store.
We'd like the app to be a "good citizen" as far as possible, also on uninstall.
For the background task, we're using a login item, created using SMLoginItemSetEnabled(). This isn't amazing, because XPC messaging doesn't seem to work (we're using CFMessagePort instead - alternative suggestions welcome), but if the user deletes the app, the login item at least doesn't get loaded anymore on next login. I suspect there's still a trace of it somewhere in the system, but the executable inside the .app bundle is used, and when that disappears, the login item no longer runs.
For the occasional operation requiring admin rights, we've got a privileged helper tool which our app installs using SMJobBless(), and which implements a named XPC service, so the task spins up on demand when it receives a message from the main app. This is what Apple recommends and describes in its Even Better Authorization Sample.
The helper executable is copied to /Library/PrivilegedHelperTools/ by SMJobBless(), and the embedded launchd.plist ends up in /Library/LaunchDaemons/. Even though the OS has the information on which app "owns" the helper, it doesn't seem to uninstall it when the user deletes the app. Apple's sample is silent on uninstalling, other than the uninstall.sh script which is apparently intended to be used during development only. We don't need this helper while the app isn't running, so installing it as a full-blown launch daemon is slightly overkill, but we'd also like to avoid repeatedly annoying the user with the password prompt too. Besides, Apple advises against other forms of running code with admin privileges than SMJobBless() these days - for example SMJobSubmit() is marked deprecated.
So how do we clean up after ourselves?
I've found SMJobRemove(), but (a) when would we call that in our case - you can't run code on .app bundle deletion, or can you? and (b) it doesn't actually seem to clean up.
The only 2 things I can think of are not terribly satisfying:
Some kind of uninstaller app or script. But that seems pretty ugly too.
Don't worry about it and just leave a mess behind when the user deletes our app.
Update:
There have been some changes in this area with macOS 13.0 Ventura; there's an introduction to the new mechanism in the WWDC22 session 'What’s new in privacy'. The new SMAppService APIs support automatic cleanup for daemons, agents and login items. Unfortunately you'll of course still have to find a workaround for any older macOS versions you support.
Original answer:
There has been a similar question on the Apple Developer Forums at https://forums.developer.apple.com/thread/66821 - the recommendation by Apple is a manual uninstall mechanism, and consuming as few resources as possible if the user does not do this.
Apple DTS staff further recommended implementing a self-uninstall mechanism in the privileged launch daemon, to be triggered from the app via XPC. This is what we're going with.
I think the only solution you have right now is to use the uninstall shell code that you mentioned in order to physically remove the privileged helper from disk or to build an uninstaller for it. Either way you will have to ask the user to enter his/her password. This what all installers / uninstallers that require privileged access to the system do, and for a very good reason. That's why I avoid like the plague to use privileged helpers, but I understand that sometimes you really have to. I don't think it is good that you leave such a helper in the user's system, because it will reload next time the user starts up the computer.
I just checked ServiceManagement.h header and they state that SMJobRemove will be replaced by an API that will be made available through libxpc in the future. (Sometimes you really need to go to the headers to get extra info that the documentation does not give you.) Hopefully this promised replacement will uninstall it for us. However, I'd file a bug report and ask for that enhancement.
One solution you could consider is to include an uninstaller script or program in your .app bundle.
You can then pass the path of this small tool to your helper tool (via IPC) and have the execute the the uninstaller, thereby deleting itself. You will have to be careful that components are removed in the right order but it can be made to work.
You're correct that Apple does not provide an API to uninstall a helper tool installed with SMJobBless nor do they do so automatically. As for why macOS doesn't automatically do an uninstall, my educated guess is because macOS fundamentally doesn't have a unified concept of "install". While it's convention for apps to be located in /Applications (and a few other locations), it's perfectly valid for apps to be located and run from anywhere on the system including external drives and network drives. For example should macOS uninstall helper tools when apps disappear because the drive they're on is disconnected?
In terms of how to uninstall, doing so requires root permission and so realistically have the helper tool itself do the uninstall is the easiest option. You can have your app via XPC tell the helper to uninstall itself. Here's an example in Swift of how to do this; it's part of SwiftAuthorizationSample. The basic idea is:
Use the launchctl command line tool to unload the helper tool
Delete the helper tool executable
Delete the helper tool launchd plist
But there's a bit of additional complexity involved because launchctl won't let you unload a running process.
I am curious about this when I am getting to know more about programming WinRT app. Normally for regular Windows programs, developers can use system headers like WinINet.h or WinHttp.h, etc. However, it is not allowed to use them in an WinRT store app.
I was wondering what way they use to prevent developer from using those dlls, how do they check, and why they do this to developers?
Thank you
All Windows Store apps run inside of an AppContainer. All of the dlls and libraries that they are allowed to import/use are limited by the rights of the AppContainer. Generally, the apps have the rights of the user "ALL_APPLICATION_PACKAGES" I believe, so that means that they have the read/write capabilities of that user. They still are only able to access the system via the AppContainer, so any limitations imposed by the AppContainer still exist.
Now, there is a way around this. The developer can use the FileOpenPicker to allow the user to choose a file or folder (via the FolderPicker) that the program can use.
They do this to developers to make it so that the AppContainer is a very closed-off sandbox running in basically a virtual environment. In this way, simply for the fact that a program is running inside of the AppContainer, it can be said to protect the users data and prevent the installation of malware.
This does not necessarily prevent an app from social engineering or phishing. That is policed via the certification and compliance system.
Microsoft provides a method to get an ASHWID (App Specific Hardware ID) which has many components, one of which is some kind of BIOS ID.
http://msdn.microsoft.com/en-us/library/windows/apps/jj553431.aspx
Does that BIOS ID change if the user upgrades the BIOS on their computer?
RANT:
I don't understand why Microsoft makes getting a unique ID for an OS installation so complicated for Windows Store Apps. Android is so simple, when the Android OS boots the very first time they generate a GUID (see http://developer.android.com/reference/android/provider/Settings.Secure.html#ANDROID_ID) that never changes. Why doesn't Microsoft do this?
First of all, that value changes with factory reset in Android.
Second of all, try checking out EasClientDeviceInformation.Id. It uses a combination of the MachineId (local user group SID), the UserId, and the Package Id, but doesn't give you the whole thing.
This is good because Thirdly, giving devs access to use them directly to generate things like crypto keys can lead to other apps getting your apps keys, like has happened on many iOS devices. By limiting what device id's you can use, they're making it so that you must use a system resource that is literally only accessible from inside of a given application with a given Id from inside that user's account. The Windows Store is very strongly sandboxed for exactly this reason. While it can be a pain sometimes, it makes the platform much more secure, which is a huge boon.
I have a requirement to create a cross platform application that launches a web link to a feedback form when its uninstalled.
This is obviously normal sort of behaviour on windows..., but on a Mac is is proving to be more complicated as applications are not technically installed and uninstalled in a windows sense, aka you just copy the .app file into Applications and delete it when you're finished.
How can I achieve this website launching requirement? (Should I even be trying, is this process too alien to Mac users?)
I tired packing the application with an uninstall shell script that deletes the app and lunches the site, but obviously the script can't delete itself.
I don't think this is the best idea, since the process would be a bit unusual to OS X users. As you noted, most applications are installed simply by dragging a .app file to /Applications (or some other location). Some apps do have an installer, but even apps with an installer only occasionally have an uninstaller; and furthermore, as a Mac user, I'd be immediately suspicious of an app that installed itself and some sort of unknown shell script.
Mac OS applications should not need to be uninstalled in any way other than the user dragging them into the trash.
Also, I would rethink very carefully your plan to make a cross-platform applications. Cross-platform applications that treat Mac OS as an afterthought and try to push foreign paradigms onto Mac OS are really irritating. If you want a Mac client, keep your backend code, but rewrite the front-end from scratch. Don't use something like Qt, no matter how tempting the portability is.
So, long story short, you're right. The process is alien to Mac users (except for things like plugins). So my suggestion is just to go with the normal Mac OS behavior (drag to trash). Best of luck!
I would recommend against it. You could create an uninstaller but nothing is going to stop a user from just deleting it from the application folder or using something like AppZapper. Most people don't even look for an uninstaller application, they just trash the app, so even if you wrote one there would be no guarantee it'll be used.
I'd certainly avoid an uninstaller shell script, no way in the world I'd personally run it.
An uninstaller on a Mac makes no sense and would be awkward to implement, if you could even implement it at all in a way where people would use it.
Consider trying to get user feedback using alternate methods, such as:
Add a menu item that opens the feedback form
Require registration when the software is downloaded, then send an email to the user at some point in the future to ask for feedback
Ask for feedback occasionally on application quit (could be annoying, though)
I don't think that it's a good idea to ask for a feedback when the app is uninstalled. However, here is a good way to provide an uninstaller for a MACOS app in case it needs to do some clean up.
Update: Since development machine has moved to Vista, i now automatically test as a standard user. And with XP being phased out, this question isn't so relavent anymore.
Since the Windows 2000 logo requirements, Microsoft has been requiring that applications run as standard user. Like everyone else i always ran my desktop as an administrative user. And like every developer: i log in, develop, run, and test as an administrative user.
Now with a new push to finally support standard users, i've been testing my applications by running them as a normal user - either through RunAs, or having my application relaunch itself with normal rights using [SaferCreateLevel][1]/[SaferComputeTokenFromLevel][2] if it detects it is running as an administrator. i quickly see how specacularly some of my apps fail under Windows XP as a standard user (due to my own stupidity). i also see how the same applications work fine under Vista (thanks to it's numerous shims to fix my bugs for me).
Aside: It's ironic that applications are more likely to run on Vista as a standard user than on XP.
The question is do you test your applications for standard user compatiblity? Do you develop as a standard user on XP? Do you ignore standard user access and hope for the best?
i tried, as a bonus, to have my app relaunch itself as a limited user (rather than normal user). It doesn't even come up - Windows says it failed to initialize. So there an area of future research on my part: making the app even support limited user.
i specifically referred to standard users on XP rather than Vista to enforce the truth that Vista is no different from XP as far as compatibility is concerned. And anyone who says their app fails on Vista must realize it also fails on XP.
I'm going to point you to Crispin Cowan's "Best Practices for Developing for Windows Standard User" talk. It's well worth watching.
If you want to sell your application to businesses then yes, you must test your application running as a standard user. If your application can't run without administrative privelleges, that's going to doom any sale in to a business.
Even in the home market, plenty of people can and do use limited users to go about their daily activities; I know I do.
Even administrative applications that do legimately need administrative privelleges should behave sensibly when running as a limited user. They should popup up a dialog informing the user that administrative rights are required to complete whatever task it was that they were attempting.
The best way to build software that respects these limitations is to develop your software under a user that has limited privileges. That way, every time you develop a feature you're implicitly testing whether it will work in a limited environment.
None of this is hard, it just take a degree of discipline - just like all quality assurance procedures do. People have been developing as non-root users on *nix for decades. Windows development is behind the curve in this respect.
Crispin, in his PDC talk, made a very good point, one that i had never considered before.
Google Chrome installs as a standard user: it installs in the per-user folder, without needing a UAC or OTS prompt, and everything is user friendly because the install is so easy. Unfortunatly, it is installed in a per-user folder, where the user can modify it.
Put it another way: malware can modify the Chrome exe.
Chrome would now become the biggest target for any mal-ware. And if some malware does modify it, Chrome is now sending your usernames, passwords, and credit card info back to home base, because that's what the new Chrome exe does.
That is why you sometimes want applications installed to protected locations.
Edit: The entire Microsoft "Click Once" deployment inititave suffers the danger.
I run on XP as a limited user almost all of the time and as the default. (On Vista, I use an adminstrative account and rely on UAC.)
I develop as a limited user. There's very little in Java and Visual Studio development that requires any more privilege than that.
If I need to run something under the limited account but with administrative privileges, I use a MakeMeAdmin (renamed and tuned as ConsoleMeAdmin) .bat script that creates an administrative console session.
If I really need to be an administrator in order to do installs and do first-time-runs so my security software can condition itself to allow network access to the new code (or not), etc., I will elevate my Limited User Account to Administrator long enough to get all of that done, then restart the account as Limited User again. Other than for Windows Updates, I do all of my downloads as a limited user and then install off-line after elevation to Administrator.
Because I only have a small workgroup LAN with no Active Directory, the only useful account types are Administrator and Limited User on XP. (I tried power user when I first began using XP but found that I could do without it and I prefer what that teaches me about not depending on special privileges in code I build.)
[PS: I also have Data Execution Protection (supported in hardware) active by default on my XP system, and you'd be surprised what that turns up.]
In the business environment most users are standard windows domain users.
To ignore standard user compliance tests is a really bad move.
And you will get each domain administrator that has to install your application very angry and they will go to your competition.
IMHO developing in an administrator account is not only unnecessary, but also highly dangerous! Suppose you check something on the internet while developing (stackoverflow comes to mind) and you catch some malware - history shows that this is far easier than you might have thought, e.g. through banners. As an administrator this malware will infect your computer and you might never get rid of it. It can even be a danger to all your development work (think of industrial espionage)!
If you have to run/test anything as an administrator, use either runas or even better virtual machines - that way you can use separate systems with defined behaviour (lots of problems with Windows software come from libraries that are of course available on the developer's PC, but hardly anywhere else!). In times of Microsoft Virtual PC and VMWare Server (both free) there isn't even an excuse due to high prices for virtualization software.
I've developed some Windows apps some years ago and besides their installers NOTHING ever required administrative rights. The run-time settings always belong to the user, not to the machine.
And yes, I run Windows XP as normal user at home too, as do my family members (parents etc.). Sometimes a crappy piece of software needs write access to their installation folder, but 95% of all installed apps run fine out-of-the-box by today.
Yes, we test that.
Probably the simplest, but most abused, rule is that you shouldn't do anything that requires write access to your program's install folder. Instead, there's a special folder called Application Data for that kind of thing.
Yes, and I took the general advice that its much easier to get your application to run on Vista if it runs ok on XP as limited user. To achieve that, and know if there were any problems running as limited user, I used LUABuglight.
I generally don't develop as limited user but only log on as limited user for testing.
The number of programs that require Admin rights and write to their own Program Files folder is amazing. To be honest, I've found very few programs that run correctly as limited user, from any software company, big or small.
Anyone else find it funny that Windows developers think its normal to run as Admin (apparently), but Linux developers pretty much never run as root?
As an old-time BOFH I will rain fire and ugly words over anyone asking for elevated rights for their client-side applications to run properly. It's just out of the question, always was ever since around 2001-2002 when we switched from Win9x to XP (sic).
As a newly born developer in a place where everyone on XP is a local admin by a forced group policy and changing it seems to take time and noone is especially inclined to start either - I've installed the RunAsAdmin shim that lowers me down to a normal user for most tasks including developing - much like in Vista. Recommended if you're stuck as a local admin on XP ^^