Related
#UPDATE:
OK FOR ANYONE ELSE WHO IS SEEKING ADVICE ON THIS ISSUE...
So far, the best thing I have found is to download yourself a copy of pGina and for 2k/XP modify the GINA, and for vista/win7 you will need to create custom login credentials (pGina have the tools/samples to interface with the Vista/Win7 architecture).
to confirm -- it appears that this is what Novell are doing with Vista/Win7 rather than the traditional method of replacing the GINA (like in 2k/XP)
If anyone else comes up with a way to run an application on the logon screen in Win7 please post it.
Ok I'm writing some vb6 software that requires input before the user logs on to the system.
Basically I want to run an application on the Windows logon screen where the user can interact with the program. At present I have the application running as a service allowing to interact with the desktop, but it is still now showing.
I know that "Allow service to interact with desktop" will work in Windows 2000 / XP, I'm running Windows 7 - I am also aware that services cannot directly interact with a user as of Windows Vista - saying this, are there any other methods to get my application running on the logon screen. Novell does it
Does anyone have any other ideas to get this working?
You can only do this if you are authenticating the credentials yourself. Prior to Vista, this was done via GINA, but since Vista, you need to write your own Credential Provider.
The reasons behind this are buried deep in the security principles -- Ctrl-Alt-Del will only ever bring up the window station associated with login (etc), and no other application can get to that window station (so you can't create a fake password box over the top to scrape passwords, for instance).
Without knowing why you think your service needs to interact with that desktop, it's difficult to advise further, but it might mean that your design is broken somehow.
Service isolation will probably prevent you doing this from a service.
Pre-vista Novell and the like would probably have used GINA, which was replaced in vista; http://msdn.microsoft.com/en-gb/magazine/cc163489.aspx
The only way I know of would be to write your own msgina.dll.
It can get dicey during testing though. Any mistakes can mess up your OS so bad that a complete reinstall becomes necessary.
I made an updater which silently runs in XP and works just fine. But when it comes to Vista, the idea of silent installation gets ruined when UAC prompts the user to cancel or allow the user from running the program.
Is there anything at all we can do about this?
Thanks...
I know this post is old... 4 months to be exact. But Actually, yes it is VERY VERY possible. I wish to correct the people above.
Just add this line to your NSIS script.
RequestExecutionLevel user
This line tells Windows Vista and Windows 7 that this program does not require administrative access, which Vista/7 thinks.
Unfortunately there's no way around this. UAC is actually intended specifically to prevent this type of thing where programs install software or make changes to the machine without the user's awarness.
This is effectively a side effect of UAC and user permissions. From a security perspective, it does make sense.
If this is something you need to do, you should look to implement a system that is designed to run patching and deployments with elevated permissions. Microsoft's own Systems Management Server would do the trick, but is obviously quite a large scale solution!
You can read about it here.
UAC for non-MSI installs is a bit of a grey area, with signed MSI packages things get much easier and less confusing for the user.
You might want to take a look at Clickonce Deployment which may solve some of your problems.
Actually, it is possible, under very preconceived circumstances. Specifically, "service" can launch an installer, in a user session, with full privileges and bypass UAC prompting (already has it).
Of course this requires your user to have already installed your service, which DOES require Admin approval.
Update: Since development machine has moved to Vista, i now automatically test as a standard user. And with XP being phased out, this question isn't so relavent anymore.
Since the Windows 2000 logo requirements, Microsoft has been requiring that applications run as standard user. Like everyone else i always ran my desktop as an administrative user. And like every developer: i log in, develop, run, and test as an administrative user.
Now with a new push to finally support standard users, i've been testing my applications by running them as a normal user - either through RunAs, or having my application relaunch itself with normal rights using [SaferCreateLevel][1]/[SaferComputeTokenFromLevel][2] if it detects it is running as an administrator. i quickly see how specacularly some of my apps fail under Windows XP as a standard user (due to my own stupidity). i also see how the same applications work fine under Vista (thanks to it's numerous shims to fix my bugs for me).
Aside: It's ironic that applications are more likely to run on Vista as a standard user than on XP.
The question is do you test your applications for standard user compatiblity? Do you develop as a standard user on XP? Do you ignore standard user access and hope for the best?
i tried, as a bonus, to have my app relaunch itself as a limited user (rather than normal user). It doesn't even come up - Windows says it failed to initialize. So there an area of future research on my part: making the app even support limited user.
i specifically referred to standard users on XP rather than Vista to enforce the truth that Vista is no different from XP as far as compatibility is concerned. And anyone who says their app fails on Vista must realize it also fails on XP.
I'm going to point you to Crispin Cowan's "Best Practices for Developing for Windows Standard User" talk. It's well worth watching.
If you want to sell your application to businesses then yes, you must test your application running as a standard user. If your application can't run without administrative privelleges, that's going to doom any sale in to a business.
Even in the home market, plenty of people can and do use limited users to go about their daily activities; I know I do.
Even administrative applications that do legimately need administrative privelleges should behave sensibly when running as a limited user. They should popup up a dialog informing the user that administrative rights are required to complete whatever task it was that they were attempting.
The best way to build software that respects these limitations is to develop your software under a user that has limited privileges. That way, every time you develop a feature you're implicitly testing whether it will work in a limited environment.
None of this is hard, it just take a degree of discipline - just like all quality assurance procedures do. People have been developing as non-root users on *nix for decades. Windows development is behind the curve in this respect.
Crispin, in his PDC talk, made a very good point, one that i had never considered before.
Google Chrome installs as a standard user: it installs in the per-user folder, without needing a UAC or OTS prompt, and everything is user friendly because the install is so easy. Unfortunatly, it is installed in a per-user folder, where the user can modify it.
Put it another way: malware can modify the Chrome exe.
Chrome would now become the biggest target for any mal-ware. And if some malware does modify it, Chrome is now sending your usernames, passwords, and credit card info back to home base, because that's what the new Chrome exe does.
That is why you sometimes want applications installed to protected locations.
Edit: The entire Microsoft "Click Once" deployment inititave suffers the danger.
I run on XP as a limited user almost all of the time and as the default. (On Vista, I use an adminstrative account and rely on UAC.)
I develop as a limited user. There's very little in Java and Visual Studio development that requires any more privilege than that.
If I need to run something under the limited account but with administrative privileges, I use a MakeMeAdmin (renamed and tuned as ConsoleMeAdmin) .bat script that creates an administrative console session.
If I really need to be an administrator in order to do installs and do first-time-runs so my security software can condition itself to allow network access to the new code (or not), etc., I will elevate my Limited User Account to Administrator long enough to get all of that done, then restart the account as Limited User again. Other than for Windows Updates, I do all of my downloads as a limited user and then install off-line after elevation to Administrator.
Because I only have a small workgroup LAN with no Active Directory, the only useful account types are Administrator and Limited User on XP. (I tried power user when I first began using XP but found that I could do without it and I prefer what that teaches me about not depending on special privileges in code I build.)
[PS: I also have Data Execution Protection (supported in hardware) active by default on my XP system, and you'd be surprised what that turns up.]
In the business environment most users are standard windows domain users.
To ignore standard user compliance tests is a really bad move.
And you will get each domain administrator that has to install your application very angry and they will go to your competition.
IMHO developing in an administrator account is not only unnecessary, but also highly dangerous! Suppose you check something on the internet while developing (stackoverflow comes to mind) and you catch some malware - history shows that this is far easier than you might have thought, e.g. through banners. As an administrator this malware will infect your computer and you might never get rid of it. It can even be a danger to all your development work (think of industrial espionage)!
If you have to run/test anything as an administrator, use either runas or even better virtual machines - that way you can use separate systems with defined behaviour (lots of problems with Windows software come from libraries that are of course available on the developer's PC, but hardly anywhere else!). In times of Microsoft Virtual PC and VMWare Server (both free) there isn't even an excuse due to high prices for virtualization software.
I've developed some Windows apps some years ago and besides their installers NOTHING ever required administrative rights. The run-time settings always belong to the user, not to the machine.
And yes, I run Windows XP as normal user at home too, as do my family members (parents etc.). Sometimes a crappy piece of software needs write access to their installation folder, but 95% of all installed apps run fine out-of-the-box by today.
Yes, we test that.
Probably the simplest, but most abused, rule is that you shouldn't do anything that requires write access to your program's install folder. Instead, there's a special folder called Application Data for that kind of thing.
Yes, and I took the general advice that its much easier to get your application to run on Vista if it runs ok on XP as limited user. To achieve that, and know if there were any problems running as limited user, I used LUABuglight.
I generally don't develop as limited user but only log on as limited user for testing.
The number of programs that require Admin rights and write to their own Program Files folder is amazing. To be honest, I've found very few programs that run correctly as limited user, from any software company, big or small.
Anyone else find it funny that Windows developers think its normal to run as Admin (apparently), but Linux developers pretty much never run as root?
As an old-time BOFH I will rain fire and ugly words over anyone asking for elevated rights for their client-side applications to run properly. It's just out of the question, always was ever since around 2001-2002 when we switched from Win9x to XP (sic).
As a newly born developer in a place where everyone on XP is a local admin by a forced group policy and changing it seems to take time and noone is especially inclined to start either - I've installed the RunAsAdmin shim that lowers me down to a normal user for most tasks including developing - much like in Vista. Recommended if you're stuck as a local admin on XP ^^
Question says it all really...
I have tried changing the "Allow non-admin users to run this program" setting on the property pages, and have also given the non-admin user in question what looks like the correct privileges in Component Services -> DCOM Config.
Is there anything else I can do ?
This is on Server2003 BTW.
Thanks
Matt
http://support.microsoft.com/kb/183607
I saw your question a few days ago, but didn't answer because all I have is something for you to try. I expected someone else with knowledgeable answer to respond, but since you still have no answers I'll tell you the little bit I know. When our tech support department installs our app onto a computer running XP or Vista they log in with an administrator account the first time they run the app. Apparently that allows what ever needs to happen with the ActiveX DLLs to work. After that the users can log in with their regular account and the app is still happy.
I didn't upgrade to Vista until May or so and one of the things I've always heard developers I know in real life say is "first thing you should do is turn off that UAC crap"
Well, I've left it on this whole time for a few reasons. First, just as a failsafe in case I do something idiotic like have a momentary lapse of reason and run an attachment from an email, or in case I view a site which hits some unpatched exploit. Second, as a big of an experiment to see how good or bad it really is.
Finally, I figure that it enforces some better practices. I used to develop every website in Windows directly in inetpub\wwwroot (Visual Studio .NET 2003 more or less required this) but now I develop them elsewhere because the UAC clickfest is a nightmare. I figure this is Microsoft's way of saying "you should really be doing it this way".
By way of another analogy - if you wrote a web app which runs on XP and 2000 just fine but requires 50 different security features of Server 2003 to be turned off, the real solution might be instead to just fix the application such that it doesn't require the security features to be turned off.
But now I'm having to work with an app which is really really NOT designed to be developed outside of inetpub/wwwroot and so UAC is really a nuisance. It's beyond the scope of the project to rectify this. I want to stick to my guns and leave UAC on but I'm also worried about being so autopilot about clicking "Yes" or "Allow" three times every time I need to modify a file.
Am I just being hard headed? Do most developers on Vista leave the UAC on or off? And for the instance described above, is there a better/easier way?
I think it is necessary to leave UAC on on a test machine, so you can see what a real user would see using your app. However, I turn it off on my development machine since I find it distracting, and I trust myself enough to not need it.
(Hopefully your test machine != your dev machine right?)
All this being said, I support UAC, and I am not recommending anyone else turn it off, especially 'common users'.
I code in a standard user account, with UAC turned on.
No I do not close UAC.
Programming C# winform, and web with IIS. Database is progresql. No need to bother with UAC. Some program only require 1 authorization, not a big deal.
I keep UAC on. I find it useful to develop in an environment similar to my end user. That way if I write any code which is trying to read / write from restricted areas I will know about it quicker.
UAC is incredibly annoying at first when you get a new system. The problem is that when you first start out with a new install you have all kinds of programs to set up and settings to tweak. It seems like you see the UAC prompt every 5 minutes.
After a while, two things happen:
You're not setting up as much new stuff.
You've become a little more used to the prompt.
At this point UAC isn't so bad anymore. I have UAC on and I've only seen one or two prompts in the last couple weeks. That's right about perfect: if I see a prompt I wasn't expecting I know to make sure I really want to proceed.
I will argue that the 2nd effect kind of defeats the purpose. What they should do is have UAC disabled by default, but for the first month only. After the first month prompt you to turn UAC on, where the default option for someone who doesn't really read things is to turn it on. Then people aren't annoyed during their setup period, and it's easier to make an informed choice about what you want to do with UAC.
I leave it on
I leave it on, but have it set to automatically elevate privileges when necessary. It's a fine distinction, but a distinction nonetheless.
Services like Microsoft SQL Server runs with administrator privileges. Visual Studio on the other hand does not. Nor do most developer-tools.
I make heavy use of virtual machines to 1) make sure my development environment is safe at all times, and 2) to test out software with the potential of leaving my machine FUBAR. And 3) to limit down-time, restoring my development environment, "in case I do something idiotic like have a momentary lapse of reason and run an attachment from an email" :)
I have been using Windows 2008 in my workstation following the advices on http://www.win2008workstation.com/wordpress/ and it has worked great for me. I don't remember turning off UAC, but certainly I haven't suffered it, so I guess it's turned off.
As others have said, you do need to have test [virtual] machines that are configured as close as possible to the ones your users will have so you won't have any surprises deploying your app.
I think whether you do this or not should depend on the target audience for your application, although I can completely understand people disabling it.
If all your users run Vista with UAC disabled then I think you can get away with turning it off, but this probably isn't realistic--or advisable. At the other end of the spectrum, our applications are used by a vast number of people with every conceivable version and configuration of Windows from Win2k onwards, and obviously including Vista and Server 2008. Since we're an ISV with no control over our users' environments, or over policies governing their privileges and administration, I always leave UAC enabled--even though it annoys me beyond all reason at times--because then I know about any possible problems it might cause for people using our applications sooner rather than later.
Disclaimer: most of my actual coding time is spent on Windows XP, although I have a Vista 64-bit test machine under my desk which I use on a daily basis for testing. Generally I'll use this box around 20 - 30% of the time.
Developing or not developing - was the first thing I did after installing vista. Just seemed an annoying nuisance at best.
Instead of running antivirus to suck away my CPU cycles (I need as many as I can with RDPs and VMs running all the time). I just leave UAC on as a safeguard to double check and make sure only certain things run. It does more than that though, it also restricts programs access to sensitive areas, so a program basically can't trash your system without you allowing it through UAC. I have not had a problem yet and my system runs only what I need it to run, quickly and smoothly.
It's too annoying for me, it gets turned off as soon as I install Vista.
I turn it off as soon as I install the OS. Security by endless modal dialogs is no security at all. Normal users just get used to clicking even more 'OK' buttons after a couple of weeks or so.
EDIT: Wow, down-voted huh? Must be some Microsoft employees around here...Of course it should remain on on a test machine, probably should have mentioned that.
I turn it off on computers that I am using.
When testing, I test in the target environment, which means I may have UAC on or off.
I see no benefit to developing with it on.
I find it extremely annoying and turn it off at all times, I trust myself enough to not have to have fail safes in place. If I screw up and run some dodgy application that's my bad and I'll live with the consequences. Meanwhile I'm not spending 5 minutes of my day clicking though some damn annoying popups.
I have it off, but that's because I trust myself entirely too much. Its funny though, it seems to make the average user (I live in Jourdanton TX, we have a lot of "average users" here in the middle of nowhere) afraid of the control panel, because it causes all these weird prompts to come up and wants their password every 5 minutes if they start to poke around.
That said, I think it depends on your level of expertise with the system. On your dev machine, yes, definitely turn the darn thing off. I haven't gone a day this week without needing to install or update some piece of software, and I don't like having to elevate myself to admin status to have to do that.
What I would really like is the ability to have it elevate for a period of time, or say automatically turn itself back on when I log off, so that I could do an entire session's worth of installing stuff without being bothered, and then be secure again when I was done and (inevitably) had to restart the machine as seems to be common practice with windows installers now.
And all that ranting aside, I think for your test machine, it should definitely be on. Not because I necessarily agree with the feature (any more than I agree that the Administrator account should be disabled permananty, I love that account way too much) but because the User is very likely to have it turned on, and you need to see your program through their eyes. This is especially true if your program is going to require elevation, say to change a setting or modify a certain directory, so that you can prompt your users to accept the UAC warning in your program, which adds an extra layer of comfort to the user I think.
Oh, and as for the one program, let me harp on you just slightly. Shouldn't the program have a define somewhere in the main header files that tells it where its "working directory" is? If this is already the case, then why is it so hard to change that working directory to somewhere else? If its not the case, shame on you, and you should go fix that. ^_^ That would have saved you a lot of trouble.
-Nicholas
I'm running into issues where our build scripts do things like manipulate registry entries or add things to the GAC. We're trying to get away from this stuff but until we do it's there and requires privilege escalation. So the build scripts get run from an Administrator command window. The problem comes in when I open Visual Studio 2008 and try to build part of the application - I can't as a normal user because the output files can't be overwritten because the build in the Admin console produced the same files at a higher privilege level. It's causing me a lot of frustration and I'm thinking the best way is to turn UAC off for now but I'm very reluctant to do so.
Because I've got post-build scripts to copy executables into the Program Files directory for testing I run Visual Studio with elevated privileges.
One tip I've found that makes life easier, is that to quickly start a command prompt with elevated privileges you can:
press Window Key
type "cmd"
Press Ctrl+Shift+Enter
Left cursor key (with right pinky) to move to "Continue" button on UAC dialog
Enter
I always keep one open for launching my IDE and running build scripts.
The only downside I've found is that elevated windows don't interact with some of my window tweaking software like KatMouse and Switcher.
No, but I do change some settings:
Do not prompt for elevation if not in the administrators group.
Evelvate automatically if you are the [machine]\administrator
I do not put myself in the administrators group.
Juts a plain old user, with no elevation prompts.
Use Run As if developing/debugging web apps with development server
I code with UAC off. I found annoying to see all those popups when i open visual studio or star uml, or just want to change a setting in my machine. I have always installed a good internet security suite that keeped me "virus free" on my machine for long years and i don't see the point to have always an "are you sure" prompt on every task i do. I agree with Ed because everyone click ok.
Exemple : install a firewall to some member of your family. When they will be prompted if app XYZ can connect to the internet, they will click yes. They will not make the distinction between a good app and a spyware/virus. It's the same thing with UAC.
I leave UAC on, but have VS set to always run as admin. The only real reason why I do that though is that I mostly work on software that requires admin permissions to run anyway. (And yes, I know that should be the minority, but my app happens to be one of those -- it's a soft-realtime hardware controller.)
For general purpose apps, you must at least test with UAC enabled; while you could do that on a separate machine, it's easier to test on your dev machine. And the prompt isn't that much of an imposition, especially if you disable the "secure desktop" option (which reacts very slowly with most graphics cards when enabled).
If you stay on Vista, turn off UAC and rely on Microsoft Security Essentials' real-time monitor to intercept anything that wants to alter your system. Or, upgrade to Win7, where you can leave UAC on and control the levels at which you want UAC to notify and interrupt the execution.
EDIT: It's very easy to exploit a Windows computer anyway, so what's the sense in having UAC turned on, if it really doesn't guarantee protection?