Automatically authenticating windows users on an apache/Linux server - windows

If I wanna authenticate windows accounts to AD when a user browses to an apache-running site on a Linux server, here are the usual suspects:
 
List item
mod_ntlm (which I used in a distant past) - last update on 2003
mod_auth_ntlm_winbind - last update on 04/2007
mod_auth_kerb - last update on 12/2008
No luck getting any of those to work with a recent, fully patched, windows 2000 AD server.
Do you have any clues as to a recipe that does work? 
-Peter
-- UPDATE
my current build environment is this:
OS: Ubuntu Lucid
Apache 2.2.14 (from repos)
the auth modules I recompiled from source.

Did you just try to drop binary modules onto an existing apache binary, or did you rebuild Apache and the modules from source on your system?
The last time I did this (admittedly 3+ years ago), I found a combination of Apache+mod_ntlm that worked, but I ended up using a less-than-current version of Apache, in order to match the version of mod_ntlm that I found. My conclusion at the time was that if I wanted current, I was going to have to rebuild Apache and mod_ntlm from source, and I didn't have the time to do that.
Unfortunately, that was two jobs ago, and I don't have access to the configuration details.

LDAP. Active Directory should speak the LDAP protocol well enough (although, I believe Novell's eDirectory sticks to the spec better) that you can use LDAP authentication setups to communicate with it. It'll be a lot easier than fussing around with the Windows-centric NTLM garbage.
See this site for an example:
http://www.jejik.com/articles/2007/06/apache_and_subversion_authentication_with_microsoft_active_directory/
The other, likely costly option, is to invest in an identity manager product. Novell, Sun (now Oracle), and IBM all make one. I suspect that, unless you're designing something for a mid-size corporate project, you won't need these. But, they are an option to consider.

Related

Can Informatica be used on Windows system?

I would like to know if Informatica can be used on Windows system ? If so what are the prerequisites?
Both the previous answers are wrong.
Windows 10 is supported for installing the client tools only.
For exact details which Windows server versions are supported, please log on (after initial sign-up if you haven't already done so) to the Informatica Network at https://network.informatica.com ; there's a section named Product Availability Matrices, here you find for each PowerCenter version the so-called Product Availability Matrix (PAM) indicating which Windows versions are supported for server installation and for client installation. You need both, and you can install both on the same Windows server system.
I won't go into this ugly ancient flame war here. Be it enough to say that some people managed to install the server part on Windows 10, but very few ever made it work reliably (in most cases the installation seems to work but doesn't, at latest after the next system restart). I wouldn't waste one single second trying to do so, it's not worth the time.

Installing process of DotNetNuke (Dnn.Platform-8.0.2)

Downloaded source package of DotnetNuke and I am new in dotnetNuke. Can anyone help me to clarify the process of installing DotnetNuke.
I am following this Install DNN
I've got a tutorial on installing DNN8 found here.
You can also follow this text tutorial
Setting up your development environment can vary based on what your
end goal is. If you are doing module development for your own use, and
within your own DNN environments, you can ignore a few of the settings
below. If you are doing module development with the idea that you
might turn around and give the modules away, or sell them, then you
will likely want to follow the guidelines set forth below to support
the widest array of DNN installation environments.
I recommend that each developer have their own local development
environment, with a local IIS website running DotNetNuke, and a SQL
Server 2008/2012 (not express, though you can use it) database for the
website. Having an individual development environment makes group
module development far easier than if you share
environments/databases.
Choosing a DotNetNuke Version Choosing a version of DotNetNuke is
important when you start your development for couple of reasons. For
modules that you are developing for yourself, you need to ask, what is
the minimum version of DotNetNuke that you have in production. Are you
running DNN 5.6.1? Are you running 6.2.6, 7.0.0, 7.0.6? Based on the
answer you can determine what version of DNN you should setup as your
development environment. You shouldn't be developing on a newer
version of DNN than what you have running in production. As with
everything there are ways around this, but I am not going to go into
the details on that in this tutorial.
As a developer working to create modules and release those, you might
have production sites that are running on the latest and greatest
version of DNN, but what about your customers? Or your potential
customers? You have to ask yourself, do you want to provide support
for really old versions of DotNetNuke? From a development perspective
you will probably say no, but from a business perspective, you might
say yes, and here’s why. Not everyone upgrades DotNetNuke websites as
they should, and often times you will find that some people never
upgrade. While I don’t advise taking that approach to managing a
DotNetNuke website, it is a fact of life that people don’t always
upgrade and there are thousands of people, if not tens of thousands,
that have sites that aren’t running on the latest version of DNN. You
should take that into account when you are doing your module
development, if you compile your module against an older version of
DNN then your module should run on newer versions of as well, for
example. If you compile your module against DotNetNuke 6.2.6 it will
likely run on every version of DNN released since then. Though there
are extended cases where this won’t always work, DNN strives to
maintain backwards compatibility, this isn't always possible.
You might also want to use features that are only available starting
with a specific version of DotNetNuke, such as the workflow
functionality found starting in DNN 5.1, in that case you may choose
not to support older versions of the platform out of necessity. This
will minimize the market in which you can sell your modules, but also
can make for less support and an easier development cycle due to the
features that DNN provides.
Choosing a Package Now here’s one that may baffle you a bit. I’m going
to recommend that you use the INSTALL package for whatever version of
DotNetNuke that you download. What? The INSTALL package? What about
the SOURCE package? Well you can use the source, but you don’t need
it. The module development that I’m setting you up for doesn't require
the DNN source, and using the INSTALL package makes your development
environment cleaner. We aren't going to be opening the DotNetNuke
project when we do our module development, so why have the files
sitting around for nothing? Also, if you've ever tried to use the
SOURCE package for anything, you'll know it isn't easy.
The steps for setting up your development environment will apply to
both the Community and Professional editions of DotNetNuke.
Installation Configuration Once you have the version selection out of
the way you can go through the installation process. While I’m not
going to walk you through the minutest of details of each step of
installing DotNetNuke in this post, I will at least try to point you
in the right direction for each step.
Download the INSTALL package of the version of DotNetNuke you want to
use in your development environment.
Extract the files in the INSTALL package to a location of your
choosing, this location is where you will point IIS (the web server)
when we can configure the website. In my environment I typically use
c:\websites\dnndev.me\ (One item of note: you may need to right click
on the ZIP file and choose Properties before extracting, on the
properties window if you have an UNBLOCK option, click that. Some
versions of Windows have started blocking files within the DotNetNuke
ZIP files, which will cause you problems later during the actual
install.)
Setup IIS IIS is the web server that comes with Windows computers. DNN
7 requires IIS 7 or later (7,7.5,8.0), so you will need at least
Windows Vista, Windows 7, Windows 8, or Windows Server 2008 R2,
Windows Server 2012.
In IIS you should create a new website (Note: If you use an existing
website in IIS be sure to add the HOST binding for DNNDEV.ME), and
point to the folder where you extracted the INSTALL package.
Note: With DotNetNuke 7.0+, .NET Framework 4.0 is required, so be sure
that your application pool is configured to run under 4.0, and not
2.0.
Set File Permissions Setting up the file permissions for your DNN
install is often the step that causes the most trouble. You should
right click on the FOLDER in which you extracted DNN
(c:\websites\dnndev.me) and choose properties. Choose the Security
tab. You need to add permissions for the account in which your
website's application pool is running under. You will want to setup
the permissions to give the account Full or Modify permissions for the
DNNDEV.ME folder. Which account you will use will vary based on your
version of IIS, here’s a simple list of some of the default accounts
based on the version of IIS.
IIS Version Operating System Account IIS 7 Windows Vista, Windows
Server 2008 localmachine\Network Service IIS 7.5 Windows 2008 R2,
Windows 7 IIS AppPool\APPPOOLNAME IIS 8 Windows 2012, Windows 8 IIS
AppPool\APPPOOLNAME
Note: If you are using IIS7.5/8.0 you’ll notice in the above table
that we have APPPOOLNAME in the identity, this is because when you
setup a new website in IIS a new application pool is created. In place
of you should type in the name of the application pool that was
created. You can also bypass this and configure your application pool
to use the Network Service account instead of a dynamic account if you
would like.
Database Configuration In SQL Server you should go through and create
a new database. I always create a database with the same name as the
website, so in this case DNNDEV.ME. Once you have created the
database, create a user that can access that database. I always use
SQL authentication, turn off the enforce password requirements, and
give the user DB Owner and Public access to the DNNDEV.ME database.
Remember the username and password you create here as you will need
them when you walk through the Installation screen for DotNetNuke.
DotNetNuke Installation Screen Populate the installation screen with
the standard DNN information, Host username, password, etc. For the
Database option, choose Custom and configure your database connection,
providing the Server IP/Name, the Database name (dnndev.me). For the
database authentication you'll want to choose the option that allows
you to enter the username/password for the database user that you
created previously.
Now there are two additional options you can configure, normally I
would tell you not to modify these, but from a development environment
perspective I do recommend that you change the objectQualifier
setting. It should be blank by default, you should type in “dnn”
(without quotes), this will prepend “dnn_” to all of the objects that
get created by DNN such as Tables and Stored Procedures. This is not
something I recommend from a production stand point, but if you are
developing modules for sale, then supporting objectQualifier in your
development is recommended. It will save you time down the road if you
have a customer who has an objectQualifier defined on their production
databases.
Follow the following video and it has total two parts one and two part links are givenbelow
Part one
Part two

Issues updating an MSI through GPO (failures to overwrite/uninstall)

Thank you in advance for considering this question. If a similar question existed, I was unable to find it.
The Issue: Our company packages an application into an MSI. This MSI when installed outside of any GPO properly updates, blocks attempts to downgrade (or move from a higher revision to a lower revision), and never has trouble uninstalling previous versions of the application regardless of how long ago those versions were created/installed. For example, we can install version 1.2.3 and then install version 2.3.4 and the application will properly install without issue. However, we work with a customer who uses GPO to deploy our application to hundreds of PC's. Each time we have provided an updated version of the application the following has been indicated:
On any machine where a previous version of our application was installed via GPO, no matter what the previous version is, the update successfully installs without issue.
On a machine where the application was manually installed (outside of the GPO), and an attempt to update the application via GPO is made - either the application is installed in addition to the old version, OR there remain registry keys to the previous version of the application and the application cannot open/run correctly. In this case the registry keys must be manually removed, and the install is then attempted again from a clean machine.
What we know is that on any machine where the application was originally installed via GPO - updating the application is no problem. On every machine where the application was not installed with the GPO in the first place, updating via GPO fails with one of the problems presented above.
My question is: Is there a technical issue with how the installation is being handled partially through the GPO and partially outside? Does the GPO need to be responsible for the entire life-cycle of the application? OR is it a reasonable expectation that the application be updated both on machines where the original version was manually (outside the GPO ) installed, and when it was installed initially from within the GPO?
One solution we are aware of is simply having all computers manage the application life-cycle (since we know updates work in that environment already), however this would mean that many computers would need to have the manually installed versions removed by hand - and then properly handle the installation through GPO which is an extensive bit of work.
We would greatly welcome any solutions, references to technical documentation that formally shed light on the proper management or expectations here, or links to information. Our research suggests that it is "best" to manage the entire applications life-cycle inside the GPO - but I have as of yet been unable to determine that it is 100% necessary to do so.
Looking forward to any assistance. If any further technical details are required to help the viability of the question, please don't hesitate to request such details.
If you end up with two versions installed in Control Panel, then all other things being correct, the most likely explanation is that you upgraded a per user install with a per machine install (or vice versa). In the GPO world that's related to assigning it to a user or to the computer, something like that. That's easy to verify by getting a verbose log and checking the FindRelatedProducts actions for an indication that another product was found but in a different context.
When you're in GPO mode all the time, I assume each one (whether it's per user or per machine) is consistent, therefore upgrades always work, but they don't work cross-context.
I believe GPO suppresses the UI in most cases, and the UI (or the UI sequence) is sometimes where per user/per machine is set. That might be something else that would cause it, depending on how the GPO publishes to the computer or the user.

Recommendations for keeping a build server updated

As a guy who frequently switches between QA, build and operations, I keep running into the issue of what to do about operating system updates on the build server. The dichotomy is the same on Windows, Linux, MacOS or any other o/s that can update itself via the internet:
The QA team wants to keep the build server exactly as it is from the beginning of the product release cycle to the end, since installing updates could destabilize the server and means that successive builds aren't made against the same baseline.
The ops team wants the software to be deployed on a system with all the latest security patches; this can mean that the software isn't deployed on exactly the same version of the o/s that it was built on.
I usually mitigate this by taking release candidate builds and installing them on a test server that has a completely up-to-date o/s, repeating the automated tests that are run on the build server and doing some additional system level testing to make sure everything looks good before deployment. However, this seems inefficient to me; does anyone have a better way ?
Personally i don't think you have much of an issue here - just apply the latest updates to the build server. The main reasons i say this are:
it is highly unlikely that your code or any of the dependencies on the build server are so tightly coupled to the OS version that installing regular updates is going to affect anything, let alone break it. There can be minor differences between window messages etc between Windows versions, but those are few and far between, and are usually quite well documented out there on teh interweb. If you are using managed technology stacks like WPF/Silverlight or ASP.Net and even mostly Winforms then you will be isolated from these changes - they should only affect you if you are doing hardcore stuff using the WinAPI directly to create your windows or draw your buttons.
it is a good practice to always engineer your product against the latest version of the OS, because you need to encourage your customer to implement those updates too - IOW you should not be in a position where you have to say to your client to not install update xyz because your application will not run against it - especially if that update is a critical security update
testing for differences between OS versions should be done by the QA team and should independant of what is on the build server
you do not want your build server to get in to such a state that it has been so isolated from the company update process that when you finally do apply them all it barfs and spits molten silicon everywhere. IOW, the longer you wait to update, the higher the risk of something going wrong and doing so catastrophically. Small and frequent/incremental updates are lower risk than mass updates once per decade :)
The build server updates that you do have to be cautious about are third party controls or library updates - they can frequently contain breaking changes or considerably altered behavior. They really should be scheduled, and followed up by a round of testing looking for any changes.
Virtualize!
Using stuff like VMWare Server you can script the launch and suspend of virtual machines. So you can script VM resume, SSH to launch build, copy, VM suspend, repeat. (I say this, but I abandoned my work on this. Still, I was making progress at the time.)
Also, you can trust your OS vendors. Can't you?
They have an interest in compatibility. If you build on Windows XP it is almost certain to work on XP SP3 and Vista and Windows 7.
If you build on RedHat Enterprise 5, it had better work on 5.1, 5.2, 5.3, 5.4, etc.
In my experience this has worked out OK so far for me and I recommend building on your lowest patch OS versions. With the Linux stuff in particular I have found newer releases linking to more recent libraries not available on older versions.
Of course it doesn't hurt to test your code on a copy of the deployment server. It all depends on how certain you want to be.
Take the build server off the network, that way you do not need to worry about installing security updates. Only load the source from CD, thumb drive or whatever other means.
Plug it back in at the end of your release cycle and then let all the updates take place.
Well, for the most stable process, I would have two build servers, "Build with Initial config, Build with update config", and two autotest test servers with similar differences. Use virtualization to do this effectively and scriptably.

In what OS should I host subversion?

I have decided to go with Subversion for a source control repository for my personal and side projects and I'm now trying to decide what OS to use. Currently my file server for my home network is Windows 7 beta. I'm wondering if I should wipe it and install Windows Server 2008 instead? Basically I'd like to know if there are things I could take advantage with a server OS that I can't with Windows 7. First thing that comes to mind is accessing subversion remotely with a VPN connection.
I'm a .net developer, but have dabbled in Linux a bit so I'm not completely turned off to the idea of an ubuntu or debian server...
I imagine the installation and configuration process might go off with fewer hitches if installed on Linux, just because of the package management, but that's assuming some experience with the package system of $whatever_distro. If you're comfortable with Windows, Subversion works perfectly well on there. I've set it up on both, but prefer the Linux installation process (easier Apache integration, in my view), but I had pre-existing Linux experience.
If you're familiar with Windows, I bet you'll find the installation and configuration process easier there. As others have said, many of the tools are cross-platform.
You can run a Subversion server on Windows or Linux (or whatever) so it really doesn't matter. Pick whichever one you already have and feel most comfortable with. Since you are a Windows developer I see no real reason to toss Linux into the mix though.
If your goal is to minimize the amount of work you put into the maintenance of subversion, go with the OS you are most comfortable with. Many maintenance scripts, and subversion hooks are written and available in perl and python which are available for both windows and linux.
One advantage to the Windows server OSes over their client counterparts is that the client OSes are limited as to the number of inbound connections. If you are going to be the only person working on the repo, this may not make a difference. However, if there are multiple people, then this would be an issue. XP Pro/Vista Ultimate are limited by Microsoft to 10 inbound connections. I cannot speak for Windows 7.
To make life easy, try VisualSVN Server. For personal projects there's no reason to setup a separate server just for SVN.
Windows 7 will be able to host Subversion with no problems whatsoever..
If your file-server is already setup and working under Windows 7, I'd say stick with that.. Adding SVN is no reason to install a new OS
You don't need a server at all to use subversion.
If you've already got a file server on your home network, and you're doing this only for you and your personal projects, just use a subversion client such as TortoiseSVN and create your repository (or repositories) on your file server via network share (or mapped network drive, etc).
I wouldn't recommend this for multi-user setups (unless each has their own repository), but for a single user this is the simplest option. And using this approach, to answer your question, you wouldn't gain anything by switching to a server OS such as Windows Server 2008.
I'd actually recommend going with a hosted Subversion provider instead of setting up Subversion on Windows or getting a second server for that purpose. I work for ProjectLocker, but if you Google "subversion hosting", you'll see there are a number of providers that offer free or reasonably priced solutions. The advantages:
It's a hosting provider's primary job to keep your code safe, secure, and accessible, so they focus on uptime, backups, and security monitoring so you don't have to
You don't have to learn how to be a system administrator or Subversion administrator; several providers have user interfaces that make it easy to manage users and permissions.
Hosting instead of DIY lets you focus on what you actually care about: writing great software
I suggest you take a look at ProjectLocker and some of the other providers and decide which one is right for you. You may decide that doing it yourself is the best option for you, but for many people in your situation, a hosted solution has met their needs.

Resources