Shibboleth Identity Server using External Shibboleth Identity Server for Authentication - shibboleth

I am designing a service to handle authentication across a number of hosted platforms. This service will need to be able to manage a number of different protocol for the users: LDAP, Shibboleth, no doubt others.
I was hoping to use the Shibboleth protocol internally and run the service as a Shibboleth IdP where depending on user category, nature of the protected resource, etc. the hard work of the authentication is passed on to the native LDAP, Shibboleth, or other server where the user already has an account.
It seems as though this should be possible, but I have not gotten sufficient clarity from the Shibboleth documentation to work out whether it is, let alone how to do it.
Is this possible? How do I do it? Useful documentation very much appreciated.

Shibboleth is not a protocol; it is the name of the software stak that uses the SAML protocol.
Shibboleth can authenticate users from LDAP, AD (and database via a JAAS plugin).
Documentation on hooking up Shibboleth to use LDAP for authentication and attribute retrieval is at: https://wiki.shibboleth.net/confluence/display/SHIB2/ResolverLDAPDataConnector

Related

In OAuth2 flow, can we delegate authentication to Windows SSO

We have an in-house OAuth2 server used by our applications. Now we want to use Windows SSO for our applications but without them to change anything: they'll still reach our OAuth2 server for an access token and the authentication part will be delegated to Kerberos (which Windows use, if I understood properly).
Is there a way to do that?
That is a standard setup and should just require configuration changes in the Authorization Server (AS) - with zero code changes in applications.
Most commonly:
The AS might be hosted in the cloud
It will redirect browsers to an on premise Identity Provider (IDP)
The IDP can connect to Active Directory
You may also need a fallback option for when users are not joined to the work domain. See this Curity guide for an example and some infrastructure factors to think about.
If the AS is in house it may even be able to make a direct Kerberos connection via an LDAP data source, though the preferred architecture is a separate IDP.
Of course you need an AS that supports the ability to make this type of connection, so would need to check the vendor docs.
REQUEST FLOW
Kerberos has always been the simplest protocol conceptually but the deepest to understand - here is a bit of a summary:
Your apps will make a standard OpenID Connect authorization redirect to the AS
The AS may then present an authentication selection screen to the user, unless there is only a single option
Alternatively an app can send the acr_values query parameter to say which authentication method to use
The AS will then redirect the browser to the next stage of processing, that uses a 'Windows SSO authenticator'
The redirect to the Windows SSO authenticator does not have to use OpenID Connect - it could be any vendor specific HTTP request
The browser will send an encrypted Kerberos ticket automatically by connecting to AD - a prerequisite for this to work might be that the domain in the URL is in the Local Intranet zone on end user computers
The Windows SSO authenticator will need to be able to decrypt this credential, which typically requires a Service Principal Name to be configured
Once the Kerberos ticket is decrypted, the authenticator will make an LDAP connection to an Active Directory data source via its standard LDAP endpoints, to verify the received ticket

How to properly configure IoT sensors in order to gain authentication and authorization using IdM Keyrock and Wilma PEP Proxy

I have being working for some weeks with the IdM Keyrock, Wilma PEP Proxy and AuthZForce in the context of Fiware Platform, in order to develop an IoT application.
I had success in protecting the Orion Context Broker APIs using Wilma PEP Proxy and now is the turn to protect IoT-UL APIs in order to secure "southbound" APIs.
I thought about using a similar strategy to that followed with the Orion Context Broker. In this case each sensor has an OAUth2 token and putting a PEP Proxy in front of the IoT-UL APIs I would be able to authenticate and authorize every request to them.
Then I noticed that into the Keyrock interface, there is a section inside my Application tab where I can register IoT Sensors so I registered a few IoT sensors. Then I realized that I could not assign roles to this users (because internally they are users) and I could neither login using keyrock interface. So I Could not assign roles and neither generate OAuth2 tokens.
What am I missing? Perhaps authentication and authorization is not yet available for IoT sensors. In that case I thought about using regular users to represent IoT sensors but I think that is overkill. Any help with this would be very usefull.
you can create tokens for devices usingResource Owner Password Credentials Grant. Roles assignment is not ready yet. So you can just check authentication. This will be available in the next release.

Forward SPNEGO Credentials to Secure Cluster

I have a cluster secured by Kerberos, and have a REST API that needs to interact with the cluster on behalf of the user. I have used Spring Security with SPNEGO to authenticate the user, but when I try to use the Hadoop SDK, it fails for various reasons based on what I try.
When I try to use the SDK directly after the user logs in, it gives me SIMPLE authentication is not enabled.
I have noticed the session's Authenticator is UserNamePasswordAuthenticationToken which does not make sense, since I'm authenticating against the Kerberos realm with the credentials from the user.
I am trying to use this project out of the box with my own service account and keytab: https://github.com/spring-projects/spring-security-kerberos/tree/master/spring-security-kerberos-samples/sec-server-spnego-form-auth
For what it's worth, you can leverage Apache Knox (http://knox.apache.org) to consume the Hadoop REST APIs in a secured cluster. Knox will take care of the SPNEGO negotiation with the various components for you. You could use the HTTP header based pre-auth SSO provider to propagate the identity of your enduser to Knox.
Details: http://knox.apache.org/books/knox-0-8-0/user-guide.html#Preauthenticated+SSO+Provider
You will need to ensure that only trusted clients can call your service if you are using that provider however.
Alternatively, you can authenticate to Knox against LDAP with username/password with the default Shiro provider.
One of the great benefits of using Knox this way is that your service never needs to know anything about whether the cluster is kerberized. Knox abstracts that from you.
First of all, Spring Sec Kerberos Extension is a terrible piece of code. I have evaluated it once and abstained from using it. You need the credential of the client authenticating to your cluster. You have basically two options here:
If you are on Tomcat, you can try the JEE pre-auth wrapper from Spring Security along with my Tomcat SPNEGO AD Authenticator from trunk. If will receive the delegated credential from the client which will enable you to perform your task, assuming that your server account is trusted for delegation.
If the above is not an option, resort to S4U2Proxy/S4U2Self with Java 8 and obtain a Kerberos ticket on behalf of the user principal and perform then your REST API call.
As soon as you have the GSSCredential the flow is the same.
Disclaimer: I have no idea about Hadoop but the GSS-API process is always the same.

The difference between Spring Security LDAP VS CAS VS OpenID

In spring security I understand that there are different modules catered to different usage and among some of the modules I saw LDAP, CAS and OPENID.
From my understanding
CAS - it is only used for authentication purposes based on a SSO
LDAP - it is based on a LDAP server to authenticate users and
manage them. Am I correct?
OPENID - It is also based on a OpenID Server to authenticate users
If that is the case, why would some people use CAS instead of LDAP? Maybe it is because of different usability? Can anyone shed the difference between the 3 of them why are one more preferred than the others?
CAS as a protocol is a mechanism to provide web single signon. There is also CAS, the software platform that implements that protocol amongst many others, including openid.
OpenId is also an authentication protocol, similar to CAS, able to achieve web single sign on but more in a federated fashion.
LDAP is a protocol that defines how one should talk to a directory server. Most systems use LDAP to talk to a directory to retrieve user accounts, verify them and retrieve attributes associated with them. It has nothing to do with authentication or single sign on. CAS, the software, can be configured to find user accounts from ldap, find attributes from ldap or do other things with ldap.

What is Shibboleth Service Provider, can & should I install it for a Windows Azure MVC3 web role?

Forgive me I am a Shibboleth / SAML 2 noob. Hopefully these are straightforward questions.
I recently posted asking whether we could do Shib / SAML 2 integration with Azure ACS. The answers led me to believe that we could not use ACS, but implement something using the lower-level WIF + SAML2 Extensions CTP libs.
On a related matter I called one of our affiliates to ask if they could add our app as a Service Provider using their InCommon Federation membership. They asked me if we were going to install the Shibboleth Service Provider on the Azure machine(s) hosting our MVC3 web role.
Until they mentioned this, I had no idea there was a Shibboleth Service Provider installer. I was under the impression, according to everything I've read so far about SAML2, that our mvc3 web role is the service provider.
So, what is the Shibboleth Service Provider? What does it do? What value would be added by installing it on our Azure instances? Do I have to have it in order to SSO against Shibboleth? or can we just do pure saml2?
My preference is to not install it, since it would have to be installed on each role instance, making deployment take longer.
There is some information on using Shibboleth 2 for SSO in front of your web application in this question: In order to implement SAML do I need Shibboleth SP installed on my host?; the answer is linux/Java-centric.
The Shibboleth SP is a product that you can use in front of your existing web application, or even just in front of a particular SSO-login URL that you can add to your existing web application. If your application already has a notion of users, then you can simply figure out how you will map the Identity Provider's user attributes to your application users. You and your affiliated company need to come up with what you want to do to map identities from the Identity Provider to identities on your application. You might have some shared data, or you might be required to set up that data when the the user first uses SSO.
The value that Shibboleth SP provides is that it is a product that implements all of the SAML 2.0 interactions you are likely to need. It's easy to configure SAML 2.0 Web-SSO with Shibboleth and have the Shibboleth module add variables to the HTTP requests that contain all of the Attributes in the SAML 2 Assertions that the Identity Provider will be sending you.
If You can do all of that with Azure ACS, then there's no need to install Shibboleth. My limited understanding is that Azure ACS may already support SAML 2.0 Web SSO: http://saml.xml.org/news/windows-azure-gains-single-sign-on-support

Resources