Hadoop web Ui security - hadoop

mine is a kerborised HDP 2.3 cluster with AD users support. we know that by using hadoop's web UI at http://namenode-ip/50070 anyone can access the hdfs details. Can we secure it only to certain authorized users and not publicly to all.. in production.

You can turn on kerberos/SPNEGO for UI access.
This will require SPNEGO to be enabled for your users' browsers as well.
See: https://hadoop.apache.org/docs/r1.2.1/HttpAuthentication.html for general instructions in securing the UIs.
At a high level, you can have the HTTP authentication set to simple, kerberos or the classname of a custom authentication handler.
You may also be interested in using Apache Knox to proxy the UIs in question. This will allow you to provide HTTP Basic Auth against LDAP or a number of other authentication options in order to access the ports. See: http://knox.apache.org/books/knox-0-7-0/user-guide.html#UI+Service+Details for UI proxying details.
This of course would require you to firewall off other direct access to the UIs. It is also worth noting that leaving the HTTP authentication type to simple in secure clusters also leaves the REST API open to anyone that can get to it.

Related

Delegation Error for Kerberos for Specific Windows Workstation

I am having an workstation specific Kerberos issue and hope anyone here would have additional recommendation.
Our application has an application server and web server and we have kerberos configured on both application layer and web layer.
And for certain users, when we provide the Kerberos link and they are not able to authorize in. And we found out the issue is workstation specific. And on the same problematic workstation, the user could access application server via Kerberos authentication.
And on the web logic we see the following error:
[SpnegoFilter.doFilter] Although user authentication to xxx was successful, Integrated Authentication could not extract the user's credentials because it appears delegation was either not configured or disallowed
I am looking for any of Windows setting could potential lead to this issue? We check that our domain is trusted on both working and nonworking machines on the browser level and GPO settings are the same.
What you're describing is unconstrained delegation, which is the act of a user handing the remote server their TGT so the server can impersonate the user without restriction.
Windows deems this as incredibly dangerous (it is) and is moving towards disabling it outright when certain security services are enabled on the client. Specifically Credential Guard. It will also block it for users that are members of the Protected Users security group, though the fact that it's affecting specific workstations leans towards Credential Guard.
If it is the above issues the correct solution is to switch to constrained delegation.

Spring security Oauth and SSO

Can anyone Tell me if it is possible to combine SSO from Spnego and Spring security with Oauth
This is my problem :
The Client I now represent has chosen Spnego as their SSO solution.
This requires us to use a full blown appServer (Liberty) in all scenarios.
At the same time, the knowlegde and skills about Spnego in the developent team is very limited.
Due to issues with creating the keytab files, Spnego is only available in the formal test environment and not our local test enviroment.
This makes it very difficult/time consuming to test and devlop due to the long deployment time to the formal test enviroment.
Not over to my question:
If possible I would like to be able to "log in" to a service in the formal test enviroment (OAUTH2 authentication server ?) using SPNEGO SSO and get a token back that I can use in further requests towards my services located locally and/or in any other test enviroment.
Is this even possible ? I have not seen any examples where the authenticantionServer is using another sso provider to actually authenticate the user.
A different possibility might be to to do some sort of redirect from the login service in the test environment but I fear the Spnego token created only will be valid on a sever in the same domain..
I`m sorry if this question is confusing or not clear.
My knowledge of this domain (security) is limited and I struggle to get a grasp of how I can test my code locally with security enabled.
Links to any resources on the net that addresses some of these issues will be greatly appreciated.

Forward SPNEGO Credentials to Secure Cluster

I have a cluster secured by Kerberos, and have a REST API that needs to interact with the cluster on behalf of the user. I have used Spring Security with SPNEGO to authenticate the user, but when I try to use the Hadoop SDK, it fails for various reasons based on what I try.
When I try to use the SDK directly after the user logs in, it gives me SIMPLE authentication is not enabled.
I have noticed the session's Authenticator is UserNamePasswordAuthenticationToken which does not make sense, since I'm authenticating against the Kerberos realm with the credentials from the user.
I am trying to use this project out of the box with my own service account and keytab: https://github.com/spring-projects/spring-security-kerberos/tree/master/spring-security-kerberos-samples/sec-server-spnego-form-auth
For what it's worth, you can leverage Apache Knox (http://knox.apache.org) to consume the Hadoop REST APIs in a secured cluster. Knox will take care of the SPNEGO negotiation with the various components for you. You could use the HTTP header based pre-auth SSO provider to propagate the identity of your enduser to Knox.
Details: http://knox.apache.org/books/knox-0-8-0/user-guide.html#Preauthenticated+SSO+Provider
You will need to ensure that only trusted clients can call your service if you are using that provider however.
Alternatively, you can authenticate to Knox against LDAP with username/password with the default Shiro provider.
One of the great benefits of using Knox this way is that your service never needs to know anything about whether the cluster is kerberized. Knox abstracts that from you.
First of all, Spring Sec Kerberos Extension is a terrible piece of code. I have evaluated it once and abstained from using it. You need the credential of the client authenticating to your cluster. You have basically two options here:
If you are on Tomcat, you can try the JEE pre-auth wrapper from Spring Security along with my Tomcat SPNEGO AD Authenticator from trunk. If will receive the delegated credential from the client which will enable you to perform your task, assuming that your server account is trusted for delegation.
If the above is not an option, resort to S4U2Proxy/S4U2Self with Java 8 and obtain a Kerberos ticket on behalf of the user principal and perform then your REST API call.
As soon as you have the GSSCredential the flow is the same.
Disclaimer: I have no idea about Hadoop but the GSS-API process is always the same.

Centralized Authentication Server OpenAM vs FreeRadius

The basic requirement is to centralize the authentication and authorization of multiple SaaS applications to ease development (each SaaS application using minimal code to authenticate against a single source) and when necessary provide SSO. The authentication mechanism must handle the following options available to the user:
Use Third Party Authentication -- Google
Use our centralized authentication
Use the corporate provided authentication (ADFS)
In my research, I have found many, many ways this can be done and have found OpenAM to be the most complete solution, but then I came across FreeRadius which could also be used.
My Questions are:
There seems to be a plug-in for each tool where one can use the other together (OpenAM - authenticate against radius server), but is there any use case where FreeRadius would be preferred as the SOLE authentication server over OpenAM.
Does OpenAM require that a web agent installed for the server - if all I am doing is serving a Restful Interface (developed in Node.js) - is it possible to authenticate users without installing a web agent (there is no web agent for Node.js).
Can I pass user credentials from Browser -> Server (node.js) -> OpenAM thereby not giving the user the OpenAM login screen. The OpenAM token will be passed from OpenAM -> Server -> Browser (setting the cookies's origin as the SaaS's application.
That is each SaaS application server will serve as a "proxy" for user management (authenticate, authorize, and manage[create|update|delete] users)
Thank you
I'm early to the Open Identity Stack game but I am deploying an OpenAM (and OpenIDM + OpenDJ) based solution to handle exactly the solutions you mention.
direct answers:
As far as handing sole authentication over to FreeRadius I don't see why you would want to but anything is possible. Given your mention of the multiple directories (identity sources - google, ADFS, and your centralized authentication) I would think hooking up OpenAM to provide the RADIUS authentication (i.e. OpenAM RADIUS hook, not FreeRadius) would make sense.
No, a web agent doesn't have to be applied but it may make sense. There are some node.js pieces to help (https://github.com/alesium/node-openam). You just need to talk from your server to the OpenAM side (REST) and that should be good.
You can do that or you can just skin the OpenAM login screen to look like your own. I'd suggest the latter as you're then relying on OpenAM for the login screen security. If you're doing a pure proxy then you take that burden on. Your call as a design decision obviously.
good luck!
you're comparing a RADIUS sever with a Web SSO solution ... I'm not sure if this makes sense.
It seems FreeRadius does not have that many 'auth backends' (like Oauth to leverage Google Auth)
I am looking into the solution for a similar requirement myself, but I am looking to integrate 2FA as well. I have seen so many different solutions, but haven't pinned down the best one yet. Here is what I have come up with so far:
RCDev OpenID seems to be pretty comprehensive, and it is free for cases with less than 40 users.
Green Rocket's GreenRADIUS is expensive, but they have plugins for every scenario and it can work.
Red Hat's KeyCloak could be used in combination with TACACS+ or FreeRADIUS to accomplish this

Shibboleth Identity Server using External Shibboleth Identity Server for Authentication

I am designing a service to handle authentication across a number of hosted platforms. This service will need to be able to manage a number of different protocol for the users: LDAP, Shibboleth, no doubt others.
I was hoping to use the Shibboleth protocol internally and run the service as a Shibboleth IdP where depending on user category, nature of the protected resource, etc. the hard work of the authentication is passed on to the native LDAP, Shibboleth, or other server where the user already has an account.
It seems as though this should be possible, but I have not gotten sufficient clarity from the Shibboleth documentation to work out whether it is, let alone how to do it.
Is this possible? How do I do it? Useful documentation very much appreciated.
Shibboleth is not a protocol; it is the name of the software stak that uses the SAML protocol.
Shibboleth can authenticate users from LDAP, AD (and database via a JAAS plugin).
Documentation on hooking up Shibboleth to use LDAP for authentication and attribute retrieval is at: https://wiki.shibboleth.net/confluence/display/SHIB2/ResolverLDAPDataConnector

Resources