How to configure "Proxy user request" for NiFi CLI - apache-nifi

According to the documentation, one prerequisite for using NiFi CLI against a secured NiFi instance is to configure proxy user request for the node's identity (e.g. CN=localhost, OU=NIFI).
https://nifi.apache.org/docs/nifi-docs/html/toolkit-guide.html#prerequisites-for-running-in-a-secure-environment
I understand how to configure it through the NiFi web user interface. However, is it possible to do the same through scripting?
The reason is that I am working on a NiFi installation script, and I would like to install NiFi and configure users/policies in one go if it is possible.
Thank you!

If you are trying to use NiFi CLI to setup NiFi itself, then you're only real option is for NiFi CLI to perform operations as the Initial Admin identity.
It then depends how NiFi is configured to perform authentication, meaning where is your initial admin identity coming from. Is it a DN from a client cert, a user in LDAP, a kerberos principal, etc?
If it is a client cert, then you can just configure NiFi CLI to use that cert and it should work.
If it is a LDAP user, then you need to have NiFi CLI use one of NiFi's server certs to proxy the LDAP user.
Both of these scenarios are shown in the docs:
https://nifi.apache.org/docs/nifi-docs/html/toolkit-guide.html#security-configuration

Related

HDP: Confused between knox proxy (API), proxy (UI) and SSO

For now, in our HDP environment, we have been try set 1st and 2nd layer security, kerberos and ranger. Then we want to add the 3rd layer, Knox.
After we read some documentation in Knox reference for HDP 3.1.4, and the other page, we found 3 option for knox being implemented to a HDP cluster, first knox proxy (api), proxy (ui), and SSO.
In our needs, we want use the proxy (api) and we get the hint to implement it (combined with kerberos).
But we confused to implement the feature proxy ui and SSO
What is the different both of them?
when we use proxy ui and when we use SSO?
can we use three Knox options together?
Based on this link, there is a step that need to configure ambari authentication, "Configure Ambari Authentication for LDAP/AD". Does it mean we drop ambari authentication with kerberos?
But what about knox supported matrix that state Knox SSO can be configured in kerberized cluster?
We can't find how to use and configure Knox proxy (UI). Does it mean if we want launch atlas apps, the knox authentication form appear first or something different about that?
Regards

Authenticate Nifi using OpenID Connect using API

I am new to OpenID connect & security domain. I have configured Nifi to use OpenID for authentication using online documentation. And to automate a few nifi related tasks we are using nipyapi.
I have already written python code which does automated flow deployment for basic nifi installation (unsecured & without user authentication)
Now, I have to move the code to secured Nifi installation. How to authenticate to OpenID connect using nipyapi/rest API ?
AS per discussion with Bryan, i am planning to use client certificate for authentication but it started giving authorization error. and have created another question with the details.
Nifi - Client Certificate Authorization Error
OpenID Connect generally requires that you follow a flow of re-directs, typically in the browser. NiFi re-directs you to the login page of the OIDC provider, upon completion, the OIDC provider redirects you back to NiFi. I'm not exactly sure how, or if you even can, perform this login process from scripts. An easy alternative would be to just generate a client certificate to represent an automation user for any NiPyApi scripts. Client certificate authentication is always enabled by default for NiFi.

Accessing cassandra without hardcoded username password

I have an existing Datastax Cassandra setup that is working. We just added authentication to the system and now we can log in with our AD accounts. This is very nice and certainly works. However applications need to use a hard-coded username/password in order to connect.
In SQL Server we were able to setup a user to run the service as and then it would connect and work through AD. However in Cassandra it is not the same.
If I don't want to include usernames and especially passwords in my app.config files what are my options?
You can use authentication via LDAP with DSE (Datastax Enterprise), so the authentication stage is done with LDAP instead of the internal authentication in DSE which you're using at the moment. Note that my comments here apply to DSE5.0 onwards but you can use LDAP auth with earlier versions of DSE from 4.6 onwards.
The documentation (link below) covers this. The basic steps are as follows:
Configure your authenticator in the cassandra.yaml to use the DSE authenticator
authenticator: com.datastax.bdp.cassandra.auth.DseAuthenticator
Create an internal role in cassandra to map to the LDAP group(s) in your LDAP server using the CREATE ROLE command
Ensure all the users you need to use map to the relevant LDAP group (part of your LDAP config)
Configure your dse.yaml to have the correct settings for your LDAP server
Restart the DSE process for the settings to take effect
The following documentation gives some good examples and background information:
https://docs.datastax.com/en/latest-dse/datastax_enterprise/unifiedAuth/unifiedAuthConfig.html
https://docs.datastax.com/en/latest-dse/datastax_enterprise/sec/authLdapConfig.html
Note: when configuring the dse.yaml note the comment in the docs regarding user_search_filter:
When using Active Directory set the filter to (sAMAccountName={0})

Forward SPNEGO Credentials to Secure Cluster

I have a cluster secured by Kerberos, and have a REST API that needs to interact with the cluster on behalf of the user. I have used Spring Security with SPNEGO to authenticate the user, but when I try to use the Hadoop SDK, it fails for various reasons based on what I try.
When I try to use the SDK directly after the user logs in, it gives me SIMPLE authentication is not enabled.
I have noticed the session's Authenticator is UserNamePasswordAuthenticationToken which does not make sense, since I'm authenticating against the Kerberos realm with the credentials from the user.
I am trying to use this project out of the box with my own service account and keytab: https://github.com/spring-projects/spring-security-kerberos/tree/master/spring-security-kerberos-samples/sec-server-spnego-form-auth
For what it's worth, you can leverage Apache Knox (http://knox.apache.org) to consume the Hadoop REST APIs in a secured cluster. Knox will take care of the SPNEGO negotiation with the various components for you. You could use the HTTP header based pre-auth SSO provider to propagate the identity of your enduser to Knox.
Details: http://knox.apache.org/books/knox-0-8-0/user-guide.html#Preauthenticated+SSO+Provider
You will need to ensure that only trusted clients can call your service if you are using that provider however.
Alternatively, you can authenticate to Knox against LDAP with username/password with the default Shiro provider.
One of the great benefits of using Knox this way is that your service never needs to know anything about whether the cluster is kerberized. Knox abstracts that from you.
First of all, Spring Sec Kerberos Extension is a terrible piece of code. I have evaluated it once and abstained from using it. You need the credential of the client authenticating to your cluster. You have basically two options here:
If you are on Tomcat, you can try the JEE pre-auth wrapper from Spring Security along with my Tomcat SPNEGO AD Authenticator from trunk. If will receive the delegated credential from the client which will enable you to perform your task, assuming that your server account is trusted for delegation.
If the above is not an option, resort to S4U2Proxy/S4U2Self with Java 8 and obtain a Kerberos ticket on behalf of the user principal and perform then your REST API call.
As soon as you have the GSSCredential the flow is the same.
Disclaimer: I have no idea about Hadoop but the GSS-API process is always the same.

API to add principals to Kerberos

I am trying to kerberize my RESTful backend and I am not seeing anywhere in the GSS-API documentation how could I add a user/service - i.e., I understand the authentication process with GSS-API, but not the signup process. To make my question simpler : kinit is the command line tool used to add principals, is there an equivalent for GSS-API ? If the answer is no - should I go and look at the kinit source code and port it to my project (using system("kinit ...") raises security questions/problems so I am not thinking about using it) ?
Users should be added to the Kerberos database using API provided by Kerberos Distribution Centre. In Microsoft Active Directory, KDC uses LDAP as its database, so users can be added/removed using JNDI, as described here: http://cyberlizard.livejournal.com/120080.html.
kinit not a tool for adding users but for (simply speaking) "logging in", or (technically speaking) it's a tool that "obtains and caches an initial ticket-granting ticket for principal" (see: http://web.mit.edu/kerberos/krb5-devel/doc/user/user_commands/kinit.html), i.e. it takes credentials (i.e. for example principal and password, connects to KDC and tries to receive initial TGT from it. Client).

Resources