Apache Knox Ldap Integration failed - hadoop

I am using Apacheknox version 1.0.0. I have tried to authenticate knox uiusing Ldap user. I have tried with following changes in knox
In ambari --> knox-->config-->Advanced topology
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>cn=admin</value>
</param>
<param>
<name>main.ldapRealm.contextFactory.url</name>
<value>ldap://x.x.x.x:10390</value>
</param>
First Iam trying for single user only.I have tried different user search But no luck I am always getting same error like below.
2021-01-29 10:22:07,266 ERROR knox.gateway (KnoxLdapRealm.java:doGetAuthenticationInfo(206)) - Shiro unable to login: javax.naming.AuthenticationException: [LDAP: error
code 49 - INVALID_CREDENTIALS: Bind failed: Invalid authentication]

LDAP: error code 49 - INVALID_CREDENTIALS Means, three things, Username/password is incorrect or the account is locked. You are having this error for Bind user.
You need to verify you systemUsername and systemPassword in configured topology.
A tool ldapsearch can be useful to verify credentials for Bind user.
main.ldapRealm.userDnTemplate should be like following
<param>
<name>main.ldapRealm.userDnTemplate</name>
<value>cn={0},ou=hadoop,ou=personal,ou=accounts,dc=example,dc=com</value>
</param>

Related

user authenticate issue In kerberos with keytab

I am trying to integrate Kerberos Hadoop with Pinot.and using below configurations.
Executables:
export HADOOP_HOME=/usr/hdp/2.6.3.0-235/hadoop
export HADOOP_VERSION=2.7.3.2.6.3.0-235
export HADOOP_GUAVA_VERSION=11.0.2
export HADOOP_GSON_VERSION=2.2.4
export GC_LOG_LOCATION=/home/hdfs/Pinot/pinotGcLog
export PINOT_VERSION=0.7.1
export PINOT_DISTRIBUTION_DIR=/home/hdfs/Pinot_IMP_FOLDER/apache-pinot-incubating-0.7.1-bin
export HADOOP_CLIENT_OPTS="-Dplugins.dir=${PINOT_DISTRIBUTION_DIR}/plugins -Dlog4j2.configurationFile=${PINOT_DISTRIBUTION_DIR}/conf/pinot-ingestion-job-log4j2.xml"
export SERVER_CONF_DIR=/home/hdfs/Pinot_IMP_FOLDER/apache-pinot-incubating-0.7.1-bin/bin
export ZOOKEEPER_ADDRESS=<ZOOKEEPER_ADDRESS>
export CLASSPATH_PREFIX="${HADOOP_HOME}/hadoop-hdfs/hadoop-hdfs-${HADOOP_VERSION}.jar:${HADOOP_HOME}/hadoop-annotations-${HADOOP_VERSION}.jar:${HADOOP_HOME}/hadoop-auth-${HADOOP_VERSION}.jar:${HADOOP_HOME}/hadoop-common-${HADOOP_VERSION}.jar:${HADOOP_HOME}/lib/guava-${HADOOP_GUAVA_VERSION}.jar:${HADOOP_HOME}/lib/gson-${HADOOP_GSON_VERSION}.jar"
export JAVA_OPTS="-Xms4G -Xmx16G -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintGCApplicationStoppedTime -XX:+PrintGCApplicationConcurrentTime -Xloggc:${GC_LOG_LOCATION}/gc-pinot-server.log"
controller.conf
controller.data.dir=<fs.defaultFS>/user/hdfs/controller_segment
controller.local.temp.dir=/home/hdfs/Pinot/pinot_tmp/
controller.zk.str=<ZOOKEEPER_ADDRESS>
controller.enable.split.commit=true
controller.access.protocols.http.port=9000
controller.helix.cluster.name=PinotCluster
pinot.controller.storage.factory.class.hdfs=org.apache.pinot.plugin.filesystem.HadoopPinotFS
pinot.controller.storage.factory.hdfs.hadoop.conf.path=/usr/hdp/2.6.3.0-235/hadoop/conf
pinot.controller.segment.fetcher.protocols=file,http,hdfs
pinot.controller.segment.fetcher.hdfs.class=org.apache.pinot.common.utils.fetcher.PinotFSSegmentFetcher
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.principle='hdfs#HDFSSITHDP.COM'
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.keytab='/home/hdfs/hdfs.keytab'
pinot.controller.storage.factory.hdfs.hadoop.kerberos.principle='hdfs#HDFSSITHDP.COM'
pinot.controller.storage.factory.hdfs.hadoop.kerberos.keytab='/home/hdfs/hdfs.keytab'
controller.vip.port=9000
controller.port=9000
pinot.set.instance.id.to.hostname=true
pinot.server.grpc.enable=true
Kerbeous Information:
kinit -V -k -t /home/hdfs/hdfs.keytab hdfs#HDFSSITHDP.COM
Using default cache: /tmp/krb5cc_57372
Using principal: hdfs#HDFSSITHDP.COM
Using keytab: /home/hdfs/hdfs.keytab
Authenticated to Kerberos v5
ERROR MESSAGE:
END: Invoking TASK controller pipeline for event ResourceConfigChange::15fc3764_TASK for cluster PinotCluster, took 278 ms
START AsyncProcess: TASK::TaskGarbageCollectionStage
END AsyncProcess: TASK::TaskGarbageCollectionStage, took 0 ms
Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Trying to authenticate user 'hdfs#HDFSSITHDP.COM' with keytab '/home/hdfs/hdfs.keytab'..
Could not instantiate file system for class org.apache.pinot.plugin.filesystem.HadoopPinotFS with scheme hdfs
java.lang.RuntimeException: Failed to authenticate user principal ['hdfs#HDFSSITHDP.COM'] with keytab ['/home/hdfs/hdfs.keytab']
at org.apache.pinot.plugin.filesystem.HadoopPinotFS.authenticate(HadoopPinotFS.java:258) ~[pinot-hdfs-0.7.1-shaded.jar:0.7.1-e22be7c3a39e840321d3658e7505f21768b228d6]
Caused by: java.io.IOException: Login failure for 'hdfs#HDFSSITHDP.COM' from keytab '/home/hdfs/hdfs.keytab': javax.security.auth.login.LoginException: Unable to obtain password from user
at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:962) ~[pinot-orc-0.7.1-shaded.jar:0.7.1-e22be7c3a39e840321d3658e7505f21768b228d6]
at org.apache.pinot.plugin.filesystem.HadoopPinotFS.authenticate(HadoopPinotFS.java:254) ~[pinot-hdfs-0.7.1-shaded.jar:0.7.1-e22be7c3a39e840321d3658e7505f21768b228d6]
... 15 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain password from user
at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:901) ~[?:1.8.0_241]
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:764) ~[?:1.8.0_241]
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) ~[?:1.8.0_241]
at org.apache.pinot.plugin.filesystem.HadoopPinotFS.authenticate(HadoopPinotFS.java:254) ~[pinot-hdfs-0.7.1-shaded.jar:0.7.1-e22be7c3a39e840321d3658e7505f21768b228d6]
... 15 more
Failed to start a Pinot [CONTROLLER] at 21.954 since launch
java.lang.RuntimeException: java.lang.RuntimeException: Failed to authenticate user principal ['hdfs#HDFSSITHDP.COM'] with keytab ['/home/hdfs/hdfs.keytab']
at org.apache.pinot.spi.filesystem.PinotFSFactory.register(PinotFSFactory.java:58) ~[pinot-all-0.7.1-jar-with-dependencies.jar:0.7.1-e22be7c3a39e840321d3658e7505f21768b228d6]
P.s. I am executing this hdfs user and for keytab file also user is hdfs .i have also given 777 access to hdfs.keytab file.
someone Kindly suggest What is the issue here.I have read multiple blocks and everywhere found that it is because of wrong prinicpal/keytab file combination/user don't have access/give 777 access to file/try with different user. tried all the the options but nothing worked as of now.
It Worked now.I just Removed ' from keytab and principle name. it was unable to read keytab with '.
Old Configuration:
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.principle='hdfs#HDFSSITHDP.COM'
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.keytab='/home/hdfs/hdfs.keytab'
pinot.controller.storage.factory.hdfs.hadoop.kerberos.principle='hdfs#HDFSSITHDP.COM'
pinot.controller.storage.factory.hdfs.hadoop.kerberos.keytab='/home/hdfs/hdfs.keytab'
New Configuration:
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.principle=hdfs#HDFSSITHDP.COM
pinot.controller.segment.fetcher.hdfs.hadoop.kerberos.keytab=/home/hdfs/hdfs.keytab
pinot.controller.storage.factory.hdfs.hadoop.kerberos.principle=hdfs#HDFSSITHDP.COM
pinot.controller.storage.factory.hdfs.hadoop.kerberos.keytab=/home/hdfs/hdfs.keytab

Error in OAuth mediator version WSO2 EI 7.0.2

I am trying to use OAuth mediator for validating an API . I am using WSO2IS as my IAM server . OAUth mediator is configured to connect WSO2IS server . using URL 'https://localhost:9443/services' while invoking i get followign exceptions . Please see Exception stack below
[
2020-05-27 13:47:27,939] WARN {org.apache.synapse.commons.util.MiscellaneousUtil} - Error loading properties from a file at from the System defined location: nhttp.properties
[2020-05-27 13:48:01,105]
INFO {org.apache.axis2.transport.http.HTTPSender} - Unable to sendViaPost to url[https://localhost:9443/services/OAuth2TokenValidationService]
javax.net.ssl.SSLPeerUnverifiedException: peer not authenticated
at sun.security.ssl.SSLSessionImpl.getPeerCertificates(SSLSessionImpl.java:450)
at org.apache.commons.httpclient.protocol.SSLProtocolSocketFactory.verifyHostName(SSLProtocolSocketFactory.java:276)
at org.apache.commons.httpclient.protocol.SSLProtocolSocketFactory.createSocket(SSLProtocolSocketFactory.java:186)
at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:707)
at org.apache.commons.httpclient.MultiThreadedHttpConnectionManager$HttpConnectionAdapter.open(MultiThreadedHttpConnectionManager.java:1361)
at
I have updated keystore and it was working

LDAP invalid login credentials

I installed sldap on my ubuntu 12.04 system.https://help.ubuntu.com/community/OpenLDAPServer
I can add /search records to lsdap from terminal.e.g. I can add ldif file.
ldapadd -x -D cn=admin,dc=test,dc=com -W -f ldap-add.ldif
and it ask me password .I entered pass000 and it added new entry.its working fine .i can add search records from terminal. Now I tried It from my spring application .i added dependency ldap-core to my pom .i set up bean etc. every thing is fine except it gives me error in valid credetials thouh I entered same as I entered while ading ldif file from terminal. My bean configuration is
<!-- ldap template -->
<ldap:context-source id="contextSource" url="ldap://localhost:389"
base="dc=test,dc=com" username="cn=admin" password="pass000" />
<ldap:ldap-template id="ldapTemplate"
context-source-ref="contextSource" />
<bean id="personDao" class="com.cheasyy.cofinding.dao.PersonDaoImpl">
<property name="ldapTemplate" ref="ldapTemplate" />
</bean>
It gives error
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.springframework.ldap.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]; nested exception is javax.naming.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:894)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778)
javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51)
com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:205)
com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:266)
root cause
org.springframework.ldap.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]; nested exception is javax.naming.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]
org.springframework.ldap.support.LdapUtils.convertLdapException(LdapUtils.java:191)
org.springframework.ldap.core.support.AbstractContextSource.createContext(AbstractContextSource.java:356)
org.springframework.ldap.core.support.AbstractContextSource.doGetContext(AbstractContextSource.java:140)
If with same credentials I can add ldif file from terminal then why not with my spring application?is their anything missing in configuration?
The 'username' DN needs to be the full DN of the admin user, including the base; in your case:
<ldap:context-source id="contextSource" url="ldap://localhost:389"
base="dc=test,dc=com" username="cn=admin,dc=test,dc=com" password="pass000" />
It's not uncommon for the admin user to be located in a different part of the LDAP tree than the base DN from which you want your operations to originate.

org.hibernate.exception.GenericJDBCException: Cannot open connection] with root cause java.sql.SQLException:

Hello I m still new with spring and hibernate it s my first app tryin to get connected to db but I m getting this excepton HTTP Status 500 - Request processing failed; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection
and in the console: org.hibernate.exception.GenericJDBCException: Cannot open connection] with root cause java.sql.SQLException:
Access denied for user 'root'#'localhost' (using password: YES)
Can anyone help me plz I cheched the connection with other java app it worked perfectly!
database.properties
database.driver=com.mysql.jdbc.Driver
database.url=jdbc:mysql://localhost:3306/DAVDB
database.user=root
database.password=''
hibernate.dialect=org.hibernate.dialect.MySQLDialect
hibernate.show_sql=true
hibernate.hbm2ddl.auto=update
Your database.properties is not correct:
database.password=''
Should be:
database.password=
Assuming you want an empty password. Quotes have no special meaning in property files, so they will be taken as literal.

JasperReports Server "The job failed to execute"

I am new to JasperReports Server and have searched for a resolve to this issue. I have found nothing. I inherited this server and the reports are not running as scheduled.
Job: 3rd (ID: 61)
Report unit: /reports/Scheduled/00_Schedule_Primer
Quartz Job: ReportJobs.job_61
Quartz Trigger: ReportJobs.trigger_62_1
Exceptions:
An error occurred while executing the report.
com.jaspersoft.jasperserver.api.JSException: jsexception.error.creating.connection
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.JdbcDataSourceService.createConnection(JdbcDataSourceService.java:62)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.BaseJdbcDataSource.setReportParameterValues(BaseJdbcDataSource.java:52)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.JdbcDataSourceService.setReportParameterValues(JdbcDataSourceService.java:67)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.EngineServiceImpl.fillReport(EngineServiceImpl.java:743)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.EngineServiceImpl.fillReport(EngineServiceImpl.java:367)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.EngineServiceImpl.executeReport(EngineServiceImpl.java:876)
at com.jaspersoft.jasperserver.api.engine.jasperreports.domain.impl.ReportUnitRequest.execute(ReportUnitRequest.java:60)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.EngineServiceImpl.execute(EngineServiceImpl.java:301)
at com.jaspersoft.jasperserver.api.engine.scheduling.quartz.ReportExecutionJob.executeReport(ReportExecutionJob.java:444)
at com.jaspersoft.jasperserver.api.engine.scheduling.quartz.ReportExecutionJob.executeAndSendReport(ReportExecutionJob.java:372)
at com.jaspersoft.jasperserver.api.engine.scheduling.quartz.ReportExecutionJob.execute(ReportExecutionJob.java:188)
at org.quartz.core.JobRunShell.run(JobRunShell.java:195)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:520)
Caused by: java.sql.SQLException: Middleware connect fail:No RPC Connection active.
at com.ibm.u2.jdbc.UniJDBCMsgFactory.createException(UniJDBCMsgFactory.java:113)
at com.ibm.u2.jdbc.UniJDBCExceptionSupport.addAndThrowException(UniJDBCExceptionSupport.java:62)
at com.ibm.u2.jdbc.UniJDBCProtocolU2Impl.connect(UniJDBCProtocolU2Impl.java:746)
at com.ibm.u2.jdbc.UniJDBCProtocolU2Impl.executeOpenDatabase(UniJDBCProtocolU2Impl.java:243)
at com.ibm.u2.jdbc.UniJDBCConnectionImpl.<init>(UniJDBCConnectionImpl.java:116)
at com.ibm.u2.jdbc.UniJDBCDriver.connect(UniJDBCDriver.java:111)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
at java.sql.DriverManager.getConnection(DriverManager.java:185)
at org.apache.commons.dbcp.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:48)
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:290)
at org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:771)
at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:95)
at com.jaspersoft.jasperserver.api.engine.jasperreports.service.impl.JdbcDataSourceService.createConnection(JdbcDataSourceService.java:58)
... 12 more
I don't even have a clue where to start troubleshooting this. ANY help would be awesome!
It seems your connection url, username and password values are not proper.If these three things are proper then the connection will be created.Check in your Jasper server configuration.

Resources