configuring/running oozie 3.3.2 and hadoop 1.2.0 - hadoop

I have been working with different versions of both hadoop and oozie to get them working together but haven't suceeded. Recently i considered the two versions(hadoop 1.2.0 and oozie 3.3.2) which are put together under hortonworks 1.3 data platform so i figured they will work together. I configured and run the hadoop on a single node in pseudo distributed mode.
Although i was able to build the oozie successfully including the jar files from hadoop, while i try to run it i'm faced with the following error on http://hostname:11000/oozie. Any help is greatly appreciated, it's so frustrating to haven't figured it out for weeks.
HTTP Status 500 - Filter execution threw an exception
type Exception report
message Filter execution threw an exception
description The server encountered an internal error that prevented it from fulfilling this request.
exception
javax.servlet.ServletException: Filter execution threw an exception
org.apache.oozie.servlet.HostnameFilter.doFilter(HostnameFilter.java:84)
root cause
java.lang.NoSuchMethodError: org.apache.http.client.utils.URLEncodedUtils.parse(Ljava/lang/String;Ljava/nio/charset/Charset;)Ljava/util/List;
org.apache.hadoop.security.authentication.server.PseudoAuthenticationHandler.getUserName(PseudoAuthenticationHandler.java:124)
org.apache.hadoop.security.authentication.server.PseudoAuthenticationHandler.authenticate(PseudoAuthenticationHandler.java:160)
org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:349)
org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:131)
org.apache.oozie.servlet.HostnameFilter.doFilter(HostnameFilter.java:84)
note The full stack trace of the root cause is available in the Apache Tomcat/6.0.36 logs. Apache Tomcat/6.0.36

Related

ClassNotFoundException in Nifi flow

Hey StackOverflow Community,
I have some problems with my Nifi flow. I made one to take my data from my Azure blob to put them into my HDFSCluster ( still in Azure).
My configuration in the item PutHDFS in Nifi is :
PutHDFSConfiguration
But when I inform the field "Hadoop ressources", i have this following error:
PutHDFS[id=89381b69-015d-1000-deb7-50b6cf485d28] org.apache.hadoop.fs.adl.HdiAdlFileSystem: java.lang.ClassNotFoundException: org.apache.hadoop.fs.adl.HdiAdlFileSystem
PutHDFS[id=89381b69-015d-1000-deb7-50b6cf485d28] PutHDFS[id=89381b69-015d-1000-deb7-50b6cf485d28] failed to invoke #OnScheduled method due to java.lang.RuntimeException: Failed while executing one of processor's OnScheduled task.; processor will not be scheduled to run for 30 seconds: java.lang.RuntimeException: Failed while executing one of processor's OnScheduled task.
How can i resolve this and put my data into my clusters.
Thanks for your answer.
Apache NiFi does not bundle any of the Azure related libraries, it only bundles the standard Apache Hadoop client, currently 2.7.3 if using a recent NiFi release.
You can specify the location of the additional Azure JARs through the PutHDFS processor property called "Additional Classpath Resources".

HBase client does not work under JBoss AS 7.1

i am having a JBoss application which needs to talk remotely with an HBase server. When using the simple console project the HBase client works perfectly but when deployed in the JBoss server looks like the server is not loading the class org.apache.hadoop.hdfs.web.resources.UserProvider.
Can anyone help with an workaround or with a fix ? ?
Your replies are much appreciated.
Error message
ERROR [org.apache.catalina.core.ContainerBase.[jboss.web].[default-host].[/HFPlatformWeb]] (http--0.0.0.0-8080-6) StandardWrapper.Throwable: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.web.resources.UserProvider from ...
List of jars
commons-configuration-1.6.jar
commons-lang-2.5.jar
commons-logging-1.1.1.jar
guava-11.0.2.jar
hadoop-auth-2.0.0-cdh4.4.0.jar
hadoop-common-2.0.0-cdh4.4.0.jar
hadoop-core-2.0.0-mr1-cdh4.4.0.jar
hadoop-hdfs-2.0.0-cdh4.4.0.jar
hbase.jar
log4j-1.2.17.jar
protobuf-java-2.4.0a.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
zookeeper-3.4.5-cdh4.4.0.jar
At least one clue should be in exception trace. It is strange you need hdfs.web.resources at all. Please look at your exception stack from one side and on cloudra JARs from another to see where this class 'lives'.
Do you really have loaded hadoop-hdfs? As far as I remember it is not 'fixed' dependency but rather implementation of some mechanics to handle HDFS scheme.
I'd recommend to upgrade Cloudera cluster to Cloudera 5 environment. Rather big step starting from HBase 0.96.x and Hadoop 2.3.x which is really serious advantage. For me another difference was YARN infrastructure as default MR handler. This seem not to fix your issue but if you don't do it now, you will get this upgrade complexity soon. It starts from HBase being split on sub-components rather than hbase.jar for CDH4. Dependencies look really different.
WARNING: Last point is just my recommendation based on my own experience if your cluster is yet in experimental phase.

Error calling Solr Cloud Index within Hadoop Job

My goal is to run an elastic map reduce job that queries a Solr index in the map phase and writes the result to S3. Solr and Hadoop worked fine together when building a Solr index within a Hadoop job (ie writing to Solr index). When I run the a job to query a Solr index I get an error when trying to initiate the Solr client. I suspect that there's a dependency issue between Hadoop and Solr, I recall they both use different versions of http clients and the error is a method not found issue. Here's the stack trace
2013-07-24 03:17:47,082 FATAL org.apache.hadoop.mapred.Child (main): Error running child : java.lang.NoSuchMethodError: org.apache.http.impl.conn.SchemeRegistryFactory.createSystemDefault()Lorg/apache/http/conn/scheme/SchemeRegistry;
at org.apache.http.impl.client.SystemDefaultHttpClient.createClientConnectionManager(SystemDefaultHttpClient.java:118)
at org.apache.http.impl.client.AbstractHttpClient.getConnectionManager(AbstractHttpClient.java:445)
at org.apache.solr.client.solrj.impl.HttpClientUtil.setMaxConnections(HttpClientUtil.java:179)
at org.apache.solr.client.solrj.impl.HttpClientConfigurer.configure(HttpClientConfigurer.java:33)
at org.apache.solr.client.solrj.impl.HttpClientUtil.configureClient(HttpClientUtil.java:115)
at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:105)
at org.apache.solr.client.solrj.impl.HttpSolrServer.<init>(HttpSolrServer.java:154)
at org.apache.solr.client.solrj.impl.HttpSolrServer.<init>(HttpSolrServer.java:127)
Adding this opt did the trick
--args -s,mapreduce.user.classpath.first=true
Using the user defined classpath worked out the dependency issue between the hadoop and solr jar

BIRT Integration - ODA Exception

My team is using BIRT to handle our clients business logic. Every so often we get an exception in our our logs:
18-Dec-2012 11:25:39.163 INFO
org.eclipse.birt.report.data.oda.jdbc.JndiDataSource.getDriverJndiPropertyFile
getDriverJndiPropertyFile() java.io.IOException: Unable to locate the installation path of
the ODA extension (org.eclipse.birt.runtime). The ODA consumer application must specify a
ResourceIdentifiers in the appContext to resolve the path.
We're not sure what causes this error and it seems to be working fine. We have a jdbc connection so we're not sure what would cause this. Any tips or info that we could use to help troubleshoot this issue would be greatly appreciated.
Copy
mysql-connector-java-5.0.8-bin.jar
to
WEB-INF\platform\plugins\org.eclipse.birt.report.data.oda.jdbc_INSTALLED_VERSION\drivers
folder.

SpringIDE 2.6.0 failing to initialize with error under Helios

I'm running into an issue with the Spring IDE release 2.6.0 under Eclipse 3.6 SR 1
Error occured processing XML 'Could not instantiate bean class
[org.springframework.ide.eclipse.beans.core.internal.model.BeansConfig$ToolingFriendlyBeanDefinitionDocumentReader]:
Constructor threw exception; nested exception is
org.apache.commons.logging.LogConfigurationException: User-specified log class
'org.apache.commons.logging.impl.Log4JLogger' cannot be found or is not useable.'.
See Error Log for more details
I dumped my entire eclipse environment and re-installed to see if it would fix it and it didn't. Not sure if this is a classpath problem or something specifically related to the SpringIDE configuration.
Any help would be appreciated.
you're hitting a bug in Spring IDE. See the following JIRA for more details:
https://issuetracker.springsource.com/browse/STS-1691
We are going to publish a patch for this issue soon. Check the JIRA for when it is available. Alternatively install an upcoming nightly build.
Regards, Christian

Resources