I have just began studying apache spark. First thing which i did was i tried to install spark on my machine. I downloaded the pre built spark 1.5.2 with hadoop 2.6. When i ran spark shell i got following erros
java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init> (ClientWrapper.scala:171)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala :163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
I searched for this error and got that i have to download winutils.exe which i did, I set the path HADOOP_HOME = "c:\Hadoop" and then ran the command
C:\Hadoop\bin\winutils.exe chmod 777 /tmp/hive
but i got following error
This version of C:\Hadoop\bin\winutils.exe is not compatible with the version of
Windows you're running. Check your computer's system information to see whether
you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contac
t the software publisher.
I tried to search 32 bit version of winutils.exe but i couldnt get it.. Please help me with this installation.
Thank You in advance
The following links may be helpful.
https://issues.apache.org/jira/browse/HADOOP-9922
https://issues.apache.org/jira/browse/HADOOP-11784
Not able to find winutils.exe for hadoop 2.6.0 for 32 bit windows
Related
has been compiled by a more recent version of the Java Runtime (class file version 56.0), this version of the Java Runtime only recognizes class file versions up to 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:93)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:46)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
PS <path>
Does new relic has different version of jar for openJDK 12 ?
I am running with new relic 5.10.0 version
The Java agent for New Relic supports OpenJDK and AdoptOpenJDK JVM versions 7 to 15 for Linux, Windows, and OS X:
Requirements to install the agent
What happens when you restart the application? Does the error prevent startup or does it recover and continue and begin reporting?
I notice that HDFS delete commands will randomly fail. For example, in a MapReduce job I delete a directory at startup. Occasionally it fails with the error below, but will succeed on the second try.
org.apache.hadoop.ipc.RemoteException(java.lang.ArrayIndexOutOfBoundsException): java.lang.ArrayIndexOutOfBoundsException
at org.apache.hadoop.ipc.Client.call(Client.java:1364)
at org.apache.hadoop.ipc.Client.call(Client.java:1411)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy14.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:513)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy15.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1862)
at org.apache.hadoop.hdfs.DistributedFileSystem$11.doCall(DistributedFileSystem.java:599)
at org.apache.hadoop.hdfs.DistributedFileSystem$11.doCall(DistributedFileSystem.java:595)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Digging into the datanode logs I see this exception:
RemoteException in offerService
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPointerException
at org.apache.hadoop.ipc.Client.call(Client.java:1411)
at org.apache.hadoop.ipc.Client.call(Client.java:1364)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy17.blockReport(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.blockReport(DatanodeProtocolClientSideTranslatorPB.java:175)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.blockReport(BPServiceActor.java:493)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:716)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:851)
at java.lang.Thread.run(Thread.java:745)
I can't find any help on that, and I'm not sure where else to look. Does anyone know what that issue is?
I'm running Ubuntu 14.04.1 LTS and hadoop version prints out:
Hadoop 2.5.0-cdh5.3.1
Subversion http://github.com/cloudera/hadoop -r 4cda8416c73034b59cc8baafbe3666b074472846
Compiled by jenkins on 2015-01-28T00:41Z
Compiled with protoc 2.5.0
From source with checksum 6a018149a764de4b8992755df9a2a1b
This command was run using /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/jars/hadoop-common-2.5.0-cdh5.3.1.jar
Thanks for your help!
I haven't tried it yet, but Cloudera suggested upgrading to 5.3.3:
http://community.cloudera.com/t5/Storage-Random-Access-HDFS/HDFS-delete-command-results-in-ArrayIndexOutOfBoundsException/m-p/31817#U31817
I am trying to use Spark along with Hadoop in my Windows 8. However no matter what my code is, I receive this error:
15/08/25 19:29:58 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:271)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:248)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:763)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:748)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:621)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2162)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:301)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnectionun(GatewayConnection.java:207)
at java.lang.Thread.run(Unknown Source)
As you can see:
null\bin\winutils.exe
The hadoop home path is null. I tried to set HADOOP_HOME as an environment variable but that did not resolve this issue.
I managed to resolve this problem using the following part of code in the begining:
import sys
import os
os.environ['HADOOP_HOME'] = "C:/Mine/Spark/hadoop-2.6.0"
sys.path.append("C:/Mine/Spark/hadoop-2.6.0/bin")
Hope this helps someone and also if anyone has a better idea, I would definitely appreciate that.
I am trying to run Oracle's Object Type Translator Utility (OTT) and getting below error:
Exception in thread "main" java.lang.UnsatisfiedLinkError: C:\oraclexe\app\oracle\product\11.2.0\server\bin\ocijdbc11.dll: Can't load AMD 64-bit .dll on a IA 32-bit platform
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(Unknown Source)
at java.lang.ClassLoader.loadLibrary(Unknown Source)
at java.lang.Runtime.loadLibrary0(Unknown Source)
at java.lang.System.loadLibrary(Unknown Source)
at oracle.jdbc.driver.T2CConnection$1.run(T2CConnection.java:3516)
at java.security.AccessController.doPrivileged(Native Method)
at oracle.jdbc.driver.T2CConnection.loadNativeLibrary(T2CConnection.java:3512)
at oracle.jdbc.driver.T2CConnection.logon(T2CConnection.java:266)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:536)
at oracle.jdbc.driver.T2CConnection.<init>(T2CConnection.java:162)
at oracle.jdbc.driver.T2CDriverExtension.getConnection(T2CDriverExtension.java:53)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:521)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at oracle.ott.Konnection.getTheConnection(Konnection.java:102)
at oracle.ott.Konnection.<init>(Konnection.java:39)
at oracle.ott.Doit.main(Doit.java:107)
at oracle.ott.c.CMain.main(CMain.java:9)
It was working fine few months ago and no idea why this is giving error now.
So, I figured out finally that there was some issue with the old setup of instant client 11_2.0, so i installed new instant client 11_2.2 and set all the required environment variables and it works perfectly fine.
I've be struggling in debugging an install4j installer where I'm trying to introduce some complicated condition expression that is failing for some reason.
However when I try to use the debug_installer.sh script I get the following error:
java.io.FileNotFoundException: /Applications/install4j/resource/MessagesDefault (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:120)
at com.install4j.runtime.util.FileResourceBundle.<init>(Unknown Source)
at com.install4j.runtime.installer.frontend.Messages.createMessagesInternal(Unknown Source)
at com.install4j.runtime.installer.frontend.Messages.createMessages(Unknown Source)
at com.install4j.runtime.installer.frontend.Messages.getMessages(Unknown Source)
at com.install4j.runtime.installer.frontend.GUIHelper.showMessageInternal(Unknown Source)
at com.install4j.runtime.installer.frontend.GUIHelper.access$100(Unknown Source)
at com.install4j.runtime.installer.frontend.GUIHelper$2.run(Unknown Source)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:199)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:682)
at java.awt.EventQueue.access$000(EventQueue.java:85)
at java.awt.EventQueue$1.run(EventQueue.java:643)
at java.awt.EventQueue$1.run(EventQueue.java:641)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:87)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:652)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:296)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:211)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:196)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:188)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)
The file actually doesn't exist but I have no idea of what that file should contain. My install4j version is 4.2.8
In the debug installer start script, replace
-cp i4jruntime.jar:user.jar:user/*.jar
with
-cp 'i4jruntime.jar:user.jar:user/*'
Then it should work. This bug was fixed in 5.0.1.