Unable to find partitioner class - Cassandra - hadoop

Can some help me in fixing the below issue facing with Cassandra, when i run my application on Hadoop.
When i run the application, i am getting the below error with respect to the partitioner class we mentioned in the application.
Caused by: java.lang.RuntimeException: org.apache.cassandra.exceptions.ConfigurationException: Unable to find partitioner class 'org.apache.cassandra.dht.RandomPartitioner'
at org.apache.cassandra.hadoop.ConfigHelper.getInputPartitioner(ConfigHelper.java:426)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.validateConfiguration(AbstractColumnFamilyInputFormat.java:85)
at org.apache.cassandra.hadoop.ColumnFamilyInputFormat.validateConfiguration(ColumnFamilyInputFormat.java:74)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSplits(AbstractColumnFamilyInputFormat.java:122)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at com.test.cassandratest.WcJob.run(WcJob.java:96)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at com.test.cassandratest.WcJob.main(WcJob.java:104)
... 10 more
Caused by: org.apache.cassandra.exceptions.ConfigurationException: Unable to find partitioner class 'org.apache.cassandra.dht.RandomPartitioner'
at org.apache.cassandra.utils.FBUtilities.classForName(FBUtilities.java:458)
at org.apache.cassandra.utils.FBUtilities.construct(FBUtilities.java:470)
at org.apache.cassandra.utils.FBUtilities.newPartitioner(FBUtilities.java:416)
at org.apache.cassandra.hadoop.ConfigHelper.getInputPartitioner(ConfigHelper.java:422)
... 26 more
Caused by: java.lang.NoClassDefFoundError: org/github/jamm/MemoryMeter$Guess
at org.apache.cassandra.utils.ObjectSizes.<clinit>(ObjectSizes.java:34)
at org.apache.cassandra.dht.RandomPartitioner.<clinit>(RandomPartitioner.java:45)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at org.apache.cassandra.utils.FBUtilities.classForName(FBUtilities.java:450)
... 29 more
Caused by: java.lang.ClassNotFoundException: org.github.jamm.MemoryMeter$Guess
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 34 more

I met the same problem when upgraded Cassandra to 2.1 in our system, and the root cause is as following.
The jamm version Cassandra 2.1 uses is 3.0.0, and older Cassandra was using 2.5. So please update the jamm version you use, and you problem may be fixed.
http://mvnrepository.com/artifact/com.github.jbellis/jamm/0.3.0

Related

Exception with Graphs Generator

JMeter version - 2.13
I have following plugins in my JMeter installation -
And was wanting to generate graphs for a previously run test -
But I end up with following exception -
2015/12/29 13:45:15 ERROR - jmeter.JMeter: Uncaught exception: java.lang.NoClassDefFoundError: kg/apc/cmd/UniversalRunner
at kg.apc.jmeter.PluginsCMDWorker.<init>(PluginsCMDWorker.java:52)
at kg.apc.jmeter.listener.GraphsGeneratorListener.testEnded(GraphsGeneratorListener.java:146)
at kg.apc.jmeter.listener.GraphsGeneratorListener.testEnded(GraphsGeneratorListener.java:137)
at org.apache.jmeter.engine.StandardJMeterEngine.notifyTestListenersOfEnd(StandardJMeterEngine.java:226)
at org.apache.jmeter.engine.StandardJMeterEngine.run(StandardJMeterEngine.java:448)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: kg.apc.cmd.UniversalRunner
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
Did I misconfigure plugin?
You need to add CMDRunner-1.3.1.jar in lib/ext folder:
http://jmeter-plugins.org/wiki/JMeterPluginsCMD/
It's in the Standard bundle.

Error, when using ELKIs data generator

Maybe it is because of my skills in handling the terminal, but I don't understand why I get an exception here.
I have extracted the folders. And I think the paths are right.
db#computer:~/Desktop/elki-0.7.0~20150828/elki$ java -cp elki-0.7.0~20150828.jar de.lmu.ifi.dbs.elki.application.GeneratorXMLSpec -app.out 0.txt -bymodel.spec 1.xml
Exception in thread "main" java.lang.NoClassDefFoundError: gnu/trove/list/TIntList
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2615)
at java.lang.Class.getMethod0(Class.java:2856)
at java.lang.Class.getMethod(Class.java:1668)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: gnu.trove.list.TIntList
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 6 more
db#computer:~/Desktop/elki-0.7.0~20150828/elki$
Your Java classpath is incomplete.
Specifically, it does not include the Trove library required.

Class not found Exception in Hazelcast plugin in openfire in cluster environment

I have installed openfire in 3 EC2 instance and added hazelcast plugin in all three. All three nodes work fine. But some time i get this exception in stderror log, Please help me to identify why this exception occurs and how can i avoid this error logs? ? ?
Apr 15, 2015 12:16:18 PM com.hazelcast.spi.OperationService
SEVERE: [172.31.22.121]:5701 [openfire] java.lang.ClassNotFoundException: com.jivesoftware.util.cache.ClusteredCacheFactory$CallableTask
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.jivesoftware.util.cache.ClusteredCacheFactory$CallableTask
at com.hazelcast.nio.serialization.DefaultSerializers$ObjectSerializer.read(Defaul tSerializers.java:190)
at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAd apter.java:40)
at com.hazelcast.nio.serialization.SerializationServiceImpl.readObject(Serializati onServiceImpl.java:276)
at com.hazelcast.nio.serialization.ByteArrayObjectDataInput.readObject(ByteArrayOb jectDataInput.java:431)
at com.hazelcast.executor.BaseCallableTaskOperation.readInternal(BaseCallableTaskO peration.java:91)
at com.hazelcast.spi.Operation.readData(Operation.java:295)
at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:105)
at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:36)
at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAd apter.java:59)
at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(Serialization ServiceImpl.java:218)
at com.hazelcast.spi.impl.NodeEngineImpl.toObject(NodeEngineImpl.java:156)
at com.hazelcast.spi.impl.OperationServiceImpl$RemoteOperationProcessor.run(Operat ionServiceImpl.java:724)
at com.hazelcast.util.executor.ManagedExecutorService$Worker.run(ManagedExecutorSe rvice.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
at com.hazelcast.util.executor.PoolExecutorThreadFactory$ManagedThread.run(PoolExe cutorThreadFactory.java:59)
Caused by: java.lang.ClassNotFoundException: com.jivesoftware.util.cache.ClusteredCacheFactory$CallableTask
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at com.hazelcast.nio.ClassLoaderUtil.loadClass(ClassLoaderUtil.java:83)
at com.hazelcast.nio.IOUtil$1.resolveClass(IOUtil.java:77)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at com.hazelcast.nio.serialization.DefaultSerializers$ObjectSerializer.read(Defaul tSerializers.java:185)
... 16 more
It seems you have to deploy some openfire jar inside your Hazelcast cluster. Hazelcast does not offer remote classloading for security reasons (otherwise everybody could inject everything).

Unable to load Hive-JDBC driver when accessed through MapReduce program on Amazon's Elastic MapReduce

I have written a MapReduce program in which I am storing some part of output data into Hive table.
I have used Hive-JDBC driver to access Hive table via MapReduce code.
This program has compiled successfully on local machine.
After this, I created a JAR file and uploaded it on S3. Then I created an elasticmapreduce cluster and started it.
However, it is resulting into below mentioned errors:
java.lang.Throwable: Child Error at
org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271) Caused
by: java.io.IOException: Task process exit with nonzero status of 1.
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
attempt_201407161054_0001_m_000001_0: java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
attempt_201407161054_0001_m_000001_0: at
java.net.URLClassLoader$1.run(URLClassLoader.java:366)
attempt_201407161054_0001_m_000001_0: at
java.net.URLClassLoader$1.run(URLClassLoader.java:355)
attempt_201407161054_0001_m_000001_0: at
java.security.AccessController.doPrivileged(Native Method)
attempt_201407161054_0001_m_000001_0: at
java.net.URLClassLoader.findClass(URLClassLoader.java:354)
attempt_201407161054_0001_m_000001_0: at
java.lang.ClassLoader.loadClass(ClassLoader.java:424)
attempt_201407161054_0001_m_000001_0: at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
attempt_201407161054_0001_m_000001_0: at
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
attempt_201407161054_0001_m_000001_0: at
java.lang.Class.forName0(Native Method)
attempt_201407161054_0001_m_000001_0: at
java.lang.Class.forName(Class.java:190)
attempt_201407161054_0001_m_000001_0: at
HubAndAuthority.InputHubMapper.configure(InputHubMapper.java:38)
attempt_201407161054_0001_m_000001_0: at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
attempt_201407161054_0001_m_000001_0: at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
attempt_201407161054_0001_m_000001_0: at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
attempt_201407161054_0001_m_000001_0: at
java.lang.reflect.Method.invoke(Method.java:606)
It appears to be an issue of missing Hive-JDBC driver and it should get resolved by adding Hive-JDBC driver in classpath. However, I am not aware of the exact step to do this on Amazon's EMR.
Could you please let me know what is missing from my end and how to resolve it?
Thanks and Regards,
Prafulla
I'm not sure enough, but you should try this:
"Note
If you want your custom classpath to override the original class path, you should set the environment variable, HADOOP_USER_CLASSPATH_FIRST to true so that the HADOOP_CLASSPATH value specified in hadoop-user-env.sh is first."
http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-hadoop-config.html
http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-hadoop-config_hadoop-user-env.sh.html
Regards,
revet

Error creating index on elastic search

I am using elasticsearch: stable 1.2.1, HEAD. It was installed with 'brew'.
I am also able to start it without any problems.
However when I create an index I got this exception:
[2014-07-11 13:40:33,300][DEBUG][action.admin.indices.create] [N'astirh] [x_application_item_development] failed to create
org.elasticsearch.indices.IndexCreationException: [x_application_item_development] failed to create index
at org.elasticsearch.indices.InternalIndicesService.createIndex(InternalIndicesService.java:302)
at org.elasticsearch.cluster.metadata.MetaDataCreateIndexService$2.execute(MetaDataCreateIndexService.java:343)
at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:309)
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/elasticsearch/ElasticSearchIllegalArgumentException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2532)
at java.lang.Class.getDeclaredConstructors(Class.java:1901)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.createMethodMapping(FactoryProvider.java:214)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.newFactory(FactoryProvider.java:151)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.newFactory(FactoryProvider.java:146)
at org.elasticsearch.index.analysis.AnalysisModule.configure(AnalysisModule.java:274)
at org.elasticsearch.common.inject.AbstractModule.configure(AbstractModule.java:60)
at org.elasticsearch.common.inject.spi.Elements$RecordingBinder.install(Elements.java:204)
at org.elasticsearch.common.inject.spi.Elements.getElements(Elements.java:85)
at org.elasticsearch.common.inject.InjectorShell$Builder.build(InjectorShell.java:130)
at org.elasticsearch.common.inject.InjectorBuilder.build(InjectorBuilder.java:99)
at org.elasticsearch.common.inject.InjectorImpl.createChildInjector(InjectorImpl.java:131)
at org.elasticsearch.common.inject.ModulesBuilder.createChildInjector(ModulesBuilder.java:69)
at org.elasticsearch.indices.InternalIndicesService.createIndex(InternalIndicesService.java:298)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.elasticsearch.ElasticSearchIllegalArgumentException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 21 more
[2014-07-11 13:40:33,506][DEBUG][action.admin.indices.create] [N'astirh] [x_application_item_development] failed to create
org.elasticsearch.indices.IndexCreationException: [x_application_item_development] failed to create index
at org.elasticsearch.indices.InternalIndicesService.createIndex(InternalIndicesService.java:302)
at org.elasticsearch.cluster.metadata.MetaDataCreateIndexService$2.execute(MetaDataCreateIndexService.java:343)
at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:309)
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:134)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/elasticsearch/ElasticSearchIllegalArgumentException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2532)
at java.lang.Class.getDeclaredConstructors(Class.java:1901)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.createMethodMapping(FactoryProvider.java:214)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.newFactory(FactoryProvider.java:151)
at org.elasticsearch.common.inject.assistedinject.FactoryProvider.newFactory(FactoryProvider.java:146)
at org.elasticsearch.index.analysis.AnalysisModule.configure(AnalysisModule.java:274)
at org.elasticsearch.common.inject.AbstractModule.configure(AbstractModule.java:60)
at org.elasticsearch.common.inject.spi.Elements$RecordingBinder.install(Elements.java:204)
at org.elasticsearch.common.inject.spi.Elements.getElements(Elements.java:85)
at org.elasticsearch.common.inject.InjectorShell$Builder.build(InjectorShell.java:130)
at org.elasticsearch.common.inject.InjectorBuilder.build(InjectorBuilder.java:99)
at org.elasticsearch.common.inject.InjectorImpl.createChildInjector(InjectorImpl.java:131)
at org.elasticsearch.common.inject.ModulesBuilder.createChildInjector(ModulesBuilder.java:69)
at org.elasticsearch.indices.InternalIndicesService.createIndex(InternalIndicesService.java:298)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.elasticsearch.ElasticSearchIllegalArgumentException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 21 more
This is the class path:
:/usr/local/Cellar/elasticsearch/1.2.1/libexec/elasticsearch-1.2.1.jar:/usr/local/Cellar/elasticsearch/1.2.1/libexec/*:/usr/local/Cellar/elasticsearch/1.2.1/libexec/sigar/*
I downloaded the latest stable (1.2.2 there is version difference) from the elastic search site. And I started manually... The class path contains the same number of items (only the path prefix is different):
:/Users/boti/Downloads/elasticsearch-1.2.2/lib/elasticsearch-1.2.2.jar:/Users/boti/Downloads/elasticsearch-1.2.2/lib/:/Users/boti/Downloads/elasticsearch-1.2.2/lib/sigar/
In the manually installed version everything works...
Is this a brew recipe problem?
Sounds like a brew recipe problem.
Then error you're getting about a missing class means there's something fundamentally wrong in the way the application is started. That or files are actually missing.
Either way it's a problem with your startup script that brew is using or the files brew downloaded for you.

Resources