ElasticSearch + Twitter river class not found - elasticsearch

I am trying to install twitter river plugin on elastic search. I have done it before on my mac without any issues, but this time I am trying to install on a Linux server and I keep getting this class not found exception. The exception comes when I use curl -XPUT command as shown in https://github.com/elasticsearch/elasticsearch-river-twitter/blob/master/README.md
Here's the exception I get :-
[2013-11-25 15:12:00,116][WARN ][river ] [Fasaud] failed to create river [twitter][my_twitter_river2]
org.elasticsearch.common.settings.NoClassSettingsException: Failed to load class with value [twitter]
at org.elasticsearch.river.RiverModule.loadTypeModule(RiverModule.java:87)
at org.elasticsearch.river.RiverModule.spawnModules(RiverModule.java:58)
at org.elasticsearch.common.inject.ModulesBuilder.add(ModulesBuilder.java:44)
at org.elasticsearch.river.RiversService.createRiver(RiversService.java:135)
at org.elasticsearch.river.RiversService$ApplyRivers$2.onResponse(RiversService.java:271)
at org.elasticsearch.river.RiversService$ApplyRivers$2.onResponse(RiversService.java:265)
at org.elasticsearch.action.support.TransportAction$ThreadedActionListener$1.run(TransportAction.java:93)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.lang.ClassNotFoundException: twitter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.elasticsearch.river.RiverModule.loadTypeModule(RiverModule.java:73)
... 9 more
I have tried restarting, installing many many times and can't get it to work anyhow. I can also see that the jar files present in the plugin directories contain the class twitter but for some reason they're not available to elasticsearch execution.

This is a configuration and install issue.
Steps:
Check to see where elasticsearch is installed by running ps aux|grep elasticsearch (on linux) and that will tell which executable is running on that machine (look for a line that looks like this: (-Des.default.path.home=/usr/share/elasticsearch) or something similar.
If several elasticsearch instances running, kill them all and start the one you care about.
Ensure that only one instance of elasticsearch is running (if that is the intent) and, from step one above, go to the home path directory.
Ensure that the plugin is installed in this directory in elasticsearch/plugins dir.
Restart elasticsearch.

confirm your classpath contain a text file es-plugin.properties with content
plugin=your.package.YourRiverPlugin
missing of such a property file shows almost the same error.
for more info, please refer http://blog.trifork.com/2013/01/10/how-to-write-an-elasticsearch-river-plugin/

Related

Flink in ECS fails to find shaded ContainerCredentialsProvider

I'm trying to run Flink 1.7.2 on ECS with Fargate. I've set up the state backend for my job to be RocksDB with a path=s3://...
In my Dockerfile my base image is 1.7.2-hadoop27-scala_2.11, and I run the following 2 commands:
RUN echo "fs.s3a.aws.credentials.provider: org.apache.flink.fs.s3hadoop.shaded.com.amazonaws.auth.ContainerCredentialsProvider" >> "$FLINK_CONF_DIR/flink-conf.yaml"
RUN cp /opt/flink/opt/flink-s3-fs-hadoop-1.7.2.jar /opt/flink/lib/flink-s3-fs-hadoop-1.7.2.jar
Just like it says in
https://issues.apache.org/jira/browse/FLINK-8439
However I get the following exception:
Caused by: java.io.IOException: From option fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class org.apache.flink.fs.s3hadoop.shaded.com.amazonaws.auth.ContainerCredentialsProvider not found
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.loadAWSProviderClasses(S3AUtils.java:592)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:556)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:52)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:256)
at org.apache.flink.fs.s3.common.AbstractS3FileSystemFactory.create(AbstractS3FileSystemFactory.java:125)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:395)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(FsCheckpointStorage.java:58)
at org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(FsStateBackend.java:444)
at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createCheckpointStorage(RocksDBStateBackend.java:407)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:249)
... 17 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.flink.fs.s3hadoop.shaded.com.amazonaws.auth.ContainerCredentialsProvider not found
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2375)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.conf.Configuration.getClasses(Configuration.java:2446)
at org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.S3AUtils.loadAWSProviderClasses(S3AUtils.java:589)
... 28 more
Looking in the flink-s3-fs-hadoop-1.7.2.jar I see that the package of the class ContainerCredentialsProvider is actually org.apache.flink.fs.s3base.shaded.com.amazonaws.auth
I already tried:
Adding the aws-sdk-core jar to the lib, and setting the credentials provider to be just com.amazonaws.auth.ContainerCredentialsProvider (without the shading) but I get the problem mentioned in the issue link above
Setting the credentials provider to be org.apache.flink.fs.s3base.shaded.com.amazonaws.auth.ContainerCredentialsProvider but then the code in S3FileSystemFactory.java prefixes it with org.apache.flink.fs.s3hadoop.shaded.
Any ideas here for finding the class?
The issue was resolved in one of the versions after.
I ran it on Flink 1.9.0 cluster with the following line:
RUN echo "fs.s3a.aws.credentials.provider: com.amazonaws.auth.ContainerCredentialsProvider" >> "$FLINK_CONF_DIR/flink-conf.yaml"
and the class is found and works.
You can see in:
https://github.com/apache/flink/blob/master/flink-filesystems/flink-s3-fs-hadoop/src/main/java/org/apache/flink/fs/s3hadoop/S3FileSystemFactory.java
that the FLINK_SHADING_PREFIX is now correct

Unable to run basic example for hadoop

Hadoop version 2.9.0, Java - 1.8.0_162
When trying to run the example given here: https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html, under standalone operation, I get the following error:
$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.9.0.jar grep input output 'dfs[a-z.]+'
java.lang.NoSuchMethodError: org.apache.hadoop.util.ProgramDriver.run([Ljava/lang/String;)I
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
I am new to hadoop and not sure how to fix this. I have set the JAVA_HOME in hadoop-env.sh. I am pretty sure that I am using the correct compatible versions of java and hadoop.
Any help will be useful.
Nish,
The error message means that while the runtime was able to find the class ProgramDriver, the function run() is not present.
The most likely reason for this is that you're running an old version of Hadoop that exposed a difference interface in ProgramDriver. About a year ago this method was renamed to run() after being called driver().
The fix for that would be making sure you're running a recent version of Hadoop.
For your reference please check following links they have asked same question.
Error while executing hadoop-mapreduce-examples-2.2.0.jar
Can the hadoop programm which write under the hadoop-2.2.0 run in hadoop-1.2.1?

Spark 1.3.1: cannot read file from S3 bucket, org/jets3t/service/ServiceException

I'm on an AWS EC2 VM (Ubuntu 14.04), willing to do some basics with Spark on RDDs from my S3 files. While running successfully this dirty command (not using sparkContext.hadoopConfiguration for the moment)
scala> val distFile = sc.textFile("s3n://< AWS_ACCESS_KEY_ID>:<AWS_SECRET_ACCESS_KEY>#bucketname/folder1/folder2/file.csv")
I then get the following error when running a distFile.count()
java.lang.NoClassDefFoundError: org/jets3t/service/ServiceException
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:334)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:324)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
...
...
Caused by: java.lang.ClassNotFoundException: org.jets3t.service.ServiceException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
I have previously
defined an AWS IAM user with corresponding AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
added the export of both keys as env variables in .bashrc
built Spark 1.3.1 with SPARK_HADOOP_VERSION=2.6.0-cdh5.4.1 sbt/sbt assembly
installed and run hadoop 2.6-cdh5.4.1 (pseudo distributed)
Does it have to do with the syntax of the textFile("s3n// ...") ? I've tried others, including s3:// without success ...
Thank you
Include Jets3t jar to your class-path. Add proper compatible version with your current setting. You need ServiceException to be added to your class path.
I had the same problem. In spite it occurred at spark v2.1.0 with hadoop v2.7.2 environment, however I left it here because it would be the same cause. Here's what I've got.
A needed class was not found. This could be due to an error in your runpath. Missing class: org/jets3t/service/ServiceException
java.lang.NoClassDefFoundError: org/jets3t/service/ServiceException
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332)
at
...
...
Caused by: java.lang.ClassNotFoundException: org.jets3t.service.ServiceException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
It was because of the classpath got the lower version of net.java.dev.jets3t:jets3t dependency than what org.apache.hadoop:hadoop-aws required.
I solved the problem after adding net.java.dev.jets3t:jets3t:0.9.0 in my build.sbt
You need to include hadoop-mapreduce-client jars in your CLASSPATH. In my case, I made my own distribution with these dependencies.
I put the following files in the lib folder:
hadoop-mapreduce-client-jobclient-2.6.0.jar
hadoop-mapreduce-client-hs-plugins-2.6.0.jar
hadoop-mapreduce-client-shuffle-2.6.0.jar
hadoop-mapreduce-client-jobclient-2.6.0-tests.jar
hadoop-mapreduce-client-common-2.6.0.jar
hadoop-mapreduce-client-app-2.6.0.jar
hadoop-mapreduce-client-hs-2.6.0.jar
hadoop-mapreduce-client-core-2.6.0.jar

Oracle Data Integrator (ODI - v11.1.1.3) "unable to load language: beanshell" Error

Following an install of Eclipse 3.7.2 on my Ubuntu 12.04 development machine, I have been unable to execute any ODI packages/interfaces/procedures. On execution (for both simulated and actual runs), an error is thrown (java trace below). I am not sure if it's anything to do with the Eclipse install, but it seems likely. Does anyone have an idea how to fix this?
Also, when launching ODI from the terminal using 'bash odi', the following error is displayed in the terminal:
2013-08-15 14:43:46.162 ERROR Error during RuntimeClassLoader initialization. ODI will start without RuntimeClassLoader
Error output:
oracle.odi.core.exception.OdiRuntimeException: Error during Code Interpretor creation
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.getInstance(SnpCodeInterpretor.java:209)
at com.sunopsis.dwg.codeinterpretor.SnpGeneratorSQLCIT.<init>(SnpGeneratorSQLCIT.java:300)
at com.sunopsis.graphical.dialog.SnpsDialogExecution.doPackageExecuter(SnpsDialogExecution.java:907)
at oracle.odi.ui.action.SnpsPopupActionExecuteHandler.actionPerformed(SnpsPopupActionExecuteHandler.java:68)
at oracle.odi.ui.SnpsActionControler.handleEvent(SnpsActionControler.java:75)
at oracle.ide.controller.IdeAction.performAction(IdeAction.java:529)
at oracle.ide.controller.IdeAction.actionPerformedImpl(IdeAction.java:884)
at oracle.ide.controller.IdeAction.actionPerformed(IdeAction.java:501)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
at javax.swing.AbstractButton.doClick(AbstractButton.java:357)
at javax.swing.plaf.basic.BasicMenuItemUI.doClick(BasicMenuItemUI.java:809)
at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(BasicMenuItemUI.java:850)
at java.awt.Component.processMouseEvent(Component.java:6297)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3275)
at java.awt.Component.processEvent(Component.java:6062)
at java.awt.Container.processEvent(Container.java:2039)
at java.awt.Component.dispatchEventImpl(Component.java:4660)
at java.awt.Container.dispatchEventImpl(Container.java:2097)
at java.awt.Component.dispatchEvent(Component.java:4488)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4575)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4236)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4166)
at java.awt.Container.dispatchEventImpl(Container.java:2083)
at java.awt.Window.dispatchEventImpl(Window.java:2489)
at java.awt.Component.dispatchEvent(Component.java:4488)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:674)
at java.awt.EventQueue.access$400(EventQueue.java:81)
at java.awt.EventQueue$2.run(EventQueue.java:633)
at java.awt.EventQueue$2.run(EventQueue.java:631)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:87)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:98)
at java.awt.EventQueue$3.run(EventQueue.java:647)
at java.awt.EventQueue$3.run(EventQueue.java:645)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.AccessControlContext$1.doIntersectionPrivilege(AccessControlContext.java:87)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:644)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)
Caused by: org.apache.bsf.BSFException: unable to load language: beanshell
at org.apache.bsf.BSFManager.loadScriptingEngine(BSFManager.java:718)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.loadEngine(SnpCodeInterpretor.java:85)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.<init>(SnpCodeInterpretor.java:75)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.getInstance(SnpCodeInterpretor.java:184)
... 45 more
After digging around for about a day on this issue, I brazenly tried running ODI as the root user on the off chance that this was a permissions issue. I started ODI from the command line (using 'bash odi') for greater verbosity, and it loaded without the error mentioned above. Something gave me the impression that this wasn't a permissions issue, but one related to the user settings.
To rectify the issue, I removed my user's odi settings folder (renaming it, for safety):
mv ~/.odi ~/.backup_odi
Then I started ODI from the terminal under my own user (i.e. not root) - there were no errors! None of my connections were available in the new settings folder though. This I fixed by closing ODI and entering the following:
cp ~/.backup_odi/oracledi/snps_login_work.xml ~/.odi/oracledi/
If anybody else encounters this issue, I hope you find this post quicker than it took me to fix it!
org.apache.bsf.BSFException: unable to load language: beanshell
The exception was thrown because bsh-2.Ob4.jar was not in the classpath and it is a dependent jar of bsf.jar

Shell+java error

I want to run either editor.sh or snippetCollector.sh under modules/editor/scripts in Sentrick. So I got it by git clone git://sentrick.git.sourceforge.net/gitroot/sentrick/sentrick. That is not a problem. I read the documentation under the doc folder. It says to run either editor.sh or snippetCollector.sh. I go and I do: ./editor.sh and it doesn't work. So I see online that you can also use sh editor.sh which also doesn't work. It says
Exception in thread "main" java.lang.NoClassDefFoundError: de/denkselbst/sentrick/sbeditor/SbEditor
Caused by: java.lang.ClassNotFoundException: de.denkselbst.sentrick.sbeditor.SbEditor
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: de.denkselbst.sentrick.sbeditor.SbEditor. Program will exit.
I have not edited anything so I think that it should work, but it doesn't!!
Inside editor.sh (the one I want to run but can't) it says only this:
java -cp #CP.UNIX# de.denkselbst.sentrick.sbeditor.SbEditor
The problem with snippetCollector.sh is the same
-cp is configuring the CLASSPATH, which should contain a reference to the directory containing your classes. I suspect that's not been set correctly, and this is a useful reference for setting it.
It seems that you are missing some class files while running your code..please check whether you follow all the steps mentioned in your doc..make sure to set the CLASSPATH and try to rerun the program..
It is clear that you are having class path issue. Normaly when I run application in linux envirenment my sh is look like bellow.
#!/bin/sh
THE_CLASSPATH=/home/pathto/lib
java -cp ${THE_CLASSPATH}/required.jar de.denkselbst.sentrick.sbeditor.SbEditor

Resources