I'm trying to create a custom filter on hbase 0.98.1 in standalone mode on ubuntu 14.04.
I created a class extending FilterBase. I put the jar in HBASE_HOME/lib. Looking in the logs, I see that my jar is in the path.
Then I have a java client that first makes a get with a columnPrfixFilter, then it makes a get with my custom filter. The columnPrefixFilter works perfectly fine. With my filter, nothing happens. The client freeze for 10 minutes and close the connexion.
I don't see any thing in the log.
Could you please give me some hint on what and where to check ?
regards,
EDIT:
It turns out to be a protc vesrion conflict. I generated java class form proto file with protoc 2.4.0 and in my filter I was using protobuf-java 2.5.0
I aligned to 2.5.0 and it's now working fine.
Related
I created a Custom ListenTCP processor by creating new socket handlers and referring them in CustomListen TCP. I was able to deploy it on my mac and tested it with a sample file that has a different incoming delimiter and works great on my mac.(version 11.4)
However, My org is using this version: Cloudera Cloudera Flow Management (CFM) 2.0.4.0 1.11.4.2.0.4.0-80, Tagged nifi-1.11.4-RC1
So, I tried to change the version appropriately on my mac for deploying the nar file into our Cloudera cluster but it is failing with ClientAuth class not found in SSLContextService( version 1.11.4.2.0.4.0-80)
Here is the link for 1.11.4 on my mac works fine
Modified to 1.11.4.2.0.4.0-80 and fails with not finding $ClientAuth
I looked at the source code
it was deprecated and somehow not found in your CFM jar.
Maybe putting this enum in your custom code solves your problem .
enum ClientAuth {
WANT,
REQUIRED,
NONE
}
Using Apache Camel plugin for Grails. Consuming ftp endpoint and wish to process files via modified date. This is not working as expected using "...&sortBy=file:modified" url param. It ignores the date and sorts by the filename. I've tried several versions like "reverse:file:modified" and "date:file:yyyyMMddmmssSSS". Platform is Grails 2.3.5 running on Linux.
TIA,
Eric
"sortBy=file:modified;file:name" works fine if you do not use "maxMessagesPerPoll=1". ;)
Thanks.
If you want to sort by oldest modified file, then you need to use sortBy=file:modified
If you want to sort by last modified file, then you need to use sortBy=reverse:file:modified
Trying to run a simple hadoop job, but hadoop is throwing a NoClassDef on "org/w3c/dom/Document"
I'm trying to run the basic examples from the "Mahout In Action" book (https://github.com/tdunning/MiA).
I do this using nearly the same maven setup but tooled for cassandra use rather than a file data model.
But, when I try to run the *-job.jar, it spits a NoClassDef from the datastax/hadoop end.
I'm using 1.0.5-dse of the driver as that's the only one that supports the current DSE version of Cassandra(1.2.1) if that helps at all though the issue seems to be deeper.
Attached is a gist with more info included.
There is the maven file, this brief overview, and the console output.
https://gist.github.com/zmarcantel/8d56ae4378247bc39be4
Thanks
try dropping the jar file for class of org.w3c.dom.Document to $DSE/resource/hadoop/lib/ folder as a work around.
I'm writing my first Avro job that is meant to take an avro file and output text. I tried to reverse engineer it from this example:
https://gist.github.com/chriswhite199/6755242
I am getting the error below though.
Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
I looked around and found it was likely an issue with what jar files are being used. I'm running CDH4 with MR1 and am using the jar files below:
avro-tools-1.7.5.jar
hadoop-core-2.0.0-mr1-cdh4.4.0.jar
hadoop-mapreduce-client-core-2.0.2-alpha.jar
I can't post code for security reasons but it shouldn't need anything not used in the example code. I don't have maven set up yet either so I can't follow those routes. Is there something else I can try to get around these issues?
Try using avro 1.7.3
AVRO-1170 bug
I'm new to Solr, and am having trouble getting my setup to work. I'm using Solr 3.5.0 running on Tomcat 7.0.23, on Windows 7 Professional. If I copy the single core example into my Solr home, it doesn't work - I get 404 errors from Tomcat for both http://foo/solr/admin/ and http://foo/solr/collection1/admin/. I've tried to convert the multicore example (which works, probably because solrconfig.xml is a lot simpler) to use a single core by deleting the additional folder and changing solr.xml to this:
<solr persistent="false">
<cores adminPath="/admin/cores" defaultCoreName="core0">
<core name="core0" instanceDir="core0" />
</cores>
</solr>
As I understand it, this should mean that I can access core0 using either http://localhost/solr/admin/ or http://localhost/solr/core0/admin/, but only the second URL works - the other just returns a 404 stating "missing core name in path". I thought that defaultCoreName meant I didn't need to specify the core name in the path. Should the defaultCoreName attribute work the way I expected it to, and if so please could you suggest areas of the configuration that I ought to have a look at in order to fix this?
The behaviour you expect is correct. http://localhost/solr/admin/ should give the same result as http://localhost/solr/core0/admin/
Not sure what the problem is, are you running Tomcat on port 80 instead of 8080?