Upgrade elasticsearch.jar from 1.7.5 to 2.3.5 - elasticsearch

I have upgraded the client(elasticsearch.jar) from 1.7.5 to 2.3.5 One of the major changes was org.elasticsearch.client.transport.TransportClient. In 1.7.5 version org.elasticsearch.client.transport.TransportClient had public constructor with no arg constructor. In my applicationContext-search.xml, earlier we had "bean id="transportClient" class="org.elasticsearch.client.transport.TransportClient"/> which was replaced to
<bean id="transportClient" class="org.elasticsearch.client.transport.TransportClient" factory-method="builder">
/bean>
<bean id="transportClient1" factory-bean="transportClient" factory-method="build" >
</bean>
Now when i start my service that invokes elastic search i end up in the following error
[01 Sep 2016 15:14:27,390] WARN [] [elasticsearch[Carolyn Trainer][transport_client_worker][T#1]{New I/O worker #1}] [org.elasticsearch.transport.netty.NettyTransport] [Carolyn Trainer] exception caught on transport layer [[id: 0xfaea1184, /127.0.0.1:63786 => localhost/127.0.0.1:9300]], closing connection
java.lang.NullPointerException
at org.elasticsearch.transport.netty.MessageChannelHandler.handleException(MessageChannelHandler.java:207)
at org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:202)
at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:136)
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Elastic search set up is one node in the development environment.
Can someone help me whats happening here ?
Regards,
Maney

This was my wrong. What i did was, running the client (elasticsearch.jar) on 2.3.5 and the server on 1.3.2
I upgraded the server to match my client, from 1.3.2 to 2.3.5 and things were up and working fine.
So, to conclude, make sure the client and server are running on the same version of elasticsearch.....
Regards,
Maney

Related

I can't write a Spark DataFrame to database with jdbc

i am attempting to write a simple dataframe into oracle database but I get an error message. I use a case class and a list to construct my dataframe. I found that that we can use a method jdbc after a write to insert data to my oracle database.
I tried this code:
case class MyClass(A: String, B: Int)
val MyClass_List = List(MyClass("att1", 1), MyClass("att2", 2))
val MyClass_df = MyClass_List.toDF()
MyClass_df.write
.mode("append")
.jdbc(url, tableTest, prop)
but I get the following error:
17/07/12 14:57:04 ERROR JobScheduler: Error running job streaming job 1499864218000 ms.0
java.lang.NullPointerException
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:93)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:446)
at Test$$anonfun$1.apply(Test.scala:177)
at Test$$anonfun$1.apply(Test.scala:117)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:254)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:254)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:254)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:253)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Exception in thread "main" java.lang.NullPointerException
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:93)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:446)
at Test$$anonfun$1.apply(Test.scala:177)
at Test$$anonfun$1.apply(Test.scala:117)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:627)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:415)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:254)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:254)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:254)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:253)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
i use spark version 2.1.0 and my database as two column A and B respectively typed as varchar and number.
Do you have any Idea?
In fact I was using the driver of mysql despite the driver of oracle.
I should use
prop.setProperty("driver", "oracle.jdbc.driver.OracleDriver")
and not
prop.setProperty("driver", "com.mysql.jdbc.Driver")
It should be "oracle.jdbc.OracleDriver" as the one in the driver package is deprecated.
prop.setProperty("driver", "oracle.jdbc.OracleDriver")

spark-shell cannot connect to remote master

I have two same version spark.
192.168.2.230 as master and 192.168.2.5 as slave.
I tried ./spark-submit --version on 2.5 and 2.230:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8, OpenJDK 64-Bit Server VM, 1.8.0_121
Branch
Compiled by user jenkins on 2016-12-16T02:04:48Z
Revision
Url
Type --help for more information.
their spark version is same.
Then I changed the master config(2.230) conf/spark-env.sh set:
SPARK_LOCAL_IP=192.168.2.230
SPARK_MASTER_HOST=192.168.2.230
and run master(2.230):
sbin/start-all.sh
I can see the master management ui on port http://192.168.2.230:8080.
Then I try to use spark shell on 2.5:
./spark-shell --master=spark://192.168.2.230:7077
there is some errors occur and cannot connect to 2.230, it likes version incompatible, but these two spark copy from the same tar.gz, here is the errors:
[root#localhost bin]# ./spark-shell --master=spark://192.168.2.230:7077
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/19 03:04:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/19 03:04:20 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.2.5 instead (on interface enp3s0)
17/04/19 03:04:20 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/04/19 03:04:20 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.2.230:7077
org.apache.spark.SparkException: Exception thrown in awaitResult
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:189)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
... 1 more
17/04/19 03:04:40 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
17/04/19 03:04:40 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
17/04/19 03:04:40 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
17/04/19 03:04:40 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
at $line3.$read$$iw$$iw.<init>(<console>:15)
at $line3.$read$$iw.<init>(<console>:42)
at $line3.$read.<init>(<console>:44)
at $line3.$read$.<init>(<console>:48)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.$print$lzycompute(<console>:7)
at $line3.$eval$.$print(<console>:6)
at $line3.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
... 47 elided
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_121)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
The reason why 2.5 cannot connect to 2.230 likes class incompatible:
Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
But i tried locally on 2.230: ./spark-shell --master spark://192.168.2.230:7077 it success...
Why i cannot use remote spark-shell (2.5) to connect master (2.230)?
Sorry, it's my fault.
I ran a docker spark on 192.168.2.230 which version is 1.6.
When 192.168.2.5 connect 192.168.2.230, docker forwards the 7077 port to 1.6 version spark so it's incompatible error.
Now I changed the version 2.1 spark's port 7777, ./spark-shell --master spark://192.168.2.230:7777 it works.
Thanks, zyl.
Had same problem solved by finding the appropriate container's IP address (172.18.0.1 instead of 127.0.0.1), because I was accessing it from another container.
That IP address could be found using ifconfig command
Just adding, that it in some cases the cause of this issue might be that the IP not even having service at all, or is inaccessible. Weird.

Implementing OLAP on titan/cassandra graph

I am using Titan 1.0.0 on top cassandra. I want to use OLAP services using SparkGraphComputer on the titan/cassandra graph. I have two questions
1) How to do it?
config:
https://github.com/thinkaurelius/titan/blob/titan10/titan-dist/src/assembly/static/conf/hadoop-graph/read-cassandra.properties
gremlin-code:
graph = GraphFactory.open('conf/hadoop-graph/read-cassandra.properties')
g = graph.traversal(computer(SparkGraphComputer))
g.V().count() //Here is the error
Error:
11:20:33 ERROR org.apache.spark.executor.Executor - Exception in task 3.0 in stage 0.0 (TID 3)
java.lang.RuntimeException: error communicating via Thrift
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:267)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:215)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331)
at org.apache.cassandra.hadoop.ColumnFamilyRecordReader.initialize(ColumnFamilyRecordReader.java:171)
at com.thinkaurelius.titan.hadoop.formats.cassandra.CassandraBinaryRecordReader.initialize(CassandraBinaryRecordReader.java:39)
at com.thinkaurelius.titan.hadoop.formats.util.GiraphRecordReader.initialize(GiraphRecordReader.java:38)
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:135)
at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:107)
at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:69)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Here is the full trace :
http://pastebin.com/CiuXjFB2
2) Why convert to hadoopGraph when the data is already stored on titan/cassandra?
references:
https://groups.google.com/forum/#!topic/gremlin-users/fVijONCxvSI

Spring batch - JSR 352 retry fails when skip exception in thrown in writer

I am trying to investigate what happens when an exception is throw in the writer. As per theory I was expecting a roll-back to occur and then retry the chunk setting commit size as 1. but this is not happening and i am receiving the following error.
I am using Spring Batch -JSR 352 job design.
In the writer I have 3 JDBC calls. If i throw an skippable exception after 2 JDBC calls have been successfully executed , then I could find the Spring batch is not able to rollback the transaction for the chunk and retry the chunk by placing the commit size as one. I am receiving the following execption after which the entire chunk is getting skipped and we are having an chunk commit. As a result the data already written to 2 JDBC tables are getting persisted missing the 3rd table data.
20 Apr 2015 18:38:28,811 ERROR [gov.state.nextgen.in.batch.sdx.chunk.writer.SdxRcvDlyWriter.writeItems(SdxRcvDlyWriter.java:74)] - Error while writing sdxRcvRecord :Exception occured during processing
at gov.state.nextgen.in.batch.sdx.bo.impl.SdxRcvDlyBoImpl.write(SdxRcvDlyBoImpl.java:125)
at gov.state.nextgen.in.batch.sdx.chunk.writer.SdxRcvDlyWriter.writeItems(SdxRcvDlyWriter.java:65)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:94)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:619)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:133)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:121)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy18.writeItems(Unknown Source)
at org.springframework.batch.jsr.item.ItemWriterAdapter.write(ItemWriterAdapter.java:55)
at org.springframework.batch.core.jsr.step.item.JsrChunkProcessor.doPersist(JsrChunkProcessor.java:243)
at org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor$5.doWithRetry(JsrFaultTolerantChunkProcessor.java:298)
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:263)
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:168)
at org.springframework.batch.core.step.item.BatchRetryTemplate.execute(BatchRetryTemplate.java:222)
at org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor.persist(JsrFaultTolerantChunkProcessor.java:348)
at org.springframework.batch.core.jsr.step.item.JsrChunkProcessor.process(JsrChunkProcessor.java:114)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:75)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:77)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:67)
at org.springframework.batch.core.jsr.job.flow.support.state.JsrStepState.handle(JsrStepState.java:53)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:165)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144)
at org.springframework.batch.core.jsr.job.flow.JsrFlowJob.doExecute(JsrFlowJob.java:82)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:304)
at org.springframework.batch.core.jsr.launch.JsrJobOperator$2.run(JsrJobOperator.java:675)
at java.lang.Thread.run(Thread.java:795)
20 Apr 2015 18:38:50,157 DEBUG [org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:293)] - Checking for rethrow: count=1
20 Apr 2015 18:38:52,584 DEBUG [org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:313)] - Retry failed last attempt: count=1
20 Apr 2015 18:38:53,403 DEBUG [org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor$6.recover(JsrFaultTolerantChunkProcessor.java:333)] - Skipping after failed write
org.springframework.batch.core.step.item.ForceRollbackForWriteSkipException: Force rollback on skippable exception so that skipped item can be located.
at org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor$5.doWithRetry(JsrFaultTolerantChunkProcessor.java:317)
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:263)
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:168)
at org.springframework.batch.core.step.item.BatchRetryTemplate.execute(BatchRetryTemplate.java:222)
at org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor.persist(JsrFaultTolerantChunkProcessor.java:348)
at org.springframework.batch.core.jsr.step.item.JsrChunkProcessor.process(JsrChunkProcessor.java:114)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:75)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:77)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:67)
at org.springframework.batch.core.jsr.job.flow.support.state.JsrStepState.handle(JsrStepState.java:53)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:165)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144)
at org.springframework.batch.core.jsr.job.flow.JsrFlowJob.doExecute(JsrFlowJob.java:82)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:304)
at org.springframework.batch.core.jsr.launch.JsrJobOperator$2.run(JsrJobOperator.java:675)
at java.lang.Thread.run(Thread.java:795)
Caused by: gov.state.nextgen.framework.batch.exception.NGBatchSkipRecordException: gov.state.nextgen.in.batch.exception.INBatchException: Exception occured during processing
at gov.state.nextgen.in.batch.sdx.chunk.writer.SdxRcvDlyWriter.writeItems(SdxRcvDlyWriter.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:94)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:619)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:133)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:121)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy18.writeItems(Unknown Source)
at org.springframework.batch.jsr.item.ItemWriterAdapter.write(ItemWriterAdapter.java:55)
at org.springframework.batch.core.jsr.step.item.JsrChunkProcessor.doPersist(JsrChunkProcessor.java:243)
at org.springframework.batch.core.jsr.step.item.JsrFaultTolerantChunkProcessor$5.doWithRetry(JsrFaultTolerantChunkProcessor.java:298)
... 26 more
Caused by: gov.state.nextgen.in.batch.exception.INBatchException: Exception occured during processing
at gov.state.nextgen.in.batch.sdx.bo.impl.SdxRcvDlyBoImpl.write(SdxRcvDlyBoImpl.java:125)
at gov.state.nextgen.in.batch.sdx.chunk.writer.SdxRcvDlyWriter.writeItems(SdxRcvDlyWriter.java:65)
... 41 more20 Apr 2015 18:39:14,808 DEBUG [org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:88)] - Inputs not busy, ended: false
20 Apr 2015 18:39:17,985 DEBUG [org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:437)] - Applying contribution: [StepContribution: read=2, written=0, filtered=0, readSkips=0, writeSkips=1, processSkips=0, exitStatus=EXECUTING]
20 Apr 2015 18:39:18,984 INFO [gov.state.nextgen.in.batch.sdx.chunk.writer.SdxRcvDlyWriter.checkpointInfo(SdxRcvDlyWriter.java:87)] - ParisRcvFedWriter - checkpointInfo(): step name: process, step execution id: 14939
20 Apr 2015 18:39:19,314 TRACE [org.springframework.transaction.support.TransactionSynchronizationManager.getResource(TransactionSynchronizationManager.java:140)] - Retrieved value [org.springframework.jdbc.datasource.ConnectionHolder#5c4a884f] for key [org.apache.commons.dbcp.BasicDataSource#d9bdc230] bound to thread [SimpleAsyncTaskExecutor-1]
20 Apr 2015 18:39:19,314 DEBUG [org.springframework.transaction.support.AbstractPlatformTransactionManager.handleExistingTransaction(AbstractPlatformTransactionManager.java:472)] - Participating in existing transaction
20 Apr 2015 18:39:19,314 TRACE [org.springframework.transaction.interceptor.TransactionAspectSupport.prepareTransactionInfo(TransactionAspectSupport.java:447)] - Getting transaction for [org.springframework.batch.core.repository.support.SimpleJobRepository.updateExecutionContext]
20 Apr 2015 18:39:19,315 DEBUG [org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:908)] - Executing prepared SQL update
20 Apr 2015 18:39:19,315 DEBUG [org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:627)] - Executing prepared SQL statement [UPDATE BATCH_STEP_EXECUTION_CONTEXT SET SHORT_CONTEXT = ?, SERIALIZED_CONTEXT = ? WHERE STEP_EXECUTION_ID = ?]
I am attaching my configuration files for reference. By debugging I could find my insert statements are getting executed under transaction managed by spring and I am sure there is no issue with the transaction manager configuration

leiningen-win-installer fails to run REPL

I've just installed Leiningen Win v 1.0.0 and tried to open REPL with JKD 7 (and JDK 8), but it fails on the first attempt. There is not bug tracker in the original project and I wonder if community can help.
Log:
`Windows PowerShell
Copyright (C) 2012 Microsoft Corporation. All rights reserved.
PS C:\Users\User> lein repl
nREPL server started on port 53308 on host 127.0.0.1 - nrepl://127.0.0.1:53308
REPL-y 0.3.2, nREPL 0.2.0-beta5Exception in thread "nREPL-worker-0" NoSuchMethodError clojure.tools.nrepl.StdOutBuffer.l
ength()I clojure.tools.nrepl.middleware.session/session-out/fn--7630 (session.clj:43)NoSuchMethodError clojure.tools.nr
epl.StdOutBuffer.length()I clojure.tools.nrepl.middleware.session/session-out/fn--7630 (session.clj:43)java.lang.NoSuch
MethodError: clojure.tools.nrepl.StdOutBuffer.length()I
at clojure.tools.nrepl.middleware.session$session_out$fn__7630.doInvoke(session.clj:43)
at clojure.lang.RestFn.invoke(RestFn.java:460)
at clojure.tools.nrepl.middleware.session.proxy$java.io.Writer$ff19274a.write(Unknown Source)
at java.io.PrintWriter.write(PrintWriter.java:456)
at java.io.PrintWriter.write(PrintWriter.java:473)
at clojure.core$fn__5471.invoke(core_print.clj:191)
at clojure.lang.MultiFn.invoke(MultiFn.java:231)
at clojure.core$pr_on.invoke(core.clj:3392)
at clojure.core$pr.invoke(core.clj:3404)
at clojure.lang.AFn.applyToHelper(AFn.java:154)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$prn.doInvoke(core.clj:3437)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$println.doInvoke(core.clj:3457)
at clojure.lang.RestFn.invoke(RestFn.java:408)
at clojure.main$repl_caught.invoke(main.clj:158)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__7569$fn__7582.invoke(interruptible_eval.clj:76
)
at clojure.main$repl$fn__6634.invoke(main.clj:259)
at clojure.main$repl.doInvoke(main.clj:257)
at clojure.lang.RestFn.invoke(RestFn.java:1096)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__7569.invoke(interruptible_eval.clj:56)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1862)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke(interruptible_eval.clj:41)
at clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__7610$fn__7613.invoke(interruptible_e
val.clj:171)
at clojure.core$comp$fn__4192.invoke(core.clj:2402)
at clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__7603.invoke(interruptible_eval.clj:138)
at clojure.lang.AFn.run(AFn.java:22)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
#<Namespace user>
Error loading namespace; falling back to userException in thread "nREPL-worker-1" java.lang.NoSuchMethodError: clojure.t
ools.nrepl.StdOutBuffer.length()INoSuchMethodError clojure.tools.nrepl.StdOutBuffer.length()I clojure.tools.nrepl.middl
eware.session/session-out/fn--7630 (session.clj:43)NoSuchMethodError clojure.tools.nrepl.StdOutBuffer.length()I clojure
.tools.nrepl.middleware.session/session-out/fn--7630 (session.clj:43)
at clojure.tools.nrepl.middleware.session$session_out$fn__7630.doInvoke(session.clj:43)
at clojure.lang.RestFn.invoke(RestFn.java:460)
at clojure.tools.nrepl.middleware.session.proxy$java.io.Writer$ff19274a.write(Unknown Source)
at java.io.PrintWriter.write(PrintWriter.java:456)
at java.io.PrintWriter.write(PrintWriter.java:473)
at clojure.core$fn__5471.invoke(core_print.clj:191)
at clojure.lang.MultiFn.invoke(MultiFn.java:231)
at clojure.core$pr_on.invoke(core.clj:3392)
at clojure.core$pr.invoke(core.clj:3404)
at clojure.lang.AFn.applyToHelper(AFn.java:154)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$prn.doInvoke(core.clj:3437)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$println.doInvoke(core.clj:3457)
at clojure.lang.RestFn.invoke(RestFn.java:408)
at clojure.main$repl_caught.invoke(main.clj:158)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__7569$fn__7582.invoke(interruptible_eval.clj:76
)
at clojure.main$repl$fn__6634.invoke(main.clj:259)
at clojure.main$repl.doInvoke(main.clj:257)
at clojure.lang.RestFn.invoke(RestFn.java:1096)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__7569.invoke(interruptible_eval.clj:56)
at clojure.lang.AFn.applyToHelper(AFn.java:152)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.core$apply.invoke(core.clj:624)
at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1862)
at clojure.lang.RestFn.invoke(RestFn.java:425)
at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke(interruptible_eval.clj:41)
at clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__7610$fn__7613.invoke(interruptible_e
val.clj:171)
at clojure.core$comp$fn__4192.invoke(core.clj:2402)
at clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__7603.invoke(interruptible_eval.clj:138)
at clojure.lang.AFn.run(AFn.java:22)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
user=>`
This is Leiningen issue, it's fixed in master (https://github.com/technomancy/leiningen/issues/1625)

Resources