TransactionTooLargeException when onSaveInstanceState called - android-9.0-pie

It Seems happened when onSaveInstanceState called.
Both Android 8.1 and 9.0 deveices have this problem.
How to solves this?
Thanks a lot!
13404-10 02:49:44.606 16580 16580 E JavaBinder: !!! FAILED BINDER TRANSACTION !!! (parcel size = 527472)
13504-10 02:49:44.606 16580 16580 W ActivityStopInfo: Bundle stats:
13604-10 02:49:44.606 16580 16580 W ActivityStopInfo: android:viewHierarchyState [size=2104]
13704-10 02:49:44.607 16580 16580 W ActivityStopInfo: android:views [size=2000]
13804-10 02:49:44.620 16580 16580 W ActivityStopInfo: android:support:fragments [size=524484]
13904-10 02:49:44.620 16580 16580 W ActivityStopInfo: PersistableBundle stats:
14004-10 02:49:44.620 16580 16580 W ActivityStopInfo: [null]
14104-10 02:49:44.620 16580 16580 D AndroidRuntime: Shutting down VM
142--------- beginning of crash
14304-10 02:49:44.621 16580 16580 E AndroidRuntime: FATAL EXCEPTION: main
14404-10 02:49:44.621 16580 16580 E AndroidRuntime: Process: com.cwj.hsing, PID: 16580
14504-10 02:49:44.621 16580 16580 E AndroidRuntime: java.lang.RuntimeException:
android.os.TransactionTooLargeException: data parcel size 527472 bytes
14604-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.app.servertransaction.PendingTransactionActions$StopInfo.run(PendingTransactionActions.java:160)
14704-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.os.Handler.handleCallback(Handler.java:873)
14804-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.os.Handler.dispatchMessage(Handler.java:99)
14904-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.os.Looper.loop(Looper.java:280)
15004-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.app.ActivityThread.main(ActivityThread.java:6706)
15104-10 02:49:44.621 16580 16580 E AndroidRuntime: at java.lang.reflect.Method.invoke(Native Method)
15204-10 02:49:44.621 16580 16580 E AndroidRuntime: at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
15304-10 02:49:44.621 16580 16580 E AndroidRuntime: at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)
15404-10 02:49:44.621 16580 16580 E AndroidRuntime: Caused by: android.os.TransactionTooLargeException: data parcel size 527472 bytes
15504-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.os.BinderProxy.transactNative(Native Method)
15604-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.os.BinderProxy.transact(Binder.java:1127)
15704-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.app.IActivityManager$Stub$Proxy.activityStopped(IActivityManager.java:4011)
15804-10 02:49:44.621 16580 16580 E AndroidRuntime: at android.app.servertransaction.PendingTransactionActions$StopInfo.run(PendingTransactionActions.java:144)
15904-10 02:49:44.621 16580 16580 E AndroidRuntime: ... 7 more

Bundle transaction null will be OK! if you donnot need save bundleData,you can do it like this!
#Override
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(new Bundle());
}

Related

Issue running spark-shell with yarn client, ERROR client.TransportClient: Failed to send RPC

I am trying to setup hadoop 3.1.2 with spark in windows. i have started hdfs cluster and i am able to create,copy files in hdfs. When i try to start spark-shell with yarn i am facing
ERROR cluster.YarnClientSchedulerBackend: Diagnostics message: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:227)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:544)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:264)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:875)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:874)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:874)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:906)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 7406367420263248997 to DESKTOP-TVBSANL.bbrouter/192.168.1.38:49691: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.access$1500(AbstractChannelHandlerContext.java:35)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
... 21 more
2020-03-28 11:32:11,608 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
when checked with yarn logs
2020-03-28 11:32:11,487 ERROR client.TransportClient: Failed to send RPC RPC 7406367420263248997 to DESKTOP-TVBSANL.bbrouter/192.168.1.38:49691: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.access$1500(AbstractChannelHandlerContext.java:35)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
... 21 more
2020-03-28 11:32:11,494 ERROR yarn.ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:227)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:544)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:264)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:875)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:874)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:874)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:906)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 7406367420263248997 to DESKTOP-TVBSANL.bbrouter/192.168.1.38:49691: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.access$1500(AbstractChannelHandlerContext.java:35)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
... 21 more
2020-03-28 11:32:11,497 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:227)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:544)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:264)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:875)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:874)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:874)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:906)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 7406367420263248997 to DESKTOP-TVBSANL.bbrouter/192.168.1.38:49691: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.access$1500(AbstractChannelHandlerContext.java:35)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
... 21 more
)
2020-03-28 11:32:11,505 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:227)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:544)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:264)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:875)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:874)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:874)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:906)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Failed to send RPC RPC 7406367420263248997 to DESKTOP-TVBSANL.bbrouter/192.168.1.38:49691: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:362)
at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:339)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
at io.netty.channel.ChannelOutboundBuffer.safeFail(ChannelOutboundBuffer.java:680)
at io.netty.channel.ChannelOutboundBuffer.remove0(ChannelOutboundBuffer.java:294)
at io.netty.channel.ChannelOutboundBuffer.failFlushed(ChannelOutboundBuffer.java:617)
at io.netty.channel.AbstractChannel$AbstractUnsafe.closeOutboundBufferForShutdown(AbstractChannel.java:627)
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:620)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:893)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:313)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:847)
at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1264)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:115)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:743)
at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:117)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:770)
at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:762)
at io.netty.channel.AbstractChannelHandlerContext.access$1500(AbstractChannelHandlerContext.java:35)
at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1116)
at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1050)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:464)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.socket.ChannelOutputShutdownException: Channel output shutdown
at io.netty.channel.AbstractChannel$AbstractUnsafe.shutdownOutput(AbstractChannel.java:587)
... 22 more
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.AbstractFileRegion.transferred()J
at org.apache.spark.network.util.AbstractFileRegion.transfered(AbstractFileRegion.java:28)
at io.netty.channel.nio.AbstractNioByteChannel.doWrite(AbstractNioByteChannel.java:228)
at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:282)
at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:879)
... 21 more
)
2020-03-28 11:32:11,526 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered.
2020-03-28 11:32:11,729 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://localhost:9000/user/Andrew/.sparkStaging/application_1585375241853_0002
2020-03-28 11:32:12,225 INFO util.ShutdownHookManager: Shutdown hook called
I have even added the following properties in spark conf
spark.driver.extraJavaOptions -Dhdp.version=3.1.2
spark.yarn.am.extraJavaOptions -Dhdp.version=3.1.2
these in yarn-site
<property>
<name>yarn.nodemanager.pmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>
<property>
<name>yarn.nodemanager.vmem-pmem-ratio</name>
<value>5</value>
</property>
My cluster is a single node cluster. Windows os with 16gb ram and 500 GB HDD. The following is my hdfs report
Configured Capacity: 1000203087872 (931.51 GB)
Present Capacity: 252250412093 (234.93 GB)
DFS Remaining: 252011880448 (234.70 GB)
DFS Used: 238531645 (227.48 MB)
DFS Used%: 0.09%
Replicated Blocks:
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 0
Low redundancy blocks with highest priority to recover: 0
Pending deletion blocks: 0
Erasure Coded Block Groups:
Low redundancy block groups: 0
Block groups with corrupt internal blocks: 0
Missing block groups: 0
Low redundancy blocks with highest priority to recover: 0
Pending deletion blocks: 0
-------------------------------------------------
Live datanodes (1):
Name: 127.0.0.1:9866 (127.0.0.1)
Hostname: ##################
Decommission Status : Normal
Configured Capacity: 1000203087872 (931.51 GB)
DFS Used: 238531645 (227.48 MB)
Non DFS Used: 747952675779 (696.59 GB)
DFS Remaining: 252011880448 (234.70 GB)
DFS Used%: 0.02%
DFS Remaining%: 25.20%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Sat Mar 28 11:48:14 IST 2020
Last Block Report: Sat Mar 28 11:30:44 IST 2020
Num of Blocks: 248
I have been at this for 2 days now. would appreciate help.
Thanks in advance.

Use the same realm from UI and background service

If I try to use the same realm from UI and background service I got this error:
05-20 14:39:35.016 11464 11464 F DEBUG : Build fingerprint: 'google/sdk_google_phone_x86/generic_x86:7.1.1/NYC/4729347:userdebug/test-keys'
05-20 14:39:35.017 11464 11464 F DEBUG : Revision: '0'
05-20 14:39:35.017 11464 11464 F DEBUG : ABI: 'x86'
05-20 14:39:35.017 11464 11464 F DEBUG : pid: 11427, tid: 11452, name: est.aps.tracker >>> fr.test.aps.tracker <<<
05-20 14:39:35.019 11464 11464 F DEBUG : signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
05-20 14:39:35.019 11464 11464 F DEBUG : eax 00000000 ebx 00002ca3 ecx 00002cbc edx 00000006
05-20 14:39:35.019 11464 11464 F DEBUG : esi 8977f978 edi 8977f920
05-20 14:39:35.019 11464 11464 F DEBUG : xcs 00000073 xds 0000007b xes 0000007b xfs 0000003b xss 0000007b
05-20 14:39:35.019 11464 11464 F DEBUG : eip ae2f1424 ebp 8977f778 esp 8977f71c flags 00200296
05-20 14:39:35.028 11464 11464 F DEBUG :
05-20 14:39:35.028 11464 11464 F DEBUG : backtrace:
05-20 14:39:35.028 11464 11464 F DEBUG : #00 pc ffffe424 [vdso:ae2f1000] (__kernel_vsyscall+16)
05-20 14:39:35.028 11464 11464 F DEBUG : #01 pc 0007a03c /system/lib/libc.so (tgkill+28)
05-20 14:39:35.028 11464 11464 F DEBUG : #02 pc 00075885 /system/lib/libc.so (pthread_kill+85)
05-20 14:39:35.028 11464 11464 F DEBUG : #03 pc 0002785a /system/lib/libc.so (raise+42)
05-20 14:39:35.028 11464 11464 F DEBUG : #04 pc 0001ee36 /system/lib/libc.so (abort+86)
05-20 14:39:35.030 11464 11464 F DEBUG : #05 pc 0046bab4 /data/app/fr.test.aps-1/lib/x86/librealm-wrappers.so (_ZN9__gnu_cxx27__verbose_terminate_handlerEv+452)
05-20 14:39:35.034 11464 11464 F DEBUG : #06 pc 00433b17 /data/app/fr.test.aps-1/lib/x86/librealm-wrappers.so (_ZN10__cxxabiv111__terminateEPFvvE+23)
05-20 14:39:35.034 11464 11464 F DEBUG : #07 pc 00433baf /data/app/fr.test.aps-1/lib/x86/librealm-wrappers.so (_ZSt9terminatev+31)
05-20 14:39:35.034 11464 11464 F DEBUG : #08 pc 0046a73d /data/app/fr.test.aps-1/lib/x86/librealm-wrappers.so (execute_native_thread_routine+141)
05-20 14:39:35.034 11464 11464 F DEBUG : #09 pc 00074fe2 /system/lib/libc.so (_ZL15__pthread_startPv+210)
05-20 14:39:35.034 11464 11464 F DEBUG : #10 pc 0002029e /system/lib/libc.so (__start_thread+30)
05-20 14:39:35.034 11464 11464 F DEBUG : #11 pc 0001e076 /system/lib/libc.so (__bionic_clone+70)
To isolate the problem I created a small app consisting of a single page and a hybrid service.
This page has two buttons to stop/start a counter. She also displays a list of user objects (in ListView).
The service only displays a counter every 5 seconds in logcat window.
(it takes a few seconds for it to start)
I would like the service to modify the listview with the counter value in real time.
But I can not use a same realm at the same time from my UI and service.
In other words, I can not uncomment at the same time the two lines mentioned below:
Enable/disable usage of realm by background service:
Uncomment/comment await in StartTrack method of Tracker.cs file.
Enable/disable usage of realm by UI:
Uncomment/comment await in OnAppearing method of ServicePage.cs file.
PS:
Adapt Constants.cs with your valid ROS address, username and password.
EDIT: I simplified the code

Couldn't run JavaFX Port Application on Genymotion with flash ARM Translation

I am having problem running my JavaFX Port Application on Genymotion. When I try to install it on the emulator. It says "Failure [INSTALL_FAILED_CPU_ABI_INCOMPATIBLE]". So I go into flashing it with ARM Trnaslation. Finally I successfully installed my application on genymotion.
The problem is when I try to run the app, it says "Unfortunately Gluon has stopped".
The error log below.
07-24 13:36:57.283 1508 1508 I MultiDex: install
07-24 13:36:57.283 1508 1508 I MultiDex: MultiDexExtractor.load(/data/app/com.gluonapplication2-1.apk, false)
07-24 13:36:57.299 197 246 W genymotion_audio: out_write() limiting sleep time 46485 to 39909
07-24 13:36:57.315 1508 1508 I MultiDex: loading existing secondary dex files
07-24 13:36:57.315 1508 1508 I MultiDex: load found 1 secondary dex files
07-24 13:36:57.319 1508 1508 I MultiDex: install done
07-24 13:36:57.323 1508 1508 V FXActivity: Initializing JavaFX Platform, using 8.60.7-SNAPSHOT
07-24 13:36:57.331 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.335 1508 1508 D dalvikvm: Trying to load lib /data/app-lib/com.gluonapplication2-1/libactivity.so 0xa4fe3960
07-24 13:36:57.371 1508 1508 F libc : Fatal signal 11 (SIGSEGV) at 0x000000b4 (code=1), thread 1508 (uonapplication2)
07-24 13:36:57.403 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.455 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.483 145 145 I DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
07-24 13:36:57.483 145 145 I DEBUG : Build fingerprint: 'generic/vbox86p/vbox86p:4.4.4/KTU84P/eng.genymotion.20160609.162149:userdebug/test-keys'
07-24 13:36:57.483 145 145 I DEBUG : Revision: '0'
07-24 13:36:57.483 145 145 I DEBUG : pid: 1508, tid: 1508, name: uonapplication2 >>> com.gluonapplication2 <<<
07-24 13:36:57.483 145 145 I DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 000000b4
07-24 13:36:57.607 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.663 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.723 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.783 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:57.919 145 145 I DEBUG : eax 0f861226 ebx 000019f0 ecx 94723d01 edx 00000000
07-24 13:36:57.919 145 145 I DEBUG : esi 00000000 edi b4e2e6bc
07-24 13:36:57.923 145 145 I DEBUG : xcs 00000073 xds 0000007b xes 0000007b xfs 00000000 xss 0000007b
07-24 13:36:57.923 145 145 I DEBUG : eip 945df751 ebp bf9353e8 esp bf9353c0 flags 00210246
07-24 13:36:57.923 145 145 I DEBUG :
07-24 13:36:57.923 145 145 I DEBUG : backtrace:
07-24 13:36:57.927 145 145 I DEBUG : #00 pc 000e8751 /system/lib/libhoudini.so
07-24 13:36:57.927 145 145 I DEBUG : #01 pc 000e66ea /system/lib/libhoudini.so
07-24 13:36:57.927 145 145 I DEBUG : #02 pc 000e8aaa /system/lib/libhoudini.so
07-24 13:36:57.931 145 145 I DEBUG : #03 pc 000bfcfc /system/lib/libhoudini.so
07-24 13:36:57.931 145 145 I DEBUG : #04 pc 000e6cc2 /system/lib/libhoudini.so (dvm2hdInit+18)
07-24 13:36:57.935 145 145 I DEBUG : #05 pc 00176947 /system/lib/libdvm.so (houdini::hookDlopen(char const*, int, bool*)+343)
07-24 13:36:57.935 145 145 I DEBUG : #06 pc 0008bcff /system/lib/libdvm.so (dvmLoadNativeCode(char const*, Object*, char**)+719)
07-24 13:36:57.935 145 145 I DEBUG : #07 pc 000ceaab /system/lib/libdvm.so (Dalvik_java_lang_Runtime_nativeLoad(unsigned int const*, JValue*)+139)
07-24 13:36:57.939 145 145 I DEBUG : #08 pc 001775b8 /system/lib/libdvm.so
07-24 13:36:57.939 145 145 I DEBUG : #09 pc 00005bff <unknown>
07-24 13:36:57.939 145 145 I DEBUG : #10 pc 0003b962 /system/lib/libdvm.so (dvmMterpStd(Thread*)+66)
07-24 13:36:57.939 145 145 I DEBUG : #11 pc 00037029 /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+217)
07-24 13:36:57.939 145 145 I DEBUG : #12 pc 000bd027 /system/lib/libdvm.so (dvmCallMethodV(Thread*, Method const*, Object*, bool, JValue*, char*)+759)
07-24 13:36:57.943 145 145 I DEBUG : #13 pc 000bd437 /system/lib/libdvm.so (dvmCallMethod(Thread*, Method const*, Object*, JValue*, ...)+55)
07-24 13:36:57.943 145 145 I DEBUG : #14 pc 000dd8b2 /system/lib/libdvm.so (dvmInitClass+1458)
07-24 13:36:57.947 145 145 I DEBUG : #15 pc 000cd8d5 /system/lib/libdvm.so (Dalvik_java_lang_Class_newInstance(unsigned int const*, JValue*)+229)
07-24 13:36:57.947 145 145 I DEBUG : #16 pc 001775b8 /system/lib/libdvm.so
07-24 13:36:57.951 145 145 I DEBUG : #17 pc 00005ceb <unknown>
07-24 13:36:57.951 145 145 I DEBUG : #18 pc 0003b962 /system/lib/libdvm.so (dvmMterpStd(Thread*)+66)
07-24 13:36:57.955 145 145 I DEBUG : #19 pc 00037029 /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+217)
07-24 13:36:57.955 145 145 I DEBUG : #20 pc 000bc1c6 /system/lib/libdvm.so (dvmInvokeMethod(Object*, Method const*, ArrayObject*, ArrayObject*, ClassObject*, bool)+1750)
07-24 13:36:57.955 145 145 I DEBUG : #21 pc 000d1b20 /system/lib/libdvm.so (Dalvik_java_lang_reflect_Method_invokeNative(unsigned int const*, JValue*)+288)
07-24 13:36:57.959 145 145 I DEBUG : #22 pc 001775b8 /system/lib/libdvm.so
07-24 13:36:57.967 145 145 I DEBUG : #23 pc 00005eff <unknown>
07-24 13:36:57.967 145 145 I DEBUG : #24 pc 0003b962 /system/lib/libdvm.so (dvmMterpStd(Thread*)+66)
07-24 13:36:57.971 145 145 I DEBUG : #25 pc 00037029 /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+217)
07-24 13:36:57.971 145 145 I DEBUG : #26 pc 000bd027 /system/lib/libdvm.so (dvmCallMethodV(Thread*, Method const*, Object*, bool, JValue*, char*)+759)
07-24 13:36:57.975 145 145 I DEBUG : #27 pc 0007879d /system/lib/libdvm.so (CallStaticVoidMethodV(_JNIEnv*, _jclass*, _jmethodID*, char*)+109)
07-24 13:36:57.975 145 145 I DEBUG : #28 pc 0005f50a /system/lib/libandroid_runtime.so (_JNIEnv::CallStaticVoidMethod(_jclass*, _jmethodID*, ...)+42)
07-24 13:36:57.979 145 145 I DEBUG : #29 pc 00060ca4 /system/lib/libandroid_runtime.so (android::AndroidRuntime::start(char const*, char const*)+884)
07-24 13:36:57.979 145 145 I DEBUG : #30 pc 00001017 /system/bin/app_process (main+567)
07-24 13:36:57.979 145 145 I DEBUG : #31 pc 0000d59c /system/lib/libc.so (__libc_init+108)
07-24 13:36:57.983 145 145 I DEBUG :
07-24 13:36:57.983 145 145 I DEBUG : stack:
07-24 13:36:57.983 145 145 I DEBUG : bf935470 b7737000 /system/bin/linker
07-24 13:36:57.987 145 145 I DEBUG : bf935474 b4e35179 /system/lib/libdvm.so
07-24 13:36:57.991 145 145 I DEBUG : bf935478 b76a9689 /system/lib/libc.so (__system_property_get+9)
07-24 13:36:57.991 145 145 I DEBUG : bf93547c b7711e58 /system/lib/libcutils.so
07-24 13:36:57.991 145 145 I DEBUG : bf935480 98dc7c38
07-24 13:36:57.995 145 145 I DEBUG : bf935484 948709c0
07-24 13:36:57.995 145 145 I DEBUG : bf935488 bf9354a8 [stack]
07-24 13:36:57.999 145 145 I DEBUG : bf93548c 945ddcc3 /system/lib/libhoudini.so (dvm2hdInit+19)
07-24 13:36:58.003 145 145 I DEBUG : bf935490 b4e3514d /system/lib/libdvm.so
07-24 13:36:58.003 145 145 I DEBUG : bf935494 bf9354e0 [stack]
07-24 13:36:58.003 145 145 I DEBUG : bf935498 9dd686d8 /dev/ashmem/dalvik-LinearAlloc (deleted)
07-24 13:36:58.007 145 145 I DEBUG : bf93549c b4e7bcb4 /system/lib/libdvm.so
07-24 13:36:58.007 145 145 I DEBUG : bf9354a0 b4e2e6bc /system/lib/libdvm.so
07-24 13:36:58.007 145 145 I DEBUG : bf9354a4 b4e2e6bc /system/lib/libdvm.so
07-24 13:36:58.011 145 145 I DEBUG : bf9354a8 00000000
07-24 13:36:58.011 145 145 I DEBUG : bf9354ac b4e2b948 /system/lib/libdvm.so (houdini::hookDlopen(char const*, int, bool*)+344)
07-24 13:36:58.011 145 145 I DEBUG : #05 bf9354b0 bf9354d8 [stack]
07-24 13:36:58.011 145 145 I DEBUG : bf9354b4 b4e35179 /system/lib/libdvm.so
07-24 13:36:58.011 145 145 I DEBUG : bf9354b8 b4e2cb75 /system/lib/libdvm.so
07-24 13:36:58.011 145 145 I DEBUG : bf9354bc b4e7bcb4 /system/lib/libdvm.so
07-24 13:36:58.011 145 145 I DEBUG : bf9354c0 bf93550c [stack]
07-24 13:36:58.015 145 145 I DEBUG : bf9354c4 9dd686d8 /dev/ashmem/dalvik-LinearAlloc (deleted)
07-24 13:36:58.019 145 145 I DEBUG : bf9354c8 b814af90 [heap]
07-24 13:36:58.023 145 145 I DEBUG : bf9354cc bf9355ab [stack]
07-24 13:36:58.023 145 145 I DEBUG : bf9354d0 bf93550c [stack]
07-24 13:36:58.027 145 145 I DEBUG : bf9354d4 00000000
07-24 13:36:58.031 145 145 I DEBUG : bf9354d8 b770a1d0 /system/lib/libcutils.so (__android_log_print)
07-24 13:36:58.031 145 145 I DEBUG : bf9354dc b4e2b7e0 /system/lib/libdvm.so (houdini::dvmGetMethodShorty(houdini::fake_Method const*))
07-24 13:36:58.035 145 145 I DEBUG : bf9354e0 00006e6f
07-24 13:36:58.035 145 145 I DEBUG : bf9354e4 00000000
07-24 13:36:58.035 145 145 I DEBUG : bf9354e8 ffffffff
07-24 13:36:58.039 145 145 I DEBUG : bf9354ec 00000000
07-24 13:36:58.039 145 145 I DEBUG : ........ ........
07-24 13:36:58.039 145 145 I DEBUG : #06 bf935560 b814af90 [heap]
07-24 13:36:58.047 145 145 I DEBUG : bf935564 00000001
07-24 13:36:58.047 145 145 I DEBUG : bf935568 bf9355ab [stack]
07-24 13:36:58.047 145 145 I DEBUG : bf93556c b4d40080 /system/lib/libdvm.so (hashcmpNameStr(void const*, void const*))
07-24 13:36:58.047 145 145 I DEBUG : bf935570 00000000
07-24 13:36:58.047 145 145 I DEBUG : bf935574 b4e44714 /system/lib/libdvm.so
07-24 13:36:58.047 145 145 I DEBUG : bf935578 00000018
07-24 13:36:58.047 145 145 I DEBUG : bf93557c b772df2a /system/bin/linker (__dl_pthread_mutex_unlock+154)
07-24 13:36:58.047 145 145 I DEBUG : bf935580 00000002
07-24 13:36:58.047 145 145 I DEBUG : bf935584 bf9355bc [stack]
07-24 13:36:58.051 145 145 I DEBUG : bf935588 a4fe3960 /dev/ashmem/dalvik-heap (deleted)
07-24 13:36:58.051 145 145 I DEBUG : bf93558c b76e4f01 /system/lib/libc.so
07-24 13:36:58.051 145 145 I DEBUG : bf935590 bf93561c [stack]
07-24 13:36:58.051 145 145 I DEBUG : bf935594 b800deb0 [heap]
07-24 13:36:58.051 145 145 I DEBUG : bf935598 000006c9
07-24 13:36:58.051 145 145 I DEBUG : bf93559c b765c455 /system/lib/libc.so (dlfree+885)
07-24 13:36:58.051 145 145 I DEBUG : ........ ........
07-24 13:36:58.051 145 145 I DEBUG : #07 bf935600 b814af90 [heap]
07-24 13:36:58.055 145 145 I DEBUG : bf935604 a4fe3960 /dev/ashmem/dalvik-heap (deleted)
07-24 13:36:58.055 145 145 I DEBUG : bf935608 bf93561c [stack]
07-24 13:36:58.055 145 145 I DEBUG : bf93560c b4e7bcb4 /system/lib/libdvm.so
07-24 13:36:58.055 145 145 I DEBUG : bf935610 9dad7bf0 /data/dalvik-cache/system#framework#core.jar#classes.dex
07-24 13:36:58.055 145 145 I DEBUG : bf935614 a4cb60e0 /dev/ashmem/dalvik-zygote (deleted)
07-24 13:36:58.055 145 145 I DEBUG : bf935618 b4d536c5 /system/lib/libdvm.so (dvmMarkCard(void const*)+5)
07-24 13:36:58.055 145 145 I DEBUG : bf93561c 00000000
07-24 13:36:58.055 145 145 I DEBUG : bf935620 a4ffdaf0 /dev/ashmem/dalvik-heap (deleted)
07-24 13:36:58.055 145 145 I DEBUG : bf935624 00000001
07-24 13:36:58.055 145 145 I DEBUG : bf935628 b4d48949 /system/lib/libdvm.so (dvmLockObject+9)
07-24 13:36:58.055 145 145 I DEBUG : bf93562c 9ed5dc00
07-24 13:36:58.055 145 145 I DEBUG : bf935630 9dae4aae /data/dalvik-cache/system#framework#core.jar#classes.dex
07-24 13:36:58.055 145 145 I DEBUG : bf935634 9ed5dc20
07-24 13:36:58.055 145 145 I DEBUG : bf935638 bf9356b8 [stack]
07-24 13:36:58.059 145 145 I DEBUG : bf93563c b4e2c5b9 /system/lib/libdvm.so
07-24 13:36:58.239 522 544 I BootReceiver: Copying /data/tombstones/tombstone_04 to DropBox (SYSTEM_TOMBSTONE)
07-24 13:36:58.255 522 1529 W ActivityManager: Force finishing activity com.gluonapplication2/javafxports.android.FXActivity
07-24 13:36:58.263 196 196 D Zygote : Process 1508 terminated by signal (11)
07-24 13:36:58.559 522 1529 I WindowManager: Screenshot max retries 4 of Token{52a6e170 ActivityRecord{52a6e0c0 u0 com.gluonapplication2/javafxports.android.FXActivity t4 f}} appWin=Window{5298eeac u0 Starting com.gluonapplication2} drawState=4
07-24 13:36:58.559 522 1529 W WindowManager: Screenshot failure taking screenshot for (1080x1920) to layer 21010
07-24 13:36:58.559 522 1529 W ActivityManager: Exception thrown during pause
07-24 13:36:58.559 522 1529 W ActivityManager: android.os.DeadObjectException
07-24 13:36:58.559 522 1529 W ActivityManager: at android.os.BinderProxy.transact(Native Method)
07-24 13:36:58.559 522 1529 W ActivityManager: at android.app.ApplicationThreadProxy.schedulePauseActivity(ApplicationThreadNative.java:660)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:761)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:2443)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityStack.finishTopRunningActivityLocked(ActivityStack.java:2320)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityStackSupervisor.finishTopRunningActivityLocked(ActivityStackSupervisor.java:2050)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:9548)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:9441)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:10086)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:9637)
07-24 13:36:58.559 522 1529 W ActivityManager: at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)
07-24 13:36:58.571 522 533 I ActivityManager: Process com.gluonapplication2 (pid 1508) has died.
07-24 13:36:58.739 719 719 W EGL_genymotion: eglSurfaceAttrib not implemented
07-24 13:36:59.335 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:59.719 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:36:59.959 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:37:00.071 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:37:00.131 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:37:00.191 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:37:02.507 522 567 D MobileDataStateTracker: default: setPolicyDataEnable(enabled=true)
07-24 13:37:02.599 522 534 W InputMethodManagerService: Window already focused, ignoring focus gain of: com.android.internal.view.IInputMethodClient$Stub$Proxy#52c2b9f8 attribute=null, token = android.os.BinderProxy#528d3c9c
07-24 13:37:02.615 197 246 W genymotion_audio: out_write() limiting sleep time 115736 to 39909
07-24 13:37:02.655 197 246 W genymotion_audio: out_write() limiting sleep time 101655 to 39909
07-24 13:37:02.695 197 246 W genymotion_audio: out_write() limiting sleep time 87596 to 39909
07-24 13:37:02.735 197 246 W genymotion_audio: out_write() limiting sleep time 77324 to 39909
07-24 13:37:02.779 197 246 W genymotion_audio: out_write() limiting sleep time 62335 to 39909
07-24 13:37:02.819 197 246 W genymotion_audio: out_write() limiting sleep time 46485 to 39909
07-24 13:37:02.891 197 246 W genymotion_audio: out_write() limiting sleep time 51201 to 39909
Can you help me with this? I am a total newbie in JavaFX Port.
I've installed Genymotion and the ARM translation, and I can reproduce the error.
If you set on Genymotion->Settings->ADB 'use custom Android SDK tools', and provide the path to your android sdk, then you can open a console window, go to Android/sdk/platform-tools and when the emulator running, type: adb logcat -v threadtime.
You will see the exception when launching the Gluon application:
07-23 07:36:17.191 1713 1713 E AndroidRuntime: java.lang.UnsatisfiedLinkError: dlopen failed: "/data/app/com.gluonhq.fiftystates-1/lib/arm/libactivity.so" has unexpected e_machine: 40
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at java.lang.Runtime.loadLibrary(Runtime.java:372)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at java.lang.System.loadLibrary(System.java:1076)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at javafxports.android.FXActivity.<clinit>(FXActivity.java:116)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at java.lang.Class.newInstance(Native Method)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.Instrumentation.newActivity(Instrumentation.java:1067)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2317)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2476)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.ActivityThread.-wrap11(ActivityThread.java)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1344)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.os.Handler.dispatchMessage(Handler.java:102)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.os.Looper.loop(Looper.java:148)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at android.app.ActivityThread.main(ActivityThread.java:5417)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at java.lang.reflect.Method.invoke(Native Method)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
07-23 07:36:17.191 1713 1713 E AndroidRuntime: at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
This error: UnsatisfiedLinkError: dlopen failed: "/data/app/com.gluonhq.fiftystates-1/lib/arm/libactivity.so" has unexpected e_machine: 40 has been reported several times, as a bug in the arm translation.
Basically it means there are no x86 libraries, and the translation from arm libraries has failed.
If you want to run the app in an emulator you may try BlueStacks. Once you have installed it, you can double click on your apk and it will be available on BlueStacks. Then you can run it.
If for any reason the app crashes, you can run adb logcat as well to find out about any possible exception. I've noticed that apps that work fine on mobile fail easily on the emulator.
So the best solution is running and testing your apps in a real device.

PIG creates file on Hadoop but cannot write to it

I am learning hadoop and created a simple pig script.
Reading a file works, but writing to another file does not.
My script runs fine, the DUMP f command shows me 10 records, as expected. But when I store the same relation to a file (store f into 'result.csv';), there are some odd messages on the console, and for some reason, in the end I have a result file with only the first 3 records.
My questions are:
What's the matter with the IOException, when reading worked and
writing worked at least partly?
Why does the console tell me Total records written : 0, when actually 3 records have been written?
Why didn't it store the 10 records, as expected?
My Script (it's just some sandbox playing)
cd /user/samples
c = load 'crimes.csv' using PigStorage(',')
as (ID:int,Case_Number:int,Date:chararray,Block:chararray,IUCR:chararray,Primary_Type,Description,LocationDescription,Arrest:boolean,Domestic,Beat,District,Ward,CommunityArea,FBICode,XCoordinate,YCoordinate,Year,UpdatedOn,Latitude,Longitude,Location);
c = LIMIT c 1000;
t = foreach c generate ID, Date, Arrest, Year;
f = FILTER t by Arrest==true;
f = LIMIT f 10;
dump f;
store f into 'result.csv';
part of the console output:
2016-07-21 15:55:07,435 [main] INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:10020. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2016-07-21 15:55:07,537 [main] WARN org.apache.pig.tools.pigstats.mapreduce.MRJobStats - Unable to get job counters
java.io.IOException: java.io.IOException: java.net.ConnectException: Call From m1.hdp2/192.168.178.201 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.getCounters(HadoopShims.java:132)
at org.apache.pig.tools.pigstats.mapreduce.MRJobStats.addCounters(MRJobStats.java:284)
at org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil.addSuccessJobStats(MRPigStatsUtil.java:235)
at org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil.accumulateStats(MRPigStatsUtil.java:165)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:360)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:308)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1474)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1459)
at org.apache.pig.PigServer.execute(PigServer.java:1448)
at org.apache.pig.PigServer.access$500(PigServer.java:118)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1773)
at org.apache.pig.PigServer.registerQuery(PigServer.java:707)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1075)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:176)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.io.IOException: java.net.ConnectException: Call From m1.hdp2/192.168.178.201 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:343)
at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:428)
at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:572)
at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:184)
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.getCounters(HadoopShims.java:126)
... 24 more
Caused by: java.net.ConnectException: Call From m1.hdp2/192.168.178.201 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.GeneratedConstructorAccessor18.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy14.getJobReport(Unknown Source)
at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:324)
... 28 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 36 more
2016-07-21 15:55:07,540 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2016-07-21 15:55:07,571 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.7.2 0.16.0 hadoop 2016-07-21 15:50:17 2016-07-21 15:55:07 FILTER,LIMIT
Success!
Job Stats (time in seconds):
JobId Maps Reduces MaxMapTime MinMapTime AvgMapTime MedianMapTime MaxReduceTime MinReduceTime AvgReduceTime MedianReducetime Alias Feature Outputs
job_1469130571595_0001 3 1 n/a n/a n/a n/a n/a n/a n/a n/a c
job_1469130571595_0002 1 1 n/a n/a n/a n/a n/a n/a n/a n/a c,f,t hdfs://localhost:9000/user/samples/result.csv,
Input(s):
Successfully read 0 records from: "hdfs://localhost:9000/user/samples/crimes.csv"
Output(s):
Successfully stored 0 records in: "hdfs://localhost:9000/user/samples/result.csv"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_1469130571595_0001 -> job_1469130571595_0002,
job_1469130571595_0002
2016-07-21 15:55:07,573 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at /0.0.0.0:8032
2016-07-21 15:55:07,585 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate - Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting to job history server
2016-07-21 15:55:08,592 [main] INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:10020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)

unable to access sonar through web

i have installed sonar on rhel 6.3 64 bit machine but when trying to access the application through web using "http://10.217.14.40:13385/sonar" i get HTTP Error 503 Service Unavailable
Following is the last 100 lines from the log
[root#RHEL-6 logs]# tail -100 sonar.log
INFO | jvm 1 | 2012/09/25 07:58:21 | 07:58:21,136 |-INFO in ch.qos.logback.classic.joran.action.LevelAction - ROOT level set to INFO
INFO | jvm 1 | 2012/09/25 07:58:21 | 07:58:21,136 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [SONAR_FILE] to Logger[ROOT]
INFO | jvm 1 | 2012/09/25 07:58:21 | 07:58:21,136 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator#77435978 - Registering current configuration as safe fallback point
INFO | jvm 1 | 2012/09/25 07:58:21 |
2012.09.25 07:58:21 INFO o.s.s.p.ServerImpl Sonar Server / 3.2 / d9303b2d9d4c1e75f8536e4144028f1999f727f4
2012.09.25 07:58:21 INFO o.s.s.d.EmbeddedDatabase Starting embedded database on port 9092 with url jdbc:h2:tcp://10.217.14.40:13384/sonar
2012.09.25 07:58:21 ERROR o.s.s.p.Platform Unable to start database
org.sonar.api.utils.SonarException: Unable to start database
at org.sonar.server.database.EmbeddedDatabase.start(EmbeddedDatabase.java:75) ~[classes/:na]
at org.sonar.server.database.EmbeddedDatabaseFactory.start(EmbeddedDatabaseFactory.java:41) ~[classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_07]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_07]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_07]
at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_07]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.behaviors.Stored.start(Stored.java:110) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1009) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1002) ~[picocontainer-2.14.1.jar:na]
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:760) ~[picocontainer-2.14.1.jar:na]
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:70) ~[sonar-plugin-api-3.2.jar:na]
at org.sonar.server.platform.Platform.startDatabaseConnectors(Platform.java:166) ~[classes/:na]
at org.sonar.server.platform.Platform.init(Platform.java:114) ~[classes/:na]
at org.sonar.server.platform.PlatformLifecycleListener.contextInitialized(PlatformLifecycleListener.java:33) [classes/:na]
at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:548) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.jetty.servlet.Context.startContext(Context.java:136) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1272) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:517) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:489) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.25.jar:6.1.25]
at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.jetty.Server.doStart(Server.java:224) [jetty-6.1.25.jar:6.1.25]
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50) [jetty-util-6.1.25.jar:6.1.25]
at org.sonar.application.JettyEmbedder.start(JettyEmbedder.java:72) [sonar-application-3.2.jar:na]
at org.sonar.application.StartServer.main(StartServer.java:48) [sonar-application-3.2.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_07]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_07]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_07]
at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_07]
at org.tanukisoftware.wrapper.WrapperSimpleApp.run(WrapperSimpleApp.java:240) [wrapper-3.2.3.jar:3.2.3]
at java.lang.Thread.run(Thread.java:722) [na:1.7.0_07]
Caused by: org.h2.jdbc.JdbcSQLException: Exception opening port "9092" (port may be in use), cause: "java.net.BindException: Address already in use" [90061-167]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:329) ~[h2-1.3.167.jar:1.3.167]
at org.h2.message.DbException.get(DbException.java:158) ~[h2-1.3.167.jar:1.3.167]
at org.h2.util.NetUtils.createServerSocketTry(NetUtils.java:190) ~[h2-1.3.167.jar:1.3.167]
at org.h2.util.NetUtils.createServerSocket(NetUtils.java:156) ~[h2-1.3.167.jar:1.3.167]
at org.h2.server.TcpServer.start(TcpServer.java:222) ~[h2-1.3.167.jar:1.3.167]
at org.h2.tools.Server.start(Server.java:455) ~[h2-1.3.167.jar:1.3.167]
at org.sonar.server.database.EmbeddedDatabase.start(EmbeddedDatabase.java:71) ~[classes/:na]
... 35 common frames omitted
Caused by: java.net.BindException: Address already in use
at java.net.PlainSocketImpl.socketBind(Native Method) ~[na:1.7.0_07]
at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:376) ~[na:1.7.0_07]
at java.net.ServerSocket.bind(ServerSocket.java:376) ~[na:1.7.0_07]
at java.net.ServerSocket.<init>(ServerSocket.java:237) ~[na:1.7.0_07]
at java.net.ServerSocket.<init>(ServerSocket.java:128) ~[na:1.7.0_07]
at org.h2.util.NetUtils.createServerSocketTry(NetUtils.java:186) ~[h2-1.3.167.jar:1.3.167]
... 39 common frames omitted
INFO | jvm 1 | 2012/09/25 07:58:21 | 2012-09-25 07:58:21.716:WARN::Failed startup of context org.mortbay.jetty.webapp.WebAppContext#1778db3{/,file:/home/BuildTools/sonar-3.2/war/sonar-server}
INFO | jvm 1 | 2012/09/25 07:58:21 | org.sonar.api.utils.SonarException: Unable to start database
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.server.database.EmbeddedDatabase.start(EmbeddedDatabase.java:75)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.server.database.EmbeddedDatabaseFactory.start(EmbeddedDatabaseFactory.java:41)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
INFO | jvm 1 | 2012/09/25 07:58:21 | at java.lang.reflect.Method.invoke(Method.java:601)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.behaviors.Stored.start(Stored.java:110)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1009)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1002)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:760)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:70)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.server.platform.Platform.startDatabaseConnectors(Platform.java:166)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.server.platform.Platform.init(Platform.java:114)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.server.platform.PlatformLifecycleListener.contextInitialized(PlatformLifecycleListener.java:33)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:548)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.servlet.Context.startContext(Context.java:136)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1272)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:517)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:489)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.jetty.Server.doStart(Server.java:224)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.application.JettyEmbedder.start(JettyEmbedder.java:72)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.sonar.application.StartServer.main(StartServer.java:48)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
INFO | jvm 1 | 2012/09/25 07:58:21 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
INFO | jvm 1 | 2012/09/25 07:58:21 | at java.lang.reflect.Method.invoke(Method.java:601)
INFO | jvm 1 | 2012/09/25 07:58:21 | at org.tanukisoftware.wrapper.WrapperSimpleApp.run(WrapperSimpleApp.java:240)
INFO | jvm 1 | 2012/09/25 07:58:21 | at java.lang.Thread.run(Thread.java:722)
INFO | jvm 1 | 2012/09/25 07:58:21 | 2012-09-25 07:58:21.737:INFO::Started SelectChannelConnector#10.217.14.40:13385
As you can read in your log, Sonar fails to start because H2 database cannot be started. It can't be started because the port 9092 is already used:
Exception opening port "9092" (port may be in use), cause: "java.net.BindException: Address already in use"
This means that you already have another process that uses this port - probably another Sonar instance that is already started...

Resources