How to solve org.infinispan.commons.CacheException? - caching

I have seen the solution specified in the link below, and I don't get a solution for that error.
Infinispan File Cache Store
This is the message in my stacktrace,
org.infinispan.commons.CacheException:
Unable to invoke method public void org.infinispan.persistence.manager.PersistenceManagerImpl.start() on object of type PersistenceManagerImpl
Caused by:
org.infinispan.commons.CacheException:
Unable to start cache loaders
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
at java.base/java.lang.reflect.Method.invoke(Method.java:564) ~[na:na]
at org.infinispan.commons.util.SecurityActions.lambda$invokeAccessibly$0(SecurityActions.java:79) ~[infinispan-commons-9.3.5.Final.jar:9.3.5.Final]
... 190 common frames omitted
Please suggest me a better solution.Thanks in advance.

You skipped the interesting part of the exception. However I suspect this is related to Java 11. Upgrade to Infinispan 9.4.8.Final and try again please.

Related

How to prevent Apache FOP 2.8 from throwing java.lang.NoSuchFieldError: RAW_PDF sporadic, when rendering a pdf?

While printing a PDF with Apache FOP 2.8, it sporadicly throws an exception:
Caused by: java.lang.NoSuchFieldError: RAW_PDF
at org.apache.fop.render.afp.AFPImageHandlerRawStream.<clinit>(AFPImageHandlerRawStream.java:43) ~[fop-2.8.jar:2.8]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:na]
Does anyone know why this happens or how to resolve this issue?

Error in semantic analysis when trying to compile Groovy class

I have a maven project in which I want to write and run groovy classes.
I added the respective dependencies but when I try to run my class, this error pops up.
The class in question is an abstract class.
Groovyc: While compiling [<Module name>]: BUG! exception in phase 'semantic analysis' in source unit '<File path>' Unsupported class file major version 61
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:969)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:642)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:591)
at org.jetbrains.groovy.compiler.rt.GroovyCompilerWrapper.compile(GroovyCompilerWrapper.java:48)
at org.jetbrains.groovy.compiler.rt.DependentGroovycRunner.runGroovyc(DependentGroovycRunner.java:118)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.jetbrains.groovy.compiler.rt.GroovycRunner.intMain2(GroovycRunner.java:80)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.jetbrains.jps.incremental.groovy.InProcessGroovyc.runGroovycInThisProcess(InProcessGroovyc.java:167)
at org.jetbrains.jps.incremental.groovy.InProcessGroovyc.lambda$runGroovyc$0(InProcessGroovyc.java:77)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.IllegalArgumentException: Unsupported class file major version 61
at groovyjarjarasm.asm.ClassReader.<init>(ClassReader.java:196)
at groovyjarjarasm.asm.ClassReader.<init>(ClassReader.java:177)
at groovyjarjarasm.asm.ClassReader.<init>(ClassReader.java:163)
at groovyjarjarasm.asm.ClassReader.<init>(ClassReader.java:284)
at org.codehaus.groovy.ast.decompiled.AsmDecompiler.parseClass(AsmDecompiler.java:81)
at org.codehaus.groovy.control.ClassNodeResolver.findDecompiled(ClassNodeResolver.java:251)
at org.codehaus.groovy.control.ClassNodeResolver.tryAsLoaderClassOrScript(ClassNodeResolver.java:189)
at org.codehaus.groovy.control.ClassNodeResolver.findClassNode(ClassNodeResolver.java:169)
at org.codehaus.groovy.control.ClassNodeResolver.resolveName(ClassNodeResolver.java:125)
at org.codehaus.groovy.control.ResolveVisitor.resolveToOuter(ResolveVisitor.java:853)
at org.codehaus.groovy.control.ResolveVisitor.resolve(ResolveVisitor.java:467)
at org.codehaus.groovy.control.ResolveVisitor.visitClass(ResolveVisitor.java:1422)
at org.codehaus.groovy.control.ResolveVisitor.startResolving(ResolveVisitor.java:230)
at org.codehaus.groovy.control.CompilationUnit$13.call(CompilationUnit.java:700)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:965)
... 19 more

hadoop "Can not create a Path from an empty string"

I am a beginner of hadoop.
I have been getting error as following.
Exception in thread "main" java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:172)
at org.apache.hadoop.fs.Path.<init>(Path.java:184)
at org.apache.hadoop.util.StringUtils.stringToPath(StringUtils.java:254)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:499)
at org.apache.mahout.text.SequenceFilesFromDirectory.runMapReduce(SequenceFilesFromDirectory.java:172)
at org.apache.mahout.text.SequenceFilesFromDirectory.run(SequenceFilesFromDirectory.java:90)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.mahout.text.SequenceFilesFromDirectory.main(SequenceFilesFromDirectory.java:64)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
The code is here.
"username"#lena:~/Coursework$ mahout seqdirectory -i docs -o docs-seqfiles -c UTF-8 -chunk 5
There are some similar questions I found, but nothing worked in my case.
I hope somebody could find a solution, thank you!
I suspect that your files are in Local File System, 2 things should help out:
Put folder/files on HDFS and set -i [HDFS path] -o [HDFS path]
set MAHOUT_LOCAL
refer: mahout seqdirectory fails to read input file

CDAP Source plugin to read data from Sftp server

I want to read a csv file that is available to Sftp server by using a cdap source plugin.
I came across FTP Batch Source plugin that does the same. But when running this i am getting below exception.
Caused by: java.io.IOException: No FileSystem for scheme: sftp
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2798) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2809) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) ~[org.apache.hadoop.hadoop-common-2.8.0.jar:na]
at co.cask.hydrator.format.plugin.AbstractFileSource.prepareRun(AbstractFileSource.java:129) ~[na:na]
at co.cask.hydrator.format.plugin.AbstractFileSource.prepareRun(AbstractFileSource.java:63) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource$1.call(WrappedBatchSource.java:53) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource$1.call(WrappedBatchSource.java:50) ~[na:na]
at co.cask.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[na:na]
at co.cask.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:50) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:36) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource$1.call(WrappedBatchSource.java:53) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource$1.call(WrappedBatchSource.java:50) ~[na:na]
at co.cask.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[na:na]
at co.cask.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:50) ~[na:na]
at co.cask.cdap.etl.common.plugin.WrappedBatchSource.prepareRun(WrappedBatchSource.java:36) ~[na:na]
at co.cask.cdap.etl.common.submit.SubmitterPlugin$3.run(SubmitterPlugin.java:83) ~[na:na]
at co.cask.cdap.internal.app.runtime.AbstractContext$2.run(AbstractContext.java:534) ~[na:na]
at co.cask.cdap.data2.transaction.Transactions$CacheBasedTransactional.finishExecute(Transactions.java:224) ~[na:na]
... 18 common frames omitted
I am using below version of libraries which is also a ristriction.
Hadoop - 2.7.3
Spark - 2.3.0
I also came across this question which suggest using this and setting proeprty fs.sftp.impl to org.apache.hadoop.fs.sftp.SFTPFileSystem will solve the issue but not sure how use above code and set this proeprty.
You need to set a file system properties under the Advanced section when using SFTP as the protocol:
{
"fs.sftp.impl": "org.apache.hadoop.fs.sftp.SFTPFileSystem"
}
The FTP plugin is deprecated to my understanding. Use the SFTP Action plugin instead and build a pipeline like so :
The idea is to first copy the file to the local file system of the runtime environment and then sink the file where you wish (GCS in my case).

Error while connecting to Phoenix ERROR 103 (08004): Unable to establish connection

Am trying to establish connection to Phoenix, using DBVisualizer but getting below error
Followed steps give here
https://community.hortonworks.com/articles/19016/connect-to-phoenix-hbase-using-dbvisualizer.html
After which getting error as below
ERROR 103 (08004): Unable to establish connection
When checked the error log of DBVisualizer am seeing below things
2019-07-22 12:53:18.890 INFO 903 [ExecutorRunner-pool-3-thread-1 - H.Ĵ] Exception while connecting Hbase
java.sql.SQLException: ERROR 103 (08004): Unable to establish connection.
at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:422)
at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:392)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$300(ConnectionQueryServicesImpl.java:211)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2269)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.onseven.dbvis.h.B.D.ā(Z:1548)
at com.onseven.dbvis.h.B.F$A.call(Z:1369)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:421)
at org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:330)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:144)
at org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:390)
... 18 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
... 23 more
Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.RpcClientImpl
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:54)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:34)
at org.apache.hadoop.hbase.ipc.RpcClientFactory.createClient(RpcClientFactory.java:64)
at org.apache.hadoop.hbase.ipc.RpcClientFactory.createClient(RpcClientFactory.java:48)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:638)
... 28 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:46)
... 32 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.ClassSize
at org.apache.hadoop.hbase.ipc.IPCUtil.<init>(IPCUtil.java:74)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.<init>(AbstractRpcClient.java:95)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.<init>(RpcClientImpl.java:1092)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.<init>(RpcClientImpl.java:1118)
I tried using Intellij and it is working fine.
Add all jars that are required by the phoenix driver and restart the tool.
Following jars I had added for IntelliJ
hbase-annotations
hbase-common
hbase-client
phoenix-hbase-client
I had added in the drivers of Intellij and restarted it.
Make sure you are using the proper version of JDK while connecting the Phoenix through tool.

Resources