This is the response of logstash -f logstash.conf. I don't know what to do.
Using JAVA_HOME defined java: C:\Program Files\Java\jdk-17.0.2
WARNING, using JAVA_HOME while Logstash distribution comes with a
bundled JDK 2022-02-09T18:48:30.825+03:30 [main] WARN FilenoUtil :
Native subprocess control requires open access to the JDK IO subsystem
Pass '--add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens
java.base/java.io=ALL-UNNAMED' to enable. Sending Logstash logs to
C:/logstash-7.12.0/logs which is now configured via log4j2.properties
[2022-02-09T18:48:44,793][INFO ][logstash.runner ] Log4j
configuration path used is:
C:\logstash-7.12.0\config\log4j2.properties
[2022-02-09T18:48:44,800][INFO ][logstash.runner ] Starting
Logstash {"logstash.version"=>"7.12.0", "jruby.version"=>"jruby
9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 17.0.2+8-LTS-86 on 17.0.2+8-LTS-86 +indy +jit [mswin32-x86_64]"}
[2022-02-09T18:48:44,879][WARN ][logstash.config.source.multilocal]
Ignoring the 'pipelines.yml' file because modules or command line
options are specified [2022-02-09T18:48:45,593][INFO ][logstash.agent
] Successfully started Logstash API endpoint {:port=>9600}
[2022-02-09T18:48:46,107][INFO ][org.reflections.Reflections]
Reflections took ۲۲ ms to scan ۱ urls, producing ۲۳ keys and ۴۷ values
[2022-02-09T18:48:46,484][ERROR][logstash.plugins.registry] Tried to
load a plugin's code, but failed. {:exception=>#<LoadError: Could not
load FFI Provider: (NotImplementedError) FFI not available:
java.lang.UnsatisfiedLinkError: could not locate stub library in jar
file. Tried [jni/x86_64-Windows/jffi-۱.۲.dll,
/jni/x86_64-Windows/jffi-۱.۲.dll]
at com.kenai.jffi.internal.StubLoader.getStubLibraryStream(StubLoader.java:450)
at com.kenai.jffi.internal.StubLoader.loadFromJar(StubLoader.java:375)
at com.kenai.jffi.internal.StubLoader.load(StubLoader.java:278)
at com.kenai.jffi.internal.StubLoader.(StubLoader.java:487)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:467)
at com.kenai.jffi.Init.load(Init.java:68)
at com.kenai.jffi.Foreign$InstanceHolder.getInstanceHolder(Foreign.java:49)
at com.kenai.jffi.Foreign$InstanceHolder.(Foreign.java:45)
at com.kenai.jffi.Foreign.getInstance(Foreign.java:103)
at com.kenai.jffi.Type$Builtin.lookupTypeInfo(Type.java:242)
at com.kenai.jffi.Type$Builtin.getTypeInfo(Type.java:237)
at com.kenai.jffi.Type.resolveSize(Type.java:155)
at com.kenai.jffi.Type.size(Type.java:138)
at jnr.ffi.provider.jffi.NativeRuntime$TypeDelegate.size(NativeRuntime.java:178)
at jnr.ffi.provider.AbstractRuntime.(AbstractRuntime.java:48)
at jnr.ffi.provider.jffi.NativeRuntime.(NativeRuntime.java:57)
at jnr.ffi.provider.jffi.NativeRuntime.(NativeRuntime.java:41)
at jnr.ffi.provider.jffi.NativeRuntime$SingletonHolder.(NativeRuntime.java:53)
at jnr.ffi.provider.jffi.NativeRuntime.getInstance(NativeRuntime.java:49)
at jnr.ffi.provider.jffi.Provider.(Provider.java:29)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
at java.base/java.lang.Class.newInstance(Class.java:645)
at jnr.ffi.provider.FFIProvider$SystemProviderSingletonHolder.getInstance(FFIProvider.java:68)
at jnr.ffi.provider.FFIProvider$SystemProviderSingletonHolder.(FFIProvider.java:57)
at jnr.ffi.provider.FFIProvider.getSystemProvider(FFIProvider.java:35)
at jnr.ffi.Library.loadLibrary(Library.java:114)
at jnr.posix.POSIXFactory$DefaultLibCProvider$SingletonHolder.(POSIXFactory.java:289)
at jnr.posix.POSIXFactory$DefaultLibCProvider.getLibC(POSIXFactory.java:318)
at jnr.posix.BaseNativePOSIX.(BaseNativePOSIX.java:38)
at jnr.posix.WindowsPOSIX.(WindowsPOSIX.java:134)
at jnr.posix.POSIXFactory.loadWindowsPOSIX(POSIXFactory.java:173)
at jnr.posix.POSIXFactory.loadNativePOSIX(POSIXFactory.java:142)
at jnr.posix.POSIXFactory.loadPOSIX(POSIXFactory.java:93)
at jnr.posix.LazyPOSIX.loadPOSIX(LazyPOSIX.java:38)
at jnr.posix.LazyPOSIX.posix(LazyPOSIX.java:32)
at jnr.posix.LazyPOSIX.isNative(LazyPOSIX.java:402)
at org.jruby.util.io.FilenoUtil.(FilenoUtil.java:42)
at org.jruby.Ruby.(Ruby.java:294)
at org.jruby.Ruby.newInstance(Ruby.java:706)
at org.logstash.Logstash.(Logstash.java:162)
at org.logstash.Logstash.main(Logstash.java:69)
See http://jira.codehaus.org/browse/JRUBY-4583>,
:path=>"logstash/inputs/file", :type=>"input", :name=>"file"}
[2022-02-09T18:48:46,494][ERROR][logstash.agent ] Failed to
execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable
to configure plugins: (PluginLoadingError) Couldn't find any input
plugin named 'file'. Are you sure this is correct? Trying to load the
file input plugin resulted in this error: Could not load FFI Provider:
(NotImplementedError) FFI not available:
java.lang.UnsatisfiedLinkError: could not locate stub library in jar
file. Tried [jni/x86_64-Windows/jffi-۱.۲.dll,
/jni/x86_64-Windows/jffi-۱.۲.dll]\r\n\tat
com.kenai.jffi.internal.StubLoader.getStubLibraryStream(StubLoader.java:450)\r\n\tat
com.kenai.jffi.internal.StubLoader.loadFromJar(StubLoader.java:375)\r\n\tat
com.kenai.jffi.internal.StubLoader.load(StubLoader.java:278)\r\n\tat
com.kenai.jffi.internal.StubLoader.(StubLoader.java:487)\r\n\tat
java.base/java.lang.Class.forName0(Native Method)\r\n\tat
java.base/java.lang.Class.forName(Class.java:467)\r\n\tat
com.kenai.jffi.Init.load(Init.java:68)\r\n\tat
com.kenai.jffi.Foreign$InstanceHolder.getInstanceHolder(Foreign.java:49)\r\n\tat
com.kenai.jffi.Foreign$InstanceHolder.(Foreign.java:45)\r\n\tat
com.kenai.jffi.Foreign.getInstance(Foreign.java:103)\r\n\tat
com.kenai.jffi.Type$Builtin.lookupTypeInfo(Type.java:242)\r\n\tat
com.kenai.jffi.Type$Builtin.getTypeInfo(Type.java:237)\r\n\tat
com.kenai.jffi.Type.resolveSize(Type.java:155)\r\n\tat
com.kenai.jffi.Type.size(Type.java:138)\r\n\tat
jnr.ffi.provider.jffi.NativeRuntime$TypeDelegate.size(NativeRuntime.java:178)\r\n\tat
jnr.ffi.provider.AbstractRuntime.(AbstractRuntime.java:48)\r\n\tat
jnr.ffi.provider.jffi.NativeRuntime.(NativeRuntime.java:57)\r\n\tat
jnr.ffi.provider.jffi.NativeRuntime.(NativeRuntime.java:41)\r\n\tat
jnr.ffi.provider.jffi.NativeRuntime$SingletonHolder.(NativeRuntime.java:53)\r\n\tat
jnr.ffi.provider.jffi.NativeRuntime.getInstance(NativeRuntime.java:49)\r\n\tat
jnr.ffi.provider.jffi.Provider.(Provider.java:29)\r\n\tat
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)\r\n\tat
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)\r\n\tat
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)\r\n\tat
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)\r\n\tat
java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)\r\n\tat
java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)\r\n\tat
java.base/java.lang.Class.newInstance(Class.java:645)\r\n\tat
jnr.ffi.provider.FFIProvider$SystemProviderSingletonHolder.getInstance(FFIProvider.java:68)\r\n\tat
jnr.ffi.provider.FFIProvider$SystemProviderSingletonHolder.(FFIProvider.java:57)\r\n\tat
jnr.ffi.provider.FFIProvider.getSystemProvider(FFIProvider.java:35)\r\n\tat
jnr.ffi.Library.loadLibrary(Library.java:114)\r\n\tat
jnr.posix.POSIXFactory$DefaultLibCProvider$SingletonHolder.(POSIXFactory.java:289)\r\n\tat
jnr.posix.POSIXFactory$DefaultLibCProvider.getLibC(POSIXFactory.java:318)\r\n\tat
jnr.posix.BaseNativePOSIX.(BaseNativePOSIX.java:38)\r\n\tat
jnr.posix.WindowsPOSIX.(WindowsPOSIX.java:134)\r\n\tat
jnr.posix.POSIXFactory.loadWindowsPOSIX(POSIXFactory.java:173)\r\n\tat
jnr.posix.POSIXFactory.loadNativePOSIX(POSIXFactory.java:142)\r\n\tat
jnr.posix.POSIXFactory.loadPOSIX(POSIXFactory.java:93)\r\n\tat
jnr.posix.LazyPOSIX.loadPOSIX(LazyPOSIX.java:38)\r\n\tat
jnr.posix.LazyPOSIX.posix(LazyPOSIX.java:32)\r\n\tat
jnr.posix.LazyPOSIX.isNative(LazyPOSIX.java:402)\r\n\tat
org.jruby.util.io.FilenoUtil.(FilenoUtil.java:42)\r\n\tat
org.jruby.Ruby.(Ruby.java:294)\r\n\tat
org.jruby.Ruby.newInstance(Ruby.java:706)\r\n\tat
org.logstash.Logstash.(Logstash.java:162)\r\n\tat
org.logstash.Logstash.main(Logstash.java:69)\r\n\n See
http://jira.codehaus.org/browse/JRUBY-4583",
:backtrace=>["org.logstash.config.ir.CompiledPipeline.(CompiledPipeline.java:119)",
"org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:83)",
"org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)",
"org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)",
"org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)",
"org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1156)",
"org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)",
"C_3a_.logstash_minus_7_dot_12_dot_0.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(C:/logstash-7.12.0/logstash-core/lib/logstash/java_pipeline.rb:47)",
"org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)",
"org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)",
"org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)",
"org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)",
"org.jruby.RubyClass.newInstance(RubyClass.java:939)",
"org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)",
"org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)",
"C_3a_.logstash_minus_7_dot_12_dot_0.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(C:/logstash-7.12.0/logstash-core/lib/logstash/pipeline_action/create.rb:52)",
"C_3a_.logstash_minus_7_dot_12_dot_0.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$VARARGS(C:/logstash-7.12.0/logstash-core/lib/logstash/pipeline_action/create.rb)",
"org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)",
"org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)",
"org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)",
"C_3a_.logstash_minus_7_dot_12_dot_0.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(C:/logstash-7.12.0/logstash-core/lib/logstash/agent.rb:389)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)",
"org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)",
"org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)",
"org.jruby.runtime.Block.call(Block.java:139)",
"org.jruby.RubyProc.call(RubyProc.java:318)",
"org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)",
"java.base/java.lang.Thread.run(Thread.java:833)"]} warning: thread
"Converge PipelineAction::Create" terminated with exception
(report_on_exception is true): LogStash::Error: Don't know how to
handle Java::JavaLang::IllegalStateException for
PipelineAction::Create<main>
create at org/logstash/execution/ConvergeResultExt.java:129
add at org/logstash/execution/ConvergeResultExt.java:57 converge_state at
C:/logstash-7.12.0/logstash-core/lib/logstash/agent.rb:402
[2022-02-09T18:48:46,509][ERROR][logstash.agent ] An
exception happened when converging configuration
{:exception=>LogStash::Error, :message=>"Don't know how to handle
Java::JavaLang::IllegalStateException for
PipelineAction::Create<main>"}
[2022-02-09T18:48:46,519][FATAL][logstash.runner ] An
unexpected error occurred! {:error=>#<LogStash::Error: Don't know how
to handle Java::JavaLang::IllegalStateException for
PipelineAction::Create<main>>,
:backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:in
create'", "org/logstash/execution/ConvergeResultExt.java:57:in add'", "C:/logstash-7.12.0/logstash-core/lib/logstash/agent.rb:402:in
`block in converge_state'"]}
[2022-02-09T18:48:46,527][FATAL][org.logstash.Logstash ] Logstash
stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
at C_3a_.logstash_minus_7_dot_12_dot_0.lib.bootstrap.environment.(C:\logstash-7.12.0\lib\bootstrap\environment.rb:89)
~[?:?]
I found out that I had to use java version 8+ instead of 17.
Related
I am going to install NetFlow.
Here is a document for it.
My logstash.yml setting is following.
modules:
- name: netflow
var.input.udp.port: 9996
I've run this command.
/usr/share/logstash/bin/logstash --modules netflow -M netflow.var.input.udp.port=9996
I've got following error.
JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2022-02-15 23:44:29.148 [main] runner - Starting Logstash {"logstash.version"=>"7.17.0", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.13+8 on 11.0.13+8 +indy +jit [linux-x86_64]"}
[INFO ] 2022-02-15 23:44:29.163 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
Your settings are invalid. Reason: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.
[FATAL] 2022-02-15 23:44:29.208 [main] Logstash - Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.20.1.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.20.1.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:94) ~[?:?]
Is there a solution for it?
The error message states the following:
Your settings are invalid. Reason: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.
So you simply need to make the /usr/share/logstash/data folder writable by the logstash user.
I have installed NiFi using Homebrew following the instructions on this page.
Once I go start NiFi using
nifi start
I get the following:
Java home: /usr/local/opt/openjdk#11/libexec/openjdk.jdk/Contents/Home
NiFi home: /usr/local/Cellar/nifi/1.15.0/libexec
Bootstrap Config File: /usr/local/Cellar/nifi/1.15.0/libexec/conf/bootstrap.conf
Error: Could not find or load main class org.apache.nifi.bootstrap.RunNiFi
Caused by: java.lang.ClassNotFoundException: org.apache.nifi.bootstrap.RunNiFi
I also see this error in the nifi-app.log
2021-12-08 13:06:37,463 ERROR [Write-Ahead Local State Provider Maintenance] o.a.n.c.s.p.l.WriteAheadLocalStateProvider Failed to checkpoint Write-Ahead Log used to stor$
java.io.FileNotFoundException: ./state/local/partition-0/1.journal (No such file or directory)
at java.base/java.io.FileOutputStream.open0(Native Method)
at java.base/java.io.FileOutputStream.open(FileOutputStream.java:298)
at java.base/java.io.FileOutputStream.<init>(FileOutputStream.java:237)
at java.base/java.io.FileOutputStream.<init>(FileOutputStream.java:187)
at org.wali.MinimalLockingWriteAheadLog$Partition.rollover(MinimalLockingWriteAheadLog.java:788)
at org.wali.MinimalLockingWriteAheadLog.checkpoint(MinimalLockingWriteAheadLog.java:534)
at org.apache.nifi.controller.state.providers.local.WriteAheadLocalStateProvider$CheckpointTask.run(WriteAheadLocalStateProvider.java:286)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Any ideas?
I got the same error. Mine was due to the 8443 being occupied. I think, you can either find what's using the 8443 or you can try to change the port nifi.web.https.port=8443 in ./bin/conf/nifi.properties to something else. I hope it helps if you are still facing the issues.
In macOS, I was trying to use Payara Server with Netbeans 12 and I got:
Launching Payara Server on Felix platform
INFO: Create bundle provisioner class = class com.sun.enterprise.glassfish.bootstrap.osgi.BundleProvisioner.
Registered com.sun.enterprise.glassfish.bootstrap.osgi.EmbeddedOSGiGlassFishRuntime#462c1ddf in service registry.
#!## LogManagerService.postConstruct : rootFolder=/Users/joseluisbz/Documentos/Java/payara5-2020-4/glassfish
#!## LogManagerService.postConstruct : templateDir=/Users/joseluisbz/Documentos/Java/payara5-2020-4/glassfish/lib/templates
#!## LogManagerService.postConstruct : src=/Users/joseluisbz/Documentos/Java/payara5-2020-4/glassfish/lib/templates/logging.properties
#!## LogManagerService.postConstruct : dest=/Users/joseluisbz/Documentos/Java/payara5-2020-4/glassfish/domains/domain1/config/logging.properties
Running Payara Version: Payara Server 5.2020.4 #badassfish (build 817)|#]
Server log file is using Formatter class: com.sun.enterprise.server.logging.ODLLogFormatter|#]
HV000001: Hibernate Validator 6.1.2.Final|#]
[192.168.0.11]:4900 [development] [3.12.6] Connection[id=1, /192.168.0.11:49587->/192.168.0.11:5900, qualifier=null, endpoint=[192.168.0.11]:5900, alive=false, type=NONE] closed. Reason: Exception in Connection[id=1, /192.168.0.11:49587->/192.168.0.11:5900, qualifier=null, endpoint=[192.168.0.11]:5900, alive=true, type=NONE], thread=hz._hzInstance_1_development.IO.thread-in-0
java.lang.IllegalStateException: Unknown protocol: RFB
at com.hazelcast.nio.tcp.UnifiedProtocolDecoder.onRead(UnifiedProtocolDecoder.java:107)
at com.hazelcast.internal.networking.nio.NioInboundPipeline.process(NioInboundPipeline.java:135)
at com.hazelcast.internal.networking.nio.NioThread.processSelectionKey(NioThread.java:369)
at com.hazelcast.internal.networking.nio.NioThread.processSelectionKeys(NioThread.java:354)
at com.hazelcast.internal.networking.nio.NioThread.selectLoop(NioThread.java:280)
at com.hazelcast.internal.networking.nio.NioThread.run(NioThread.java:235)
|#]
Then by console, here my position (I renamed payara5 directory to payara5-2020-4).
% pwd
.../payara5-2020-4/glassfish/bin
%
In order to fix the first problem:
% ./asadmin set-hazelcast-configuration --enabled=false
Remote server does not listen for requests on [localhost:4848]. Is the server up?
No such local command: set-hazelcast-configuration. Unable to access the server to execute the command remotely. Verify the server is available.
Command set-hazelcast-configuration failed.
%
After I was trying to up...
% ./asadmin start-domain domain1
Waiting for domain1 to start ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
No response from the Domain Administration Server (domain1) after 600 seconds.
The command is either taking too long to complete or the server has failed.
Please see the server log files for command status.
Please start with the --verbose option in order to see early messages.
Command start-domain failed.
%
Then, I was trying to verbose option (like recommendation)...
% ./asadmin start-domain domain1 --verbose
Command start-domain only accepts one operand
...
% ./asadmin --verbose start-domain domain1
Invalid option: --verbose
...
% ./asadmin -v start-domain domain1
Invalid option: -v
...
% ./asadmin start-domain domain1 -v
Command start-domain only accepts one operand
...
The common message
Usage: asadmin [asadmin-utility-options] start-domain
[-v|--verbose[=<verbose(default:false)>]]
[--upgrade[=<upgrade(default:false)>]]
[-w|--watchdog[=<watchdog(default:false)>]]
[-d|--debug[=<debug(default:false)>]]
[-n|--dry-run[=<dry-run(default:false)>]]
[--drop-interrupted-commands[=<drop-interrupted-commands(default:false)>]]
[--prebootcommandfile <prebootcommandfile>]
[--postbootcommandfile <postbootcommandfile>] [--domaindir <domaindir>]
[-?|--help[=<help(default:false)>]] [domain_name]
Sadly, I believe strongly that Payara is inmature product.
But, How Can I solve all these errors/mistakes?
EDIT:
I was testing on Windows 10 PRO with Netbeans 12
Launching Payara Server on Felix platform
INFO: Create bundle provisioner class = class com.sun.enterprise.glassfish.bootstrap.osgi.BundleProvisioner.
Registered com.sun.enterprise.glassfish.bootstrap.osgi.EmbeddedOSGiGlassFishRuntime#586f5c68 in service registry.
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain.main(GlassFishMain.java:109)
at com.sun.enterprise.glassfish.bootstrap.ASMain.main(ASMain.java:54)
Caused by: A MultiException has 2 exceptions. They are:
1. com.sun.enterprise.module.ResolveError: Failed to start OSGiModuleImpl:: Bundle = [fish.payara.server.internal.batch.glassfish-batch-connector [102]], State = [NEW]
2. java.lang.IllegalStateException: Could not load descriptor SystemDescriptor(
implementation=org.glassfish.batch.spi.impl.BatchRuntimeConfigurationInjector
name=batch-runtime-configuration
contracts={org.glassfish.batch.spi.impl.BatchRuntimeConfigurationInjector,org.jvnet.hk2.config.ConfigInjector}
scope=javax.inject.Singleton
qualifiers={org.jvnet.hk2.config.InjectionTarget}
descriptorType=CLASS
descriptorVisibility=NORMAL
metadata=#table-suffix={optional,default\:,datatype\:java.lang.String,leaf},#data-source-lookup-name={optional,datatype\:java.lang.String,leaf},#table-prefix={optional,default\:,datatype\:java.lang.String,leaf},#schema-name={optional,default\:APP,datatype\:java.lang.String,leaf},#executor-service-lookup-name={optional,default\:concurrent/__defaultManagedExecutorService,datatype\:java.lang.String,leaf},target={org.glassfish.batch.spi.impl.BatchRuntimeConfiguration},Bundle-SymbolicName={fish.payara.server.internal.batch.glassfish-batch-connector},Bundle-Version={5.2020.4}
rank=0
loader=OsgiPopulatorPostProcessor.HK2Loader(OSGiModuleImpl:: Bundle = [fish.payara.server.internal.batch.glassfish-batch-connector [102]], State = [NEW],1228963996)
proxiable=null
proxyForSameScope=null
analysisName=null
id=170
locatorId=0
identityHashCode=373437697
reified=false)
at org.jvnet.hk2.internal.ServiceLocatorImpl.loadClass(ServiceLocatorImpl.java:2247)
at org.jvnet.hk2.internal.ServiceLocatorImpl.reifyDescriptor(ServiceLocatorImpl.java:438)
at org.jvnet.hk2.internal.ServiceLocatorImpl.reifyDescriptor(ServiceLocatorImpl.java:457)
at org.jvnet.hk2.config.DomDocument$InjectionTargetFilter.matches(DomDocument.java:184)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getDescriptors(ServiceLocatorImpl.java:347)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getDescriptors(ServiceLocatorImpl.java:389)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getBestDescriptor(ServiceLocatorImpl.java:397)
at org.jvnet.hk2.config.DomDocument.buildModel(DomDocument.java:135)
at org.jvnet.hk2.config.ConfigModel.parseValue(ConfigModel.java:959)
at org.jvnet.hk2.config.ConfigModel.<init>(ConfigModel.java:875)
at org.jvnet.hk2.config.DomDocument.buildModel(DomDocument.java:114)
at org.jvnet.hk2.config.DomDocument.getModelByElementName(DomDocument.java:162)
at org.jvnet.hk2.config.ConfigParser.handleElement(ConfigParser.java:165)
at org.jvnet.hk2.config.ConfigParser.parse(ConfigParser.java:101)
at org.jvnet.hk2.config.ConfigParser.parse(ConfigParser.java:95)
at org.glassfish.config.support.DomainXml.parseDomainXml(DomainXml.java:271)
at org.glassfish.config.support.DomainXml.run(DomainXml.java:121)
at org.jvnet.hk2.config.ConfigurationPopulator.populateConfig(ConfigurationPopulator.java:58)
at org.glassfish.hk2.bootstrap.HK2Populator.populateConfig(HK2Populator.java:83)
at com.sun.enterprise.module.common_impl.AbstractModulesRegistryImpl.populateConfig(AbstractModulesRegistryImpl.java:190)
at com.sun.enterprise.module.bootstrap.Main.createServiceLocator(Main.java:249)
at org.jvnet.hk2.osgiadapter.HK2Main.createServiceLocator(HK2Main.java:95)
at com.sun.enterprise.glassfish.bootstrap.osgi.EmbeddedOSGiGlassFishRuntime.newGlassFish(EmbeddedOSGiGlassFishRuntime.java:95)
at com.sun.enterprise.glassfish.bootstrap.GlassFishRuntimeDecorator.newGlassFish(GlassFishRuntimeDecorator.java:68)
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntime.newGlassFish(OSGiGlassFishRuntime.java:91)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher.launch(GlassFishMain.java:125)
... 6 more
Caused by: com.sun.enterprise.module.ResolveError: Failed to start OSGiModuleImpl:: Bundle = [fish.payara.server.internal.batch.glassfish-batch-connector [102]], State = [NEW]
at org.jvnet.hk2.osgiadapter.OSGiModuleImpl.start(OSGiModuleImpl.java:193)
at org.jvnet.hk2.osgiadapter.OsgiPopulatorPostProcessor$1.loadClass(OsgiPopulatorPostProcessor.java:54)
at org.jvnet.hk2.internal.ServiceLocatorImpl.loadClass(ServiceLocatorImpl.java:2239)
... 31 more
Caused by: org.osgi.framework.BundleException: Unable to resolve fish.payara.server.internal.batch.glassfish-batch-connector [102](R 102.0): missing requirement [fish.payara.server.internal.batch.glassfish-batch-connector [102](R 102.0)] osgi.wiring.package; (osgi.wiring.package=com.ibm.jbatch.spi) [caused by: Unable to resolve fish.payara.server.internal.batch.payara-jbatch [311](R 311.0): missing requirement [fish.payara.server.internal.batch.payara-jbatch [311](R 311.0)] osgi.wiring.package; (osgi.wiring.package=org.glassfish.weld) [caused by: Unable to resolve fish.payara.server.internal.web.weld-integration [372](R 372.0): missing requirement [fish.payara.server.internal.web.weld-integration [372](R 372.0)] osgi.wiring.package; (&(osgi.wiring.package=org.glassfish.web.deployment.descriptor)(version>=5.2020.0)(!(version>=6.0.0))) [caused by: Unable to resolve fish.payara.server.internal.web.glue [360](R 360.0): missing requirement [fish.payara.server.internal.web.glue [360](R 360.0)] osgi.wiring.package; (&(osgi.wiring.package=org.apache.catalina)(version>=5.2020.0)(!(version>=6.0.0))) [caused by: Unable to resolve fish.payara.server.internal.web.core [358](R 358.0): missing requirement [fish.payara.server.internal.web.core [358](R 358.0)] osgi.wiring.package; (&(osgi.wiring.package=org.glassfish.web.loader)(version>=5.2020.0)(!(version>=6.0.0)))]]]] Unresolved requirements: [[fish.payara.server.internal.batch.glassfish-batch-connector [102](R 102.0)] osgi.wiring.package; (osgi.wiring.package=com.ibm.jbatch.spi)]
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:4368)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2281)
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:998)
at org.jvnet.hk2.osgiadapter.OSGiModuleImpl.startBundle(OSGiModuleImpl.java:227)
at org.jvnet.hk2.osgiadapter.OSGiModuleImpl.start(OSGiModuleImpl.java:185)
... 33 more
Completed shutdown of GlassFish runtime
We are in non-embedded mode, so fish.payara.server.internal.core.glassfish [113] has nothing to do.
The error "Unknown protocol: RFB" is coming from the Hazelcast component, which is trying to discover other cluster instances that could be running on port 5900. In some operating systems, very often on Mac, this port is occupied by VNC (remote desktop), which responds to Payara Server in an unexpected way.
There's a solution covered for Payara Enterprise users in the Payara Knowledge Base. I have access to it and will copy the relevant parts from it here.
There are various solutions possible:
Stop the process that occupies the port 5900 (e.g. VNC, which uses that port by default). Alternatively, you can change its port. Payara Server should then start OK.
Configure Payara Server to use a different port for Hazelcast. If you run Payara Server according to the above solution, you can then run command: asadmin set-hazelcast-configuration --startport=5901.
Directly edit the domain.xml in the directory glassfish/domains/domain1/config and change the port 5900 to something else. Then run Payara Server as usual.
Or change the Hazelcast port of Payara Server at startup. First, create a text file config.txt with one line set-hazelcast-configuration --startport=5901. Then start Payara Server with asdamin start-domain --postbootcommandfile config.txt. More on this in the documentation: https://docs.payara.fish/community/docs/5.2020.4/documentation/payara-micro/asadmin/pre-and-post-boot-scripts.html
I am testing my configuration by using :
./logstash -f /etc/logstash/conf.d/your_config_file.conf --config.test_and_exit
And getting below error :
[INFO ] 2020-10-15 10:25:21.481 [main] runner - Starting Logstash {"logstash.version"=>"7.9.2",
"jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM
11.0.8+10-post-Ubuntu-0ubuntu118.04.1 on 11.0.8+10-post-Ubuntu-0ubuntu118.04.1 +indy +jit
[linux-x86_64]"}
[FATAL] 2020-10-15 10:25:21.586 [main] runner - An unexpected error occurred! {:error=>#
<ArgumentError: Path "/usr/share/logstash/data/queue" must be a writable directory.
It is not writable.>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/settings.rb:528:in
`validate'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:288:in `validate_value'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:199:in `block in
validate_all'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/logstash-
core/lib/logstash/settings.rb:198:in `validate_all'", "/usr/share/logstash/logstash-
core/lib/logstash/runner.rb:312:in `execute'",
"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:268:in `run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'",
"/usr/share/logstash/lib/bootstrap/environment.rb:88:in `<main>'"]}
[ERROR] 2020-10-15 10:25:21.589 [main] Logstash - java.lang.IllegalStateException: Logstash
stopped processing because of an error: (SystemExit) exit
I have tried a=giving permission to logstash user to full directory by :
/usr/share$ sudo chown -R logstash.logstash logstash
But still same error. Please help
chown will change ownership of the directories but will not change permissions. Logstash complains that the user used for starting logstash does not have write access to /usr/share/logstash
chmod will help change the permissions. Check this to understand more about the command.
I downloaded spark-1.5.2 and I setup a cluster on ec2 using the spark-ec2 doc here.
After that I went to examples/ and run mvn package and packaged the examples in a jar.
In the end I run the submit with:
bin/spark-submit --class org.apache.spark.examples.JavaTC --master spark://url_here.eu-west-1.compute.amazonaws.com:7077 --deploy-mode cluster /home/aki/Projects/spark-1.5.2/examples/target/spark-examples_2.10-1.5.2.jar
Instead of it running, I get the error:
WARN RestSubmissionClient: Unable to connect to server spark://url_here.eu-west-1.compute.amazonaws.com:7077.
Warning: Master endpoint spark://url_here.eu-west-1.compute.amazonaws.com:7077 was not a REST server. Falling back to legacy submission gateway instead.
15/12/22 17:36:07 WARN Utils: Your hostname, aki-linux resolves to a loopback address: 127.0.1.1; using 192.168.10.63 instead (on interface wlp4s0)
15/12/22 17:36:07 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/12/22 17:36:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.lookupTimeout
at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcEnv.scala:214)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcEnv.scala:242)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:98)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:116)
at org.apache.spark.deploy.Client$$anonfun$7.apply(Client.scala:233)
at org.apache.spark.deploy.Client$$anonfun$7.apply(Client.scala:233)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
at org.apache.spark.deploy.Client$.main(Client.scala:233)
at org.apache.spark.deploy.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcEnv.scala:241)
... 21 more
Are you sure the URL to master contains "url-here"?
spark://url_here.eu-west-1.compute.amazonaws.com:7077
Or maybe you are trying to obfuscate it for this post.
If you can you connect the Spark UI at
http://url_here.eu-west-1.compute.amazonaws.com:4040 or depending on your spark version http://url_here.eu-west-1.compute.amazonaws.com:8080, make sure you are using the URL variable seen on the Spark UI for your spark://...:7070 command line argument