Apache NiFi can not connect to "localhost:8080/nifi" - apache-nifi

I am trying to run Apache NiFi on my CentOS system. I downloaded it and run it with command:
/opt/nifi-1.1.1.0-12/bin/nifi.sh start
the result is here:
Java home: /usr/java/jdk1.7.0_45
NiFi home: /opt/nifi-1.1.1.0-12
Bootstrap Config File: /opt/nifi-1.1.1.0-12/conf/bootstrap.conf
2016-03-06 22:02:08,477 INFO [main] org.apache.nifi.bootstrap.Command Starting Apache NiFi...
2016-03-06 22:02:08,478 INFO [main] org.apache.nifi.bootstrap.Command Working Directory: /opt/nifi-1.1.1.0-12
2016-03-06 22:02:08,478 INFO [main] org.apache.nifi.bootstrap.Command Command: /usr/java/jdk1.7.0_45/bin/java -classpath /opt/nifi-1.1.1.0-12/./conf:/opt/nifi-1.1.1.0-12/./lib/jcl-over-slf4j-1.7.12.jar:/opt/nifi-1.1.1.0-12/./lib/log4j-over-slf4j-1.7.12.jar:/opt/nifi-1.1.1.0-12/./lib/nifi-runtime-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/./lib/logback-classic-1.1.3.jar:/opt/nifi-1.1.1.0-12/./lib/nifi-nar-utils-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/./lib/nifi-properties-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/./lib/slf4j-api-1.7.12.jar:/opt/nifi-1.1.1.0-12/./lib/logback-core-1.1.3.jar:/opt/nifi-1.1.1.0-12/./lib/nifi-documentation-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/./lib/nifi-api-1.1.1.0-12.jar:/opt/nifi-1.1.1.0-12/./lib/jul-to-slf4j-1.7.12.jar -Djava.net.preferIPv4Stack=true -Dsun.net.http.allowRestrictedHeaders=true -Djava.protocol.handler.pkgs=sun.net.www.protocol -Dorg.apache.jasper.compiler.disablejsr199=true -Xmx512m -Djava.awt.headless=true -Xms512m -Dnifi.properties.file.path=/opt/nifi-1.1.1.0-12/./conf/nifi.properties -Dnifi.bootstrap.listen.port=32864 -Dapp=NiFi org.apache.nifi.NiFi
then I check the status of Apache NiFi with command:
/opt/nifi-1.1.1.0-12/bin/nifi.sh status
and the result was:
Java home: /usr/java/jdk1.7.0_45
NiFi home: /opt/nifi-1.1.1.0-12
Bootstrap Config File: /opt/nifi-1.1.1.0-12/conf/bootstrap.conf
2016-03-06 22:03:21,227 INFO [main] org.apache.nifi.bootstrap.Command Apache NiFi is currently running, listening to Bootstrap on port 45542, PID=30817
but at the end when I try to access to http://localhost:8090/nifi or http://localhost:8080/nifi on my browser it said : unable to connect. (I changed the http port to 8090 to avoid conflicts, but still have the same problem). Please help me, what is the problem?
Here is $NIFI_HOME/logs/nifi-app.log:
2016-03-07 00:31:54,204 ERROR [Cleanup Archive for default] o.a.n.c.repository.FileSystemRepository Failed to cleanup archive for container default due to java.io.IOException: Mount point not found
2016-03-07 00:31:54,216 INFO [main] o.apache.nifi.controller.FlowController Controller has been terminated successfully.
2016-03-07 00:31:54,225 WARN [main] org.eclipse.jetty.webapp.WebAppContext Failed startup of context o.e.j.w.WebAppContext#d2b452{/nifi-api,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-api-1.1.1.0-12.war/webapp/,STARTING}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-api-1.1.1.0-12.war}
org.apache.nifi.web.NiFiCoreException: Unable to start Flow Controller.
at org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:99) ~[na:na]
at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:800) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:444) ~[jetty-servlet-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:791) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:294) ~[jetty-servlet-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1342) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:741) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:505) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) [jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.Server.start(Server.java:387) [jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) [jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.Server.doStart(Server.java:354) [jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) [jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:663) [nifi-jetty-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.NiFi.<init>(NiFi.java:137) [nifi-runtime-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.NiFi.main(NiFi.java:227) [nifi-runtime-1.1.1.0-12.jar:1.1.1.0-12]
Caused by: java.nio.file.FileSystemException: ./flowfile_repository/partition-119/2.journal: Too many open files
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[na:1.7.0_45]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[na:1.7.0_45]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107) ~[na:1.7.0_45]
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214) ~[na:1.7.0_45]
at java.nio.file.Files.newByteChannel(Files.java:315) ~[na:1.7.0_45]
at java.nio.file.Files.newByteChannel(Files.java:361) ~[na:1.7.0_45]
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:380) ~[na:1.7.0_45]
at java.nio.file.Files.newInputStream(Files.java:106) ~[na:1.7.0_45]
at org.wali.MinimalLockingWriteAheadLog$Partition.createDataInputStream(MinimalLockingWriteAheadLog.java:932) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog$Partition.getRecoveryStream(MinimalLockingWriteAheadLog.java:947) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog$Partition.getNextRecoverableTransactionId(MinimalLockingWriteAheadLog.java:973) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog.recoverFromEdits(MinimalLockingWriteAheadLog.java:419) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog.recoverRecords(MinimalLockingWriteAheadLog.java:293) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.repository.WriteAheadFlowFileRepository.loadFlowFiles(WriteAheadFlowFileRepository.java:328) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.FlowController.initializeFlow(FlowController.java:573) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.StandardFlowService.loadFromBytes(StandardFlowService.java:622) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.StandardFlowService.load(StandardFlowService.java:458) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:79) ~[na:na]
... 22 common frames omitted
2016-03-07 00:31:54,427 INFO [main] o.e.jetty.server.handler.ContextHandler Started o.e.j.w.WebAppContext#ec15ce{/nifi-content-viewer,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-content-viewer-1.1.1.0-12.war/webapp/,AVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-content-viewer-1.1.1.0-12.war}
2016-03-07 00:31:54,429 INFO [main] o.e.jetty.server.handler.ContextHandler Started o.e.j.s.h.ContextHandler#917ef5{/nifi-docs,null,AVAILABLE}
2016-03-07 00:31:54,457 INFO [main] o.e.jetty.server.handler.ContextHandler Started o.e.j.w.WebAppContext#260a31{/nifi-docs,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-docs-1.1.1.0-12.war/webapp/,AVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-docs-1.1.1.0-12.war}
2016-03-07 00:31:54,483 INFO [main] o.e.jetty.server.handler.ContextHandler Started o.e.j.w.WebAppContext#15dcce1{/,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-error-1.1.1.0-12.war/webapp/,AVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-error-1.1.1.0-12.war}
2016-03-07 00:31:54,497 INFO [main] org.eclipse.jetty.server.ServerConnector Started ServerConnector#1e4a851{HTTP/1.1}{0.0.0.0:8089}
2016-03-07 00:31:54,497 INFO [main] org.eclipse.jetty.server.Server Started #27531ms
2016-03-07 00:31:54,499 WARN [main] org.apache.nifi.web.server.JettyServer Failed to start web server... shutting down.
org.apache.nifi.web.NiFiCoreException: Unable to start Flow Controller.
at org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:99) ~[na:na]
at org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:800) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:444) ~[jetty-servlet-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:791) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:294) ~[jetty-servlet-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1342) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:741) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:505) ~[jetty-webapp-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.Server.start(Server.java:387) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.server.Server.doStart(Server.java:354) ~[jetty-server-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) ~[jetty-util-9.2.11.v20150529.jar:9.2.11.v20150529]
at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:663) ~[nifi-jetty-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.NiFi.<init>(NiFi.java:137) [nifi-runtime-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.NiFi.main(NiFi.java:227) [nifi-runtime-1.1.1.0-12.jar:1.1.1.0-12]
Caused by: java.nio.file.FileSystemException: ./flowfile_repository/partition-119/2.journal: Too many open files
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:91) ~[na:1.7.0_45]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[na:1.7.0_45]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107) ~[na:1.7.0_45]
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214) ~[na:1.7.0_45]
at java.nio.file.Files.newByteChannel(Files.java:315) ~[na:1.7.0_45]
at java.nio.file.Files.newByteChannel(Files.java:361) ~[na:1.7.0_45]
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:380) ~[na:1.7.0_45]
at java.nio.file.Files.newInputStream(Files.java:106) ~[na:1.7.0_45]
at org.wali.MinimalLockingWriteAheadLog$Partition.createDataInputStream(MinimalLockingWriteAheadLog.java:932) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog$Partition.getRecoveryStream(MinimalLockingWriteAheadLog.java:947) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog$Partition.getNextRecoverableTransactionId(MinimalLockingWriteAheadLog.java:973) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog.recoverFromEdits(MinimalLockingWriteAheadLog.java:419) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.wali.MinimalLockingWriteAheadLog.recoverRecords(MinimalLockingWriteAheadLog.java:293) ~[nifi-write-ahead-log-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.repository.WriteAheadFlowFileRepository.loadFlowFiles(WriteAheadFlowFileRepository.java:328) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.FlowController.initializeFlow(FlowController.java:573) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.StandardFlowService.loadFromBytes(StandardFlowService.java:622) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.controller.StandardFlowService.load(StandardFlowService.java:458) ~[nifi-framework-core-1.1.1.0-12.jar:1.1.1.0-12]
at org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:79) ~[na:na]
... 22 common frames omitted
2016-03-07 00:31:54,500 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server...
2016-03-07 00:31:54,503 INFO [Thread-1] org.eclipse.jetty.server.ServerConnector Stopped ServerConnector#1e4a851{HTTP/1.1}{0.0.0.0:8089}
2016-03-07 00:31:54,511 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#15dcce1{/,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-error-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-error-1.1.1.0-12.war}
2016-03-07 00:31:54,513 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#260a31{/nifi-docs,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-docs-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-docs-1.1.1.0-12.war}
2016-03-07 00:31:54,515 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.s.h.ContextHandler#917ef5{/nifi-docs,null,UNAVAILABLE}
2016-03-07 00:31:54,516 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#ec15ce{/nifi-content-viewer,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-content-viewer-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-content-viewer-1.1.1.0-12.war}
2016-03-07 00:31:54,518 INFO [Thread-1] o.a.n.w.c.ApplicationStartupContextListener Initiating shutdown of flow service...
2016-03-07 00:31:54,518 INFO [Thread-1] o.a.n.w.c.ApplicationStartupContextListener Flow service termination completed.
2016-03-07 00:31:54,518 INFO [Thread-1] /nifi-api Closing Spring root WebApplicationContext
2016-03-07 00:31:54,643 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#d2b452{/nifi-api,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-api-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-api-1.1.1.0-12.war}
2016-03-07 00:31:54,656 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#13f7ede{/nifi,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-web-ui-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/framework/nifi-framework-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-web-ui-1.1.1.0-12.war}
2016-03-07 00:31:54,682 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#2b0fea{/nifi-update-attribute-ui-1.1.1.0-12,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-update-attribute-ui-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/extensions/nifi-update-attribute-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-update-attribute-ui-1.1.1.0-12.war}
2016-03-07 00:31:54,699 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#1680713{/nifi-image-viewer-1.1.1.0-12,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-image-viewer-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/extensions/nifi-image-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-image-viewer-1.1.1.0-12.war}
2016-03-07 00:31:54,704 INFO [Thread-1] o.e.jetty.server.handler.ContextHandler Stopped o.e.j.w.WebAppContext#155d5e4{/nifi-standard-content-viewer-1.1.1.0-12,file:/opt/nifi-1.1.1.0-12/work/jetty/nifi-standard-content-viewer-1.1.1.0-12.war/webapp/,UNAVAILABLE}{./work/nar/extensions/nifi-standard-nar-1.1.1.0-12.nar-unpacked/META-INF/bundled-dependencies/nifi-standard-content-viewer-1.1.1.0-12.war}
2016-03-07 00:31:54,714 INFO [Thread-1] org.apache.nifi.NiFi Jetty web server shutdown completed (nicely or otherwise).

I see the error : "Too many open files"
I would suggest to increase this limit as recommended in https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#configuration-best-practices

Related

ERROR delegation.AbstractDelegationTokenSecretManager: ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted(hadoop window10)

I use windows 10 and node manager also not starting correctly. I see the following errors:
Resource manager is not connecting and failing due to :
2021-07-07 11:01:52,473 ERROR delegation.AbstractDelegationTokenSecretManager: ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
2021-07-07 11:01:52,493 INFO handler.ContextHandler: Stopped o.e.j.w.WebAppContext#756b58a7{/,null,UNAVAILABLE}{/cluster}
2021-07-07 11:01:52,504 INFO server.AbstractConnector: Stopped ServerConnector#633a2e99{HTTP/1.1,[http/1.1]}{0.0.0.0:8088}
2021-07-07 11:01:52,504 INFO handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler#7b420819{/static,jar:file:/F:/hadoop_new/share/hadoop/yarn/hadoop-yarn-common-3.2.1.jar!/webapps/static,UNAVAILABLE}
2021-07-07 11:01:52,507 INFO handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler#c9d0d6{/logs,file:///F:/hadoop_new/logs/,UNAVAILABLE}
2021-07-07 11:01:52,541 INFO ipc.Server: Stopping server on 8033
2021-07-07 11:01:52,543 INFO ipc.Server: Stopping IPC Server listener on 8033
2021-07-07 11:01:52,544 INFO resourcemanager.ResourceManager: Transitioning to standby state
2021-07-07 11:01:52,544 INFO ipc.Server: Stopping IPC Server Responder
2021-07-07 11:01:52,550 INFO resourcemanager.ResourceManager: Transitioned to standby state
2021-07-07 11:01:52,554 FATAL resourcemanager.ResourceManager: Error starting ResourceManager
org.apache.hadoop.service.ServiceStateException: 5: Access is denied.
and
2021-07-07 11:01:51,625 INFO recovery.RMStateStore: Storing RMDTMasterKey.
2021-07-07 11:01:52,158 INFO store.AbstractFSNodeStore: Created store directory :file:/tmp/hadoop-yarn-Abby/node-attribute
2021-07-07 11:01:52,186 INFO service.AbstractService: Service NodeAttributesManagerImpl failed in state STARTED
5: Access is denied.
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileOutputStreamWithMode(NativeIO.java:595)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:246)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:331)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:320)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:305)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1098)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:987)
at org.apache.hadoop.yarn.nodelabels.store.AbstractFSNodeStore.recoverFromStore(AbstractFSNodeStore.java:160)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.FileSystemNodeAttributeStore.recover(FileSystemNodeAttributeStore.java:95)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.NodeAttributesManagerImpl.initNodeAttributeStore(NodeAttributesManagerImpl.java:140)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.NodeAttributesManagerImpl.serviceStart(NodeAttributesManagerImpl.java:123)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:121)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:895)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1262)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1303)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1299)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1299)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1350)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1535)
2021-07-07 11:01:52,212 INFO service.AbstractService: Service RMActiveServices failed in state STARTED
org.apache.hadoop.service.ServiceStateException: 5: Access is denied.
at org.apache.hadoop.service.ServiceStateException.convert(ServiceStateException.java:105)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:203)
at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:121)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:895)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1262)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1303)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1299)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1299)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1350)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1535)
Caused by: 5: Access is denied.
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileOutputStreamWithMode(NativeIO.java:595)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:246)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:331)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:320)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:305)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1098)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:987)
at org.apache.hadoop.yarn.nodelabels.store.AbstractFSNodeStore.recoverFromStore(AbstractFSNodeStore.java:160)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.FileSystemNodeAttributeStore.recover(FileSystemNodeAttributeStore.java:95)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.NodeAttributesManagerImpl.initNodeAttributeStore(NodeAttributesManagerImpl.java:140)
at org.apache.hadoop.yarn.server.resourcemanager.nodelabels.NodeAttributesManagerImpl.serviceStart(NodeAttributesManagerImpl.java:123)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
... 13 more
You have access denied, maybe need to run with another user. Try to start services with a user with more access like Administrator in windows.

sonar starts and then stops running after 15 seconds without displaying any error

I have installed sonar on my Ubuntu 14.04 LTS (which I am running inside windows using vagrant) by following a tutorial here
And my sonar.properties file is below
/opt/sonar/conf/sonar.properties
sonar.jdbc.username=sonar
sonar.jdbc.password=sonar
sonar.jdbc.url=jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance
sonar.web.host=127.0.0.1
sonar.web.context=/sonar
sonar.web.port=9000
sonar.jdbc.driverClassName=com.mysql.jdbc.Driver
sonar.jdbc.validationQuery=select 1
sonar.jdbc.dialect=mysql
sonar.jdbc.maxActive=20 sonar.jdbc.maxIdle=5
sonar.jdbc.minIdle=2
sonar.jdbc.maxWait=5000
sonar.jdbc.minEvictableIdleTimeMillis=600000
sonar.jdbc.timeBetweenEvictionRunsMillis=30000
Once after installing I restarted, started the sonar as below
/opt/sonar/bin/linux-x86-64/sonar.sh restart
/opt/sonar/bin/linux-x86-64/sonar.sh start
Now I have checked the status of the sonar as below
/opt/sonar/bin/linux-x86-64/sonar.sh status
and output is SonarQube is running.
and after some time like 15 seconds, I have checked the status again and now the output is SonarQube is not running.
I don't what's the reason that causing sonar to shut down immediately because its not displaying any error
sonar.log(/opt/sonar/logs/sonar.log) file output
--> Wrapper Started as Daemon
Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2017.07.28 06:47:48 INFO app[o.s.a.AppFileSystem] Cleaning or creating temp directory /opt/sonar/temp
2017.07.28 06:47:48 INFO app[o.s.p.m.JavaProcessLauncher] Launch process[es]: /usr/lib/jvm/java-8-oracle/jre/bin/java -Djava.awt.headless=true -Xmx1G -Xms256m -Xss256k -Djava.net.preferIPv4Stack=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Djava.io.tmpdir=/opt/sonar/temp -javaagent:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar -cp ./lib/common/*:./lib/search/* org.sonar.search.SearchServer /opt/sonar/temp/sq-process8071013234782021313properties
2017.07.28 06:47:48 INFO es[o.s.p.ProcessEntryPoint] Starting es
2017.07.28 06:47:48 INFO es[o.s.s.EsSettings] Elasticsearch listening on 127.0.0.1:9001
2017.07.28 06:47:49 INFO es[o.elasticsearch.node] [sonar-1501224467799] version[1.7.5], pid[5493], build[00f95f4/2016-02-02T09:55:30Z]
2017.07.28 06:47:49 INFO es[o.elasticsearch.node] [sonar-1501224467799] initializing ...
2017.07.28 06:47:49 INFO es[o.e.plugins] [sonar-1501224467799] loaded [], sites []
2017.07.28 06:47:49 INFO es[o.elasticsearch.env] [sonar-1501224467799] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [34.3gb], net total_space [39.3gb], types [ext4]
2017.07.28 06:47:50 WARN es[o.e.bootstrap] JNA not found. native methods will be disabled.
2017.07.28 06:47:52 INFO es[o.elasticsearch.node] [sonar-1501224467799] initialized
2017.07.28 06:47:52 INFO es[o.elasticsearch.node] [sonar-1501224467799] starting ...
2017.07.28 06:47:52 INFO es[o.e.transport] [sonar-1501224467799] bound_address {inet[/127.0.0.1:9001]}, publish_address {inet[/127.0.0.1:9001]}
2017.07.28 06:47:52 INFO es[o.e.discovery] [sonar-1501224467799] sonarqube/a-DrjveQTd6cOLiDMQGFPA
2017.07.28 06:47:55 INFO es[o.e.cluster.service] [sonar-1501224467799] new_master [sonar-1501224467799][a-DrjveQTd6cOLiDMQGFPA][vagrant-ubuntu-trusty-64][inet[/127.0.0.1:9001]]{rack_id=sonar-1501224467799}, reason: zen-disco-join (elected_as_master)
2017.07.28 06:47:55 INFO es[o.elasticsearch.node] [sonar-1501224467799] started
2017.07.28 06:47:55 INFO es[o.e.gateway] [sonar-1501224467799] recovered [0] indices into cluster_state
2017.07.28 06:47:55 INFO app[o.s.p.m.Monitor] Process[es] is up
2017.07.28 06:47:55 INFO app[o.s.p.m.JavaProcessLauncher] Launch process[web]: /usr/lib/jvm/java-8-oracle/jre/bin/java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.management.enabled=false -Djruby.compile.invokedynamic=false -Xmx512m -Xms128m -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true -Djava.io.tmpdir=/opt/sonar/temp -javaagent:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar -cp ./lib/common/*:./lib/server/*:/opt/sonar/lib/jdbc/mysql/mysql-connector-java-5.1.35.jar org.sonar.server.app.WebServer /opt/sonar/temp/sq-process7630170265596703695properties
2017.07.28 06:47:56 INFO web[o.s.p.ProcessEntryPoint] Starting web
2017.07.28 06:47:56 INFO web[o.s.s.a.TomcatContexts] Webapp directory: /opt/sonar/web
2017.07.28 06:47:57 INFO web[o.a.c.h.Http11NioProtocol] Initializing ProtocolHandler ["http-nio-127.0.0.1-9000"]
2017.07.28 06:47:57 INFO web[o.a.t.u.n.NioSelectorPool] Using a shared selector for servlet write/read
2017.07.28 06:47:58 INFO web[o.s.s.p.ServerImpl] SonarQube Server / 5.6.4 / 52298794f1a34a4fd713ff8d441a0c13432e40a9
2017.07.28 06:47:58 INFO web[o.sonar.db.Database] Create JDBC data source for jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance
2017.07.28 06:47:58 ERROR web[o.a.c.c.C.[.[.[/]] Exception sending context initialized event to listener instance of class org.sonar.server.platform.PlatformServletContextListener
java.lang.NumberFormatException: For input string: "20 sonar.jdbc.maxIdle=5"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) ~[na:1.8.0_131]
at java.lang.Integer.parseInt(Integer.java:580) ~[na:1.8.0_131]
at java.lang.Integer.parseInt(Integer.java:615) ~[na:1.8.0_131]
at org.apache.commons.dbcp.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:223) ~[commons-dbcp-1.4.jar:1.4]
at org.sonar.db.DefaultDatabase.initDataSource(DefaultDatabase.java:92) ~[sonar-db-5.6.4.jar:na]
at org.sonar.db.DefaultDatabase.start(DefaultDatabase.java:70) ~[sonar-db-5.6.4.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110) ~[picocontainer-2.15.jar:na]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89) ~[picocontainer-2.15.jar:na]
at org.sonar.core.platform.ComponentContainer$1.start(ComponentContainer.java:320) ~[sonar-core-5.6.4.jar:na]
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84) ~[picocontainer-2.15.jar:na]
--> Wrapat org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169) ~[picocontainer-2.15.jar:na]
Launchinat org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132) ~[picocontainer-2.15.jar:na]
Wrapper at org.picocontainer.behaviors.Stored.start(Stored.java:110) ~[picocontainer-2.15.jar:na]
Copyriat org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1016) ~[picocontainer-2.15.jar:na]
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1009) ~[picocontainer-2.15.jar:na]
2017.07.at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:767) ~[picocontainer-2.15.jar:na]
2017.07.at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:141) ~[sonar-core-5.6.4.jar:na].awt.headless=true -Xmx1G -Xms256m -Xss256k -Djava.net.preferIPv4Stack=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMat org.sonar.server.platform.platformlevel.PlatformLevel.start(PlatformLevel.java:84) ~[sonar-server-5.6.4.jar:na]r=/opt/sonar/temp -javaagent:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar -cp ./lib/common/*:./lib/search/* org.sonat org.sonar.server.platform.Platform.start(Platform.java:216) ~[sonar-server-5.6.4.jar:na]
2017.07.at org.sonar.server.platform.Platform.startLevel1Container(Platform.java:175) ~[sonar-server-5.6.4.jar:na]
2017.07.at org.sonar.server.platform.Platform.init(Platform.java:90) ~[sonar-server-5.6.4.jar:na]
2017.07.at org.sonar.server.platform.PlatformServletContextListener.contextInitialized(PlatformServletContextListener.java:43) ~[sonar-server-5.6.4.jar:na]
2017.07.at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4812) [tomcat-embed-core-8.0.32.jar:8.0.32]
2017.07.at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5255) [tomcat-embed-core-8.0.32.jar:8.0.32]
2017.07.at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147) [tomcat-embed-core-8.0.32.jar:8.0.32])]], net usable_space [34.3gb], net total_space [39.3gb], types [ext4]
2017.07.at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1408) [tomcat-embed-core-8.0.32.jar:8.0.32]
2017.07.at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1398) [tomcat-embed-core-8.0.32.jar:8.0.32]
2017.07.at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_131]g ...
2017.07.at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_131], publish_address {inet[/127.0.0.1:9001]}
2017.07.at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_131]
2017.07.at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]224467799] new_master [sonar-1501224467799][a-DrjveQTd6cOLiDMQGFPA][vagrant-ubuntu-trusty-64][inet[/127.0.0.1:9001]]{rack_id=sonar-1501224467799}, reason: zen-disco-join (electe2017.07.28 06:47:58 ERROR web[o.a.c.c.StandardContext] One or more listeners failed to start. Full details will be found in the appropriate container log file
2017.07.28 06:47:58 ERROR web[o.a.c.c.StandardContext] Context [] startup failed due to previous errors
2017.07.28 06:47:58 WARN web[o.a.c.l.WebappClassLoaderBase] The web application [ROOT] appears to have started a thread named [Abandoned connection cleanup thread] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: INFO app[o.s.p.m.Monitor] Process[es] is up
java.lang.Object.wait(Native Method).JavaProcessLauncher] Launch process[web]: /usr/lib/jvm/java-8-oracle/jre/bin/java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.management.enabled=false -Djruby.compile.invokedynamic=false -Xmx java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)erIPv4Stack=true -Djava.io.tmpdir=/opt/sonar/temp -javaagent:/usr/lib/jvm/java-8-oracle/jre/lib/management-agent.jar -cp ./lib/common/*:./lib/server/*:/opt/sonar/lib/jdbc/mysql/my com.mysql.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:43)265596703695properties
2017.07.28 06:47:58 INFO web[o.a.c.h.Http11NioProtocol] Starting ProtocolHandler ["http-nio-127.0.0.1-9000"]
2017.07.28 06:47:58 INFO web[o.s.s.a.TomcatAccessLog] Web server is started/sonar/web
2017.07.28 06:47:58 INFO web[o.s.s.a.EmbeddedTomcat] HTTP connector enabled on port 9000ttp-nio-127.0.0.1-9000"]
2017.07.28 06:47:58 WARN web[o.s.p.ProcessEntryPoint] Fail to start webselector for servlet write/read
java.lang.IllegalStateException: Webapp did not startarQube Server / 5.6.4 / 52298794f1a34a4fd713ff8d441a0c13432e40a9
2017.07.at org.sonar.server.app.EmbeddedTomcat.isUp(EmbeddedTomcat.java:84) ~[sonar-server-5.6.4.jar:na]06/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance
2017.07.at org.sonar.server.app.WebServer.isUp(WebServer.java:47) [sonar-server-5.6.4.jar:na]to listener instance of class org.sonar.server.platform.PlatformServletContextListener
java.lanat org.sonar.process.ProcessEntryPoint.launch(ProcessEntryPoint.java:105) ~[sonar-process-5.6.4.jar:na]
at org.sonar.server.app.WebServer.main(WebServer.java:68) [sonar-server-5.6.4.jar:na]:1.8.0_131]
2017.07.28 06:47:58 INFO web[o.a.c.h.Http11NioProtocol] Pausing ProtocolHandler ["http-nio-127.0.0.1-9000"]
2017.07.28 06:47:59 INFO web[o.a.c.h.Http11NioProtocol] Stopping ProtocolHandler ["http-nio-127.0.0.1-9000"]
2017.07.28 06:47:59 INFO web[o.a.c.h.Http11NioProtocol] Destroying ProtocolHandler ["http-nio-127.0.0.1-9000"]ommons-dbcp-1.4.jar:1.4]
2017.07.28 06:47:59 INFO web[o.s.s.a.TomcatAccessLog] Web server is stopped92) ~[sonar-db-5.6.4.jar:na]
2017.07.28 06:48:00 INFO app[o.s.p.m.Monitor] Process[es] is stopping ~[sonar-db-5.6.4.jar:na]
2017.07.28 06:48:00 INFO es[o.s.p.StopWatcher] Stopping processhod) ~[na:1.8.0_131]
2017.07.28 06:48:00 INFO es[o.elasticsearch.node] [sonar-1501224467799] stopping ...
2017.07.28 06:48:00 INFO es[o.elasticsearch.node] [sonar-1501224467799] stopped
2017.07.28 06:48:00 INFO es[o.elasticsearch.node] [sonar-1501224467799] closing ...
2017.07.28 06:48:00 INFO es[o.elasticsearch.node] [sonar-1501224467799] closed
2017.07.28 06:48:00 INFO app[o.s.p.m.Monitor] Process[es] is stopped
<-- Wrapper Stopped
I have checked whether mysql is running or not as below and it was running and up
mysqladmin -u root -p status
Uptime: 2781 Threads: 1 Questions: 122 Slow queries: 0 Opens: 48 Flush tables: 1 Open tables: 41 Queries per second avg: 0.043
So finally I didn't understand what makes sonar to shut down after some time once it was started, and what changes to make in order to run sonar continuously and where to debug?
You have a missing line break here:
sonar.jdbc.maxActive=20 sonar.jdbc.maxIdle=5
Sonar can't parse the maxActive option because everything after the first = is parsed as integer. This obviously fails:
java.lang.NumberFormatException: For input string: "20 sonar.jdbc.maxIdle=5"
Simple solution:
sonar.jdbc.maxActive=20
sonar.jdbc.maxIdle=5

Spark Thirft Service not started in Hadoop with Azure Storage Blob configuration

We have Created a High availability hadoop cluster with default file system as azure blob storage instead of hdfs by following the link https://hadoop.apache.org/docs/stable/hadoop-azure/index.html
Hivethrift service where started successfully but spark thift service where not started.
I can able to connect the spark-shell and connect with blob by referening the jar file hadoop-azure.jar but cannot start the thrift service.
Command used to start spark thrift server:
spark-submit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --master yarn
Following are the error details.
17/04/26 10:19:32 INFO metastore: Connected to metastore.
Exception in thread "main" java.lang.IllegalArgumentException: Error while insta
ntiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$
$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSessio
n.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109
)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.appl
y(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.appl
y(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.sca
la:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.sca
la:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala
:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.sc
ala:878)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.
scala:47)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh
riftServer2.scala:81)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr
iftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub
mit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18
7)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
onstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$
$reflect(SparkSession.scala:978)
... 22 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.ap
ache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$inter
nal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86
)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkS
ession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkS
ession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession
.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:
157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.sc
ala:32)
... 27 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
onstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$inter
nal$SharedState$$reflect(SharedState.scala:166)
... 35 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
orAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
onstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(Is
olatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.s
cala:366)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.s
cala:270)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCata
log.scala:65)
... 40 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.fs.azure.AzureException
: java.util.NoSuchElementException: An error occurred while enumerating the resu
lt, check the original exception for details.
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
a:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl
.scala:192)
... 48 more
Caused by: org.apache.hadoop.fs.azure.AzureException: java.util.NoSuchElementExc
eption: An error occurred while enumerating the result, check the original excep
tion for details.
at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.retrieveMetadat
a(AzureNativeFileSystemStore.java:1930)
at org.apache.hadoop.fs.azure.NativeAzureFileSystem.getFileStatus(Native
AzureFileSystem.java:1592)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(Sess
ionState.java:596)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(Sess
ionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
a:508)
... 49 more
Caused by: java.util.NoSuchElementException: An error occurred while enumerating
the result, check the original exception for details.
at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySe
gmentedIterator.java:113)
at org.apache.hadoop.fs.azure.StorageInterfaceImpl$WrappingIterator.hasN
ext(StorageInterfaceImpl.java:128)
at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.retrieveMetadat
a(AzureNativeFileSystemStore.java:1909)
... 54 more
Caused by: com.microsoft.azure.storage.StorageException: The server encountered
an unknown failure: OK
at com.microsoft.azure.storage.StorageException.translateException(Stora
geException.java:178)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(Exe
cutionEngine.java:273)
at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySe
gmentedIterator.java:109)
... 56 more
Caused by: java.lang.ClassCastException: org.apache.xerces.parsers.XIncludeAware
ParserConfiguration cannot be cast to org.apache.xerces.xni.parser.XMLParserConf
iguration
at org.apache.xerces.parsers.SAXParser.<init>(Unknown Source)
at org.apache.xerces.parsers.SAXParser.<init>(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.<init>(Unknown Sou
rce)
at org.apache.xerces.jaxp.SAXParserImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.SAXParserFactoryImpl.newSAXParser(Unknown Sour
ce)
at com.microsoft.azure.storage.core.Utility.getSAXParser(Utility.java:54
6)
at com.microsoft.azure.storage.blob.BlobListHandler.getBlobList(BlobList
Handler.java:72)
at com.microsoft.azure.storage.blob.CloudBlobContainer$6.postProcessResp
onse(CloudBlobContainer.java:1253)
at com.microsoft.azure.storage.blob.CloudBlobContainer$6.postProcessResp
onse(CloudBlobContainer.java:1217)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(Exe
cutionEngine.java:148)
... 57 more
17/04/26 10:19:33 INFO SparkContext: Invoking stop() from shutdown hook
17/04/26 10:19:33 INFO SparkUI: Stopped Spark web UI at http://10.0.0.4:4040
17/04/26 10:19:33 INFO YarnClientSchedulerBackend: Interrupting monitor thread
17/04/26 10:19:33 INFO YarnClientSchedulerBackend: Shutting down all executors
17/04/26 10:19:33 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each exec
utor to shut down
17/04/26 10:19:33 INFO SchedulerExtensionServices: Stopping SchedulerExtensionSe
rvices
(serviceOption=None,
services=List(),
started=false)
17/04/26 10:19:33 INFO YarnClientSchedulerBackend: Stopped
17/04/26 10:19:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEnd
point stopped!
17/04/26 10:19:33 INFO MemoryStore: MemoryStore cleared
17/04/26 10:19:33 INFO BlockManager: BlockManager stopped
17/04/26 10:19:33 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/26 10:19:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
17/04/26 10:19:33 INFO SparkContext: Successfully stopped SparkContext
17/04/26 10:19:33 INFO ShutdownHookManager: Shutdown hook called
17/04/26 10:19:33 INFO ShutdownHookManager: Deleting directory C:\Users\labuser\
AppData\Local\Temp\2\spark-11c406ec-2c53-4042-b336-9d1164c3c6f9
17/04/26 10:19:33 INFO MetricsSystemImpl: Stopping azure-file-system metrics sys
tem...
17/04/26 10:19:33 INFO MetricsSystemImpl: azure-file-system metrics system stopp
ed.
17/04/26 10:19:33 INFO MetricsSystemImpl: azure-file-system metrics system shutd
own complete.
Please help me to resolve this issue. any help would be greatly appreciated.

SonarQube - Elasticsearch could not bind

Last week SonarQube loaded in, worked nice and dandy. Now it throws this warning in sonar.log:
2017.03.16 11:58:47 WARN es[o.e.bootstrap] JNA not found. native methods will be disabled.
2017.03.16 11:58:47 INFO es[o.elasticsearch.node] [sonar-1489661925446] initialized
2017.03.16 11:58:47 INFO es[o.elasticsearch.node] [sonar-1489661925446] starting ...
Did not start of course. What could have changed while I was away?
Edit:
I did not provide the logs after trying to stop Sonar
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2017.03.16 16:54:43 INFO app[o.s.a.AppFileSystem] Cleaning or creating temp directory /proj/tn/tools/sonar/temp
2017.03.16 16:54:43 INFO app[o.s.p.m.JavaProcessLauncher] Launch process[es]: /afs/sunrise.ericsson.se/se/app/vbuild/SLED11-x86_64/jdk/1.8.0_102/jre/bin/java -Djava.awt.headless=true -Xmx1G -Xms256m -Xss256k -Djava.net.preferIPv4Stack=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Djava.io.tmpdir=/proj/tn/tools/sonar/temp -javaagent:/afs/sunrise.ericsson.se/se/app/vbuild/SLED11-x86_64/jdk/1.8.0_102/jre/lib/management-agent.jar -cp ./lib/common/*:./lib/search/* org.sonar.search.SearchServer /proj/tn/tools/sonar/temp/sq-process8365491608077217541properties
2017.03.16 16:54:43 INFO es[o.s.p.ProcessEntryPoint] Starting es
2017.03.16 16:54:43 INFO es[o.s.s.EsSettings] Elasticsearch listening on 127.0.0.1:9001
2017.03.16 16:54:43 INFO es[o.elasticsearch.node] [sonar-1489679683034] version[1.7.5], pid[438787], build[00f95f4/2016-02-02T09:55:30Z]
2017.03.16 16:54:43 INFO es[o.elasticsearch.node] [sonar-1489679683034] initializing ...
2017.03.16 16:54:43 INFO es[o.e.plugins] [sonar-1489679683034] loaded [], sites []
2017.03.16 16:54:43 INFO es[o.elasticsearch.env] [sonar-1489679683034] using [1] data paths, mounts [[/proj/tn (seroisproj02002.mo.sw.ericsson.se:/uproj020036/tn)]], net usable_space [330.8gb], net total_space [4.4tb], types [nfs]
2017.03.16 16:54:44 WARN es[o.e.bootstrap] JNA not found. native methods will be disabled.
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] initialized
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] starting ...
2017.03.16 16:54:45 WARN es[o.s.p.ProcessEntryPoint] Fail to start es
org.elasticsearch.transport.BindTransportException: Failed to bind to [9001]
at org.elasticsearch.transport.netty.NettyTransport.bindServerBootstrap(NettyTransport.java:422) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.transport.netty.NettyTransport.doStart(NettyTransport.java:283) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.component.AbstractLifecycleComponent.start(AbstractLifecycleComponent.java:85) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.transport.TransportService.doStart(TransportService.java:153) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.component.AbstractLifecycleComponent.start(AbstractLifecycleComponent.java:85) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.node.internal.InternalNode.start(InternalNode.java:257) ~[elasticsearch-1.7.5.jar:na]
at org.sonar.search.SearchServer.start(SearchServer.java:46) [sonar-search-5.6.2.jar:na]
at org.sonar.process.ProcessEntryPoint.launch(ProcessEntryPoint.java:102) ~[sonar-process-5.6.2.jar:na]
at org.sonar.search.SearchServer.main(SearchServer.java:81) [sonar-search-5.6.2.jar:na]
Caused by: org.elasticsearch.common.netty.channel.ChannelException: Failed to bind to: /127.0.0.1:9001
at org.elasticsearch.common.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.transport.netty.NettyTransport$1.onPortNumber(NettyTransport.java:413) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.transport.PortsRange.iterate(PortsRange.java:58) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.transport.netty.NettyTransport.bindServerBootstrap(NettyTransport.java:409) ~[elasticsearch-1.7.5.jar:na]
... 8 common frames omitted
Caused by: java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method) ~[na:1.8.0_102]
at sun.nio.ch.Net.bind(Net.java:433) ~[na:1.8.0_102]
at sun.nio.ch.Net.bind(Net.java:425) ~[na:1.8.0_102]
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) ~[na:1.8.0_102]
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ~[na:1.8.0_102]
at org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:391) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:315) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) ~[elasticsearch-1.7.5.jar:na]
at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) ~[elasticsearch-1.7.5.jar:na]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[na:1.8.0_102]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[na:1.8.0_102]
at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_102]
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] stopping ...
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] stopped
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] closing ...
2017.03.16 16:54:45 INFO es[o.elasticsearch.node] [sonar-1489679683034] closed
Server OS: Linux
Problem solved, bit ashamed, but as the log said the port 9001 was used by someone else on the server (probably my own previous process) so elasticsearch could not bind.
Simply changed it to 9002.

Sonar qube is not getting started

while running the StartSonar.bat from command line I am getting the following error contineously.sonarqube log file is showing the below messages.
Any help is always appreciated!
--> Wrapper Started as Console
Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2015.06.13 15:04:41 INFO app[o.s.p.m.JavaProcessLauncher] Launch process[search]: C:\Program Files\Java\jdk1.8.0_45\jre\bin\java - Djava.awt.headless=true -Xmx1G -Xms256m -Xss256k -Djava.net.preferIPv4Stack=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Djava.io.tmpdir=D:\sonarqube-5.1\sonarqube-5.1\temp -cp ./lib/common/*;./lib/search/* org.sonar.search.SearchServer C:\Users\rkutchar\AppData\Local\Temp\sq-process3411693551115002418properties
2015.06.13 15:04:42 INFO es[o.s.p.ProcessEntryPoint] Starting search
2015.06.13 15:04:42 INFO es[o.s.s.SearchServer] Starting Elasticsearch[sonarqube] on port 9001
2015.06.13 15:04:42 INFO es[o.elasticsearch.node] [sonar-1434188081817] version[1.4.4], pid[7068], build[c88f77f/2015-02-19T13:05:36Z]
2015.06.13 15:04:42 INFO es[o.elasticsearch.node] [sonar-1434188081817] initializing ...
2015.06.13 15:04:42 INFO es[o.e.plugins] [sonar-1434188081817] loaded [], sites []
2015.06.13 15:04:43 INFO es[o.elasticsearch.node] [sonar-1434188081817] initialized
2015.06.13 15:04:43 INFO es[o.elasticsearch.node] [sonar-1434188081817] starting ...
2015.06.13 15:04:44 INFO es[o.e.transport] [sonar-1434188081817] bound_address {inet[/0.0.0.0:9001]}, publish_address {inet[/10.76.17.174:9001]}
2015.06.13 15:04:44 INFO es[o.e.discovery] [sonar-1434188081817] sonarqube/Y7PopLbZQ8Gqvlg6o70C3g
2015.06.13 15:04:47 INFO es[o.e.cluster.service] [sonar-1434188081817] new_master [sonar-1434188081817][Y7PopLbZQ8Gqvlg6o70C3g][DIN35003079][inet[/10.76.17.174:9001]]{rack_id=sonar-1434188081817}, reason: zen-disco-join (elected_as_master)
2015.06.13 15:04:47 INFO es[o.elasticsearch.node] [sonar-1434188081817] started
2015.06.13 15:04:47 INFO es[o.e.gateway] [sonar-1434188081817] recovered [6] indices into cluster_state
2015.06.13 15:04:48 INFO app[o.s.p.m.Monitor] Process[search] is up
2015.06.13 15:04:48 INFO app[o.s.p.m.JavaProcessLauncher] Launch process[web]: C:\Program Files\Java\jdk1.8.0_45\jre\bin\java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.management.enabled=false -Djruby.compile.invokedynamic=false -Xmx768m -XX:MaxPermSize=160m -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true -Djava.io.tmpdir=D:\sonarqube-5.1\sonarqube-5.1\temp -cp ./lib/common/*;./lib/server/*;D:\sonarqube-5.1\sonarqube-5.1\lib\jdbc\mysql\mysql-connector-java-5.1.34.jar org.sonar.server.app.WebServer C:\Users\rkutchar\AppData\Local\Temp\sq-process4890757865030388998properties
2015.06.13 15:04:49 INFO web[o.s.p.ProcessEntryPoint] Starting web
2015.06.13 15:04:49 INFO web[o.s.s.app.Webapp] Webapp directory: D:\sonarqube-5.1\sonarqube-5.1\web
2015.06.13 15:04:49 INFO web[o.a.c.h.Http11NioProtocol] Initializing ProtocolHandler ["http-nio-0.0.0.0-9000"]
2015.06.13 15:04:49 INFO web[o.a.t.u.n.NioSelectorPool] Using a shared selector for servlet write/read
2015.06.13 15:04:50 INFO web[o.e.plugins] [sonar-1434188081817] loaded [], sites []
2015.06.13 15:04:50 INFO web[o.s.s.p.ServerImpl] SonarQube Server / 5.1 / 4aa9af3a6a4362b61db365fba32eb0a55d411e7a
2015.06.13 15:04:50 INFO web[o.s.c.p.Database] Create JDBC datasource for jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance
2015.06.13 15:04:50 ERROR web[o.a.c.c.C.[.[.[/]] Exception sending context initialized event to listener instance of class org.sonar.server.platform.PlatformServletContextListener
java.lang.IllegalStateException: Can not connect to database. Please check connectivity and settings (see the properties prefixed by 'sonar.jdbc.').
at org.sonar.core.persistence.DefaultDatabase.checkConnection(DefaultDatabase.java:117) ~[sonar-core-5.1.jar:na]
at org.sonar.core.persistence.DefaultDatabase.start(DefaultDatabase.java:73) ~[sonar-core-5.1.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_45]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_45]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_45]
at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_45]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.behaviors.Stored.start(Stored.java:110) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1015) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1008) ~[picocontainer-2.14.3.jar:na]
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:766) ~[picocontainer-2.14.3.jar:na]
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:91) ~[sonar-plugin-api-5.1.jar:na]
at org.sonar.server.platform.Platform.startLevel1Container(Platform.java:96) ~[sonar-server-5.1.jar:na]
at org.sonar.server.platform.Platform.init(Platform.java:72) ~[sonar-server-5.1.jar:na]
at org.sonar.server.platform.PlatformServletContextListener.contextInitialized(PlatformServletContextListener.java:43) ~[sonar-server-5.1.jar:na]
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4720) [tomcat-embed-core-8.0.18.jar:8.0.18]
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5154) [tomcat-embed-core-8.0.18.jar:8.0.18]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) [tomcat-embed-core-8.0.18.jar:8.0.18]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1409) [tomcat-embed-core-8.0.18.jar:8.0.18]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1399) [tomcat-embed-core-8.0.18.jar:8.0.18]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_45]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Access denied for user 'sonar'#'localhost' (using password: YES))
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044) ~[commons-dbcp-1.4.jar:1.4]
at org.sonar.core.persistence.DefaultDatabase.checkConnection(DefaultDatabase.java:115) ~[sonar-core-5.1.jar:na]
... 27 common frames omitted
Caused by: java.sql.SQLException: Access denied for user 'sonar'#'localhost' (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:996) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1206) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_45]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_45]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_45]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[na:1.8.0_45]
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545) ~[commons-dbcp-1.4.jar:1.4]
... 30 common frames omitted
2015.06.13 15:04:51 INFO web[jruby.rack] jruby 1.7.9 (ruby-1.8.7p370) 2013-12-06 87b108a on Java HotSpot(TM) 64-Bit Server VM 1.8.0_45-b15 [Windows 7-amd64]
2015.06.13 15:04:51 INFO web[jruby.rack] using a shared (threadsafe!) runtime
EDIT:
apply plugin: "sonar-runner"
sonarRunner {
sonarProperties {
// can be also set on command line like - Dsonar.analysis.mode=incremental
property "sonar.host.url","http://localhost:9000"
property "sonar.jdbc.url","jdbc:mysql://sonar.someserver.int:3306/sonar"
property "sonar.jdbc.driverClassName","com.mysql.jdbc.Driver"
property "sonar.jdbc.username", "****"
property "sonar.jdbc.password", "****"
//I added these properties to my gradle.build
property "sonar.projectKey","com.example.rkutchar.myapplication"
property "sonar.projectName","MyApplication"
property "sonar.projectVersion","V1.0"
property "sonar.language","java"
property "sonar.sources","src/main/java"
property "sonar.binaries","build"
}
}
subprojects {
sonarRunner {
sonarProperties {
properties["sonar.sources"] += "src/main/java"
}
}
}
sonarRunner {
toolVersion = '2.4'
}
PROJECT.PROPERTIES FILE
# Required metadata
sonar.projectKey=MyApplication
sonar.projectName=My Application
sonar.projectVersion=1.0
# Comma-separated paths to directories with sources (required)
sonar.sources=src/main/java
# Language
sonar.language=java
# Encoding of the source files
sonar.sourceEncoding=UTF-8
conf/sonar-properties
#Configure here general information about the environment, such as SonarQube DB details for example
#No information about specific project should appear here
#----- Default SonarQube server
#sonar.host.url=http://localhost:9000
#----- PostgreSQL
#sonar.jdbc.url=jdbc:postgresql://localhost/sonar
#----- MySQl
sonar.jdbc.url=jdbc:mysql://localhost:3306/sonar?
useUnicode=true&characterEncoding=utf8
#----- Oracle
#sonar.jdbc.url=jdbc:oracle:thin:#localhost/XE
#----- Microsoft SQLServer
#sonar.jdbc.url=jdbc:jtds:sqlserver://localhost/sonar;SelectMethod=Cursor
#----- Global database settings
#sonar.jdbc.username=sonar
#sonar.jdbc.password=sonar
#----- Default source code encoding
#sonar.sourceEncoding=UTF-8
#----- Security (when 'sonar.forceAuthentication' is set to 'true')
#sonar.login=admin
#sonar.password=admin
It seems that the permissions are not correct on your database. You have to execute the following statement on your MySQL database:
GRANT ALL PRIVILEGES ON `sonar`.* TO 'sonar'#'localhost';
As I explained, you should do step by step. First start server then analyze your android project. No need to spend your time on analyzer as long as server is not correctly started.
Mostly you placed the jdbc driver in the right folder. So it cant be realted to JDBC driver absence. Double check this.
I think you might have gotton the sonar.JDBC url property wrong. What DB is it?
Regards,
Karthik Prabhu

Resources