Unable to access JMX Console on ActiveMQ 5.12.0 VM - vagrant

I am attempting to get JMX Console working on a Vagrant VM box with ActiveMQ 5.12.0 manually installed. I've followed the guide on
http://activemq.apache.org/jmx.html
and numerous Stackoverflow troubleshooting questions:
Apache ActiveMQ browser can't connect to JMX console
How do I turn on JMX in ActiveMQ 5.2
configure JMX for ActiveMQ for remoting access
but I am still not able to access the console from the host machine.
I've done the following steps:
added useJmx="true" onto the <broker> tag in the activemq.xml file
set the following management context in the activemq.xml file:
<managementContext createConnector="true" rmiServerPort="1098" connectorPort="1099"/>
set ACTIVEMQ_SUNJMX_START environment variable to
"\
-Dcom.sun.management.jmxremote.ssl=false\
-Dcom.sun.management.jmxremote.password.file=/usr/share/activemq/conf/jmx.password\
-Dcom.sun.management.jmxremote.access.file=/usr/share/activemq/conf/jmx.access\
"
attempting to connect on
service:jmx:rmi://192.168.150.117:1098/jndi/rmi://192.168.150.117:1099/jmxrmi
using admin:activemq as username:password (specified in jmx.access and jmx.password)
This is the activemq.log file:
2015-11-20 12:24:39,710 | INFO | Refreshing org.apache.activemq.xbean.XBeanBrokerFactory$1#190c4838: startup date [Fri Nov 20 12:24:39 GMT 2015]; root of context hierarchy | org.apache.activemq.xbean.XBeanBrokerFactory$1 | main
2015-11-20 12:24:42,094 | INFO | PListStore:[/usr/share/activemq/data/localhost/tmp_storage] started | org.apache.activemq.store.kahadb.plist.PListStoreImpl | main
2015-11-20 12:24:42,191 | INFO | Using Persistence Adapter: KahaDBPersistenceAdapter[/usr/share/activemq/data/kahadb] | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:42,232 | INFO | JMX consoles can connect to service:jmx:rmi://localhost:1098/jndi/rmi://localhost:1099/jmxrmi | org.apache.activemq.broker.jmx.ManagementContext | JMX connector
2015-11-20 12:24:42,464 | INFO | Apache ActiveMQ 5.12.0 (localhost, ID:activemq.cdl.vm-56262-1448022282329-0:1) is starting | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:42,492 | INFO | Listening for connections at: tcp://activemq.cdl.vm:61616?maximumConnections=1000&wireFormat.maxFrameSize=104857600 | org.apache.activemq.transport.TransportServerThreadSupport | main
2015-11-20 12:24:42,498 | INFO | Connector openwire started | org.apache.activemq.broker.TransportConnector | main
2015-11-20 12:24:42,505 | INFO | Listening for connections at: amqp://activemq.cdl.vm:5672?maximumConnections=1000&wireFormat.maxFrameSize=104857600 | org.apache.activemq.transport.TransportServerThreadSupport | main
2015-11-20 12:24:42,510 | INFO | Connector amqp started | org.apache.activemq.broker.TransportConnector | main
2015-11-20 12:24:42,525 | INFO | Listening for connections at: stomp://activemq.cdl.vm:61613?maximumConnections=1000&wireFormat.maxFrameSize=104857600 | org.apache.activemq.transport.TransportServerThreadSupport | main
2015-11-20 12:24:42,532 | INFO | Connector stomp started | org.apache.activemq.broker.TransportConnector | main
2015-11-20 12:24:42,542 | INFO | Listening for connections at: mqtt://activemq.cdl.vm:1883?maximumConnections=1000&wireFormat.maxFrameSize=104857600 | org.apache.activemq.transport.TransportServerThreadSupport | main
2015-11-20 12:24:42,550 | INFO | Connector mqtt started | org.apache.activemq.broker.TransportConnector | main
2015-11-20 12:24:42,693 | INFO | Listening for connections at ws://activemq.cdl.vm:61614?maximumConnections=1000&wireFormat.maxFrameSize=104857600 | org.apache.activemq.transport.ws.WSTransportServer | main
2015-11-20 12:24:42,698 | INFO | Connector ws started | org.apache.activemq.broker.TransportConnector | main
2015-11-20 12:24:42,706 | INFO | Apache ActiveMQ 5.12.0 (localhost, ID:activemq.cdl.vm-56262-1448022282329-0:1) started | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:42,720 | INFO | For help or more information please see: http://activemq.apache.org | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:42,730 | WARN | Store limit is 102400 mb (current store usage is 0 mb). The data directory: /usr/share/activemq/data/kahadb only has 10221 mb of usable space - resetting to maximum available disk space: 10221 mb | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:42,737 | WARN | Temporary Store limit is 51200 mb, whilst the temporary data directory: /usr/share/activemq/data/localhost/tmp_storage only has 10221 mb of usable space - resetting to maximum available 10221 mb. | org.apache.activemq.broker.BrokerService | main
2015-11-20 12:24:43,444 | INFO | ActiveMQ WebConsole available at http://0.0.0.0:8161/ | org.apache.activemq.web.WebConsoleStarter | main
2015-11-20 12:24:43,444 | INFO | ActiveMQ Jolokia REST API available at http://0.0.0.0:8161/api/jolokia/ | org.apache.activemq.web.WebConsoleStarter | main
2015-11-20 12:24:43,538 | INFO | Initializing Spring FrameworkServlet 'dispatcher' | /admin | main
2015-11-20 12:24:43,950 | INFO | jolokia-agent: No access restrictor found at classpath:/jolokia-access.xml, access to all MBeans is allowed | /api | main
When I try to connect it just says "Secure Connection Failed. Retry insecurely?". It then tries and fails again with "Connection Failed: Retry?"

I set the following options on the ACTIVEMQ_SUNJMX_START environment variable:
export ACTIVEMQ_SUNJMX_START="\
-Dcom.sun.management.jmxremote\
-Dcom.sun.management.jmxremote.ssl=false\
-Dcom.sun.management.jmxremote.authenticate=false\
-Dcom.sun.management.jmxremote.local.only=false\
"
And it now works.

Related

Geth exit from docker when Unclean shutdown detected

My Geth server in docker has an error today. It exits without any obvious errors. Before this error happened, I had an issue that my ethers.js code hang at the wait() function of transaction after running the sendTransaction() function. Then I restarted my docker by remove the container and then then rerun docker image. After that the Geth stuck at Head state missing, repairing message for a while. After that, I started having this error.
Can anyone please help to identify the potential issue and the solution? Thank you
ubuntu#ip-172-31-37-178:~/bc$ sudo docker-compose up ethereum
Creating ethereum ... done
Attaching to ethereum
ethereum | INFO [11-13|08:34:22.777] Maximum peer count ETH=50 LES=0 total=50
ethereum | INFO [11-13|08:34:22.782] Smartcard socket not found, disabling err="stat /run/pcscd/pcscd.comm: no such file or directory"
ethereum | WARN [11-13|08:34:22.787] Sanitizing cache to Go's GC limits provided=1024 updated=322
ethereum | INFO [11-13|08:34:22.788] Set global gas cap cap=50,000,000
ethereum | INFO [11-13|08:34:22.790] Allocated trie memory caches clean=48.00MiB dirty=80.00MiB
ethereum | INFO [11-13|08:34:22.792] Allocated cache and file handles database=/dapp/geth/chaindata cache=160.00MiB handles=524,288
ethereum | INFO [11-13|08:34:22.874] Opened ancient database database=/dapp/geth/chaindata/ancient/chain readonly=false
ethereum | INFO [11-13|08:34:22.921]
ethereum | INFO [11-13|08:34:22.921] ---------------------------------------------------------------------------------------------------------------------------------------------------------
ethereum | INFO [11-13|08:34:22.922] Chain ID: 202208071 (unknown)
ethereum | INFO [11-13|08:34:22.922] Consensus: unknown
ethereum | INFO [11-13|08:34:22.923]
ethereum | INFO [11-13|08:34:22.923] Pre-Merge hard forks:
ethereum | INFO [11-13|08:34:22.924] - Homestead: 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/homestead.md)
ethereum | INFO [11-13|08:34:22.924] - Tangerine Whistle (EIP 150): 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/tangerine-whistle.md)
ethereum | INFO [11-13|08:34:22.924] - Spurious Dragon/1 (EIP 155): 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/spurious-dragon.md)
ethereum | INFO [11-13|08:34:22.925] - Spurious Dragon/2 (EIP 158): 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/spurious-dragon.md)
ethereum | INFO [11-13|08:34:22.925] - Byzantium: 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/byzantium.md)
ethereum | INFO [11-13|08:34:22.926] - Constantinople: 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/constantinople.md)
ethereum | INFO [11-13|08:34:22.926] - Petersburg: 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/petersburg.md)
ethereum | INFO [11-13|08:34:22.927] - Istanbul: 0 (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/istanbul.md)
ethereum | INFO [11-13|08:34:22.927] - Berlin: <nil> (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/berlin.md)
ethereum | INFO [11-13|08:34:22.928] - London: <nil> (https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/london.md)
ethereum | INFO [11-13|08:34:22.928]
ethereum | INFO [11-13|08:34:22.928] The Merge is not yet available for this network!
ethereum | INFO [11-13|08:34:22.929] - Hard-fork specification: https://github.com/ethereum/execution-specs/blob/master/network-upgrades/mainnet-upgrades/paris.md)
ethereum | INFO [11-13|08:34:22.929] ---------------------------------------------------------------------------------------------------------------------------------------------------------
ethereum | INFO [11-13|08:34:22.930]
ethereum | INFO [11-13|08:34:22.967] Disk storage enabled for ethash caches dir=/dapp/geth/ethash count=3
ethereum | INFO [11-13|08:34:22.968] Disk storage enabled for ethash DAGs dir=/root/.ethash count=2
ethereum | INFO [11-13|08:34:22.981] Initialising Ethereum protocol network=202,208,071 dbversion=8
ethereum | INFO [11-13|08:34:23.030] Loaded most recent local header number=459,372 hash=ec2d3b..753a58 td=467,564 age=0
ethereum | INFO [11-13|08:34:23.031] Loaded most recent local full block number=0 hash=b32e7d..0b3f25 td=8192 age=53y7mo2w
ethereum | INFO [11-13|08:34:23.031] Loaded most recent local fast block number=459,372 hash=ec2d3b..753a58 td=467,564 age=0
ethereum | WARN [11-13|08:34:23.054] Loaded snapshot journal diskroot=7ddd2b..767148 diffs=unmatched
ethereum | INFO [11-13|08:34:23.058] Setting new local account address=0xaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
ethereum | INFO [11-13|08:34:23.058] Loaded local transaction journal transactions=8 dropped=0
ethereum | INFO [11-13|08:34:23.060] Regenerated local transaction journal transactions=8 accounts=1
ethereum | INFO [11-13|08:34:23.060] Gasprice oracle is ignoring threshold set threshold=2
ethereum | WARN [11-13|08:34:23.061] Old unclean shutdowns found count=5
ethereum | WARN [11-13|08:34:23.061] Unclean shutdown detected booted=2022-11-11T06:14:21+0000 age=2d2h20m
ethereum | WARN [11-13|08:34:23.062] Unclean shutdown detected booted=2022-11-11T06:16:04+0000 age=2d2h18m
ethereum | WARN [11-13|08:34:23.076] Unclean shutdown detected booted=2022-11-11T12:59:51+0000 age=1d19h34m
ethereum | WARN [11-13|08:34:23.076] Unclean shutdown detected booted=2022-11-11T13:09:13+0000 age=1d19h25m
ethereum | WARN [11-13|08:34:23.077] Unclean shutdown detected booted=2022-11-11T13:11:27+0000 age=1d19h22m
ethereum | WARN [11-13|08:34:23.077] Unclean shutdown detected booted=2022-11-13T04:14:39+0000 age=4h19m44s
ethereum | WARN [11-13|08:34:23.077] Unclean shutdown detected booted=2022-11-13T08:21:45+0000 age=12m38s
ethereum | WARN [11-13|08:34:23.078] Unclean shutdown detected booted=2022-11-13T08:23:56+0000 age=10m27s
ethereum | WARN [11-13|08:34:23.078] Unclean shutdown detected booted=2022-11-13T08:26:07+0000 age=8m16s
ethereum | WARN [11-13|08:34:23.079] Unclean shutdown detected booted=2022-11-13T08:27:21+0000 age=7m2s
ethereum | WARN [11-13|08:34:23.079] Unclean shutdown detected booted=2022-11-13T08:29:00+0000 age=5m23s
ethereum | WARN [11-13|08:34:23.080] Engine API enabled protocol=eth
ethereum | WARN [11-13|08:34:23.083] Engine API started but chain not configured for merge yet
ethereum | INFO [11-13|08:34:23.099] Starting peer-to-peer node instance=Geth/miner_node/v1.10.22-unstable/linux-amd64/go1.18.5
ethereum | INFO [11-13|08:34:23.164] IPC endpoint opened url=/dapp/geth.ipc
ethereum | INFO [11-13|08:34:23.167] Loaded JWT secret file path=/dapp/geth/jwtsecret crc32=0xe55aa567
ethereum | INFO [11-13|08:34:23.167] HTTP server started endpoint=[::]:8545 auth=false prefix= cors=https://remix.ethereum.org vhosts=*
ethereum | INFO [11-13|08:34:23.168] WebSocket enabled url=ws://127.0.0.1:8551
ethereum | INFO [11-13|08:34:23.168] HTTP server started endpoint=127.0.0.1:8551 auth=true prefix= cors=localhost vhosts=localhost
ethereum | INFO [11-13|08:34:23.188] New local node record seq=1,660,218,854,068 id=65fe1ee283dfe396 ip=127.0.0.1 udp=0 tcp=30304
ethereum | INFO [11-13|08:34:23.189] Started P2P networking self="enode://xxxxxxxxxxxxxxxxxxxxxxxx#127.0.0.1:30304?discport=0"
ethereum | Killed
ethereum exited with code 137

Apache Karaf enable Gemini blueprint

I've downloaded Karaf 3.0.8 and tried to enable spring 3.2 and gemini blueprint - without success
karaf#root()> feature:install spring/3.2.17.RELEASE_1
karaf#root()> feature:install gemini-blueprint
Exception in thread "EclipseGeminiBlueprintExtenderThread-25" org.springframework.beans.factory.xml.XmlBeanDefinitionStoreException: Line 22 in XML document from URL [bundle://55.0:0/OSGI-INF/blueprint/kar-deployer.xml] is invalid;
nested exception is org.xml.sax.SAXParseException: cvc-complex-type.2.4.c:
The matching wildcard is strict, but no declaration can be found for element 'ext:property-placeholder'.
....
karaf#root()> feature:list -i
Name | Version | Installed | Repository | Description
---------------------------------------------------------------------------------------------------------------------
spring | 3.2.17.RELEASE_1 | x | spring-3.0.8 | Spring 3.2.x support
gemini-blueprint | 1.0.0.RELEASE | x | spring-3.0.8 | Gemini Blueprint Extender
standard | 3.0.8 | x | standard-3.0.8 | Karaf standard feature
config | 3.0.8 | x | standard-3.0.8 | Provide OSGi ConfigAdmin support
region | 3.0.8 | x | standard-3.0.8 | Provide Region Support
package | 3.0.8 | x | standard-3.0.8 | Package commands and mbeans
kar | 3.0.8 | x | standard-3.0.8 | Provide KAR (KARaf archive) support
ssh | 3.0.8 | x | standard-3.0.8 | Provide a SSHd server on Karaf
management | 3.0.8 | x | standard-3.0.8 | Provide a JMX MBeanServer and a set of MBeans in K
karaf#root()> list -t 0
START LEVEL 100 , List Threshold: 0
ID | State | Lvl | Version | Name
------------------------------------------------------------------------------------------------------------
0 | Active | 0 | 4.2.1 | System Bundle
1 | Active | 5 | 2.4.7 | OPS4J Pax Url - aether:
2 | Active | 5 | 2.4.7 | OPS4J Pax Url - wrap:
3 | Active | 8 | 1.8.4 | OPS4J Pax Logging - API
4 | Active | 8 | 1.8.4 | OPS4J Pax Logging - Service
5 | Active | 10 | 3.0.8 | Apache Karaf :: Service :: Guard
6 | Active | 10 | 1.8.4 | Apache Felix Configuration Admin Service
7 | Active | 11 | 3.5.2 | Apache Felix File Install
8 | Active | 12 | 5.0.3 | ASM all classes with debug info
9 | Active | 20 | 1.1.1 | Apache Aries Util
10 | Active | 20 | 1.0.1 | Apache Aries Proxy API
11 | Active | 20 | 1.0.8 | Apache Aries Blueprint CM
12 | Active | 20 | 1.0.4 | Apache Aries Proxy Service
13 | Active | 20 | 1.0.1 | Apache Aries Blueprint API
14 | Resolved | 20 | 1.0.0 | Apache Aries Blueprint Core Compatiblity Fragment Bundle, Hosts: 15
15 | Active | 20 | 1.6.1 | Apache Aries Blueprint Core, Fragments: 14
16 | Active | 24 | 3.0.8 | Apache Karaf :: Deployer :: Spring
17 | Active | 24 | 3.0.8 | Apache Karaf :: Deployer :: Blueprint
18 | Active | 24 | 3.0.8 | Apache Karaf :: Deployer :: Wrap Non OSGi Jar
19 | Active | 25 | 3.0.8 | Apache Karaf :: Region :: Core
20 | Active | 25 | 3.0.8 | Apache Karaf :: Features :: Core
21 | Active | 26 | 3.0.8 | Apache Karaf :: Deployer :: Features
22 | Active | 30 | 2.13.0 | JLine
23 | Active | 30 | 0.2.1 | JLEdit :: Core
24 | Active | 30 | 3.0.8 | Apache Karaf :: Features :: Command
25 | Active | 30 | 3.0.8 | Apache Karaf :: Bundle :: Core
26 | Active | 30 | 3.0.8 | Apache Karaf :: Bundle :: Commands
27 | Active | 30 | 3.0.8 | Apache Karaf :: Shell :: Console
28 | Active | 30 | 3.0.8 | Apache Karaf :: JAAS :: Modules
29 | Active | 30 | 3.0.8 | Apache Karaf :: JAAS :: Config
30 | Active | 30 | 0.14.0 | Apache Mina SSHD :: Core
31 | Active | 30 | 3.0.8 | Apache Karaf :: Shell :: Help System
32 | Active | 30 | 3.0.8 | Apache Karaf :: Shell :: Table
33 | Active | 30 | 3.0.8 | Apache Karaf :: System :: Core
34 | Active | 30 | 3.0.8 | Apache Karaf :: System :: Shell Commands
35 | Active | 30 | 3.0.8 | Apache Karaf :: Shell :: Various Commands
36 | Active | 30 | 1.0.0 | Apache Aries Quiesce API
37 | Active | 30 | 3.0.8 | Apache Karaf :: ConfigAdmin :: Core
38 | Active | 30 | 3.0.8 | Apache Karaf :: ConfigAdmin :: Commands
39 | Active | 30 | 3.0.8 | Apache Karaf :: Instance :: Core
40 | Active | 30 | 3.0.8 | Apache Karaf :: Instance :: Command
41 | Active | 30 | 3.0.8 | Apache Karaf :: JAAS :: Command
42 | Active | 30 | 3.0.8 | Apache Karaf :: Diagnostic :: Core
43 | Active | 30 | 3.0.8 | Apache Karaf :: Diagnostic :: Command
44 | Active | 30 | 3.0.8 | Apache Karaf :: Log :: Core
45 | Active | 30 | 3.0.8 | Apache Karaf :: Log :: Command
46 | Active | 30 | 3.0.8 | Apache Karaf :: Service :: Core
47 | Active | 30 | 3.0.8 | Apache Karaf :: Service :: Command
48 | Active | 30 | 1.0.0.v20110524 | Region Digraph
49 | Active | 30 | 3.0.8 | Apache Karaf :: Region :: Persistence
50 | Active | 30 | 3.0.8 | Apache Karaf :: Region :: Shell Commands
51 | Active | 30 | 3.0.8 | Apache Karaf :: Package :: Core
52 | Active | 30 | 3.0.8 | Apache Karaf :: Package :: Commands
53 | Active | 30 | 3.0.8 | Apache Karaf :: KAR :: Core
54 | Active | 30 | 3.0.8 | Apache Karaf :: KAR :: Command
55 | Active | 30 | 3.0.8 | Apache Karaf :: Deployer :: Karaf Archive (.kar)
56 | Active | 30 | 2.0.7 | Apache MINA Core
57 | Active | 30 | 3.0.8 | Apache Karaf :: Shell :: SSH
58 | Active | 30 | 3.0.8 | Apache Karaf :: Management
59 | Active | 30 | 1.1.1 | Apache Aries JMX API
60 | Active | 30 | 1.1.6 | Apache Aries JMX Core
61 | Active | 30 | 1.1.0 | Apache Aries JMX Blueprint API
62 | Active | 30 | 1.1.0 | Apache Aries JMX Blueprint Core
63 | Active | 30 | 1.0.0 | Apache Aries JMX Whiteboard
64 | Active | 30 | 1.0.0.6 | Apache ServiceMix :: Bundles :: aopalliance
65 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-core
66 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-expression
67 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-beans
68 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-aop
69 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-context
70 | Active | 30 | 3.2.17.RELEASE_1 | Apache ServiceMix :: Bundles :: spring-context-support
71 | Active | 30 | 1.0.0.RELEASE | gemini-blueprint-io
72 | Active | 30 | 1.0.0.RELEASE | gemini-blueprint-core
73 | Active | 30 | 1.0.0.RELEASE | gemini-blueprint-extender
After restart I've got following exception printed for almost every bundle
73 - org.eclipse.gemini.blueprint.extender - 1.0.0.RELEASE | Application context refresh failed (OsgiBundleXmlApplicationContext(bundle=org.apache.karaf.config.command, config=OSGI-INF/blueprint/shell-config.xml))
org.springframework.beans.factory.xml.XmlBeanDefinitionStoreException: Line 25 in XML document from URL [bundle://38.0:0/OSGI-INF/blueprint/shell-config.xml] is invalid; nested exception is org.xml.sax.SAXParseException: cvc-complex-type.2.4.c: The matching wildcard is strict, but no declaration can be found for element 'command-bundle'.
For some bundles there's a problem with command-bundle element and for others with ext:property-placeholder
So it looks like Karaf core bundles have invalid blueprint xml? Is it a bug?
The Apache Karaf 2.x and 3.x versions use blueprint internally.
Blueprint specification is somewhat limited and does not cover any extension mechanism through namespaces. This is supported by both Aries Blueprint and Gemini Blueprint, but in different ways.
Those Karaf versions rely on Aries Blueprint, so it's a bit hard to get Gemini Blueprint working.
Karaf 4.x does not use Blueprint internally anymore, so it should be much easier to install Gemini Blueprint there.
That said, Gemini Blueprint isn't much active, so you should have a look at the Aries Blueprint implementation instead. Fwiw, Aries Blueprint also provides a full Spring compatibility layer if you need.

Sonarqube 5.1.2 native package on RHEL6 won't start without root

I have recently installed the native RPM package of Sonarqube 5.1.2 on my RHEL 6.6 server.
I'm using 64 bit OpenJDK 1.8.0_51 and MySQL as DB.
Sonarqube starts up perfectly with root using the sonar.sh script in /bin within the installation folder (which is /opt/sonar in my case) but it doesn't work with the startup script created by the package in /etc/init.d.
Here is the log export:
Running SonarQube...
wrapper | --> Wrapper Started as Console
wrapper | Using tick timer.
wrapperp | server listening on port 32000.
wrapper | Command[0] : /etc/alternatives/java_sdk/bin/java
wrapper | Command[1] : -Djava.awt.headless=true
wrapper | Command[2] : -Xms3m
wrapper | Command[3] : -Xmx32m
wrapper | Command[4] : -Djava.library.path=./lib
wrapper | Command[5] : -classpath
wrapper | Command[6] : ../../lib/jsw/wrapper-3.2.3.jar:../../lib/sonar-application-5.1.2.jar
wrapper | Command[7] : -Dwrapper.key=sdELP0aWwf4S5hdM
wrapper | Command[8] : -Dwrapper.port=32000
wrapper | Command[9] : -Dwrapper.jvm.port.min=31000
wrapper | Command[10] : -Dwrapper.jvm.port.max=31999
wrapper | Command[11] : -Dwrapper.debug=TRUE
wrapper | Command[12] : -Dwrapper.pid=23176
wrapper | Command[13] : -Dwrapper.version=3.2.3
wrapper | Command[14] : -Dwrapper.native_library=wrapper
wrapper | Command[15] : -Dwrapper.cpu.timeout=10
wrapper | Command[16] : -Dwrapper.jvmid=1
wrapper | Command[17] : org.tanukisoftware.wrapper.WrapperSimpleApp
wrapper | Command[18] : org.sonar.application.App
wrapper | Launching a JVM...
jvm 1 | WrapperManager class initialized by thread: main Using classloader: sun.misc.Launcher$AppClassLoader#4e25154f
jvm 1 | Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
jvm 1 | Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
jvm 1 |
jvm 1 | Wrapper Manager: JVM #1
jvm 1 | Running a 64-bit JVM.
jvm 1 | Wrapper Manager: Registering shutdown hook
jvm 1 | Wrapper Manager: Using wrapper
jvm 1 | Load native library. One or more attempts may fail if platform specific libraries do not exist.
jvm 1 | Loading native library failed: libwrapper-linux-x86-64.so Cause: java.lang.UnsatisfiedLinkError: no wrapper-linux-x86-64 in java.library.path
jvm 1 | Loaded native library: libwrapper.so
jvm 1 | Calling native initialization method.
jvm 1 | Inside native WrapperManager initialization method
jvm 1 | Java Version : 1.8.0_51-b16 OpenJDK 64-Bit Server VM
jvm 1 | Java VM Vendor : Oracle Corporation
jvm 1 |
jvm 1 | Startup runner thread started.
jvm 1 | Control event monitor thread started.
jvm 1 | WrapperManager.start(org.tanukisoftware.wrapper.WrapperSimpleApp#4a574795, args[]) called by thread: main
jvm 1 | Communications runner thread started.
jvm 1 | Open socket to wrapper...Wrapper-Connection
jvm 1 | Failed attempt to bind using local port 31000
jvm 1 | Opened Socket from 31001 to 32000
jvm 1 | Send a packet KEY : sdELP0aWwf4S5hdM
jvm 1 | handleSocket(Socket[addr=/127.0.0.1,port=32000,localport=31001])
wrapperp | accepted a socket from 127.0.0.1 on port 31001
wrapperp | read a packet KEY : sdELP0aWwf4S5hdM
wrapper | Got key from JVM: sdELP0aWwf4S5hdM
wrapperp | send a packet LOW_LOG_LEVEL : 1
wrapperp | send a packet PING_TIMEOUT : 0
wrapperp | send a packet PROPERTIES : (Property Values)
wrapper | Start Application.
wrapperp | send a packet START : start
jvm 1 | Received a packet LOW_LOG_LEVEL : 1
jvm 1 | Wrapper Manager: LowLogLevel from Wrapper is 1
jvm 1 | Received a packet PING_TIMEOUT : 0
jvm 1 | PingTimeout from Wrapper is 0
jvm 1 | Received a packet PROPERTIES : (Property Values)
jvm 1 | Received a packet START : start
jvm 1 | calling WrapperListener.start()
jvm 1 | Waiting for WrapperListener.start runner thread to complete.
jvm 1 | WrapperListener.start runner thread started.
jvm 1 | WrapperSimpleApp: start(args) Will wait up to 2 seconds for the main method to complete.
jvm 1 | WrapperSimpleApp: invoking main method
jvm 1 | Wrapper Manager: ShutdownHook started
jvm 1 | WrapperManager.stop(0) called by thread: Wrapper-Shutdown-Hook
jvm 1 | Send a packet STOP : 0
jvm 1 | Startup runner thread stopped.
wrapperp | read a packet STOP : 0
wrapper | JVM requested a shutdown. (0)
wrapper | wrapperStopProcess(0) called.
wrapper | Sending stop signal to JVM
wrapperp | send a packet STOP : NULL
jvm 1 | Send a packet START_PENDING : 5000
wrapperp | read a packet START_PENDING : 5000
wrapper | JVM signalled a start pending with waitHint of 5000 millis.
jvm 1 | Thread, Wrapper-Shutdown-Hook, handling the shutdown process.
jvm 1 | shutdownJVM(0) Thread:Wrapper-Shutdown-Hook
jvm 1 | Send a packet STOPPED : 0
wrapperp | read a packet STOPPED : 0
wrapper | JVM signalled that it was stopped.
jvm 1 | Closing socket.
wrapperp | socket read no code (closed?).
wrapperp | server listening on port 32001.
jvm 1 | Wrapper Manager: ShutdownHook complete
wrapper | JVM exited normally.
wrapper | Signal trapped. Details:
wrapper | signal number=17 (SIGCHLD), source="unknown"
wrapper | Received SIGCHLD, checking JVM process status.
wrapper | JVM process exited with a code of 0, leaving the wrapper exit code set to 0.
wrapper | <-- Wrapper Stopped
I know it must be something with the permissions for the "sonar" user which was also created by the install package and I already made sure that the /opt/sonar folder is owned by the sonar user but still, these logs can't tell me what is missing.
I also compared the above log with the one when I start the script with the root user and I was able to point out at least some differences.
With root, after invoking the main method, the actual java process starts properly:
.
.
jvm 1 | WrapperSimpleApp: invoking main method
jvm 1 | INFO app[o.s.p.m.JavaProcessLauncher] Launch process[search]: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.51-1.b16.el6_7.x86_64/jre/bin/java -Djava.awt.headless=true -Xmx1G -Xms256m -Xss256k -Djava.net.preferIPv4Stack=true -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutOfMemoryError -Djava.io.tmpdir=/opt/sonar/temp -cp ./lib/common/*:./lib/search/* org.sonar.search.SearchServer /tmp/sq-process3395120832375267732properties
With the sonar user, its not started and the JVM (or wrapper) simply initiates a shutdown:
.
.
jvm 1 | WrapperSimpleApp: invoking main method
jvm 1 | Wrapper Manager: ShutdownHook started
jvm 1 | WrapperManager.stop(0) called by thread: Wrapper-Shutdown-Hook
Unfortunately I have no idea what might be missing from the sonar user because according to the RPM package description, the startup script should work without root privileges.
Can anyone help me or point me to the right direction? I really don't want to run Sonarqube as root if it's not necessary.
Thank you for the help in advance!
Did you initialy start sonar using root account (via sonar.sh) ?
If so, did you chown -r /opt/sonar directory to sonar user ?

How to configure ActiveMQ with LevelDB JNI driver?

I am having trouble forcing ActiveMQ 5.10.1 on CentOS 6 (64bit) with Oracle JDK 8 to use JNI driver for LevelDB. When I set indexFactory="org.fusesource.leveldbjni.JniDBFactory" - my broker fails to start. When I omit it, it does start, but it uses Pure Java driver.
<persistenceAdapter>
<levelDB directory="${activemq.data}/leveldb" indexFactory="org.fusesource.leveldbjni.JniDBFactory"/>
</persistenceAdapter>
I did install LevelDB rpm on my OS, but after the AMQ start, its log file indicates that it uses the pure java driver:
2015-01-27 05:44:48,172 | INFO | Using Persistence Adapter: LevelDB[/home/roman/amq/DISK1/broker1/data/leveldb] | org.apache.activemq.broker.BrokerService | main
2015-01-27 05:44:48,233 | INFO | Using the pure java LevelDB implementation. | org.apache.activemq.leveldb.LevelDBClient | main
2
I try adding the level-db jni jar file to the classpath, but AMQ fails to find the classes. Here is running broker (note classpath with the leveldbjni-1.8.jar - AMQ docs does not really explain if those optional libs are loaded or not by default?, so I added it by hand):
/home/roman/jdk1.8.0_31/bin/java -Xms1G -Xmx1G -Djava.util.logging.config.file=logging.properties -Djava.security.auth.login.config=/home/roman/amq/DISK1/broker1/conf/login.config -Dcom.sun.management.jmxremote -Djava.awt.headless=true -Djava.io.tmpdir=/home/roman/amq/DISK1/broker1/tmp -Dactivemq.classpath=;/home/roman/amq/DISK1/broker1/conf;/home/roman/apache-activemq-5.10.1/lib/optional/leveldbjni-1.8.jar -Dactivemq.home=/home/roman/apache-activemq-5.10.1 -Dactivemq.base=/home/roman/amq/DISK1/broker1 -Dactivemq.conf=/home/roman/amq/DISK1/broker1/conf -Dactivemq.data=/home/roman/amq/DISK1/broker1/data -jar /home/roman/apache-activemq-5.10.1/bin/activemq.jar start
Here is the error I get in the log file when forcing JNI to be used for LevelDB:
2015-01-28 05:07:34,904 | INFO | Refreshing org.apache.activemq.xbean.XBeanBrokerFactory$1#6276ae34: startup date [Wed Jan 28 05:07:34 PST 2015]; root of context hierarchy | org.apache.activemq.xbean.XBeanBrokerFactory$1 | main
2015-01-28 05:07:36,139 | INFO | Using Persistence Adapter: LevelDB[/home/roman/amq/DISK1/broker1/data/leveldb] | org.apache.activemq.broker.BrokerService | main
2015-01-28 05:07:36,193 | ERROR | Failed to start Apache ActiveMQ ([broker1, null], java.lang.Exception: Could not load any of the index factory classes: org.fusesource.leveldbjni.JniDBFactory) | org.apache.activemq.broker.BrokerService | main
2015-01-28 05:07:36,198 | INFO | Apache ActiveMQ 5.10.1 (broker1, null) is shutting down | org.apache.activemq.broker.BrokerService | main
2015-01-28 05:07:36,201 | INFO | Connector openwire stopped | org.apache.activemq.broker.TransportConnector | main
2015-01-28 05:07:36,205 | INFO | Connector amqp stopped | org.apache.activemq.broker.TransportConnector | main
2015-01-28 05:07:36,208 | INFO | Connector stomp stopped | org.apache.activemq.broker.TransportConnector | main
2015-01-28 05:07:36,212 | INFO | Connector mqtt stopped | org.apache.activemq.broker.TransportConnector | main
2015-01-28 05:07:36,215 | INFO | Connector ws stopped | org.apache.activemq.broker.TransportConnector | main
2015-01-28 05:07:36,229 | INFO | Stopped LevelDB[/home/roman/amq/DISK1/broker1/data/leveldb] | org.apache.activemq.leveldb.LevelDBStore | main
2015-01-28 05:07:36,240 | INFO | Apache ActiveMQ 5.10.1 (broker1, null) uptime 0.135 seconds | org.apache.activemq.broker.BrokerService | main
2015-01-28 05:07:36,243 | INFO | Apache ActiveMQ 5.10.1 (broker1, null) is shutdown | org.apache.activemq.broker.BrokerService | main
2015-01-28 05:07:36,246 | INFO | Closing org.apache.activemq.xbean.XBeanBrokerFactory$1#6276ae34: startup date [Wed Jan 28 05:07:34 PST 2015]; root of context hierarchy | org.apache.activemq.xbean.XBeanBrokerFactory$1 | main
2015-01-28 05:07:36,252 | WARN | Exception thrown from LifecycleProcessor on context close | org.apache.activemq.xbean.XBeanBrokerFactory$1 | main
java.lang.IllegalStateException: LifecycleProcessor not initialized - call 'refresh' before invoking lifecycle methods via the context: org.apache.activemq.xbean.XBeanBrokerFactory$1#6276ae34: startup date [Wed Jan 28 05:07:34 PST 2015]; root of context hierarchy
at org.springframework.context.support.AbstractApplicationContext.getLifecycleProcessor(AbstractApplicationContext.java:360)[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1057)[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:1010)[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.apache.activemq.hooks.SpringContextHook.run(SpringContextHook.java:30)[activemq-spring-5.10.1.jar:5.10.1]
at org.apache.activemq.broker.BrokerService.stop(BrokerService.java:809)[activemq-broker-5.10.1.jar:5.10.1]
at org.apache.activemq.xbean.XBeanBrokerService.stop(XBeanBrokerService.java:122)[activemq-spring-5.10.1.jar:5.10.1]
at org.apache.activemq.broker.BrokerService.start(BrokerService.java:601)[activemq-broker-5.10.1.jar:5.10.1]
at org.apache.activemq.xbean.XBeanBrokerService.afterPropertiesSet(XBeanBrokerService.java:73)[activemq-spring-5.10.1.jar:5.10.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_31]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_31]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_31]
at java.lang.reflect.Method.invoke(Method.java:483)[:1.8.0_31]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1638)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1579)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1509)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:458)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:296)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:223)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:293)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:628)[spring-beans-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:932)[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:479)[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE]
at org.apache.xbean.spring.context.ResourceXmlApplicationContext.<init>(ResourceXmlApplicationContext.java:64)[xbean-spring-3.16.jar:3.16]
at org.apache.xbean.spring.context.ResourceXmlApplicationContext.<init>(ResourceXmlApplicationContext.java:52)[xbean-spring-3.16.jar:3.16]
at org.apache.activemq.xbean.XBeanBrokerFactory$1.<init>(XBeanBrokerFactory.java:104)[activemq-spring-5.10.1.jar:5.10.1]
at org.apache.activemq.xbean.XBeanBrokerFactory.createApplicationContext(XBeanBrokerFactory.java:104)[activemq-spring-5.10.1.jar:5.10.1]
at org.apache.activemq.xbean.XBeanBrokerFactory.createBroker(XBeanBrokerFactory.java:67)[activemq-spring-5.10.1.jar:5.10.1]
at org.apache.activemq.broker.BrokerFactory.createBroker(BrokerFactory.java:71)[activemq-broker-5.10.1.jar:5.10.1]
at org.apache.activemq.broker.BrokerFactory.createBroker(BrokerFactory.java:54)[activemq-broker-5.10.1.jar:5.10.1]
at org.apache.activemq.console.command.StartCommand.runTask(StartCommand.java:87)[activemq-console-5.10.1.jar:5.10.1]
at org.apache.activemq.console.command.AbstractCommand.execute(AbstractCommand.java:57)[activemq-console-5.10.1.jar:5.10.1]
at org.apache.activemq.console.command.ShellCommand.runTask(ShellCommand.java:150)[activemq-console-5.10.1.jar:5.10.1]
at org.apache.activemq.console.command.AbstractCommand.execute(AbstractCommand.java:57)[activemq-console-5.10.1.jar:5.10.1]
at org.apache.activemq.console.command.ShellCommand.main(ShellCommand.java:104)[activemq-console-5.10.1.jar:5.10.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_31]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_31]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_31]
at java.lang.reflect.Method.invoke(Method.java:483)[:1.8.0_31]
at org.apache.activemq.console.Main.runTaskClass(Main.java:262)[activemq.jar:5.10.1]
at org.apache.activemq.console.Main.main(Main.java:115)[activemq.jar:5.10.1]
The AMQ doc does not explain on how to configure LevelDB for JNI to work. Any suggestions?
The reason I want to use JNI driver is because with pure Java driver my AMQ broker performs about ~20% slower than KahaDB in 80 concurrent user / 20 queues test. I am hoping JNI with LevelDB can make AMQ go faster than KahaDB configuration.
I know this is old, but I couldn't find the answer really documented anywhere else. All I needed to do was download the appropriate JNI library to the ActiveMQ lib directory. For example:
wget -O /etc/activemq/apache-activemq-5.13.0/lib/optional/leveldbjni-linux64-1.8.jar http://central.maven.org/maven2/org/fusesource/leveldbjni/leveldbjni-linux64/1.8/leveldbjni-linux64-1.8.jar

Error starting remote Bamboo agent: HTTP status code 500 received in response to fingerprint request

I have the following error when starting remote Bamboo agent:
INFO | jvm 1 | 2012/11/20 01:15:58 | 2012-11-20 01:15:58,235 INFO [WrapperSimpleAppMain] [RemoteAgentHomeLocatorForBootstrap] Agent home located at '/Users/user9066/bamboo-agent-home'
INFO | jvm 1 | 2012/11/20 01:15:58 | 2012-11-20 01:15:58,248 INFO [WrapperSimpleAppMain] [AgentUuidInitializer] Found agent UUID <snip> in temporary UUID file '/Users/user9066/bamboo-agent-home/uuid-temp.properties'
INFO | jvm 1
| 2012/11/20 01:15:58 | 2012-11-20 01:15:58,378 INFO [WrapperSimpleAppMain] [AgentContext] Requesting fingerprint, url: http://<bamboo-server-ip>:8090/bamboo/AgentServer/GetFingerprint.action?hostName=<remote-agent-ip>&version=3&agentUuid=<snip>
ERROR | wrapper | 2012/11/20 01:15:58 | JVM exited while starting the application.
INFO | jvm 1 | 2012/11/20 01:15:58 | Exiting due to fatal exception.
INFO | jvm 1 | 2012/11/20 01:15:58 | com.atlassian.bamboo.agent.bootstrap.RemoteAgentHttpException: HTTP status code 500 received in response to fingerprint request.
INFO | jvm 1 | 2012/11/20 01:15:58 | at com.atlassian.bamboo.agent.bootstrap.AgentContext.initFingerprint(AgentContext.java:131)
The ports 8085 and 54663 are open. Enabling log4j does not provide any additional information too.
Has anyone seen this error? Any pointers to resolve this please?
I had a similar error. I seemed to fix it by downloading the alternative remote agent installer called bamboo-agent-4.2.0.jar. You can find it as a small link underneath the main remote agent download button.
Once i had ran this jar and successfully authenticated, the original jar worked.

Resources