Debezium Kafka connector mongodb : Error connecting kafka connector to mongodb - apache-kafka-connect

Below are my MongoDB config in /etc/kafka/connect-mongodb-source.properties
name=mongodb-source-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
mongodb.hosts=/remoteserveraddress:27017
mongodb.name=mongo_conn
initial.sync.max.threads=1
tasks.max=1
but getting below error
ERROR Plugin class loader for connector: 'io.debezium.connector.mongodb.MongoDbConnector' was not found. Returning: org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader#5a058be5 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:165)
Running connector in standalone mode.
I have mongodbconnector class under debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar
also classpath is set as follows
#for CLASSPATH
CLASSPATH=/Users/111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/*
export CLASSPATH
PATH=$PATH:/usr/local/sbin
export PATH
Using plugin path I see its able to register and load all required plugins..
[2020-01-10 08:14:07,916] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:78)
[2020-01-10 08:14:07,942] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,082] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,085] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,120] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.MongoDbConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.transforms.UnwrapFromMongoDbEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,122] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-core-1.0.0.Final.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,198] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-core-1.0.0.Final.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,198] INFO Added plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,200] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongo-kafka-0.2-all.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,340] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongo-kafka-0.2-all.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,340] INFO Added plugin 'com.mongodb.kafka.connect.MongoSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,340] INFO Added plugin 'com.mongodb.kafka.connect.MongoSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,341] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/bson-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,373] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/bson-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,373] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-core-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,465] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-core-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
but at the end again same issue
[2020-01-10 08:40:43,613] ERROR Plugin class loader for connector: 'io.debezium.connector.mongodb.MongoDbConnector' was not found. Returning: org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader#33f2df51 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:165)
[2020-01-10 08:40:43,809] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:121)
java.lang.IllegalArgumentException: username can not be null
at com.mongodb.MongoCredential.<init>(MongoCredential.java:350)
at com.mongodb.MongoCredential.<init>(MongoCredential.java:344)
at com.mongodb.MongoCredential.createCredential(MongoCredential.java:169)
at io.debezium.connector.mongodb.ConnectionContext.<init>(ConnectionContext.java:69)
at io.debezium.connector.mongodb.MongoDbConnector.validate(MongoDbConnector.java:222)
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:313)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:192)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:115)
[2020-01-10 08:40:43,810] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
output of >curl localhost:8083/connector-plugins
[{"class":"com.mongodb.kafka.connect.MongoSinkConnector","type":"sink","version":"0.2"},{"class":"com.mongodb.kafka.connect.MongoSourceConnector","type":"source","version":"0.2"},{"class":"io.confluent.connect.activemq.ActiveMQSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.ibm.mq.IbmMQSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.jdbc.JdbcSinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.jdbc.JdbcSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.jms.JmsSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.replicator.ReplicatorSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.s3.S3SinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.storage.tools.SchemaSourceConnector","type":"source","version":"5.3.2-ce"},{"class":"io.debezium.connector.mongodb.MongoDbConnector","type":"source","version":"1.0.0.Final"},{"class":"org.apache.kafka.connect.file.FileStr
thanks in advance

First of all, please check the installation of your plugin using the Kafka Connect REST Interface (see details here).
Try to install Kafka Connect plugins using the plugin path mechanism instead of CLASSPATH (more info in the docs).

There were two issues
Debezium connector has hardcoded source='admin'and there is no way to overwrite it and the DB server I was trying to connect didn't had any admin as authentication.
Test server that I was using was standalone server and had no replica, so I had to create replica and add this server to it.
Debezium connector doesnt support standalone servers.

Related

what would cause this debezium kafka connector error?

I'm looking to connect mysql --> debezium --> kafka (confluent cloud). I was wondering if someone can help me with this error msg:
Failed to find any class that implements Connector and which name matches io.debezium.connector.mysql.MySqlConnector.
I have the jar files for debezium in my kafka worker's file:
plugin.path=/home/ec2-user/kafka/plugins
The content of the plugin folder has the following:
-antlr4-runtime-4.7.2.jar
-debezium-api-1.5.3.Final.jar
-debezium-connector-mysql-1.5.3.Final.jar
-debezium-core-1.5.3.Final.jar
-debezium-ddl-parser-1.5.3.Final.jar
-failureaccess-1.0.1.jar
-guava-30.0-jre.jar
-mysql-binlog-connector-java-0.25.1.jar
-mysql-connector-java-8.0.21.jar
when I started up the distributor, I see that the plugins are added:
[2021-06-24 23:01:54,680] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-api-1.5.3.Final.jar}
[2021-06-24 23:01:54,684] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-connector-mysql-1.5.3.Final.jar
[2021-06-24 23:01:54,744] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-connector-mysql-
[2021-06-24 23:01:54,745] INFO Added plugin 'io.debezium.connector.mysql.transforms.ReadToInsertEvent'
[2021-06-24 23:01:54,745] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-core-1.5.3.Final.jar
[2021-06-24 23:01:54,866] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-core-1.5.3.Final.jar}
[2021-06-24 23:01:54,867] INFO Added plugin 'io.debezium.converters.CloudEventsConverter'
[2021-06-24 23:01:54,871] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter'
[2021-06-24 23:01:54,872] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState'
[2021-06-24 23:01:54,872] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter'
[2021-06-24 23:01:54,873] INFO Added plugin 'io.debezium.transforms.tracing.ActivateTracingSpan'
[2021-06-24 23:01:54,873] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-ddl-parser-1.5.3.Final.jar
[2021-06-24 23:01:55,060] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-ddl-parser-1.5.3.Final.jar}
[2021-06-24 23:01:55,061] INFO Loading plugin from: /home/ec2-user/kafka/plugins/failureaccess-1.0.1.jar
[2021-06-24 23:01:55,069] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/failureaccess-1.0.1.jar}
[2021-06-24 23:01:55,070] INFO Loading plugin from: /home/ec2-user/kafka/plugins/guava-30.0-jre.jar
[2021-06-24 23:01:55,307] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/guava-30.0-jre.jar}
However, when I run my curl command:
curl -s -X POST -H 'Content-Type: application/json' --data
#debe-connector.json http://localhost:8083/connectors -v
I get the error:
Failed to find any class that implements Connector and which name matches io.debezium.connector.mysql.MySqlConnector.
Any idea what I might be missing?
Thanks,
You need to put all the jar files of the debezium mysql connector inside a directory
/home/ec2-user/kafka/plugins/debezium-connector-mysql
instead of just under
/home/ec2-user/kafka/plugins
You can use the curl command
curl -X GET http://localhost:8083/connector-plugins
to see which plugins are installed correctly. If debezium is not listed here, that means there is a problem with the installation. You may want to check file permissions.

Sonarqube can't be started in Ubuntu

When I tried to start the sonarqube it gives me following errors.
--> Wrapper Started as Daemon
Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2019.08.19 17:30:23 INFO app[][o.s.a.AppFileSystem] Cleaning or creating temp directory /opt/sonarqube/temp
2019.08.19 17:30:23 INFO app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2019.08.19 17:30:23 INFO app[][o.s.a.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [/opt/sonarqube/elasticsearch]: /opt/sonarqube/elasticsearch/bin/elasticsearch
2019.08.19 17:30:23 INFO app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
2019.08.19 17:30:24 INFO app[][o.e.p.PluginsService] no modules loaded
2019.08.19 17:30:24 INFO app[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2019.08.19 17:30:41 WARN app[][o.s.a.p.AbstractManagedProcess] Process exited with exit value [es]: 1
2019.08.19 17:30:41 INFO app[][o.s.a.SchedulerImpl] Process[es] is stopped
2019.08.19 17:30:41 INFO app[][o.s.a.SchedulerImpl] SonarQube is stopped
<-- Wrapper Stopped
I can't find any solution for this problem and I really need some help to run sonarqube. I might have done something wrong, but I've followed sonarqube documentation for installation.
And this is the es.log file
Server VM/11.0.4/11.0.4+11-post-Ubuntu-1ubuntu218.04.3]
2019.08.20 20:38:29 INFO es[][o.e.n.Node] JVM arguments [-XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Des.networkaddress.cache.ttl=60, -Des.networkaddress.cache.negative.ttl=10, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -XX:-OmitStackTraceInFastThrow, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Djava.io.tmpdir=/opt/sonarqube/temp, -XX:ErrorFile=../logs/es_hs_err_pid%p.log, -Des.enforce.bootstrap.checks=true, -Xms512m, -Xmx512m, -Des.path.home=/opt/sonarqube/elasticsearch, -Des.path.conf=/opt/sonarqube/temp/conf/es, -Des.distribution.flavor=default, -Des.distribution.type=tar]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [analysis-common]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [lang-painless]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [mapper-extras]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [parent-join]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [percolator]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [reindex]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [repository-url]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] loaded module [transport-netty4]
2019.08.20 20:38:30 INFO es[][o.e.p.PluginsService] no plugins loaded
2019.08.20 20:38:34 WARN es[][o.e.d.c.s.Settings] [http.enabled] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.
2019.08.20 20:38:36 INFO es[][o.e.d.DiscoveryModule] using discovery type [zen] and host providers [settings]
2019.08.20 20:38:36 INFO es[][o.e.n.Node] initialized
2019.08.20 20:38:36 INFO es[][o.e.n.Node] starting ...
2019.08.20 20:38:37 INFO es[][o.e.t.TransportService] publish_address {127.0.0.1:9001}, bound_addresses {127.0.0.1:9001}
2019.08.20 20:38:37 INFO es[][o.e.b.BootstrapChecks] explicitly enforcing bootstrap checks
2019.08.20 20:38:37 ERROR es[][o.e.b.Bootstrap] node validation exception
[1] bootstrap checks failed
[1]: max file descriptors [4096] for elasticsearch process is too low, increase to at least [65535]
2019.08.20 20:38:37 INFO es[][o.e.n.Node] stopping ...
2019.08.20 20:38:37 INFO es[][o.e.n.Node] stopped
2019.08.20 20:38:37 INFO es[][o.e.n.Node] closing ...
2019.08.20 20:38:37 INFO es[][o.e.n.Node] closed
Elasticsearch is not starting, see the logs :
2019.08.20 20:38:37 ERROR es[][o.e.b.Bootstrap] node validation exception
[1] bootstrap checks failed
[1]: max file descriptors [4096] for elasticsearch process is too low, increase to at least [65535]
And here is what they say in their doc:
Elasticsearch uses a lot of file descriptors or file handles. Running
out of file descriptors can be disastrous and will most probably lead
to data loss. Make sure to increase the limit on the number of open
files descriptors for the user running Elasticsearch to 65,536 or
higher.
Therefore you need to increase the limit:
For the .zip and .tar.gz packages, set ulimit -n 65535 as root before
starting Elasticsearch, or set nofile to 65535 in
/etc/security/limits.conf.
Solution is withing the sonar.service file which should be stayed in /etc/systemd/system.sonar.service
[Unit]
Description=SonarQube service
After=syslog.target network.target
[Service]
Type=simple
User=sonarqube
Group=sonarqube
PermissionsStartOnly=true
ExecStart=/bin/nohup /opt/java/bin/java -Xms32m -Xmx32m -Djava.net.preferIPv4Stack=true -jar /opt/sonarqube/lib/sonar-application-7.4.jar
StandardOutput=syslog
LimitNOFILE=65536
LimitNPROC=8192
TimeoutStartSec=5
Restart=always
[Install]
WantedBy=multi-user.target
You have to define the file limit on the service
You need to set fs.file-max to increase max file descriptors.
sysctl -w fs.file-max=65535
You're likely to also need to set the vm.max-map-count
sysctl -w vm.max_map_count=262144

Cannot run SonarQube as Service

If I just run the StartSonar.bat everything works and starts fine.
If I setup a service to run using NSSM using:
Path: ..\blah\wrapper.exe
Startup directory: ..\blah
Arguments: -c ..\blah\conf\wrapper.conf
I get an error in the logs that says:
2019.06.24 16:03:49 INFO web[][o.s.p.ProcessEntryPoint] Starting web
2019.06.24 16:03:50 INFO web[][o.a.t.u.n.NioSelectorPool] Using a shared selector for servlet write/read
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] no modules loaded
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2019.06.24 16:03:52 INFO web[][o.s.s.e.EsClientProvider] Connected to local Elasticsearch: [127.0.0.1:9001]
2019.06.24 16:03:52 INFO web[][o.s.s.p.LogServerVersion] SonarQube Server / 7.7.0.23042 / 1dcac8b8de36b377a1810cc8f1c4c31744e12729
2019.06.24 16:03:52 INFO web[][o.sonar.db.Database] Create JDBC data source for jdbc:sqlserver://BRKPRCSQUBE1;databaseName=sonar;integratedSecurity=true
2019.06.24 16:03:53 ERROR web[][o.s.s.p.Platform] Web server startup failed
java.lang.IllegalStateException: Can not connect to database. Please check
connectivity and settings (see the properties prefixed by 'sonar.jdbc.').
That leads me to believe there is a problem with setting up the database correctly, but once again.. if I run the StartSonar.bat everything works fine.
The only thing I can think of is that I am setting up the service as a "Local System account," instead of when I run the bat it would be from my own account.
That being said, I cannot use my account and then login and change my password or something every time my AD changes.

SpringBoot app starts multiple times and disconnects from logstash

I have an springboot application (Spring Boot v1.3.5.RELEASE) that is run on docker with ./mvnw;
it "restarts" after 1-2 minutes and after second start they don't send any log to logstash..
in first start we see below log
2016-07-27 08:54:29,616 DEBUG [background-preinit] logging: Logging Provider: org.jboss.logging.Slf4jLoggerProvider found via system property
bug after the second one logging provider log is missing..
**2016-07-27 08:54:25,386 INFO [restartedMain] DemoApp: Starting DemoApp on 7adf92b8bc96 with PID 85 (/home/infoowl/project/target/classes started by infoowl in /home/infoowl/project)**
2016-07-27 08:54:25,471 DEBUG [restartedMain] DemoApp: Running with Spring Boot v1.3.5.RELEASE, Spring v4.2.6.RELEASE
2016-07-27 08:54:25,487 INFO [restartedMain] DemoApp: The following profiles are active: dev
2016-07-27 08:54:29,616 DEBUG [background-preinit] logging: Logging Provider: org.jboss.logging.Slf4jLoggerProvider found via system property
2016-07-27 08:54:32,436 INFO [restartedMain] DemoApp: Started DemoApp in 9.865 seconds (JVM running for 14.572)
2016-07-27 08:54:45,937 DEBUG [restartedMain] HttpURLConnection: sun.net.www.MessageHeader#3959b7066 pairs: {GET /config/transformation/dev/master HTTP/1.1: null}{Accept: application/json, application/*+json}{Authorization: Basic YWRtaW46YWRtaW4=}{User-Agent: Java/1.8.0_91}{Host: registry:8761}{Connection: keep-alive}
2016-07-27 08:54:46,512 DEBUG [restartedMain] HttpURLConnection: sun.net.www.MessageHeader#12bba63311 pairs: {null: HTTP/1.1 200 OK}{Server: Apache-Coyote/1.1}{X-Content-Type-Options: nosniff}{X-XSS-Protection: 1; mode=block}{Cache-Control: no-cache, no-store, max-age=0, must-revalidate}{Pragma: no-cache}{Expires: 0}{X-Application-Context: jhipster-registry:dev,native:8761}{Content-Type: application/json;charset=UTF-8}{Transfer-Encoding: chunked}{Date: Wed, 27 Jul 2016 08:54:46 GMT}
2016-07-27 08:54:46,777 INFO [restartedMain] DemoApp: The following profiles are active: dev
2016-07-27 08:55:01,347 WARN [restartedMain] ConfigurationClassPostProcessor: Cannot enhance #Configuration bean definition 'refreshScope' since its singleton instance has been created too early. The typical cause is a non-static #Bean method with a BeanDefinitionRegistryPostProcessor return type: Consider declaring such methods as 'static'.
2016-07-27 08:55:03,767 DEBUG [restartedMain] AsyncConfiguration: Creating Async Task Executor
2016-07-27 08:55:07,072 DEBUG [restartedMain] MetricsConfiguration: Registering JVM gauges
2016-07-27 08:55:07,169 DEBUG [restartedMain] MetricsConfiguration: Initializing Metrics JMX reporting
2016-07-27 08:55:07,281 INFO [restartedMain] MetricsConfiguration: Initializing Metrics Log reporting
2016-07-27 08:55:16,352 INFO [localhost-startStop-1] WebConfigurer: Web application configuration, using profiles: [dev]
2016-07-27 08:55:16,359 DEBUG [localhost-startStop-1] WebConfigurer: Initializing Metrics registries
2016-07-27 08:55:16,383 DEBUG [localhost-startStop-1] WebConfigurer: Registering Metrics Filter
2016-07-27 08:55:16,400 DEBUG [localhost-startStop-1] WebConfigurer: Registering Metrics Servlet
2016-07-27 08:55:16,402 INFO [localhost-startStop-1] WebConfigurer: Web application fully configured
2016-07-27 08:55:19,343 INFO [localhost-startStop-1] CoreApp: Running with Spring profile(s) : [dev]
2016-07-27 08:55:24,670 INFO [restartedMain] LoggingConfiguration: Initializing Logstash logging
2016-07-27 08:55:24,781 INFO [restartedMain] LoggingConfiguration: Logstash customFields: '{"app_name":"transformation","app_port":"9093","instance_id":"transformation:4afb19b3a2763ed887b8d69d246082e6"}', config: 'net.infoowl.hepsiburada.core.config.JHipsterProperties$Logging$Logstash#96d9ebe[enabled=true,host=elk-logstash,port=5000,queueSize=512]'
2016-07-27 08:55:35,367 DEBUG [restartedMain] CacheConfiguration: No cache
2016-07-27 08:55:40,071 DEBUG [restartedMain] DatabaseConfiguration: Configuring Mongeez
2016-07-27 08:55:40,245 INFO [restartedMain] FilesetXMLReader: Parsing XML Fileset file master.xml
2016-07-27 08:55:40,275 INFO [restartedMain] FilesetXMLReader: Num of changefiles found 0
2016-07-27 08:55:41,089 DEBUG [restartedMain] SwaggerConfiguration: Starting Swagger
2016-07-27 08:55:41,218 DEBUG [restartedMain] SwaggerConfiguration: Started Swagger in 118 ms
2016-07-27 08:55:55,610 WARN [restartedMain] URLConfigurationSource: No URLs will be polled as dynamic configuration sources.
2016-07-27 08:55:56,975 WARN [restartedMain] URLConfigurationSource: No URLs will be polled as dynamic configuration sources.
2016-07-27 08:56:00,036 DEBUG [cron4j::scheduler[20b6349f63f32eea2d00877b000001562b91269c7714a479]::launcher[20b6349f63f32eea238ca33a000001562b9155042b804084]] CronPlugin: Found crontab config url org.crsh.vfs.Resource#7a65b25c
**2016-07-27 08:56:00,788 INFO [restartedMain] DemoApp: Starting DemoApp on 7adf92b8bc96 with PID 85 (/home/infoowl/project/target/classes started by infoowl in /home/infoowl/project)**
2016-07-27 08:56:00,788 DEBUG [restartedMain] DemoApp: Running with Spring Boot v1.3.5.RELEASE, Spring v4.2.6.RELEASE
2016-07-27 08:56:00,788 INFO [restartedMain] DemoApp: The following profiles are active: dev
2016-07-27 08:56:01,290 INFO [restartedMain] DemoApp: Started DemoApp in 0.899 seconds (JVM running for 103.426)
another observation.. see second start is just "0.899 seconds" which is not possible.. Actually it seems second one is not a real start but logstash connection is gone..
What may be the reason for this second start? Where should I check and investigate;
From documentation:
Applications that use spring-boot-devtools will automatically restart
whenever files on the classpath change.
You can exclude resources or disable restarts.
I found the problem, the details are here;
https://github.com/spring-cloud/spring-cloud-stream/issues/605

Sonar runner 2.3 throws exception with Sonar 3.7 --> IllegalDataException

I am trying to analyse Java project using SOnar 3.7 & sonar-runner 2.3.
I was successfully able to analyse same codebase using Sonar 3.6.1 & Sonar runner 2.2.1 with same rule set/profile.
However, when I have upgraded version then I am having following errors...
org.jdom.IllegalDataException: The data "null" is not legal for a JDO
M attribute: A null is not a legal XML value
Console Output :
SonarQube Runner 2.3 Java 1.6.0_30 Sun Microsystems Inc. (64-bit) Windows 7 6.1 amd64
INFO: Error stacktraces are turned on.
INFO: Runner configuration file: C:\Dhruba\InstallationFolder\Sonar\sonar-runner-2.3\conf\sonar-runner.properties
INFO: Project configuration file: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\sonar-project.properties
INFO: Default locale: "en_US", source code encoding: "windows-1252" (analysis is platform dependent)
INFO: Work directory: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\.sonar
INFO: SonarQube Server 3.7
13:38:29.376 INFO - Load batch settings
13:38:29.482 INFO - User cache: C:\Users\Dhruba\.sonar\cache
13:38:29.487 INFO - Install plugins
13:38:31.219 INFO - Install JDBC driver
13:38:31.227 INFO - Create JDBC datasource for jdbc:oracle:thin:#localhost:1521 /orcl
13:38:32.382 INFO - Initializing Hibernate 13:38:34.579 INFO - Load project settings
13:38:34.665 INFO - Apply project exclusions
13:38:34.788 INFO - ------------- Scan OPSBankIIUtilityService
13:38:34.790 INFO - Load module settings
13:38:35.382 INFO - Quality profile : [name=OBBase1,language=java]
13:38:35.394 INFO - Excluded sources:
13:38:35.394 INFO - **/*_*.java
13:38:35.394 INFO - **/bo/**
13:38:35.395 INFO - */stub/**
13:38:35.395 INFO - **/*FacadeSoap*.java
13:38:35.395 INFO - com.ibm.ejs.container._EJSWrapper_**/src
13:38:35.395 INFO - *_Deser/src
13:38:35.395 INFO - *Proxy.java
13:38:35.395 INFO - */*FacadeHome.java
13:38:35.395 INFO - */*FacadeLocalHome.java 13:38:35.395 INFO - com/elsevier/obii/xml/*
13:38:35.395 INFO - Excluded tests:
13:38:35.395 INFO - */package-info.java
13:38:35.430 INFO - Configure Maven plugins
13:38:35.502 INFO - Compare to previous analysis
13:38:35.532 INFO - Compare over 7 days (2013-09-14)
13:38:35.558 INFO - Compare to previous version
13:38:35.734 INFO - Base dir: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\OPSBankIIUtilityService
13:38:35.734 INFO - Working dir: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\.sonar\dhrsrvc_OPSBankIIUtilityService
13:38:35.735 INFO - Source dirs: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\OPSBankIIUtilityService\src
13:38:35.735 INFO - Source encoding: windows-1252, default locale: en_US
13:38:36.090 INFO - Sensor JavaSourceImporter...
13:38:37.021 INFO - Sensor JavaSourceImporter done: 931 ms
13:38:37.021 INFO - Sensor JavaSquidSensor...
13:38:37.160 INFO - Java AST scan...
13:38:40.518 INFO - Java AST scan done: 3358 ms
13:38:40.763 INFO - Sensor JavaSquidSensor done: 3742 ms
13:38:40.764 INFO - Sensor SurefireSensor...
13:38:40.765 INFO - parsing C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\.sonar\dhrsrvc_OPSBankIIUtilityService\build\surefire-reports
13:38:40.767 INFO - Sensor SurefireSensor done: 3 ms
13:38:40.769 INFO - Sensor CpdSensor...
13:38:40.770 INFO - SonarEngine is used
13:38:40.846 INFO - Cross-project analysis disabled
13:38:41.511 INFO - Sensor CpdSensor done: 742 ms
13:38:41.511 INFO - Sensor CheckstyleSensor...
13:38:41.514 INFO - Execute Checkstyle 5.6...
13:38:41.534 INFO - Checkstyle configuration: C:\Dhruba\Projects\Elsevier-OPSBANK_II_AIS\R11_Service_Workspace\.sonar\dhrsrvc_OPSBankIIUtilityService\checkstyle.xml
13:38:43.845 INFO - Execute Checkstyle 5.6 done: 2331 ms
13:38:43.849 INFO - Sensor CheckstyleSensor done: 2338 ms
13:38:43.850 INFO - Sensor PmdSensor...
13:38:43.853 INFO - Execute PMD 4.3...
13:38:43.864 INFO - Java version: 1.5
13:38:44.008 INFO - Execute PMD 4.3 done: 155 ms
INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
Total time: 17.761s Final Memory: 15M/406M
INFO: ------------------------------------------------------------------------
ERROR: Error during Sonar runner execution
org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
at org.sonar.runner.api.Runner.execute(Runner.java:90)
at org.sonar.runner.Main.executeTask(Main.java:70)
at org.sonar.runner.Main.execute(Main.java:59)
at org.sonar.runner.Main.main(Main.java:41) Caused by: org.sonar.api.utils.XmlParserException: org.jdom.IllegalDataException : The data "null" is not legal for a JDOM attribute: A null is not a legal XML value.
at org.sonar.plugins.pmd.PmdSensor.analyse(PmdSensor.java:55)
at org.sonar.batch.phases.SensorsExecutor.execute(SensorsExecutor.java:72)
at org.sonar.batch.phases.PhaseExecutor.execute(PhaseExecutor.java:114)
at org.sonar.batch.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:142)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:88)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:73)
at org.sonar.batch.scan.ProjectScanContainer.scan(ProjectScanContainer.java:186)
at org.sonar.batch.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:181)
at org.sonar.batch.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:179)
at org.sonar.batch.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:174)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:88)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:73)
at org.sonar.batch.scan.ScanTask.scan(ScanTask.java:57)
at org.sonar.batch.scan.ScanTask.execute(ScanTask.java:45)
at org.sonar.batch.bootstrap.TaskContainer.doAfterStart(TaskContainer.java:82)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:88)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:73)
at org.sonar.batch.bootstrap.BootstrapContainer.executeTask(BootstrapContainer.java:156)
at org.sonar.batch.bootstrap.BootstrapContainer.doAfterStart(BootstrapContainer.java:144)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:88)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:73)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:92)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:74)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
... 9 more
**Caused by: org.jdom.IllegalDataException: The data "null" is not legal for a JDO M attribute: A null is not a legal XML value.**
at org.jdom.Attribute.setValue(Attribute.java:486)
at org.jdom.Attribute.<init>(Attribute.java:229)
at org.jdom.Attribute.<init>(Attribute.java:252)
at org.jdom.Element.setAttribute(Element.java:1109)
at org.sonar.plugins.pmd.PmdProfileExporter.exportPmdRulesetToXml(PmdProfileExporter.java:126)
at org.sonar.plugins.pmd.PmdProfileExporter.exportProfile(PmdProfileExporter.java:63)
at org.sonar.plugins.pmd.PmdExecutor.createRulesets(PmdExecutor.java:107)
at org.sonar.plugins.pmd.PmdExecutor.executeRules(PmdExecutor.java:89)
at org.sonar.plugins.pmd.PmdExecutor.executePmd(PmdExecutor.java:75)
at org.sonar.plugins.pmd.PmdExecutor.execute(PmdExecutor.java:61)
at org.sonar.plugins.pmd.PmdSensor.analyse(PmdSensor.java:52)
... 37 more ERROR: ERROR: Re-run SonarQube Runner using the -X switch to enable full debug logging.
Can you please help to resolve this one ?
I was analyzing the issue and found that imported profile from Sonar 3.6.1 to 3.7 had issue. Sonar 3.7 was not able to parse few of the XPath rules.
Therefore, I have just removed those XPath rules and re-added them manually and it starts working. This issue is not a bug of Sonar-runner instead Sonar 3.7 could not restore profiles correctly.

Resources