I install spring-xd-1.2.1.RELEASE and start in Spring XD in xd-signle mode, when I type the following command
xd:>stream create --definition "time | log" --name ticktock --deploy
I get the following result:
Command failed org.springframework.xd.rest.client.impl.SpringXDException: Could not find module with name 'log' and type 'sink'
When I type the following command:
xd:> module list
I get the following resul:
Source Processor Sink Job
gemfire gemfire-json-server filejdbc
gemfire-cq gemfire-server hdfsjdbc
jdbc jdbc jdbchdfs
kafka rabbit sqoop
rabbit redis
twittersearch
twitterstream
Some default modules are missed ? What happens ? Is there any other configuration to set before starting spring xd ?
Check XD_HOME/modules/sink/log - Is this folder exist?
Related
I'm using confluent so I've installed dibezium connectors according to confluent docs using confluent-hub in connect.properties I do have entry
plugin.path=/usr/share/java,/opt/confluent-6.0.0/share/confluent-hub-components
I need to use io.debezium.transforms.ContentBasedRouter https://debezium.io/documentation/reference/1.3/configuration/content-based-routing.html
so according to debezium doc I've downloaded debezium-scripting-1.3.1.Final.jar
and put it into
/opt/confluent-6.0.0/share/confluent-hub-components/ and copied it into
/opt/confluent-6.0.0/share/confluent-hub-components/debezium-debezium-connector-sqlserver/lib directories
here the entries in my mysql_src.json connector
"transforms": "unwrap,route",
"transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
"transforms.unwrap.add.fields": "source.snapshot",
"transforms.route.type": "io.debezium.transforms.ContentBasedRouter",
"transforms.route.language": "jsr223.groovy",
"transforms.route.topic.expression": "value.__source_snapshot == 'false' ? 'test'"
when I'm trying to configure/load this connector I'm getting following error message
[2020-12-15 22:18:45,351] ERROR [Worker clientId=connect-1, groupId=connect-cluster] Failed to reconfigure connector's tasks, retrying after backoff: (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1369)
java.lang.NoClassDefFoundError: io/debezium/DebeziumException
Any suggestions how to fix this problem ?
According the docs, you need to additionally obtain a JSR-223 script engine implementation and add its contents to the Debezium plug-in directories of your Kafka Connect environment, since:
Debezium does not come with any implementations of the JSR 223 API. To use an expression language with Debezium, you must download the JSR 223 script engine implementation for the language, and add to your Debezium connector plug-in directories, along any other JAR files used by the language implementation.
I am not sure that configuration is correct but I passed first configuration problem (I hope) I'm facing another problem now which I will describe in different question.
I am not sure what was wrong, I did following
Clean up zookeeper directories
Clean up kafka directories
Run kafka in distributed mode using command line start/stop scripts (not using confluent cli)
this solved java.lang.NoClassDefFoundError: io/debezium/DebeziumException
error
connect-standalone.properties
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
bootstrap.servers=10.33.62.20:9092,10.33.62.110:9092,10.33.62.200:9092
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000
plugin.path=/grid/1/mukul/confluent-5.0.0/share/java
source-sqlite.properties
name=test-source-sqlite-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=5
connection.url=jdbc:mysql://10.32.177.178:3306/test&user=xxxx&password=xxxxx
table.whitelist=banner_hourly_statistics_v2
group.id=test-mysql-kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
config.storage.topic=demo-1-distributed-config
offset.storage.topic=demo-1-distributed-offset
status.storage.topic=demo-1-distributed-status
bootstrap.servers=10.33.62.20:9092,10.33.62.110:9092,10.33.62.200:9092
mode=bulk
#incrementing.column.name=id
topic.prefix=test-sqlite-jdbc-
CMD: connect-standalone /grid/1/mukul/confluent-5.0.0/etc/kafka/connect-standalone.properties /grid/1/mukul/confluent-5.0.0/etc/kafka-connect-jdbc/source-quickstart-sqlite.properties
In the startup logs, it clearly shows loading JDBC Connectors:
[2018-08-09 06:59:30,072] INFO Loading plugin from: /grid/1/mukul/confluent-5.0.0/share/java/kafka-connect-jdbc (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:218)
[2018-08-09 06:59:30,133] INFO Registered loader: PluginClassLoader{pluginLocation=file:/grid/1/mukul/confluent-5.0.0/share/java/kafka-connect-jdbc/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2018-08-09 06:59:30,133] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:170)
[2018-08-09 06:59:30,133] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:170)
But it fails with following exception:
Invalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://10.32.177.178:3306/test&user=xxxx&password=xxxx for configuration Couldn't open connection to jdbc:mysql://10.32.177.178:3306/test&user=xxxx&password=xxx
Invalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://10.32.177.178:3306/test&user=xxxx&password=xxxx for configuration Couldn't open connection to jdbc:mysql://10.32.177.178:3306/test&user=xxxx&password=xxxx
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110)
Tried changing the plugin directories too but it didn't work. Tried moving the confluent share/* to /usr/share/java too but it also didn't work.
Download the JAR from the URL: https://dev.mysql.com/downloads/connector/j/5.1.html
Place inside the Plugin dir
Run the connect
It will take start pulling data from MySql.
May be a little late. I had the same issue of "No Driver found.." when I connect DB2 using kafka jdbc connector.
1st Possible Solution:
I resolved it by placing the DB2 driver at the exact location where jdbc-connector is.
With in Kafka connect:
find / -name kafka-connect-jdbc\*.jar
Once you found the location from the above command, copy DB2 jar at that location:
cp {your DB2 jar location}/db2.jar {copy the location from 'find' command}
Example
cp /Download/db2.jar /Users/share/java/kafka-connect-java/
Restart kafka-connect, it will pick up the DB2 drivers
2nd Possible Solution:
Download the jt400 jar (jdk-8) and put it next to the other jdbc drivers (DB2, SQL etc)
Happy coding :)
I am trying to run the kafka-connect-elasticsearch plugin from Confluent in order to stream topics from Kafka (V0.11.0.1) directly into Elasticsearch (without putting Logstash in between).
I build the connector using Maven -
$ cd kafka-connect-elasticsearch
$ mvn clean package
I then created the require configuration file -
name=es-cluster-lab
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=filebeats-test
topic.index.map=filebeats-test:kafka_test_index
key.ignore=true
schema-ignore=true
connection.url=http://elastic:9200
type.name=log
As per the new Kafka Classpath Isolation spec, I also added the following line to my connect-standalone.properties file -
plugin.path=/home/kafka/kafka-connect-elasticsearch-3.3.0/target/kafka-connect-elasticsearch-3.3.0-development/share/java/kafka-connect-elasticsearch/
I go to run the script ...
bin/connect-standalone.sh config/connect-standalone.properties config/elasticsearch-connect.properties
... and receive the below error.
[2017-09-14 16:08:26,599] INFO Loading plugin from: /home/kafka/kafka-connect-elasticsearch-3.3.0/target/kafka-connect-elasticsearch-3.3.0-development/share/java/kafka-connect-elasticsearch/slf4j-api-1.7.25.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:176)
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
at org.reflections.Reflections.expandSuperTypes(Reflections.java:380)
at org.reflections.Reflections.<init>(Reflections.java:126)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:221)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:198)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:190)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:150)
at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:47)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:68)
I also tried to move the JAR files into the /app/kafka/libs directory (default CLASSPATH) and even tried to create a subdirectory /app/kafka/libs/connect_libs and add that manually to my CLASSPATH environment variable.
Not sure what my next step is besides putting Logstash between Kafka and Elastic.
try to change the guava version to 20 before you build it
I think you are missing the star '*' at the end of the path of the plugin path.
plugin.path=/home/kafka/kafka-connect-elasticsearch-3.3.0/target/kafka-connect-elasticsearch-3.3.0-development/share/java/kafka-connect-elasticsearch/*
I have created a composite module:
module compose common-module --definition "kafka --topic=topic1 --outputType=text/plain | shell --command='script1.sh' "
I then created a stream using this module:
stream create stream1 --definition "common-module > queue:job:job1"
And I got the following error:
Command failed org.springframework.xd.rest.client.impl.SpringXDException:
Error with option(s) for module common-module of type source:
command: may not be null
command: may not be empty
Anyone knows what's going on? Thanks !
It's a bug, I opened a JIRA Issue.
The only work-around I can think of (short of creating a custom shell module - see the JIRA) is to pass-in the script again...
stream create stream1 --definition "common-module --shell.script=script1.sh > queue:job:job1"
I have installed SPRING-XD version 1.1.0 on a Centos machine. Using xd-singlenode I want to connect it to a SQL Server database via jdbc source and put the data into file.
I created some streams as follows:
1)xd:>stream create connectiontest --definition "jdbc --url=jdbc:sqlserver://sqlserverhost:1433/SampleDatabase --username=sample --password=***** --query= 'SELECT * FROM schema.tablename' |file" --deploy
2)xd:>stream create connectiontest --definition "jdbc --connectionProperties=jdbc:sqlserver://sqlserverhost:1433/SampleDatabase --username=sample --password=***** --initSQL= 'SELECT * FROM schema.tablename' |file" --deploy
Everytime I deploy the stream it gives the following error:
Command failed org.springframework.xd.rest.client.impl.SpringXDException: Multiple top level module resources found :file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/jms-hornetq.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/hadoop.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/xd-admin-logger.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/xd-singlenode-logger.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/xd-container-logger.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/jms-activemq.properties],file [/opt/pivotal/spring-xd-1.1.0.RELEASE/xd/config/httpSSL.properties]
Earlier I set springxd_home pointing to my springxd directory. After removing the path it is working fine now.
Thanks for the support.