Using Avro message format.
Getting Below Exception. It is not able to map to AvroPlanCompleteTrigger java object after de-serilization. Can someone please help?
Bean [com.wom.repl.odr.receiver.Receiver#606b1c65]; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [com.wom.repl.odr.dto.AvroPlanCompleteTrigger] to [com.wom.repl.odr.dto.AvroPlanCompleteTrigger] for GenericMessage [payload={"groupId": 609001, "runUUID": "runuuid", "wmtItemNumber": "123456", "sourceLocation": 987, "retryCount": 3, "applicationName": "Jenkins"}, headers={kafka_offset=0, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=test3, kafka_acknowledgment=Acknowledgment for ConsumerRecord(topic = test3, partition = 0, offset = 0, CreateTime = 1574768943315, checksum = 2886880703, serialized key size = -1, serialized value size = 29, key = null, value = {"groupId": 1234, "runUUID": "runuuid", "wmtItemNumber": "123456", "sourceLocation": 987, "retryCount": 3, "applicationName": "Jenkins"})}], failedMessage=GenericMessage [payload={"groupId": 1234, "runUUID": "runuuid", "wmtItemNumber": "123456", "sourceLocation": 987, "retryCount": 3, "applicationName": "Jenkins"}, headers={kafka_offset=0, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=test3, kafka_acknowledgment=Acknowledgment for ConsumerRecord(topic = test3, partition = 0, offset = 0, CreateTime = 1574768943315, checksum = 2886880703, serialized key size = -1, serialized value size = 29, key = null, value = {"groupId": 609001, "runUUID": "runuuid", "wmtItemNumber": "123456", "sourceLocation": 987, "retryCount": 3, "applicationName": "Jenkins"})}], at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:178), at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:72), at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:47), at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:794), at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:738), at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.access$2200(KafkaMessageListenerContainer.java:245), at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer$ListenerInvoker.run(KafkaMessageListenerContainer.java:1031), at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511), at java.util.concurrent.FutureTask.run(FutureTask.java:266), at java.lang.Thread.run(Thread.java:748)] :END
Pom dependencies version:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>1.2.2.RELEASE</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.9.1</version>
<exclusions>
<exclusion>
<artifactId>slf4j-api</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
AvroDeserializer method below which I am using to get AvroPlanCompleteTrigger this java object.
public T deserialize(String topic, byte[] data) {
try {
T result = null;
if (data != null) {
LOGGER.debug("data='{}'", DatatypeConverter.printHexBinary(data));
DatumReader<GenericRecord> datumReader = new SpecificDatumReader<>(
targetType.newInstance().getSchema());
Decoder decoder = DecoderFactory.get().binaryDecoder(data, null);
result = (T) datumReader.read(null, decoder);
LOGGER.debug("deserialized data='{}'", result);
}
return result;
} catch (Exception ex) {
throw new SerializationException(
"Can't deserialize data '" + Arrays.toString(data) + "' from topic '" + topic + "'", ex);
}
}
It looks like you have deserialized a com.wom.replenishment.odr.dto.AvroPlanCompleteTrigger but the method parameter is a com.wom.repl.odr.dto.AvroPlanCompleteTrigger.
Spring doesn't know how to convert from one to the other.
Related
I'm sending a message to "config" topic from my Spring boot backend application
Here's my mqtt setup
final String mqttServerAddress =
String.format("ssl://%s:%s", options.mqttBridgeHostname, options.mqttBridgePort);
// Create our MQTT client. The mqttClientId is a unique string that identifies this device. For
// Google Cloud IoT Core, it must be in the format below.
final String mqttClientId =
String.format(
"projects/%s/locations/%s/registries/%s/devices/%s",
options.projectId, options.cloudRegion, options.registryId, options.deviceId);
MqttConnectOptions connectOptions = new MqttConnectOptions();
// Note that the Google Cloud IoT Core only supports MQTT 3.1.1, and Paho requires that we
// explictly set this. If you don't set MQTT version, the server will immediately close its
// connection to your device.
connectOptions.setMqttVersion(MqttConnectOptions.MQTT_VERSION_3_1_1);
Properties sslProps = new Properties();
sslProps.setProperty("com.ibm.ssl.protocol", "TLSv1.2");
connectOptions.setSSLProperties(sslProps);
// With Google Cloud IoT Core, the username field is ignored, however it must be set for the
// Paho client library to send the password field. The password field is used to transmit a JWT
// to authorize the device.
connectOptions.setUserName(options.userName);
DateTime iat = new DateTime();
if ("RS256".equals(options.algorithm)) {
connectOptions.setPassword(
createJwtRsa(options.projectId, options.privateKeyFile).toCharArray());
} else if ("ES256".equals(options.algorithm)) {
connectOptions.setPassword(
createJwtEs(options.projectId, options.privateKeyFileEC).toCharArray());
} else {
throw new IllegalArgumentException(
"Invalid algorithm " + options.algorithm + ". Should be one of 'RS256' or 'ES256'.");
}
// [START iot_mqtt_publish]
// Create a client, and connect to the Google MQTT bridge.
MqttClient client = new MqttClient(mqttServerAddress, mqttClientId, new MemoryPersistence());
// Both connect and publish operations may fail. If they do, allow retries but with an
// exponential backoff time period.
long initialConnectIntervalMillis = 500L;
long maxConnectIntervalMillis = 6000L;
long maxConnectRetryTimeElapsedMillis = 900000L;
float intervalMultiplier = 1.5f;
long retryIntervalMs = initialConnectIntervalMillis;
long totalRetryTimeMs = 0;
while ((totalRetryTimeMs < maxConnectRetryTimeElapsedMillis) && !client.isConnected()) {
try {
client.connect(connectOptions);
} catch (MqttException e) {
int reason = e.getReasonCode();
// If the connection is lost or if the server cannot be connected, allow retries, but with
// exponential backoff.
System.out.println("An error occurred: " + e.getMessage());
if (reason == MqttException.REASON_CODE_CONNECTION_LOST
|| reason == MqttException.REASON_CODE_SERVER_CONNECT_ERROR) {
System.out.println("Retrying in " + retryIntervalMs / 1000.0 + " seconds.");
Thread.sleep(retryIntervalMs);
totalRetryTimeMs += retryIntervalMs;
retryIntervalMs *= intervalMultiplier;
if (retryIntervalMs > maxConnectIntervalMillis) {
retryIntervalMs = maxConnectIntervalMillis;
}
} else {
throw e;
}
}
}
attachCallback(client, options.deviceId);
// The MQTT topic that this device will publish telemetry data to. The MQTT topic name is
// required to be in the format below. Note that this is not the same as the device registry's
// Cloud Pub/Sub topic.
String mqttTopic = String.format("/devices/%s/%s", options.deviceId, options.messageType);
long secsSinceRefresh = ((new DateTime()).getMillis() - iat.getMillis()) / 1000;
if (secsSinceRefresh > (options.tokenExpMins * MINUTES_PER_HOUR)) {
System.out.format("\tRefreshing token after: %d seconds%n", secsSinceRefresh);
iat = new DateTime();
if ("RS256".equals(options.algorithm)) {
connectOptions.setPassword(
createJwtRsa(options.projectId, options.privateKeyFile).toCharArray());
} else if ("ES256".equals(options.algorithm)) {
connectOptions.setPassword(
createJwtEs(options.projectId, options.privateKeyFileEC).toCharArray());
} else {
throw new IllegalArgumentException(
"Invalid algorithm " + options.algorithm + ". Should be one of 'RS256' or 'ES256'.");
}
client.disconnect();
client.connect(connectOptions);
attachCallback(client, options.deviceId);
}
MqttMessage message = new MqttMessage(data.getBytes(StandardCharsets.UTF_8.name()));
message.setQos(1);
client.publish(mqttTopic, message);
here's the options class
public class MqttExampleOptions {
String mqttBridgeHostname = "mqtt.googleapis.com";
short mqttBridgePort = 8883;
String projectId =
String cloudRegion = "europe-west1";
String userName = "unused";
String registryId = <I don't want to show>
String gatewayId = <I don't want to show>
String algorithm = "RS256";
String command = "demo";
String deviceId = <I don't want to show>
String privateKeyFile = "rsa_private_pkcs8";
String privateKeyFileEC = "ec_private_pkcs8";
int numMessages = 100;
int tokenExpMins = 20;
String telemetryData = "Specify with -telemetry_data";
String messageType = "config";
int waitTime = 120
}
When I try to publish message to topic "config" I get this error
ERROR 12556 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[.[dispatcherServlet] : Servlet.service()
for servlet [dispatcherServlet] in context with path [/iot-admin] threw exception [Request processing failed; nested exception is Connection Lost (32109) - java.io.EOFException] with root cause
java.io.EOFException: null
at java.base/java.io.DataInputStream.readByte(DataInputStream.java:273) ~[na:na]
at org.eclipse.paho.client.mqttv3.internal.wire.MqttInputStream.readMqttWireMessage(MqttInputStream.java:92) ~[org.eclipse.paho.client.mqttv3-1.2.5.jar:na]
at org.eclipse.paho.client.mqttv3.internal.CommsReceiver.run(CommsReceiver.java:137) ~[org.eclipse.paho.client.mqttv3-1.2.5.jar:na]
this is the message I am sending
{
"Led": {
"id": "e36b5877-2579-4db1-b595-0e06410bde11",
"rgbColors": [{
"id": "1488acfe-baa7-4de8-b4a2-4e01b9f89fc5",
"configName": "Terminal",
"rgbColor": [150, 150, 150]
}, {
"id": "b8ce6a35-4219-4dba-a8de-a9070f17f1d2",
"configName": "PayZone",
"rgbColor": [150, 150, 150]
}, {
"id": "bf62cef4-8e22-4804-a7d8-a0996bef392e",
"configName": "PayfreeLogo",
"rgbColor": [150, 150, 150]
}, {
"id": "c62d25a4-678b-4833-9123-fe3836863400",
"configName": "BagDetection",
"rgbColor": [200, 200, 200]
}, {
"id": "e19e1ff3-327e-4132-9661-073f853cf913",
"configName": "PersonDetection",
"rgbColor": [150, 150, 150]
}]
}
}
How can I properly send a message to a config topic without getting this error? I am able to send message to "state" topic, but not to "config" topic.
Below is My method defination for kafka listener and if receive null or empty string for payload i guess I'm getting below error... Can you please help.
#KafkaListener(topics = "${kafka.consumer-topic-name.reservation}", groupId = "${kafka.consumer-group-id.test}",
containerFactory = "kafkaListenerContainerFactory",autoStartup = "${kafka.auto-start.consumer.tets}")
public void consumeReservation(String payload, #Header(KafkaHeaders.RECEIVED_TOPIC) String topic,
#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) String kafkaKey) {}
[org.springframework.kafka.KafkaListenerEndpointContainer #0-0-C-1] ERROR o.s.k.l.SeekToCurrentErrorHandler - Backoff none exhausted for ConsumerRecord(topic = test_topic, partition = 0, leaderEpoch = 2, offset = 453473, CreateTime = 1601962346576, serialized key size = 41, serialized value size = -1, headers = RecordHeaders(headers = [RecordHeader(key = OPERATION, value = [68, 69, 76, 69, 84, 69]), RecordHeader(key = __Key_TypeId__, value = [99, 108, 75, 101, 121])], isReadOnly = false), key = {
"orgId": "1",
"orderId": "U4000024004"}, value = null)
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message
Endpoint handler details:
Method[public void com.demo.test.analytics.testanalytics.consumer.FLReservationKafkaConsumer.consumeReservation(java.lang.String, java.lang.String, java.lang.String)]
Bean[com.demo.test.analytics.testanalytics.consumer.FLReservationKafkaConsumer #731702d1];
nested exception is org.springframework.messaging.handler.annotation.support.MethodArgumentNotValidException: Could not resolve method parameter at index 0 in public void com.demo.test.analytics.testanalytics.consumer.FLReservationKafkaConsumer.consumeReservation(java.lang.String, java.lang.String, java.lang.String): 1 error(s): [Error in object 'payload': codes[];arguments[];
default message[Payload value must not be empty]
], failedMessage = GenericMessage[payload = org.springframework.kafka.support.KafkaNull #2d99561c, headers = {
Key_TypeId = [B #19a2dc5f, kafka_offset = 453473, OPERATION = [B #7d75c01a, kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer #363f44ef, kafka_timestampType = CREATE_TIME, kafka_receivedPartitionId = 0, kafka_receivedMessageKey = {
"orgId": "1",
"orderId": "U4000024004"
}, kafka_receivedTopic = test_1order, kafka_receivedTimestamp = 1601962346576, kafka_groupId = reservation_group_id
}];nested exception is org.springframework.messaging.handler.annotation.support.MethodArgumentNotValidException: Could not resolve method parameter at index 0 in public void com.demo.test.analytics.testanalytics.consumer.FLReservationKafkaConsumer.consumeReservation(java.lang.String, java.lang.String, java.lang.String): 1 error(s): [Error in object 'payload': codes[];arguments[];
default message[Payload value must not be empty]
],
failedMessage = GenericMessage[payload = org.springframework.kafka.support.KafkaNull #2d99561c, headers = {
Key_TypeId = [B #19a2dc5f, kafka_offset = 453473, OPERATION = [B #7d75c01a, kafka_consumer = org.apache.kafka.clients.consumer.KafkaConsumer #363f44ef, kafka_timestampType = CREATE_TIME, kafka_receivedPartitionId = 0, kafka_receivedMessageKey = {
"orgId": "1",
"orderId": "U4000024004"
}, kafka_receivedTopic = test_1order, kafka_receivedTimestamp = 1601962346576, kafka_groupId = reservation_group_id
}]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java: 1925)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeErrorHandler(KafkaMessageListenerContainer.java: 1913)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java: 1812)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java: 1739)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java: 1636)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java: 1366)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java: 1082)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java: 990)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java: 511)
at java.util.concurrent.FutureTask.run(FutureTask.java: 266)
at java.lang.Thread.run(Thread.java: 748)
Caused by: org.springframework.messaging.handler.annotation.support.MethodArgumentNotValidException: Could not resolve method parameter at index 0 in public void com.demo.test.analytics.testanalytics.consumer.FLReservationKafkaConsumer.consumeReservation(java.lang.String, java.lang.String, java.lang.String): 1 error(s): [Error in object 'payload': codes[];arguments[];
default message[Payload value must not be empty]
]
at org.springframework.messaging.handler.annotation.support.PayloadMethodArgumentResolver.resolveArgument(PayloadMethodArgumentResolver.java: 122)
at org.springframework.kafka.annotation.KafkaListenerAnnotationBeanPostProcessor$KafkaNullAwarePayloadArgumentResolver.resolveArgument(KafkaListenerAnnotationBeanPostProcessor.java: 901)
at org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java: 117)
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java: 148)
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java: 116)
at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java: 48)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java: 329)
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java: 86)
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java: 51)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java: 1880)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java: 1862)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java: 1799)
...8 common frames omitt
You need to specify that the payload is not required.
#Payload(required = false) String payload, ...
I'm using Spring Integration JMS 5.1.3 with ActiveMQ, and I found an error with mapping priority:
java.lang.IllegalArgumentException: The 'priority' header value must be a Number.
at org.springframework.util.Assert.isTrue(Assert.java:118) ~[spring-core-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.integration.IntegrationMessageHeaderAccessor.verifyType(IntegrationMessageHeaderAccessor.java:177) ~[spring-integration-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
at org.springframework.messaging.support.MessageHeaderAccessor.setHeader(MessageHeaderAccessor.java:305) ~[spring-messaging-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.messaging.support.MessageHeaderAccessor.lambda$copyHeaders$0(MessageHeaderAccessor.java:396) ~[spring-messaging-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at java.util.HashMap.forEach(HashMap.java:1289) ~[na:1.8.0_181]
at org.springframework.messaging.support.MessageHeaderAccessor.copyHeaders(MessageHeaderAccessor.java:394) ~[spring-messaging-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.integration.support.MessageBuilder.copyHeaders(MessageBuilder.java:179) ~[spring-integration-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
at org.springframework.integration.support.MessageBuilder.copyHeaders(MessageBuilder.java:48) ~[spring-integration-core-5.1.3.RELEASE.jar:5.1.3.RELEASE]
at org.springframework.integration.jms.ChannelPublishingJmsMessageListener.onMessage(ChannelPublishingJmsMessageListener.java:327) ~[spring-integration-jms-5.1.3.RELEASE.jar:5.1.3.RELEASE]
at org.springframework.jms.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:736) ~[spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:696) ~[spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.AbstractMessageListenerContainer.doExecuteListener(AbstractMessageListenerContainer.java:674) ~[spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.doReceiveAndExecute(AbstractPollingMessageListenerContainer.java:318) [spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.receiveAndExecute(AbstractPollingMessageListenerContainer.java:257) [spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.invokeListener(DefaultMessageListenerContainer.java:1189) [spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.executeOngoingLoop(DefaultMessageListenerContainer.java:1179) [spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.run(DefaultMessageListenerContainer.java:1076) [spring-jms-5.1.5.RELEASE.jar:5.1.5.RELEASE]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181]
My message header as following:
I have disabled the inbound message header for Priority:
#Bean
public DefaultJmsHeaderMapper jmsHeaderMapper() {
final DefaultJmsHeaderMapper mapper = new DefaultJmsHeaderMapper();
{
mapper.setMapInboundDeliveryMode(true);
mapper.setMapInboundExpiration(true);
mapper.setMapInboundPriority(false);
}
return mapper;
}
Is there any resolution for this issue ?
The inbound message in DEBUG log:
2019-03-01 09:51:51.278 DEBUG 4224 --- [sage-listener-1] .i.j.ChannelPublishingJmsMessageListener : converted JMS Message [ActiveMQTextMessage {commandId = 19, responseRequired = true, messageId = ID:hot-srv-wso2-01-44620-1551368625113-1:4:3:1:1, originalDestination = null, originalTransactionId = null, producerId = ID:hot-srv-wso2-01-44620-1551368625113-1:4:3:1, destination = queue://extraction-request, transactionId = null, expiration = 0, timestamp = 1551408495720, arrival = 0, brokerInTime = 1551408705168, brokerOutTime = 1551408705172, correlationId = ID:hot-srv-wso2-01-44620-1551368625113-1:3:3:1:1, replyTo = queue://extraction-response, persistent = true, type = null, priority = 4, groupID = null, groupSequence = 0, targetConsumerId = null, compressed = false, userID = null, content = null, marshalledProperties = org.apache.activemq.util.ByteSequence#67153d1f, dataStructure = null, redeliveryCounter = 6, size = 0, properties = {Connection=Keep-Alive, User-Agent=Apache-HttpClient/4.1.1 (java 1.5), Host=10.10.15.235:8280, Accept-Encoding=gzip,deflate, jms_type=vn.sps.ias.domain.Response, priority=4, JMS_DESTINATION=ReqOutput, JMS_REPLY_TO=ReqROutput, Content-Length=38, JMS_REDELIVERED=false, Content-Type=application/json, timestamp=1551408705049}, readOnlyProperties = true, readOnlyBody = true, droppable = false, jmsXGroupFirstForConsumer = false, text = {"text":"1 was processed"}}] to integration Message payload []
For me, this problem occurred when I used the header "priority" in a custom message.
MessageBuilder.withPayload(val).setHeader("priority", true).build();
It seems as "priority" is a header value you must not use.
Changing to
MessageBuilder.withPayload(val).setHeader("prio", true).build();
solved the problem for me.
We are using JdbcAppender to log all WARN and ERROR logs into database.
But when we are running the SpringBootTest classes, we are using in-memory database for our tests so its throwing exceptions because of JdbcAppender fails to log.
Caused by: org.h2.jdbc.JdbcSQLException: Table "APPLICATION_LOG" not found; SQL statement:
INSERT INTO application_log (application_log_id,service_id,logger,log_level,message,throwable,log_date) VALUES (?,?,?,?,?,?,?)
2018-06-15 11:53:38,369 main ERROR An exception occurred processing Appender DB org.apache.logging.log4j.core.appender.AppenderLoggingException: Cannot write logging event or flush buffer; JDBC manager cannot connect to the database.
Method:
#PostConstruct
public void onStartUp() {
// Create a new connectionSource build from the Spring properties
LoggerConnectionSource connectionSource = new LoggerConnectionSource(url, userName, password, validationQuery);
// This is the mapping between the columns in the table and what to
// insert in it.
ColumnConfig[] columnConfigs = new ColumnConfig[7];
columnConfigs[0] = ColumnConfig.createColumnConfig(null, "application_log_id", "0", null, null, null, null);
columnConfigs[1] = ColumnConfig.createColumnConfig(null, "service_id", "" + serviceId + "", null, null, "false",
null);
columnConfigs[2] = ColumnConfig.createColumnConfig(null, "logger", "%logger", null, null, "false", null);
columnConfigs[3] = ColumnConfig.createColumnConfig(null, "log_level", "%level", null, null, "false", null);
columnConfigs[4] = ColumnConfig.createColumnConfig(null, "message", "%message", null, null, "false", null);
columnConfigs[5] = ColumnConfig.createColumnConfig(null, "throwable", "%ex{full}", null, null, "false", null);
columnConfigs[6] = ColumnConfig.createColumnConfig(null, "log_date", null, null, "true", null, null);
// filter for the appender to keep only errors
ThresholdFilter filter = ThresholdFilter.createFilter(appenderLevel, null, null);
JdbcAppender appender = JdbcAppender.createAppender(appenderName, "true", filter, connectionSource, "1",
appenderTableName, columnConfigs);
// start the appender, and this is it...
appender.start();
((Logger) LogManager.getRootLogger()).addAppender(appender);
}
Is there any way to skip jdbcAppender during #SpringBootTest ?
I have created Simple JMS Client and Receiver Program. I have used JDk1.7 and activeMQ 5.10.0.My Sender code is executing with particlur message
MessageProducer mp= session.createProducer(destinatin);
Message message = session.createTextMessage("Hi Welcome to ActiveMQ Example");
mp.send(message);
System.out.println("Message Has Sent");
and
Receiver code My Reciver Code is this one where it is not printing anything.
and after some time it gives me error of connection timeout.Could you find out where I am creating mistake...
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("tcp://Name-PC:61616");
//connectin creation
Connection con = connectionFactory.createConnection();
Session session = con.createSession(false,Session.AUTO_ACKNOWLEDGE);
Destination dest= new ActiveMQQueue("Test1.Queue");
MessageConsumer Consumer = session.createConsumer(dest);
Message message = Consumer .receive();
System.out.println("End of Message1");
TextMessage text = (TextMessage) message;
System.out.println("Message" +text.getText());
System.out.println("End of Message");
In http://localhost:8161/admin/queues.jsp it is showing content on
Name Test1.Queue, Number Of Pending Messages =1, Number Of Consumers=1 Messages Enqueued =1,but Messages Dequeued not showing anything
You need to start the connection.
This groovy code works for me:
factory = new ActiveMQConnectionFactory("tcp://tpmint:61616");
dest = new ActiveMQQueue("foo.bar");
conn = null;
session = null;
consumer = null;
try {
conn = factory.createConnection();
println "Connected: $conn";
session = conn.createSession(false,Session.AUTO_ACKNOWLEDGE);
println "Session: $session";
consumer = session.createConsumer(dest);
println "Consumer: $consumer";
conn.start();
msg = consumer.receive(1000);
if(msg==null) println "Timeout";
else println "Msg:$msg";
} finally {
if(consumer!=null) try { consumer.close(); println "Consumer Closed";} catch (e) {}
if(session!=null) try { session.close(); println "Session Closed";} catch (e) {}
if(conn!=null) try { conn.close(); println "Connection Closed"; } catch (e) {e.printStackTrace(System.err);}
}
Output:
Connected: ActiveMQConnection {id=ID:tpmint-51137-1445798087365-0:8,clientId=null,started=false}
Session: ActiveMQSession {id=ID:tpmint-51137-1445798087365-0:8:1,started=false}
Consumer: ActiveMQMessageConsumer { value=ID:tpmint-51137-1445798087365-0:8:1:1, started=false }
Msg:ActiveMQTextMessage {commandId = 7, responseRequired = false, messageId = ID:tpmint-58446-1445793097761-4:2:1:1:3, originalDestination = null, originalTransactionId = null, producerId = ID:tpmint-58446-1445793097761-4:2:1:1, destination = queue://foo.bar, transactionId = null, expiration = 0, timestamp = 1445798905534, arrival = 0, brokerInTime = 1445798905534, brokerOutTime = 1445802982704, correlationId = , replyTo = null, persistent = false, type = , priority = 0, groupID = null, groupSequence = 0, targetConsumerId = null, compressed = false, userID = null, content = null, marshalledProperties = org.apache.activemq.util.ByteSequence#5346e84e, dataStructure = null, redeliveryCounter = 2, size = 0, properties = {JMSXMessageCounter=1}, readOnlyProperties = true, readOnlyBody = true, droppable = false, text = hey hey}
Consumer Closed
Session Closed
Connection Closed