In my wicket application I have this service class:
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
#Component
#Transactional
public class DatabaseService {
#Autowired
SessionFactory sessionFactory;
public void save(Message m) {}
}
This service class is "injected" into a wicket panel:
public class MyPanel extends Panel {
#SpringBean()
private DatabaseService service;
}
It works fine. But if I open the application hours later (server is still running), I receive following error:
java.net.SocketException: Datenübergabe unterbrochen (broken pipe)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(Unknown Source)
at java.net.SocketOutputStream.write(Unknown Source)
[...]
at java.io.BufferedOutputStream.flushBuffer(Unknown Source)
at java.io.BufferedOutputStream.flush(Unknown Source)
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3634)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2460)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2625)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2547)
at com.mysql.jdbc.ConnectionImpl.setAutoCommit(ConnectionImpl.java:4874)
at org.apache.commons.dbcp.DelegatingConnection.setAutoCommit(DelegatingConnection.java:371)
at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.setAutoCommit(PoolingDataSource.java:328)
[...]
(JdbcResourceLocalTransactionCoordinatorImpl.java:214)
at org.hibernate.engine.transaction.internal.TransactionImpl.begin(TransactionImpl.java:52)
at org.hibernate.internal.SessionImpl.beginTransaction(SessionImpl.java:1525)
at org.springframework.orm.hibernate5.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:500)
[...]
at de.project.database.DatabaseService$$EnhancerBySpringCGLIB$$8fa0ab80.getMessages(<generated>)
at WICKET_de.project.database.DatabaseService$$FastClassByCGLIB$$68e55e7c.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.apache.wicket.proxy.LazyInitProxyFactory$AbstractCGLibInterceptor.intercept(LazyInitProxyFactory.java:350)
at WICKET_de.project.database.DatabaseService$$EnhancerByCGLIB$$a9cbdf2b.getMessages(<generated>)
at de.project.pms.MyPanel.<init>(MyPanel.java:26)
at de.project.home.projectHome.<init>(projectHome.java:17)
Is it connected with the (un)detach mechanismn of wicket?
MySQL connections usually time out after a period of time. This causes exceptions if you are using a datasource/connection pool and do not use connection validation. From the stack trace you've pasted I see that you are using apache dbcp as a datasource, so I think you should set the following parameters on it:
validationQuery, testOnCreate, testOnBorrow, testOnReturn, testWhileIdle
Related
I'm running Wildfly 20.0.1.Final in standalone, two-node cluster. I'm trying to implement HTTP Session sharing between the nodes.
In my Spring web application I have <distributable/> in my web.xml.
My session object is this:
package my.package;
#Component
#Scope(value = WebApplicationContext.SCOPE_SESSION, proxyMode = ScopedProxyMode.INTERFACES)
public class MySessionBean implements Serializable {
// omitted for brevity
}
As you can see, I have ScopedProxyMode.TARGET_CLASS.
When I perform a failover in Wildfly, my HTTP Session can't be restored however, as I hit this warning:
2021-02-22 13:24:18,651 WARN [org.wildfly.clustering.web.infinispan] (default task-1) WFLYCLWEBINF0007:
Failed to activate attributes of session Pd9oI0OBiZSC9we0uXsZdBwkLnadO1l4TUfvoJZf:
org.wildfly.clustering.marshalling.spi.InvalidSerializedFormException:
java.lang.ClassNotFoundException: my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df
from [Module "deployment.myDeployment.war" from Service Module Loader]
...
Caused by: java.lang.ClassNotFoundException: my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df from [Module "deployment.myDeployment.war" from Service Module Loader]
at org.jboss.modules.ModuleClassLoader.findClass(ModuleClassLoader.java:255)
at org.jboss.modules.ConcurrentClassLoader.performLoadClassUnchecked(ConcurrentClassLoader.java:410)
at org.jboss.modules.ConcurrentClassLoader.performLoadClass(ConcurrentClassLoader.java:398)
at org.jboss.modules.ConcurrentClassLoader.loadClass(ConcurrentClassLoader.java:116)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at org.jboss.marshalling#2.0.9.Final//org.jboss.marshalling.ModularClassResolver.resolveClass(ModularClassResolver.java:133)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadClassDescriptor(RiverUnmarshaller.java:1033)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadNewObject(RiverUnmarshaller.java:1366)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadObject(RiverUnmarshaller.java:283)
at org.jboss.marshalling.river#2.0.9.Final//org.jboss.marshalling.river.RiverUnmarshaller.doReadObject(RiverUnmarshaller.java:216)
at org.jboss.marshalling#2.0.9.Final//org.jboss.marshalling.AbstractObjectInput.readObject(AbstractObjectInput.java:41)
at org.wildfly.clustering.marshalling.spi#20.0.1.Final//org.wildfly.clustering.marshalling.spi.util.MapExternalizer.readObject(MapExternalizer.java:65)
...
Note, that the ClassNotFoundException is complaining because the lack of my.package.MySessionBean$$EnhancerBySpringCGLIB$$9c0fa1df, which is the Spring-enhanced bean of my MySessionBean bean.
Changing to ScopedProxyMode.INTERFACES is not an option.
Can you please point me in the right direction with this?
I managed to fix this by creating a simple POJO, called MySessionDTO, and using that in my session.
So initially I had this (which threw the exception in the question):
request.getSession().setAttribute("mySession", mySessionBean);
...and after I created MySessionDTO (see below), I refactored it into this:
request.getSession().setAttribute("mySession", mySessionBean.getMySessionDTO());
MySessionDTO is a simple POJO:
package my.package;
import java.io.Serializable;
public class MySessionDTO extends MySessionBean implements Serializable {
public MySessionDTO (MySessionBean mySessionBean) {
this.setAttributeX(mySessionBean.getAttributeX());
this.setAttributeY(mySessionBean.getAttributeY());
}
}
I am using dropwizard 1.2.4 with log4j 1.2.17. I have followed the instructions as mentioned below
https://github.com/arteam/dropwizard-nologback/
It is throwing the exception like below during unit testing.
java.lang.NoClassDefFoundError: ch/qos/logback/core/filter/Filter
at io.dropwizard.testing.junit.ResourceTestRule.<clinit>(ResourceTestRule.java:34)
at com.vnera.restapilayer.api.resources.ApiInfoControllerTest.<clinit>(ApiInfoControllerTest.java:25)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
at java.lang.reflect.Field.get(Field.java:393)
at org.junit.runners.model.FrameworkField.get(FrameworkField.java:73)
at org.junit.runners.model.TestClass.getAnnotatedFieldValues(TestClass.java:230)
at org.junit.runners.ParentRunner.classRules(ParentRunner.java:255)
at org.junit.runners.ParentRunner.withClassRules(ParentRunner.java:244)
at org.junit.runners.ParentRunner.classBlock(ParentRunner.java:194)
at org.junit.runners.ParentRunner.run(ParentRunner.java:362)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: java.lang.ClassNotFoundException: ch.qos.logback.core.filter.Filter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 19 more
My test code looks like below
import io.dropwizard.testing.junit.ResourceTestRule;
import org.junit.Assert;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import javax.ws.rs.core.Response;
import static org.mockito.Mockito.mock;
#Category(value = UnitTest.class)
public class ApiInfoControllerTest {
private static ApiNonFunctionalHandler nonFunctionalHandler = mock(ApiNonFunctionalHandler.class);
private static ApiFilter apiFilter = new ApiFilter(nonFunctionalHandler);
private static final String authToken = "NetworkInsight xTyAGJmZ8nU8yJDP7LnA8Q==";
#ClassRule
public static final ResourceTestRule resources = ResourceTestRule.builder()
.addResource(new ApiInfoController())
.addProvider(apiFilter).build();
#Test
public void testApiVersion() throws Exception {
Response response = resources.client()
.target(ApiConstants.INFO_BASE_URL + "/version")
.request()
.header("Authorization", authToken)
.buildGet().invoke();
Assert.assertNotNull(response);
Assert.assertEquals(response.toString(), Response.Status.OK.getStatusCode(), response.getStatus());
final VersionResponse actualError = response.readEntity(VersionResponse.class);
Assert.assertEquals(actualError.getApiVersion(), ApiConstants.API_VERSION);
}
}
My main application is working fine. The configuration.yaml for main application looks like below
# Change default server ports
server:
applicationConnectors:
- type: http
port: 8123
adminConnectors:
- type: http
port: 8124
requestLog:
type: external
logging:
type: external
Can someone let me know what could be going wrong and how can I get around this?
EDIT
Output of mvn dependency:tree is placed here as I am hitting the character limit here.
This is a bug in dropwizard 1.2.4 as discussed below
https://github.com/dropwizard/dropwizard/pull/2338
When I run my task without the CommandLineRunner implemented and add the #Scheduled annotation it appears the context is being closed. How can I keep the context open so the #Scheduled can run properly?
DataTransferTask.java
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class DataTransferTask {
public static void main(String[] args) {
SpringApplication.run(DataTransferTask.class, args);
}
}
DataTransferRunner.java
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.task.configuration.EnableTask;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import java.util.List;
#Component
#EnableTask
#EnableScheduling
public class DataTransferRunner {
#Autowired
public DataTransferRunner() {
}
#Scheduled(fixedRateString = "${job_concurrency.fixed-rate}")
public void run() throws Exception {
System.out.println("I started running");
}
}
Here is the exception I keep getting
Caused by: java.lang.IllegalStateException: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#21526f6c has been closed already
at org.springframework.context.support.AbstractApplicationContext.assertBeanFactoryActive(AbstractApplicationContext.java:1065) ~[spring-context-4.3.8.RELEASE.jar:4.3.8.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1176) ~[spring-context-4.3.8.RELEASE.jar:4.3.8.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchExecutionListenerBeanPostProcessor.postProcessAfterInitialization(TaskBatchExecutionListenerBeanPostProcessor.java:59) ~[spring-cloud-task-batch-1.2.0.RELEASE.jar:1.2.0.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsAfterInitialization(AbstractAutowireCapableBeanFactory.java:423) ~[spring-beans-4.3.8.RELEASE.jar:4.3.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1633) ~[spring-beans-4.3.8.RELEASE.jar:4.3.8.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) ~[spring-beans-4.3.8.RELEASE.jar:4.3.8.RELEASE]
... 73 common frames omitted
Since you have #EnableTask, spring will close the context once everything is done running. From the code it looks like you are not running anything explicitly so spring is closing the context before your #Schedule() annotation kicks in.
The fix for that is to tell spring to not close the context at all with spring.cloud.task.closecontext_enable=false. This will keep the context open for your scheduled task.
Some documentation here: https://docs.spring.io/spring-cloud-task/docs/1.2.2.RELEASE/reference/htmlsingle/#features-lifecycle
One more note about the property. In the documentation it says closecontext_enable but after inspecting the logs and the jar, that property has been deprecated and replaced by close_context_enabled.
I am trying to connect to Aster server with jdbc drivers from java. I have added already the Jar files to the classpath.
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.*;
public class TeradataJDBCConnection {
public static void main(String[] args) throws Exception {
Class.forName("com.asterdata.ncluster.Driver");
String url="jdbc:ncluster://<ip_address>:2406/test";
Connection conn=DriverManager.getConnection(url, "user123", "test");
}
}
But I am getting the below error.
Exception in thread "main" java.sql.SQLException: [AsterData][ASTERJDBCDSII](34) : Failed to connect to 10.99.186.92. Please check the host address. ()
at com.asterdata.ncluster.jdbc.core.NClusterConnection.connect(Unknown Source)
at com.simba.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.simba.jdbc.common.AbstractDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at Tera.TeradataJDBCConnection.main(TeradataJDBCConnection.java:17)
Caused by: com.asterdata.ncluster.jdbc.core.MuleException: [AsterData][ASTERJDBCDSII](34) : Failed to connect to 10.99.186.92. Please check the host address. ()
... 6 more
No bug in the code.
Everything is running fine.
Download the jdbc driver from here.
https://aster-community.teradata.com/docs/DOC-2254
You could also download the driver from here: http://downloads.teradata.com/download/aster/aster-client-tools-for-windows
You can use the jar noarch-aster-jdbc-driver from AsterJDBC__indep_indep.06.10.00.02.zip file. This jar file works for my connection.
I'm in the process of moving a simple Kafka consumer application out of an existing framework and feel like spring-cloud-stream is an easy way to do that. I used Initializr to bootstrap the app, which is now using Spring-Boot v1.3.3 and Spring-Cloud-Stream v1.0.0-RC1. The application is extremely simple, all it has to do is pick a message from Kafka, deserialize the JSON encoded object and pass it on to our existing library. To get started I just used the LogSink example, since eventually I won't do much else (just deserialize and pass object to a different method).
It all works great: It connects to Kafka, receives the message and passes it (as byte[]) to my sink. However, EmbeddedHeadersMessageConverter logs a StringIndexOutOfBoundsException:
2016-04-11 10:06:50.287 ERROR 11464 --- [pool-1-thread-1] fkaMessageChannelBinder$ReceivingHandler : Could not convert message: 7B2267656E65726174696F6E223A3 [...]
java.lang.StringIndexOutOfBoundsException: String index out of range: 2009
at java.lang.String.checkBounds(String.java:373) ~[na:1.8.0_25]
at java.lang.String.<init>(String.java:413) ~[na:1.8.0_25]
at org.springframework.cloud.stream.binder.EmbeddedHeadersMessageConverter.oldExtractHeaders(EmbeddedHeadersMessageConverter.java:131) ~[spring-cloud-stream-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.cloud.stream.binder.EmbeddedHeadersMessageConverter.extractHeaders(EmbeddedHeadersMessageConverter.java:104) ~[spring-cloud-stream-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder$ReceivingHandler.handleRequestMessage(KafkaMessageChannelBinder.java:583) ~[spring-cloud-stream-binder-kafka-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:99) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:127) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:69) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:63) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:115) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:45) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:105) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:105) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$300(KafkaMessageDrivenChannelAdapter.java:43) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$AutoAcknowledgingChannelForwardingMessageListener.doOnMessage(KafkaMessageDrivenChannelAdapter.java:171) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.AbstractDecodingMessageListener.onMessage(AbstractDecodingMessageListener.java:50) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.QueueingMessageListenerInvoker$KafkaMessageDispatchingSubscriber.onNext(QueueingMessageListenerInvoker.java:221) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.QueueingMessageListenerInvoker$KafkaMessageDispatchingSubscriber.onNext(QueueingMessageListenerInvoker.java:209) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at reactor.core.processor.util.RingBufferSubscriberUtils.route(RingBufferSubscriberUtils.java:67) [reactor-core-2.0.7.RELEASE.jar:na]
at reactor.core.processor.RingBufferProcessor$BatchSignalProcessor.run(RingBufferProcessor.java:789) [reactor-core-2.0.7.RELEASE.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_25]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_25]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_25]
https://github.com/spring-cloud/spring-cloud-stream/issues/209 seems to indicate the problem is missing Kafka headers, which is true, there aren't any. But the solution mentioned there is to add
spring.cloud.stream.binder.kafka.mode=raw
to my application configuration. Unfortunately that did not work for me. Also, STS actually has auto-completion for the respective properties and it suggested
spring.cloud.stream.kafka.binder.mode=raw
Neither of the 2 (separately or combined) made any difference, the exception is still being logged.
I have used Spring for years, but this would be my first Spring-Boot/Spring-Cloud application.
Here's the application code:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.cloud.stream.messaging.Sink;
import org.springframework.integration.annotation.ServiceActivator;
#SpringBootApplication
public class UpdateApplication {
private static Logger logger = LoggerFactory.getLogger(UpdateApplication.class);
public static void main(String[] args) {
SpringApplication.run(UpdateApplication.class, args);
}
#EnableBinding(Sink.class)
public static class UpdateHandler {
#StreamListener(Sink.INPUT)
//#ServiceActivator(inputChannel=Sink.INPUT)
public void loggerSink(Object payload) {
logger.info("Received: " + payload);
}
}
}
I tried both, #ServiceActivator as well as #StreamListener annotation, which in this case does not seem to make a difference.
My application.properties looks like this:
spring.cloud.stream.bindings.input.binder=kafka
spring.cloud.stream.bindings.input.destination=updates
spring.cloud.stream.bindings.input.group=update-client
spring.cloud.stream.kafka.binder.brokers=brokerName
spring.cloud.stream.kafka.binder.zkNodes=zookeeperName
spring.cloud.stream.kafka.binder.mode=raw
Any help to get rid of this error would be appreciated.
As a side note: Since I just started experimenting with spring-cloud-stream I added
spring.cloud.stream.bindings.updates.consumer.resetOffsets=true
spring.cloud.stream.bindings.updates.consumer.startOffset=earlist
to the configuration to avoid having to send new messages every time I restart, but that didn't work.
Since the RC that option has been moved to the .consumer. configuration option.
So, right now you have to do like this:
spring.cloud.stream.bindings.input.consumer.mode=raw
See more info in the Reference Manual.
spring.cloud.stream.bindings.input.consumer.headerMode=raw
is working for version 1.1.0.RELEASE.