Using custom feature generators with parameters in OpenNLP - opennlp

I am trying to setup the OpenNLP NameFinder in a project with an XML feature generator descriptor and some non-standard features. The XML descriptor has support for custom feature generators:
<generators>
<cache>
<generators>
...
<custom class="com.example.MyFeatureGenerator"/>
</cache>
</generators>
However, documentation doesn't speak of passing parameters to the feature generator. Creating a new class for every slightly different configuration of the feature generator is not desirable. On the other hand, creating the feature generators programmatically likely means duplicating much of the OpenNLP code for handling the feature generator setup. What is the recommended way to use custom feature generators in OpenNLP?

If you don't mind open a jira issue over at Apache OpenNLP and request to fix this. It should be possible for the custom element to pass in parameters and external resources.

No proper solution yet, but I worked around the issue by registering a new feature factory in OpenNLP. Unfortunately, this needs access to private parts of the OpenNLP class GeneratorFactory via reflection. Here's a working solution.
First, define a new class, named XmlDescriptorUtil:
import java.lang.reflect.Field;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Method;
import java.lang.reflect.Proxy;
import java.util.Map;
import opennlp.tools.util.InvalidFormatException;
import opennlp.tools.util.featuregen.AdaptiveFeatureGenerator;
import opennlp.tools.util.featuregen.FeatureGeneratorResourceProvider;
import opennlp.tools.util.featuregen.GeneratorFactory;
import org.w3c.dom.Element;
public final class XmlDescriptorUtil {
private XmlDescriptorUtil(){};
public static abstract class XmlDescriptorFactory implements InvocationHandler
{
#Override
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
return create((Element)args[0], (FeatureGeneratorResourceProvider)args[1]);
}
public abstract AdaptiveFeatureGenerator create(Element generatorElement, FeatureGeneratorResourceProvider resourceManager)
throws InvalidFormatException;
}
public static void register(String name, XmlDescriptorFactory factory) throws Exception
{
Class<?> factoryInterface = Class.forName(GeneratorFactory.class.getName()+"$XmlFeatureGeneratorFactory");
Object proxy = Proxy.newProxyInstance(GeneratorFactory.class.getClassLoader(), new Class[]{factoryInterface}, factory);
registerByProxy(name, proxy);
}
private static void registerByProxy(String name, Object proxy) throws Exception
{
Field f = GeneratorFactory.class.getDeclaredField("factories");
f.setAccessible(true);
#SuppressWarnings("unchecked")
Map<String, Object> factories = (Map<String, Object>) f.get(null);
factories.put(name, proxy);
}
}
Then, create a feature generator factory which implements the public interface XmlDescriptorUtil$XmlDescriptorFactory:
public static void main(String[] args) {
XmlDescriptorUtil.register("myCustom", new XmlDescriptorUtil.XmlDescriptorFactory() {
#Override
public AdaptiveFeatureGenerator create(Element generatorElement, FeatureGeneratorResourceProvider resourceManager) throws InvalidFormatException {
return new MyFeatureGenerator();
});
}
Now, the feature generator is ready for use and can be used in the XML descriptor:
<generators>
<cache>
<generators>
...
<myCustom/>
</generators>
</cache>
</generators>
If the feature generator needs parameters, they can be extracted from generatorElement in the factory class.

Related

Aspect does not triggered

I am trying to implement read-only data source in my application.
According to the following repo implementation, this aspect method should be called when a transaction happens but it never triggers this method(This line never printed to the console - System.out.println("Aspect executed");
#Aspect
#Component
#Order(0)
public class TransactionReadonlyAspect {
#Around("#annotation(transactional)")
public Object proceed(ProceedingJoinPoint proceedingJoinPoint, org.springframework.transaction.annotation.Transactional transactional) throws Throwable {
System.out.println("Aspect executed");
try {
if (transactional.readOnly()) {
DatabaseContextHolder.set(DatabaseEnvironment.READONLY);
}
return proceedingJoinPoint.proceed();
} finally {
DatabaseContextHolder.reset();
}
}
}
And also in the following class it initializes the default datasource no matter what,
How can I make this works or what are the other confihgurations I need to add ?
Thanks.
package com.programmingsharing.demoreadwriterouting.conf;
import com.programmingsharing.demoreadwriterouting.context.DatabaseEnvironment;
import com.programmingsharing.demoreadwriterouting.datasource.MasterSlaveRoutingDataSource;
import com.zaxxer.hikari.HikariDataSource;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.sql.DataSource;
import java.util.HashMap;
import java.util.Map;
#Configuration
public class DataSourceConfiguration {
#Value("${jdbc.master.url}")
private String mstUrl;
#Value("${jdbc.master.username}")
private String mstUsername;
#Value("${jdbc.master.password}")
private String mstPassword;
#Value("${jdbc.slave.url}")
private String slaveUrl;
#Value("${jdbc.slave.username}")
private String slaveUsername;
#Value("${jdbc.slave.password}")
private String slavePassword;
#Bean
public DataSource dataSource(){
MasterSlaveRoutingDataSource masterSlaveRoutingDataSource = new MasterSlaveRoutingDataSource();
Map<Object, Object> targetDataSources = new HashMap<>();
targetDataSources.put(DatabaseEnvironment.UPDATABLE, masterDataSource());
targetDataSources.put(DatabaseEnvironment.READONLY, slaveDataSource());
masterSlaveRoutingDataSource.setTargetDataSources(targetDataSources);
// Set as all transaction point to master
masterSlaveRoutingDataSource.setDefaultTargetDataSource(masterDataSource());
return masterSlaveRoutingDataSource;
}
public DataSource slaveDataSource() {
HikariDataSource hikariDataSource = new HikariDataSource();
hikariDataSource.setJdbcUrl(slaveUrl);
hikariDataSource.setUsername(slaveUsername);
hikariDataSource.setPassword(slavePassword);
return hikariDataSource;
}
public DataSource masterDataSource() {
HikariDataSource hikariDataSource = new HikariDataSource();
hikariDataSource.setJdbcUrl(mstUrl);
hikariDataSource.setUsername(mstUsername);
hikariDataSource.setPassword(mstPassword);
return hikariDataSource;
}
}
https://programmingsharing.com/routing-read-write-datasource-in-spring-99bcc4468f94
Also
context is always printed null
CONTEXT.get() : null
public class DatabaseContextHolder {
private static final ThreadLocal<DatabaseEnvironment> CONTEXT = new ThreadLocal<>();
public static void set(DatabaseEnvironment databaseEnvironment) {
CONTEXT.set(databaseEnvironment);
}
public static DatabaseEnvironment getEnvironment() {
System.out.println("CONTEXT.get() : " + CONTEXT.get());
return CONTEXT.get();
}
public static void reset() {
CONTEXT.set(DatabaseEnvironment.UPDATABLE);
}
}
Also this is always null, none of the environment variables doe not set
DatabaseContextHolder.getEnvironment() : null
public class MasterSlaveRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
System.out.println("DatabaseContextHolder.getEnvironment() : " + DatabaseContextHolder.getEnvironment());
return DatabaseContextHolder.getEnvironment();
}
}
That obviously not the answer to your Q, however I would discourage you from using that datasource routing "solution" you are referring to.
The problem is from spring-tx perspective transaction is read-only if and only if the outermost transaction definition is readonly, please check some examples of execution stacks below:
#Transactional(readonly=true)
...
#Transactional(readonly=false)
// current tx is read-only regardless readonly=false definition
#Transactional(readonly=false)
...
#Transactional(readonly=true)
// current tx is not read-only regardless readonly=true definition
"AspectJ" solution does not take into account that spring-tx convention and thus it is basically wrong.
Technically, we may determine whether transaction is read-only or not via calling TransactionSynchronizationManager#isCurrentTransactionReadOnly method, unfortunately that won't help us much because spring-tx may acquire resources (jdbc connection) before marking transaction as read-only, this problem was mentioned by Vlad Mihalcea in Read-write and read-only transaction routing with Spring:
Not only that the hibernate.connection.provider_disables_autocommit allows you to make better use of database connections, but it’s the only way we can make this example work since, without this configuration, the connection is acquired prior to calling the determineCurrentLookupKey method TransactionRoutingDataSource.
There are two options:
if you are using Hibernate - just follow Vlad's recommendations
if you are not using Hibernate you need to take into account that you need to control outermost transaction definitions only - just place there your own annotations/aspects and do not depend on spring-tx stuff.

How to handle Access Denied properly in Vaadin 14 LTS

I started implementing authentication and authorization for our applications written in Spring Boot (2.2.6.RELEASE) and Vaadin 14 LTS (14.6.1).
I have followed those resources:
Securing your app with Spring Security
Router Exception Handling
I have code for checking whether logged-in user has access rights to specified resources implemented in beforeEnter method. The problem is with invocation of event.rerouteToError(AccessDeniedException.class);. It tries to create an instance of the specified exception with reflection but fails because it does not contain public no-arg constructor.
private void beforeEnter(final BeforeEnterEvent event) {
if (!AuthView.class.equals(event.getNavigationTarget()) && !AuthUtils.isUserLoggedIn()) {
event.rerouteTo(AuthView.class);
}
if (!AuthUtils.isAccessGranted(event.getNavigationTarget())) {
event.rerouteToError(AccessDeniedException.class);
}
}
java.lang.IllegalArgumentException: Unable to create an instance of 'org.springframework.security.access.AccessDeniedException'. Make sure the class has a public no-arg constructor.
at com.vaadin.flow.internal.ReflectTools.createProxyInstance(ReflectTools.java:519)
at com.vaadin.flow.internal.ReflectTools.createInstance(ReflectTools.java:451)
at com.vaadin.flow.router.BeforeEvent.rerouteToError(BeforeEvent.java:720)
at com.vaadin.flow.router.BeforeEvent.rerouteToError(BeforeEvent.java:704)
What can be the best solution for that case? I am thinking about two possible solutions:
First instantiate AccessDeniedException and then pass it to overloaded method in BeforeEvent: public void rerouteToError(Exception exception, String customMessage) which should skip creating exception object by reflection
Create dedicated ErrorView and use method public void rerouteTo(Class<? extends Component> routeTargetType, RouteParameters parameters) of BeforeEvent
I decided to follow Leif Åstrand's answer. I created custom AccessDeniedException and appropriate error handler. Here is my implementation. Maybe it will be helpful for someone.
public class AccessDeniedException extends RuntimeException {
private final int code;
public AccessDeniedException() {
super("common.error.403.details");
this.code = HttpServletResponse.SC_FORBIDDEN;
}
public int getCode() {
return code;
}
}
#Tag(Tag.DIV)
#CssImport(value = "./styles/access-denied-view.css")
#CssImport(value = "./styles/access-denied-box.css", themeFor = "vaadin-details")
public class AccessDeniedExceptionHandler extends VerticalLayout implements HasErrorParameter<AccessDeniedException> {
private final Details details;
public AccessDeniedExceptionHandler() {
setWidthFull();
setHeight("100vh");
setPadding(false);
setDefaultHorizontalComponentAlignment(Alignment.CENTER);
setJustifyContentMode(JustifyContentMode.CENTER);
setClassName(ComponentConstants.ACCESS_DENIED_VIEW);
this.details = new Details();
this.details.setClassName(ComponentConstants.ACCESS_DENIED_BOX);
this.details.addThemeVariants(DetailsVariant.REVERSE, DetailsVariant.FILLED);
this.details.setOpened(true);
add(this.details);
}
#Override
public final int setErrorParameter(final BeforeEnterEvent event, final ErrorParameter<AccessDeniedException> parameter) {
final int code = parameter.getException().getCode();
this.details.setSummaryText(getTranslation("common.error.403.header", code));
this.details.setContent(new Text(getTranslation(parameter.getException().getMessage())));
return code;
}
}
I would recommend creating a custom exception type instead of reusing AccessDeniedException from Spring. In that way, you don't have to deal with the required error message at all.
As you mentioned in your first solution, you could do:
event.rerouteToError(new AccessDeniedException("Navigation target not permitted"), "");
or maybe also specify the customMessage if you want. If you see the implementation of the rerouteToError(Class) method, it just passes empty customMessage and creates the Exception - which you could do manually and that's completely acceptable. I recommend this solution.
Another solution could be to subclass AccessDeniedException and use that with reflection:
public class RouteAccessDeniedException extends AccessDeniedException {
public RouteAccessDeniedException() {
super("Navigation target not permitted");
}
}
I don't recommend this solution.

how to override an application property programatically in Quarkus

I've recently started using testcontantainers for unit/integration testing database operations in my Quarkus webapp. It works fine except I cannot figure out a way to dynamically set the MySQL port in the quarkus.datasource.url application property. Currently I'm using the deprecated withPortBindings method to force the containers to bind the exposed MySQL port to port 11111 but the right way is to let testcontainers pick a random one and override the quarkus.datasource.url property.
My unit test class
#Testcontainers
#QuarkusTest
public class UserServiceTest {
#Container
private static final MySQLContainer MY_SQL_CONTAINER = (MySQLContainer) new MySQLContainer()
.withDatabaseName("userServiceDb")
.withUsername("foo")
.withPassword("bar")
.withUrlParam("serverTimezone", "UTC")
.withExposedPorts(3306)
.withCreateContainerCmdModifier(cmd ->
((CreateContainerCmd) cmd).withHostName("localhost")
.withPortBindings(new PortBinding(Ports.Binding.bindPort(11111), new ExposedPort(3306))) // deprecated, let testcontainers pick random free port
);
#BeforeAll
public static void setup() {
// TODO: use the return value from MY_SQL_CONTAINER.getJdbcUrl()
// to set %test.quarkus.datasource.url
LOGGER.info(" ********************** jdbc url = {}", MY_SQL_CONTAINER.getJdbcUrl());
}
// snip...
}
my application.properties:
%test.quarkus.datasource.url=jdbc:mysql://localhost:11111/userServiceDb?serverTimezone=UTC
%test.quarkus.datasource.driver=com.mysql.cj.jdbc.Driver
%test.quarkus.datasource.username=foo
%test.quarkus.datasource.password=bar
%test.quarkus.hibernate-orm.dialect=org.hibernate.dialect.MySQL8Dialect
The Quarkus guide to configuring an app describes how to programmatically read an application property:
String databaseName = ConfigProvider.getConfig().getValue("database.name", String.class);
but not how to set it. This tutorial on using test containers with Quarkus implicates it should be possible:
// Below should not be used - Function is deprecated and for simplicity of test , You should override your properties at runtime
SOLUTION:
As suggested in the accepted answer, I don't have to specify host and port in the datasource property. So the solution is to simply replace the two lines in application.properties:
%test.quarkus.datasource.url=jdbc:mysql://localhost:11111/userServiceDb
%test.quarkus.datasource.driver=com.mysql.cj.jdbc.Driver
with
%test.quarkus.datasource.url=jdbc:tc:mysql:///userServiceDb
%test.quarkus.datasource.driver=org.testcontainers.jdbc.ContainerDatabaseDriver
(and remove the unnecessary withExposedPorts and withCreateContainerCmdModifier method calls)
Please read the documentation carefully. The port can be omitted.
https://www.testcontainers.org/modules/databases/jdbc/
now (quarkus version 19.03.12) it can be a bit simpler.
Define test component that starts container and overrides JDBC props
import io.quarkus.test.common.QuarkusTestResourceLifecycleManager;
import org.testcontainers.containers.PostgreSQLContainer;
public class PostgresDatabaseResource implements QuarkusTestResourceLifecycleManager {
public static final PostgreSQLContainer<?> DATABASE = new PostgreSQLContainer<>("postgres:10.5")
.withDatabaseName("test_db")
.withUsername("test_user")
.withPassword("test_password")
.withExposedPorts(5432);
#Override
public Map<String, String> start() {
DATABASE.start();
return Map.of(
"quarkus.datasource.jdbc.url", DATABASE.getJdbcUrl(),
"quarkus.datasource.db-kind", "postgresql",
"quarkus.datasource.username", DATABASE.getUsername(),
"quarkus.datasource.password", DATABASE.getPassword());
}
#Override
public void stop() {
DATABASE.stop();
}
}
use it in test
import io.quarkus.test.common.QuarkusTestResource;
import io.quarkus.test.junit.QuarkusTest;
import org.junit.jupiter.api.Test;
import javax.ws.rs.core.MediaType;
import java.util.UUID;
import java.util.stream.Collectors;
import static io.restassured.RestAssured.given;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
#QuarkusTest
#QuarkusTestResource(PostgresDatabaseResource.class)
public class MyControllerTest {
#Test
public void myAwesomeControllerTestWithDb() {
// whatever you want to test here. Quarkus will use Container DB
given().contentType(MediaType.APPLICATION_JSON).body(blaBla)
.when().post("/create-some-stuff").then()
.statusCode(200).and()
.extract()
.body()
.as(YourBean.class);
}

How to view Bytecode spring framework generated proxy classes?

Is there API in Spring AOP/ASM libraries that lets us read the bytecode representation of Spring Generated Proxy class.
In my test code, I am having access to the class file.
private static void printClassBytes(String classFilePath) throws Exception{
TraceClassVisitor visitor = new TraceClassVisitor(new PrintWriter(System.out));
ClassReader reader = new ClassReader(new FileInputStream(new File("/xxx/asm_source/asmtest-0.0.1-SNAPSHOT/com/test/asm/Application.class")));
reader.accept(visitor, 0);
}
But in my application, Proxy class is generated using Spring Integration Gateway in runtime, I only have object reference of Proxy Object. Is there some API in Spring or ASM that lets me find the bytecode of the corresponding Proxy class, using object reference
Something like
private static void printClassBytes(Object obj) throws Exception{
I do not know if Spring has on-board means to do that, but AFAIK the ASM classes embedded in Spring do not include the TraceClassVisitor. So if you like its output, you have to use ASM (artifact asm-util) directly, which I assume you did for your sample code. If you want to stick with on-board means, you can just write a transformer which dumps the full byte code as a class file into a file and then look at the file with the JDK command line tool javap, e.g. via javap -c -p -v MyDumpedBytes.class.
Anyway, an easy thing to do is to implement a Java agent and attach it to the Java command line starting the Spring project via -javaagent:/path/to/my-agent.jar. I found this article for you which explains how to implement a simple Java agent with manifest file etc. and also how to attach it to a running process via Java attach API. The article uses Javassist as an example for writing a transformer, but you can just use ASM instead.
Your Java agent + transformer would look something like this:
import org.objectweb.asm.ClassReader;
import org.objectweb.asm.util.TraceClassVisitor;
import java.io.PrintWriter;
import java.lang.instrument.ClassFileTransformer;
import java.lang.instrument.IllegalClassFormatException;
import java.lang.instrument.Instrumentation;
import java.security.ProtectionDomain;
class TraceClassTransformer implements ClassFileTransformer {
/**
* Attach agent dynamically after JVM start-up
*/
public static void agentmain(String commandLineOptions, Instrumentation instr) {
premain(commandLineOptions, instr);
}
/**
* Start agent via <code>-javaagent:/path/to/my-agent.jar=<i>options</i></code> JVM parameter
*/
public static void premain(String commandLineOptions, Instrumentation instrumentation) {
TraceClassTransformer transformer = new TraceClassTransformer();
instrumentation.addTransformer(transformer, true);
}
#Override
public byte[] transform(ClassLoader loader, String className, Class<?> classBeingRedefined, ProtectionDomain protectionDomain, byte[] classfileBuffer) throws IllegalClassFormatException {
dumpClass(classfileBuffer);
// Do not apply any transformation
return null;
}
private void dumpClass(byte[] classfileBuffer) {
TraceClassVisitor visitor = new TraceClassVisitor(new PrintWriter(System.out));
ClassReader reader = new ClassReader(classfileBuffer);
reader.accept(visitor, 0);
}
}
Just make sure that the agent's manifest enables retransformation via Can-Retransform-Classes: true.
After the agent has started, you can just call instrumentation.retransformClasses(proxyInstance.getClass()); and then enjoy the log output.
In order to make it a bit simpler for this example, let us use byte-buddy-agent which contains a neat little tool set to attach transformers during runtime without the need to wrap them into Java agents. The artifact is small and does not contain the rest of ByteBuddy, just the agent tool.
That would simplify your class to (you can keep or drop the premain and agentmain methods, depending on whether you are planning to use the class as a Java agent or not):
import net.bytebuddy.agent.ByteBuddyAgent;
import org.objectweb.asm.ClassReader;
import org.objectweb.asm.util.TraceClassVisitor;
import java.io.Closeable;
import java.io.PrintWriter;
import java.lang.instrument.ClassFileTransformer;
import java.lang.instrument.IllegalClassFormatException;
import java.lang.instrument.Instrumentation;
import java.lang.instrument.UnmodifiableClassException;
import java.lang.reflect.Proxy;
import java.security.ProtectionDomain;
class TraceClassTransformer implements ClassFileTransformer {
public static void main(String[] args) throws UnmodifiableClassException {
// Easy way to get an Instrumentation instance, so we can directly register our transformer on it
Instrumentation instrumentation = ByteBuddyAgent.install();
// I am just creating a Java dynamic proxy for a JRE interface. In your own application,
// you would just get a reference to a dynamic proxy created by Spring.
Object proxyInstance = Proxy.newProxyInstance(
Closeable.class.getClassLoader(),
new Class<?>[] { Closeable.class },
(proxy, method, args1) -> null
);
// Register + use dummy ClassFileTransformer, then unregister again (optional)
TraceClassTransformer transformer = new TraceClassTransformer();
try {
instrumentation.addTransformer(transformer, true);
instrumentation.retransformClasses(proxyInstance.getClass());
}
finally {
instrumentation.removeTransformer(transformer);
}
}
#Override
public byte[] transform(ClassLoader loader, String className, Class<?> classBeingRedefined, ProtectionDomain protectionDomain, byte[] classfileBuffer) throws IllegalClassFormatException {
dumpClass(classfileBuffer);
// Do not apply any transformation
return null;
}
private void dumpClass(byte[] classfileBuffer) {
TraceClassVisitor visitor = new TraceClassVisitor(new PrintWriter(System.out));
ClassReader reader = new ClassReader(classfileBuffer);
reader.accept(visitor, 0);
}
}
When running the sample main class, you get a console log like:
// class version 58.0 (58)
// access flags 0x11
public final class com/sun/proxy/$Proxy0 extends java/lang/reflect/Proxy implements java/io/Closeable {
// access flags 0xA
private static Ljava/lang/reflect/Method; m0
// access flags 0xA
private static Ljava/lang/reflect/Method; m1
// access flags 0xA
private static Ljava/lang/reflect/Method; m2
// access flags 0xA
private static Ljava/lang/reflect/Method; m3
// access flags 0x1
public <init>(Ljava/lang/reflect/InvocationHandler;)V
ALOAD 0
ALOAD 1
INVOKESPECIAL java/lang/reflect/Proxy.<init> (Ljava/lang/reflect/InvocationHandler;)V
RETURN
MAXSTACK = 2
MAXLOCALS = 2
(...)

ApacheConnector does not process request headers that were set in a WriterInterceptor

I am experiencing problems when configurating my Jersey Client with the ApacheConnector. It seems to ignore all request headers that I define in a WriterInterceptor. I can tell that the WriterInterceptor is called when I set a break point within WriterInterceptor#aroundWriteTo(WriterInterceptorContext). Contrary to that, I can observe that the modification of an InputStream is preserved.
Here is a runnable example demonstrating my problem:
public class ApacheConnectorProblemDemonstration extends JerseyTest {
private static final Logger LOGGER = Logger.getLogger(JerseyTest.class.getName());
private static final String QUESTION = "baz", ANSWER = "qux";
private static final String REQUEST_HEADER_NAME_CLIENT = "foo-cl", REQUEST_HEADER_VALUE_CLIENT = "bar-cl";
private static final String REQUEST_HEADER_NAME_INTERCEPTOR = "foo-ic", REQUEST_HEADER_VALUE_INTERCEPTOR = "bar-ic";
private static final int MAX_CONNECTIONS = 100;
private static final String PATH = "/";
#Path(PATH)
public static class TestResource {
#POST
public String handle(InputStream questionStream,
#HeaderParam(REQUEST_HEADER_NAME_CLIENT) String client,
#HeaderParam(REQUEST_HEADER_NAME_INTERCEPTOR) String interceptor)
throws IOException {
assertEquals(REQUEST_HEADER_VALUE_CLIENT, client);
// Here, the header that was set in the client's writer interceptor is lost.
assertEquals(REQUEST_HEADER_VALUE_INTERCEPTOR, interceptor);
// However, the input stream got gzipped so the WriterInterceptor has been partly applied.
assertEquals(QUESTION, new Scanner(new GZIPInputStream(questionStream)).nextLine());
return ANSWER;
}
}
#Provider
#Priority(Priorities.ENTITY_CODER)
public static class ClientInterceptor implements WriterInterceptor {
#Override
public void aroundWriteTo(WriterInterceptorContext context)
throws IOException, WebApplicationException {
context.getHeaders().add(REQUEST_HEADER_NAME_INTERCEPTOR, REQUEST_HEADER_VALUE_INTERCEPTOR);
context.setOutputStream(new GZIPOutputStream(context.getOutputStream()));
context.proceed();
}
}
#Override
protected Application configure() {
enable(TestProperties.LOG_TRAFFIC);
enable(TestProperties.DUMP_ENTITY);
return new ResourceConfig(TestResource.class);
}
#Override
protected Client getClient(TestContainer tc, ApplicationHandler applicationHandler) {
ClientConfig clientConfig = tc.getClientConfig() == null ? new ClientConfig() : tc.getClientConfig();
clientConfig.property(ApacheClientProperties.CONNECTION_MANAGER, makeConnectionManager(MAX_CONNECTIONS));
clientConfig.register(ClientInterceptor.class);
// If I do not use the Apache connector, I avoid this problem.
clientConfig.connector(new ApacheConnector(clientConfig));
if (isEnabled(TestProperties.LOG_TRAFFIC)) {
clientConfig.register(new LoggingFilter(LOGGER, isEnabled(TestProperties.DUMP_ENTITY)));
}
configureClient(clientConfig);
return ClientBuilder.newClient(clientConfig);
}
private static ClientConnectionManager makeConnectionManager(int maxConnections) {
PoolingClientConnectionManager connectionManager = new PoolingClientConnectionManager();
connectionManager.setMaxTotal(maxConnections);
connectionManager.setDefaultMaxPerRoute(maxConnections);
return connectionManager;
}
#Test
public void testInterceptors() throws Exception {
Response response = target(PATH)
.request()
.header(REQUEST_HEADER_NAME_CLIENT, REQUEST_HEADER_VALUE_CLIENT)
.post(Entity.text(QUESTION));
assertEquals(200, response.getStatus());
assertEquals(ANSWER, response.readEntity(String.class));
}
}
I want to use the ApacheConnector in order to optimize for concurrent requests via the PoolingClientConnectionManager. Did I mess up the configuration?
PS: The exact same problem occurs when using the GrizzlyConnector.
After further research, I assume that this is rather a misbehavior in the default Connector that uses a HttpURLConnection. As I explained in this other self-answered question of mine, the documentation states:
Whereas filters are primarily intended to manipulate request and
response parameters like HTTP headers, URIs and/or HTTP methods,
interceptors are intended to manipulate entities, via manipulating
entity input/output streams
A WriterInterceptor is not supposed to manipulate the header values while a {Client,Server}RequestFilter is not supposed to manipulate the entity stream. If you need to use both, both components should be bundled within a javax.ws.rs.core.Feature or within the same class that implements two interfaces. (This can be problematic if you need to set two different Prioritys though.)
All this is very unfortunate though, since JerseyTest uses the Connector that uses a HttpURLConnection such that all my unit tests succeeded while the real life application misbehaved since it was configured with an ApacheConnector. Also, rather than suppressing changes, I wished, Jersey would throw me some exceptions. (This is a general issue I have with Jersey. When I for example used a too new version of the ClientConnectionManager where the interface was renamed to HttpClientConnectionManager I simply was informed in a one line log statement that all my configuration efforts were ignored. I did not discover this log statement til very late in development.)

Resources