Use JSON deserializer for Batch job execution context - spring

I'm trying to get a list of job executions which have been stored in Spring batch related tables in the database using:
List<JobExecution> jobExecutions = jobExplorer.getJobExecutions(jobInstance);
The above method call seems to invoke ExecutionContextRowMapper.mapRow method in JdbcExecutionContextDao class.
The ExecutionContextRowMapper uses com.thoughtworks.xstream.Xstream.fromXML method to deserialize the JSON string of JobExecutionContext stored in DB.
It looks like an incorrect or default xml deserializer is used for unmarshalling JSONified JobExecutionContext.
Is there any configuration to use a JSON deserializer in this scenario.

The serializer/deserializer for the ExecutionContext is configurable in 2.2.x. We use the ExecutionContextSerializer interface (providing two implementations, one using java serialization and one using the XStream impl you mention). To configure your own serializer, you'll need to implement the org.springframework.batch.core.repository.ExecutionContextSerializer and inject it into the JobRepositoryFactoryBean (so that the contexts are serialized/deserialized correctly) and the JobExplorerFactoryBean (to reserialize the previously saved contexts).
It is important to note that changing the serialization method will prevent Spring Batch from deserializing previously saved ExecutionContexts.

Related

How to read specific data from message from kafka with Spring-kafka #KafkaListener?

My task is to read events from multiple different topics (class of all data in all topics is "Event"). This class contains field "data" (Map) which carries specific for each topic data, that can be deserialized to specific class (e.g. to "DeviceCreateEvent" or smth.). I can create consumers for each topic with #KafkaListener on methods with parameter type "Event". But in this case firstly i have to event.getData() and deserialize it into specific class, so I will get code duplication in all consumer methods. Is there any way to get in annotated consumer method already deserialized object to specific class?
It's not clear what you are asking.
If you have a different #KafkaListener for each topic/event type, and use JSON, the framework will automatically tell the message converter the type the data should be converted to; see the documentation.
Although the Serializer and Deserializer API is quite simple and flexible from the low-level Kafka Consumer and Producer perspective, you might need more flexibility at the Spring Messaging level, when using either #KafkaListener or Spring Integration. To let you easily convert to and from org.springframework.messaging.Message, Spring for Apache Kafka provides a MessageConverter abstraction with the MessagingMessageConverter implementation and its JsonMessageConverter (and subclasses) customization. You can inject the MessageConverter into a KafkaTemplate instance directly and by using AbstractKafkaListenerContainerFactory bean definition for the #KafkaListener.containerFactory() property. The following example shows how to do so: ...
On the consumer side, you can configure a JsonMessageConverter; it can handle ConsumerRecord values of type byte[], Bytes and String so should be used in conjunction with a ByteArrayDeserializer, BytesDeserializer or StringDeserializer. (byte[] and Bytes are more efficient because they avoid an unnecessary byte[] to String conversion). You can also configure the specific subclass of JsonMessageConverter corresponding to the deserializer, if you so wish.

Can MongoTemplate provide automatic translation?

I have a simple persistent pojo like:
public class Peristent {
private String unsafe;
}
I use Spring Data mongoTemplate to persist and fetch the above object. I also need to encrypt the Persistent.unsafe variable and store a complex representation of that in backend, everytime I try to save Persistent object.
Can I annotate Persistent, or provide some sort of hooks where I can make the aforementioned translations without me having to do that in the Pojo code manually. This has to happen automatically during mongoTemplate.insert.
Spring Data currently only support Type based conversions. There is an issue for supporting property based conversion, which you might want to track.
Therefore annotating won't work. What you could do is, create use a separate class for the property, which just wraps the String and register a custom converter for that type. See http://docs.spring.io/spring-data/data-mongo/docs/1.10.4.RELEASE/reference/html/#mongo.custom-converters for details, how to do that.

Returning an object from a Spring Batch job / processor

I have a Spring Batch job that reads in a very large fixed length file and maps it just fine to an object. Validated all of the data in the associated processing task.
Being rather new to Spring and Spring Batch I am wondering if it is possible to get out of the job, a fully populated object to be used in a particular case when I am running the job as part of another process ( that I would like to have access to the data).
I realize that I could do the above without Batch, and it seems to be designed with scope limitations for its purpose.
I could serialize the objects in the processor and go that route but for my immediate satisfaction I am hoping there is a way to get around this.
Thanks
In my #configuration class for the batch processing, I created a class variable (it is a list of the object I want to get back) and instantiated with the no arg constructor.
My Step, ItemReader, LineMapper are setup to use a list for input. The custom FieldSetMapper takes that list instantiated from the constructor as a parameter and adds to the list as the file is read and mapped. Similarly my custom ItemProcessor takes the list as input and returns it.
Finally I created a ReturnObjectList bean that returns the populated list.
In my main I cast the AnnotationConfigApplicationContext getbean to the list of that object type. I am now able to use the list of objects generated from the fixed file in the scope of my main application.
Not sure if this is a healthy work around in terms of how Spring Java config is supposed to work, but it does give me what I need.

Missing Converter when using Spring LdapTemplate with Grails Validateable annotation

I'm using the Spring LDAP (docs) library in a Grails application. I have a class annotated with the #Entry annotation, so it is mapped to an LDAP server. This all works quite beautifully.
However, when I add the Grails #Validateable annotation (to enable validating the LDAP class similarly to Grails domain classes) and attempt to retrieve data from LDAP (i.e. a findAll operation on the LdapUserRepo, or similar), I get the following exception:
Message: Missing converter from class java.lang.String to interface org.springframework.validation.Errors, this is needed for field errors on Entry class com.ldap.portal.LdapUser
Basically, it seems like the AST transformation performed by the #Validateable annotation is producing extra fields (namely the errors field) on the LdapUser object. It appears that Spring LDAP, in processing the #Entry logic, assumes a default mapping for the fields property (probably interpreting it as a string field on the LDAP object). When it gets nothing from the LDAP server, it attempts to set the field of type ValidationErrors to a value of type String -- an empty string.
I did some looking in github and found this code that seems relevant and may support my theory.
My question is: is this behavior expected for annotations, and how can one prevent fields added by one annotation from being inappropriately processed by another annotation?
At present the best workaround I've come up with for my specific issue is to add an errors field to my LdapUser object and mark it as transient (so that LDAP ignores it):
#Transient
ValidationErrors errors

jersey - How to use a Resource Method with multiple #FormParam of custom type

I use jersey and have a method in my Resource class which has multiple parameters. These parameters are filled using #FormParam but the problem is, the type of the parameters are custom java types, not some primitives or String. I want to convert the value of parameters from json to custom java types. If I use #Cosume(MediaType.APPLICATION_JSON), then I cannot use multiple parameters and if I remove it, parameters cannot be converted from json to their java instances.
#POST #Path("/add")
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
#Produces(MediaType.APPLICATION_JSON)
public String add(#FormParam("source") BookEntity source, #FormParam("author") AuthorEntity a) throws JsonGenerationException, JsonMappingException, IOException, TransformationException
{
...
}
If I change the parameter types to String and then use Jackson deserialization, I can deserialize json parameters to java instances but I want to do it for other methods too and get it done automatically.
I tried to use the approach used in Custom Java type for consuming request parameters but I cannot make it work.
You can use a custom type mapper.
See this answer
Anyway by default Jersey tries to map received json object representation using JAXB. Obiously you must annotate your objects.

Resources