How to format date correctly using Spring Data Elasticsearch - elasticsearch

I'm using SpringBoot 2.2.5 with Elasticsearch 6.8.6. I'm in progress of migrating from Spring Data Jest to using the Spring Data Elasticsearch REST transport mechanism with ElasticsearchEntityMapper.
I have a Date field with the following definition:
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSZ")
#Field(type = FieldType.Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSZ")
private Date date;
I would like the date stored in Elasticsearch like this:
"date": "2020-04-02T14:49:05.672+0000"
When I start the application, the index is created but when I try to save the entity I get the following exception:
Caused by: org.elasticsearch.client.ResponseException: method [POST], host [http://localhost:9200], URI [/trends/estrend?timeout=1m], status line [HTTP/1.1 400 Bad Request]
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse field [date] of type [date] in document with id 'rS5UP3EB9eKtCTMXW_Ky'"}],"type":"mapper_parsing_exception","reason":"failed to parse field [date] of type [date] in document with id 'rS5UP3EB9eKtCTMXW_Ky'","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \"1585905425266\" is malformed at \"5266\""}},"status":400}
Any pointers on what I'm doing wrong and what I should do to fix it?
Configuration and entity definitions below:
#Configuration
public class ElasticsearchConfig extends AbstractElasticsearchConfiguration {
#Value("${spring.data.elasticsearch.host}")
private String elasticSearchHost;
#Value("${spring.data.elasticsearch.port}")
private String elasticSearchPort;
#Bean
public RestHighLevelClient elasticsearchClient() {
final ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchHost + ":" + elasticSearchPort)
.usingSsl()
.build();
return RestClients.create(clientConfiguration).rest();
}
#Bean
public EntityMapper entityMapper() {
ElasticsearchEntityMapper entityMapper = new ElasticsearchEntityMapper(elasticsearchMappingContext(), new DefaultConversionService());
entityMapper.setConversions(elasticsearchCustomConversions());
return entityMapper;
}
}
package com.es.test;
import java.util.Date;
import java.util.UUID;
import com.fasterxml.jackson.annotation.JsonFormat;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
#Document(indexName = "trends")
public class EsTrend {
#Id
private UUID id;
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSZ")
#Field(type = FieldType.Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSZ")
private Date date;
private String entityOrRelationshipId;
// getter and setters
}
Update:
If I disable the ElasticsearchEntityMapper bean, I don't get the exception and the date is written in the correct format to Elasticsearch. Is there anything else I need to configure for the ElasticsearchEntityMapper?

First, please don't use the Jackson based default mapper. It is removed in the next major version of Spring Data Elasticsearch (4.0). Then there will be no choice available, and internally the ElasticsearchEntityMapperis used.
As to your problem: The ElasticsearchEntityMapperin version 3.2, which is used by Spring Boot currently, does not use the date relevant information from the #Field attribute to convert the entity, it is only used for the index mappings creation. This was a missing feature or bug and is fixed in the next major version, the whole mapping process was overhauled there.
What you can do in your current situation: You need to add custom converters. You can do this in your configuration class like this:
#Configuration
public class ElasticsearchConfig extends AbstractElasticsearchConfiguration {
private static SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ");
#Value("${spring.data.elasticsearch.host}")
private String elasticSearchHost;
#Value("${spring.data.elasticsearch.port}")
private String elasticSearchPort;
#Bean
public RestHighLevelClient elasticsearchClient() {
final ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchHost + ":" + elasticSearchPort)
.usingSsl()
.build();
return RestClients.create(clientConfiguration).rest();
}
#Bean
public EntityMapper entityMapper() {
ElasticsearchEntityMapper entityMapper = new ElasticsearchEntityMapper(elasticsearchMappingContext(), new DefaultConversionService());
entityMapper.setConversions(elasticsearchCustomConversions());
return entityMapper;
}
#Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(Arrays.asList(DateToStringConverter.INSTANCE, StringToDateConverter.INSTANCE));
}
#WritingConverter
enum DateToStringConverter implements Converter<Date, String> {
INSTANCE;
#Override
public String convert(Date date) {
return formatter.format(date);
}
}
#ReadingConverter
enum StringToDateConverter implements Converter<String, Date> {
INSTANCE;
#Override
public Date convert(String s) {
try {
return formatter.parse(s);
} catch (ParseException e) {
return null;
}
}
}
}
You still need to have the dateformat in the #Field anotation though, because it is needed to create the correct index mappings.
And you should change your code to use the Java 8 introduced time classes like LocalDate or LocalDateTime, Spring Data Elasticsearch supports these out of the box, whereas java.util.Date would need custom converters.
Edit 09.04.2020: added the necessary #WritingConverter and #ReadingConverter annotations.
Edit 19.04.2020: Spring Data Elasticsearch 4.0 will support the java.util.Date class out of the box with the #Field annotation as well.

As I am a new joiner,I can't comment under #P.J.Meisch's anwser by the stack rules.
I also faced the problem, and solved it with #P.J.Meisch's anwser.
But just a little change with the #ReadingConverter.
Infact, the raw type read from ES, is Long, and the result type in java we need is LocalDateTime. Thus, the read converter shoud be Long to LocalDateTime.
Code follows below:
#Configuration
public class ElasticsearchClientConfig extends AbstractElasticsearchConfiguration {
public final static int TIME_OUT_MILLIS = 50000;
#Autowired
private ElasticsearchProperties elasticsearchProperties;
#Override
#Bean
public RestHighLevelClient elasticsearchClient() {
final ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticsearchProperties.getHost() + ":" + elasticsearchProperties.getPort())
.withBasicAuth(elasticsearchProperties.getName(), elasticsearchProperties.getPassword())
.withSocketTimeout(TIME_OUT_MILLIS)
.withConnectTimeout(TIME_OUT_MILLIS)
.build();
return RestClients.create(clientConfiguration).rest();
}
/**
* Java LocalDateTime to ElasticSearch Date mapping
*
* #return EntityMapper
*/
#Override
#Bean
public EntityMapper entityMapper() {
ElasticsearchEntityMapper entityMapper = new ElasticsearchEntityMapper(elasticsearchMappingContext(), new DefaultConversionService());
entityMapper.setConversions(elasticsearchCustomConversions());
return entityMapper;
}
#Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(Arrays.asList(DateToStringConverter.INSTANCE, LongToLocalDateTimeConverter.INSTANCE));
}
#WritingConverter
enum DateToStringConverter implements Converter<Date, String> {
/**
* instance
*/
INSTANCE;
#Override
public String convert(#NonNull Date date) {
return DateUtil.format(date, DateConstant.TIME_PATTERN);
}
}
**#ReadingConverter
enum LongToLocalDateTimeConverter implements Converter<Long, LocalDateTime> {
/**
* instance
*/
INSTANCE;
#Override
public LocalDateTime convert(#NonNull Long s) {
return LocalDateTime.ofInstant(Instant.ofEpochMilli(s), ZoneId.systemDefault());
}
}**
}
and the DateUtil file:
public class DateUtil {
/**
* lock obj
*/
private static final Object LOCK_OBJ = new Object();
/**
* sdf Map for different pattern
*/
private static final Map<String, ThreadLocal<SimpleDateFormat>> LOCAL_MAP = new HashMap<>();
/**
* thread safe
*
* #param pattern pattern
* #return SimpleDateFormat
*/
private static SimpleDateFormat getSdf(final String pattern) {
ThreadLocal<SimpleDateFormat> tl = LOCAL_MAP.get(pattern);
if (tl == null) {
synchronized (LOCK_OBJ) {
tl = LOCAL_MAP.get(pattern);
if (tl == null) {
System.out.println("put new sdf of pattern " + pattern + " to map");
tl = ThreadLocal.withInitial(() -> {
System.out.println("thread: " + Thread.currentThread() + " init pattern: " + pattern);
return new SimpleDateFormat(pattern);
});
LOCAL_MAP.put(pattern, tl);
}
}
}
return tl.get();
}
/**
* format
*
* #param date date
* #param pattern pattern
* #return String
*/
public static String format(Date date, String pattern) {
return getSdf(pattern).format(date);
}
}
at last,
pls vote for #P.J.Meisch, not me.

Related

How do I parse snake case fields in a FeignClient response json?

I have configured a FeignClient in my spring boot webapp where I'm calling an external api that returns the following object.
public class Issue {
private Assignee assignee;
private Date createdAt;
private Date updatedAt;
private Date closedAt;
private String description;
private Date dueDate;
public Assignee getAssignee() {
return assignee;
}
public void setAssignee(Assignee assignee) {
this.assignee = assignee;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Date getDueDate() {
return dueDate;
}
public void setDueDate(Date dueDate) {
this.dueDate = dueDate;
}
public Date getUpdatedAt() {
return updatedAt;
}
public void setUpdatedAt(Date updatedAt) {
this.updatedAt = updatedAt;
}
public Date getClosedAt() {
return closedAt;
}
public void setClosedAt(Date closedAt) {
this.closedAt = closedAt;
}
#Override
public String toString() {
return (JacksonJson.toJsonString(this));
}
}
The fields updatedAt, createdAt and closedAt are all in snake case. All multi-word fields show up as null. Is there any way of configuring the FeignClient's Jackson parser so that it can process snake case characters? Note, that I cannot change the default Jackson Parser for my spring boot webapp because I myself render json in camel case. I just need to configure this parser on the FeignClient that I'm using to connect to an external REST api.
I have verified that the json response returned from the api call contains valid values in each of these json fields.
Here's how I solved it. I created a custom JacksonParser as a Spring Bean.
#Configuration(proxyBeanMethods = false)
public class FeignClientDateFormatConfig {
#Bean
public Decoder feignDecoder() {
HttpMessageConverter jacksonConverter = new MappingJackson2HttpMessageConverter(customObjectMapper());
ObjectFactory<HttpMessageConverters> objectFactory = () -> new HttpMessageConverters(jacksonConverter);
return new ResponseEntityDecoder(new SpringDecoder(objectFactory));
}
public ObjectMapper customObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setPropertyNamingStrategy(PropertyNamingStrategy.SNAKE_CASE);
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
return objectMapper;
}
}
This successfully parses all snake case properties.
Please note that this has a severe limitation. If you have multiple FeignClients and only one of them returns snake-case json, then you're out of luck. This overrides the default FeignClient config. The only workaround possible with this solution is to move your FeignClient calls into a separate microservice so other FeignClient calls are not affected.

CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE

I was able to reproduce my problem with a minimal modification of the official Spring Boot guide for Accessing Data with MongoDB, see https://github.com/thokrae/spring-data-mongo-zoneddatetime.
After adding a java.time.ZonedDateTime field to the Customer class, running the example code from the guide fails with a CodecConfigurationException:
Customer.java:
public String lastName;
public ZonedDateTime created;
public Customer() {
output:
...
Caused by: org.bson.codecs.configuration.CodecConfigurationException`: Can't find a codec for class java.time.ZonedDateTime.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63) ~[bson-3.6.4.jar:na]
at org.bson.codecs.configuration.ChildCodecRegistry.get(ChildCodecRegistry.java:51) ~[bson-3.6.4.jar:na]
This can be solved by changing the Spring Boot version from 2.0.5.RELEASE to 2.0.1.RELEASE in the pom.xml:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.1.RELEASE</version>
</parent>
Now the exception is gone and the Customer objects including the ZonedDateTime fields are written to MongoDB.
I filed a bug (DATAMONGO-2106) with the spring-data-mongodb project but would understand if changing this behaviour is not wanted nor has a high priority.
What is the best workaround? When duckduckgoing for the exception message I find several approaches like registering a custom codec, a custom converter or using Jackson JSR 310. I would prefer to not add custom code to my project to handle a class from the java.time package.
Persisting date time types with time zones was never supported by Spring Data MongoDB, as stated by Oliver Drotbohm himself in DATAMONGO-2106.
These are the known workarounds:
Use a date time type without a time zone, e.g. java.time.Instant. (It is generally advisable to only use UTC in the backend, but I had to extend an existing code base which was following a different approach.)
Write a custom converter and register it by extending AbstractMongoConfiguration. See the branch converter in my test repository for a running example.
#Component
#WritingConverter
public class ZonedDateTimeToDocumentConverter implements Converter<ZonedDateTime, Document> {
static final String DATE_TIME = "dateTime";
static final String ZONE = "zone";
#Override
public Document convert(#Nullable ZonedDateTime zonedDateTime) {
if (zonedDateTime == null) return null;
Document document = new Document();
document.put(DATE_TIME, Date.from(zonedDateTime.toInstant()));
document.put(ZONE, zonedDateTime.getZone().getId());
document.put("offset", zonedDateTime.getOffset().toString());
return document;
}
}
#Component
#ReadingConverter
public class DocumentToZonedDateTimeConverter implements Converter<Document, ZonedDateTime> {
#Override
public ZonedDateTime convert(#Nullable Document document) {
if (document == null) return null;
Date dateTime = document.getDate(DATE_TIME);
String zoneId = document.getString(ZONE);
ZoneId zone = ZoneId.of(zoneId);
return ZonedDateTime.ofInstant(dateTime.toInstant(), zone);
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return database;
}
#Bean
public CustomConversions customConversions() {
return new MongoCustomConversions(asList(
new ZonedDateTimeToDocumentConverter(),
new DocumentToZonedDateTimeConverter()
));
}
}
Write a custom codec. At least in theory. My codec test branch is unable to unmarshal the data when using Spring Boot 2.0.5 while working fine with Spring Boot 2.0.1.
public class ZonedDateTimeCodec implements Codec<ZonedDateTime> {
public static final String DATE_TIME = "dateTime";
public static final String ZONE = "zone";
#Override
public void encode(final BsonWriter writer, final ZonedDateTime value, final EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeDateTime(DATE_TIME, value.toInstant().getEpochSecond() * 1_000);
writer.writeString(ZONE, value.getZone().getId());
writer.writeEndDocument();
}
#Override
public ZonedDateTime decode(final BsonReader reader, final DecoderContext decoderContext) {
reader.readStartDocument();
long epochSecond = reader.readDateTime(DATE_TIME);
String zoneId = reader.readString(ZONE);
reader.readEndDocument();
return ZonedDateTime.ofInstant(Instant.ofEpochSecond(epochSecond / 1_000), ZoneId.of(zoneId));
}
#Override
public Class<ZonedDateTime> getEncoderClass() {
return ZonedDateTime.class;
}
}
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host + ":" + port, createOptions());
}
private MongoClientOptions createOptions() {
CodecProvider pojoCodecProvider = PojoCodecProvider.builder()
.automatic(true)
.build();
CodecRegistry registry = CodecRegistries.fromRegistries(
createCustomCodecRegistry(),
MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromProviders(pojoCodecProvider)
);
return MongoClientOptions.builder()
.codecRegistry(registry)
.build();
}
private CodecRegistry createCustomCodecRegistry() {
return CodecRegistries.fromCodecs(
new ZonedDateTimeCodec()
);
}
#Override
protected String getDatabaseName() {
return database;
}
}

Couchbase 5 bucket password setting

I am trying to write a sample in order to learn couchbase. I am trying to use it with spring boot and it’s crud repositories .
So I have downloaded latest docker image but the point is: i could not find the password of the bucket. The couchbase console allows only user creation but in spring, there is no equivalent of this usage like a username/password. It allows only bucketName and password which does not seem compatible with couchbase 5.
Am I missing anything here or is spring not compatible with couchbase 5? If spring is not compatible, which version of couchbase is ok?
Thx
Spring Data Couchbase is compatible with Couchbase Server 5.0. You can achieve the same auth as 4.x by creating a user with the same name as the bucket, then just use that bucket name and password from Spring Data if it's prior to 3.0/Kay.
The docs should cover this and if there's anything confusing there, please click the "feedback" button and offer what could be improved!
https://developer.couchbase.com/documentation/server/5.0/security/security-authorization.html
https://developer.couchbase.com/documentation/server/5.0/security/concepts-rba-for-apps.html
https://developer.couchbase.com/documentation/server/5.0/security/security-resources-under-access-control.html
I faced the same issue. I started debugging by getting into AbstractCouchbaseConfiguration and there i found
public abstract class AbstractCouchbaseConfiguration
extends AbstractCouchbaseDataConfiguration implements CouchbaseConfigurer {
....//some other configuration
#Override
#Bean(name = BeanNames.COUCHBASE_CLUSTER_INFO)
public ClusterInfo couchbaseClusterInfo() throws Exception {
return couchbaseCluster().clusterManager(getBucketName(), getBucketPassword()).info();
}
What i did is created a bucket with the same name as of my couchbase user.
couchbase username : userdetail
couchbase password : ******
bucket name : userdetail
Couchbase driver supports connection to Couchbase 5 buckets using username/password. Problem is that spring-data-couchbase is not developed fast enough to cover all the new features Couchbase introduces. So, we need to help Spring to use a new bucket connection, doing it by overriding Couchbase cluster instantiation method of spring-data-couchbase configuration base class - org.springframework.data.couchbase.config.AbstractCouchbaseConfiguration. This is the method we are looking at :
#Override
#Bean(name = BeanNames.COUCHBASE_CLUSTER_INFO)
public ClusterInfo couchbaseClusterInfo() throws Exception {
return couchbaseCluster().clusterManager(getBucketName(), getBucketPassword()).info();
}
as we can see, it's not using username, just bucket and password, so in our configuration we will override it as following :
#Override
#Bean(name = BeanNames.COUCHBASE_CLUSTER_INFO)
public ClusterInfo couchbaseClusterInfo() throws Exception {
return couchbaseCluster().authenticate(couchbaseUsername, couchbasePassword).clusterManager().info();
}
that's it. Here is the full code of my spring-data-couchbase configuration :
import com.couchbase.client.java.Bucket;
import com.couchbase.client.java.cluster.ClusterInfo;
import com.couchbase.client.java.env.CouchbaseEnvironment;
import com.couchbase.client.java.env.DefaultCouchbaseEnvironment;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.env.Environment;
import org.springframework.core.io.Resource;
import org.springframework.core.io.ResourceLoader;
import org.springframework.data.couchbase.config.AbstractCouchbaseConfiguration;
import org.springframework.data.couchbase.config.BeanNames;
import org.springframework.data.couchbase.repository.config.EnableCouchbaseRepositories;
import javax.inject.Inject;
import java.security.KeyStore;
import java.util.List;
/**
* #author by avoinovan
*/
#Configuration
#EnableCouchbaseRepositories
public class ModelConfig extends AbstractCouchbaseConfiguration {
private final static int DEFAULT_HTTP_PORT = 8091;
private final static int DEFAULT_HTTP_SSL_PORT = 18091;
private final static int DEFAULT_CARRIER_PORT = 11210;
private final static int DEFAULT_CARRIER_SSL_PORT = 11207;
private final static long DEFAULT_KEEP_ALIVE_INTERVAL = 30000;
private final static int DEFAULT_SOCKET_CONNECT_TIMEOUT_MS = 5000;
private final static long DEFAULT_CONNECT_TIMEOUT_MS = 5000;
private final static long DEFAULT_MANAGEMENT_TIMEOUT_MS = 75000;
private final static long DEFAULT_DISCONNECT_TIMEOUT_MS = 25000;
private final static String PROPERTY_KEEP_ALIVE_INTERVAL_MS = "couchbase.keep_alive_interval_ms";
private final static String PROPERTY_SOCKET_CONNECT_TIMEOUT_MS = "couchbase.socket_connect_timeout_ms";
private final static String PROPERTY_CONNECT_TIMEOUT_MS = "couchbase.connect_timeout_ms";
private final static String PROPERTY_MANAGEMENT_TIMEOUT_MS = "couchbase.management_timeout_ms";
private final static String PROPERTY_DISCONNECT_TIMEOUT_MS = "couchbase.disconnect_timeout_ms";
private final static String PROPERTY_SSL_ENABLED = "couchbase.ssl.enabled";
private final static String PROPERTY_SSL_KEYSTORE_FILE = "couchbase.ssl.keystore.file";
private final static String PROPERTY_SSL_KEYSTORE_PASSWORD = "couchbase.ssl.keystore.password";
private final static String PROPERTY_SSL_TRUSTSTORE_FILE = "couchbase.ssl.truststore.file";
private final static String PROPERTY_SSL_TRUSTSTORE_PASSWORD = "couchbase.ssl.truststore.password";
private final static String PROPERTY_BOOTSTRAP_HTTP_ENABLED = "couchbase.bootstrap.http.enabled";
private final static String PROPERTY_BOOTSTRAP_HTTP_PORT = "couchbase.bootstrap.http.port";
private final static String PROPERTY_BOOTSTRAP_HTTP_SSL_PORT = "couchbase.bootstrap.http.ssl.port";
private final static String PROPERTY_BOOTSTRAP_CARRIER_ENABLED = "couchbase.bootstrap.carrier.enabled";
private final static String PROPERTY_BOOTSTRAP_CARRIER_PORT = "couchbase.bootstrap.carrier.port";
private final static String PROPERTY_BOOTSTRAP_CARRIER_SSL_PORT = "couchbase.bootstrap.carrier.ssl.port";
#Value("#{'${spring.couchbase.bootstrap-hosts}'.split(',')}")
private List<String> couchbaseBootstrapHosts;
#Value("${spring.couchbase.bucket.name}")
private String bucketName;
#Value("${spring.couchbase.password}")
private String couchbasePassword;
#Value("${spring.couchbase.username}")
private String couchbaseUsername;
private final Environment environment;
private final ResourceLoader resourceLoader;
#Inject
public ModelConfig(final Environment environment,
final ResourceLoader resourceLoader) {
this.environment = environment;
this.resourceLoader = resourceLoader;
}
protected List<String> getBootstrapHosts() {
return couchbaseBootstrapHosts;
}
protected String getBucketName() {
return bucketName;
}
protected String getBucketPassword() {
return couchbasePassword;
}
protected CouchbaseEnvironment getEnvironment() {
return DefaultCouchbaseEnvironment.builder()
.keepAliveInterval(environment.getProperty(PROPERTY_KEEP_ALIVE_INTERVAL_MS,
Long.class,
DEFAULT_KEEP_ALIVE_INTERVAL))
// timeout settings
.socketConnectTimeout(environment.getProperty(PROPERTY_SOCKET_CONNECT_TIMEOUT_MS,
Integer.class,
DEFAULT_SOCKET_CONNECT_TIMEOUT_MS))
.connectTimeout(environment.getProperty(PROPERTY_CONNECT_TIMEOUT_MS,
Long.class,
DEFAULT_CONNECT_TIMEOUT_MS))
.managementTimeout(environment.getProperty(PROPERTY_MANAGEMENT_TIMEOUT_MS,
Long.class,
DEFAULT_MANAGEMENT_TIMEOUT_MS))
.disconnectTimeout(environment.getProperty(PROPERTY_DISCONNECT_TIMEOUT_MS,
Long.class,
DEFAULT_DISCONNECT_TIMEOUT_MS))
// port and ssl
.sslEnabled(environment.getProperty(PROPERTY_SSL_ENABLED, Boolean.class, false))
.bootstrapHttpEnabled(environment.getProperty(PROPERTY_BOOTSTRAP_HTTP_ENABLED,
Boolean.class,
Boolean.TRUE))
.bootstrapHttpDirectPort(environment.getProperty(PROPERTY_BOOTSTRAP_HTTP_PORT,
Integer.class,
DEFAULT_HTTP_PORT))
.bootstrapHttpSslPort(environment.getProperty(PROPERTY_BOOTSTRAP_HTTP_SSL_PORT,
Integer.class,
DEFAULT_HTTP_SSL_PORT))
.bootstrapCarrierEnabled(environment.getProperty(PROPERTY_BOOTSTRAP_CARRIER_ENABLED,
Boolean.class,
Boolean.TRUE))
.bootstrapCarrierDirectPort(environment.getProperty(PROPERTY_BOOTSTRAP_CARRIER_PORT,
Integer.class,
DEFAULT_CARRIER_PORT))
.bootstrapCarrierSslPort(environment.getProperty(PROPERTY_BOOTSTRAP_CARRIER_SSL_PORT,
Integer.class,
DEFAULT_CARRIER_SSL_PORT))
// keystore and trust store
.sslKeystore(createKeyStore(environment, resourceLoader))
.sslTruststore(createTrustStore(environment, resourceLoader))
.build();
}
#Override
#Bean(name = BeanNames.COUCHBASE_CLUSTER_INFO)
public ClusterInfo couchbaseClusterInfo() throws Exception {
return couchbaseCluster().authenticate(couchbaseUsername, couchbasePassword).clusterManager().info();
}
/**
* Return the {#link Bucket} instance to connect to.
*
* #throws Exception on Bean construction failure.
*/
#Override
#Bean(destroyMethod = "close", name = BeanNames.COUCHBASE_BUCKET)
public Bucket couchbaseClient() throws Exception {
//#Bean method can use another #Bean method in the same #Configuration by directly invoking it
return couchbaseCluster().openBucket(getBucketName());
}
private KeyStore createKeyStore(final Environment environment, final ResourceLoader resourceLoader) {
return loadKeyStore(environment, resourceLoader, PROPERTY_SSL_KEYSTORE_FILE, PROPERTY_SSL_KEYSTORE_PASSWORD);
}
private KeyStore createTrustStore(final Environment environment, final ResourceLoader resourceLoader) {
return loadKeyStore(environment, resourceLoader, PROPERTY_SSL_TRUSTSTORE_FILE, PROPERTY_SSL_TRUSTSTORE_PASSWORD);
}
private KeyStore loadKeyStore(final Environment environment,
final ResourceLoader resourceLoader,
final String fileProperty,
final String passwordProperty) {
String file = environment.getProperty(fileProperty);
String password = environment.getProperty(passwordProperty);
if (file != null) {
Resource resource = resourceLoader.getResource(file);
if (resource != null) {
try {
KeyStore keyStore = KeyStore.getInstance(KeyStore.getDefaultType());
keyStore.load(resource.getInputStream(), password == null ? null : password.toCharArray());
return keyStore;
} catch (final Exception e) {
throw new RuntimeException(e);
}
}
}
return null;
}
}

Process string templates with thymeleaf 3

Can we use StringTemplateResolver to populate a string template with Icontext. If so how we can do?
TemplateProcessingParameters and IResourceResolver is removed from Thymeleaf 3. Any working example would greatly help?
I have followed this example and it works great in Thymeleaf 2
Is there a way to make Spring Thymeleaf process a string template?
I didnt see any reference any migration guide as well.
I think I found a solution. If anybody has better answer please let me know.
I did a small mistake earlier. Hope this helps.
private TemplateEngine templateEngine;
private TemplateEngine getTemplateEngine() {
if(null == templateEngine){
templateEngine = new TemplateEngine();
StringTemplateResolver templateResolver =new StringTemplateResolver();
templateResolver.setTemplateMode(TemplateMode.HTML);
templateEngine.setTemplateResolver(templateResolver);
}
return templateEngine;
}
public String getTemplateFromMap(String htmlContent, Map<String, String> dynamicAttibutesMap) {
templateEngine = getTemplateEngine();
String template = null;
final Context ctx = new Context(new Locale(TEMPLATE_LOCAL));
if (!CollectionUtils.isEmpty(emailAttibutesMap)) {
dynamicAttibutesMap.forEach((k,v)->ctx.setVariable(k, v));
}
if (null != templateEngine) {
template = templateEngine.process(htmlContent, ctx);
}
return template;
}
This is how we did it, as a Spring #Service Bean:
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import org.springframework.stereotype.Service;
import org.thymeleaf.TemplateProcessingParameters;
import org.thymeleaf.context.IContext;
import org.thymeleaf.messageresolver.IMessageResolver;
import org.thymeleaf.messageresolver.StandardMessageResolver;
import org.thymeleaf.resourceresolver.IResourceResolver;
import org.thymeleaf.spring4.SpringTemplateEngine;
import org.thymeleaf.templatemode.StandardTemplateModeHandlers;
import org.thymeleaf.templateresolver.ITemplateResolutionValidity;
import org.thymeleaf.templateresolver.ITemplateResolver;
import org.thymeleaf.templateresolver.NonCacheableTemplateResolutionValidity;
import org.thymeleaf.templateresolver.TemplateResolution;
import org.thymeleaf.util.Validate;
import com.rathna.app.model.constants.common.BeanConstants;
/**
* Ref: https://github.com/thymeleaf/thymeleaf-itutorial/blob/2.1-master/src/test/java/org/thymeleaf/tools/memoryexecutor/StaticTemplateExecutorTest.java
* #author anandchakru
*
*/
#Service
public class StaticTemplateService {
public String processTemplateCode(final String code, final IContext context) {
Validate.notNull(code, "Code must be non-null");
Validate.notNull(context, "Context must be non-null");
String templateMode = StandardTemplateModeHandlers.HTML5.getTemplateModeName();
IMessageResolver messageResolver = new StandardMessageResolver();
ITemplateResolver templateResolver = new MemoryTemplateResolver(code, templateMode);
SpringTemplateEngine templateEngine = new SpringTemplateEngine();
templateEngine.setMessageResolver(messageResolver);
templateEngine.setTemplateResolver(templateResolver);
templateEngine.initialize();
return templateEngine.process("dummy", context);
}
}
class FixedMemoryResourceResolver implements IResourceResolver {
private static final String NAME = "FixedMemoryResourceResolver";
private final String templateContent;
public FixedMemoryResourceResolver(final String templateContent) {
Validate.notNull(templateContent, "Template content must be non-null");
this.templateContent = templateContent;
}
#Override
public String getName() {
return NAME;
}
#Override
public InputStream getResourceAsStream(final TemplateProcessingParameters tpp, final String templateName) {
return new ByteArrayInputStream(templateContent.getBytes());
}
}
class MemoryTemplateResolver implements ITemplateResolver {
private static final String NAME = "MemoryTemplateResolver";
private static final Integer ORDER = 1;
private final String templateContent;
private final String templateMode;
public MemoryTemplateResolver(final String templateContent, final String templateMode) {
Validate.notNull(templateContent, "Template content must be non-null");
Validate.notNull(templateMode, "Template mode must be non-null");
this.templateContent = templateContent;
this.templateMode = templateMode;
}
#Override
public void initialize() {
}
#Override
public String getName() {
return NAME;
}
#Override
public Integer getOrder() {
return ORDER;
}
#Override
public TemplateResolution resolveTemplate(final TemplateProcessingParameters tpp) {
String templateName = "CustomTemplate";
String resourceName = "CustomResource";
IResourceResolver resourceResolver = new FixedMemoryResourceResolver(templateContent);
ITemplateResolutionValidity validity = new NonCacheableTemplateResolutionValidity();
return new TemplateResolution(templateName, resourceName, resourceResolver, StandardCharsets.UTF_8.toString(),
templateMode, validity);
}
}
and call it like this:
#Autowired
protected StaticTemplateService staticTemplateService;
...
private String getProcessedHtml(){
Context context2 = new Context();
context2.setVariable("greet", "Hello");
return staticTemplateService.processTemplateCode("<div th:text="${greet}">Hi</div> World", context2);
}
With the latest version of spring 5 and thymeleaf its easy to read string from thymeleaf.
If you are using gradle use the below import
compile "org.thymeleaf:thymeleaf:3.0.11.RELEASE"
compile "org.thymeleaf:thymeleaf-spring5:3.0.11.RELEASE"
//Code sample starts here
private TemplateEngine templateEngine;
private final static String TEMPLATE_LOCAL = "US";
public TemplateEngine getTemplateEngine() {
templateEngine = new TemplateEngine();
StringTemplateResolver stringTemplateResolver = new StringTemplateResolver();
templateEngine.setTemplateResolver(stringTemplateResolver);
return templateEngine;
}
public String getTemplateFromAttributes(String htmlContent, Map<String, Object> attr)
{
templateEngine = getTemplateEngine();
Context context = new Context(new Locale(TEMPLATE_LOCAL));
if (!CollectionUtils.isEmpty(attr)) {
attr.forEach((k,v)->context.setVariable(k, v));
}
return templateEngine.process(htmlContent, context);
}
Hope this is a useful snippet

How to custom #FeignClient Expander to convert param?

Feign default expander to convert param:
final class ToStringExpander implements Expander {
#Override
public String expand(Object value) {
return value.toString();
}
}
I want custom it to convert user to support GET param, like this
#FeignClient("xx")
interface UserService{
#RequestMapping(value="/users",method=GET)
public List<User> findBy(#ModelAttribute User user);
}
userService.findBy(user);
What can i do?
First,you must write a expander like ToJsonExpander:
public class ToJsonExpander implements Param.Expander {
private static ObjectMapper objectMapper = new ObjectMapper();
public String expand(Object value) {
try {
return objectMapper.writeValueAsString(value);
} catch (JsonProcessingException e) {
throw new ExpanderException(e);
}
}
}
Second, write a AnnotatedParameterProcessor like JsonArgumentParameterProcessor to add expander for your processor.
public class JsonArgumentParameterProcessor implements AnnotatedParameterProcessor {
private static final Class<JsonArgument> ANNOTATION = JsonArgument.class;
public Class<? extends Annotation> getAnnotationType() {
return ANNOTATION;
}
public boolean processArgument(AnnotatedParameterContext context, Annotation annotation) {
MethodMetadata data = context.getMethodMetadata();
String name = ANNOTATION.cast(annotation).value();
String method = data.template().method();
Util.checkState(Util.emptyToNull(name) != null,
"JsonArgument.value() was empty on parameter %s", context.getParameterIndex());
context.setParameterName(name);
if (method != null && (HttpMethod.POST.matches(method) || HttpMethod.PUT.matches(method) || HttpMethod.DELETE.matches(method))) {
data.formParams().add(name);
} else {
`data.indexToExpanderClass().put(context.getParameterIndex(), ToJsonExpander.class);`
Collection<String> query = context.setTemplateParameter(name, data.template().queries().get(name));
data.template().query(name, query);
}
return true;
}
}
Third,add it to Feign configuration.
#Bean
public Contract feignContract(){
List<AnnotatedParameterProcessor> processors = new ArrayList<>();
processors.add(new JsonArgumentParameterProcessor());
processors.add(new PathVariableParameterProcessor());
processors.add(new RequestHeaderParameterProcessor());
processors.add(new RequestParamParameterProcessor());
return new SpringMvcContract(processors);
}
Now, you can use #JsonArgument to send model argument like:
public void saveV10(#JsonArgument("session") Session session);
I don't know what #ModelAttribute does but I was looking for a way to convert #RequestParam values so I did this:
import com.google.i18n.phonenumbers.PhoneNumberUtil;
import com.google.i18n.phonenumbers.Phonenumber;
import org.springframework.cloud.netflix.feign.FeignFormatterRegistrar;
import org.springframework.format.FormatterRegistry;
import org.springframework.stereotype.Component;
import static com.google.i18n.phonenumbers.PhoneNumberUtil.PhoneNumberFormat.E164;
#Component
public class PhoneNumberFeignFormatterRegistrar implements FeignFormatterRegistrar {
private final PhoneNumberUtil phoneNumberUtil;
public PhoneNumberFeignFormatterRegistrar(PhoneNumberUtil phoneNumberUtil) {
this.phoneNumberUtil = phoneNumberUtil;
}
#Override
public void registerFormatters(FormatterRegistry registry) {
registry.addConverter(Phonenumber.PhoneNumber.class, String.class, source -> phoneNumberUtil.format(source, E164));
}
}
Now stuff like the following works
import com.google.i18n.phonenumbers.Phonenumber;
import org.springframework.cloud.netflix.feign.FeignClient;
import org.springframework.hateoas.Resource;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
#FeignClient("data-service")
public interface DataClient {
#RequestMapping(method = RequestMethod.GET, value = "/phoneNumbers/search/findByPhoneNumber")
Resource<PhoneNumberRecord> getPhoneNumber(#RequestParam("phoneNumber") Phonenumber.PhoneNumber phoneNumber);
}
As the open feign issue and spring doc say:
The OpenFeign #QueryMap annotation provides support for POJOs to be used as GET parameter maps.
Spring Cloud OpenFeign provides an equivalent #SpringQueryMap annotation, which is used to annotate a POJO or Map parameter as a query parameter map since 2.1.0.
You can use it like this:
#GetMapping("user")
String getUser(#SpringQueryMap User user);
public class User {
private String name;
private int age;
...
}

Resources