How to convert Properties to PropertySources? - spring

In spring boot 1.4.2.RELEASE, PropertiesConfigurationFactory has method setProperties(Properties properties).
So my code can write as follow:
public Map<String, CacheProperties> resolve() {
Map<String, Object> cacheSettings = resolveCacheSettings();
Set<String> beanNames = resolveCacheManagerBeanNames(cacheSettings);
Map<String, CacheProperties> cachePropertiesMap = new HashMap<>(beanNames.size());
MutablePropertySources propertySources = new MutablePropertySources();
propertySources.addFirst(new MapPropertySource("cache", cacheSettings));
beanNames.forEach(beanName -> {
CacheProperties cacheProperties = new CacheProperties();
PropertiesConfigurationFactory<CacheProperties> factory = new PropertiesConfigurationFactory<>(cacheProperties);
Properties properties = new Properties();
properties.putAll(PropertySourceUtils.getSubProperties(propertySources, beanName));
factory.setProperties(properties);
factory.setConversionService(environment.getConversionService());
try {
factory.bindPropertiesToTarget();
cachePropertiesMap.put(beanName, cacheProperties);
} catch (Exception e) {
throw new RuntimeException(e);
}
});
return cachePropertiesMap;
}
But when I upgrade to spring boot 1.5.8.RELEASE. PropertiesConfigurationFactory changed the method setProperties(Properties properties) to setPropertySources(PropertySources propertySources).
So I changed my code like this:
public Map<String, CacheProperties> resolve() {
Map<String, Object> cacheSettings = resolveCacheSettings();
Set<String> beanNames = resolveCacheManagerBeanNames(cacheSettings);
Map<String, CacheProperties> cachePropertiesMap = new HashMap<>(beanNames.size());
MutablePropertySources propertySources = new MutablePropertySources();
propertySources.addFirst(new MapPropertySource("cache", cacheSettings));
beanNames.forEach(beanName -> {
CacheProperties cacheProperties = new CacheProperties();
PropertiesConfigurationFactory<CacheProperties> factory = new PropertiesConfigurationFactory<>(cacheProperties);
//Properties properties = new Properties();
//properties.putAll(PropertySourceUtils.getSubProperties(propertySources, beanName));
//factory.setProperties(properties);
factory.setPropertySources(propertySources);
factory.setConversionService(environment.getConversionService());
try {
factory.bindPropertiesToTarget();
cachePropertiesMap.put(beanName, cacheProperties);
} catch (Exception e) {
throw new RuntimeException(e);
}
});
return cachePropertiesMap;
}
But this not work...
Anyone can help me? How to convert Properties to PropertySources?

Resolved...
I add factory.setTargetName(beanName); before factory.setPropertySources(propertySources);, and it works well.

Related

Loading value from json upon start up application

I want to load the values from json file upon the Spring Boot Application is started.
My code for the Configuration File is like the below:
#Configuration
#Getter
public class FedexAPIConfig {
private final static String JSON_FILE = "/static/config/fedex-api-credentials.json";
private final boolean IS_PRODUCTION = false;
private FedexAPICred apiCredentials;
public FedexAPIConfig() {
try (InputStream in = getClass().getResourceAsStream(JSON_FILE);
BufferedReader reader = new BufferedReader(new InputStreamReader(in, StandardCharsets.UTF_8))) {
JSONObject json = new JSONObject();
// this.apiCredentials = new JSONObject(new JSONTokener(reader));
if (IS_PRODUCTION) {
json = new JSONObject(new JSONTokener(reader)).getJSONObject("production");
} else {
json = new JSONObject(new JSONTokener(reader)).getJSONObject("test");
}
System.out.println(json.toString());
this.apiCredentials = FedexAPICred.builder()
.url(json.optString("url"))
.apiKey(json.optString("api_key"))
.secretKey(json.optString("secret_key"))
.build();
} catch (FileNotFoundException fnfe) {
fnfe.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
}
and with this, when the application is in progress of startup, values are successfully printed on the console.Startup console log
When I tried to call this value from other ordinary class, like the below:, it brings nothing but just throws NullPointerException... What are my faults and what shall I do?
public class FedexOAuthTokenManager extends OAuthToken {
private static final String VALIDATE_TOKEN_URL = "/oauth/token";
private static final String GRANT_TYPE_CLIENT = "client_credentials";
private static final String GRANT_TYPE_CSP = "csp_credentials";
#Autowired
private FedexAPIConfig fedexApiConfig;
#Autowired
private Token token;
#Override
public void validateToken() {
// This is the part where "fedexApiConfig" is null.
FedexAPICred fedexApiCred = fedexApiConfig.getApiCredentials();
Response response = null;
try {
RequestBody body = new FormBody.Builder()
.add("grant_type", GRANT_TYPE_CLIENT)
.add("client_id", fedexApiCred.getApiKey())
.add("client_secret", fedexApiCred.getSecretKey())
.build();
response = new HttpClient().post(fedexApiCred.getUrl() + VALIDATE_TOKEN_URL, body);
if (response.code() == 200) {
JSONObject json = new JSONObject(response.body().string());
token.setAccessToken(json.optString("access_token"));
token.setTokenType(json.optString("token_type"));
token.setExpiredIn(json.optInt("expires_in"));
token.setExpiredDateTime(LocalDateTime.now().plusSeconds(json.optInt("expires_in")));
token.setScope(json.optString("scope"));
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
fedexApiConfg is null even though I autowired it in prior to call.
And this FedexOAuthTokenManager is called from other #Component class by new FedexOAuthTokenManager()
Did you try like below?
Step 1: Create one Configuration class like below
public class DemoConfig implements ApplicationListener<ApplicationPreparedEvent> {
#Override
public void onApplicationEvent(ApplicationPreparedEvent event) {
//Load the values from the JSON file and populate the application
//properties dynamically
ConfigurableEnvironment environment = event.getApplicationContext().getEnvironment();
Properties props = new Properties();
props.put("spring.datasource.url", "<my value>");
//Add more properties
environment.getPropertySources().addFirst(new PropertiesPropertySource("myProps", props));
}
To listen to a context event, a bean should implement the ApplicationListener interface which has just one method onApplicationEvent().The ApplicationPreparedEvent is invoked very early in the lifecycle of the application
Step 2: Customize in src/main/resources/META-INF/spring.factories
org.springframework.context.ApplicationListener=com.example.demo.DemoConfig
Step 3: #Value in spring boot is commonly used to inject the configuration values into the spring boot application. Access the properties as per your wish.
#Value("${spring.datasource.url}")
private String valueFromJSon;
Try this sample first in your local machine and then modify your changes accordingly.
Refer - https://www.baeldung.com/spring-value-annotation
Refer - https://www.knowledgefactory.net/2021/02/aws-secret-manager-service-as.html

Send emails to multiple email addresses, It feeds email addresses from a csv file

I wanna create an application to send emails to several recipients. It feeds email addresses from a csv file and sends an email to each recipient, and I'm getting some trouble doing this.
Could you help me please?
Here is my CSVHelper.java
#Component
public class CSVHelper
{
public static String TYPE = "text/csv";
static String[] HEADERs = { "id", "email", "dateEcheance"};
//This method is used to filter the csv file and get only the emails
public List<ContactsFile> csvToEmails() throws NumberFormatException, ParseException
{
InputStream is = null;
try (BufferedReader fileReader = new BufferedReader(new InputStreamReader(is, "UTF-8"));
CSVParser csvParser = new CSVParser(fileReader, CSVFormat.DEFAULT.withFirstRecordAsHeader().withIgnoreHeaderCase().withTrim());)
{
List<ContactsFile> emailsList = new ArrayList<>();
Iterable<CSVRecord> csvRecords = csvParser.getRecords();
for (CSVRecord csvRecord : csvRecords)
{
ContactsFile contact = new ContactsFile(csvRecord.get("email"));
emailsList.add(contact);
}
System.out.println(emailsList);
return emailsList;
}
catch (IOException e) { throw new RuntimeException("fail to get emails: " + e.getMessage()); }
}
We call csvToEmails() method in the controller to send the emails
#Autowired
private CSVHelper csvHelper;
#PostMapping("/getdetails")
public #ResponseBody EmailNotification sendMail(#RequestBody EmailNotification details) throws Exception {
MimeMessage message = sender.createMimeMessage();
MimeMessageHelper helper = new MimeMessageHelper(message,
MimeMessageHelper.MULTIPART_MODE_MIXED_RELATED,
StandardCharsets.UTF_8.name());
try {
helper.setTo((InternetAddress) csvHelper.csvToEmails());
helper.setText(details.getMessage(),true);
helper.setSubject("Test Mail");
} catch (javax.mail.MessagingException e) {
e.printStackTrace();
}
sender.send(message);
return details;
this is an example of the csv file:
id,email,dateEcheance
1,address#email.com,10/05/2021
2,address2#email.com,10/02/2021
I'm new to spring boot, and I'm in trouble completing this project.
Assuming that you've mail server configurations in place. First, you need to provide the JavaMailSender implementation. Spring does provide the implementation for the same. You've to create a bean and config the mail server configurations as follows.
#Configuration
public class MailConfig {
#Autowired
private Environment env;
#Bean
public JavaMailSender mailSender() {
try {
JavaMailSenderImpl mailSenderImpl = new JavaMailSenderImpl();
mailSenderImpl.setHost(env.getRequiredProperty("mail.smtp.host"));
mailSenderImpl.setUsername(env.getRequiredProperty("mail.smtp.username"));
mailSenderImpl.setPassword(env.getRequiredProperty("mail.smtp.password"));
mailSenderImpl.setDefaultEncoding(env.getRequiredProperty("mail.smtp.encoding"));
mailSenderImpl.setPort(Integer.parseInt(env.getRequiredProperty("mail.smtp.port")));
mailSenderImpl.setProtocol(env.getRequiredProperty("mail.transport.protocol"));
Properties p = new Properties();
p.setProperty("mail.smtp.starttls.enable", "true");
p.setProperty("mail.smtp.auth", "true");
mailSenderImpl.setJavaMailProperties(p);
return mailSenderImpl;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
Configuration settings in application.properties. Add settings according to the mail server.
# SMTP mail settings
mail.smtp.host=**********
mail.smtp.username=**********
mail.smtp.password=**********
mail.smtp.port=**********
mail.smtp.socketFactory.port=**********
mail.smtp.encoding=utf-8
mail.transport.protocol=smtp
mail.smtp.auth=true
mail.smtp.timeout=10000
mail.smtp.starttls.enable=true
mail.smtp.socketFactory.class=javax.net.ssl.SSLSocketFactory
CSVHelper.java
#Component
public class CSVHelper {
// This method is used to filter the csv file and get only the emails
public List<String> csvToEmails() {
InputStream is = ClassLoader.getSystemClassLoader().getResourceAsStream("csvFile.csv");
try (BufferedReader fileReader = new BufferedReader(new InputStreamReader(is, "UTF-8"));
CSVParser csvParser = new CSVParser(fileReader,
CSVFormat.DEFAULT.withFirstRecordAsHeader().withIgnoreHeaderCase().withTrim());) {
List<String> emails = new ArrayList<>();
Iterable<CSVRecord> csvRecords = csvParser.getRecords();
for (CSVRecord csvRecord : csvRecords) {
String email = csvRecord.get("email");
emails.add(email);
}
return emails;
} catch (IOException e) {
throw new RuntimeException("fail to get emails: " + e.getMessage());
}
}
}
Send mail service
#Service
public class MailSenderService {
#Autowired
private CSVHelper csvHelper;
#Autowired
private JavaMailSender sender;
public void sendMail(EmailNotification details) {
try {
List<String> recipients = csvHelper.csvToEmails();
String[] to = recipients.stream().toArray(String[]::new);
JavaMailSenderImpl jms = (JavaMailSenderImpl) sender;
MimeMessage message = jms.createMimeMessage();
MimeMessageHelper helper = new MimeMessageHelper(message, true);
// add the sender email address below
helper.setFrom("sender#mail.com");
helper.setTo(to);
helper.setText(details.getMessage(), true);
helper.setSubject("Test Mail");
sender.send(message);
} catch (javax.mail.MessagingException | NumberFormatException e) {
// action for exception case
}
}
}

Export entities to database schema through java code

A long time ago, I did that with a code like that:
Configuration config = new Configuration();
Properties props = new Properties();
FileInputStream fos = = new FileInputStream( file_name );
props.load(fos);
fos.close();
config.setProperties(props);
config.addAnnotatedClass(...);
Connection conn = DriverManager.getConnection(url,usuario,senha);
SchemaExport schema = new SchemaExport();
schema.create(true, true);
But now, if I try use this code, I got a compilation error. Seeing the javadoc for SchemaExport, I notice a lot of changes in the methods used on this example.
Hpw I could do that now?
update
based on the suggested link, I implemented the method this way:
public void criarTabelas(String server, String user, String pass) throws Exception {
StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder().applySetting("hibernate.hbm2ddl.auto", "create").applySetting("hibernate.dialect", dialect).applySetting("hibernate.id.new_generator_mappings", "true").build();
MetadataSources sources = new MetadataSources(standardRegistry);
for(Class<?> entity : lista_entidades())
sources.addAnnotatedClass(entity);
MetadataImplementor metadata = (MetadataImplementor) sources.getMetadataBuilder().build();
SchemaExport export = new SchemaExport();
export.create(EnumSet.of(TargetType.DATABASE), metadata);
}
private List<Class<?>> lista_entidades() throws Exception {
List<Class<?>> lista = new ArrayList<Class<?>>();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Entity.class));
for (BeanDefinition bd : scanner.findCandidateComponents("org.loja.model"))
lista.add(Class.forName(bd.getBeanClassName()));
return lista;
}
Now I need a way to establish a jdbc connection and associate to the SchemaExport.
I solve this issue with this code:
public void criarTabelas(String server, String user, String pass) throws Exception {
Connection conn = DriverManager.getConnection(url_prefix+server+"/"+url_suffix, user, pass);
StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder()
.applySetting("hibernate.hbm2ddl.auto", "create")
.applySetting("hibernate.dialect", dialect)
.applySetting("hibernate.id.new_generator_mappings", "true")
.applySetting("javax.persistence.schema-generation-connection", conn)
.build();
MetadataSources sources = new MetadataSources(standardRegistry);
for(Class<?> entity : lista_entidades())
sources.addAnnotatedClass(entity);
MetadataImplementor metadata = (MetadataImplementor) sources.getMetadataBuilder().build();
SchemaExport export = new SchemaExport();
export.create(EnumSet.of(TargetType.DATABASE), metadata);
conn.close();
}
private List<Class<?>> lista_entidades() throws Exception {
List<Class<?>> lista = new ArrayList<Class<?>>();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Entity.class));
for (BeanDefinition bd : scanner.findCandidateComponents("org.loja.model"))
lista.add(Class.forName(bd.getBeanClassName()));
System.out.println("lista: "+lista);
return lista;
}

AggregatingReplyingKafkaTemplate releaseStrategy Question

There seem to be an issue when I use AggregatingReplyingKafkaTemplate with template.setReturnPartialOnTimeout(true) in that, it returns timeout exception even if partial results are available from consumers.
In example below, I have 3 consumers to reply to the request topic and i've set the reply timeout at 10 seconds. I've explicitly delayed the response of Consumer 3 to 11 seconds, however, I expect the response back from Consumer 1 and 2, so, I can return partial results. However, I am getting KafkaReplyTimeoutException. Appreciate your inputs. Thanks.
I follow the code based on the Unit Test below.
[ReplyingKafkaTemplateTests][1]
I've provided the actual code below:
#RestController
public class SumController {
#Value("${kafka.bootstrap-servers}")
private String bootstrapServers;
public static final String D_REPLY = "dReply";
public static final String D_REQUEST = "dRequest";
#ResponseBody
#PostMapping(value="/sum")
public String sum(#RequestParam("message") String message) throws InterruptedException, ExecutionException {
AggregatingReplyingKafkaTemplate<Integer, String, String> template = aggregatingTemplate(
new TopicPartitionOffset(D_REPLY, 0), 3, new AtomicInteger());
String resultValue ="";
String currentValue ="";
try {
template.setDefaultReplyTimeout(Duration.ofSeconds(10));
template.setReturnPartialOnTimeout(true);
ProducerRecord<Integer, String> record = new ProducerRecord<>(D_REQUEST, null, null, null, message);
RequestReplyFuture<Integer, String, Collection<ConsumerRecord<Integer, String>>> future =
template.sendAndReceive(record);
future.getSendFuture().get(5, TimeUnit.SECONDS); // send ok
System.out.println("Send Completed Successfully");
ConsumerRecord<Integer, Collection<ConsumerRecord<Integer, String>>> consumerRecord = future.get(10, TimeUnit.SECONDS);
System.out.println("Consumer record size "+consumerRecord.value().size());
Iterator<ConsumerRecord<Integer, String>> iterator = consumerRecord.value().iterator();
while (iterator.hasNext()) {
currentValue = iterator.next().value();
System.out.println("response " + currentValue);
System.out.println("Record header " + consumerRecord.headers().toString());
resultValue = resultValue + currentValue + "\r\n";
}
} catch (Exception e) {
System.out.println("Error Message is "+e.getMessage());
}
return resultValue;
}
public AggregatingReplyingKafkaTemplate<Integer, String, String> aggregatingTemplate(
TopicPartitionOffset topic, int releaseSize, AtomicInteger releaseCount) {
//Create Container Properties
ContainerProperties containerProperties = new ContainerProperties(topic);
containerProperties.setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
//Set the consumer Config
//Create Consumer Factory with Consumer Config
DefaultKafkaConsumerFactory<Integer, Collection<ConsumerRecord<Integer, String>>> cf =
new DefaultKafkaConsumerFactory<>(consumerConfigs());
//Create Listener Container with Consumer Factory and Container Property
KafkaMessageListenerContainer<Integer, Collection<ConsumerRecord<Integer, String>>> container =
new KafkaMessageListenerContainer<>(cf, containerProperties);
// container.setBeanName(this.testName);
AggregatingReplyingKafkaTemplate<Integer, String, String> template =
new AggregatingReplyingKafkaTemplate<>(new DefaultKafkaProducerFactory<>(producerConfigs()), container,
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
template.setSharedReplyTopic(true);
template.start();
return template;
}
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,bootstrapServers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "test_id");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
return props;
}
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
// list of host:port pairs used for establishing the initial connections to the Kakfa cluster
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
org.apache.kafka.common.serialization.StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringSerializer.class);
return props;
}
public ProducerFactory<Integer,String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#KafkaListener(id = "def1", topics = { D_REQUEST}, groupId = "D_REQUEST1")
#SendTo // default REPLY_TOPIC header
public String dListener1(String in) throws InterruptedException {
return "First Consumer : "+ in.toUpperCase();
}
#KafkaListener(id = "def2", topics = { D_REQUEST}, groupId = "D_REQUEST2")
#SendTo // default REPLY_TOPIC header
public String dListener2(String in) throws InterruptedException {
return "Second Consumer : "+ in.toLowerCase();
}
#KafkaListener(id = "def3", topics = { D_REQUEST}, groupId = "D_REQUEST3")
#SendTo // default REPLY_TOPIC header
public String dListener3(String in) throws InterruptedException {
Thread.sleep(11000);
return "Third Consumer : "+ in;
}
}
'''
[1]: https://github.com/spring-projects/spring-kafka/blob/master/spring-kafka/src/test/java/org/springframework/kafka/requestreply/ReplyingKafkaTemplateTests.java
template.setReturnPartialOnTimeout(true) simply means the template will consult the release strategy on timeout (with the timeout argument = true, to tell the strategy it's a timeout rather than a delivery call).
It must return true to release the partial result.
This is to allow you to look at (and possibly modify) the list to decide whether you want to release or discard.
Your strategy ignores the timeout parameter:
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
You need return timeout ? true : { ... }.

Spring Kafka - how to fetch timestamp (event time) when message was produced

I have a requirement to fetch timestamp (event-time) when the message was produced, in the kafka consumer application. I am aware of the timestampExtractor, which can be used with kafka stream , but my requirement is different as I am not using stream to consume message.
My kafka producer is as follows :
#Override
public void run(ApplicationArguments args) throws Exception {
List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
Runnable runnable = () -> {
String rPage = pages.get(new Random().nextInt(pages.size()));
String rName = pages.get(new Random().nextInt(names.size()));
PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);
Message<PageViewEvent> message = MessageBuilder
.withPayload(pageViewEvent).
setHeader(KafkaHeaders.MESSAGE_KEY, pageViewEvent.getUserId().getBytes())
.build();
try {
this.pageViewsOut.send(message);
log.info("sent " + message);
} catch (Exception e) {
log.error(e);
}
};
Kafka Consumer is implemented using Spring kafka #KafkaListener.
#KafkaListener(topics = "test1" , groupId = "json", containerFactory = "kafkaListenerContainerFactory")
public void receive(#Payload PageViewEvent data,#Headers MessageHeaders headers) {
LOG.info("Message received");
LOG.info("received data='{}'", data);
}
Container factory configuration
#Bean
public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> priceEventsKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(priceEventConsumerFactory());
return factory;
}
The producer, which is sending the message when I print give me below data :
[payload=PageViewEvent(userId=blog, page=about, duration=10),
headers={id=8ebdad85-e2f7-958f-500e-4560ac0970e5,
kafka_messageKey=[B#71975e1a, contentType=application/json,
timestamp=1553041963803}]
This does have a produced timestamp. How can I fetch the message produced time stamp with Spring kafka?
RECEIVED_TIMESTAMP means it is the time stamp from the record that was received not the time it was received.. We avoid putting it in TIMESTAMP to avoid inadvertent propagation to an outbound message.
You can use something like below:
final Producer<String, String> producer = new KafkaProducer<String, String>(properties);
long time = System.currentTimeMillis();
final CountDownLatch countDownLatch = new CountDownLatch(5);
int count=0;
try {
for (long index = time; index < time + 10; index++) {
String key = null;
count++;
if(count<=5)
key = "id_"+ Integer.toString(1);
else
key = "id_"+ Integer.toString(2);
final ProducerRecord<String, String> record =
new ProducerRecord<>(TOPIC, key, "B2B Sample Message: " + count);
producer.send(record, (metadata, exception) -> {
long elapsedTime = System.currentTimeMillis() - time;
if (metadata != null) {
System.out.printf("sent record(key=%s value=%s) " +
"meta(partition=%d, offset=%d) time=%d timestamp=%d\n",
record.key(), record.value(), metadata.partition(),
metadata.offset(), elapsedTime, metadata.timestamp());
System.out.println("Timestamp:: "+metadata.timestamp() );
} else {
exception.printStackTrace();
}
countDownLatch.countDown();
});
}
try {
countDownLatch.await(25, TimeUnit.SECONDS);
} catch (InterruptedException e) {
e.printStackTrace();
}
}finally {
producer.flush();
producer.close();
}
}

Resources