How to log fields having #JsonIgnore annotation? - spring-boot

I am using Spring Boot 2.7.3 with Java 17 & Gradle. I need to log every DB operations in a log file. Everything is working fine except one condition. When I am using #JsonIgnore in my UI Class, these particular fields are not logged. Other fields are properly logged. How to deal with it ?
My Aspect Class:
#Aspect
#Component
public class LoggingAspect {
private static final Logger logger = LoggerFactory.getLogger(LoggingAspect.class);
#Pointcut("execution(public * org.mycomp.erp.service..*(..))")
public void requestPointcut() {}
#Before("requestPointcut()")
public void requestLogger(JoinPoint joinPoint) throws JsonProcessingException {
final ObjectMapper objectMapper = new ObjectMapper();
final String methodName = joinPoint.getSignature().getName();
final Object[] inputArguments = joinPoint.getArgs();
final String logMessage = "Operation : " + methodName +
" With Input Parameters : " + objectMapper.writeValueAsString(inputArguments);
logger.info(logMessage);
}
}
My UI Record Class:
public record LeaveApplUx(
#JsonIgnore String employeeCode,
#JsonIgnore int applicationNumber,
LocalDate startDate,
LocalDate endDate,
String leaveType) {
public LeaveApplUx(
#Size(min = 4, max = 5, message = "Employee Code Must Be Within 4 To 5 Character Long Or Blank")
#JsonProperty("employeeCode") String employeeCode,
#Range(min = 0, max = 9999, message = "Application Number Must Be Within 9999 Or Zero")
#JsonProperty("applicationNumber") int applicationNumber,
#NotNull(message = "Start Date Empty")
#JsonSerialize(using = LocalDateSerializer.class)
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "dd-MM-yyyy")
#JsonProperty("startDate") LocalDate startDate,
#NotNull(message = "End Date Empty")
#JsonSerialize(using = LocalDateSerializer.class)
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "dd-MM-yyyy")
#JsonProperty("endDate") LocalDate endDate,
#Size(min = 2, max = 2, message = "Leave Type Mis-Match")
#JsonProperty("leaveType") String leaveType) {
this.employeeCode = employeeCode;
this.applicationNumber = applicationNumber;
this.startDate = startDate;
this.endDate = endDate;
this.leaveType = leaveType;
}
}
My Request through Postman:
DELETE
http://localhost:8080/hr/leave/application/delete
{
"employeeCode": "E001",
"applicationNumber": "1",
"startDate": "01-04-2022",
"endDate": "05-04-2022",
"leaveType": "CL"
}
My Log File Output:
2022-08-30 14:32:53,688 INFO : Operation : deleteLeaveApplication With Input Parameters : ["N","N",{"startDate":"01-04-2022","endDate":"05-04-2022","leaveType":"CL"}]
The Employee Code & Application Number is not logged.

The annotation #JsonIgnore causes that Jackson's ObjectMapper ignores such fields on serialization. This is the defined behavior.
Marker annotation that indicates that the logical property that the accessor (field, getter/setter method or Creator parameter [of JsonCreator-annotated constructor or factory method]) is to be ignored by introspection-based serialization and deserialization functionality.
You have two options: Log as an array or configure ObjectMapper for the logging.
Solution #1: Log as an array
You have all the parameters available in the Object[] inputArguments array that can be logged directly. I also recommend using the proper logging features:
logger.info("Operation: {} with input parameters: {}",
methodName, inputArguments);
Solution #2: Configure a specific ObjectMapper
For this particular case, you can configure the ObjectMapper to ignore the annotations by setting MapperFeature.USE_ANNOTATIONS to false.
Feature that determines whether annotation introspection is used for configuration; if enabled, configured AnnotationIntrospector will be used: if disabled, no annotations are considered.
Based on the Jackson version:
Jackson before 2.13:
ObjectMapper ignoringObjectMapper = new ObjectMapper()
.configure(MapperFeature.USE_ANNOTATIONS, false);
Jackson 2.13 and higher:
ObjectMapper ignoringObjectMapper = JsonMapper.builder()
.configure(MapperFeature.USE_ANNOTATIONS, false)
.build();
... and log it:
logger.info("Operation: {} with input parameters: {}",
methodName, ignoringObjectMapper.writeValueAsString(inputArguments));

The problem is solved.
Just replace
objectMapper.writeValueAsString(inputArguments)
with this
Arrays.toString(inputArguments)

Related

#Value can not generate value properly, getting null

When I run the code the external configuration in the application.properties file does not get populated into the variable within the DataBucketUtil. I'm sure I'm doing something stupid,but I can not find out wheres the problem.
public class DataBucketUtil {
private static final Logger logger = LoggerFactory.getLogger(DataBucketUtil.class);
#Value("${gcp.config.file}")
private String gcpConfigFile;
#Value("${gcp.project.id}")
private String gcpProjectId;
#Value("${gcp.bucket.id}")
private String gcpBucketId;
#Value("${gcp.directory.name}")
private String gcpDirectoryName;
/**
* Upload file to GCS
*
* #param multipartFile-
* #param fileName-
* #param contentType-
* #return -
*/
public FileDto uploadFile(MultipartFile multipartFile, String fileName, String contentType) {
try {
logger.debug("Start file uploading process on GCS");
byte[] fileData = FileUtils.readFileToByteArray(convertFile(multipartFile));
InputStream inputStream = new ClassPathResource(gcpConfigFile).getInputStream();
StorageOptions options = StorageOptions.newBuilder().setProjectId(gcpProjectId)
.setCredentials(GoogleCredentials.fromStream(inputStream)).build();
Storage storage = options.getService();
Bucket bucket = storage.get(gcpBucketId, Storage.BucketGetOption.fields());
RandomString id = new RandomString(6, ThreadLocalRandom.current());
Blob blob = bucket.create(gcpDirectoryName + "/"
+ fileName + "-" + id.nextString() + checkFileExtension(fileName),
fileData, contentType);
if (blob != null) {
logger.debug("File successfully uploaded to GCS");
return new FileDto(blob.getName(), blob.getMediaLink());
}
} catch (IOException e) {
logger.error("An error occurred while uploading data. Exception: ", e);
throw new RuntimeException("An error occurred while uploading data to GCS");
}
throw new RuntimeException("An error occurred while uploading data to GCS");
}
My application properties is given below:
gcp.config.file=gcp-config/gcs-prod-ho-finance.json
gcp.project.id=brac-main gcp.bucket.id=prod-ho-finance
gcp.dir.name=gs://prod-ho-finance
It is not entirely clear from your code snippet but my guess would be that your DataBucketUtil is not instantiated as a Bean and therefore the #Value annotated fields are not populated. See here for more details about the #Value annotation.
You could transform your class to a service or component with the #Component or #Service annotation and then autowire it to where you need it. See here for more information about beans.
Please add Annotations. I hope it work.
#EnableConfigurationProperties
#Component
public class DataBucketUtil {
private static final Logger logger = LoggerFactory.getLogger(DataBucketUtil.class);
#Value("${gcp.config.file}")
private String gcpConfigFile;
#Value("${gcp.project.id}")
private String gcpProjectId;
#Value("${gcp.bucket.id}")
private String gcpBucketId;
#Value("${gcp.directory.name}")
private String gcpDirectoryName;
/** ............ **/
}

Schedule a function in spring boot with #Scheduled annotation

I would like to execute the following method at the date and time speified on my Angular form
-Here is the input:
<input required [(ngModel)]="emailNotification.sendingDate" class="form-control" type="datetime-local" name="sendingDate" id="time">
The sending emails method(from the controller)
#PostMapping(value="/getdetails")
public #ResponseBody void sendMail(#RequestBody EmailNotification details) throws Exception {
try {
JavaMailSenderImpl jms = (JavaMailSenderImpl) sender;
MimeMessage message = jms.createMimeMessage();
MimeMessageHelper helper = new MimeMessageHelper(message, MimeMessageHelper.MULTIPART_MODE_MIXED_RELATED, StandardCharsets.UTF_8.name());
helper.setFrom("smsender4#gmail.com");
List<String> recipients = fileRepo.findWantedEmails(details.getDaysNum());
String[] to = recipients.stream().toArray(String[]::new);
helper.setTo(to);
helper.setText(details.getMessage(),true);
helper.setSubject("Test Mail");
details.setRecipients(to);
sender.send(message);
enr.save(new EmailNotification(details.getId(), "Test mail", details.getMessage(), details.getDaysNum(), details.getRecipients(), details.getSendingDate()));
} catch (MessagingException e) {
throw new RuntimeException("fail to send emails: " + e.getMessage());
}
EmailNotification.class
public class EmailNotification {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private long id;
private String subject;
private String message;
private int daysNum;
private String[] recipients;
/*
* #Temporal(TemporalType.TIMESTAMP)
*/
//"yyyy-MM-dd'T'HH:mm:ss.SSSX"
#Column(name = "sending_date")
#Basic(fetch = FetchType.EAGER)
private LocalDateTime sendingDate;
public EmailNotification(long id, String subject,String message, int daysNum, String[] recipients, LocalDateTime sendingDate) {
super();
this.sendingDate = sendingDate;
this.daysNum = daysNum;
this.id = id;
this.message = message;
this.subject = subject;
}
I don't know how to proceed, IĆ¹ll be so gratefull if someone helped
Now I think I understand what you are trying to achieve, so check if this works for you.
Let's say you want to send an email when in the given table (I will name it PAYMENT), there's 5 days left for the deadline. Let's say this table has an equivalent entity class named Payment, you could do something like this:
Asumming the Payment entity has it's own CrudRepository extention, which you are using to access the dabatase, you would make a query to find the payments close to the deadline. This of course will depend on how you are accessing your database. The next example query probably won't work (because as far as I know, JPA doesn't support DATEDIFF), but will work as base for what you are trying to achieve
#Repository
public interface PaymentRepository extends CrudRepository<Payment, Integer> {
/* You need to find every payment that's 5 days from the deadline */
#Query("SELECT p FROM PAYMENT p WHERE DATEDIFF(day, CURRENT_TIMESTAMP, p.deadline) <= 5")
public List<PaymentRepository> findPaymentsCloseToDeadline();
}
Make a scheduled task class that sends the email for every payment close to the deadline. Instead of passing parameters to the method, use the database access you already have to fill your email. You should not depend of information received by the front (like Angular), because it's easily manipulable. Use the information you know, the information stored in the database. In this example the tasks starts a 10:00 hours:
#Configuration
#EnableScheduling
public class EmailScheduler {
#Autowired
private PaymentRepository paymentRepository;
#Scheduled(cron = "0 0 10 * * ?")
public void sendEmails() {
JavaMailSenderImpl jms = (JavaMailSenderImpl) sender;
MimeMessage message = jms.createMimeMessage();
MimeMessageHelper helper = new MimeMessageHelper(message, MimeMessageHelper.MULTIPART_MODE_MIXED_RELATED, StandardCharsets.UTF_8.name());
// Here you are accessing the payments close to the deadline
List<Payment> payments = paymentRepository.findPaymentsCloseToDeadline();
/* Send an email for each payment. Use the database info you already have to fill the information */
for (Payment payment : payments) {
try {
helper.setFrom("smsender4#gmail.com");
// Fill the recipients with the info you need
String[] recipients = {payment.getUserEmail(), "other#email.com"};
helper.setTo(recipients);
String msg = "Here would go the message";
helper.setText(msg, true);
helper.setSubject("Test Mail");
sender.send(message);
// You would still need to calculate the ID and the days if you require them
enr.save(new EmailNotification(id, "Test mail", msg, days,
recipients, "today's date in LocalDate"));
} catch (MessagingException e) {
throw new RuntimeException("fail to send emails: " + e.getMessage());
}
}
}
}
This is only a guideline because there's still some things I don't know about your code, but I hope it's clear enough so it helps you. Good luck.

spring boot validation for Long values

This is class against which we are going to map the incoming request
#Getter
#Setter
public class FooRequest {
#Size(max = 255, message = "{error.foo.name.size}")
private String name;
#Digits(integer = 15, fraction = 0, message = "{error.foo.fooId.size}")
private Long fooId;
#Digits(integer = 15, fraction = 0, message = "{error.foo.barId.size}")
private Long barId;
}
I have used javax.validation.constraints.* like above. If we send request like
{
"name": "Test",
"fooId": "0001234567",
"barId": "0003456789"
}
Then It works fine and we are able to save the results in the database but if we send it like:
{
"name": "Test",
"fooId": 0001234567,
"barId": 0003456789
}
Then we are getting 400 Bad Request. I am not getting it what wrong am I doing, I just want to ensure that user sends digits, having length between 1-15 and wants to map it against the Long variable. Is it because of fraction or because all these values are starting with 0?
The second JSON is not a valid json because of the leading zeroes.
Background
Spring uses Jackson library for JSON interactions.
The Jackson's ObjectMapper by default throws if you try to parse the second JSON:
public class Main {
public static void main(String[] args) throws IOException {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.readValue("{\"name\": \"Test\", \"fooId\": 0001234567, \"barId\": 0003456789}", FooRequest.class);
}
}
The exception is:
Exception in thread "main" com.fasterxml.jackson.core.JsonParseException: Invalid numeric value: Leading zeroes not allowed
at [Source: (String)"{"name": "Test", "fooId": 0001234567, "barId": 0003456789}"; line: 1, column: 28]
One can allow leading zeroes via the JsonParser.Feature.ALLOW_NUMERIC_LEADING_ZEROS:
public class Main {
public static void main(String[] args) throws IOException {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(JsonParser.Feature.ALLOW_NUMERIC_LEADING_ZEROS, true);
FooRequest fooRequest = objectMapper.readValue("{\"name\": \"Test\", \"fooId\": 0001234567, \"barId\": 0003456789}", FooRequest.class);
System.out.println(fooRequest.getBarId());
}
}
Or in spring via the Spring Boot's application.properties:
spring.jackson.parser.allow-numeric-leading-zeros=true
then, the second JSON will be parsed successfully.
Why does it work with the first JSON?
Because by default Jackson's MapperFeature.ALLOW_COERCION_OF_SCALARS is turned on.
From its javadoc:
When feature is enabled, conversions from JSON String are allowed, as long as textual value matches (for example, String "true" is allowed as equivalent of JSON boolean token true; or String "1.0" for double).
because all these values are starting with 0?
So it turns out that, yes, but for a slightly different reason

Testing a REST endpoint with Spring, MongoDB using ObjectIds

I'm new to MongoDB and I'm writing a series of unit tests for a Mongo-backed REST web-service. Here's a simple test for a /clients/{id} enpoint :
#RunWith(MockitoJUnitRunner.class)
public class ClientsControllerMockMvcStandaloneTest {
private MockMvc mvc;
#Mock
private ClientsRepository clientsRepository;
#Mock
private ModelMapper modelMapper;
#InjectMocks
private ClientsController clientsController;
private ExceptionHandlerExceptionResolver createExceptionResolver() {
ExceptionHandlerExceptionResolver exceptionResolver = new ExceptionHandlerExceptionResolver() {
#SuppressWarnings("ConstantConditions")
#Override
protected ServletInvocableHandlerMethod getExceptionHandlerMethod(final HandlerMethod handlerMethod,
final Exception exception) {
final Method method = new ExceptionHandlerMethodResolver(RestResponseEntityExceptionHandler.class)
.resolveMethod(exception);
final RestResponseEntityExceptionHandler handler = new RestResponseEntityExceptionHandler();
return new ServletInvocableHandlerMethod(handler, method);
}
};
exceptionResolver.getMessageConverters().add(new MappingJackson2HttpMessageConverter());
exceptionResolver.afterPropertiesSet();
return exceptionResolver;
}
#Before
public void setup() {
JacksonTester.initFields(this, new ObjectMapper());
mvc = MockMvcBuilders.standaloneSetup(clientsController)
.setHandlerExceptionResolvers(createExceptionResolver())
.build();
}
// GET /api/clients/{id} 200
#Test
public void findById_ClientEntryFound_ShouldReturnFoundClientEntry() throws Exception {
final ObjectId id = new ObjectId();
final Client client = Client.builder()
.id(id)
.name("Microsoft")
.build();
final ClientDTO clientDTO = ClientDTO.builder()
.id(id)
.name("Microsoft")
.build();
when(clientsRepository.findById(id))
.thenReturn(Optional.of(client));
when(modelMapper.map(client, ClientDTO.class))
.thenReturn(clientDTO);
mvc.perform(get("/clients/" + id.toString())
.accept(TestUtils.APPLICATION_JSON_UTF8))
.andExpect(content().contentType(TestUtils.APPLICATION_JSON_UTF8))
.andExpect(status().isOk())
.andExpect(jsonPath("$.id", is(id)))
.andExpect(jsonPath("$.name", is("Microsoft")))
.andDo(MockMvcResultHandlers.print());
verify(modelMapper, times(1)).map(client, ClientDTO.class);
verify(clientsRepository, times(1)).findById(id);
verifyNoMoreInteractions(clientsRepository);
}
}
I expect this to work but I'm getting the following :
java.lang.AssertionError: JSON path "$.id"
Expected: is <5c9b9a0289d2b311b150b92c>
but: was <{timestamp=1553701378, machineIdentifier=9032371, processIdentifier=4529, counter=5290284, timeSecond=1553701378, time=1553701378000, date=1553701378000}>
Expected :is <5c9b9a0289d2b311b150b92c>
Actual :<{timestamp=1553701378, machineIdentifier=9032371, processIdentifier=4529, counter=5290284, timeSecond=1553701378, time=1553701378000, date=1553701378000}>
<Click to see difference>
Any help would be appreciated (including any pointers if you think my general approach could be improved!).
Cheers!
Jackson doesn't know your ObjectId instance should be serialized as 5c9b9a0289d2b311b150b92c and not as:
{
"timestamp": 1553701378,
"machineIdentifier": 9032371,
"processIdentifier": 4529,
"counter": 5290284,
"time": 1553701378000,
"date": 1553701378000,
"timeSecond": 1553701378
}
Luckily it's easy to fix. The ObjectId#toString() method (which will internally invoke ObjectId#toHexString()) allows you to convert the ObjectId instance into a 24-byte hexadecimal string representation.
So you could use #JsonSerialize along with ToStringSerializer to have the ObjectId instance represented as a string:
#JsonSerialize(using = ToStringSerializer.class)
private ObjectId id;
Then, in your test, use the ObjectId#toString() method (or ObjectId#toHexString()) for the assertion:
.andExpect(jsonPath("$.id", is(id.toString())))
Alternatively, assuming that you are using Spring Data for MongoDB, instead of ObjectId, you could use:
#Id
private String id;
You also could handle the conversion of ObjectId to String in your mapper layer.

Gson: How do I deserialize an inner JSON object to a map if the property name is not fixed?

My client retrieves JSON content as below:
{
"table": "tablename",
"update": 1495104575669,
"rows": [
{"column5": 11, "column6": "yyy"},
{"column3": 22, "column4": "zzz"}
]
}
In rows array content, the key is not fixed. I want to retrieve the key and value and save into a Map using Gson 2.8.x.
How can I configure Gson to simply use to deserialize?
Here is my idea:
public class Dataset {
private String table;
private long update;
private List<Rows>> lists; <-- little confused here.
or private List<HashMap<String,Object> lists
Setter/Getter
}
public class Rows {
private HashMap<String, Object> map;
....
}
Dataset k = gson.fromJson(jsonStr, Dataset.class);
log.info(k.getRows().size()); <-- I got two null object
Thanks.
Gson does not support such a thing out of box. It would be nice, if you can make the property name fixed. If not, then you can have a few options that probably would help you.
Just rename the Dataset.lists field to Dataset.rows, if the property name is fixed, rows.
If the possible name set is known in advance, suggest Gson to pick alternative names using the #SerializedName.
If the possible name set is really unknown and may change in the future, you might want to try to make it fully dynamic using a custom TypeAdapter (streaming mode; requires less memory, but harder to use) or a custom JsonDeserializer (object mode; requires more memory to store intermediate tree views, but it's easy to use) registered with GsonBuilder.
For option #2, you can simply add the names of name alternatives:
#SerializedName(value = "lists", alternate = "rows")
final List<Map<String, Object>> lists;
For option #3, bind a downstream List<Map<String, Object>> type adapter trying to detect the name dynamically. Note that I omit the Rows class deserialization strategy for simplicity (and I believe you might want to remove the Rows class in favor of simple Map<String, Object> (another note: use Map, try not to specify collection implementations -- hash maps are unordered, but telling Gson you're going to deal with Map would let it to pick an ordered map like LinkedTreeMap (Gson internals) or LinkedHashMap that might be important for datasets)).
// Type tokens are immutable and can be declared constants
private static final TypeToken<String> stringTypeToken = new TypeToken<String>() {
};
private static final TypeToken<Long> longTypeToken = new TypeToken<Long>() {
};
private static final TypeToken<List<Map<String, Object>>> stringToObjectMapListTypeToken = new TypeToken<List<Map<String, Object>>>() {
};
private static final Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(new TypeAdapterFactory() {
#Override
public <T> TypeAdapter<T> create(final Gson gson, final TypeToken<T> typeToken) {
if ( typeToken.getRawType() != Dataset.class ) {
return null;
}
// If the actual type token represents the Dataset class, then pick the bunch of downstream type adapters
final TypeAdapter<String> stringTypeAdapter = gson.getDelegateAdapter(this, stringTypeToken);
final TypeAdapter<Long> primitiveLongTypeAdapter = gson.getDelegateAdapter(this, longTypeToken);
final TypeAdapter<List<Map<String, Object>>> stringToObjectMapListTypeAdapter = stringToObjectMapListTypeToken);
// And compose the bunch into a single dataset type adapter
final TypeAdapter<Dataset> datasetTypeAdapter = new TypeAdapter<Dataset>() {
#Override
public void write(final JsonWriter out, final Dataset dataset) {
// Omitted for brevity
throw new UnsupportedOperationException();
}
#Override
public Dataset read(final JsonReader in)
throws IOException {
in.beginObject();
String table = null;
long update = 0;
List<Map<String, Object>> lists = null;
while ( in.hasNext() ) {
final String name = in.nextName();
switch ( name ) {
case "table":
table = stringTypeAdapter.read(in);
break;
case "update":
update = primitiveLongTypeAdapter.read(in);
break;
default:
lists = stringToObjectMapListTypeAdapter.read(in);
break;
}
}
in.endObject();
return new Dataset(table, update, lists);
}
}.nullSafe(); // Making the type adapter null-safe
#SuppressWarnings("unchecked")
final TypeAdapter<T> typeAdapter = (TypeAdapter<T>) datasetTypeAdapter;
return typeAdapter;
}
})
.create();
final Dataset dataset = gson.fromJson(jsonReader, Dataset.class);
System.out.println(dataset.lists);
The code above would print then:
[{column5=11.0, column6=yyy}, {column3=22.0, column4=zzz}]

Resources