Flowbale auditlog - audit-logging

I want to audit entire application built in flowable, I am spring boot develeoper, I getting difficult to implement audit log in Flowable. I am looking alike envers in Spring Boot, which audit specific entity annotated upon.

In flowable bpmn, there is facility of event logging. This is disabled by default. If enabled this logs events to database table act_evt_log.
Flowable also provides the facility to add a custom event logger. Using this you can achieve the functionality of auditing.
Reference: Flowable Event Logging

Related

Spring Boot Micrometer Influxdb custom data insert

I have a Spring Boot Application which consumes data from kafka topic. I am using Micrometer and Influxdb for monitoring purpose. I read in documentation that, By adding micrometer-registry-influx we automatically enable exporting data to InfluxDB. I have some below questions on this -
What kind of data micrometer automatically adds to InfluxDB?
Can we add custom data to InfluxDB according to my application?
How can I publish custom or my application specific data to InfluxDB?
How can I disable adding default data to InfluxDB?
As I understand from the documentation, the standard output set is described here
Adding your own metrics
Metrics filter (here you can exclude standard metrics accordingly)

Put trace id for each message in kafka listener

How can i use KafkaListenerAnnotationBeanPostProcessor to add trace id for each message coming to the consumer?
I want to avoid using ThreadContext.put() in each of the listeners. What's the best practice that is followed for this purpose? Is there any better way of doing it without using KafkaListenerAnnotationBeanPostProcessor.
Also I cannot use Sleuth because its creating some issue with my application. Any help would be appreciated.
Most of the tracing library ( including Spring Cloud Sleuth which in turn internally uses Brave as tracing library) generally utilize Threadlocal to store the tracing. So it would be simpler to use Threadlocal if what you want is just a tracing id. Of course in that case you would need to propagate the tracing id yourself. You can use Spring AOP to write and apply the threadlocal tracing id logic generation/cleanup in an non-invasive way. Another approach is to integrate the Brave library yourself as detailed here in their supported instrumentation options for Kafka clients here.

Can I use together: Zipkin, Sleuth, MDC, ELK, Jaeger?

I really read many articles. I figure out that need to just include a starters in spring boot )))
Can anyone sort it out: is Sleuth create MDC (Mapped Diagnostic Context)? Is sleuth create a record's ID which used by Zipkin? Can I see this ID in Kibana? Or do I need to use zipkin API? Are there best practice to use all of this together? Is Jaeger substitute both Zipkin and Sleuth or how?
Yes you can, and I have shown that numerous times during my presentations (https://toomuchcoding.com/talks) and we describe it extensively in the documentation (https://docs.spring.io/spring-cloud-sleuth/docs/current/reference/html/). Sleuth will set up your logging pattern which you can then parse and visualize using the ELK stack. Sleuth takes care of tracing context propagation and can send the spans to a span storage (e.g. Zipkin or Jaeger). Sleuth does take care of updating the MDC for you. Please always read the documentation and the project page before filing a question

How to read events for newly modified record from keycloak to spring boot

Actually We have following flow and scenario to poll data in to my Spring boot App
Active Directory --> Keycloak --> Spring boot App
Here we are able to poll data in but in future if there is any record change in Active Directory keycloak has a provision to poll data in it's DB periodically but the same changed (Newly Added /Deleted /Updated ) records from key cloak to Spring boot application there is some eventing option but I do not see how and where to implement it ?
I suppose there should be a listener in the spring boot application which would get triggered on any change in record (Newly Added /Deleted /Updated ) of keycloak
The event listener is implemented on the Keycloak side. Here is an example of how to implement an event listener that logs events to console - Event Listener Example
Instead of logging, you will need to send notifications to your SpringBoot application in any suitable way:
you can implement some endpoint in your SpringBoot application that will be invoked by Even Listener's code
or Event listener can send let's say JMS message and your SpringBoot application will be subscribed to the JMS topic
etc.

Spring Realtime Data Sending With Database

I'm having small e-commerce application with Spring boot, but I want to realtime the rest endpoint where the products are listed, so I want the rest endpoint to be updated if any product is updated or created.
I'm confused how to listen database in realtime. I found Sse for realtime data sending but i couldn't found how to do this with database
Can you please suggest a best methodology
If you are using Hibernate for managing Data. then use hibernate envers and process the model after create/update db.
my suggestion is whenever an action create/update happens in hibernate.
Hibernate envers will trigger and use Akka actor to process the event.
Hibernate -> hibernate Envers -> Akka (business layer) -> rest endpoint
Thanks,
Vimalesh

Resources