Sharing events between two Laravel applications - laravel

Is it possible to have one Laravel application listen for events triggered in another?
I've built a REST API to complement an existing web app. It uses the same database but I've built it as a separate application and there are certain events which clear some cached results. At the moment the events are not being shared between the two applications so I'm getting the cached results in spite of having updated the database. Is there a way for one app to pick up on events fired by the other? I haven't found anything about this in the docs.

Redis is completely agnostic about what application is listening to it. You can set your broadcast driver to redis and invoke your events in one application while listening on the other as long as they both use the same Redis instance. The other can then listen for those events. However, of note is that the way that Laravel handles the listeners is to bind to a specific class. So you would still have to make sure the class existed so you may define a listener for it.

Related

Using Spring or Lambda for bulk event trigger

Looking for some help on an application design. I am using spring framework and hosting application in AWS.
I am working on an enterprise Java Web application that is suppose to handle events when their trigger time is reached. For example, consumers can set an event to begin on 12/20/22 at 07:35 AM, and system is suppose to send a notification when that time is reached.
I can store these events in a database along with their trigger time and setup a Spring scheduler (#Scheduler) to run every minute and process events whose trigger time is reached. My only concern with this approach is, there could be hundreds/thousands of event to trigger at any minute, and it cannot be processed within one minute.
Is there any alternate way to design this? I don't know if Spring offers a feature where I could create these Event, and Frameworks trigger these events when trigger time is reached. In that way, I can stay away from managing Scheduling and Triggering part.
I am using AWS to host this applications, so another option I'm thinking towards is creating an AWS lambda for every such Event, and let AWS manage the triggering part. In that way, I can stay away from managing the triggers.
Let me know your views? Or If you came across similar problems and how you resolved that?
You can consider using spring-cloud-dataflow to manage this as tasks and streams.
You create a custom batch application that will use #Scheduled to check the your database when events are dure and then send events to a stream. You can use Spring Integration APIs to interact with RabbitMQ or Kafka topics.
The event should contain enough information needed to process the event.
You then have a stream application that produces the content and send via email or pass it on to a separate stream app that sends the email.
https://dataflow.spring.io/docs/stream-developer-guides/programming-models/
The flow will look something like:
:mail_events | message-processor | message-sender
You will configure property for mail_events to match the topic created and configured for you mail-event-batch application.
You can use Spring Cloud Data Flow to manage the mail-event-batch application as well.
You can scale each application https://dataflow.spring.io/docs/recipes/scaling/

Spring Boot - Push message to Angular UI

I want to develop an application where I want to push the messages (or data) to UI from backend Spring boot application.
I have the following requirement -
Consider there is a REST service that accepts the data from other applications using the POST method.
This data will be pushed to UI.
OR
Consider that there is a background process running which generate events and we want to push these events to UI.
For this, I came across about the WebSocket component that we can use in the Spring Boot application.
However, is there any other settings required to make it possible to push the incoming data to the UI?
Any help is appreciated.
Thanks,
Avinash Deshmukh
The backend cannot magically push updates to a client UI. The backend will have no way of knowing where the UI exists (i.e. what the UI's ip address is) and even if it did, it may not have access to establish a connection (due to firewalls or a NAT).
For this reason a client UI has to request updates. One way this could be done would be to have a timer in the UI application that polls for updates via REST. But this is essentially what websockets do - with much less overhead.
This is how common applications that you use everyday work all the time. So I'm not sure why you do not want to go down the websockets route.
...
Starting with Spring 5.0.5.RELEASE, it isn’t necessary to do any customization because of the improvement of #SendToUser annotation, that allows us to send a message to a user destination via “/user/{sessionId}/…” rather than “/user/{user}/…“.
That means the annotation works relying on the session id of the input message, effectively sending a reply to destination private to the session:
...
There is a good example over here:
https://www.baeldung.com/spring-websockets-sendtouser

Is event definition in model file (.cto) necessary when using composer-rest-server ?

If I want to develop a node.js application for a Hyperledger Fabric Composer business network, it is necessary to define (in the model file) events that are emitted, whenever a transaction takes place. Otherwise, the node.js application is not "informed" about those transactions (see https://hyperledger.github.io/composer/latest/business-network/publishing-events.html).
Defining the events (in the model file ... and emitting them in the respective transaction processor functions) makes it possible for the node.js application to subscribe to those events (and therefore to be informed about transactions happening.)
So far I understand it.
My question is the following:
When I use the composer rest server (i.e. the automatically generated node.js application) instead of developing my "own" node.js application, do I still have to define the events for the transactions defined in the model file (.cto)?
Or is this not necessary because the composer rest server does not use those events anyway?
You would still need to define events in your model, then publish them in your transaction code (and subsequently consume them (subscribe) them from a client - whether composer-client or websockets etc. So regardless of whether you're using the REST APIs, Composer client APIs or even the CLI.
So if you POST a transaction from your REST client (eg. browser)- which sends it to the REST server - you must have defined an event (in your model) AND furthermore, your transaction logic would have to emit that event - for any listener to process it.

EasyNetQ / RabbitMQ consuming events in Web API

I have created Web API which allows messages to be sent to the Queue. My Web API is designed with CQRS and DDD in mind. I want my message consumer to always be waiting for any messages on the queue to receive. Currently the way its done, this will only read messages if I make a request to the API to hit the method.
Is there a way of either using console application or something that will always be running to consume messages at anytime given without having to make a request from the Web Api. So more of a automation task ?
If so, how do I go about with it i.e. if its console app how would I keep it always running (IIS ?) and is there way to use Dependency Injection as I need to consume the message then send to my repository which lives on separate solution. ?
or a way to make EasyNetQ run at start up ?
The best way to handle this situation in your case is to subscribe to bus events using AMPQ through EasyNetQ library. The recommended way of hosting it is by writing a windows service using topshelf library and subscribe to bus events inside that service on start.
IIS processes and threads are not reliable for such tasks as they are designed to be recycled on a regular basis which may cause some instabilities and inconsistencies in your application.
and is there way to use Dependency Injection as I need to consume the message then send to my repository which lives on separate solution.
It is better to create a separate question for this, as it is obviously off-topic. Also, it requires a further elaboration as it is not clear what specifically you are struggling with.

Persistent subscriber in Firebase

Is there built-in support or any way of implementing a persistent subscription in Firebase?
I need to set up a backend which reacts to certain events in my Firebase database. If the backend has crashed or is being restarted I need it to catch up with anything that has happened while it was down.
For example, I want to re-index certain objects in ElasticSearch when they change. If the backend is down I need to re-index any changed objects when the backend comes back up again.
Nothing is built in for that, although you can definitely build it on top of Firebase by adding a isIndexed or isDirty property to the items.
But the more common approach is to stuff the items that need to re-indexed into a queue and use a worker process that removes them from the queue when they've been handled. I highly recommend using firebase-queue for that.

Resources