Emit NestJS event locally (in-process EventPattern trigger) - events

I am using #nest/microservices with the latest version of nextjs 6.x. I have inter-process communication working correctly using #EventPattern and ClientProxy.emit.
This works fine locally but when I deploy I use AWS Eventbridge for communication directly to a lambda. This could of course be any transport mechanism so my question is not related to Eventbridge per se.
What I need to do is receive a payload into my lambda and then I need to push this into the nestjs microservices lifecycle so that my event can be processed using the #EventHandler(...). So I am looking for something like a publishLocal or emitLocal so that the communication stays in-process.
I would be bootstrapping the nestjs application context with NestFactory.createApplicationContext(...), I am just not sure how I can process the subscription payload in-process in a decoupled fashion.
Any ideas would be greatly appreciated, thanks in advance.
Cheers,
Mark

Related

How can i Run Web socket In Apache Flink Serverless Java

I have a Java Program to run in Apache flink in AWS i want to run
real time communication through web socket how can i integrate serverless web socket in Apache flink Java ???
Thanks You
Flink is designed to help you process and move data continuously between storage or streaming solutions. It is not intended to, and would not work well with websockets directly for these reasons:
When submitting a job, the runtime serializes your logic and moves it to other TaskManager instances so that it can parallelize them. These can be on another machine entirely. Now, if you were intending to service a websocket with that code, it has just moved elsewhere!
TaskManagers can be stopped and restarted (scaling event, recovering from a checkpoint/savepoint, etc). That's where your websocket connection will be cut.
Also, the Flink planner can decide that your source functions need be read twice if it helps the processing. This means that your websockets would need to maintain a history of messages received, and make sure they are sent once to each operator instance.
This being said you can have a webserver managing the websocket, piping messages back and forth to a Kafka topic, which then Flink can operate on.
Since you're talking about AWS, I suggest you learn about their Websocket API Gateway service. I believe these can be connected easily with Kinesis, which Flink can read from and write to easily.

Capture Windchill events

I am using windchill 11.1 M020, what's the best way to capture events from windchill?
the context is I am a third party java application which runs in a different host than windchill and I'd like to be triggered when Checkin events or version changed happen or any other events
I did some research and here's what I found
We can Capture events through custom service listener, but this method not clean enough since we need to develop a custom service code and place it (run it with assigned port) inside windchill container.
We can capture windchill events through Windchill ESI service and Info*Engine but not sure how to configure ESI to listen to events and publish events to a broker, for example MQ Broker, I don't want to use EMS to avoid any licence.
any recommendations to capture events and publish them to messaging broker?
Thank you.
The only way I know to capture events from Windchill, is implementing a listener. In Windchill you can implement a service that can be notified from Windchill when objects change their state or when they have been checked in, etc.
As a Windchill service your code runs in process of the Windchill method server so you have to devise some way to comunicate what's happened in Windchill. You could use a web service, a REST call, you can write in a shared log file or something like that.
You can look at this PTC Community (Windchill Discussions) to start to dig in making Windchill listeners:
https://community.ptc.com/t5/Windchill/How-to-implement-listeners/td-p/674877

Spring Boot - Push message to Angular UI

I want to develop an application where I want to push the messages (or data) to UI from backend Spring boot application.
I have the following requirement -
Consider there is a REST service that accepts the data from other applications using the POST method.
This data will be pushed to UI.
OR
Consider that there is a background process running which generate events and we want to push these events to UI.
For this, I came across about the WebSocket component that we can use in the Spring Boot application.
However, is there any other settings required to make it possible to push the incoming data to the UI?
Any help is appreciated.
Thanks,
Avinash Deshmukh
The backend cannot magically push updates to a client UI. The backend will have no way of knowing where the UI exists (i.e. what the UI's ip address is) and even if it did, it may not have access to establish a connection (due to firewalls or a NAT).
For this reason a client UI has to request updates. One way this could be done would be to have a timer in the UI application that polls for updates via REST. But this is essentially what websockets do - with much less overhead.
This is how common applications that you use everyday work all the time. So I'm not sure why you do not want to go down the websockets route.
...
Starting with Spring 5.0.5.RELEASE, it isn’t necessary to do any customization because of the improvement of #SendToUser annotation, that allows us to send a message to a user destination via “/user/{sessionId}/…” rather than “/user/{user}/…“.
That means the annotation works relying on the session id of the input message, effectively sending a reply to destination private to the session:
...
There is a good example over here:
https://www.baeldung.com/spring-websockets-sendtouser

Laravel Microservices & RabbitMQ

Just wondering what the best way of capturing "fanout" calls from RabbitMQ is in Laravel subscriber services?
Service 1 sends out the message, say UserUpdated with their UUID, and this goes into RabbitMQ now.
Service 2/3/4/n capture UserUpdated and perform their appropriate actions.
I just don't know the best way to have a long running service on the Laravel subscribers to catch these messages and perform their own actions. I've tried multiple packages on GitHub so far but none go into this detail of where to place a class to receive the messages.
All help is much appreciated.
You can achieve that with enqueue/laravel-queue package. It comes with Enqueue Simple Client support. The client supports, pub/sub, message bus and friendly for use in microservers oriented systems.

Microservice and RabbitMQ

I am new to Microservices and have a question with RabbitMQ / EasyNetQ.
I am sending messages from one microservice to another microservice.
Each Microservice are Web API's. I am using CQRS where my Command Handler would consume message off the Queue and do some business logic. In order to call the handler, it will need to make a request to the API method.
I would like to know without having to explicit call the API endpoint to hit the code for consuming messages. Is there an automated way of doing it without having to call the API endpoint ?
Suggestion could be creating a separate solution which would be a Console App that will execute the RabbitMQ in order to start listening. Create a while loop to read messages, then call the web api endpoint to handle business logic every time a new message is sent to the queue.
My aim is to create a listener or a startup task where once messages are in the queue it will automatically pick it up from the Queue and continue with command handler but not sure how to do the "Automatic" way as i describe it. I was thinking to utilise Azure Webjob that will continuously be running and it will act as the Consumer.
Looking for a good architectural way of doing it.
Programming language being used is C#
Much Appreciated
The recommended way of hosting RabbitMQ subscriber is by writing a windows service using something like topshelf library and subscribe to bus events inside that service on its start. We did that in multiple projects with no issues.
If you are using Azure, the best place to host RabbitMQ subscriber is in a "Worker Role".
I am using CQRS where my Command Handler would consume message off
the Queue and do some business logic. In order to call the handler, it
will need to make a request to the API method.
Are you sure this is real CQRS? CQRS occures when you handle queries and commands differently in your domain logic. Receiving a message via a calss, that's called CommandHandler and just reacting to it is not yet CQRS.
My aim is to create a listener or a startup task where once messages
are in the queue it will automatically pick it up from the Queue and
continue with command handler but not sure how to do the "Automatic"
way as i describe it. I was thinking to utilise Azure Webjob that will
continuously be running and it will act as the Consumer. Looking for
a good architectural way of doing it.
The easier you do that, the better. Don't go searching for complex solutions until you tried out all the simple ones. When I was implementing something similar, I was just running a pool of message handler scripts using Linux cron. A handler poped a message off the queue, processed it and terminated. Simple.
I think using the CQRS pattern, you will have events as well and corresponding event handlers. As you are using RabbitMQ for asynchronous communication between command and query then any message put on specific channel on RabbitMQ, can be listened by a callback method
Receiving messages from the queue is more complex. It works by subscribing a callback function to a queue. Whenever we receive a message, this callback function is called by the Pika library.

Resources