process "events" async with elixir and phoenix [closed] - phoenix-framework

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I'm using phoenix controllers to receive data via REST calls. So an iOS app could send the "events" for each user and based on the event, I need to calculate the score/points and send it back to the user. Calculation and sending back to the user can happen asynchronously. I'm using Firebase to communicate back to the user.
What is a good pattern to do calculation? Calculate could be bunch of database queries to determine the score of that event. Where should this calculation happen? Background workers, GenEvent, streams within user-specific GenServer (I have supervised GenServer per user).

I would look at Phoenix channels, tasks and GenServer.
Additionally, if you would like to manage a pool of GenServer workers to do the calculations and maybe send back the results for you, check out Conqueuer. I wrote this library and it is in use in production systems for my company. It is uses poolboy, which is probably the most pervasive pool management library in Erlang/Elixir.
Admittedly, I do not fully understand the requirements of your system, but it does not seem to me GenEvent has a place in your requirements. GenEvent is about distributing events to one or more consumers of events. So unless you have a graph of processes that need to subscribe to events being emitted from other parts of your system I do not see a role for it.

Related

When is it better to use websockets versus a message broker such as Kafka? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed last month.
The community reviewed whether to reopen this question last month and left it closed:
Original close reason(s) were not resolved
Improve this question
As a product scales, APIs and two tier architecture incurs bottlenecks, data contention, downtime. Messages can become lost, if there are thousands or millions of requests & activity
What makes websocket connections beneficial vs Kafka? What are the best use cases for each?
Is there an example such as a large scale chat application where a hybrid of both technologies are necessary?
Websockets should be used when you need real-time interactions, such as propagating the same message to multiple users (group messaging) in a chat app.
Kafka should be used as a backbone communication layer between components of a system. It fits really well in event-driven architectures (microservices).
I see them as 2 different technologies which have been developed for 2 different purposes.
Kafka, for example, allows you to reply messages easily, because they are stored on the local disk (for the configured topic retention time). Websockets are based on TCP connections (two-way communication), so they have a different use-case spectrum.

OneM2M: IPE that periodically fetches device data from FTP server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I have a device that sends data to a FTP server. In oneM2M, I want to implement an IPE to access this data and send it to MN-CSE.
I am thinking of periodically (say every 5 minutes) requesting the FTP server and fetching the data to the IPE which further processes and sends this data to MN-CSE. I just wanted to confirm if the approach is fine or are there any better ways to achieve this?
Thanks in advance.
This is a question more related to the infrastructure architecture you are planing to deploy. It is feasible, of course, but polling has the disadvantage that if the delay is to small your AE would poll the ftp server unnecessarily often, and if it is to long then you might miss data delivered by your device (to the ftp server). Alternatives could be:
The ftp server notifies your AE
Your AE implements the ftp server and would then be able to directly react on new data events.

How to show a message to user when another login occurs? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I want show a message to the user when more than one login is performed. For example, a user called John is logged into Computer A. Now John logs into Computer B. In this case system should show a message to John on Computer A that another login has happened.
In my opinion, this could be done using a push notification service such as Pusher or Firebase Cloud Messaging. Is that correct? or is there another way of doing this.
I'm storing the user's sessions in the database as soon as the login is performed so I can fire an event when more than the desired logins are performed.
Yes, Laravel pretty much as the basics built in to allow for this. If you look into the default boostrap.js file Laravel ships with, you can see the code is already there: see https://github.com/laravel/laravel/blob/master/resources/js/bootstrap.js#L32-L41
If using Laravel's echo server you've broadcast events as described here. Within your application you'd listen for events as explained here
You could also make this even simpler (although I'd recommend websockets).
You could stgore "active" logins in your database and display a message based upon that.
Note that you can avoid multiple logins. When logging out you invalidate other sessions. This feature is built into Laravel, see here.

Is it okay to use one RabbitMQ channel for all goroutines? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
I'm doing a message publisher and a receiver to/from rabbitmq queue.
I'm planning to use one rabbitmq channel for publishing messages and one for fetching, but i also want my code to be asynchronous, is it the right way to use one rabbitmq channel per multiple goroutines?
I'm not a golang-guy, by I use every day RabbitMq with .Net driver; .Net driver channels (an abstraction that encapsulate interactions with queue/exchange and message publishing/subscribing) are very similar to golang RabbitMq channels, so I think my answer can help you.
While connections are tread safe by design and are supposed to be shared between threads, channels are not: so, if different aysnchronous goroutines can be run in different threads (this is up to you: I don't known how golang runtime works) you should not share the same channel instance between them.
I hope this can help you.

JMS Queue and JMS Topic [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
How a Queue and Topic can be applied in airport management when a plane arrives?
At the airport there are many systems that interact with the plane at ramp time. These include fueling and servicing the plane, gate management, passenger announcements, FAA filings, and 3rd party vendors such as those who SMS you with updates. All of these are different families of applications both within and external to the airport's network fabric.
Publishing a single event notification on a topic is a good way to update all interested systems at once. Rather than establishing dozens of point-to-point interfaces for all these systems, they are all allowed to subscribe to the topic of interest. The publications can be converted to queued delivery on a per-receiver basis for legacy apps or external apps that cannot issue a subscribe command.

Resources