MongoDB add event - ruby

Is it possible to somehow subscribe to 'add to db' event using Node.js? Database is currently populated via Ruby on Rails. Thanks.

MongoDB wise
There's an ongoing discussion about triggers on mongodb Jira.
But for now you're stuck with storing auto-increment values alongside with your data, and using indexed queries to check if there's anything new.
Rails wise
I'm assuming you're using Mongoid. Use callbacks or observers to send messages to a fast capped collection / unix socket / whatever. Other ODMs shouldn't be too different.

You need to notify the Node.js process from the Rails app when you insert something in the DB.
Listen on a socket/port from Node.js process
From Rails write on that socket when record added
From Node, process each message on the socket

Related

Retrieving data from database using spring integration JDBC without poll

Currently learning spring integration, I want to retrieve information from a MySQL database to use inside an int:service-activator, or an int:splitter .
Unfortunately, it would seem that most examples and documentation is based around the idea of using an int-jdbc:inbound-channel-adapter, which in itself requires a poller. I don't want to poll a database, but rather retrieve specific data based on the payload of an existing message originating from an int:gateway. This data would then be used to further modify the payload, or assist in how the message is split.
I tried using int-jdbc:outbound-gateway, as the description states:
... jdbc.JdbcOutboundGateway' for updating a database in response to a message on the request channel, and/or for retrieving data from the database ...
This implies that it can be used for retrieval of data only and not just updates, but as I implement it, there's a complaint that at least one update statement is required:
And so I'm currently sitting with a faulty prototype that initially looks like so:
The circled piece being the non-functioning int-jdbc:outbound-gateway.
My end goal is to, based on the payload coming from the incomingGateway (in the picture above), retrieve some information from a MySQL database, and use that data to split the message in the analyzerSplitter, or to perhaps modify the payload using an int:service-activator. This should then all be linked up to a int-jdbc:message-store which I believe could assist with performance. I do not wish to poll the database on a regular basis, and I do not wish to update anything in the database.
By testing using the polling int-jdbc:inbound-channel-adapter, I am confident that my datasource bean is set up correctly and the query can execute.
How would I go about correctly setting up such behaviour in spring integration?
If you want to proceed with the flow after updating the database, you can simply use a JdbcTemplate in a method invoked by a service activator, or, if it's the end of the flow, use an outbound channel adapter.
The outbound channel adapter is the inverse of the inbound: its role is to handle a message and use it to execute a SQL query. By default, the message payload and headers are available as input parameters to the query, as the following example shows:
...

Heroku Connect & PostGreSQL Events

Is there a practical way for my app to get notified when Heroku Connect adds records to a table?
I currently have a Flask app connected to a Salesforce org via Heroku Connect. I have event listeners for before_insert, after_insert, before_update, after_update. Additionally, SQLALCHEMY_ECHO is set to True. When I create a record in Salesforce, none of the event listeners fire, and no SQL statements are printed. However, if I query the model that matches the mapped sObject, I can see the new record. Therefore, Heroku Connect must be updating the table, but in a way that doesn't trigger event listeners. I did read up a bit on pg_notify (LSITEN), but all solutions seem to involve a select loop, which is much less elegant than db.event.listens_for decorators.

Sharing events between two Laravel applications

Is it possible to have one Laravel application listen for events triggered in another?
I've built a REST API to complement an existing web app. It uses the same database but I've built it as a separate application and there are certain events which clear some cached results. At the moment the events are not being shared between the two applications so I'm getting the cached results in spite of having updated the database. Is there a way for one app to pick up on events fired by the other? I haven't found anything about this in the docs.
Redis is completely agnostic about what application is listening to it. You can set your broadcast driver to redis and invoke your events in one application while listening on the other as long as they both use the same Redis instance. The other can then listen for those events. However, of note is that the way that Laravel handles the listeners is to bind to a specific class. So you would still have to make sure the class existed so you may define a listener for it.

Persistent subscriber in Firebase

Is there built-in support or any way of implementing a persistent subscription in Firebase?
I need to set up a backend which reacts to certain events in my Firebase database. If the backend has crashed or is being restarted I need it to catch up with anything that has happened while it was down.
For example, I want to re-index certain objects in ElasticSearch when they change. If the backend is down I need to re-index any changed objects when the backend comes back up again.
Nothing is built in for that, although you can definitely build it on top of Firebase by adding a isIndexed or isDirty property to the items.
But the more common approach is to stuff the items that need to re-indexed into a queue and use a worker process that removes them from the queue when they've been handled. I highly recommend using firebase-queue for that.

Queuing in tandem with a Ruby Web socket server

I am writing an application using jruby on rails. Part of the application initiates a long running process from a web page. The long running process could last for 20 minutes in some cases but will in most cases outlive a web page response in terms of time. I also want the job to continue if the user closes down the browser. The long running process will add records to a database as it is running.
I want to give visual indications of the inserts into the database on the web page and I would prefer to use web sockets rather than polling the database for the inserts.
I am thinking of sending a message to a resque queue with a queue handler that will ensure the job is completed if the user closes down the browser. The queue handler will perform the inserts into the database.
I was thinking of using EM-WebSocket as my websocket server.
The problem I have is:
How can I communicate between the resque process and EM-WebSocket process? I want to somehow pass the details of the new inserts into the database from the resque process to an EM-WebSocket instance that will communicate with the browser?
Anybody solved a problem like this or any ideas how I can do this?
I'm actually working on a gem that makes this pretty simple. Right now it's pretty bare, but it does work. https://github.com/KellyMahan/RealTimeRails
It runs an event-machine server for listening for updates and makes use of em-websockets to send those updates to the browser.
It's meant to watch for active record updates through an after_save call that tells the event-machine server it has an update for it's model with an id. Then it matches the model and id to specific channels to send a message to connections on the web socket server. When the browser receives a notice to update it makes an ajax call to retrieve the latest results.
Still a work in progress but it could help you.

Resources