Angular 2 Updating objects in “real time.” - socket.io

Hi I’m trying to wrap on how to update a table angular 2.
Here is what I have:
Backend: express / MongoDB. Updates are feed into the DB via an external app
Data: 90% data will will be static. 10% of the data updates every second.
I’ve looked at Observables / promises. HTTP requests/ socket IO but can’t wrap my mind around the concepts.
Main Question: can I use observables with socket.io to update records?
Other Questions about data updates
Angular 2’s Observables – are observables use only when the client is pulling data? or can you use it with a socket when data is being pushed to the client. (all examples online use observables with a http request)
Can you use Socket IO to update an object or is it just for new objects? Every example is see is a chat application.
When using http requests how do you set how often the data is requested? (some examples online use loops but that seems wrong.)

Observables are event-based so they can be used to receive events from server leveraging web sockets. Have a look at this article (section "Event-based support"):
https://jaxenter.com/reactive-programming-http-and-angular-2-124560.html
In fact it's new objects but you can leverage the scan operators to aggregate the content of several events.
var obs = (...)
obs.startWith([])
.scan((acc,value) => acc.concat(value))
.subscribe((data) => {
console.log(data);
});
See this question for more details:
Convert a plain string[] into a Observable<string[]> and concat it to another Observable<string[]> using RxJS 5
If you want to pull with a time interval, you can leverage the interval method:
Observable.interval(3000).flatMap(() => {
return this.http.get('/some-request').map(res => res.json());
}).subscribe((data) => {
console.log(data);
});

Related

Update charts values every 30 seconds from springboot backend to angular4 frontend

I'm working on a project where I have a dashboard with lots of charts and variables that needs to be updated every 30 seconds. I'm going to use Springboot to create an API that will return JSON values, and I'm going to fetch the values with Angular4 and render my graphs. As I'm no expert neither in Angular or Springboot I need your advice on how can I approach this issue, and what will be the easiest way to dynamically update my charts . Do I need to use AJAX?
Are there any other easy ways of doing this?
Thanks you all in anticipation.
You should use HttpClientModule, for your requests to the Springboot API. And the timer functionality is pretty simple:
export class DataService {
constructor(private http: HttpClient){}
getData() {
interval(30*1000)//just instead of writing 30000
.pipe(switchMap(() => this.http.get('your url')))
.subscribe(data => do things with data);
}
}
This is one way to go, you can also use timer for example. Replace interval with timer(0, 30*1000) where 0 how much time to wait until the first try, and the second value is how often to repeat it. Here you can read about switchMap and interval.
Bare in mind, that I've written it using RxJS 5+ with pipe, and Angular 5+ using HttpClient, the way might differ in older versions, and I highly recommend using the latest versions.

Http Performance - Many small requests or one big one

Scenario:
In my site I display books.
The user can add every book to a "Read Later" list.
Behavior:
When the user enters the site, they are presented with a list of books.
Some of which are already in their "Read Later" list, some aren't.
The user has an indication next to each book telling them whether the book has been added to the list or not.
My issue
I am debating which option is the ideal for my situation.
Option 1:
For every book, query the server whether it already exists in the user's list.
Update the indicator for each book.
Pro:
Very small request to the server, and very easy response (true or false).
Con: In a page with 30 books, I will send 30 separate http requests, which can block sockets, and is rather slow considering the browser and the server have to perform the entire handshake for each transaction.
Option 2:
I query the server once, and get a response with the full list of books in the "Read Later" list as an array.
In the browser, I go over the array, and update the indication for each book based on whether it exists in the array or not.
Pro: I only make one request, and update the indicator for all the books at once.
Con: The "Read Later" list might have hundreds of books, and passing a big array might prove slow and excessive. Especially in scenarios when not 30 books appear on the screen, but only 2-3. (That is, I want to check if a certain book is in the list, and for this I have the server send the client the entire list of books from the list).
So,
Which way would you go to maximize performance: 1 or 2?
Is there any alternative I am missing?
I think in 2017, and beyond, the solution is much less about overall performance but about user experience and user expectations.
Nowadays users do not tolerate delays. In that sense sophisticated user interfaces try to be responsive as quickly as possible. Thus: if you can use those small requests to enable the user to do something quickly (instead of waiting 2 seconds for that one big request to return) you should prefer that solution.
To my knowledge, there are many "high fidelity" sites out there where a single page might send 50, 100 requests. Therefore I consider that to be common practice!
And maybe it is helpful here: se-radio.net podcast episode 277 discusses this topic intensively, in the context of tail latency.
Option 1 sounds good but has a big problem in terms of scalability.
Option 2 mitigates this scalability problem and we can improve its design:
Client side, via javascript, collect only displayed book ids and query once, via ajax, for an array of read-later info, only for those 30 books. This way you still serve the page fast and request a small set of additional info, once with a single http request.
Server side you can further improve caching an in memory array of read-later ids for each user.
Live Testing, Solution & Real-World Data
This answer is written in JavaScript, and includes easy to understand code examples.
Introduction
The OP asked what is the most efficient way to make requests to a "Read Later" API that each request requires to wait some time while the backend saves the book.
For this answer, I have created a demo of a "Read Later" API endpoint, every request waits randomly from 70-130 milliseconds for saving each book.
I am testing in all scenarios 30 books every time.
Finally, we will see the best results for each method by measuring professionally real runtime of every action we will take.
Synchronous Requests (OP's Option 1)
Here, we will run every call via JS, one after the other synchronously.
The code:
async function saveBooksSync() {
console.time('save-books-sync');
// creates 30 book IDs
const booksIds = Array.from({length: 30}, (_, i) => i + 1);
// creates 30 API links for each request
const urls = booksIds.map(bookId => `http://localhost:7777/books/read-later?bookId=${bookId}`);
for(let url of urls) {
const response = await fetch(url);
const json = await response.json();
console.log(json);
}
console.timeEnd('save-books-sync');
}
Runtime: 3712.40087890625 ms
One Big Request
Although we will not be creating many request connection to the server, the runtime speaks for itself.
The code:
async function saveAllBooksAtOnce() {
console.time('save-all-books')
const booksIds = Array.from({length: 30}, (_, i) => i + 1);
const url = `http://localhost:7777/books/read-later?all=1`;
const response = await fetch(url);
const json = await response.json();
console.timeEnd('save-all-books');
}
Runtime: 3486.71484375 ms
Parallel Asynchronous Requests (solution)
Here the magic happens, the solution to the question, what is the most efficient request method.
Here we are making 30 parallel small requests with amazing results.
The code:
async function saveBooksParallel() {
console.time('save-books')
const booksIds = Array.from({length: 30}, (_, i) => i + 1);
const urls = booksIds.map(bookId => `http://localhost:7777/books/read-later?bookId=${bookId}`);
const promises = urls.map((url) =>
fetch(url).then((response) => response.json())
);
const data = await Promise.all(promises);
console.log(data);
console.timeEnd('save-books');
}
Here in this asynchronous parallel example, I used the Promise.all method.
The Promise.all() method takes an iterable of promises as an input,
and returns a single Promise that resolves to an array of the results
of the input promises
Runtime: 668.47705078125 ms
Conclusion
The results are clear, the most efficient way to make these multiple requests is to do this in Asynchronous Parallel.
Update: I followed #Iglesias Leonardo's request to remove the console.log() of the data output because (presumably) it takes high resources.
These are the runtime results:
Synchronous Requests: 3371.695 ms
One Big Request: 3358.269 ms
Parallel Asynchronous Requests: 613.506
Update Conclusion:
The runtimes stayed almost the same and thus reflect the reality that Parallel Asynchronous Requests are unmatched by speed
In my view it depends on how the data is stored. If a relational database is being used you could easily get the boolean flag into the list of books by simply doing a join on the corresponding tables.
This will most likely give you the best results and you wouldn't have to write any algorithms in the front end.

Exchange data between node.js script and client's Javascript

I have the following situation, where the already sent headers problem happens, when sending multiple request from the server to the client via AJAX:
It is something I expected since I opted to go with AJAX, instead of sockets. Is there is other way around to exchange the data between the server and the client, like using browserify to translate an emitter script for the client? I suppose that I can't escape the sockets, so I will take advice about simpler library, as sockets.io seems too complex for such a small operation.
//-------------------------
Update:
Here is the node.js code as requested.
var maxRunning = 1;
var test_de_rf = ['rennen','ausgehen'];
function callHandler(word, cb) {
console.log("word is - " + word);
gender.gender_function_rf( word , function (result_rf) {
console.log(result_rf);
res.send(result_rf);// Here I send data back to the ajax call
setTimeout(function() { cb(null);
}, 3000);
});
}
async.eachLimit(test_de_rf, maxRunning, function(item, done) {
callHandler(item, function(err) {
if (err) throw new Error(err);
done();
});
}, function(err) {
if (err) throw new Error(err);
console.log('done');
});
res.send() sends and finishes an http response. You can only call it once per request because the request is finished and done after calling that. It is a fairly high level way of sending a response (does it all at once in one call).
If you wanted to have several different functions contributing to a response, you could use the lower level functions on the http object such as res.setHeader(), res.writeHead(), res.write() (which you can call multiple times) and res.end() (which indicates the end of the response).
You can use the standard webSocket API in the browser and get webSocket module for server-side support or you can use socket.io which offers both client and server support and a number of higher level functions (such as automatic reconnect, automatic failover to http polling if webSockets are not supported, etc...).
All that said, if what you really want is the ability to just send some data from server to client whenever you want, then a webSocket is really the better way to go. This is a persistent connection, is supported by all modern browsers and allows the server to send data unsolicited to the client at any time. I'd hardly say socket.io is complex. The doc isn't particularly great at explaining things (not uncommon in the open source world as the node.js doc isn't particularly great either). But, I've always been able to figure advanced things out by just looking at a few runtime data structures in the debugger and/or looking at the source code.

RPCs on Websockets with Scala and JS (like SignalR)

I want to implement an application based on Scala and Play! 2.1 where all data transport is handled through websockets in real-time. Since the application supports collaboration of several users, I want to be able to call methods on a) the server, b) one client, c) all clients.
For example, let's say there are Users Bob, Jane and Carl.
Carl creates a "note" which is sent through the socket and then, if successfully stored, added to the DOM through basic Javascript (let's say addNote(note)) on all clients.
A sample call could look like this:
// sends message type createCard to server, takes <form id="card"> as data and receives a JSON object as response
mysocket.send("createCard", $('#card').serialize(), { success: function(data) {
var card = data.card
mysocket.allClients().addCard(card); // appends <div id="card"> to DOM
});
Is this possible or am I going about this the wrong way entirely?
See SignalJ - a port of SignalR ideas to PlayFramework and Akka.

Sending events from server to client(s) in Meteor

Is there a way to send events from the server to all or some clients without using collections.
I want to send events with some custom data to clients. While meteor is very good in doing this with collections, in this case the added complexity and storage its not needed.
On the server there is no need for Mongo storage or local collections.
The client only needs to be alerted that it received an event from the server and act accordingly to the data.
I know this is fairly easy with sockjs but its very difficult to access sockjs from the server.
Meteor.Error does something similar to this.
The package is now deprecated and do not work for versions >0.9
You can use the following package which is originally aim to broadcast messages from clients-server-clients
http://arunoda.github.io/meteor-streams/
No collection, no mongodb behind, usage is as follow (not tested):
stream = new Meteor.Stream('streamName'); // defined on client and server side
if(Meteor.isClient) {
stream.on("channelName", function(message) {
console.log("message:"+message);
});
}
if(Meteor.isServer) {
setInterval(function() {
stream.emit("channelName", 'This is my message!');
}, 1000);
}
You should use Collections.
The "added complexity and storage" isn't a factor if all you do is create a collection, add a single property to it and update that.
Collections are just a shape for data communication between server and client, and they happen to build on mongo, which is really nice if you want to use them like a database. But at their most basic, they're just a way of saying "I want to store some information known as X", which hooks into the publish/subscribe architecture that you should want to take advantage of.
In the future, other databases will be exposed in addition to Mongo. I could see there being a smart package at some stage that strips Collections down to their most basic functionality like you're proposing. Maybe you could write it!
I feel for #Rui and the fact of using a Collection just to send a message feel cumbersome.
At the same time, once you have several of such message to send around is convenient to have a Collection named something like settings or similar where you keep these.
Best package I have found is Streamy. It allows you to send to everybody, or just one specific user
https://github.com/YuukanOO/streamy
meteor add yuukan:streamy
Send message to everybody:
Streamy.broadcast('ddpEvent', { data: 'something happened for all' });
Listen for message on client:
// Attach an handler for a specific message
Streamy.on('ddpEvent', function(d, s) {
console.log(d.data);
});
Send message to one user (by id)
var socket = Streamy.socketsForUsers(["nJyQvECmkBSXDZEN2"])._sockets[0]
Streamy.emit('ddpEvent', { data: 'something happened for you' }, socket);

Resources