Show status of long process running on server side - ajax

I'm working for CSV import in my Node.js based web app.
Most given CSV files has tens of thousands of records, and it takes several minutes.
So until import finish, I want to show users "Currently importing..." message.
What I want to create is similar to Github's forking screen. After you press fork button on top right of repo, it shows message that "Forking / It should only take a few seconds." until fork finishes.
In addition, I want to add progress bar to indicate percentage of processed records hopefully.
Current my implementation is:
Client send request with CSV data
Server processes received CSVs and insert records to DB.
Server respond 200 if CSV is valid.
But with implementation users cannot see current status. Even sometimes socket hangs up.
I'm considering following reimplementation:
Client send request with CSV data
Server respond 200 to tell client that CSV is received
Server starts to process received CSVs and insert records to DB.
However, I have no idea:
how client know that import is done
how client know when error is occur in CSV processing and DB insertion
How can implement server side?
Thanks in advance ;)

You need to use socket.io here to keep track of the progress. As soon as you receive the CSV, your client could connect to socket.
Server:
io.on('connection', function (socket) {
console.log('CONNECTED');
socket.join('progressSession');
});
You can periodically emit progress event to let the client know how many records you've been processed. (I hope you're processing records asynchronously, or can at least run some other code in between)
io.sockets.in('progressSession').emit('progress', noOfRecords);
And, Client can listen on progress event and show it to the user
var socket = io.connect('http://localhost:9000');
socket.on('progress' , function (status){
console.log(status);
// show status to the user
});
Comment if need any more clarity.

Send the request as you do, return the status immediately to confirm or reject CSV valididy and finish the response. Then use something like http://socket.io/ to send updates to the client.

Related

Pubnub chat application with storage

I'm looking to develop a chat application with Pubnub where I want to make sure all the chat messages that are send is been stored in the database and also want to send messages in chat.
I found out that I can use the Parse with pubnub to provide storage options, But I'm not sure how to setup those two in a way where the messages and images send in the chat are been stored in the database.
Anyone have done this before with pubnub and parse? Are there any other easy options available to use with pubnub instead of using parse?
Sutha,
What you are seeking is not a trivial solution unless you are talking about a limited number of end users. So I wouldn't say there are no "easy" solutions, but there are solutions.
The reason is your server would need to listen (subscribe) to every chat channel that is active and store the messages being sent into your database. Imagine your app scaling to 1 million users (doesn't even need to get that big, but that number should help you realize how this can get tricky to scale where several server instances are listening to channels in a non-overlapping manner or with overlap but using a server queue implementation and de-duping messages).
That said, yes, there are PubNub customers that have implemented such a solution - Parse not being the key to making this happen, by the way.
You have three basic options for implementing this:
Implement a solution that will allow many instances of your server to subscribe to all of the channels as they become active and store the messages as they come in. There are a lot of details to making this happen so if you are not up to this then this is not likely where you want to go.
There is a way to monitor all channels that become active or inactive with PubNub Presence webhooks (enable Presence on your keys). You would use this to keep a list of all channels that your server would use to pull history (enable Storage & Playback on your keys) from in an on-demand (not completely realtime) fashion.
For every channel that goes active or inactive, your server will receive these events via the REST call (and endpoint that you implement on your server - your Parse server in this case):
channel active: record "start chat" timetoken in your Parse db
channel inactive: record "end chat" timetoken in your Parse db
the inactive event is the kickoff for a process that uses start/end timetokens that you recorded for that channel to get history from for channel from PubNub: pubnub.history({channel: channelName, start:startTT, end:endTT})
you will need to iterate on this history call until you receive < 100 messages (100 is the max number of messages you can retrieve at a time)
as you retrieve these messages you will save them to your Parse db
New Presence Webhooks have been added:
We now have webhooks for all presence events: join, leave, timeout, state-change.
Finally, you could just save each message to Parse db on success of every pubnub.publish call. I am not a Parse expert and barely know all of its capabilities but I believe they have some sort or store local then sync to cloud db option (like StackMob when that was a product), but even if not, you will save msg to Parse cloud db directly.
The code would look something like this (not complete, likely errors, figure it out or ask PubNub support for details) in your JavaScript client (on the browser).
var pubnub = PUBNUB({
publish_key : your_pub_key,
subscribe_key : your_sub_key
});
var msg = ... // get the message form your UI text box or whatever
pubnub.publish({
// this is some variable you set up when you enter a chat room
channel: chat_channel,
message: msg
callback: function(event){
// DISCLAIMER: code pulled from [Parse example][4]
// but there are some object creation details
// left out here and msg object is not
// fully fleshed out in this sample code
var ChatMessage = Parse.Object.extend("ChatMessage");
var chatMsg = new ChatMessage();
chatMsg.set("message", msg);
chatMsg.set("user", uuid);
chatMsg.set("channel", chat_channel);
chatMsg.set("timetoken", event[2]);
// this ChatMessage object can be
// whatever you want it to be
chatMsg.save();
}
error: function (error) {
// Handle error here, like retry until success, for example
console.log(JSON.stringify(error));
}
});
You might even just store the entire set of publishes (on both ends of the conversation) based on time interval, number of publishes or size of total data but be careful because either user could exit the chat and the browser without notice and you will fail to save. So the per publish save is probably best practice if a bit noisy.
I hope you find one of these techniques as a means to get started in the right direction. There are details left out so I expect you will have follow up questions.
Just some other links that might be helpful:
http://blog.parse.com/learn/building-a-killer-webrtc-video-chat-app-using-pubnub-parse/
http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
https://www.pubnub.com/knowledge-base/discussion/293/how-do-i-publish-a-message-from-parse
And we have a PubNub Parse SDK, too. :)

Nodejs - How to kill a running SQLite query

Possible similar question :
SQL: Interrupting a query
Is there a way to abort an SQLite call?
Hi everyone,
I am actually using socket.io and sqlite3 modules to perform SELECT query on a SQLite database. When a user click on an OpenLayers Map, its sends a signal to the server (through socket.io) to gather informations by performing spatial request (like intersection, union... using Spatialite extension) and then finally send back data to the client (these are long-running queries (depending of the amount of geometries) ) to show a popup on the map where the user clicked to.
The problem is: if a user click many times on the map, sending many requests to the server, only the last one is important. Imagine that if a query takes 5 sec to be executed, and that a user click 3 times in a second on the map (he just wants the last location where he clicked to be used), then the server will do 3 queries, sending back 3 signals through socket.io (and opening 3 popup, we just need the last one to be opened) ! Is there any solution to kill/abort a running sqlite query with nodejs ?
Example code :
socket.on('askForInfo', function (data) {
sendInfo(socket, data.latitude, data.longitude);
});
sendInfo definition :
function sendInfo(socket, lat, lng) {
// Database connection
var db = new sqlite.Database('some file.sqlite', sqlite.OPEN_READONLY);
// Load Spatialite extension
db.loadExtension('mod_spatialite', function(err) {
// Query doing spatial request
db.get("VERY LONG SQL QUERY", function(err, row) {
// Send the data gathered from database
socket.emit('sendData', row['some sql column']);
});
});
}
I want to do something like :
if ("sendInfo didn't finished to emit any signal through socket"
AND "user did another resquest")
then
"kill all running sendInfo function execution and sql query"
I know that if there are many users connected this won't work like that (I may need to use session to know for which user the function is actually gathering data). But I don't find any solution even if there is only one user.
I tryed using AJAX(jquery) instead of socket.io. I can abort the xhr request, but the SQL query is still running even if the request is aborted until she is finish ( using lot of ressources uselessly )
Thanks in advance for your answer.
Thanks for your answer Qualcuno.
I found a solution, using child_process to query database from another process, killing this one if user do another request.
var cp = require('child_process');
// more stuff ...
socket.on('askForInfo', function (data) {
// if process is connected, kill it (query isn't terminated)
if (child.connected) {
child.kill();
}
// create a new process executing 'query_database.js'
child = cp.fork('./query_database.js', [
data.latitude, data.longitude
]);
// when invoking [ process.send(some_data_here) ] in child process, fire this event to send data to the user
child.on('message', function(d) {
socket.emit('sendData', d)
};
});
Nodejs’ SQLite has interrupt method, albeit currently undocumented: https://github.com/mapbox/node-sqlite3/issues/1205#issuecomment-559983976

Implementing bulk-messaging from Salesforce to/from Twilio, hitting Salesforce API limits

I am building an integration between Salesforce and Twilio that sends/receives SMS using TwilioForce REST API. The main issue is getting around the 10-call API limit from Salesforce, as well as the prohibition on HTTP call outs from a trigger.
I am basing the design on Dan Appleman's Asynchronous Request processes, but in either Batch mode or RequestAsync(), ASync(), Sync(), repeat... I'm still hitting the limits.
I'd like to know how other developers have done this successfully; the integrations have been there for a while, but the examples are few and far between.
Are you sending unique messages for each record that has been updated? If not, then why not send one message to multiple recipients to save on your API limits?
Unfortunately, if you do actually need to send more than 10 unique messages there is no way to send messages in bulk with the Twilio API, you could instead write a simple application that runs on Heroku or some other application platform that you can call out to that will handle the SMS functionality for you.
I have it working now using the following structure (I apologize for the formatting - it's mostly pseudocode):
ASyncRequest object:
AsyncType (picklist: 'SMS to Twilio' is it for now),
Params (long text area: comma-separated list of Ids)
Message object:
To (phone), From (phone), Message (text), Sent (boolean), smsId (string), Error (text)
Message trigger: passes trigger details to CreateAsyncRequests() method.
CreateAsyncRequests: evaluate each new/updated Message__c; if Sent == false for any messages, we create an AsyncRequest, type=SMS to Twilio, Params += ',' + message.Id.
// Create a list to be inserted after all the Messages have been processed
List requests = new List();
Once we reach 5 message.Ids in a single AsyncRequest.Params list, add it to requests.
If all the messages have been processed and there's a request with < 5 Ids in Params, add it to requests as well.
If requests.size() > 0 {
insert requests;
AsyncProcessor.StartBatch();
}
AsyncProcessor implements .Batchable and .AllowsCallouts, and queries ASyncRequest__c for any requests that need to be processed, which in this case will be our Messages list.
The execute() method takes the list of ASyncRequests, splits each Params value into its component Message Ids, and then queries the Message object for those particular Messages.
StartBatch() calls execute() with 1 record at a time, so that each execute() process will still contain fewer than the maximum 10 callouts.
Each Message is processed in a try/catch block that calls SendMessage(), sets Message.smsId = Twilio.smsId and sets Message.Sent = true.
If no smsId is returned, then the message was not sent, and I set a boolean bSidIsNull = true indicating that (at least) one message was not sent.
** If any message failed, no smsIds are returned EVEN FOR MESSAGES THAT WERE SUCCESSFUL **
After each batch of messages is processed, I check bSidIsNull; if true, then I go back over the list of messages and put any that do not have an smsId into a map indexed by the Twilio number I'm trying to send them From.
Since I limited each ASyncRequest to 5 messages, I still have the use of a callout to retrieve all of the messages sent from that Twilio.From number for the current date, using
client.getAccount().getMessages('From' => fromNumber, 'DateSent' => currentDate)
Then I can update the Message.smsIds for all of the messages that were successful, and add an error message to Message.Error_on_Send__c for any that failed.

ExpressJS backend hanging with too much requests

I have an express app running with Sequelize.js as an ORM. My express app receives requests from my main Rails app, and because of the cross-domain policy, these requests are performed with getJSON.
On the client, the request is fired when the user hits a key.
Everything goes fine and express logs the queries being performed (and json being served) each time the user hits the key. Even trying to hit quickly it performs ok. But, whenever I leave the key pressed (or maybe several clients hitting the key very quickly), as it starts firing lots of requests, at some moment the server just hangs, all the requests from that point on are left pending (I see that in the Network tab of Chrome Dev Tools), and they slowly start to timeout. I have to reboot the server to make it respond again.
The server code for my request is:
models.Comment.findAllPublic(req.params.pId, req.params.sId, function(comments){
var json = comments.map(function(comment){
var com = {};
['user_id','user_avatar', 'user_slug', 'user_name', 'created_at', 'text', 'private', 'is_speaker_note'].forEach(function(key){
com[key]=comment[key];
});
return com;
});
res.json({comments: json});
});
And the findAllPublic method from the Comment model (this is a Sequelize model) is:
findAllPublicAndMyNotes: function(current_user, presentationId, slideId, cb){
db.query("SELECT * FROM `comments` WHERE commentable_type='Slide' AND commentable_id=(SELECT id from `slides` where `order_in_presentation`="+slideId+" AND `presentation_id`="+presentationId+") AND (`private` IS FALSE OR (`private` IS TRUE AND `user_id`="+current_user+" AND `is_speaker_note` IS FALSE))",self.Comment).on('success', cb).on('failure',function(err){console.log(err);});
}
How to avoid the server from getting stuck? Am I leaving some blocking code in the request that may slowly hang the server as new requests are made?
At first I thought it could be a problem because of the "forEach" when composing the json object from the Sequelize model, but I also tried leaving the callback for the mysql query empty, just responding empty json and it also got frozen.
Maybe it is a problem of the mysql connector? When the server gets stuck I can normally run the mysql console and perform queries on my database and it also responds, so I don't know if that's the problem.
I know I could just control the key event to prevent it from firing too many requests when the key gets pressed for a long time, but the problem seems to appear also when several clients hit the key repeatedly and concurrently.
Any thoughts? Thanks in advance for the help :D
Two things:
It seems like you have some path where res.render is not being called. It could be that the database you're connecting to is dropping the connection to your Express server after the absurd number of requests and the callback is never fired (and there's no database.on('close', function() { // Handle disconnect from DB, perhaps auto-restarting }) code to catch it.
Your client-side code should detect when an AJAX request on keypress is still pending while a new one is being started, and cancel the old one. I'm guessing getJSON is a jQuery method? Assuming it's jQuery's, then you need something like the following
.
var currKeyRequest = null;
function callOnKeyUp() {
var searchText = $('#myInputBox').value;
if(currKeyRequest) {
currKeyRequest.reject();
currKeyRequest = null;
}
currKeyRequest = $.getJSON('path/to/server', function(json) {
currKeyRequest = null;
// Use JSON code
});
}
This way, you reduce the load on the client, the latency of the autocomplete functionality (but why not use the jQuery UI autocomplete if that's what you're after?), and you can save the server from some of the load as well if the keypresses are faster than handshaking with the server (possible with a good touch-typist a few hours flight away).

Long Running Wicket Ajax Request

I occasionally have some long running AJAX requests in my Wicket application. When this occurs the application is largely unusable as subsequent AJAX requests are queued up to process synchronously after the current request. I would like the request to terminate after a period of time regardless of whether or not a response has been returned (I have a user requirement that if this occurs we should present the user an error message and continue). This presents two questions:
Is there any way to specify a
timeout that's specific to an AJAX
or all AJAX request(s)?
If not, is there any way to kill the current request?
I've looked through the wicket-ajax.js file and I don't see any mention of a request timeout whatsoever.
I've even gone so far as to try re-loading the page after some timeout on the client side, but unfortunately the server is still busy processing the original AJAX request and does not return until the AJAX request has finished processing.
Thanks!
I think it won't help you to let the client 'cancel' the request. (However this could work.)
The point is that the server is busy processing a request that is not required anymore. If you want to timeout such operations you had to implement the timeout on the server side. If the operation takes too long, then the server aborts it and returns some error value as the result of the Ajax request.
Regarding your queuing problem: You may consider to use asynchronous requests in spite of synchronous ones. This means that the client first sends a request for starting the long running process. This request immediately returns. Then the client periodically polls the server and asks if the process has finished. Those poll requests also return immediately saying either that the process is still running or that it has finished with a certain result.
Failed solution: After a given setTimeout I kill the active transports and restart the channel, which handles everything on the client side. I avoided request conflicts by tying each to an ID and checking that against a global reference that increments each time a request is made and each time a request completes.
function longRunningCallCheck(refId) {
// make sure the reference id matches the global id.
// this indicates that we are still processing the
// long running ajax call.
if(refId == id){
// perform client processing here
// kill all active transport layers
var t = Wicket.Ajax.transports;
for (var i = 0; i < t.length; ++i) {
if (t[i].readyState != 0) {
t[i].onreadystatechange = Wicket.emptyFunction;
t[i].abort();
}
}
// process the default channel
Wicket.channelManager.done('0|s');
}
}
Unfortunately, this still left the PageMap blocked and any subsequent calls wait for the request to complete on the server side.
My solution at this point is to instead provide the user an option to logout using a BookmarkablePageLink (which instantiates a new page, thus not having contention on the PageMap). Definitely not optimal.
Any better solutions are more than welcome, but this is the best one I could come up with.

Resources