NHibernate & session-per-conversation in AJAX scenario: manual flush needed? - ajax

we are facing a very strange problem with our application, until now i see no option to solve the problem:
The application uses NHibernate (3.1.x) in an session-per-conversation architecture, meaning at start/end of each ASP/IIS request a NHibernate session with a new transaction is opened/closed + flushed/committed.
Now, we have a mailbox in the application including a simple mailgrid:
The mailgrid is populated first time when a user enters the mailbox (start-Request creates a new NHibernate session/transaction, loads stuff from DB, on end-Request the session is closed)
The problem occurs at the deletion of a mail:
(1)
the deletion of mails does not happen via a full page-reload/refresh, instead it calls a webservice which returns the new rendered HTML grid after a message is deleted.
This webservice deletes a mail by doing a simple
T_Mails m = T_Mails.GetMailByID( 12345678 );
m.Status = MailStatusDeleted;
NHibernateSession.Update( m );
(2)
After this call to update the mail, the same function also re-renders the updated mailgrid - it does this by querying for the last 5 rows in table T_Mails (one mailbox page contains only 5 rows!); then, this is the current content of the mailbox for a given user.
The problem now is:
Since both of these calls occur in the same function of the webservice (and thus in the same session/transaction scope), the mail we want to delete (in step 1), is not deleted since the updated status from NHibernateSession.Update( m ) does not reach the database.
If i open the table via MSSQL ManagementStudio while a debugging session i can see that the status of the mail is not persited to database.
My simple solution for this problem would be:
Doing a manual NHibernateSession.Flush() call between updating the mail and reloading the content from the table?
But: this would break the session-per-conversation pattern, since NhibernateSession.Flush() is/should occuring at the end of each request - it's not ought to be called manually somewhere in the running request.
Otherwise i'm really stuck.

You should make 2 ajax calls - first one to delete an item and second one to refresh list when first query will be finished. Please make sure that you are making calls in same order. Below is sample code using jQuery
$.ajax({
url: 'Mail/Delete',
data: { mailId: someId },
success: function (data) {
$.ajax({
url: 'Mail/Refresh',
success: function (data) {
//here you should put new list into list container
}
});
}
});

Related

Updating website as soon as intermediate plots are ready

Dependent on a selection, made on a website, plots are generated on a server where Flask runs.
The issue that I have is that generating e.g. 10 different plots can take up to 30s. What I like to achieve is to start updating the website as soon as the first plot is ready and then load automatically the others as soon as they are ready.
Currently, the following AJAX function is executed as soon as the user hits the "process" button on the website:
$.ajax({
type: "POST",
url: "/single",
data: { athleteName1: $('#athleteName1').val(), style: $('#style').val()},
success: function (results) {
$('#results').empty().append(results);
$('#results').show();
$('#submitbutton').prop('disabled', false);
},
error: function (error) {
console.log(error);
}
});
On the server site, plots are created and embedded in div-containers. They are subsequently concatenated an returned to the website at once as "diagStr":
#app.route('/single', methods=['POST', 'GET'])
def single():
loop 10 times:
diagStr += generate_plot()
return Markup(diagStr)
Doing it with "Streaming Contents" can only be part of the solution as AJAX waits until the the entire response is received.
Any idea how this is solved with today's technology?
There are multiple way you could achieve this, but some simple examples:
do your looping on the client side, and generate 10 separate Ajax requests, updating the web page when each response is received.
if you don't know in advance on the client side, how many loops you will have, then use a single request and have the server send the response as soon as the first loop is complete, along with a flag indicating whether there are more loops or not - the client can look at this flag and create a new Ajax request if there are more loops.

Ajax ABORT leads to Django errno 10053 - An established connection was aborted by the software in your host machine

I have a web-page (Django 1.9 on back-end, Apache server) with an endless-paginated table with large data set and column filters. When a user activates one filter (let's denote it CHOICE 1), and then instantly changes his mind resetting the filter (let's refer to it as CHOICE 2), I would like to tell Ajax to give up waiting for back-end response to CHOICE 1 and go on to posting and waiting for CHOICE 2 request. For this purpose, I had the following JS code:
// AJAX_GET_REQUEST is a global variable
AJAX_GET_REQUEST= $.ajax(
{
type:'GET',
url:"/my_url/",
beforeSend : function()
{
if (AJAX_GET_REQUEST!= null)
AJAX_GET_REQUEST.abort();
},
data:data,
success: function(response)
{
// Do something
}
});
Fine. I used to think that I achieved the goal of successfully canceling irrelevant requests, But I found out that AJAX_GET_REQUEST.abort(); leads to Django error [Errno 10053] An established connection was aborted by the software in your host machine. The interesting think is that this is not a 'fatal error' in that the app does not terminate, but rather it takes years for my paginated table to load. Django seems to reactivate connection itself and go on to handle last request. Finally after waiting for long time I see the correct result on front-end. If I remove the AJAX_GET_REQUEST.abort(); line, everything is fine, but I have to wait until Django is through with irrelevant requests until it goes on to handle the last relevant request.
Is there any way out? Is it possible to abort previous requests avoiding this annoying 10053 error ?

Nodejs - How to kill a running SQLite query

Possible similar question :
SQL: Interrupting a query
Is there a way to abort an SQLite call?
Hi everyone,
I am actually using socket.io and sqlite3 modules to perform SELECT query on a SQLite database. When a user click on an OpenLayers Map, its sends a signal to the server (through socket.io) to gather informations by performing spatial request (like intersection, union... using Spatialite extension) and then finally send back data to the client (these are long-running queries (depending of the amount of geometries) ) to show a popup on the map where the user clicked to.
The problem is: if a user click many times on the map, sending many requests to the server, only the last one is important. Imagine that if a query takes 5 sec to be executed, and that a user click 3 times in a second on the map (he just wants the last location where he clicked to be used), then the server will do 3 queries, sending back 3 signals through socket.io (and opening 3 popup, we just need the last one to be opened) ! Is there any solution to kill/abort a running sqlite query with nodejs ?
Example code :
socket.on('askForInfo', function (data) {
sendInfo(socket, data.latitude, data.longitude);
});
sendInfo definition :
function sendInfo(socket, lat, lng) {
// Database connection
var db = new sqlite.Database('some file.sqlite', sqlite.OPEN_READONLY);
// Load Spatialite extension
db.loadExtension('mod_spatialite', function(err) {
// Query doing spatial request
db.get("VERY LONG SQL QUERY", function(err, row) {
// Send the data gathered from database
socket.emit('sendData', row['some sql column']);
});
});
}
I want to do something like :
if ("sendInfo didn't finished to emit any signal through socket"
AND "user did another resquest")
then
"kill all running sendInfo function execution and sql query"
I know that if there are many users connected this won't work like that (I may need to use session to know for which user the function is actually gathering data). But I don't find any solution even if there is only one user.
I tryed using AJAX(jquery) instead of socket.io. I can abort the xhr request, but the SQL query is still running even if the request is aborted until she is finish ( using lot of ressources uselessly )
Thanks in advance for your answer.
Thanks for your answer Qualcuno.
I found a solution, using child_process to query database from another process, killing this one if user do another request.
var cp = require('child_process');
// more stuff ...
socket.on('askForInfo', function (data) {
// if process is connected, kill it (query isn't terminated)
if (child.connected) {
child.kill();
}
// create a new process executing 'query_database.js'
child = cp.fork('./query_database.js', [
data.latitude, data.longitude
]);
// when invoking [ process.send(some_data_here) ] in child process, fire this event to send data to the user
child.on('message', function(d) {
socket.emit('sendData', d)
};
});
Nodejs’ SQLite has interrupt method, albeit currently undocumented: https://github.com/mapbox/node-sqlite3/issues/1205#issuecomment-559983976

ExpressJS backend hanging with too much requests

I have an express app running with Sequelize.js as an ORM. My express app receives requests from my main Rails app, and because of the cross-domain policy, these requests are performed with getJSON.
On the client, the request is fired when the user hits a key.
Everything goes fine and express logs the queries being performed (and json being served) each time the user hits the key. Even trying to hit quickly it performs ok. But, whenever I leave the key pressed (or maybe several clients hitting the key very quickly), as it starts firing lots of requests, at some moment the server just hangs, all the requests from that point on are left pending (I see that in the Network tab of Chrome Dev Tools), and they slowly start to timeout. I have to reboot the server to make it respond again.
The server code for my request is:
models.Comment.findAllPublic(req.params.pId, req.params.sId, function(comments){
var json = comments.map(function(comment){
var com = {};
['user_id','user_avatar', 'user_slug', 'user_name', 'created_at', 'text', 'private', 'is_speaker_note'].forEach(function(key){
com[key]=comment[key];
});
return com;
});
res.json({comments: json});
});
And the findAllPublic method from the Comment model (this is a Sequelize model) is:
findAllPublicAndMyNotes: function(current_user, presentationId, slideId, cb){
db.query("SELECT * FROM `comments` WHERE commentable_type='Slide' AND commentable_id=(SELECT id from `slides` where `order_in_presentation`="+slideId+" AND `presentation_id`="+presentationId+") AND (`private` IS FALSE OR (`private` IS TRUE AND `user_id`="+current_user+" AND `is_speaker_note` IS FALSE))",self.Comment).on('success', cb).on('failure',function(err){console.log(err);});
}
How to avoid the server from getting stuck? Am I leaving some blocking code in the request that may slowly hang the server as new requests are made?
At first I thought it could be a problem because of the "forEach" when composing the json object from the Sequelize model, but I also tried leaving the callback for the mysql query empty, just responding empty json and it also got frozen.
Maybe it is a problem of the mysql connector? When the server gets stuck I can normally run the mysql console and perform queries on my database and it also responds, so I don't know if that's the problem.
I know I could just control the key event to prevent it from firing too many requests when the key gets pressed for a long time, but the problem seems to appear also when several clients hit the key repeatedly and concurrently.
Any thoughts? Thanks in advance for the help :D
Two things:
It seems like you have some path where res.render is not being called. It could be that the database you're connecting to is dropping the connection to your Express server after the absurd number of requests and the callback is never fired (and there's no database.on('close', function() { // Handle disconnect from DB, perhaps auto-restarting }) code to catch it.
Your client-side code should detect when an AJAX request on keypress is still pending while a new one is being started, and cancel the old one. I'm guessing getJSON is a jQuery method? Assuming it's jQuery's, then you need something like the following
.
var currKeyRequest = null;
function callOnKeyUp() {
var searchText = $('#myInputBox').value;
if(currKeyRequest) {
currKeyRequest.reject();
currKeyRequest = null;
}
currKeyRequest = $.getJSON('path/to/server', function(json) {
currKeyRequest = null;
// Use JSON code
});
}
This way, you reduce the load on the client, the latency of the autocomplete functionality (but why not use the jQuery UI autocomplete if that's what you're after?), and you can save the server from some of the load as well if the keypresses are faster than handshaking with the server (possible with a good touch-typist a few hours flight away).

Maintain order of requests when making several ajax callbacks

I'm looping through several items and making an ajax request for each of them (using jQuery). I want them to execute independently, but populate into the DOM in the order they were called, not the order they are returned (for some reason some requests are taking longer than others). Any tips on the best practice for this type of thing?
Well the results can come back in any undefined order, they are asynchronous and subject to the vagaries of the internet and servers.
What you can do is deal with the problem in the same way TCP does over UDP. You use sequence identifiers.
Keep a sequence identifier going, and increment it every time you send out a request. As requests come back, check them off in order and only process them as they come in. Keep a list of what has returned with the data in order, and have a routine fire to check that list after each update to it. When the first expected is in, it should process the whole list down to the first gap.
Bare in mind that you could lose a request, so a suitable timeout before you ignore a given sequence identifier would be in order.
The answer to this ended up being a jQuery plugin called ajaxManager. This did exactly what I needed:
https://github.com/aFarkas/Ajaxmanager
You could send all the success result objects to a queue. Have an index that was sent with the original request, and continually check that queue for the next index.
But generally browsers only allow two simultaneous ajax requests, so it might be worth it to just send the next ajax request on success of the previous request.
Here's a start at the code:
var results = {}, lastProcessedIndex = 0;
var totalLength = $('a.myselector').each(function(el, index){
$.ajax({
url: $(this).attr('href'),
success: function(result){
results[index] = result; // add to results object
}
});
}).length;
var intervalId = setInterval(function(){
if(results[lastProcessedIndex]){
// use object
lastProcessedIndex++;
}
else if(totalLength == lastProcessedIndex){
clearInterval(intervalId);
}
}, 1000); // every 1 second
I'll be taking a stab in the dark with this one but it might help. Maybe you could create a global buffer array and then whenever the AJAX returns you can add the result to the buffer. You could then set up a timer that, when triggered, will check the contents of the buffer. If they are in order it will output it accordingly.

Resources