Nodejs - How to kill a running SQLite query - ajax

Possible similar question :
SQL: Interrupting a query
Is there a way to abort an SQLite call?
Hi everyone,
I am actually using socket.io and sqlite3 modules to perform SELECT query on a SQLite database. When a user click on an OpenLayers Map, its sends a signal to the server (through socket.io) to gather informations by performing spatial request (like intersection, union... using Spatialite extension) and then finally send back data to the client (these are long-running queries (depending of the amount of geometries) ) to show a popup on the map where the user clicked to.
The problem is: if a user click many times on the map, sending many requests to the server, only the last one is important. Imagine that if a query takes 5 sec to be executed, and that a user click 3 times in a second on the map (he just wants the last location where he clicked to be used), then the server will do 3 queries, sending back 3 signals through socket.io (and opening 3 popup, we just need the last one to be opened) ! Is there any solution to kill/abort a running sqlite query with nodejs ?
Example code :
socket.on('askForInfo', function (data) {
sendInfo(socket, data.latitude, data.longitude);
});
sendInfo definition :
function sendInfo(socket, lat, lng) {
// Database connection
var db = new sqlite.Database('some file.sqlite', sqlite.OPEN_READONLY);
// Load Spatialite extension
db.loadExtension('mod_spatialite', function(err) {
// Query doing spatial request
db.get("VERY LONG SQL QUERY", function(err, row) {
// Send the data gathered from database
socket.emit('sendData', row['some sql column']);
});
});
}
I want to do something like :
if ("sendInfo didn't finished to emit any signal through socket"
AND "user did another resquest")
then
"kill all running sendInfo function execution and sql query"
I know that if there are many users connected this won't work like that (I may need to use session to know for which user the function is actually gathering data). But I don't find any solution even if there is only one user.
I tryed using AJAX(jquery) instead of socket.io. I can abort the xhr request, but the SQL query is still running even if the request is aborted until she is finish ( using lot of ressources uselessly )
Thanks in advance for your answer.

Thanks for your answer Qualcuno.
I found a solution, using child_process to query database from another process, killing this one if user do another request.
var cp = require('child_process');
// more stuff ...
socket.on('askForInfo', function (data) {
// if process is connected, kill it (query isn't terminated)
if (child.connected) {
child.kill();
}
// create a new process executing 'query_database.js'
child = cp.fork('./query_database.js', [
data.latitude, data.longitude
]);
// when invoking [ process.send(some_data_here) ] in child process, fire this event to send data to the user
child.on('message', function(d) {
socket.emit('sendData', d)
};
});

Nodejs’ SQLite has interrupt method, albeit currently undocumented: https://github.com/mapbox/node-sqlite3/issues/1205#issuecomment-559983976

Related

Updating website as soon as intermediate plots are ready

Dependent on a selection, made on a website, plots are generated on a server where Flask runs.
The issue that I have is that generating e.g. 10 different plots can take up to 30s. What I like to achieve is to start updating the website as soon as the first plot is ready and then load automatically the others as soon as they are ready.
Currently, the following AJAX function is executed as soon as the user hits the "process" button on the website:
$.ajax({
type: "POST",
url: "/single",
data: { athleteName1: $('#athleteName1').val(), style: $('#style').val()},
success: function (results) {
$('#results').empty().append(results);
$('#results').show();
$('#submitbutton').prop('disabled', false);
},
error: function (error) {
console.log(error);
}
});
On the server site, plots are created and embedded in div-containers. They are subsequently concatenated an returned to the website at once as "diagStr":
#app.route('/single', methods=['POST', 'GET'])
def single():
loop 10 times:
diagStr += generate_plot()
return Markup(diagStr)
Doing it with "Streaming Contents" can only be part of the solution as AJAX waits until the the entire response is received.
Any idea how this is solved with today's technology?
There are multiple way you could achieve this, but some simple examples:
do your looping on the client side, and generate 10 separate Ajax requests, updating the web page when each response is received.
if you don't know in advance on the client side, how many loops you will have, then use a single request and have the server send the response as soon as the first loop is complete, along with a flag indicating whether there are more loops or not - the client can look at this flag and create a new Ajax request if there are more loops.

Show status of long process running on server side

I'm working for CSV import in my Node.js based web app.
Most given CSV files has tens of thousands of records, and it takes several minutes.
So until import finish, I want to show users "Currently importing..." message.
What I want to create is similar to Github's forking screen. After you press fork button on top right of repo, it shows message that "Forking / It should only take a few seconds." until fork finishes.
In addition, I want to add progress bar to indicate percentage of processed records hopefully.
Current my implementation is:
Client send request with CSV data
Server processes received CSVs and insert records to DB.
Server respond 200 if CSV is valid.
But with implementation users cannot see current status. Even sometimes socket hangs up.
I'm considering following reimplementation:
Client send request with CSV data
Server respond 200 to tell client that CSV is received
Server starts to process received CSVs and insert records to DB.
However, I have no idea:
how client know that import is done
how client know when error is occur in CSV processing and DB insertion
How can implement server side?
Thanks in advance ;)
You need to use socket.io here to keep track of the progress. As soon as you receive the CSV, your client could connect to socket.
Server:
io.on('connection', function (socket) {
console.log('CONNECTED');
socket.join('progressSession');
});
You can periodically emit progress event to let the client know how many records you've been processed. (I hope you're processing records asynchronously, or can at least run some other code in between)
io.sockets.in('progressSession').emit('progress', noOfRecords);
And, Client can listen on progress event and show it to the user
var socket = io.connect('http://localhost:9000');
socket.on('progress' , function (status){
console.log(status);
// show status to the user
});
Comment if need any more clarity.
Send the request as you do, return the status immediately to confirm or reject CSV valididy and finish the response. Then use something like http://socket.io/ to send updates to the client.

Programmatically change database for heroku dataclips

We just upgraded our Heroku postgres database using the follower changeover method. We have over 50 dataclips attached to the old database, and now we need to move them over to the new database. However, doing them one by one will take a lot of time.
Is there a programatic way to update the database a dataclip is attached to, perhaps with the CLI tools?
At least once the old database has been deprovisioned, you can now (as of March 2016) reattach them to another database:
Go to https://dataclips.heroku.com/clips/recoverable. It will display your old database and a set of 'orphaned' dataclips and you can choose to transfer them to another database (in my case the promoted follower from the changeover).
Note that this only affects the dataclips that you created, it does not affect the dataclips one of your team members created and that you only had access to. So they will have to go through this process as well.
Official devcenter article: https://devcenter.heroku.com/articles/dataclips#dataclip-recovery
Thanks to Heroku CSRF measures, programmatically updating data clips is much more difficult than you might expect. You'll need to suck it up and start clicking buttons by hand, or beg their support team to do it for you, which is just as difficult.
There is no official support for programmatically moving the dataclips. That being said, you can script it out against their HTTP API.
The base URL is https://dataclips.heroku.com/api/v1/. There are three relevant endpoints:
clips /clips
resources (databases) /heroku_resources
move clip /clips/:slug/move
Find the slug of the clip you want to move, find the resource id of the new database, and make a post to the move clip endpoint:
POST /api/v1/clips/fjhwieufysdufnjqqueyuiewsr/move
Content-Type: application/json
{"heroku_resource_id":"resource123456789#heroku.com"}
I had over 300 dataclips to move. I used the following technique to update them all (essentially reverse engineering the dataclips API).
Open Chrome with Web Developer tools, Network tab.
Log into Heroku Dataclips
Observe the network call which returns all the dataclips, in JSON (https://dataclips.heroku.com/api/v1/clips). Take this response and extract out all dataclip slugs.
Update the database for one dataclip. Observe the network call which does this (https://dataclips.heroku.com/api/v1/clips/:slug/move). Right click, Copy as cURL. This is the easiest way to get all the correct parameters, since the API uses cookies for authentication.
Write a script that loops through each dataclip slug, and shells out to curl. In Ruby, this looks like:
slugs = <paste ids here>.split("\n")
slugs.each do |slug|
command = %Q(curl -v 'https://dataclips.heroku.com/api/v1/clips/#{slug}/move' -H 'Cookie: ...' --data '{"heroku_resource_id":"resource1234567#heroku.com"}')
puts command
system(command)
end
You can contact Heroku support, and they will bulk transfer the dataclips to your new database for you.
Batch working on dataclips
I've finally found a solution to work on my Dataclips as a batch using the javascript console and some scraping technique. I needed it to retrieve every dataclips. But it guess It can be updated as such:
// Go to the dataclip listing (https://data.heroku.com/dataclips).
// Then execute this script in your console.
// Be careful, this will focus a new window every 4 seconds, preventing
// you from working 4 seconds times the number of dataclips you have.
// Retrieve urls and titles
let dataclips = Array.
from(document.querySelectorAll('.rt-td:first-child a')).
map(el => ({ url: el.href, title: el.innerText }))
/**
* Allows waiting for a given timeout before execution.
* #param {number} seconds
*/
const timeout = function(seconds) {
return new Promise(resolve => {
setTimeout(() => {
resolve()
}, seconds);
})
}
/**
* Here are all the changes you want to apply to every single
* dataclip.
* #param {object} window
*/
const applyChanges = function(window) {
}
// With a fast connection, 4 seconds is OK. Dial it down if you
// have errors.
const expectedLoadTime = 4000 // ms
// This is the main loop, windows are opened one by one to ensure focus and a
// correct loading time.
for (const dataclip of dataclips) {
// This opens another window from the script, having access to its DOM.
// See https://github.com/buonomo/kazoo for a funnier example usage!
// And don't be shy to star and share :D
const externWindow = window.open(dataclip.url)
// A hack to wait for loading, this could be improved for sure.
await timeout(expectedLoadTime)
applyChanges(externWindow)
externWindow.close()
}
You'd still have to implement applyChanges yourself which I conceed is a bit tedious and I don't have time to do it know (if one does, please share!). But at least it can be done on all of your dataclips in a single function.
For an example usage of this script, you can take a look at the gist I made to scrape every dataclips and related errors.

ExpressJS backend hanging with too much requests

I have an express app running with Sequelize.js as an ORM. My express app receives requests from my main Rails app, and because of the cross-domain policy, these requests are performed with getJSON.
On the client, the request is fired when the user hits a key.
Everything goes fine and express logs the queries being performed (and json being served) each time the user hits the key. Even trying to hit quickly it performs ok. But, whenever I leave the key pressed (or maybe several clients hitting the key very quickly), as it starts firing lots of requests, at some moment the server just hangs, all the requests from that point on are left pending (I see that in the Network tab of Chrome Dev Tools), and they slowly start to timeout. I have to reboot the server to make it respond again.
The server code for my request is:
models.Comment.findAllPublic(req.params.pId, req.params.sId, function(comments){
var json = comments.map(function(comment){
var com = {};
['user_id','user_avatar', 'user_slug', 'user_name', 'created_at', 'text', 'private', 'is_speaker_note'].forEach(function(key){
com[key]=comment[key];
});
return com;
});
res.json({comments: json});
});
And the findAllPublic method from the Comment model (this is a Sequelize model) is:
findAllPublicAndMyNotes: function(current_user, presentationId, slideId, cb){
db.query("SELECT * FROM `comments` WHERE commentable_type='Slide' AND commentable_id=(SELECT id from `slides` where `order_in_presentation`="+slideId+" AND `presentation_id`="+presentationId+") AND (`private` IS FALSE OR (`private` IS TRUE AND `user_id`="+current_user+" AND `is_speaker_note` IS FALSE))",self.Comment).on('success', cb).on('failure',function(err){console.log(err);});
}
How to avoid the server from getting stuck? Am I leaving some blocking code in the request that may slowly hang the server as new requests are made?
At first I thought it could be a problem because of the "forEach" when composing the json object from the Sequelize model, but I also tried leaving the callback for the mysql query empty, just responding empty json and it also got frozen.
Maybe it is a problem of the mysql connector? When the server gets stuck I can normally run the mysql console and perform queries on my database and it also responds, so I don't know if that's the problem.
I know I could just control the key event to prevent it from firing too many requests when the key gets pressed for a long time, but the problem seems to appear also when several clients hit the key repeatedly and concurrently.
Any thoughts? Thanks in advance for the help :D
Two things:
It seems like you have some path where res.render is not being called. It could be that the database you're connecting to is dropping the connection to your Express server after the absurd number of requests and the callback is never fired (and there's no database.on('close', function() { // Handle disconnect from DB, perhaps auto-restarting }) code to catch it.
Your client-side code should detect when an AJAX request on keypress is still pending while a new one is being started, and cancel the old one. I'm guessing getJSON is a jQuery method? Assuming it's jQuery's, then you need something like the following
.
var currKeyRequest = null;
function callOnKeyUp() {
var searchText = $('#myInputBox').value;
if(currKeyRequest) {
currKeyRequest.reject();
currKeyRequest = null;
}
currKeyRequest = $.getJSON('path/to/server', function(json) {
currKeyRequest = null;
// Use JSON code
});
}
This way, you reduce the load on the client, the latency of the autocomplete functionality (but why not use the jQuery UI autocomplete if that's what you're after?), and you can save the server from some of the load as well if the keypresses are faster than handshaking with the server (possible with a good touch-typist a few hours flight away).

NHibernate & session-per-conversation in AJAX scenario: manual flush needed?

we are facing a very strange problem with our application, until now i see no option to solve the problem:
The application uses NHibernate (3.1.x) in an session-per-conversation architecture, meaning at start/end of each ASP/IIS request a NHibernate session with a new transaction is opened/closed + flushed/committed.
Now, we have a mailbox in the application including a simple mailgrid:
The mailgrid is populated first time when a user enters the mailbox (start-Request creates a new NHibernate session/transaction, loads stuff from DB, on end-Request the session is closed)
The problem occurs at the deletion of a mail:
(1)
the deletion of mails does not happen via a full page-reload/refresh, instead it calls a webservice which returns the new rendered HTML grid after a message is deleted.
This webservice deletes a mail by doing a simple
T_Mails m = T_Mails.GetMailByID( 12345678 );
m.Status = MailStatusDeleted;
NHibernateSession.Update( m );
(2)
After this call to update the mail, the same function also re-renders the updated mailgrid - it does this by querying for the last 5 rows in table T_Mails (one mailbox page contains only 5 rows!); then, this is the current content of the mailbox for a given user.
The problem now is:
Since both of these calls occur in the same function of the webservice (and thus in the same session/transaction scope), the mail we want to delete (in step 1), is not deleted since the updated status from NHibernateSession.Update( m ) does not reach the database.
If i open the table via MSSQL ManagementStudio while a debugging session i can see that the status of the mail is not persited to database.
My simple solution for this problem would be:
Doing a manual NHibernateSession.Flush() call between updating the mail and reloading the content from the table?
But: this would break the session-per-conversation pattern, since NhibernateSession.Flush() is/should occuring at the end of each request - it's not ought to be called manually somewhere in the running request.
Otherwise i'm really stuck.
You should make 2 ajax calls - first one to delete an item and second one to refresh list when first query will be finished. Please make sure that you are making calls in same order. Below is sample code using jQuery
$.ajax({
url: 'Mail/Delete',
data: { mailId: someId },
success: function (data) {
$.ajax({
url: 'Mail/Refresh',
success: function (data) {
//here you should put new list into list container
}
});
}
});

Resources