Is there a way to unsubscribe or change an existing changefeed observer in rethinkdb? Setting the return value of the changes() function to null doesn't seem to do anything, is there a unsubscribe() function?
What I'd ideally like to do is change one of the index filter parameters (favorites) after the changefeed is created (since changefeeds on joins don't work and I have to change the feed if the underlying favorites collection changes).
Here is the sample code in javascript
var observer = r.table("users")
.getAll(r.args(favorites), {index:"name"})
.changes().then(function(results) {
results.each(function(err,row) {
if (err) console.error(err);
var prefix = row.new_val ? 'added' : 'deleted';
var msg = row.new_val ? row.new_val : row.old_val;
console.log(prefix + ': ' + msg.name);
});
});
observer = null; //what do I do there to have it stop observing or unsubscribe... or change the subscription to something else.. say adding a filter or changing a filter?
Don't know what library are you using for JS. With rethinkdb + rethinkdb-pool you can use this syntaxes:
r.table("users").getAll(r.args(favorites), {index:"name"})
.changes().run(connection, function(err, cursor) {
cursor.each(function(err,row) {
if (err) console.error(err);
var prefix = row.new_val ? 'added' : 'deleted';
var msg = row.new_val ? row.new_val : row.old_val;
console.log(prefix + ': ' + msg.name);
});
}
So after that you can just close cursor to stop receiving changes:
cursor.close();
Or you can close connection, and it will automatically close all cursors associated with a connection:
connection.close();
I'm not familiar with the JS driver in particular, but generally you should receive a cursor at some point, and calling close on that cursor will close the changefeed. (I'm not sure in the API above whether observer or results is the cursor, but if you print their types you should be able to see.)
Related
I am querying a dynamodb table and i am getting the results required, however i cant seem to figure out how to pass the results for use.
I haven't included the params array but its standard. This code lives inside of a lambda.
What im trying to achieve is to make this update the "value" parameter with the contents of "item[0]['the_data'];
var value = "not changed after the dbase query, why?";
dynamodb.query(queryparams, function(err, data) {
if (err) {
console.log("Query Error", err);
} else {
if(data.Count > 0){
var item = data.Items;
value = item[0]['the_data'];
//console.log("The data: " + JSON.stringify(item));
}
}
});
Yes,
It’s shows in CloudWatch, and the log when I use the test functionality in the lambda itself.
The only way around it so far has been to create an s3 object, and creating another gateway and function to return that objects content.
I have a hot Observable fed by a socket. I can use the pausable to pause the socket feed. But once I 'unpause' the observable, I need to display the last values that the socket could have sent while the subscription was paused. I don't want to keep track of the last values the socket sends manually. How could this be pausible?
From the example in the documentation, see comments below:
var pauser = new Rx.Subject();
var source = Rx.Observable.fromEvent(document, 'mousemove').pausable(pauser);
var subscription = source.subscribe(
function (x) {
//somehow after pauser.onNext(true)...push the last socket value sent while this was paused...
console.log('Next: ' + x.toString());
},
function (err) {
console.log('Error: ' + err);
},
function () {
console.log('Completed');
});
// To begin the flow
pauser.onNext(true);
// To pause the flow at any point
pauser.onNext(false);
You don't even need pausable to do this. (Note as well that you tagged RxJS5 but pausable only exists in RxJS 4). You simply need to convert your pauser into a higher order Observable:
var source = Rx.Observable.fromEvent(document, 'mousemove')
// Always preserves the last value sent from the source so that
// new subscribers can receive it.
.publishReplay(1);
pauser
// Close old streams (also called flatMapLatest)
.switchMap(active =>
// If the stream is active return the source
// Otherwise return an empty Observable.
Rx.Observable.if(() => active, source, Rx.Observable.empty())
)
.subscribe(/**/)
//Make the stream go live
source.connect();
I have an application that uses swig for templating, and sessions to store some client specific data.
I create my sessions inside a route like this:
exports.RenderIndex = function(req, res){
if (req.body.filters) {
var filters = req.body.filters;
if (filters.date && filters.date.length > 0) {
var d = moment(filters.date, "YYYY-MM-DD");
var weekNumber = d.week();
req.session.selectedDate = filters.date;
req.session.selectedWeek = weekNumber;
}
}
};
Later on in my app i have a swig-filter needs to read and use my sessions, but i'm not quite sure how to access the sessions when i have no request object to take it from.
My swig-filter looks something like this:
swig.setFilter('getOpeninghourTable', function (input, idx) {
var weeknumber = HERE_ID_LIKE_MY_SESSION_VALUE
var data = calendar.json(input, weeknumber );
return swig.renderFile(''template.html, calendarData);
});
IS it possible?
No, it's not possible. Why would it be? How would your code know what session to use?
Any request-specific code must be in a request handler (or called from it) so it has a reference to the request data.
So I'm building a multipart form uploader over ajax on node.js, and sending progress events back to the client over socket.io to show the status of their upload. Everything works just fine until I have multiple clients trying to upload at the same time. Originally what would happen is while one upload is going, when a second one starts up it begins receiving progress events from both of the forms being parsed. The original form does not get affected and it only receives progress updates for itself. I tried creating a new formidable form object and storing it in an array along with the socket's session id to try to fix this, but now the first form stops receiving events while the second form gets processed. Here is my server code:
var http = require('http'),
formidable = require('formidable'),
fs = require('fs'),
io = require('socket.io'),
mime = require('mime'),
forms = {};
var server = http.createServer(function (req, res) {
if (req.url.split("?")[0] == "/upload") {
console.log("hit upload");
if (req.method.toLowerCase() === 'post') {
socket_id = req.url.split("sid=")[1];
forms[socket_id] = new formidable.IncomingForm();
form = forms[socket_id];
form.addListener('progress', function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
});
form.parse(req, function (err, fields, files) {
file_name = escape(files.upload.name);
fs.writeFile(file_name, files.upload, 'utf8', function (err) {
if (err) throw err;
console.log(file_name);
})
});
}
}
});
var socket = io.listen(server);
server.listen(8000);
If anyone could be any help on this I would greatly appreciate it. I've been banging my head against my desk for a few days trying to figure this one out, and would really just like to get this solved so that I can move on. Thank you so much in advance!
Can you try putting console.log(socket_id);
after form = forms[socket_id]; and
after progress = (bytesReceived / bytesExpected * 100).toFixed(0);, please?
I get the feeling that you might have to wrap that socket_id in a closure, like this:
form.addListener(
'progress',
(function(socket_id) {
return function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
};
})(socket_id)
);
The problem is that you aren't declaring socket_id and form with var, so they're actually global.socket_id and global.form rather than local variables of your request handler. Consequently, separate requests step over each other since the callbacks are referring to the globals rather than being proper closures.
rdrey's solution works because it bypasses that problem (though only for socket_id; if you were to change the code in such a way that one of the callbacks referenced form you'd get in trouble). Normally you only need to use his technique if the variable in question is something that changes in the course of executing the outer function (e.g. if you're creating closures within a loop).
I have an aspx page on which I am using XDomainRequest object to populate two div(s) with html returned from AJAX response.
I have used Jquery to get the divs and perform "each()" on the retrieved List
var divs = $("div");
divs.each(function(index) {
if (window.XDomainRequest) {
xdr = new XDomainRequest();
if (xdr) {
xdr.onload = function() {
alert("XDR Response - " + xdr.responseText);
var currentDivID = divs[index].attributes["id"].value;
var selectedDiv = $("div[id='" + currentDivID + "']");
if (xdr.responseText == '') selectedDiv.attr("style", "display:none;");
else selectedDiv.append(xdr.responseText);
};
xdr.open("GET", xdrUrl);
try {
xdr.send();
} catch (e) {
alert(e);
}
} else {
alert('Create XDR failed.');
}
} else {
alert('XDR not found on window object .');
}
}
Now, whats happening is , i have two Divs on a page that have different IDs and when this code runs on "$.ready(function(){})" , both requests are asynchronously sent to the server and processed
the result is
1. sometimes the onload get the response for the second div in both div results.
2. IE sents only one request to the server(I am using fiddler to see what requests are sent to server).
Can anybody guide me whats wrong with the code ? As far as I know XDR does not support synchronous calls, and asynchronous calls are giving me wrong results. Any workaround/tip for this problem.
Issue solved by myself when I pointed out a mistake in my code:(.
xdr = new XDomainRequest();
should be
var xdr = new XDomainRequest();
For Point 2 , I added "Cache-Control:no-cache" header in my response and it solved the matter.