I've been searching to find out how to perform a subscription to key space notifications on Redis using ServiceStack.Redis library for removal of Key.
Checking available tests on the git-hub and other websites I've found IRedisSubscription can be used for subscribing to specific Redis key events, For set operation it is working absolutely fine but when it comes to Delete operation the action is not invoked.
Is it possible to take advantage of this Redis feature using ServiceStack.Redis and get event on delete operation too?
In the configuration file I have added this line:
notify-keyspace-events KEAg
I am using the following code.
var channels = new[] { "__keyevent#0__:set" , "__keyevent#0__:del" };
using (var redisConsumer = new RedisClient("localhost:6379"))
using (var subscription = redisConsumer.CreateSubscription()) {
subscription.OnMessage = onKeyChange;
subscription.SubscribeToChannelsMatching(channels );
}
From the surface, it looks like what you got should work.
Try setting notify-keyspace-events to AKE, the g is redundant, as noted in Notifications Config:
A Alias for g$lshztxe, so that the "AKE" string means all the
events.
Try using SubscribeToChannels instead of SubscribeToChannelsMatching. The latter is for pattern subscription.
You can test how many subscribers you have with the PUBSUB NUMSUB __keyevent#0__:del command from redis-cli.
Try testing your events are being triggered with SUBSCRIBE __keyevent#0__:del from redis-cli. This will help you determine if the problem is on redis-server or the app code.
Please update the question with results if you can't get it to work after trying the above.
Related
I'm bothering with situation when I want to emit query update via queryUpdateEmitter but in different module (microservice). I have application built upon microservices and both are connected to the same Axon Server. First service creates subscriptionQuery, and sends some commands. After a while (through few commands and events) second service handles some event, and emits update for firstly subscribed query. Unfortunately it seems like this emit doesn't get to subscriber. Queries are exactly the same and sits in the same packages.
Subscription:
#GetMapping("/refresh")
public Mono<MovieDTO> refreshMovies() {
commandGateway.send(
new CreateRefreshMoviesCommand(UUID.randomUUID().toString()));
SubscriptionQueryResult<MovieDTO, MovieDTO> refreshedMoviesSubscription =
queryGateway.subscriptionQuery(
new GetRefreshedMoviesQuery(),
ResponseTypes.instanceOf(MovieDTO.class),
ResponseTypes.instanceOf(MovieDTO.class)
);
return refreshedMoviesSubscription.updates().next();
}
Emitter:
#EventHandler
public void handle(DataRefreshedEvent event) {
log.info("[event-handler] Handling {}, movieId={}",
event.getClass().getSimpleName(),
event.getMovieId());
queryUpdateEmitter.emit(GetRefreshedMoviesQuery.class, query -> true,
Arrays.asList(
MovieDTO.builder().aggregateId("as").build(),
MovieDTO.builder().aggregateId("be").build()));
}
This situation is even possible in the newest version of Axon? Similar configuration but within one service is working as expected.
#Edit
I have found a workardound for this situation:
Second service instead of emitting query via queryUpdateEmitter, publishes event with list of movies
First service handles this event and then emits update via queryUpdateEmitter
But still I'd like to know if there is a way to do this using queries only, because it seems natural to me (commandGateways/eventGateways works as expected, queryUpdateEmitter is the exception).
This follows from the implementation of the QueryUpdateEmitter (regardless of using Axon Server yes/no).
The QueryUpdateEmitter stores a set of update handlers, referencing the issued subscription queries. It however only maintains the issued subscription queries handled by the given JVM (as the QueryUpdateEmitter implementation is not distributed).
It's intent is to be paired in the component (typically a Query Model "projector") which answers queries about a given model, updates the model and emits those updates.
Hence, placing the QueryUpdateEmitter operations in a different (micro)service as where the query is handled will not work.
Does the Azure Service Bus Subscription client support the ability to use OnMessage Action when the subscription requires a session?
I have a subscription, called "TestSubscription". It requires a sessionId and contains multipart data that is tied together by a SessionId.
if (!namespaceManager.SubscriptionExists("TestTopic", "Export"))
{
var testRule = new RuleDescription
{
Filter = new SqlFilter(#"(Action='Export')"),
Name = "Export"
};
var subDesc = new SubscriptionDescription("DataCollectionTopic", "Export")
{
RequiresSession = true
};
namespaceManager.CreateSubscription(sub`enter code here`Desc, testRule);
}
In a seperate project, I have a Service Bus Monitor and WorkerRole, and in the Worker Role, I have a SubscriptionClient, called "testSubscriptionClient":
testSubscriptionClient = SubscriptionClient.CreateFromConnectionString(connectionString, _topicName, CloudConfigurationManager.GetSetting("testSubscription"), ReceiveMode.PeekLock);
I would then like to have OnMessage triggered when new items are placed in the service bus queue:
testSubscriptionClient.OnMessage(PersistData);
However I get the following message when I run the code:
InvalidOperationException: It is not possible for an entity that requires sessions to create a non-sessionful message receiver
I am using Azure SDK v2.8.
Is what I am looking to do possible? Are there specific settings that I need to make in my service bus monitor, subscription client, or elsewhere that would let me retrieve messages from the subscription in this manner. As a side note, this approach works perfectly in other cases that I have in which I am not using sessioned data.
Can you try this code:
var messageSession=testSubscriptionClient.AcceptMessageSession();
messageSession.OnMessage(PersistData);
beside of this:
testSubscriptionClient.OnMessage(PersistData);
Edit:
Also, you can register your handler to handle sessions (RegisterSessionHandler). It will fire your handle every new action.
I think this is more suitable for your problem.
He shows both way, in this article. It's for queue, but I think you can apply this to topic also.
I have 1000 to 10,000 keys stored in Redis, their value type is list.
When a new item is added to any one of the existing lists, I need my golang program to be notified.
Once notification is received I need to spawn a new goroutine and perform a small operation.
I am using redigo for redis connection pool.
What is the best approach to solve this problem, without overloading the Redis instance?
You can enable Redis' keyspace notifications and subscribe to the relevant events on the keys/patterns that interest you.
More details can be found in the documentation: http://redis.io/topics/notifications
I haven't tried this, but as speculation, I would use Redis's Lua to implement 2 new commands - MSR_PUSH and MSR_POP - which would do the following respectively:
-- MSR_PUSH
redis.call("PUSH", KEYS[1], ARGV[1])
redis.call("PUBLISH", "notify", KEYS[1])
and:
-- MSR_POP
local v = redis.call("POP", KEYS[1])
if v then
redis.call("PUBLISH", "notify", KEYS[1])
end
return v
So, these Lua scripts update the lists as you normally do, but then also publish the keyname that was updated to the notify pub/sub, which will then allow a watching script (golang) to do something. You could also just push to another queue, and long poll that.
This link has more information on Lua with Redis: https://www.redisgreen.net/blog/intro-to-lua-for-redis-programmers/
I'm looking to develop a chat application with Pubnub where I want to make sure all the chat messages that are send is been stored in the database and also want to send messages in chat.
I found out that I can use the Parse with pubnub to provide storage options, But I'm not sure how to setup those two in a way where the messages and images send in the chat are been stored in the database.
Anyone have done this before with pubnub and parse? Are there any other easy options available to use with pubnub instead of using parse?
Sutha,
What you are seeking is not a trivial solution unless you are talking about a limited number of end users. So I wouldn't say there are no "easy" solutions, but there are solutions.
The reason is your server would need to listen (subscribe) to every chat channel that is active and store the messages being sent into your database. Imagine your app scaling to 1 million users (doesn't even need to get that big, but that number should help you realize how this can get tricky to scale where several server instances are listening to channels in a non-overlapping manner or with overlap but using a server queue implementation and de-duping messages).
That said, yes, there are PubNub customers that have implemented such a solution - Parse not being the key to making this happen, by the way.
You have three basic options for implementing this:
Implement a solution that will allow many instances of your server to subscribe to all of the channels as they become active and store the messages as they come in. There are a lot of details to making this happen so if you are not up to this then this is not likely where you want to go.
There is a way to monitor all channels that become active or inactive with PubNub Presence webhooks (enable Presence on your keys). You would use this to keep a list of all channels that your server would use to pull history (enable Storage & Playback on your keys) from in an on-demand (not completely realtime) fashion.
For every channel that goes active or inactive, your server will receive these events via the REST call (and endpoint that you implement on your server - your Parse server in this case):
channel active: record "start chat" timetoken in your Parse db
channel inactive: record "end chat" timetoken in your Parse db
the inactive event is the kickoff for a process that uses start/end timetokens that you recorded for that channel to get history from for channel from PubNub: pubnub.history({channel: channelName, start:startTT, end:endTT})
you will need to iterate on this history call until you receive < 100 messages (100 is the max number of messages you can retrieve at a time)
as you retrieve these messages you will save them to your Parse db
New Presence Webhooks have been added:
We now have webhooks for all presence events: join, leave, timeout, state-change.
Finally, you could just save each message to Parse db on success of every pubnub.publish call. I am not a Parse expert and barely know all of its capabilities but I believe they have some sort or store local then sync to cloud db option (like StackMob when that was a product), but even if not, you will save msg to Parse cloud db directly.
The code would look something like this (not complete, likely errors, figure it out or ask PubNub support for details) in your JavaScript client (on the browser).
var pubnub = PUBNUB({
publish_key : your_pub_key,
subscribe_key : your_sub_key
});
var msg = ... // get the message form your UI text box or whatever
pubnub.publish({
// this is some variable you set up when you enter a chat room
channel: chat_channel,
message: msg
callback: function(event){
// DISCLAIMER: code pulled from [Parse example][4]
// but there are some object creation details
// left out here and msg object is not
// fully fleshed out in this sample code
var ChatMessage = Parse.Object.extend("ChatMessage");
var chatMsg = new ChatMessage();
chatMsg.set("message", msg);
chatMsg.set("user", uuid);
chatMsg.set("channel", chat_channel);
chatMsg.set("timetoken", event[2]);
// this ChatMessage object can be
// whatever you want it to be
chatMsg.save();
}
error: function (error) {
// Handle error here, like retry until success, for example
console.log(JSON.stringify(error));
}
});
You might even just store the entire set of publishes (on both ends of the conversation) based on time interval, number of publishes or size of total data but be careful because either user could exit the chat and the browser without notice and you will fail to save. So the per publish save is probably best practice if a bit noisy.
I hope you find one of these techniques as a means to get started in the right direction. There are details left out so I expect you will have follow up questions.
Just some other links that might be helpful:
http://blog.parse.com/learn/building-a-killer-webrtc-video-chat-app-using-pubnub-parse/
http://www.pubnub.com/blog/realtime-collaboration-sync-parse-api-pubnub/
https://www.pubnub.com/knowledge-base/discussion/293/how-do-i-publish-a-message-from-parse
And we have a PubNub Parse SDK, too. :)
Is there a way to send events from the server to all or some clients without using collections.
I want to send events with some custom data to clients. While meteor is very good in doing this with collections, in this case the added complexity and storage its not needed.
On the server there is no need for Mongo storage or local collections.
The client only needs to be alerted that it received an event from the server and act accordingly to the data.
I know this is fairly easy with sockjs but its very difficult to access sockjs from the server.
Meteor.Error does something similar to this.
The package is now deprecated and do not work for versions >0.9
You can use the following package which is originally aim to broadcast messages from clients-server-clients
http://arunoda.github.io/meteor-streams/
No collection, no mongodb behind, usage is as follow (not tested):
stream = new Meteor.Stream('streamName'); // defined on client and server side
if(Meteor.isClient) {
stream.on("channelName", function(message) {
console.log("message:"+message);
});
}
if(Meteor.isServer) {
setInterval(function() {
stream.emit("channelName", 'This is my message!');
}, 1000);
}
You should use Collections.
The "added complexity and storage" isn't a factor if all you do is create a collection, add a single property to it and update that.
Collections are just a shape for data communication between server and client, and they happen to build on mongo, which is really nice if you want to use them like a database. But at their most basic, they're just a way of saying "I want to store some information known as X", which hooks into the publish/subscribe architecture that you should want to take advantage of.
In the future, other databases will be exposed in addition to Mongo. I could see there being a smart package at some stage that strips Collections down to their most basic functionality like you're proposing. Maybe you could write it!
I feel for #Rui and the fact of using a Collection just to send a message feel cumbersome.
At the same time, once you have several of such message to send around is convenient to have a Collection named something like settings or similar where you keep these.
Best package I have found is Streamy. It allows you to send to everybody, or just one specific user
https://github.com/YuukanOO/streamy
meteor add yuukan:streamy
Send message to everybody:
Streamy.broadcast('ddpEvent', { data: 'something happened for all' });
Listen for message on client:
// Attach an handler for a specific message
Streamy.on('ddpEvent', function(d, s) {
console.log(d.data);
});
Send message to one user (by id)
var socket = Streamy.socketsForUsers(["nJyQvECmkBSXDZEN2"])._sockets[0]
Streamy.emit('ddpEvent', { data: 'something happened for you' }, socket);