OpenDaylight: How to get notified when a switch connects - opendaylight

I am working with opendaylight Carbon. I want to write a DataTreeChangeListener which gets notified when a switch connects. I tried modeling this on the LearningSwitch example of the OpenFlow plugin but I do not see any notification when mininet connects to openflow.
Any pointers on how to do this would be greatly appreciated.
Thanks,

Have look at this, we use this class to add default flows when DPN gets connected

Related

Capture Windchill events

I am using windchill 11.1 M020, what's the best way to capture events from windchill?
the context is I am a third party java application which runs in a different host than windchill and I'd like to be triggered when Checkin events or version changed happen or any other events
I did some research and here's what I found
We can Capture events through custom service listener, but this method not clean enough since we need to develop a custom service code and place it (run it with assigned port) inside windchill container.
We can capture windchill events through Windchill ESI service and Info*Engine but not sure how to configure ESI to listen to events and publish events to a broker, for example MQ Broker, I don't want to use EMS to avoid any licence.
any recommendations to capture events and publish them to messaging broker?
Thank you.
The only way I know to capture events from Windchill, is implementing a listener. In Windchill you can implement a service that can be notified from Windchill when objects change their state or when they have been checked in, etc.
As a Windchill service your code runs in process of the Windchill method server so you have to devise some way to comunicate what's happened in Windchill. You could use a web service, a REST call, you can write in a shared log file or something like that.
You can look at this PTC Community (Windchill Discussions) to start to dig in making Windchill listeners:
https://community.ptc.com/t5/Windchill/How-to-implement-listeners/td-p/674877

Kafka Streams Add New Source to Running Application

Is it possible to add another source topic to an existing topology of a running kafka streams java application. Based on the javadoc (https://kafka.apache.org/23/javadoc/org/apache/kafka/streams/KafkaStreams.html) I am guessing the answer is no.
My Use Case:
REST api call triggers a new source topic should be processed by an existing processor. Source topics are stored in a DB and used to generate the topology.
I believe the only option is to shutdown the app and restart it allowing for the new topic to be picked up.
Is there any option to add the source topic without shutting down the app?
You cannot modify the program while it's running. As you point out, to change anything, you need to stop the program and create a new Topology. Depending on your program and the change, you might actually need to reset the application before restarting it. Cf. https://docs.confluent.io/current/streams/developer-guide/app-reset-tool.html

Does RocketMQ support master-slave auto switch?

Does RocketMQ support master-slave auto switch?
I have try it in v3.5.8, but it does not work. So, i just want ask someone who can gave accuracy answer
No, Rocketmq does not support this for sending messages.
Howerver, slave can take the position for pulling messages when master is out of service. And auto switch to slave for reading has been implemented.
So problems only exist in the scenario of sending messsages, if you need to have high-availability on writing, please put more master with diffrent group name in service.
you can use Rocketmq Version 5.0.0

Push and pull with Parse

So what I need is a kind of a push and pull web service mechanism; Certain devices will be sending data to my parse backend and some others should be able to receive the newly added data as it's being added. Think of it as a restaurant environment where customers send their order via their phones and the restaurant manager receives the orders on his pc real time.
I know I can use push notifications but I want to target specific users (in this case the manager alone). I guess I can have a specific push notification channel in which only the manager is added, but I am not sure if I can send proper json data in bulk or just simple strings. Maybe there's a smarter way of going about it.
Any suggestions?
Many thanks,
Polis
You can use the Parse Cloud for this purposes. So certain devices (you can differentiate in cloud or in client side) can call the cloud method. The called cloud method can make http request to your server (manager pc real time). From now on your server side can deliver coming message to your manager in real time. In this solution, I assume that you have your own server for web users (like manager) and mobile application for client user (customers).
Hope this can give you an idea. Regards.
You can use Push notification for this purpose. In my opinion that would be your best option.
When registering for push notification on client side, you can set a column owner to user pointer. Now when sending push notification from one user to another you can query the Installation class for other user's pointer. You can send push notification either from client side or writing cloud code for afterSave trigger. Cloud code is a better option.
The downside of this approach is that if other user did not allow push notifications then this would fail. The second user would still be able to get the data when they open the app, but won't get push notifications.
***I built a chat app using this approach on Parse.com
You don't need a complicated channel setup, before you save your installation, do a line like this:
[installation setObject:[PFUser currentUser] forKey:#"owner"];
[installation saveInBackground]; // ... completion or whatever
Then, just query:
PFQuery *installationQuery = [PFInstallation query];
[installationQuery whereKey:#"owner" equal:userImLookingFor];
Then, it's like PFPush w/ query or something.
(I'm typing from memory, so some of these might need to be slightly tweaked)

Hadoop Implement a status callback

I am looking for a clean way to implement a java event system that hooks into Hadoop v2. I know there is a notification url and I have used that in the past. What I want to do is to hook into the JobStatus and have events posted to a queue service for propagating events to client. I tried extending job and assigning the status using reflection to my custom JobStatus class but this is not working. I have also cursory looked into Yarn's event system to add a hook that will allow me to listen for yarn events and propagate those. I really need an expert opinion on how to accomplish this kind of task. I want to get log messages, status change events in real time over to a web client.
Thanks for any assistance in advance.

Resources