where does opendaylight store flow-rules? and in what format? - opendaylight

I wish to know where opendaylight controller stores flow-rules (current research show that it's in Data-store MD-SAL) is it right? and in what format this data store is? database? xml file?

I am using Opendaylight Beryllium version so the data base which I can see is JSON .

Related

How To Store Data On Icloud in xamarin form.ios

enter image description hereThis Below my data I want to store on IClouds IOS, How Can I do that ,is it possible to store on iClouds ,please give me a code for that
enter image description here
This Below my data I want to store on IClouds IOS, How Can I do that ,is it possible to store on iClouds ,please give me a code for that
To save the "MyFun item" data, it is recommended to serialize the data and save it to a json file or an xml file.
For JSON serialization, you can install the Nuget package Newtonsoft.Json. For more info, you can refer to this document.
Or XML serialization, you can create an instance of XmlSerializer Class to serializes and deserializes objects into and from XML document.
Here are some related SO threads you can refer to.
How to write a JSON file in C#?
Serialize an object to XML
If you have got the serialized data, as mentioned in the comment, you can download official demo Xamarin.iOS - Introduction to iCloud, then pass it to MonkeyDocument.DocumentString.

Parse Server - Where does it store uploaded files

I am new to Parse Server (implementing it on Heroku and locally).
I have a basic question, when I upload a file using the ParseFile class, it provides me a URL and a fileobject. Where is this File being stored?
Is it being stored physically on a file system? Or in Mongodb?
Thank you!
I found a collection in Mongodb named fs.files. The files I uploaded were located there. I assume the Parse URL is generated as a redirect.

How to register a 'DataSourceFactiry' with 'Geo Server'?

If I have written my own DataSourceFactory, then how can Geo Server recognize my dataSourceFactory? As I know that if we register with org.geotools.data.DataStoreFactorySpi then automatically Geo Server recognizes our data source. But I don't know how to register with Geo Server. I am planning to create a Java project (mvn).
If you are implementing a store you can follow the tutorial at:
http://docs.geotools.org/stable/userguide/tutorial/datastore/index.html
(GeoServer expects to find GeoTools data stores).
In particular, the answer to your specific question is at the bottom of this page, you have to register it in a META-INF/services/org.geotools.data.DataStoreFactorySpi file:
http://docs.geotools.org/stable/userguide/tutorial/datastore/source.html

How to delete index in custom java connector

I have build a custom connector to get the data from a web service and then index it. The web service response returns only the data to be indexed.
I want to delete the documents from index which are not part of the web service response during the crawl but were added to the index in the last crawl.
Is there any way to achieve the above or can I flush the full index programmatically in the connector code and then add the recent content to the index.
Marged is correct. A feed (which is what the connector can send to the GSA) of type full will purge the existing feed and replace it. Otherwise, your connector is going to have to manage state and prune out documents as you decided.
Thanks Marged and Michael for the help.. I guess i have to write the custom logic in connector to delete the data from index.
What you're trying to achieve is exactly what happens when you send a "full" content feed. This is from the documentation:
When the feedtype element is set to full for a content feed, the system deletes all the prior URLs that were associated with the data source. The new feed contents completely replace the prior feed contents. If the feed contains metadata, you must also provide content for each record; a full feed cannot push metadata alone. You can delete all documents in a data source by pushing an empty full feed.
Marged is correct that v4.x is the way to go in the future, but if you've already started this with the 3.x connector framework and you're happy with it there's no need to rush to upgrade it. All the related code is open source and 3.x won't disappear any time soon, there are too many 3rd party connectors based on it.

How to upload file and save it directly to MongoDB using GridFS

I have a Sinatra application hosted on heroku and I'm trying to enable file uploading. I know heroku doesn't allow saving to the file system so I'm trying to save the image to MongoDB using GridFS directly. But I don’t know how.
Using the code below, I'm able to save to file system
base_dir = Dir.pwd + "/static/images/channels/"
File.open("#{base_dir}" + params['logo'][:filename], "w") do |f|
f.write(params['logo'][:tempfile].read)
end
How do I save the file directly to MongoDB without first saving it to the file system?
You can use the GridFS API to basically do what you're doing above, but write to MongoDB: http://api.mongodb.org/ruby/current/Mongo/GridFileSystem.html#open-instance_method.
I think you need to upload the file as binary data to a database.
You can use PaperClip to upload files and then store them as binary to MangoDB.
here this link might help you out:
If your files are actually less than 16 mb, please try using this Converter that changes the image of format jpeg / png to a format of saving to mongodb, and you can see this as an easy alternative for gridfs ,
please follow this github repo for more details
https://github.com/saran-surya/Mongo-Image-Converter

Resources