can i use opendaylight functionality same as jnc? - opendaylight

I want to write an application to configure network element using netconf and I’m looking for an open source netconf client I can use to achieve it.
I already tried and succeeded doing so using Jnc. the problem is that jnc doesn’t support netconf 1.1 and I’m looking for another solution.
Is it even possible to use the same functionality via opendaylight?
In jnc i converted yang files to java classes, filled them and then configured the device. what steps should i do in opendaylight for the same functionality?

Yangtools in opendaylight covers what JNC provides, in addition to generating restconf API automatically.
In general there are a number of steps you need to follow as I have explained here:
use pojos generated from yang to configure device using odl netconf client
Once you go through creating a maven project, you can import the yang models you want to use in to the project.
For instance, lets say you have yang models from a vendor like Nokia or Cisco, you need to place them in a folder within the maven project (please use the boiler plate provided by Maven Archetype to generate one), and then you need to declare this folder in the features.xml file of the project.
When you build your project, you will end up with java codes from the yang models.
Now its your turn to write some logic, and use those generate java classes in your provider code.
And to use Netconf, or any protocol for that matter, you need to import those additionally in to your project, which then would be accessible via the MDSAL.
Please note, from my personal experience with ODL, its not easy to understand it without getting hands on. I would suggest starting from simple projects from the links I provided in my other post, and then adding features one by one to get to know the tool.
Hope this helps.

Related

Hoverfly API Simulations with Golang repositories: how to get started

I have just started to experiment with Hoverfly and I have a Golang backend calling a number of 3rd party APIs for which I would need to create simulations. I am aware that Hoverfly has Java and Py bindings and I have come across a number of tutorials using Hoverfly with both. I think I am possibly missing very trivial point here, once I have created the simulations (via the Capture Mode), what is the next step? Do I simply create integration tests making use of them? Do you import the go package here into my repository? I was looking for some sample usages in the examples folder and I have seen more .py driven ones. Is there any available example that I totally missed out?
Thank you
For Golang testing, you can have a look at the functional tests in the hoverfly project: https://github.com/SpectoLabs/hoverfly/tree/master/functional-tests, it’s using hoverfly to test hoverfly!

How do you deploy a golang google cloud function with a custom go build constraint/tag?

I am using go build constraints to conditionally compile constants into my test/staging/production cloud functions. How can I pass -tags ENV to the builder used by gcloud beta functions deploy?
As #Guilherme mentioned in the comments, indeed, it seems that it's not possible to pass the go constraints/tags to the builder used by Cloud Functions.
I searched around and while there isn't this option, I think indeed, having the option to send constraints to the builder used by Cloud Functions. Considering that, I would recommend you to raise a Feature Request for this to be checked by Google.
One option that you might want to give a look at it, it's deploying your application using Cloud Run. As it's informed in their official documentation about this application:
Use the programming language of your choice, any language or operating system libraries, or even bring your own binaries.
Cloud Run pairs great with the container ecosystem: Cloud Build, Container Registry, Docker.
So, this might work for you as a workaround. In this below tutorial, there are the steps to build and deploy a quick application with Go in Cloud Run.
Quickstart: Build and Deploy
Let me know if the information helped you!

Convert Resuable ErrorHandling flow in to connector/component in Mule4

I'm Using Mule 4.2.2 Runtime. We use the errorHandling generated by APIKIT and we customized it according to customer requirement's, which is quite standard across all the upcoming api's.
Thinking to convert this as a connector so that it will appear as component/connector in palette to reuse across all the api's instead copy paste everytime.
Like RestConnect for API specification which will automatically convert in to connector as soon as published in Exchange ( https://help.mulesoft.com/s/article/How-to-generate-a-connector-for-a-REST-API-for-Mule-3-x-and-4-x).
Do we have any option like above publishing mule common flow which will convert to component/connector?
If not, which one is the best way suits in my scenario
1) using SDK
https://dzone.com/articles/mulesoft-custom-connector-using-mule-sdk-for-mule (or)
2) creating jar as mentioned in this page
[https://www.linkedin.com/pulse/flow-reusability-mule-4-nagaraju-kshathriya][2]
Please suggest which one is best and easy way in this case? Thanks in advance.
Using the Mule SDK (1) is useful to create a connector or module in Java. Your questions wasn't fully clear about what do want to encapsulate in a connector. I understand that you want is to share parts of a flow as a connector in the palette, which is different. The XML SDK seems to be more inline with that. You will need to make some changes to encapsulate the flow elements, as described in the documentation. That's actually very similar to how REST connect works.
The method described in (2) is for importing XML flows from a JAR file, but the method described by that link is actually incorrect for Mule 4. The right way to implement sharing flows through a library is the one described at https://help.mulesoft.com/s/article/How-to-add-a-call-to-an-external-flow-in-Mule-4. Note that this method doesn't create a connector that can be used from Anypoint Studio palette.
From personal experience - use common flow, put it to repository and include it as dependency to pom file. Even better solution - include is as flow to the Domain app and use it alone with your shared https connector.
I wrote a lot of Java based custom components. I liked them a lot and was proud of them. But transition from Mule3 to Mule4 killed most of them. Even in Mule4 Mulesoft makes changes periodically which make components incompatible with runtime.

Managing and storing an artifact/library (e.g. a jar) used by an Heroku/Cloudfoundry application

I have read the 12-factor-app manifesto and I have the following interrogation: where and how can I store a in-house library such a jar that is going to be consumed by several of my Heroku or Cloudfoundry apps?
All of this bearing in mind that the codebase for that library/jar is versioned in Github.
I would use a file store such as OpenStack Swift (it has an S3 compatible API as well. You could also use CouchDB and store attachments in a DB record as well.
There really isn't anything like that that I have seen yet in java. You can't have dynamic dependency injection in java, its something you can't do.

Protobuf in windows 8 app serialization or generating code not working

I need use protobuf in my windows store app and I use this Protobuf port but when I generate classes from proto file seen like not full because I dont have access to .newBuilder()... nad if I use p:lightFramework I still cannot work with .newBuilder()... Anyone can help?
Part of generated code without light framework options
[global::System.Serializable, global::ProtoBuf.ProtoContract(Name=#"Person")]
Part of generated code with light framework options
[global::ProtoBuf.ProtoContract(Name=#"Person")]
Problem is there:(.newBuilder() is not recognized
CP.ConnectionResponse respp = CP.ConnectionResponse.newBuilder()...
You seem to be using two different libraries at once; in particular, you seem to be following the instructions for protobuf port, but actually using protobuf-net. These are two different libraries, connected only in so much as:
they both target .NET
they both serialize/deserialize protobuf data
To add context - this is a bit like using JSON.NET but following the instructions for ServiceStack.Text: both can serialize/deserialize JSON, but the API is different.
You need to decide which library you want to use, and follow the instructions for that implementation.
As an aside: for the best performance on a store app / windows phone app with protobuf-net, you may also want to consider using the precompiler - but you should be able to get it working (for a proof-of-concept etc) without this.

Resources