Protobuf in windows 8 app serialization or generating code not working - windows

I need use protobuf in my windows store app and I use this Protobuf port but when I generate classes from proto file seen like not full because I dont have access to .newBuilder()... nad if I use p:lightFramework I still cannot work with .newBuilder()... Anyone can help?
Part of generated code without light framework options
[global::System.Serializable, global::ProtoBuf.ProtoContract(Name=#"Person")]
Part of generated code with light framework options
[global::ProtoBuf.ProtoContract(Name=#"Person")]
Problem is there:(.newBuilder() is not recognized
CP.ConnectionResponse respp = CP.ConnectionResponse.newBuilder()...

You seem to be using two different libraries at once; in particular, you seem to be following the instructions for protobuf port, but actually using protobuf-net. These are two different libraries, connected only in so much as:
they both target .NET
they both serialize/deserialize protobuf data
To add context - this is a bit like using JSON.NET but following the instructions for ServiceStack.Text: both can serialize/deserialize JSON, but the API is different.
You need to decide which library you want to use, and follow the instructions for that implementation.
As an aside: for the best performance on a store app / windows phone app with protobuf-net, you may also want to consider using the precompiler - but you should be able to get it working (for a proof-of-concept etc) without this.

Related

One-click OpenAPI/Swagger build architecture for backend server code

I use swagger to generate both client/server api code. On my frontend (react/ts/axios), integrating the generated code is very easy: I make a change to my spec, consume the new version through NPM, and immediately the new API methods are available
On the server side, however, the process is a lot more janky. I am using java, and I have to copy and paste specific directories over (such as the routes, data types etc) and a lot of the generated code doesn't play nice with my existing application (in terms of imports etc). I am having the same experience with a flask instance that I have.
I know that comparing client to server is apple to oranges, but is there a better way to construct a build architecture so that I don't have to go through this error prone/tedious process every time? Any pointers here?

Square Wire Service Implementation

I'm trying to learn about grpc and protocol buffers while creating a test app using square's wire library. I've got to the point of generating both client and server files but I got stuck in creating a server side app. The documentation is too simple and the examples in the library have to much going on.
Can anyone point out what are the next steps after generating the files?
Thanks.
I'm not sure what language your are using, but each gRPC language has similar set of examples. For example, for a simple server-side app, you can find the Python example at https://github.com/grpc/grpc/blob/master/examples/python/helloworld/greeter_server.py

Convert Resuable ErrorHandling flow in to connector/component in Mule4

I'm Using Mule 4.2.2 Runtime. We use the errorHandling generated by APIKIT and we customized it according to customer requirement's, which is quite standard across all the upcoming api's.
Thinking to convert this as a connector so that it will appear as component/connector in palette to reuse across all the api's instead copy paste everytime.
Like RestConnect for API specification which will automatically convert in to connector as soon as published in Exchange ( https://help.mulesoft.com/s/article/How-to-generate-a-connector-for-a-REST-API-for-Mule-3-x-and-4-x).
Do we have any option like above publishing mule common flow which will convert to component/connector?
If not, which one is the best way suits in my scenario
1) using SDK
https://dzone.com/articles/mulesoft-custom-connector-using-mule-sdk-for-mule (or)
2) creating jar as mentioned in this page
[https://www.linkedin.com/pulse/flow-reusability-mule-4-nagaraju-kshathriya][2]
Please suggest which one is best and easy way in this case? Thanks in advance.
Using the Mule SDK (1) is useful to create a connector or module in Java. Your questions wasn't fully clear about what do want to encapsulate in a connector. I understand that you want is to share parts of a flow as a connector in the palette, which is different. The XML SDK seems to be more inline with that. You will need to make some changes to encapsulate the flow elements, as described in the documentation. That's actually very similar to how REST connect works.
The method described in (2) is for importing XML flows from a JAR file, but the method described by that link is actually incorrect for Mule 4. The right way to implement sharing flows through a library is the one described at https://help.mulesoft.com/s/article/How-to-add-a-call-to-an-external-flow-in-Mule-4. Note that this method doesn't create a connector that can be used from Anypoint Studio palette.
From personal experience - use common flow, put it to repository and include it as dependency to pom file. Even better solution - include is as flow to the Domain app and use it alone with your shared https connector.
I wrote a lot of Java based custom components. I liked them a lot and was proud of them. But transition from Mule3 to Mule4 killed most of them. Even in Mule4 Mulesoft makes changes periodically which make components incompatible with runtime.

Protobufs in Production - Isn't it heavy to include the compiler in production?

I haven't worked with Protobufs extensively, but have been asked to recently.
Installing the Protobuf compiler is pretty heavy on a local installation. I'm supposed to use Protobuf for serializing and deserializing messages on a server. Flask for simplicity.
Doesn't that mean on the server that I will also have to install this Protobuf Compiler?
Seems like it's quite heavy no? When all I want to do is serialize and deserialize messages?
In passing, I heard that I don't need to include the compiler in production... But how would that work?
Usually, protobuf works by having a build-time step that parses a .proto schema file (usually "protoc"), which generates code in your chosen language. You then compile this with your app, and it is that generated code that does the actual work. The DSL parser/generator don't happen in your app, and you don't need to include them. The actual runtime code is ... just regular code.
Another option used by some protobuf tools is meta-programming, whereby the model (generated or hand-coded) contains enough information to infer how the serialization should happen, but no actual serialization code is generated. In this case, typically a strategy is emitted at runtime based on the model - usually the first time it is needed, then aggressively reused. This approach does need some runtime elements (the strategy emitter).
It is also possible to invoke the full DSL parser at runtime in various ways. In real terms, the parser isn't a huge piece of software, and as long as you aren't using it per-call it should usually be fine.

can i use opendaylight functionality same as jnc?

I want to write an application to configure network element using netconf and I’m looking for an open source netconf client I can use to achieve it.
I already tried and succeeded doing so using Jnc. the problem is that jnc doesn’t support netconf 1.1 and I’m looking for another solution.
Is it even possible to use the same functionality via opendaylight?
In jnc i converted yang files to java classes, filled them and then configured the device. what steps should i do in opendaylight for the same functionality?
Yangtools in opendaylight covers what JNC provides, in addition to generating restconf API automatically.
In general there are a number of steps you need to follow as I have explained here:
use pojos generated from yang to configure device using odl netconf client
Once you go through creating a maven project, you can import the yang models you want to use in to the project.
For instance, lets say you have yang models from a vendor like Nokia or Cisco, you need to place them in a folder within the maven project (please use the boiler plate provided by Maven Archetype to generate one), and then you need to declare this folder in the features.xml file of the project.
When you build your project, you will end up with java codes from the yang models.
Now its your turn to write some logic, and use those generate java classes in your provider code.
And to use Netconf, or any protocol for that matter, you need to import those additionally in to your project, which then would be accessible via the MDSAL.
Please note, from my personal experience with ODL, its not easy to understand it without getting hands on. I would suggest starting from simple projects from the links I provided in my other post, and then adding features one by one to get to know the tool.
Hope this helps.

Resources