Convert Resuable ErrorHandling flow in to connector/component in Mule4 - mule-component

I'm Using Mule 4.2.2 Runtime. We use the errorHandling generated by APIKIT and we customized it according to customer requirement's, which is quite standard across all the upcoming api's.
Thinking to convert this as a connector so that it will appear as component/connector in palette to reuse across all the api's instead copy paste everytime.
Like RestConnect for API specification which will automatically convert in to connector as soon as published in Exchange ( https://help.mulesoft.com/s/article/How-to-generate-a-connector-for-a-REST-API-for-Mule-3-x-and-4-x).
Do we have any option like above publishing mule common flow which will convert to component/connector?
If not, which one is the best way suits in my scenario
1) using SDK
https://dzone.com/articles/mulesoft-custom-connector-using-mule-sdk-for-mule (or)
2) creating jar as mentioned in this page
[https://www.linkedin.com/pulse/flow-reusability-mule-4-nagaraju-kshathriya][2]
Please suggest which one is best and easy way in this case? Thanks in advance.

Using the Mule SDK (1) is useful to create a connector or module in Java. Your questions wasn't fully clear about what do want to encapsulate in a connector. I understand that you want is to share parts of a flow as a connector in the palette, which is different. The XML SDK seems to be more inline with that. You will need to make some changes to encapsulate the flow elements, as described in the documentation. That's actually very similar to how REST connect works.
The method described in (2) is for importing XML flows from a JAR file, but the method described by that link is actually incorrect for Mule 4. The right way to implement sharing flows through a library is the one described at https://help.mulesoft.com/s/article/How-to-add-a-call-to-an-external-flow-in-Mule-4. Note that this method doesn't create a connector that can be used from Anypoint Studio palette.

From personal experience - use common flow, put it to repository and include it as dependency to pom file. Even better solution - include is as flow to the Domain app and use it alone with your shared https connector.
I wrote a lot of Java based custom components. I liked them a lot and was proud of them. But transition from Mule3 to Mule4 killed most of them. Even in Mule4 Mulesoft makes changes periodically which make components incompatible with runtime.

Related

Why dataweave over template engines like Velocity/Freemarker/Thymeleaf

I see a broad adoption of Dataweave which I feel is more of transformation library just like Freemarker or Velocity.
In case of DW Change in transformation logic would need change in code, the very same purpose template engines got popular at the first place to seperate logic and code so that we can change transformation logic without needing to rebuild/repackage our code (more deployment hassle).
Can anyone help me to point out few reasons as to why one would prefer DW .
TLDR: If you're looking for a template engine for things like static websites, DataWeave definitely isn't the right choice. Use the right tool for the job. Also, while you can use DataWeave outside of Mule, I don't think I've seen anyone adopt DataWeave that hasn't adopted MuleSoft..
A few things to consider (and most of these I'm stating in the context of developing Mule applications):
These template engines are, typically, for outputting static text. If you're using it to output structured data rather than something like an HTML page.. you're probably doing it wrong. They aren't going to return structured data - they are going to return text. If you're at the very end of your flow and you're going to output that back out of the API or to a file, you're fine I suppose.. but if you want to actually be able to work with that output, you're going to have to convert the plain text to an actual object... introducing a lot of extra steps in this process when you could have just used DataWeave in the first place. Dataweave is especially beneficial when you want to do things like streaming because you're processing large payloads. Dataweave can understand JSON, XML, and CSV (the three most common data types I see) in a streamed format without any additional work, making it very easy to create efficient applications. The big difference between a template engine and a data transformation language is that one is for outputting text using structured data as input, and the other is for working with structured data on the input and outputting structured data that you can continue to work with. There is a reason that almost all of the template engine docs talk about building websites and not things like integrations.
The DataWeave engine is, as Aled indicated, built into the Mule runtime. Deeply so. You can use DataWeave in any field in any connector by default, even fields that don't have the f(x) button - because it's built into the runtime. This makes DataWeave what you could consider a first-class citizen within Mule, unlike something you will only be able to utilize either via connectors or by invoking java bridges/libraries.. which you do via DataWeave or a long series of connector operations.
The benefits you listed are also not things you can't do with DataWeave. You can VERY easily templatize and externalize DataWeave - for example, I have several DataWeave libraries in my maven repo I can include as dependencies. I've built several transformation services that use databases with DataWeave in order to do transformation, allowing me to change those transformations without modifying the app. You can also use dynamic DataWeave, where you use a template system to load specific parts of the script before running it. I've even taken it a step further and written a generic DataWeave script that I can use to do basic mappings without writing DataWeave - this allowed me to wrap a web UI around things pretty easily.
I wouldn't use DataWeave outside of MuleSoft unless you're a MuleSoft shop. If you are a MuleSoft shop, using the CLI to run your scripts, the same way you do with most interpreted languages, works fairly nicely - especially since you likely already have in-house expertise in DataWeave. The language is still niche enough that unless you've already adopted it for use in Mule applications I don't see any advantage in using it.
Docs / basic examples:
https://github.com/mulesoft-labs/data-weave-native
https://docs.mulesoft.com/mule-runtime/4.3/parse-template-reference
https://docs.mulesoft.com/mule-runtime/4.3/dataweave-create-module
https://github.com/mikeacjones/transform-system-api
Because it is the expression and transformation language embedded in Mule runtime. If you are using Mule it is also integrated with the IDE Anypoint Studio.
Outside Mule applications I don't think you can use DataWeave easily. You might want to go with the alternatives.

Test endpoints compliance against openapi contract in Spring Boot Rest

I am looking for a nice way to write tests to make sure that enpoints in Spring Boot Rest (ver. 2.1.9) application follows the contract in openapi contract.
In the project I moved recently there is following workflow: architects write contract openapi.yml and developers have to implement endpoint to compliance the contract. Unfortunately a lot of differences happen and this test have to catch such situation and it is not possible to change this :(
I was thinking about the solution to generate openapi.yml from current ednpoints and compares it somehow but wonder if there is some out of the box solution.
I was thinking about the solution to generate openapi.yml from current ednpoints and compares it somehow but wonder if there is some out of the box solution.
In a general case, even the generated spec may not match the actual app behavior because some things can't be expressed with Open API. However, it still could be helpful as a starting point.
Open API provides a way to specify examples, that could be used to verify the contract. But the actual schemas might be a better source of expectations.
I want to note two tools that can generate and execute test cases based only on the input Open API spec:
Schemathesis uses both examples and schemas and doesn't require configuration by default. It utilizes property-based testing and verifies properties defined in the tested schema - response codes, schemas, and headers. It supports Open API 2 & 3.
Dredd focuses more on examples and provides several automatic expectations. It supports only Open API 2, and the third version is experimental.
Both provide a CLI and could be extended with various hooks to fit the desired workflow.
I'd suggest passing the contracts (as a spec you mentioned) to Schemathesis and it will verify if all schemas and examples are handled correctly by your app.

any way to call mlcp from java apps

I'm new to Marklogic and mlcp. I'm working on marklogin 9.0-8. I wnat to use mlcp to load content, but since some parameters may need to be dynamically built based on content, does anyone know if it is possible to call mlcp from java application?
Thanks a lot,
Helen
MarkLogic provides two Java-based ways to load content: MLCP and DMSDK. MLCP is intended to be used as a command-line tool (and I believe that's the only supported use).
The Data Movement SDK, on the other hand, is specifically intended to offer very similar functionality in the form of a JAR, making it easy to access from a Java application. I encourage you to look into using that instead.
tutorial
JavaDoc
Asynchronous Multi-Document Operations
12-minute video intro to DMSDK
common tasks made easier through ml-gradle

can i use opendaylight functionality same as jnc?

I want to write an application to configure network element using netconf and I’m looking for an open source netconf client I can use to achieve it.
I already tried and succeeded doing so using Jnc. the problem is that jnc doesn’t support netconf 1.1 and I’m looking for another solution.
Is it even possible to use the same functionality via opendaylight?
In jnc i converted yang files to java classes, filled them and then configured the device. what steps should i do in opendaylight for the same functionality?
Yangtools in opendaylight covers what JNC provides, in addition to generating restconf API automatically.
In general there are a number of steps you need to follow as I have explained here:
use pojos generated from yang to configure device using odl netconf client
Once you go through creating a maven project, you can import the yang models you want to use in to the project.
For instance, lets say you have yang models from a vendor like Nokia or Cisco, you need to place them in a folder within the maven project (please use the boiler plate provided by Maven Archetype to generate one), and then you need to declare this folder in the features.xml file of the project.
When you build your project, you will end up with java codes from the yang models.
Now its your turn to write some logic, and use those generate java classes in your provider code.
And to use Netconf, or any protocol for that matter, you need to import those additionally in to your project, which then would be accessible via the MDSAL.
Please note, from my personal experience with ODL, its not easy to understand it without getting hands on. I would suggest starting from simple projects from the links I provided in my other post, and then adding features one by one to get to know the tool.
Hope this helps.

Protobuf in windows 8 app serialization or generating code not working

I need use protobuf in my windows store app and I use this Protobuf port but when I generate classes from proto file seen like not full because I dont have access to .newBuilder()... nad if I use p:lightFramework I still cannot work with .newBuilder()... Anyone can help?
Part of generated code without light framework options
[global::System.Serializable, global::ProtoBuf.ProtoContract(Name=#"Person")]
Part of generated code with light framework options
[global::ProtoBuf.ProtoContract(Name=#"Person")]
Problem is there:(.newBuilder() is not recognized
CP.ConnectionResponse respp = CP.ConnectionResponse.newBuilder()...
You seem to be using two different libraries at once; in particular, you seem to be following the instructions for protobuf port, but actually using protobuf-net. These are two different libraries, connected only in so much as:
they both target .NET
they both serialize/deserialize protobuf data
To add context - this is a bit like using JSON.NET but following the instructions for ServiceStack.Text: both can serialize/deserialize JSON, but the API is different.
You need to decide which library you want to use, and follow the instructions for that implementation.
As an aside: for the best performance on a store app / windows phone app with protobuf-net, you may also want to consider using the precompiler - but you should be able to get it working (for a proof-of-concept etc) without this.

Resources