Using custom connector's components in same connector's flow - maven

Problem Description
I have developed a custom Mulesoft connector using AnypointStudio and following all guidelines on how to do it. However, I am struggling on writing MUnit functional tests for that connector or involving some example flows. The issue is the connector project cannot "import itself", meaning components that I developed for people importing my connector (via Maven for example) are not available for me in my src/main/mule (Flows) location on the Mule Palette.
Question
Is there a way to import components from my connector inside the connector itself so that it can use them for example flow? If not, is the right approach here to make new separate project which will import my connector and then have all my tests there?

Test cases for Mule 4 connectors can be done as described at the documentation, using JUnit and Java test cases: https://docs.mulesoft.com/mule-sdk/1.1/testing-writing-your-first-test-case
Maven knows how to handle the dependencies for tests so that should not be a problem.
If you want to also integrate MUnit tests you can take a peek at how other connectors do it. You can inspect the open source connectors.
Examples:
File Connector: https://github.com/mulesoft/mule-file-connector/
HTTP Connector: https://github.com/mulesoft/mule-http-connector/

Related

How to use grpc with spring boot

I am new in grpc i don't know how to use it with spring boot but using the below link
https://github.com/saturnism/grpc-java-by-example/tree/master/simple-grpc-server
https://github.com/saturnism/grpc-java-by-example/tree/master/simple-grpc-client
note* : - first is for server project and second is for client project.
i have created a project on grpc with spring boot but i can'nt getting understand one thing in this that in grpc client project how can i use classes which are generated by protobuf in the project of grpc server. because it is not creating any proto file in grpc client project then how can i use the classes of grpc server project in grpc client project or can we create one project for grpc server and client instead of creating a diffrent project for both.
I have two queries to ask related to this question one:-
1. How to use classes of grpc generated by protobuf compiler in another project like if client and server are two different project and only server have proto generated files and client wants to use same classes.
How can i create all these thing in a single project means client and server in one project and then how can i run this project with step by step demo.
There are two ways you can do this:
Copy the .proto files between the two projects, and have each one generate their own copies of the generated code. This is probably the easiest, and allows you to avoid checking in the generated code into source control. The downside to this approach is that the .proto files can get out of date if you modify one and not the other.
Keep the .proto in the same repository of both the client and server, and make both depend on the generated code. This allows the proto to be modified for the client and server at the same time, but requires the code to live in the same repository (this is sometimes called the "Monorepo" approach). The downside to this is that the client and server repos may get too big, and need to be split up.
Google (the author of Protobuf) typically uses option #2, but many users of Protobuf prefer option 1. I would highly recommend regenerating the classes each time, and not check in the generated code. The ABI of Protobuf classes can change occasionally, and you would lose the backwards compatibility of Protobuf.
I have created a sample spring boot grpc application and posted in here
https://javabelazy.blogspot.com/
use the dependency net.devh.grpc-server-spring-boot-starter in your pom
create a protofile (sample service code)
service PingPongService {
rpc ping(PingRequest) returns (PongResponse) {
option (google.api.http) = { get: "/v1/grpc/{ping}" };
}
generate stubs for proto file using io.grpc:protoc-gen-grpc-java:1.30.0:exe
use nettyserver
set the port to 9090 (default) grpc.server.port=9090 in application properties
I have used https://github.com/yidongnan/grpc-spring-boot-starter recently. You will get most of the spring features along with grpc using this library.
There is yidongnan/grpc-spring-boot-starter (DOC) which implement springboot autoconfiguration starter for both client and server.
It implements #GrpcServer and #GrpcClient.
#GrpcService, which will add service to grpc server and start server automatically.
Annotation that marks gRPC services that should be registered with a gRPC server.
If spring-boot's auto configuration is used, then the server will be created
automatically. This annotation should only be added to implementations of
BindableService (GrpcService-ImplBase).
#GrpcClient, which will create channel and stub for client automatically
Example: #GrpcClient("myClient") <-> grpc.client.myClient.address=static://localhost:9090
nils server sample
nils client sample
Based on these samples, I also implement my simple server and client sample:
ppdouble/springboot-grpc-server-sample
ppdouble/springboot-grpc-client-sample
You can based on those samples implement your project or implement a new springboot autoconfiguration starter.

DIfference between Application and Integration service in IIB

Could anyone please explain the difference between application and integration service in IIB. I have referred through documentation but it was not clear. For example, if I have to create a service based on wsdl which has some 3 operations.Should I create it as integration service or application.Please suggest
So with an Application it's roll your own in that you have to build everything.
With an Integration Service you can import a WSDL and the framework of your flow will be generated for you.
So if you are being given WSDL's for the services you want to build then using an IS may be the way to go.
Personally I don't like some of the aspects of the generated code but that's me. I'm currently working on a project that uses REST API's and am using the REST API project option for my projects and it generates code.

OpenAPI multimodule EAR deployment

I would like to package 2 openapi.yaml definition files with it's corresponding implementation, each one in it's own war file into one ear and deploy it to openliberty. So war this works and when openliberty start up it shows me the url for ~/openapi/ui and the corresponding REST-Services ~/converter1 and ~/converter2. When I use openapi/ui I only can see one Service definition, the second one I can not see. Do I something wrong? Should my scenario work with openliberty?
My general UseCase is to have severel REST-Services defined by OpenApi's grouped together as long as they are in a common domain. Until now I can run each openapi.yaml on its own OpenLiberty but I like to group my REST-Services together into one OpenLiberty Server.
Does somebody knows a solution to my problem?
As you have noted, Open Liberty's MicroProfile OpenAPI support (via the mpOpenAPI-1.0 feature) only supports a single application per server.
If you want to aggregate multiple OpenAPI documents in a single server you have to use WebSphere Liberty's openapi-3.1 feature. See these docs for more info.

Reusing Mule connectors and validation flows

How to reuse mule code (flows, exception strategies, database connectors, validators) across several projects. It's a application specific reusable artifacts, not an enterprise wide reuse.
For ex: I have some master code( validators, flows, and exception stratagies) which should be reused in a 15 different flows. (i.e 15 different mule projects). We are not using maven at the moment. One way I explored is, we could jar it and publish to local nexus repo, and re-use it via pom. Is there any other way ?
If possible, I also would like to make it dynamic, such that if I change the master code and deploy, it should be in effect without having to redeploy the ones that are using it.
You can reuse flows etc. (everything which is in Mule xml files) and Java classes by placing them in a plain Java project, building a jar from it and placing the jar on the classpaths of the importing Mule projects.
To use the stuff in the xml files, import them with .
Your question sounds like you already know this part.
I recommend building all Mule projects and the so called master project with Maven, Mule projects with packaging Mule, the master project with packaging jar.
Maven will pack the master part inside the using projects, so there is no dynamic update.
When you want this dynamic update, don't build with Maven or set the scope to "provided". In this case the master is not packaged in the other Mule projects. You have to make sure it is on the server classpath, e.g. in lib/user. Then you can change it there, restart the Mule server and all projects get the update.
Same with another level of indirection/possibility for grouping can be done with Mule domains.
All the dynamic stuff described so far does only for on premise Mule servers, not for CloudHub.

PACT: java-maven

I need few answer for my doubt:
Pact-mock-service Vs pact-jvm-server, is both are same? Pls describe this.
Am implementing the PACT in java-maven
I can able to run this:
https://github.com/anha1/microservices-pact-maven
https://github.com/warmuuh/pactbroker-maven-plugin
Help me to understand this with pact-mock-service and pact-jvm-server
Pact-mock-service is a general mock server built into the pact libraries to support mocking out the other dependency in an integration during a consumer test. If you use any of the consumer test support libraries, you do not need to use it directly.
pact-jvm-server is a controllable server that bundles the Pact-mock-service and allows you to setup and tear down mock servers via HTTP requests. It exists for people who can not,or do not wish to use the consumer test support libraries.
For people using Maven, there is a plugin provided as part of the pact-jvm project that can do provider verification tests and publish to a pact broker. For the consumer tests, they just run as JUnit tests so you don't need any Maven specific plugin.
Of the two links you posted, the first is an example project using a spring-boot application, and the second is a maven plugin that provides publishing to a pact broker only.

Resources