Is there a way where I can specify the order the car files inside carbonapps folder will be deployed?
I have scenario where I have a car file depending on other car.
In this case, I have car that is using the file connector, so the car that contains the file connector must be deployed first.
As I can not guarantee that, we are having some deployment issues.
Is this possible to specify an order for the deployment?
Thanks,
At CApp deployment, the order of artifact types is defined as in https://github.com/wso2/maven-tools/blob/master/org.wso2.maven.capp/src/main/java/org/wso2/maven/capp/utils/CAppArtifactPriorityMapping.java#L30.
And artifacts for each type is deployed in the alphabetical order. By default, dependencies are defined in the artifacts.xml file following the alphabetical order at the CApp creation.
You can change the order of artifacts deployment by manually editing the dependencies order at artifacts.xml as preferred.
The artifacts will be deployed in the below order. The only way is, you can separate the artifacts into different Composite Applications (move all dependency artifacts into one CAPP) and deploy them in the order as you prefer. But, you have to deploy the dependency CAPP (which contains the dependency artifacts) first.
Local entries
Endpoints
Sequences
Message stores
Templates
Proxy services
Tasks
Events
Message processors
APIs
Inbound endpoints
Other types
Thank you
Related
I'm working on a multi-module build which is needed to create an artifact from all WSDLs available on an internal repository. But they are a lot and I don't want to make a list of it, because it might be possible that later another WSDL project is created and needs to be included in the list, if that doesn't happen then the final artifact will be incomplete.
So I need to know if there's any way I can tell gradle to fetch artifacts present on a certain path like domain.com/path/to/repo/wsdls/ and fetch all available artifacts from this path.
I was thinking of creating a configuration which then has this specific repository to download from but it seems configuration does not include a repository and will use defined in build.gradle.
Any way to define a download-everything-pattern in dependencies block?
EDIT: Note: WSDL project means soap services in a zip archive
I would like to package 2 openapi.yaml definition files with it's corresponding implementation, each one in it's own war file into one ear and deploy it to openliberty. So war this works and when openliberty start up it shows me the url for ~/openapi/ui and the corresponding REST-Services ~/converter1 and ~/converter2. When I use openapi/ui I only can see one Service definition, the second one I can not see. Do I something wrong? Should my scenario work with openliberty?
My general UseCase is to have severel REST-Services defined by OpenApi's grouped together as long as they are in a common domain. Until now I can run each openapi.yaml on its own OpenLiberty but I like to group my REST-Services together into one OpenLiberty Server.
Does somebody knows a solution to my problem?
As you have noted, Open Liberty's MicroProfile OpenAPI support (via the mpOpenAPI-1.0 feature) only supports a single application per server.
If you want to aggregate multiple OpenAPI documents in a single server you have to use WebSphere Liberty's openapi-3.1 feature. See these docs for more info.
I have created a service builder project (Gradle type) in Liferay7 called register-user. There is another service builder project called register-organization. I have a situation where one of the service builders depends upon other. However, i am not able to figure out where to put the dependency of one into another. Is there is any way to do that?
With each servicebuilder project you create from the template, you get two projects, e.g. register-user-api and register-user-service. The -service project depends on the -api project and has the dependency noted in its build.gradle. Look it up and use exactly the same notation to make any other project depend on register-user-api.
The situation changes if both projects do not live in the same workspace: In that case you'll need your own repository (e.g. proxy for Maven Central) where you publish your own modules. Then you can just declare a standard dependency for your modules.
How to reuse mule code (flows, exception strategies, database connectors, validators) across several projects. It's a application specific reusable artifacts, not an enterprise wide reuse.
For ex: I have some master code( validators, flows, and exception stratagies) which should be reused in a 15 different flows. (i.e 15 different mule projects). We are not using maven at the moment. One way I explored is, we could jar it and publish to local nexus repo, and re-use it via pom. Is there any other way ?
If possible, I also would like to make it dynamic, such that if I change the master code and deploy, it should be in effect without having to redeploy the ones that are using it.
You can reuse flows etc. (everything which is in Mule xml files) and Java classes by placing them in a plain Java project, building a jar from it and placing the jar on the classpaths of the importing Mule projects.
To use the stuff in the xml files, import them with .
Your question sounds like you already know this part.
I recommend building all Mule projects and the so called master project with Maven, Mule projects with packaging Mule, the master project with packaging jar.
Maven will pack the master part inside the using projects, so there is no dynamic update.
When you want this dynamic update, don't build with Maven or set the scope to "provided". In this case the master is not packaged in the other Mule projects. You have to make sure it is on the server classpath, e.g. in lib/user. Then you can change it there, restart the Mule server and all projects get the update.
Same with another level of indirection/possibility for grouping can be done with Mule domains.
All the dynamic stuff described so far does only for on premise Mule servers, not for CloudHub.
I'd like to keep Mule Projects small for the sake of simplicity. This means that I'd only like a single Mule project to hold flows which are related.
However, when deploying I'd like to bundle all projects and deploy them together as one. Is there a good way to do this using Maven which will result in a single Mule deployable file which to Mule looks like a single project?
As far as I understand, your mule-deploy.properties file is the one responsible for holding the list of flows which should be executed or deployed.
It is generated automatically, when you first create a project. It automatically gets populated as flows are added.
It is located at src->main->app.
Try to deploy the project on mule-standalone and it will deploy all the flows as one.