I am working on a gwt project and we are using eventbus for communicating events between widgets. I have 2 modules and when i raise an event in one module, the other module is unable to receive it. How can i solve this.Any help??
Are you sure you've passed the same EventBus to both modules, and that both modules have subscribed to the event that you're publishing?
What EventBus class are you using, anyway? One you wrote yourself, or one that's included in a GWT library?
You are most likely using two different instances of EventBus in each of the modules.
Two possible reasons:
You have created two different instances (check the code for the occurrences of something like new HandlerManager(null) if you are using the supplied one, or similar).
You have a problem with passing the eventBus reference between modules; how are you passing data across these two modules?
Related
I have an application with file-supplier Srping Cloud module included. My workflow is like track file creating/modifying in particular directory->push events to Kafka topic if there are such events. FileSupplierConfiguration is used to configure this file supplier. But now I have to track one more directory and push events to another relevant Kafka topic. So, there is an issue, because there is no possibility to include multiple FileSupplierConfiguration in project for configuration another file supplier. I remember that one of the main principles of microservices for which spring-cloud-stream was designed for is you do one thing and do it well without affecting others, but it still the same microservice with same tracking/pushing functionality but for another directory and topic. Is there any possibility to add one more file supplier with relevant configuration with file-supplier module? Or is the best solution for this issue to run one more application instance with another configuration?
Yes, the best way is to have another instance of this application, but with its own specific configuration properties. This is really how these functions have been designed: microservice based on the convention on configuration. What you are asking really contradicts with Spring Boot expectations. Imaging you'd need to connect to several data bases. So, you only can have a single JdbcTemplate auto-configured. The rest is only possible manually. But better to have the same code base which relies on the auto-configuration and may apply different props.
For a project, we will develop some micro services. They will have many common parts like configurations class, filters...
I search the best practice et method to develop it.
For the moment, i had create a generic micro service, forked by all to develop module above it. But is not convenient and some modification in one web service may be would cause some difficult for the next merge of the generic MS.
Do you have any idea to develop it or sources to inspire me?
Thanks for your responses.
Create one or more libraries containing Abstract classes that implement the shared functionality.
Import one or more of the libraries into each individual project.
Extend the Abstract classes in the individual projects.
Use Spring annotations (for instance #Component or
#RestController) in the individual projects,
not in the common libraries.
Use some Spring annotations (for instance #Scheduled) in the common libraries.
If you do this, allow the individual projects to override the values in these annotations.
For example,
#Scheduled(initialDelayString = "${common.thing.initialDelayString:10_000}",
fixedDelayString = "${common.thing.fixedDelayString:60_000}")
The example annotation includes default values and allow the individual projects to override the values as desired.
I'm working on a GWT project but stuck with a situation where I have to create 2 different modules with separate eventBus in the same project. One module should be child of another i.e. there should be some way to invoke or load child module from a main module eventBus.
I tried it with by looking at this link https://code.google.com/archive/p/mvp4g/wikis/MultiModules.wiki#Before,_After but giving some differed binding exceptions.
Can someone please elaborate the detailed procedure to get the solution.?
We are writing a new set of services and have decided to make them share a common interface... calling it a BaseService. The idea is that whenever anyone wants to develop a new service in our organization, they should be just able to extend and use this BaseService.
We have written a few other classes which also form a part of this base jar, it does things like handle transactions and connect to database using hibernate etc.
Right now all the services that extend the BaseService are a part of the same project (Eclipse + Maven), and some of the services are dependent on each other, but because they are in the same project we don't have a problem with dependencies.However, we expect 40-50 services to be written which would extend base service and would also be interdependent.
I am worried that the size of the base project would be huge and that just because when someone has to use one service they might have to depend on my base jar which has 50 services.
Is there a way that we can make some projects dynamically dependent on others?
Lets say I have a service A which depends on service B, when I build/compile Service A,it should be able to realize that it has a dependency on service B and automatically use the Service B jar.
I have heard of OSGi, will it solve my problem or is there a way I can do it with Maven or is there a simpler solution ?
Sorry about the long post !
Thanks in advance for your suggestions
It doesn't make any sense to "dynamically" manage project dependencies, since build dependencies are by definition not dynamic.
Your question, at least for the moment, seems to be entirely about how to build your code rather than about how to run it. If you are interested in creating a runtime system in which new service implementations can be dynamically added then OSGi may be the solution to look at. An extra advantage here would be that you could enforce the separation of API from implementation, and prevent the implementing services from invalidly depending on parts of your core module that you do not want them to have visibility of.
You could also use OSGi to manage evolution of your core service API through versioning; for example how do you communicate the fact that a non-breaking change has been made to the API versus a breaking change etc.
I would say there are two options depending if i understand your question correct. First one. You have already defined an interface (java term) and now you have different implementations of that. The simple solution for Maven would be to a have a module which is called for example: service-api and than this will be released and can be used by others as dependencies. On their side they simply implement the interface. No problem with the dependencies. If you are more talking about OSGi than you should take a look to maven-tycho.
Is there a way to decouple ManagedBeans from each other in a way that it is possible to send and receive custom events - probably over the (cool) FacesContext?! I do not want to inject Beans as ManagedProperty, to reduce direct dependencies. Unfortunately #ListenerFor and all that new stuff does only work for components and renderers and seems completely the wrong approach.
Those of you who are familiar with Adobe Flex' event mechanism know what I mean and what I expect from a standardized web UI framework.
Please let me know an elegant way that is included in the JSF specification without the need to implement another framework around.
Is there a way to decouple ManagedBeans from each other in a way that it is possible to send and receive custom events - probably over the (cool) FacesContext?!
Not without adding the event to a component, and you would have to add it before the Event phase of the JSF lifecycle.
I do not want to inject Beans as ManagedProperty, to reduce direct dependencies
Just because you are not injecting needed dependencies into your bean, doesn't mean that those dependencies wouldn't exist anyway if you are trying to go with an event driven model. At least by injecting the dependencies you explicitly declare what the managed bean depends on. This seems like a much more maintainable solution than what you are proposing.
Those of you who are familiar with Adobe Flex' event mechanism know what I mean and what I expect from a standardized web UI framework.
You expect a desktop based event driven model in a web application framework? This is apples to oranges. Adobe Flex is a Rich Internet Application that behaves like a desktop application while communicating with outside web services. JSF is a web application framework standard for web based components powered by javascript and ajax, with reusable server components and a server lifecycle which includes an event phase for components.
Please let me know an elegant way that is included in the JSF specification without the need to implement another framework around.
Your language implies that you do not find JSF elegant and that you are actively trying to make it something that it is not. Please do not do this, you will create a nightmare for yourself and anybody who has to maintain your application.
JSF requires a different way of thinking about web application development than what you are used to. If you find it this unpalatable then I suggest abandoning it for a web application framework that fits your comfortability level. You mentioned Flex, there is also Silverlight with .NET.