I have a spring-integration channel hooked up to a service-activator using the XML configuration. I've attached an sftp inbound-channel-adapter to the same channel. This is working quite well.
I would like to allow my clients to add/remove SFTP inbound-channel-adapters to the channel through my web interface, but instantiating spring-integration components appears to be fairly tightly coupled to the XML Spring context (see org.springframework.integration.sftp.config.SftpInboundChannelAdapterParser).
Is there a way to add/remove SFTP inbound-channel-adapters after the application has started?
This is not trivial task.
All Spring Integration components are Spring beans, at least.
So, if you want to do something with Spring at runtime you should Application Context, who provides Dependency Injection features.
I suggest to take a look into this sample: https://github.com/spring-projects/spring-integration-samples/tree/master/advanced/dynamic-ftp
It demonstrates some dynamic registration technique.
Related
My requirement is to for starters send a string from one spring-boot application to another using AMQP.
I am new to it and I have gone through this spring-boot guide, so i know the basic fundamentals of Queue, Exchange, Binding, Container and listener.
So, above guide shows the steps when amqp is received in same application.
I am a little confused on where to start if I want to achieve above type of communication between 2 different spring-boot applications.
What are the properties needed for that, etc.
Let me know if any details required.
Just divide the application into two:
One without Receiver and ...
Another without Sender
Make sure your application and configuration etc stays the same. With Spring boot's built-in RabbitMQ, you will be able to run it alright.
Next step is to call sender as and when needed from your business logic.
Currently I'm using SCS with almost default configuration for sending and receiving message between microservices.
Somehow I've read this
https://www.confluent.io/blog/enabling-exactly-kafka-streams
and wonder that it is gonna works or not if we just put the property called "processing.guarantee" with value "exactly-once" there through properties in Spring boot application ?
In the context of your question you should look at Spring Cloud Stream as just a delegate between target system (e.g., Kafka) and your code. The binders that enable such delegation are usually implemented in such way that they propagate whatever functionality supported by the target system.
ApplicationContext is supporting event propagation what it means where it is used in our applications, can you please provide any one usecase on this?
Is it sending event from on context to another context or is there any other use?
Regards,
Srikanth
Is it sending event from one context to another context or is there any other use?
No, the events are only within the context they are fired.
Spring’s eventing mechanism is designed for simple communication between Spring beans within the same application context. However, for more sophisticated enterprise integration needs, the separately-maintained Spring Integration project provides complete support for building lightweight, pattern-oriented, event-driven architectures that build upon the well-known Spring programming model.
(Spring Reference)
Maybe this helps: How to bridge Spring Application Context events to an other context
I am willing to create an example(code) using Spring in which business logic to be distibuted across different servers like JBoss or Glassfish and still under one transaction? First of all is this possible in Spring. I know using EJB has this option. Likewise do we have a similar technique in Spring also? I am looking for Synchronous communication approach and not using asynchronous message oriented middleware. Any help/pointer appreciated.
Thanks
Prakash
Spring has support for RMI or provides its own remoting mechamism HttpInvoker but according to the doc they don't provide any remote transaction propagation.
Similar questions:
Spring Distributed Transaction Involving RMI calls possible?
Transaction propagation in multiple servlet context with multiple data source
Is there an easy/lightweight way to add persistence to Spring's JavaMailSender and have it operate asynchronously? Does Spring provide any "built-in" support for this? I'm currently looking at queues with JMS, but they seem like overkill for the task at hand (looking at ActiveMQ and RabbitMQ). Is there a lightweight JMS option?
Your approach with jms is fine. Unfortunately persistence and asynchronous processing is not such a simple task and you will have to code a bit.
However have a look at Spring integration, it provides built-in support for JMS inbounds and e-mail outbounds - all you have to do is connect the pieces via XML DSL.
If you want to make any method in Spring asynchronous, all you need to do is configure task namespace in the xml config via <task:annotation-driven/>. Then, you just annotate the method with #Async and it will run in its own thread. Note that an async call will run in its own transaction, as Spring grabs a new thread from its internal pool to service the call. If you do this, then you don't need JMS for aynchronous processing.