Spring Cloud Task - launch task from maven repository in docker container - spring

I learn Spring Cloud Task and I write simple application that is divided into 3 services. First is a TaskApplication that have only main() and implements CommandLineRunner, second is a TaskIntakeApplication that receives request and send them to RabbitMQ, third service is an TaskLauncherApplication that receives messages from RabbitMQ and runs the task with received parameters.
#Component
#EnableBinding(Source.class)
public class TaskProcessor {
#Autowired
private Source source;
public void publishRequest(String arguments) {
final String url = "maven://groupId:artifatcId:jar:version";
final List<String> args = Arrays.asList(arguments.split(","));
final TaskLaunchRequest request = new TaskLaunchRequest(url, args, null, null, "TaskApplication");
final GenericMessage<TaskLaunchRequest> message = new GenericMessage<>(request);
source.output().send(message);
}
}
And as you can see I call my built artifact by giving maven url but I wonder how can I call artifact from another docker container?

If you intend to launch a task application from an upstream event (e.g., a new file event; a new DB record event; a new message in Rabbit event, etc.,), you'd simply use the respective out-of-the-box applications and then launch the task via the Task Launcher.
Follow this example on how the 3-steps are orchestrated via SCDF's DSL.
Perhaps you could consider reusing the existing apps instead of reinventing them unless you have got a completely different requirement and that these apps cannot meet it. I'd suggest trying to get the example mentioned above working locally before you consider extending the behavior.

Related

Download files to a local folder by using Spring Integration in spring boot application

I am new to spring integration framework. Currently i am working on a project which has a requirement to download the files to a local directory.
My goal is to complete the below task
1.Download the files by suing spring integration to a local directory
2.Trigger a batch job.It means to read the file and extract a specific column information.
I am able to connect to SFTP server.But facing difficulty how to use spring integration java DSL to download the files and trigger a batch job.
Below code to connect to SFTP Session Factory
#Bean
public SessionFactory<ChannelSftp.LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
if (sftpPrivateKey != null) {
factory.setPrivateKey(sftpPrivateKey);
factory.setPrivateKeyPassphrase(privateKeyPassPhrase);
} else {
factory.setPassword("sftpPassword");
}
factory.setPassword("sftpPassword");
logger.info("Connecting to SFTP Server" + factory.getSession());
System.out.println("Connecting to SFTP Server" + factory.getSession());
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<ChannelSftp.LsEntry>(factory);
}
Below code to download the files from remote to local
#Bean
public IntegrationFlowBuilder integrationFlow() {
return IntegrationFlows.from(Sftp.inboundAdapter(sftpSessionFactory()));
}
I am using spring integration dsl. i am not able to get what to code here.
I am trying many possible ways to do this.But not able to get how to proceed with this requirement.
Can anyone one help me how to approach at this and if possible share me a sample code for reference?
The Sftp.inboundAdapter() produces messages with a File as a payload. So, having that IntegrationFlows.from(Sftp.inboundAdapter(sftpSessionFactory())) you can treat as a first task done.
Your problem from here that you don't make an integrationFlow, but rather return that IntegrationFlowBuilder and register it as a #Bean. That's where it doesn't work for you.
You need to continue a flow definition and call its get() in the end to return an integrationFlow instance which already has to be registered as a bean. If this code flow is confusing a bit, consider to implement an IntegrationFlowAdapter as a #Component.
To trigger a batch job you need consider to use a FileMessageToJobRequest in a .transform() EIP-method and then a JobLaunchingGateway in a .handle() EIP-method.
See more info in docs:
https://docs.spring.io/spring-integration/reference/html/dsl.html#java-dsl
https://docs.spring.io/spring-integration/reference/html/sftp.html#sftp-inbound
https://docs.spring.io/spring-batch/docs/4.3.x/reference/html/spring-batch-integration.html#spring-batch-integration-configuration
BTW, the last one has a flow sample exactly for your use-case.

Spring with Apache Beam

I want to use Spring with Apache Beam that will run on Google Cloud Data flow Runner. Dataflow job should be able to use Spring Runtime application context while executing the Pipeline steps. I want to use Spring feature in my Apache Beam pipeline for DI and other stuff. After browsing hours on google, I couldn't find any post or documentation which shows Spring integration in Apache Beam. So, if anyone has tried spring with Apache beam, please let me know.
In main class i have initialised the spring application context but it is not available while execution of pipeline steps. I get null pointer exception for autowired beans. I guess the problem is, at runtime context is not available to worker threads.
public static void main(String[] args) {
initSpringApplicationContext();
GcmOptions options = PipelineOptionsFactory.fromArgs(args)
.withValidation()
.as(GcmOptions.class);
Pipeline pipeline = Pipeline.create(options);
// pipeline definition
}
I want to inject the spring application context to each of the ParDo functions.
The problem here is that the ApplicationContext is not available on any worker, as the main method is only called when constructing the job and not on any worker machine. Therefore, initSpringApplicationContext is never called on any worker.
I've never tried to use Spring within Apache Beam, but I guess moving initSpringApplicationContext in a static initializer block will lead to your expected result.
public class ApplicationContextHolder {
private static final ApplicationContext CTX;
static {
CTX = initApplicationContext();
}
public static ApplicationContext getContext() {
return CTX;
}
}
Please be aware that this alone shouldn't be considered as a best practice of using Spring within Apache Beam since it doesn't integrate well in the lifecycle of Apache Beam. For example, when an error happens during the initialization of the application context, it will appear in the first place where the ApplicationContextHolder is used. Therefore, I'd recommend to extract initApplicationContext out of the static initializer block and call it explicitly with regards to Apache Beam's Lifecycle. The setup phase would be a good place for this.

Springboot REST endpoints with Alfresco Process Services (Activiti)

I have installed Alfresco Process Services on my local machine. I have also created a springboot project to write custom listeners like Task listener or execution listeners. These listeners are working fine. I create a jar file and put it into webapp\activiti-app\WEB-INF\lib folder.
Now I want to add REST endpoints in my application so that external users can directly take an action on task.
I added the following class beside my main application class which has the main method.
#RestController
#RequestMapping("/api2")
public class WorkflowController {
#RequestMapping("/greeting")
public String greeting(#RequestParam(value="name", defaultValue="World") String name) {
return "Hello from " +name;
}
}
The issue is when I try to access the endpoint via any of the below URL it gives me 404 error.
http://localhost:8081/activiti-app/api2/greeting
OR
http://localhost:8081/api2/greeting
Please help
To access the URL publicly you need to start your mapping with /enterprise. APS requires this to distinguish between public and private rest APIs, so your #RequestMapping("/api2") should be #RequestMapping("/enterprise/api2") which should then be accessible with http://localhost:8081/activiti-app/api/enterprise/api2/greeting. refer the developer series for a detailed example.

Initialize spring cloud config client, for SDK development

This question is somewhat similar to this existing question
I am still trying to navigate or trying to find right spring boot code, which i can customize. I need to develop java SDK which connects with existing config server and provides values to key. This SDK will be used in java applications, which might or might not be spring application. Same SDK will be used by QA for regression testing of config server.
So question is, if given
Config server URL
application name
active profile (no need for label, it will be default master),
Can I initialize some config client class which will give me simple methods like public String getKeyValue(final String key)
I am looking at source of classes like ConfigServicePropertySourceLocator, CompositePropertySource, ConfigClientAutoConfiguration, ConfigServiceBootstrapConfiguration etc.
Do I need to build Environment object manually? If yes, how?
I have some success. Posting a possible answer for others to further fine tune it.
#SpringBootApplication
public class ConfigSDKApp {
#Autowired
public SomeSpringBean someBean = null;
private static ConfigSDKApp INSTANCE = null;
public synchronized static ConfigSDKApp getInstance(String[] args) {
if (null != INSTANCE) {
return INSTANCE;
}
SpringApplication sprApp = new SpringApplication(ConfigSDKApp.class);
sprApp.setWebEnvironment(false);
ConfigurableApplicationContext appContext = sprApp.run(args);
ConfigSDKApp app = appContext.getBean(ConfigSDKApp.class);//new ConfigSDKApp();
INSTANCE = app;
return INSTANCE;
}
}
It's kind of singleton class (but public constructor). Hence code smell.
Also, what if this SDK is running with-in springboot client. ApplicationContext & environment is already initialized.

using Spring integration with spring batch

I have a spring batch application which reads from file , does some processing and finally write a customized output. This all happens in one step. In next step i have a tasklet which archives the input files (move to another folder). This application works fine.But, now i got a requirement to sftp output files on a remote servers where they would further processed. I got a way to sftp using spring integration where i have created a input channel which feeds to outboundchannel adapter. I put my files as payload in message and send messages to channel. The only problem i see here is that everytime I have to get the context i eed to load the spring config file, which seems kind of hackish way to do the task. Does anyone know about any way to integrate SI with SB.
Let me know if you want to see my config...
Thanks in Advance !!
code to access the same app-context without loading the spring config again
public class AppContextProvider implements ApplicationContextAware{
private static ApplicationContext ctx;
public ApplicationContext getApplicationContext() {
return ctx;
}
public void setApplicationContext(ApplicationContext appContext) throws BeansException {
ctx = appContext;
}
}
code to push the output file to sftp server
log.info("Starting transfer of outputFile : " + absoluteOutputFileName);
final File file = new File(absoluteOutputFileName);
final Message<File> message = MessageBuilder.withPayload(file).build();
AppContextProvider context = new AppContextProvider();
final MessageChannel inputChannel = context.getApplicationContext().getBean("toChannel",MessageChannel.class);
inputChannel.send(message);
log.info("transfer complete for : " + absoluteOutputFileName);
Take a look at the spring-batch-integration module within the Spring Batch project. In there, we have components for launching jobs via messages. In your situation, you'd FTP the file down then have the JobLaunchingMessageHandler launch the job.
You can also watch this video of a talk I co-presented at SpringOne a couple years ago on this topic: https://www.youtube.com/watch?v=8tiqeV07XlI
As Michael said, you'll definitely want to look at and leverage spring-batch-integration. We actually use Spring Integration as a wrapper of sorts to launch 100% of our Spring Batch jobs.
One use case we've found particularly useful is leveraging the spring-integration-file Inbound Channel Adapters to poll staging directories to indicate when a new batch file has landed. As the poller finds a new file, we then launch a new batch job using the input filename as a parameter.
This has been a real help when it comes to restartability, because we now have one job instance per file as opposed to having a job kick off at arbitrary intervals and then partition across however many files happen to be in the staging folder. Now if an exception occurs during processing, you can target a specific job for restart immediately rather than waiting for 99 of the 100 "good" files to finish first.

Resources