Reading different CSV files using FlatFileItemReader Spring Batch - spring

I tried to find a solution for the problem I am looking for, but I dint find it or a chance that I might skipped. Help me if you can redirect me to the solution page.
Input to the batch: I have TRADES.csv, PORTFOLIO.csv files which are located at different paths.
How I am implemented it currently:
Currently I wrote a class CSVReader has two FlatFileItemReaders which are defined as below.
<beans:bean id="porfolioReader"
class="org.springframework.batch.item.file.FlatFileItemReader">
<beans:property name="lineMapper" ref="lineMapperForPortfolio"></beans:property>
<beans:property name="strict" value="false"></beans:property>
<beans:property name="recordSeparatorPolicy"
ref="csvRecordSeparatorPolicy"></beans:property>
<beans:property name="linesToSkip" value="1"></beans:property>
<beans:property name="encoding" value="ISO-8859-1"></beans:property>
</beans:bean>
<beans:bean id="tradeReader"
class="org.springframework.batch.item.file.FlatFileItemReader">
<beans:property name="lineMapper" ref="lineMapperForTrades"></beans:property>
<beans:property name="strict" value="false"></beans:property>
<beans:property name="recordSeparatorPolicy"
ref="csvRecordSeparatorPolicy"></beans:property>
<beans:property name="linesToSkip" value="1"></beans:property>
<beans:property name="encoding" value="ISO-8859-1"></beans:property>
</beans:bean>
So based on the input file path I am opening the corresponding reader and making the FieldSet.
Case I am looking for:
I am looking to read these CSV files in single step and make a FieldSet so that in processor I can again split the data into list of TRADES and PORTFOLIO objects. Is there a way I can make the FlatFileItemReader capable of finding what is the CSV picked, and choose the corresponding linemapper..?
Job Definition:
<batch:step id="tradeStep1" allow-start-if-complete="true">
<batch:tasklet>
<batch:chunk reader="csvReader"
processor="csvProcessor" writer="csvWriter"
commit-interval="1" />
</batch:tasklet>
<batch:next on="*" to="tradeStep2" />
<batch:fail on="FAILED" />
</batch:step>
tradeStep2 will archive processed CSV files.

Related

How to write retryable-exception-classes using Spring Batch Annotation?

We're migrating from Spring Batch XML based application to latest Spring Boot 2.2.6.RELEASE version application at this moment.
I've the below XML Snippet which I want to convert it into the Annotation based Job. I am really struggling to find these options, as I went through https://docs.spring.io/spring-batch/docs/current/reference/html/step.html#taskletStep.
<batch:job id="myJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="reader" writer="writer" commit-interval="100" retry-limit="3" skip-limit="3">
<batch:retryable-exception-classes>
<batch:exclude class="org.springframework.dao.PessimisticLockingFailureException"/>
</batch:retryable-exception-classes>
<batch:skippable-exception-classes>
<batch:include class="org.springframework.dao.DeadlockLoserDataAccessException"/>
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
Another snippet:
<bean id="retryPolicy" class="org.springframework.retry.policy.ExceptionClassifierRetryPolicy">
<property name="policyMap">
<map>
<entry key="org.springframework.dao.ConcurrencyFailureException">
<bean class="org.springframework.batch.retry.policy.SimplreRetryPolicy">
<property name="maxAttempts" value="3" />
</bean>
</entry>
<entry key="org.springframework.dao.DeadlockLoserDataAccessException">
<bean class="org.springframework.batch.retry.policy.SimplreRetryPolicy">
<property name="maxAttempts" value="5" />
</bean>
</entry>
</map>
</property>
</bean>
There is a toggle at the top of each page that lets you choose which configuration style (XML or Java) you want to see in the code examples:
Now for the retryable exceptions section, here is the code with XML config:
And the equivalent in Java config:
If you want to provide a custom retry policy, you can use FaultTolerantStepBuilder#retryPolicy.

Transfer files on different servers (sftp) using Spring Integration

I have a requirement to build an application in SI which reads an input directory which may consist 1000s of file and copy them to remote servers, say 10 servers where a processor instance will pick them up for processing.The movement of file should be on round-robin fashion so that there is not additional burden on any server while processing them. To elaborate a little more - lets say we have 10 files in input directory then application should copy
file 1 on server1,
file2 on server2
.
.
.
....
file 10 on server 10.
Sequence doesn't matter what matters is that every server should have equal load.I am fairly new to Spring Integration but I found a sample to do sftp of file using SI
https://github.com/spring-projects/spring-integration-samples/tree/master/basic/sftp
but i am not sure how can I configure it for multiple servers and to have an algo to move files in round-robin fashion.
I will appreciate any tips or suggestion .
I am able to do sftp using below config.
<context:property-placeholder location="classpath:app.properties" />
<int-file:inbound-channel-adapter id="ReaderChannel"
directory="file:${input.file.dir}" filename-pattern="*.*"
prevent-duplicates="true" ignore-hidden="true" auto-startup="true">
<int:poller id="poller" fixed-rate="1" task-executor="myTaskExecutor" />
</int-file:inbound-channel-adapter>
<int-task:executor id="myTaskExecutor" pool-size="${file.concurrentFilesNum}" queue-capacity="0" rejection-policy="CALLER_RUNS" />
<int-sftp:outbound-channel-adapter id="sftpOutboundAdapter" session-factory="sftpSessionFactory" channel="ReaderChannel"
charset="UTF-8" remote-directory="${output.file.dir}" auto-startup="true">
<int-sftp:request-handler-advice-chain>
<int:retry-advice />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-channel-adapter>
<beans:bean id="sftpSessionFactory" class="org.springframework.integration.file.remote.session.CachingSessionFactory">
<beans:constructor-arg ref="defaultSftpSessionFactory" />
</beans:bean>
<beans:bean id="defaultSftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<beans:property name="host" value="${sftp.host}" />
<beans:property name="privateKey" value="${sftp.private.keyfile}" />
<beans:property name="privateKeyPassphrase" value="${sftp.private.passphrase}" />
<beans:property name="port" value="${sftp.serverPort}" />
<beans:property name="user" value="${sftp.username}" />
<beans:property name="allowUnknownKeys" value="true" />
</beans:bean>
The round-robin is hidden in the DirectChannel with UnicastingDispatcher on the RoundRobinLoadBalancingStrategy.
So, when you have several subscriber to the same DirectChannel, the message will be dispatched to them in the round-robin.
What you need for your use-case is just configure 10 <int-sftp:outbound-channel-adapter> for each your remote server. And use the same simple <channel> definition for their channel attribute.
The <int-file:inbound-channel-adapter> should always send its message to that shared channel with default round-robin strategy.

spring batch flatfileitemreader input resource must exist error

I am new to spring and my java is rusty so please excuse me if this is a simple question.
I have a two step job. The first step is to query a MongoDB and write a file. The second step is to read the file and process it.
I am using FlatFileItemReader for the second step. In the first step, I am writing the file without specifying any path.
When I run the job from within sts, the first step creates the file in the project directory in my sts workspace folder.
The second step throws the input resource must exist exception. I tried a variety of options such as file://, file:/ and so on without any success.
Confused as to why the read doesn't look for the file in the same place that the write wrote to.
In any case, any help in resolving the issue will be appreciated.
The first step does not use a FlatFileItemWriter and is just a tasklet that writes the file using java.
Here's the config:
<batch:step id="step2">
<batch:tasklet>
<batch:chunk reader="customerItemReader" processor="customerProcessor" writer="mongodbItemWriter"
commit-interval="1">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="loadFromMongo" class="org.springframework.batch.core.step.tasklet.MethodInvokingTaskletAdapter">
<property name="targetObject">
<bean class="testMongo.MongoLoader"/>
</property>
<property name="targetMethod" value="loadFromMongo" />
</bean>
<bean id="customerItemReader"
class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="classpath:Customers.txt"/>
<property name="lineMapper" ref="customerLineMapper"/>
<bean id="customerLineMapper"
class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer" ref="customerLineTokenizer"/>
<property name="fieldSetMapper" ref="customerFieldSetMapper"/>
<bean id="customerLineTokenizer" class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
</bean>
<bean id="customerFieldSetMapper" class="testMongo.CustomerFieldSetMapper">
</bean>
<bean id="customerProcessor" class="testMongo.CustomerItemProcessor">
</bean>
<bean id="mongodbItemWriter" class="org.springframework.batch.item.data.MongoItemWriter">
<property name="template" ref="mongoTemplate" />
<property name="collection" value="creditReport" />
</bean>
<!-- commenting out copied stuff from mkyong SpringBatch Example for now
<bean id="xmlItemReader" class="org.springframework.batch.item.xml.StaxEventItemReader">
<property name="fragmentRootElementName" value="record" />
<property name="resource" value="classpath:xml/report.xml" />
<property name="unmarshaller" ref="reportUnmarshaller" />
</bean>
<bean id="reportUnmarshaller" class="org.springframework.oxm.xstream.XStreamMarshaller">
<property name="aliases">
<util:map id="aliases">
<entry key="record" value="com.mkyong.model.Report" />
</util:map>
</property>
</bean>
-->

how to read data from multiple tables in db using spring batch

I tried reading data from one table and writing to other table using spring batch but now my requirement is to read data from mutiple tables and write to a file, so we can achieve this by defining mutiple jobs but I want to do it using single job means single reader and single writer and single processor.
Please provide me some references for this scenario.
Not possible by the classes provided by the spring batch but you can make a way our of it.
Just before the chunk processing add one step, make a custom tasklet where you will assign different sql and different output file and make them run in loop as long as there are sqls to execute.
It might sound difficult but I have worked on same situation, Here is some idea how you can do it -
<flow id="databaseReadWriteJob">
<step id="step1_setReaderWriter">
<tasklet ref="setReaderWriter" />
<next on="FAILED" to="" />
<next on="*" to="dataExtractionStep" />
</step>
<step id="dataExtractionStep">
<tasklet>
<chunk reader="dbReader" writer="flatFileWriter" commit-interval="${commit-interval}" />
</tasklet>
<next on="FAILED" to="" />
<next on="*" to="step3_removeProcessedSql" />
</step>
<step id="step3_removeProcessedSql">
<tasklet ref="removeProcessedSql" />
<next on="NOOP" to="step1_setReaderWriter" />
<next on="*" to="step4_validateNumOfSuccessfulSteps" />
</step>
</flow>
and here is the bean for setReaderWriter
<beans:bean id="setReaderWriter" class="SetReaderWriter">
<beans:property name="reader" ref="dbReader" />
<beans:property name="flatFileWriter" ref="flatFileWriter" />
<beans:property name="fileSqlMap" ref="jobSqlFileMap" />
<beans:property name="fileNameBuilder" ref="localFileNameBuilder" />
<beans:property name="sourceFolder" value="${dataDir}" />
<beans:property name="dateDiff" value="${dateDiff}" />
Anything you need to add dynamically in Reader or Writer. Above sqlMap is the map of sql as key and Output file as value of that.
I hope it could help.

Spring Security Multiple Authentication Filters

In my project i have declared two authentication filters as mentioned below in spring-security-context.xml file
<beans:bean id="springSecurityFilterChain" class="org.springframework.security.web.FilterChainProxy">
<sec:filter-chain-map path-type="ant">
<sec:filter-chain pattern="/**" filters="authenticationProcessingFilterWithDB, authenticationProcessingFilterWithXML" />
</sec:filter-chain-map>
</beans:bean>
I have declared authentication manager for each filter in following way:
<beans:bean id="authenticationProcessingFilterWithDB" class="org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter">
<beans:property name="authenticationManager" ref="authenticationManagerWithDB" />
<beans:property name="filterProcessesUrl" value="/j_spring_security_check_with_db" />
</beans:bean>
<beans:bean id="authenticationProcessingFilterWithXML" class="org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter">
<beans:property name="authenticationManager" ref="authenticationManagerWithXML" />
<beans:property name="filterProcessesUrl" value="/j_spring_security_check_with_xml" />
</beans:bean>
but when i try to access /j_spring_security_check_with_xml from jsp then it is giving resource not found error. The same is working if give default '/j_spring_security_check' with single authentication manager. Please give the solution for adding multiple authentication manager in spring security as i should be able to verify login details against xml file (where username and password will be provided),against Data base

Resources