Schedule a task with Cron which allows dynamic update - spring

I use sprint boot 1.3, spring 4.2
In this class
#Service
public class PaymentServiceImpl implements PaymentService {
....
#Transactional
#Override
public void processPayment() {
List<Payment> payments = paymentRepository.findDuePayment();
processCreditCardPayment(payments);
}
}
I would like to call processPayment every x moment.
This x moment is set in a database.
The user can modify it.
So i think i can't use anotation.
I started to this this
#EntityScan(basePackageClasses = {MyApp.class, Jsr310JpaConverters.class})
#SpringBootApplication
#EnableCaching
#EnableScheduling
public class MyApp {
#Autowired
private DefaultConfigService defaultConfigService;
public static void main(String[] args) {
SpringApplication.run(MyApp.class, args);
}
#Bean
public TaskScheduler poolScheduler() {
SimpleAsyncTaskExecutor taskScheduler = new SimpleAsyncTaskExecutor();
DefaultConfigDto defaultConfigDto = defaultConfigService.getByFieldName("payment-cron-task");
String cronTabExpression = "0 0 4 * * ?";
if (defaultConfigDto != null && !defaultConfigDto.getFieldValue().isEmpty()) {
cronTabExpression = "0 0 4 * * ?";
}
appContext.getBean("scheduler");
taskScheduler.schedule(task, new CronTrigger(cronTabExpression));
return scheduler;
}
Maybe it's not the good way.
Any suggestion?
Don't know if to get my context if i need to create a property like
#Autowired
ConfigurableApplicationContext context;
and after in the main
public static void main(String[] args) {
context = SpringApplication.run(MyApp.class, args);
}

Looking at the question seems like you want to update the scheduler, without restart.
The code you have shared only ensures the config is picked from DB, but it will not refresh without application restart.
The following code will use the default scheduler available in the spring context and dynamically compute the next execution time based on the available cron setting in the DB:
Here is the sample code:
import java.util.Date;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.scheduling.Trigger;
import org.springframework.scheduling.TriggerContext;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.SchedulingConfigurer;
import org.springframework.scheduling.config.ScheduledTaskRegistrar;
import org.springframework.scheduling.support.CronTrigger;
#SpringBootApplication
#EnableScheduling
public class Perses implements SchedulingConfigurer {
private static final Logger log = LoggerFactory.getLogger(Perses.class);
#Autowired
private DefaultConfigService defaultConfigService;
#Autowired
private PaymentService paymentService;
public static void main(String[] args) {
SpringApplication.run(Perses.class, args);
}
private String cronConfig() {
String cronTabExpression = "*/5 * * * * *";
if (defaultConfigDto != null && !defaultConfigDto.getFieldValue().isEmpty()) {
cronTabExpression = "0 0 4 * * ?";
}
return cronTabExpression;
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.addTriggerTask(new Runnable() {
#Override
public void run() {
paymentService.processPayment();
}
}, new Trigger() {
#Override
public Date nextExecutionTime(TriggerContext triggerContext) {
String cron = cronConfig();
log.info(cron);
CronTrigger trigger = new CronTrigger(cron);
Date nextExec = trigger.nextExecutionTime(triggerContext);
return nextExec;
}
});
}
}

Just if someone still having this issue a better solution getting value from database whenever you want without many changes would be run cron every minute and get mod between current minute versus a configurated value delta from database, if this mod is equals to 0 means it has to run like if it is a mathematical multiple, so if you want it to run every 5 minutes for example delta should be 5.
A sample:
#Scheduled(cron = "0 */1 * * * *") //fire every minute
public void perform() {
//running
Integer delta = 5;//get this value from databse
Integer minutes = getField(Calendar.MINUTE)//calendar for java 7;
Boolean toRun = true;//you can also get this one from database to make it active or disabled
toRun = toRun && (minutes % delta == 0);
if (toRun && (!isRunning)) {
isRunning = true;
try {
//do your logic here
} catch (Exception e) { }
isRunning = false;
}
}
public Integer getField(int field) {
Calendar now = Calendar.getInstance();
if(field == Calendar.MONTH) {
return now.get(field)+ 1; // Note: zero based!
}else {
return now.get(field);
}
}
Hope this help :D

Related

Spring Boot How to run multiple method with #Scheduled

I have a Spring Boot app where I want to have multiple methods to run at different times of the day. The first one runs, but no subsequent method runs. What do I need to do to fix this? Here is my code:
#EnableScheduling
#Configuration
//#ConditionalOnProperty(name = "spring.enable.scheduling")
#SpringBootApplication
#PropertySources({
#PropertySource(value = "prop.properties", ignoreResourceNotFound = true)
})
public class Application {
private static final Logger LOGGER = LoggerFactory.getLogger(Application.class);
public static MyClass class = new MyClass();
public static void main(String[] args) {
ClassLoader classLoader = ClassLoader.getSystemClassLoader();
InputStream resourceAsStream = classLoader.getResourceAsStream("log4j2.properties");
PropertyConfigurator.configure(resourceAsStream);
SpringApplication.run(Application.class, args);
}
#Scheduled(cron = "${4am.cron.expression}", zone = "America/New_York") //0 0 6 * * ?
public void method1() {
something;
}
#Scheduled(cron = "${10am.cron.expression}", zone = "America/New_York") //0 0 6 * * ?
public void method2() {
something;
}
#Scheduled(cron = "${10am.cron.expression}", zone = "America/New_York") //0 0 6 * * ?
public void method3() {
something;
}
#Scheduled(cron = "${330pm.cron.expression}", zone = "America/New_York") //0 0 6 * * ?
public void method4() {
something;
}
#Scheduled(cron = "${430pm.cron.expression}", zone = "America/New_York") //0 0 6 * * ?
public void stopExecutor() {
MyClass class = new MyClass();
Executor executor = new Executor(class);
executor.stop();
}
You can try annonate method you are trying to run at given scheduled day / time using #Scheduled ( cron = "your cron job time ") on method.
E.g.
#Scheduled(cron = " specify cron job here ")
public void run job() {
// Code here
}
Hope this helps !

NiFI "unable to find flowfile content"

I am using nifi 1.6 and get the following errors when trying to modify a clone of an incoming flowFile:
[1]"unable to find content for FlowFile: ... MissingFlowFileException
...
Caused by ContentNotFoundException: Could not find contetn for StandardClaim
...
Caused by java.io.EOFException: null"
[2]"FlowFileHandlingException: StandardFlowFileRecord... is not known in this session"
The first error occurs when trying to access the contents of the flow file, the second when removing the flow file from the session (within a catch of the first). This process is known to have worked under nifi 0.7.
The basic process is:
Clone the incoming flow file
Write to the clone
Write to the clone again (some additional formatting)
Repeat 1-3
The error occurs on the second iteration step 3.
An interesting point is that if immediately after the clone is performed, a session.read of the clone is done everything works fine. The read seems to reset some pointer.
I have created unit tests for this processor, but they do not fail in either case.
Below is code simplified from the actual version in use that demonstrates the issue. (The development system is not connected so I had to copy the code. Please forgive any typos - it should be close. This is also why a full stack trace is not provided.) The processor doing the work has a property to determine if an immediate read should be done, or not. So both scenarios can be performed easily. To set it up, all that is needed is a GetFile processor to supply the input and terminators for the output from the SampleCloningProcessor. A sample input file is included as well. The meat of the code is in the onTrigger and manipulate methods. The manipulation in this simplified version really don't do anything but copy the input to the output.
Any insights into why this is happening and suggestions for corrections will be appreciated - thanks.
SampleCloningProcessor.java
processor sample.package.cloning
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.Reader;
import java.util.Arrays;
import java.util.Hashset;
import java.util.List;
import java.util.Scanner;
import java.util.Set;
import org.apache.commons.compress.utils.IOUtils;
import org.apache.nifi.annotation.documentaion.CapabilityDescription;
import org.apache.nifi.annotation.documentaion.Tags;
import org.apache.nifi.componets.PropertyDescriptor;
import org.apache.nifi.flowfile.FlowFile;
import org.apache.nifi.processor.AbstractProcessor;
import org.apache.nifi.processor.ProcessorContext;
import org.apache.nifi.processor.ProcessorSession;
import org.apache.nifi.processor.ProcessorInitioalizationContext;
import org.apache.nifi.processor.Relationship;
import org.apache.nifi.processor.exception.ProcessException;
import org.apache.nifi.processor.io.InputStreamCalback;
import org.apache.nifi.processor.io.OutputStreamCalback;
import org.apache.nifi.processor.io.StreamCalback;
import org.apache.nifi.processor.util.StandardValidators;
import com.google.gson.Gson;
#Tags({"example", "clone"})
#CapabilityDescription("Demonsrates cloning of flowfile failure.")
public class SampleCloningProcessor extend AbstractProcessor {
/* Determines if an immediate read is performed after cloning of inoming flowfile. */
public static final PropertyDescriptor IMMEDIATE_READ = new PropertyDescriptor.Builder()
.name("immediateRead")
.description("Determines if processor runs successfully. If a read is done immediatly "
+ "after the clone of the incoming flowFile, then the processor should run successfully.")
.required(true)
.allowableValues("true", "false")
.defaultValue("true")
.addValidator(StandardValidators.BOLLEAN_VALIDATOR)
.build();
public static final Relationship SUCCESS = new Relationship.Builder().name("success").
description("No unexpected errors.").build();
public static final Relationship FAILURE = new Relationship.Builder().name("failure").
description("Errors were thrown.").build();
private Set<Relationship> relationships;
private List<PropertyDescriptors> properties;
#Override
public void init(final ProcessorInitializationContext contex) {
relationships = new HashSet<>(Arrays.asList(SUCCESS, FAILURE));
properties = new Arrays.asList(IMMEDIATE_READ);
}
#Override
public Set<Relationship> getRelationships() {
return this.relationships;
}
#Override
public List<PropertyDescriptor> getSuppprtedPropertyDescriptors() {
return this.properties;
}
#Override
public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
FlowFile incomingFlowFile = session.get();
if (incomingFlowFile == null) {
return;
}
try {
final InfileReader inFileReader = new InfileReader();
session.read(incomingFlowFile, inFileReader);
Product product = infileReader.getProduct();
boolean transfer = false;
getLogger().info("\tSession :\n" + session);
getLogger().info("\toriginal :\n" + incomingFlowFile);
for(int i = 0; i < 2; i++) {
transfer = manipulate(context, session, inclmingFlowFile, product);
}
} catch (Exception e) {
getLogger().error(e.getMessage(), e);
session.rollback(true);
}
}
private boolean manipuate(final ProcessContext context, final ProcessSession session
final FlowFile incomingFlowFile, final Product product) {
boolean transfer = false;
FlowFile outgoingFlowFile = null;
boolean immediateRead = context.getProperty(IMMEDIATE_READ).asBoolean();
try {
//Clone incoming flowFile
outgoinFlowFile = session.clone(incomingFlowFile);
getLogger().info("\tclone outgoing :\n" + outgoingFlowFile);
if(immediateRead) {
readFlowFile(session, outgoingFlowFile);
}
//First write into clone
StageOneWrite stage1Write = new StaeOneWrite(product);
outgoingFlowFile = session.write(outgoingFlowFile, stage1Write);
getLogger().info("\twrite outgoing :\n" + outgoingFlowFile);
// Format the cloned file with another write
outgoingFlowFile = formatFlowFile(outgoingFlowFile, session)
getLogger().info("\format outgoing :\n" + outgoingFlowFile);
session.transfer(outgoingFlowFile, SUCCESS);
transfer != true;
} catch(Exception e)
getLogger().error(e.getMessage(), e);
if(outgoingFlowFile ! = null) {
session.remove(outgoingFlowFile);
}
}
return transfer;
}
private void readFlowFile(fainl ProcessSession session, fianl Flowfile flowFile) {
session.read(flowFile, new InputStreamCallback() {
#Override
public void process(Final InputStream in) throws IOException {
try (Scanner scanner = new Scanner(in)) {
scanner.useDelimiter("\\A").next();
}
}
});
}
private FlowFile formatFlowFile(fainl ProcessSession session, FlowFile flowfile) {
OutputFormatWrite formatWrite = new OutputFormatWriter();
flowfile = session.write(flowFile, formatWriter);
return flowFile;
}
private static class OutputFormatWriter implement StreamCallback {
#Override
public void process(final InputStream in, final OutputStream out) throws IOException {
try {
IOUtils.copy(in. out);
out.flush();
} finally {
IOUtils.closeQuietly(in);
IOUtils.closeQuietly(out);
}
}
}
private static class StageOneWriter implements OutputStreamCallback {
private Product product = null;
public StageOneWriter(Produt product) {
this.product = product;
}
#Override
public void process(final OutputStream out) throws IOException {
final Gson gson = new Gson();
final String json = gson.toJson(product);
out.write(json.getBytes());
}
}
private static class InfileReader implements InputStreamCallback {
private Product product = null;
public StageOneWriter(Produt product) {
this.product = product;
}
#Override
public void process(final InputStream out) throws IOException {
product = null;
final Gson gson = new Gson();
Reader inReader = new InputStreamReader(in, "UTF-8");
product = gson.fromJson(inreader, Product.calss);
}
public Product getProduct() {
return product;
}
}
SampleCloningProcessorTest.java
package sample.processors.cloning;
import org.apache.nifi.util.TestRunner;
import org.apache.nifi.util.TestRunners;
import org.junit.Before;
import org.junit.Test;
public class SampleCloningProcessorTest {
final satatic String flowFileContent = "{"
+ "\"cost\": \"cost 1\","
+ "\"description\": \"description","
+ "\"markup\": 1.2"
+ "\"name\":\"name 1\","
+ "\"supplier\":\"supplier 1\","
+ "}";
private TestRunner testRunner;
#Before
public void init() {
testRunner = TestRunner.newTestRunner(SampleCloningProcessor.class);
testRunner.enqueue(flowFileContent);
}
#Test
public void testProcessorImmediateRead() {
testRunner.setProperty(SampleCloningProcessor.IMMEDIATE_READ, "true");
testRunner.run();
testRinner.assertTransferCount("success", 2);
}
#Test
public void testProcessorImmediateRead_false() {
testRunner.setProperty(SampleCloningProcessor.IMMEDIATE_READ, "false");
testRunner.run();
testRinner.assertTransferCount("success", 2);
}
}
Product.java
package sample.processors.cloning;
public class Product {
private String name;
private String description;
private String supplier;
private String cost;
private float markup;
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescriptione(final String description) {
this.description = description;
}
public String getSupplier() {
return supplier;
}
public void setSupplier(final String supplier) {
this.supplier = supplier;
}
public String getCost() {
return cost;
}
public void setCost(final String cost) {
this.cost = cost;
}
public float getMarkup() {
return markup;
}
public void setMarkup(final float name) {
this.markup = markup;
}
}
product.json A sample input file.
{
"const" : "cost 1",
"description" : "description 1",
"markup" : 1.2,
"name" : "name 1",
"supplier" : "supplier 1"
}
Reported as a bug in Nifi. Being addressed by https://issues.apache.org/jira/browse/NIFI-5879

Logback Filter - limitative error log message

Still newbie on springboot, gradle and logback, I need help! I am trying to create my own logback filter.
Main goal is to allow my logger to only send a single log message if some logs with same error message are send by application.
To do it, I just create a basic gradle project to test, with 2 classes.
build.gradle
logback.xml
project_explorer_eclipse
I - My main class which logs some errors
package com.example.CDOP221logback;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.joran.JoranConfigurator;
import ch.qos.logback.core.joran.spi.JoranException;
#SpringBootApplication
public class Cdop221LogbackApplication {
private final static Logger log = LoggerFactory.getLogger("com.example.CDOP221log4j");
public static void main(String[] args) {
SpringApplication.run(Cdop221LogbackApplication.class, args);
LoggerContext context = (LoggerContext)LoggerFactory.getILoggerFactory();
context.reset();
JoranConfigurator config = new JoranConfigurator();
config.setContext(context);
try {
config.doConfigure("/home/mehdi/eclipse-workspace/CDOP-221-logback/logback.xml");
} catch (JoranException e) {
e.printStackTrace();
}
test();
}
private static void test() {
log.debug("Application Cdop221 with LOGBACK logger launch succesful");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
log.error("ERROR_1");
int i = 0;
while(i < 100) {
log.error("ERROR_2");
i++;
}
}
}
II - My own appender which has to limit number of log if some entries are the same
package com.logback;
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.classic.spi.IThrowableProxy;
import ch.qos.logback.classic.spi.LoggingEvent;
import ch.qos.logback.classic.spi.StackTraceElementProxy;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;
import org.apache.commons.lang3.ArrayUtils;
import org.slf4j.LoggerFactory;
import java.util.HashMap;
import java.util.Map;
import java.util.Timer;
import java.util.TimerTask;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.TimeUnit;
/**
* Improved {#link ch.qos.logback.classic.turbo.DuplicateMessageFilter} with a timeout feature added and time window error stacking #buzzwords
* Indeed if there's some error logs that are the same (same hashcode) they are stacked and sent after {#link DuplicateErrorLogFilter#cacheTimeoutInSec}
*/
public class DuplicateErrorLogFilter extends Filter<ILoggingEvent> {
/**
* Repetition number MDC property
*/
private static final String REP_NB = "repNb";
/**
* The default cache size.
*/
private static final int DEFAULT_CACHE_SIZE = 100;
/**
* The default cache timeout in seconds
*/
private static final int DEFAULT_CACHE_TIMEOUT_IN_SEC = 300;
private String smtpAppenderName;
private int cacheSize = DEFAULT_CACHE_SIZE;
private int cacheTimeoutInSec = DEFAULT_CACHE_TIMEOUT_IN_SEC;
private Map<Integer, FoldingTask> tasks = new ConcurrentHashMap<>(cacheSize);
/**
* Timer that will expire folding tasks
*/
private Timer foldingTimer = new Timer("folding-timer", false);
private final class FoldingTask extends TimerTask {
private Integer key;
private ILoggingEvent lastEvent;
private int foldingCount;
#Override
public void run() {
// Remove current task
tasks.remove(key);
// And send the event to SMTP appender
sendEvent(lastEvent, foldingCount);
}
}
/**
* Append an event that has been folded
*
* #param event the last seen event of this kind
* #param foldingCount how many events were folded
*/
protected void sendEvent(ILoggingEvent event, int foldingCount) {
if (event != null) {
if (foldingCount > 1) {
// Do that to prevent UnsupportedOp from EmptyMap
if (event.getMDCPropertyMap().isEmpty() && event instanceof LoggingEvent) {
((LoggingEvent) event).setMDCPropertyMap(new HashMap<>());
}
event.getMDCPropertyMap().put(REP_NB, "[" + foldingCount + "x]");
}
((Logger) (LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME))).getAppender(smtpAppenderName).doAppend(event);
}
}
public void setSmtpAppenderName(String smtpAppenderName) {
this.smtpAppenderName = smtpAppenderName;
}
public void setCacheSize(int cacheSize) {
this.cacheSize = cacheSize;
}
public void setCacheTimeoutInSec(int cacheTimeoutInSec) {
this.cacheTimeoutInSec = cacheTimeoutInSec;
}
#Override
public void start() {
super.start();
}
#Override
public void stop() {
tasks.clear();
tasks = null;
super.stop();
}
#Override
public FilterReply decide(ILoggingEvent event) {
if (!event.getLevel().isGreaterOrEqual(Level.ERROR)) {
return FilterReply.NEUTRAL;
}
Integer key = eventHashCode(event);
FoldingTask task = tasks.get(key);
if (task == null) {
// First time we encounter this event
task = new FoldingTask();
task.key = key;
// lastEvent will be set at the first folded event
tasks.put(key, task);
// Arm timer for this task
foldingTimer.schedule(task, TimeUnit.SECONDS.toMillis(cacheTimeoutInSec));
// And log this event
return FilterReply.NEUTRAL;
} else {
// Fold this event
task.lastEvent = event;
task.foldingCount++;
return FilterReply.DENY;
}
}
/**
* Compute a signature for an event
*/
private int eventHashCode(ILoggingEvent event) {
IThrowableProxy thrInfo = event.getThrowableProxy();
if (thrInfo == null || ArrayUtils.isEmpty(thrInfo.getStackTraceElementProxyArray())) {
// No stacktrace
String message = event.getFormattedMessage();
return message.hashCode();
}
StackTraceElementProxy[] stack = thrInfo.getStackTraceElementProxyArray();
int hashCode = 0;
for (StackTraceElementProxy str : stack) {
hashCode = 31 * hashCode + str.hashCode();
}
return hashCode;
}
}
So, when I run my code, it doesn't work actually... But I am not really able to identify if it's because of a bad configuration (I am beginner with logback library) or if my code sucks?...
Thank you in advance for your help
result code (doesn't work correctly)
A part is missing from the logback config file (logback.xml) where you connect your filter (DuplicateErrorLogFilter) and logback:
<filter class="com.logback.DuplicateErrorLogFilter"/>
for additional info on how to use filter: https://logback.qos.ch/manual/filters.html

Spring Batch to read multiple files with same extension

I have custom reader to read data from CSV File.
package org.kp.oppr.remediation.batch.csv;
import java.util.Arrays;
import java.util.LinkedHashMap;
import java.util.Map;
import org.apache.commons.lang.StringUtils;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.remediation.batch.csv.FlatFileItemReaderNewLine;
import org.remediation.batch.model.RawItem;
import org.remediation.batch.model.RawItemLineMapper;
import org.springframework.batch.core.ExitStatus;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.StepExecutionListener;
import org.springframework.batch.core.annotation.BeforeStep;
import org.springframework.batch.item.file.LineCallbackHandler;
import org.springframework.batch.item.file.LineMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.mapping.FieldSetMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.batch.item.file.transform.FieldSet;
import org.springframework.batch.item.file.transform.LineTokenizer;
import org.springframework.core.io.Resource;
import org.springframework.util.Assert;
import org.springframework.validation.BindException;
public class RawItemCsvReader extends MultiResourceItemReader<RawItem>
implements StepExecutionListener, LineCallbackHandler,
FieldSetMapper<RawItem> {
static final Logger LOGGER = LogManager.getLogger(RawItemCsvReader.class);
final private String COLUMN_NAMES_KEY = "COLUMNS_NAMES_KEY";
private StepExecution stepExecution;
private DefaultLineMapper<RawItem> lineMapper;
private String[] columnNames;
private Resource[] resources;
// = DelimitedLineTokenizer.DELIMITER_COMMA;
private char quoteCharacter = DelimitedLineTokenizer.DEFAULT_QUOTE_CHARACTER;
private String delimiter;
public RawItemCsvReader() {
setLinesToSkip(0);
setSkippedLinesCallback(this);
}
#Override
public void afterPropertiesSet() {
// not in constructor to ensure we invoke the override
final DefaultLineMapper<RawItem> lineMapper = new RawItemLineMapper();
setLineMapper(lineMapper);
}
/**
* Satisfies {#link LineCallbackHandler} contract and and Acts as the
* {#code skippedLinesCallback}.
*
* #param line
*/
#Override
public void handleLine(String line) {
getLineMapper().setLineTokenizer(getTokenizer());
getLineMapper().setFieldSetMapper(this);
}
private LineTokenizer getTokenizer() {
// this.columnNames = line.split(delimiter);
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setQuoteCharacter(quoteCharacter);
lineTokenizer.setDelimiter(delimiter);
lineTokenizer.setStrict(true);
lineTokenizer.setNames(columnNames);
addColumnNames();
return lineTokenizer;
}
private void addColumnNames() {
stepExecution.getExecutionContext().put(COLUMN_NAMES_KEY, columnNames);
}
#Override
public void setResources(Resource[] resources) {
this.resources = resources;
super.setResources(resources);
}
/**
* Provides acces to an otherwise hidden field in parent class. We need this
* because we have to reconfigure the {#link LineMapper} based on file
* contents.
*
* #param lineMapper
*/
#Override
public void setLineMapper(LineMapper<RawItem> lineMapper) {
if (!(lineMapper instanceof DefaultLineMapper)) {
throw new IllegalArgumentException(
"Must specify a DefaultLineMapper");
}
this.lineMapper = (DefaultLineMapper) lineMapper;
super.setLineMapper(lineMapper);
}
private DefaultLineMapper getLineMapper() {
return this.lineMapper;
}
/**
* Satisfies {#link FieldSetMapper} contract.
*
* #param fs
* #return
* #throws BindException
*/
#Override
public RawItem mapFieldSet(FieldSet fs) throws BindException {
if (fs == null) {
return null;
}
Map<String, String> record = new LinkedHashMap<String, String>();
for (String columnName : this.columnNames) {
record.put(columnName,
StringUtils.trimToNull(fs.readString(columnName)));
}
RawItem item = new RawItem();
item.setResource(resources);
item.setRecord(record);
return item;
}
#BeforeStep
public void saveStepExecution(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
#Override
public void beforeStep(StepExecution stepExecution) {
//LOGGER.info("Start Raw Read Step for " + itemResource.getFilename());
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
LOGGER.info("End Raw Read Step for lines read: " + stepExecution.getReadCount()
+ " lines skipped: " + stepExecution.getReadSkipCount());
/*
LOGGER.info("End Raw Read Step for " + itemResource.getFilename()
+ " lines read: " + stepExecution.getReadCount()
+ " lines skipped: " + stepExecution.getReadSkipCount());
*/
return ExitStatus.COMPLETED;
}
public void setDelimiter(String delimiter) {
this.delimiter = delimiter;
}
public void setQuoteCharacter(char quoteCharacter) {
this.quoteCharacter = quoteCharacter;
}
public String[] getColumnNames() {
return columnNames;
}
public void setColumnNames(String[] columnNames) {
this.columnNames = columnNames;
}
public String getDelimiter() {
return delimiter;
}
}
I want to use MultiResourceItemReader along with this class to read multiple files with the same extension. I am using the Spring MultiResourceItemReader to do the job. I need to know how to configure private ResourceAwareItemReaderItemStream delegate; instance for this class
package org.kp.oppr.remediation.batch.csv;
import java.util.Arrays;
import java.util.Comparator;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemStream;
import org.springframework.batch.item.ItemStreamException;
import org.springframework.batch.item.ParseException;
import org.springframework.batch.item.UnexpectedInputException;
import org.springframework.batch.item.file.LineCallbackHandler;
import org.springframework.batch.item.file.LineMapper;
import org.springframework.batch.item.file.ResourceAwareItemReaderItemStream;
import org.springframework.batch.item.util.ExecutionContextUserSupport;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.io.Resource;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
public class MultiResourceItemReader <T> implements ItemReader<T>, ItemStream, InitializingBean,ResourceAwareItemReaderItemStream<T> {
static final Logger LOGGER = LogManager
.getLogger(MultipleFlatFileItemReaderNewLine.class);
private final ExecutionContextUserSupport executionContextUserSupport = new ExecutionContextUserSupport();
private ResourceAwareItemReaderItemStream<? extends T> delegate;
private Resource[] resources;
private MultiResourceIndex index = new MultiResourceIndex();
private boolean saveState = true;
// signals there are no resources to read -> just return null on first read
private boolean noInput;
private LineMapper<T> lineMapper;
private int linesToSkip = 0;
private LineCallbackHandler skippedLinesCallback;
private Comparator<Resource> comparator = new Comparator<Resource>() {
/**
* Compares resource filenames.
*/
public int compare(Resource r1, Resource r2) {
return r1.getFilename().compareTo(r2.getFilename());
}
};
public MultiResourceItemReader() {
executionContextUserSupport.setName(ClassUtils.getShortName(MultiResourceItemReader.class));
}
/**
* #param skippedLinesCallback
* will be called for each one of the initial skipped lines
* before any items are read.
*/
public void setSkippedLinesCallback(LineCallbackHandler skippedLinesCallback) {
this.skippedLinesCallback = skippedLinesCallback;
}
/**
* Public setter for the number of lines to skip at the start of a file. Can
* be used if the file contains a header without useful (column name)
* information, and without a comment delimiter at the beginning of the
* lines.
*
* #param linesToSkip
* the number of lines to skip
*/
public void setLinesToSkip(int linesToSkip) {
this.linesToSkip = linesToSkip;
}
/**
* Setter for line mapper. This property is required to be set.
*
* #param lineMapper
* maps line to item
*/
public void setLineMapper(LineMapper<T> lineMapper) {
this.lineMapper = lineMapper;
}
/**
* Reads the next item, jumping to next resource if necessary.
*/
public T read() throws Exception, UnexpectedInputException, ParseException {
if (noInput) {
return null;
}
T item;
item = readNextItem();
index.incrementItemCount();
return item;
}
/**
* Use the delegate to read the next item, jump to next resource if current
* one is exhausted. Items are appended to the buffer.
* #return next item from input
*/
private T readNextItem() throws Exception {
T item = delegate.read();
while (item == null) {
index.incrementResourceCount();
if (index.currentResource >= resources.length) {
return null;
}
delegate.close();
delegate.setResource(resources[index.currentResource]);
delegate.open(new ExecutionContext());
item = delegate.read();
}
return item;
}
/**
* Close the {#link #setDelegate(ResourceAwareItemReaderItemStream)} reader
* and reset instance variable values.
*/
public void close() throws ItemStreamException {
index = new MultiResourceIndex();
delegate.close();
noInput = false;
}
/**
* Figure out which resource to start with in case of restart, open the
* delegate and restore delegate's position in the resource.
*/
public void open(ExecutionContext executionContext) throws ItemStreamException {
Assert.notNull(resources, "Resources must be set");
noInput = false;
if (resources.length == 0) {
LOGGER.warn("No resources to read");
noInput = true;
return;
}
Arrays.sort(resources, comparator);
for(int i =0; i < resources.length; i++)
{
LOGGER.info("Resources after Sorting" + resources[i]);
}
index.open(executionContext);
delegate.setResource(resources[index.currentResource]);
delegate.open(new ExecutionContext());
try {
for (int i = 0; i < index.currentItem; i++) {
delegate.read();
}
}
catch (Exception e) {
throw new ItemStreamException("Could not restore position on restart", e);
}
}
/**
* Store the current resource index and position in the resource.
*/
public void update(ExecutionContext executionContext) throws ItemStreamException {
if (saveState) {
index.update(executionContext);
}
}
/**
* #param delegate reads items from single {#link Resource}.
*/
public void setDelegate(ResourceAwareItemReaderItemStream<? extends T> delegate) {
this.delegate = delegate;
}
/**
* Set the boolean indicating whether or not state should be saved in the
* provided {#link ExecutionContext} during the {#link ItemStream} call to
* update.
*
* #param saveState
*/
public void setSaveState(boolean saveState) {
this.saveState = saveState;
}
/**
* #param comparator used to order the injected resources, by default
* compares {#link Resource#getFilename()} values.
*/
public void setComparator(Comparator<Resource> comparator) {
this.comparator = comparator;
}
/**
* #param resources input resources
*/
public void setResources(Resource[] resources) {
this.resources = resources;
}
/**
* Facilitates keeping track of the position within multi-resource input.
*/
private class MultiResourceIndex {
private static final String RESOURCE_KEY = "resourceIndex";
private static final String ITEM_KEY = "itemIndex";
private int currentResource = 0;
private int markedResource = 0;
private int currentItem = 0;
private int markedItem = 0;
public void incrementItemCount() {
currentItem++;
}
public void incrementResourceCount() {
currentResource++;
currentItem = 0;
}
public void mark() {
markedResource = currentResource;
markedItem = currentItem;
}
public void reset() {
currentResource = markedResource;
currentItem = markedItem;
}
public void open(ExecutionContext ctx) {
if (ctx.containsKey(executionContextUserSupport.getKey(RESOURCE_KEY))) {
currentResource = ctx.getInt(executionContextUserSupport.getKey(RESOURCE_KEY));
}
if (ctx.containsKey(executionContextUserSupport.getKey(ITEM_KEY))) {
currentItem = ctx.getInt(executionContextUserSupport.getKey(ITEM_KEY));
}
}
public void update(ExecutionContext ctx) {
ctx.putInt(executionContextUserSupport.getKey(RESOURCE_KEY), index.currentResource);
ctx.putInt(executionContextUserSupport.getKey(ITEM_KEY), index.currentItem);
}
}
#Override
public void afterPropertiesSet() throws Exception {
// TODO Auto-generated method stub
}
#Override
public void setResource(Resource resource) {
// TODO Auto-generated method stub
}
}
Configuration Files for Spring is :
<batch:step id="readFromCSVFileAndUploadToDB" next="stepMovePdwFile">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="multiResourceReader" writer="rawItemDatabaseWriter"
commit-interval="500" skip-policy="pdwUploadSkipPolicy" />
</batch:tasklet>
</batch:step>
<bean id="multiResourceReader"
class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
<property name="resource" value="file:#{jobParameters[filePath]}/*.dat" />
<property name="delegate" ref="rawItemCsvReader"></property>
</bean>
<bean id="rawItemCsvReader" class="org.kp.oppr.remediation.batch.csv.RawItemCsvReader"
scope="step">
<property name="resources" value="file:#{jobParameters[filePath]}/*.dat" />
<property name="columnNames" value="${columnNames}" />
<property name="delimiter" value="${delimiter}" />
</bean>
Use a standard FlatFileItemReader (properly configured via XML) instead of your RawItemCsvReader as delegate.
This solution will answer your question because FlatFileItemReader implements AbstractItemStreamItemReader.
Remember: SB is heavly based on delegation; write a class like your reader is rarely requested.

Spring boot server port range setting

Is it possible to set an acceptable range for the server.port in the application.yml file for a spring boot application.
I have taken to setting server.port=0 to get an automatically assigned port rather than a hard coded one.
Our network ops people want to restrict the available range for this port assignment.
Any idea?
Following both user1289300 and Dave Syer, I used the answers to formulate one solution. It is supplied as a configuration that reads from the application.yml file for the server section. I supplied a port range min and max to choose from.
Thanks again
#Configuration
#ConfigurationProperties("server")
public class EmbeddedServletConfiguration{
/*
Added EmbeddedServletContainer as Tomcat currently. Need to change in future if EmbeddedServletContainer get changed
*/
private final int MIN_PORT = 1100;
private final int MAX_PORT = 65535;
/**
* this is the read port from the applcation.yml file
*/
private int port;
/**
* this is the min port number that can be selected and is filled in from the application yml fil if it exists
*/
private int maxPort = MIN_PORT;
/**
* this is the max port number that can be selected and is filled
*/
private int minPort = MAX_PORT;
/**
* Added EmbeddedServletContainer as Tomcat currently. Need to change in future if EmbeddedServletContainer get changed
*
* #return the container factory
*/
#Bean
public EmbeddedServletContainerFactory servletContainer() {
return new TomcatEmbeddedServletContainerFactory();
}
#Bean
public EmbeddedServletContainerCustomizer containerCustomizer() {
return new EmbeddedServletContainerCustomizer() {
#Override
public void customize(ConfigurableEmbeddedServletContainer container) {
// this only applies if someone has requested automatic port assignment
if (port == 0) {
// make sure the ports are correct and min > max
validatePorts();
int port = SocketUtils.findAvailableTcpPort(minPort, maxPort);
container.setPort(port);
}
container.addErrorPages(new ErrorPage(HttpStatus.NOT_FOUND, "/404"));
container.addErrorPages(new ErrorPage(HttpStatus.FORBIDDEN, "/403"));
}
};
}
/**
* validate the port choices
* - the ports must be sensible numbers and within the alowable range and we fix them if not
* - the max port must be greater than the min port and we set it if not
*/
private void validatePorts() {
if (minPort < MIN_PORT || minPort > MAX_PORT - 1) {
minPort = MIN_PORT;
}
if (maxPort < MIN_PORT + 1 || maxPort > MAX_PORT) {
maxPort = MAX_PORT;
}
if (minPort > maxPort) {
maxPort = minPort + 1;
}
}
}
Just implement EmbeddedServletContainerCustomizer
http://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-developing-web-applications.html#boot-features-programmatic-embedded-container-customization
Of course you can make improvements to public static boolean available(int port) below that checks availability of the port because some ports though available are sometimes denied like port 1024, OS dependent, also range can be read from some properties file but not with Spring because range is set before context is loaded, but that should not be a problem, I put everything in one file to show approach not to make it look pretty
#Configuration
#ComponentScan
#EnableAutoConfiguration
public class DemoApplication {
private static final int MIN_PORT = 1100; // to by set according to your
private static final int MAX_PORT = 9000; // needs or uploaded from
public static int myPort; // properties whatever suits you
public static void main(String[] args) {
int availablePort = MIN_PORT;
for (availablePort=MIN_PORT; availablePort < MAX_PORT; availablePort++) {
if (available(availablePort)) {
break;
}
}
if (availablePort == MIN_PORT && !available(availablePort)) {
throw new IllegalArgumentException("Cant start container for port: " + myPort);
}
DemoApplication.myPort = availablePort;
SpringApplication.run(DemoApplication.class, args);
}
public static boolean available(int port) {
System.out.println("TRY PORT " + port);
// if you have some range for denied ports you can also check it
// here just add proper checking and return
// false if port checked within that range
ServerSocket ss = null;
DatagramSocket ds = null;
try {
ss = new ServerSocket(port);
ss.setReuseAddress(true);
ds = new DatagramSocket(port);
ds.setReuseAddress(true);
return true;
} catch (IOException e) {
} finally {
if (ds != null) {
ds.close();
}
if (ss != null) {
try {
ss.close();
} catch (IOException e) {
/* should not be thrown */
}
}
}
return false;
}
}
and this is most important part:
#Component
class CustomizationBean implements EmbeddedServletContainerCustomizer {
#Override
public void customize(ConfigurableEmbeddedServletContainer container) {
container.setPort(DemoApplication.myPort);
}
}
The easiest way to configure is using the following in application.properties.
Here i mentioned 8084 as the minimum range and 8100 as the maximum range.
server.port=${random.int[8084,8100]}
There is challenges in spring boot project, we can not add this feature to spring boot at the moment, If you have any solution please contribute.
Spring boot server port range support Pull Request
With this solution, the application choosing her own Port. I don't understand why it get "-1", because it runs perfect.
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.context.embedded.ConfigurableEmbeddedServletContainer;
import org.springframework.boot.context.embedded.EmbeddedServletContainerCustomizer;
import org.springframework.context.annotation.Configuration;
import org.springframework.util.SocketUtils;
#Configuration
class PortRangeCustomizerBean implements EmbeddedServletContainerCustomizer
{
private final Logger logger = LoggerFactory.getLogger(this.getClass());
#Value("${spring.port.range.min}")
private int MIN_PORT;
#Value("${spring.port.range.max}")
private int MAX_PORT;
#Override
public void customize(ConfigurableEmbeddedServletContainer container) {
int port = SocketUtils.findAvailableTcpPort(MIN_PORT, MAX_PORT);
logger.info("Started with PORT:\t " + port);
container.setPort(port);
}
}
We have done this in Spring Boot 1.5.9 using EmbeddedServletContainerCustomizer and something as follows:
#Bean
public EmbeddedServletContainerCustomizer containerCustomizer() {
return (container -> {
try {
// use defaults if we can't talk to config server
Integer minPort = env.getProperty("minPort")!=null ? Integer.parseInt(env.getProperty("minPort")) : 7500;
Integer maxPort = env.getProperty("maxPort")!=null ? Integer.parseInt(env.getProperty("maxPort")) : 9500;
int port = SocketUtils.findAvailableTcpPort(minPort,maxPort);
System.getProperties().put("server.port", port);
container.setPort(port);
} catch (Exception e) {
log.error("Error occured while reading the min & max port form properties : " + e);
throw new ProductServiceException(e);
}
});
}
However this does not seem to be possible in Spring Boot 2.0.0.M7 and we are looking for an alternative way.

Resources