How should I use .tasklet() / .chunk() to finish job succesfully? - spring

I use Spring Batch for cloning table from source to target database. The job is started manually from service layer using jobLauncher with passing parameters.
Everything is fine, but using current configuration (below) with .chunk(10) in step description I have only 10 rows cloned and Caused by: java.sql.SQLException: Result set already closed exception.
How to describe step properly just to finish read->write the whole table from source to target DB?
#Configuration
#EnableBatchProcessing
public class DatasetProcessingContext {
private static final String OVERRIDEN_BY_JOB_PARAMETER = null;
private static final String DATASET_PROCESSING_STEP = "datasetProcessingStep";
private static final String DATASET_PROCESSING_JOB = "datasetProcessingJob";
public static final String SUBSYSTEM = "subsystem";
public static final String SQL = "sql";
public static final String SOURCE_DATASOURCE = "sourceDatasource";
public static final String INSERT_QUERY = "insertQuery";
public static final String TARGET_DATASOURCE = "targetDatasource";
#Autowired
#Qualifier(DEV_DATA_SOURCE)
private DataSource devDataSource;
//set of datasources
#Autowired
private PlatformTransactionManager transactionManager;
#SuppressWarnings("MismatchedQueryAndUpdateOfCollection")
#Autowired
private Map<String, TableMessageDataRowMapper> tableMessageDataRowMappers;
#SuppressWarnings("MismatchedQueryAndUpdateOfCollection")
#Autowired
private Map<String, TableMessageDataPreparedStatementSetter> messageDataPreparedStatementSetters;
#Autowired
private JobBuilderFactory jobsFactory;
#Autowired
private StepBuilderFactory stepsFactory;
#Bean
public JobRepository jobRepository() throws Exception {
return new MapJobRepositoryFactoryBean(transactionManager).getObject();
}
#Bean
public JobRegistry jobRegistry() {
return new MapJobRegistry();
}
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor() {
JobRegistryBeanPostProcessor postProcessor = new JobRegistryBeanPostProcessor();
postProcessor.setJobRegistry(jobRegistry());
return postProcessor;
}
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository());
return jobLauncher;
}
#Bean
public static StepScope stepScope() {
return new StepScope();
}
#Bean
#SuppressWarnings("unchecked")
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemStreamReader jdbcReader(#Value("#{jobParameters['" + SUBSYSTEM + "']}") String subsystem,
#Value("#{jobParameters['" + SQL + "']}") String sql,
#Value("#{jobParameters['" + SOURCE_DATASOURCE + "']}") String sourceDatasource) {
JdbcCursorItemReader jdbcCursorItemReader = new JdbcCursorItemReader();
jdbcCursorItemReader.setDataSource(getDataSourceFromEnum(TargetDataSource.valueOf(sourceDatasource)));
jdbcCursorItemReader.setSql(sql);
jdbcCursorItemReader.setRowMapper((RowMapper) tableMessageDataRowMappers
.get(subsystem + TABLE_MESSAGE_DATA_ROW_MAPPER));
return jdbcCursorItemReader;
}
#Bean
#SuppressWarnings("unchecked")
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemWriter jdbcWriter(#Value("#{jobParameters['" + SUBSYSTEM + "']}") String subsystem,
#Value("#{jobParameters['" + INSERT_QUERY + "']}") String insertQuery,
#Value("#{jobParameters['" + TARGET_DATASOURCE + "']}") String targetDatasource) {
JdbcBatchItemWriter jdbcWriter = new JdbcBatchItemWriter();
jdbcWriter.setDataSource(getDataSourceFromEnum(TargetDataSource.valueOf(targetDatasource)));
jdbcWriter.setSql(insertQuery);
jdbcWriter.setItemPreparedStatementSetter(messageDataPreparedStatementSetters
.get(subsystem + TABLE_MESSAGE_DATA_PREPARED_STATEMENT_SETTER));
return jdbcWriter;
}
#Bean
#SuppressWarnings("unchecked")
public Step datasetProcessingStep() {
return stepsFactory.get(DATASET_PROCESSING_STEP)
// should I create Tasklet or chunk with some CompletionPolicy?
.chunk(10)
.reader(jdbcReader(OVERRIDEN_BY_JOB_PARAMETER, OVERRIDEN_BY_JOB_PARAMETER, OVERRIDEN_BY_JOB_PARAMETER))
.writer(jdbcWriter(OVERRIDEN_BY_JOB_PARAMETER, OVERRIDEN_BY_JOB_PARAMETER, OVERRIDEN_BY_JOB_PARAMETER))
.allowStartIfComplete(true)
.build();
}
#Bean
public Job datasetProcessingJob() {
return jobsFactory.get(DATASET_PROCESSING_JOB).start(datasetProcessingStep()).build();
}

Using .chunk(new DefaultResultCompletionPolicy()) in step description is suitable for my case. This policy returns true from isComplete(RepeatContext context, RepeatStatus result) in case of null-result - than ResultSet is over.

Related

How to run a job multiple times parallelly with different excels as input in spring batch

I have a use case where user upload different excel files where each file process parallelly and needs to save with job execution id for every row of the excel in h2 database.
But the issue I am facing here is when user uploads first file and the processing is going at back end and saving the excel data of every row with job id 1, but after if he uploading another excel file with out completing the first one with different data then the data related to first excel also getting saving with latest job execution id which is 2. So how to resolve this issue. so that each job data save with that particular id and has to run different jobs parlalley. this is the data for first excel sheet and this the data for second excel sheet and this is the output saving in h2 database.
This is the service class
#Service
public class BatchTestService {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher batchJobLauncher;
#Autowired
private WriterImpl writer;
public Job job(byte data[]) {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.flow(step(data))
.end()
.build();
}
#SneakyThrows
public PoiItemReader<TestEntity> reader(byte[] data) {
ReaderImpl reader = new ReaderImpl();
reader.setLinesToSkip(1);
reader.setResource(toResource(data, "TEST"));
reader.setRowMapper(new MapperClass());
return reader;
}
public Step step(byte data[]) {
return stepBuilderFactory.get("step").<TestEntity, TestEntity>chunk(2).reader(reader(data))
.writer(writer)
.build();
}
public ThreadPoolTaskExecutor getThreadPoolTaskExecutor(){
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(2);
taskExecutor.setMaxPoolSize(4);
taskExecutor.setThreadNamePrefix("test");
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
public void uploadExcel(MultipartFile file)
throws Exception {
String jobId = String.valueOf(System.currentTimeMillis());
JobParameters parameters = new JobParametersBuilder().addString("jobId", jobId)
.toJobParameters();
((SimpleJobLauncher) batchJobLauncher).setTaskExecutor(getThreadPoolTaskExecutor());
batchJobLauncher.run(job(file.getBytes()), parameters);
}
public static Resource toResource(byte bytesFile[], String sheetName) throws IOException {
ByteArrayInputStream bin = new ByteArrayInputStream(bytesFile);
XSSFWorkbook workbook = new XSSFWorkbook(bin);
var outputStream = new ByteArrayOutputStream();
workbook.write(outputStream);
return new ByteArrayResource(outputStream.toByteArray());
}
}
This is the config class.
#Configuration
public class BatchDataSourceConfig {
#Value("${spring.datasource.driver-class-name}")
private String driverName;
#Value("${spring.datasource.url}")
private String url;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverName);
dataSource.setUrl(url);
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
#Bean
public JobLauncher batchJobLauncher(JobRepository jobRepository) {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
}
This is reader class
public class ReaderImpl extends PoiItemReader<TestEntity> {}
This is writer class
#Component
public class WriterImpl implements ItemWriter<TestEntity> {
private static Logger logger = LoggerFactory.getLogger(WriterImpl.class);
#Autowired
private TestEntityRepository testEntityRepository;
private StepExecution stepExecution;
#BeforeStep
public void beforeStep(final StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
#Override
#SneakyThrows
public void write(List<? extends TestEntity> modelObjectList) {
logger.info("Writer is reached...");
Thread.sleep(3000);
for (TestEntity testEntity : modelObjectList) {
testEntity.setJobExecutionId(stepExecution.getJobExecutionId());
testEntityRepository.save(testEntity);
}
}
}
And also resprctive rowmapper class is also defined.
public class MapperClass implements RowMapper<TestEntity> {
#Override
public TestEntity mapRow(RowSet rowSet) {
TestEntity testEntity = new TestEntity();
testEntity.setStudentName(rowSet.getColumnValue(0));
testEntity.setRollNo(rowSet.getColumnValue(1));
testEntity.setSection(rowSet.getColumnValue(2));
return testEntity;
}
}
This is the model class
#AllArgsConstructor
#Data
#Entity
#NoArgsConstructor
#Table(name = "TEST_ENTITY")
public class TestEntity {
#GeneratedValue(strategy = GenerationType.AUTO)
#Id
private Integer id;
private String studentName;
private String rollNo;
private String section;
private Long jobExecutionId;
}

Springboot Batch get CSV from rest API

I have a scenario for Springboot batch application where I need to read the CSV-11 and write to another CSV-2, the problem arises when I need to get the CSV-1 from a REST endpoint, for e.g. when I initiate the job it should fetch a CSV-1 from an endpoint and then continue the batch processing to write to CSV-2.
But it seems like to process a CSV-1 we need to feed the 'Resource' in advance while application startup [I am fairly new to batch so not sure if it's 100% correct].
Can anyone please guide me the right approach to resolve this?
EDIT : Adding code, I was able to resolve it but need advice if it's the right way to do things (please ignore the hard-coded data).
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Autowired
private JobBuilderFactory jobbuilder;
#Autowired
private StepBuilderFactory stepbuilder;
#Autowired
private CSVProcessor processor;
#Autowired
private CustomSkipPolicy customSkipPolicy;
#Autowired
private CSVResponse response;
#Bean(name="csv")
public Job job_csv() throws Exception {
Step step = stepbuilder
.get("csv-step")
.<Person, Person>chunk(5)
.reader(new CSVReader(response.getResource()))
.processor(processor)
.writer(new CSVWriter().write(response.getFilename()))
.faultTolerant()
.skipPolicy(customSkipPolicy)
.listener(new StepListener())
.build();
return jobbuilder
.get("csv-job")
.incrementer(new RunIdIncrementer())
.listener(new JobListener())
.flow(step)
.end()
.build();
}
}
#Slf4j
public class CSVReader extends FlatFileItemReader<Person> {
public CSVReader(Resource resource) throws Exception {
super();
setResource(resource);
setStrict(false);
setLinesToSkip(1);
doOpen();
DelimitedLineTokenizer dlt = new DelimitedLineTokenizer();
dlt.setNames(new String[] {"id","first_name","last_name","email","gender","ip_address","dob"});
dlt.setDelimiter(",");
dlt.setStrict(false);
BeanWrapperFieldSetMapper<Person> fsp = new BeanWrapperFieldSetMapper<>();
fsp.setTargetType(Person.class);
DefaultLineMapper<Person> dlp = new DefaultLineMapper<>();
dlp.setLineTokenizer(dlt);
dlp.setFieldSetMapper(fsp);
setLineMapper(dlp);
}
}
public class CSVWriter {
private static final String DATA_PROCESSED = "C:/data/processed";
public FlatFileItemWriter<Person> write(String filename) throws Exception {
FlatFileItemWriter<Person> writer = new FlatFileItemWriter<>();
writer.setResource(resource(filename));
writer.setHeaderCallback(new Header());
BeanWrapperFieldExtractor<Person> fe = new BeanWrapperFieldExtractor<>();
fe.setNames(new String[] { "id", "first_name", "last_name", "age" });
DelimitedLineAggregator<Person> dla = new DelimitedLineAggregator<>();
dla.setDelimiter(",");
dla.setFieldExtractor(fe);
writer.setLineAggregator(dla);
return writer;
}
private Resource resource(String filename) throws IOException {
String processed_file = filename.replace(".csv", "").concat("_PROCESSED").concat(".csv");
if (Files.notExists(Paths.get(DATA_PROCESSED), new LinkOption[] { LinkOption.NOFOLLOW_LINKS })) {
Files.createDirectory(Paths.get(DATA_PROCESSED));
}
if (Files.notExists(Paths.get(DATA_PROCESSED + "/" + processed_file),
new LinkOption[] { LinkOption.NOFOLLOW_LINKS })) {
Files.createFile(Paths.get(DATA_PROCESSED + "/" + processed_file));
}
return new FileSystemResource(DATA_PROCESSED + "/" + processed_file);
}
}
#Component
#Slf4j
public class CSVResponse {
private static final String DATA_RECEIVE = "C:/data/receive";
private String filename;
public Resource getResource() throws IOException {
ResponseEntity<Resource> resp = new RestTemplate().getForEntity("http://localhost:8081/csv", Resource.class);
log.info("File Received : "+resp.getBody().getFilename());
setFilename(resp.getBody().getFilename());
Path path = Paths.get(DATA_RECEIVE);
if(!Files.exists(path, new LinkOption[]{ LinkOption.NOFOLLOW_LINKS})) {
Files.createDirectory(path);
}
Files.copy(resp.getBody().getInputStream(), Paths.get(DATA_RECEIVE + "/" +resp.getBody().getFilename()), StandardCopyOption.REPLACE_EXISTING);
return resp.getBody();
}
public String getFilename() {
return filename;
}
public void setFilename(String filename) {
this.filename = filename;
}
}
You can implement item reader with StepExecutionListener interface, and you have to implement the fetching CSV-1 file via RestTemplate in beforeStep() method of the item reader.
You can refer how to use RestTemplate from https://www.baeldung.com/rest-template

Spring Batch Multiple JobExecutionListener - not working

I have a Spring boot batch application.
I have two job and two different JobExecutionListener. But always only one JobExecutionListener is called from both the job.
Job One
#Configuration
public class TestDriveJob {
#Autowired
JobLauncher jobLauncher;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
TestDriveTasklet testDriveTasklet;
#Autowired
BatchJobMgmt batchJobMgmt;
#Autowired
TestDriveJobExecutionListener testDriveJobExecutionListener;
private static final String JOB_NAME = "test-drive-job";
#Scheduled(fixedDelay = 100 * 1000, initialDelay = 5000)
public void schedule() throws Exception {
if (batchJobMgmt.isJobAllowed(JOB_NAME)) {
JobParameters param = new JobParametersBuilder().addString("JobID", "TEST" + String.valueOf(System.currentTimeMillis())).toJobParameters();
jobLauncher.run(job(), param);
}
}
#Bean
public Job job() {
return jobBuilderFactory.get(JOB_NAME).listener(testDriveJobExecutionListener).incrementer(new RunIdIncrementer()).start(testDriveStep()).build();
}
#Bean
public Step testDriveStep() {
return stepBuilderFactory.get("test-drive-step").tasklet(testDriveTasklet).listener(testDriveJobExecutionListener).build();
}
}
Job Two
#Configuration
public class ExtendedWarrantyJob {
private static final Logger LOGGER = LoggerFactory.getLogger(ExtendedWarrantyJob.class);
#Autowired
JobLauncher jobLauncher;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
ExtendedWarrantyTasklet extendedWarrantyTasklet;
#Autowired
BatchJobMgmt batchJobMgmt;
#Autowired
ExtendedWarrantyJobExecutionListener extendedWarrantyJobExecutionListener;
private static final String JOB_NAME = "extended-warranty-job";
#Scheduled(fixedDelay = 900 * 1000, initialDelay = 20000)
public void schedule() throws Exception {
if (batchJobMgmt.isJobAllowed(JOB_NAME)) {
LOGGER.debug("Extended Warranty Job Started at :" + new Date());
JobParameters param = new JobParametersBuilder().addString("JobID", String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = jobLauncher.run(job(), param);
LOGGER.debug("Extended Warranty Job finished with status :" + execution.getStatus());
}
}
#Bean
public Job job() {
return jobBuilderFactory.get(JOB_NAME).listener(extendedWarrantyJobExecutionListener).incrementer(new RunIdIncrementer()).start(extendedWarrantyStep())
.build();
}
#Bean
public Step extendedWarrantyStep() {
return stepBuilderFactory.get("extended-warranty-step").tasklet(extendedWarrantyTasklet).build();
}
}
Job one Listener
#Service
public class TestDriveJobExecutionListener implements JobExecutionListener {
private static final Logger LOGGER = LoggerFactory.getLogger(TestDriveJobExecutionListener.class);
#Override
public void beforeJob(JobExecution jobExecution) {
LOGGER.debug("{} Started at : {}", jobExecution.getJobInstance().getJobName(), new Date());
}
#Override
public void afterJob(JobExecution jobExecution) {
LOGGER.debug("{} Completed at : {}", jobExecution.getJobInstance().getJobName(), new Date());
}
}
Job Two Listener
#Service
public class ExtendedWarrantyJobExecutionListener implements JobExecutionListener {
private static final Logger LOGGER = LoggerFactory.getLogger(ExtendedWarrantyJobExecutionListener.class);
#Override
public void beforeJob(JobExecution jobExecution) {
LOGGER.debug("{} Started=== at : {}", jobExecution.getJobInstance().getJobName(), new Date());
}
#Override
public void afterJob(JobExecution jobExecution) {
LOGGER.debug("{} Completed at : {}", jobExecution.getJobInstance().getJobName(), new Date());
}
}
Both the jobs are executing without any issue. But Always the job one listener is called. Job Two Listener is not working.
I tried adding #Qualifier to the ##Autowired annotation, but it did not make any sense.
any help is appreciated.

How to copy a file from ftp server to local directory using sftp and spring boot,Java

I have codes for Inbound and Outbound channel adapter over SFTP. I want to call those method via spring boot scheduler not using polling. Looking for example how to call resultFileHandler() method
public class SftpConfig {
#Value("${nodephone.directory.sftp.host}")
private String sftpHost;
#Value("${nodephone.directory.sftp.port}")
private int sftpPort;
#Value("${nodephone.directory.sftp.user}")
private String sftpUser;
#Value("${nodephone.directory.sftp.password}")
private String sftpPasword;
#Value("${nodephone.directory.sftp.remote.directory.download}")
private String sftpRemoteDirectoryDownload;
#Value("${nodephone.directory.sftp.remote.directory.upload}")
private String sftpRemoteDirectoryUpload;
#Value("${nodephone.directory.sftp.remote.directory.filter}")
private String sftpRemoteDirectoryFilter;
#Value("${nodephone.directory.sftp.remote.directory.localDirectory}")
private String sftpLocalDirectory;
// private FtpOrderRequestHandler handler;
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
factory.setPassword(sftpPasword);
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(factory);
}
#Bean
public SftpInboundFileSynchronizer sftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer fileSynchronizer = new SftpInboundFileSynchronizer(sftpSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(true);
fileSynchronizer.setRemoteDirectory(sftpRemoteDirectoryDownload);
fileSynchronizer.setFilter(new SftpSimplePatternFileListFilter(sftpRemoteDirectoryFilter));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "fromSftpChannel", poller = #Poller(cron = "0/5 * * * * *"))
public MessageSource<File> sftpMessageSource() {
SftpInboundFileSynchronizingMessageSource source = new SftpInboundFileSynchronizingMessageSource(
sftpInboundFileSynchronizer());
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
source.setLocalDirectory(new File("/local"));
return source;
}
#Bean
#ServiceActivator(inputChannel = "fromSftpChannel")
public MessageHandler resultFileHandler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println("********************** " + message.getPayload());
}
};
}
I have tested with Configuration annotation and it reads the file from the server, but I want to run this from Cron instead of polling, how do I call the method resultFileHandler()
I've never done this using Spring Integration in any production code although I did something like below, to download files from remote servers using sftp/ftp.
I'm only using the SftpOutboundGateway (there could be better ways), to call the "mget" method and fetch the payload (file).
#Configuration
#ConfigurationProperties(prefix = "sftp")
#Setter
#Getter
#EnableIntegration
public class RemoteFileConfiguration {
private String clients;
private String hosts;
private int ports;
private String users;
private String passwords;
#Bean(name = "clientSessionFactory")
public SessionFactory<LsEntry> clientSessionFactory() {
DefaultSftpSessionFactory sf = new DefaultSftpSessionFactory();
sf.setHost(hosts);
sf.setPort(ports);
sf.setUser(users);
sf.setPassword(passwords);
sf.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(sf);
}
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler clientMessageHandler() {
SftpOutboundGateway sftpOutboundGateway = new SftpOutboundGateway(
clientSessionFactory(), "mget", "payload");
sftpOutboundGateway.setAutoCreateLocalDirectory(true);
sftpOutboundGateway.setLocalDirectory(new File("/users/localPath/client/INPUT/"));
sftpOutboundGateway.setFileExistsMode(FileExistsMode.REPLACE_IF_MODIFIED);
sftpOutboundGateway.setFilter(new AcceptOnceFileListFilter<>());
return sftpOutboundGateway;
}
}
#MessagingGateway
public interface SFTPGateway {
#Gateway(requestChannel = "sftpChannel")
List<File> get(String dir);
}
To ensure we use cron to execute this, I have used a Tasklet which is executed by Spring Batch, when I need it to be using a cron expression.
#Slf4j
#Getter
#Setter
public class RemoteFileInboundTasklet implements Tasklet {
private RemoteFileTemplate remoteFileTemplate;
private String remoteClientDir;
private String clientName;
private SFTPGateway sftpGateway;
#Override
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext)
throws Exception {
List<File> files = sftpGateway.get(remoteClientDir);
if (CollectionUtils.isEmpty(files)) {
log.warn("No file was downloaded for client {}.", clientName);
return RepeatStatus.FINISHED;
}
log.info("Total file: {}", files.size());
return RepeatStatus.FINISHED;
}
}
NOTE: If you don't want to use Batch's Tasklet, you can use your #Component class and inject the Gateway to call "get" method.
#Autowired
private SFTPGateway sftpGateway;

How to fix xml-less autowiring of service

When I call a service directly in my main() I can query the database and things work fine. When a jersey request comes in and maps the JSON to NewJobRequest I can't use my service because the #Autowire failed.
My app:
public class Main {
public static final URI BASE_URI = getBaseURI();
private static URI getBaseURI() {
return UriBuilder.fromUri("http://localhost/").port(9998).build();
}
protected static HttpServer startServer() throws IOException {
ResourceConfig rc = new PackagesResourceConfig("com.production.api.resources");
rc.getFeatures()
.put(JSONConfiguration.FEATURE_POJO_MAPPING, true);
return GrizzlyServerFactory.createHttpServer(BASE_URI, rc);
}
public static void main(String[] args) throws IOException {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Config.class);
//if this is uncommented, it'll successfully query the database
//VendorService vendorService = (VendorService)ctx.getBean("vendorService");
//Vendor vendor = vendorService.findByUUID("asdf");
HttpServer httpServer = startServer();
System.out.println(String.format("Jersey app started with WADL available at " + "%sapplication.wadl\nTry out %shelloworld\nHit enter to stop it...", BASE_URI, BASE_URI));
System.in.read();
httpServer.stop();
}
}
My Resource (controller):
#Component
#Path("/job")
public class JobResource extends GenericResource {
#Path("/new")
#POST
public String New(NewJobRequest request) {
return "done";
}
}
Jersey is mapping the JSON post to:
#Component
public class NewJobRequest {
#Autowired
private VendorService vendorService;
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
//uh oh, can't do anything here because #Autowired failed and vendorService is null
}
}
VendorService:
#Service
public class VendorService extends GenericService<VendorDao> {
public Vendor findByUUID(String uuid) {
Vendor entity = null;
try {
return (Vendor)em.createNamedQuery("Vendor.findByUUID")
.setParameter("UUID", uuid)
.getSingleResult();
} catch (Exception ex) {
return null;
}
}
}
-
#Service
public class GenericService<T extends GenericDao> {
private static Logger logger = Logger.getLogger(Logger.class.getName());
#PersistenceContext(unitName = "unit")
public EntityManager em;
protected T dao;
#Transactional
public void save(T entity) {
dao.save(entity);
}
}
My service config:
#Configuration
public class Config {
#Bean
public VendorService vendorService() {
return new VendorService();
}
}
My config
#Configuration
#ComponentScan(basePackages = {
"com.production.api",
"com.production.api.dao",
"com.production.api.models",
"com.production.api.requests",
"com.production.api.requests.job",
"com.production.api.resources",
"com.production.api.services"
})
#Import({
com.production.api.services.Config.class,
com.production.api.dao.Config.class,
com.production.api.requests.Config.class
})
#PropertySource(value= "classpath:/META-INF/application.properties")
#EnableTransactionManagement
public class Config {
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USER = "db.user";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
Environment environment;
#Bean
public DataSource dataSource() {
MysqlDataSource dataSource = new MysqlDataSource();
dataSource.setUrl(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
dataSource.setUser(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_USER));
dataSource.setPassword(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return dataSource;
}
#Bean
public JpaTransactionManager transactionManager() throws ClassNotFoundException {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactoryBean().getObject());
return transactionManager;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean() throws ClassNotFoundException {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceUnitName("unit");
entityManagerFactoryBean.setPackagesToScan(environment.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
Properties jpaProperties = new Properties();
jpaProperties.put(PROPERTY_NAME_HIBERNATE_DIALECT, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_FORMAT_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_FORMAT_SQL));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
}
The #Path and #POST annotations are JAX-RS, not Spring. So the container is instantiating your endpoints on its own, without any knowledge of Spring beans. You are most likely not getting any Spring logging because Spring is not being used at all.
I've figured out the issue and blogged about it here: http://blog.benkuhl.com/2013/02/how-to-access-a-service-layer-on-a-jersey-json-object/
In the mean time, I'm also going to post the solution here:
I need to tap into the bean that Spring already created so I used Spring's ApplicationContextAware
public class ApplicationContextProvider implements ApplicationContextAware {
private static ApplicationContext applicationContext;
public static ApplicationContext getApplicationContext() {
return applicationContext;
}
public void setApplicationContext (ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
}
And then used that static context reference within my object to be mapped to so I can perform lookups in the service:
public class NewJobRequest {
private VendorService vendorService;
public NewJobRequest() {
vendorService = (VendorService) ApplicationContextProvider.getApplicationContext().getBean("vendorService");
}
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
setVendor(vendorService.findById(request.get("vendorId")); //vendorService is null
}
....
}

Resources