I have multiple merchants and a WSDL with different configurations is needed for each merchant. Right now, I'm copying and pasting the following method and changing the configs. But it creates some difficulties (requires code change & deployment). I want to initialize it programmatically at the run time. I've tried these methods but it din't work.
Is it possible?
#Bean(name = "marchant-1")
public DefaultWsdl11Definition defaultWsdl11Definition(XsdSchema commonSchema) {
DefaultWsdl11Definition wsdl11Definition = new DefaultWsdl11Definition();
wsdl11Definition.setPortTypeName("Marchant1WSPort");
wsdl11Definition.setLocationUri("/Marchant1WebService");
wsdl11Definition.setTargetNamespace("http://.../");
wsdl11Definition.setSchema(commonSchema);
return wsdl11Definition;
}
I solved similar kind of problem by specifying scope of my bean as prototype. Below example explains my implementation:
Create your DefaultWsdl11Definition class as shown below:
import org.springframework.context.annotation.Scope;
import org.springframework.stereotype.Component;
#Component
#Scope("prototype")
#Getter
#Setter
public class DefaultWsdl11Definition {
private String portTypeName;
private String locationUri;
private String targetNamespace;
private XsdSchema schema;
}
Implement ApplicationContextAware to generate beans programmatically:
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.stereotype.Service;
#Service
public class ApplicationContextAwareImpl implements ApplicationContextAware {
private static ApplicationContext context;
#Override
public void setApplicationContext(ApplicationContext applicationContext) {
ApplicationContextAwareImpl.initApplicationContext(applicationContext);
}
private static void initApplicationContext(ApplicationContext applicationContext) {
ApplicationContextAwareImpl.context = applicationContext;
}
/**
* #param requiredType Bean class
*
* #return Bean of required type
*/
public static <T> T getBean(Class<T> requiredType) {
return context.getBean(requiredType);
}
}
Use ApplicationContextAwareImpl.getBean() method to generate beans programmatically:
DefaultWsdl11Definition wsdl11Definition = ApplicationContextAwareImpl.getBean(DefaultWsdl11Definition.class);
wsdl11Definition.setPortTypeName("Marchant1WSPort");
wsdl11Definition.setLocationUri("/Marchant1WebService");
wsdl11Definition.setTargetNamespace("http://.../");
wsdl11Definition.setSchema(commonSchema);
Also you can make use of Qualifier and Bean annotations to generate multiple beans of same type:
#Configuration
public class BeanConfigurations {
#Qualifier("marchant-1")
#Bean(name = "marchant-1")
public DefaultWsdl11Definition defaultWsdl11Definition1(XsdSchema commonSchema) {
DefaultWsdl11Definition wsdl11Definition = new DefaultWsdl11Definition();
wsdl11Definition.setPortTypeName("Marchant1WSPort");
wsdl11Definition.setLocationUri("/Marchant1WebService");
wsdl11Definition.setTargetNamespace("http://.../");
wsdl11Definition.setSchema(commonSchema);
return wsdl11Definition;
}
#Qualifier("marchant-2")
#Bean(name = "marchant-2")
public DefaultWsdl11Definition defaultWsdl11Definition2(XsdSchema commonSchema) {
DefaultWsdl11Definition wsdl11Definition = new DefaultWsdl11Definition();
wsdl11Definition.setPortTypeName("Marchant2WSPort");
wsdl11Definition.setLocationUri("/Marchant2WebService");
wsdl11Definition.setTargetNamespace("http://.../");
wsdl11Definition.setSchema(commonSchema);
return wsdl11Definition;
}
}
This solution worked for me:
#Configuration
public class WsBeanLoader implements BeanDefinitionRegistryPostProcessor {
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
SimpleXsdSchema xsd = new SimpleXsdSchema(new ClassPathResource("xsd/sample.xsd"));
xsd.afterPropertiesSet();
BeanDefinitionBuilder builder = builder(name, xsd);
registry.registerBeanDefinition(name, builder.getBeanDefinition());
}
private BeanDefinitionBuilder builder(String name, SimpleXsdSchema xsd) {
name = StringUtils.capitalize(name);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(DefaultWsdl11Definition.class);
builder.addPropertyValue("serviceName", name + "WebService");
builder.addPropertyValue("portTypeName", name + "WSPortBinding");
builder.addPropertyValue("locationUri", "/WS/" + name + "WS");
builder.addPropertyValue("targetNamespace", "url..");
builder.addPropertyValue("schema", xsd);
return builder;
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
}
}
Related
I wanna use the properties to set some swagger docket to spring but I cant get the properties when I implements ImportBeanDefinitionRegistrar and get an error
Caused by: java.lang.NoSuchMethodException:
com.github.sofior.swagger.SwaggerAutoConfiguration.<init>()
#Configuration
#EnableSwagger2
#EnableConfigurationProperties(SwaggerProperties.class)
public class SwaggerAutoConfiguration implements ImportBeanDefinitionRegistrar {
private final SwaggerProperties properties;
public SwaggerAutoConfiguration(SwaggerProperties properties) {
this.properties = properties;
}
#Override
public void registerBeanDefinitions(AnnotationMetadata importingClassMetadata, BeanDefinitionRegistry registry) {
System.out.println(properties);
properties.getDockets().forEach((docketName, docketProperties) -> {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(Docket.class);
builder.addConstructorArgValue(docketProperties.getType());
builder.addConstructorArgValue(docketProperties.getType());
registry.registerBeanDefinition(docketName, builder.getRawBeanDefinition());
});
}
}
I think it is impossible to do this, because spring have two phase
1.bean registration
2.bean initialization and instantiation
SwaggerProperties can only be used after phase 2 when it is finished to instantiate, but registerBeanDefinitions is the phase 1
Basically you need to inject the properties to your class constructor.
So the configurations should be Autowired in order to work them.
#Autowired
public SwaggerAutoConfiguration(SwaggerProperties properties) {
this.properties = properties;
}
This should fix your "properties" is null issue.
the workaround of this question is to read a new properties during registerBeanDefinitions
EnableCustomSwagger
import org.springframework.context.annotation.Import;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
#Documented
#Import(SwaggerAutoConfiguration.class)
public #interface EnableCustomSwagger {
String path() default "";
}
SwaggerAutoConfiguration
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.annotation.AnnotationAttributes;
import org.springframework.core.io.DefaultResourceLoader;
import org.springframework.core.io.Resource;
import org.springframework.core.io.ResourceLoader;
import org.springframework.core.type.AnnotationMetadata;
public class SwaggerAutoConfiguration implements ImportBeanDefinitionRegistrar {
#Override
public void registerBeanDefinitions(AnnotationMetadata importingClassMetadata, BeanDefinitionRegistry registry) {
String clsName = EnableCustomSwagger.class.getName();
AnnotationAttributes attrs = AnnotationAttributes.fromMap(importingClassMetadata.getAnnotationAttributes(clsName, false));
if (!attrs.getString("path").equals("")) {
String path = attrs.getString("path");
ResourceLoader loader = new DefaultResourceLoader();
Resource resource = loader.getResource(path);
// you can get the value from your property files
}
//how can I get properties here,the properties is null
// properties.getDockets().forEach((docketName, docketProperties) -> {
// BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(Docket.class);
// builder.addConstructorArgValue(docketProperties.getType());
// builder.addConstructorArgValue(docketProperties.getType());
// registry.registerBeanDefinition(docketName, builder.getRawBeanDefinition());
// });
}
}
Application
#SpringBootApplication
#EnableCustomSwagger(path="classpath:docklet.properties")
public class Application {
}
spring 2.x
import org.springframework.boot.context.properties.bind.Binder;
public class MultipleDataSourceComponentRegistrar implements ImportBeanDefinitionRegistrar, EnvironmentAware {
...
private Environment environment;
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Override
public void registerBeanDefinitions(AnnotationMetadata importingClassMetadata, BeanDefinitionRegistry registry) {
ConfigurationProperties annotationCp = MultipleDataSourceSetProperties.class.getAnnotation(ConfigurationProperties.class);
MultipleDataSourceSetProperties properties = Binder.get(environment).bind(annotationCp.prefix(), MultipleDataSourceSetProperties.class).get();
}
...
I have a monitoring app wherein I am running a fixedRate task. This is pulling in a config parameter configured with Consul. I want to pull in updated configuration, so I added #RefreshScope. But as soon as I update the config value on Consul, the fixedRate task stops running.
#Service
#RefreshScope
public class MonitorService {
#Autowired
private AppConfig appConfig;
#PostConstruct
public void postConstRun() {
System.out.println(appConfig.getMonitorConfig());
}
#Scheduled(fixedRate = 1000)
public void scheduledMonitorScan() {
System.out.println("MonitorConfig:" + appConfig.getMonitorConfig());
}
}
AppConfig class just has a single String parameter:
#Configuration
#Getter
#Setter
public class AppConfig {
#Value("${monitor-config:default value}")
private String monitorConfig;
}
As soon as I update the value in consul, the scheduled task just stops running (display in sheduledMonitorScan method) stop showing up.
I'm successfully get & override the values from consul config server using RefreshScopeRefreshedEvent
import java.text.SimpleDateFormat;
import java.util.Date;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.cloud.context.config.annotation.RefreshScope;
import org.springframework.cloud.context.scope.refresh.RefreshScopeRefreshedEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
#Component
#RefreshScope
public class AlertSchedulerCron implements ApplicationListener<RefreshScopeRefreshedEvent> {
private SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
#Value("${pollingtime}")
private String pollingtime;
/*
* #Value("${interval}") private String interval;
*/
#Scheduled(cron = "${pollingtime}")
//#Scheduled(fixedRateString = "${interval}" )
public void task() {
System.out.println(pollingtime);
System.out.println("Scheduler (cron expression) task with duration : " + sdf.format(new Date()));
}
#Override
public void onApplicationEvent(RefreshScopeRefreshedEvent event) {
// TODO Auto-generated method stub
}
}
Here's how we've solved this issue.
/**
* Listener of Spring's lifecycle to revive Scheduler beans, when spring's
* scope is refreshed.
* <p>
* Spring is able to restart beans, when we change their properties. Such a
* beans marked with RefreshScope annotation. To make it work, spring creates
* <b>lazy</b> proxies and push them instead of real object. The issue with
* scope refresh is that right after refresh in order for such a lazy proxy
* to be actually instantiated again someone has to call for any method of it.
* <p>
* It creates a tricky case with Schedulers, because there is no bean, which
* directly call anything on any Scheduler. Scheduler lifecycle is to start
* few threads upon instantiation and schedule tasks. No other bean needs
* anything from them.
* <p>
* To overcome this, we had to create artificial method on Schedulers and call
* them, when there is a scope refresh event. This actually instantiates.
*/
#RequiredArgsConstructor
public class RefreshScopeListener implements ApplicationListener<RefreshScopeRefreshedEvent> {
private final List<RefreshScheduler> refreshSchedulers;
#Override
public void onApplicationEvent(RefreshScopeRefreshedEvent event) {
refreshSchedulers.forEach(RefreshScheduler::materializeAfterRefresh);
}
}
So, we've defined an interface, which does nothing in particular, but allows us to call for a refreshed job.
public interface RefreshScheduler {
/**
* Used after refresh context for scheduler bean initialization
*/
default void materializeAfterRefresh() {
}
}
And here is actual job, whose parameter from.properties can be refreshed.
public class AJob implements RefreshScheduler {
#Scheduled(cron = "${from.properties}")
public void aTask() {
// do something useful
}
}
UPDATED:
Of course AJob bean must be marked with #RefreshScope in #Configuration
#Configuration
#EnableScheduling
public class SchedulingConfiguration {
#Bean
#RefreshScope
public AJob aJob() {
return new AJob();
}
}
I have done workaround for this kind of scenario by implementing SchedulingConfigurer interface.
Here I am dynamically updating "scheduler.interval" property from external property file and scheduler is working fine even after actuator refresh as I am not using #RefreshScope anymore.
Hope this might help you in your case also.
public class MySchedulerImpl implements SchedulingConfigurer {
#Autowired
private Environment env;
#Bean(destroyMethod = "shutdown")
public Executor taskExecutor() {
return Executors.newScheduledThreadPool(10);
}
#Override
public void configureTasks(final ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(this.taskExecutor());
taskRegistrar.addTriggerTask(() -> {
//put your code here that to be scheduled
}, triggerContext -> {
final Calendar nextExecutionTime = new GregorianCalendar();
final Date lastActualExecutionTime = triggerContext.lastActualExecutionTime();
if (lastActualExecutionTime == null) {
nextExecutionTime.setTime(new Date());
} else {
nextExecutionTime.setTime(lastActualExecutionTime);
nextExecutionTime.add(Calendar.MILLISECOND, env.getProperty("scheduler.interval", Integer.class));
}
return nextExecutionTime.getTime();
});
}
}
My solution consists of listening to EnvironmentChangeEvent
#Configuration
public class SchedulingSpringConfig implements ApplicationListener<EnvironmentChangeEvent>, SchedulingConfigurer {
private static final Logger LOGGER = LoggerFactory.getLogger(SchedulingSpringConfig.class);
private final DemoProperties demoProperties;
public SchedulingSpringConfig(DemoProperties demoProperties) {
this.demoProperties = demoProperties;
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
LOGGER.info("Configuring scheduled task with cron expression: {}", demoProperties.getCronExpression());
taskRegistrar.addTriggerTask(triggerTask());
taskRegistrar.setTaskScheduler(taskScheduler());
}
#Bean
public TriggerTask triggerTask() {
return new TriggerTask(this::work, cronTrigger());
}
private void work() {
LOGGER.info("Doing work!");
}
#Bean
#RefreshScope
public CronTrigger cronTrigger() {
return new CronTrigger(demoProperties.getCronExpression());
}
#Bean
public ThreadPoolTaskScheduler taskScheduler() {
return new ThreadPoolTaskScheduler();
}
#Override
public void onApplicationEvent(EnvironmentChangeEvent event) {
if (event.getKeys().contains("demo.config.cronExpression")) {
ScheduledTasksRefresher scheduledTasksRefresher = new ScheduledTasksRefresher(triggerTask());
scheduledTasksRefresher.afterPropertiesSet();
}
}
}
Then I use the ContextLifecycleScheduledTaskRegistrar to recreate the task.
public class ScheduledTasksRefresher extends ContextLifecycleScheduledTaskRegistrar {
private final TriggerTask triggerTask;
ScheduledTasksRefresher(TriggerTask triggerTask) {
this.triggerTask = triggerTask;
}
#Override
public void afterPropertiesSet() {
super.destroy();
super.addTriggerTask(triggerTask);
super.afterSingletonsInstantiated();
}
}
Properties definition:
#ConfigurationProperties(prefix = "demo.config", ignoreUnknownFields = false)
public class DemoProperties {
private String cronExpression;
public String getCronExpression() {
return cronExpression;
}
public void setCronExpression(String cronExpression) {
this.cronExpression = cronExpression;
}
}
Main definition:
#SpringBootApplication
#EnableConfigurationProperties(DemoProperties.class)
#EnableScheduling
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
Based on previous answers I added the following interface and used it on #RefreshScope annotated beans:
public interface RefreshScopeScheduled {
#EventListener(RefreshScopeRefreshedEvent.class)
default void onApplicationEvent() { /*do nothing*/ }
}
I have a Spring Boot (1.4.0) application, which, during initialization, starts a 2nd context (I need that because I have to publish a web service using a specific kind of authorization while the parent context publishes a different service).
I created a child context like so:
#Configuration
#ConditionalOnClass({Servlet.class, DispatcherServlet.class})
#ConditionalOnWebApplication
public class ChildContextConfiguration implements ApplicationContextAware, ApplicationListener<ContextRefreshedEvent> {
private final Logger logger = LoggerFactory.getLogger(ChildContextConfiguration.class);
private ApplicationContext applicationContext;
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
private void createChildContext() {
final AnnotationConfigEmbeddedWebApplicationContext childContext = new AnnotationConfigEmbeddedWebApplicationContext(ChildConfiguration.class);
childContext.setParent(this.applicationContext);
childContext.setId(this.applicationContext.getId() + ":child");
}
#Override
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent) {
logger.info("creating child context");
createChildContext();
}
}
The child context's configuration class looks like this:
#Configuration
#ComponentScan(basePackages = {"com.example.child"})
#PropertySource("file:some-config.properties")
#ConfigurationProperties(prefix = "child")
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
public class ChildConfiguration {
private Integer port;
private String keyStore;
private String keyStorePass;
private String keyPass;
private String trustStore;
private String trustStorePass;
private String packageBase;
public void setPort(Integer port) {
this.port = port;
}
public void setKeyStore(String keyStore) {
this.keyStore = keyStore;
}
public void setKeyStorePass(String keyStorePass) {
this.keyStorePass = keyStorePass;
}
public void setKeyPass(String keyPass) {
this.keyPass = keyPass;
}
public void setTrustStore(String trustStore) {
this.trustStore = trustStore;
}
public void setTrustStorePass(String trustStorePass) {
this.trustStorePass = trustStorePass;
}
public void setPackageBase(String packageBase) {
this.packageBase = packageBase;
}
#Bean
public Jaxb2Marshaller swpMarshaller() {
Jaxb2Marshaller marshaller = new Jaxb2Marshaller();
marshaller.setPackagesToScan(packageBase);
return marshaller;
}
#Bean
public Unmarshaller swpUnmarshaller() throws JAXBException {
JAXBContext jaxbContext = JAXBContext.newInstance(packageBase);
return jaxbContext.createUnmarshaller();
}
#Bean
public Filter encodingFilter() {
CharacterEncodingFilter encodingFilter = new CharacterEncodingFilter();
encodingFilter.setEncoding("UTF-8");
return encodingFilter;
}
#Bean
public ServerProperties serverProperties() {
ServerProperties props = new ServerProperties();
props.setPort(port);
props.setSsl(ssl());
return props;
}
private Ssl ssl() {
Ssl ssl = new Ssl();
ssl.setEnabled(true);
ssl.setKeyStore(keyStore);
ssl.setKeyStorePassword(keyStorePass);
ssl.setKeyStoreType("JKS");
ssl.setKeyPassword(keyPass);
ssl.setTrustStore(trustStore);
ssl.setTrustStorePassword(trustStorePass);
ssl.setClientAuth(Ssl.ClientAuth.NEED);
return ssl;
}
}
So far, this works. But when I try to autowire a bean from the parent context, I get an error stating that there is no candidate.
Another interesting thing is, when I inject the (child)context into one of my child context's beans using the ApplicationContextAware interface, the getParent() property of that context is null at that time.
What I have done now is implementing getter functions like these:
private SomeBean getSomeBean() {
if (this.someBean == null) {
this.someBean = applicationContext.getParent().getBean(SomeBean.class);
}
return this.someBean;
}
To summarize this: During construction of the child context's beans, the parent context is not set, so I cannot use autowire.
Is there some way to make autowire work with my setup?
Constructor taking classes to register refreshes context internally - try to set class and refresh manually after setting parent context.
private void createChildContext() {
final AnnotationConfigEmbeddedWebApplicationContext childContext = new AnnotationConfigEmbeddedWebApplicationContext();
childContext.setParent(this.applicationContext);
childContext.setId(this.applicationContext.getId() + ":child");
childContext.register(ChildConfiguration.class);
childContext.refresh();
}
I am using Spring boot and Spring batch. I have defined more than one job.
I am trying to build junit to test specific task within a job.
Therefor I am using the JobLauncherTestUtils library.
When I run my test case I always get NoUniqueBeanDefinitionException.
This is my test class:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = {BatchConfiguration.class})
public class ProcessFileJobTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() {
JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
jobLauncherTestUtils.setJob(jobUnderTest());
return jobLauncherTestUtils;
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(processIdFileStep())
.build();
}
#Bean
public Step processIdFileStep() {
return stepBuilder.get("processIdFileStep")
.<PushItemDTO, PushItemDTO>chunk(1) //important to be one in this case to commit after every line read
.reader(reader(null))
.processor(processor(null, null, null, null))
.writer(writer())
// .faultTolerant()
// .skipLimit(10) //default is set to 0
// .skip(MySQLIntegrityConstraintViolationException.class)
.build();
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemStreamReader<PushItemDTO> reader(#Value("#{jobExecutionContext[filePath]}") String filePath) {
...
return itemReader;
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemProcessor<PushItemDTO, PushItemDTO> processor(#Value("#{jobParameters[pushMessage]}") String pushMessage,
#Value("#{jobParameters[jobId]}") String jobId,
#Value("#{jobParameters[taskId]}") String taskId,
#Value("#{jobParameters[refId]}") String refId)
{
return new PushItemProcessor(pushMessage,jobId,taskId,refId);
}
#Bean
public LineMapper<PushItemDTO> lineMapper() {
DefaultLineMapper<PushItemDTO> lineMapper = new DefaultLineMapper<PushItemDTO>();
...
return lineMapper;
}
#Bean
public ItemWriter writer() {
return new someWriter();
}
}
#Autowired
protected JobLauncher jobLauncher;
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void processIdFileStepTest1() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().addString("filePath", "C:\\etc\\files\\2015_02_02").toJobParameters();
JobExecution jobExecution = jobLauncherTestUtils.launchStep("processIdFileStep",jobParameters);
}
and thats the exception:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.batch.core.Job] is defined: expected single matching bean but found 3: jobUnderTest,executeToolJob,processFileJob
Any idea?
Thanks.
added BatchConfiguration class:
package com.mycompany.notification_processor_service.batch.config;
import com.mycompany.notification_processor_service.common.config.CommonConfiguration;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.*;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import javax.sql.DataSource;
#ComponentScan("com.mycompany.notification_processor_service.batch")
#PropertySource("classpath:application.properties")
#Configuration
#Import({CommonConfiguration.class})
#ImportResource({"classpath:applicationContext-pushExecuterService.xml"/*,"classpath:si/integration-context.xml"*/})
public class BatchConfiguration {
#Value("${database.driver}")
private String databaseDriver;
#Value("${database.url}")
private String databaseUrl;
#Value("${database.username}")
private String databaseUsername;
#Value("${database.password}")
private String databasePassword;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(databaseDriver);
dataSource.setUrl(databaseUrl);
dataSource.setUsername(databaseUsername);
dataSource.setPassword(databasePassword);
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
and this is CommonConfiguration
#ComponentScan("com.mycompany.notification_processor_service")
#Configuration
#EnableJpaRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.jpa"})
#EnableCouchbaseRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.couchbase"})
#EntityScan({"com.mycompany.notification_processor_service"})
#EnableAutoConfiguration
#EnableTransactionManagement
#EnableAsync
public class CommonConfiguration {
}
I had the same issue and the easier way is injecting in the setter of JobLauncherTestUtils like Mariusz explained in Jira of Spring:
#Bean
public JobLauncherTestUtils getJobLauncherTestUtils() {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("ncsvImportJob") Job job) {
super.setJob(job);
}
};
}
So I see the jobUnderTest bean. Somewhere in all those imports, you're importing the two other jobs as well. I see your BatchConfiguration class imports other stuff as well as you having component scanning turned on. Carefully trace through all your configurations. Something is picking up the definitions for those beans.
I also ran into this issue and couldn't have JobLauncherTestUtils to work properly. It might be caused by this issue
I ended up autowiring the SimpleJobLauncher and my Job into the unit test, and simply
launcher.run(importAccountingDetailJob, params);
An old post, but i thought of providing my solution as well.
In this case i am automatically registering a JobLauncherTestUtils per job
#Configuration
public class TestConfig {
private static final Logger logger = LoggerFactory.getLogger(TestConfig.class);
#Autowired
private AbstractAutowireCapableBeanFactory beanFactory;
#Autowired
private List<Job> jobs;
#PostConstruct
public void registerServices() {
jobs.forEach(j->{
JobLauncherTestUtils u = create(j);
final String name = j.getName()+"TestUtils"
beanFactory.registerSingleton(name,u);
beanFactory.autowireBean(u);
logger.info("Registered JobLauncherTestUtils {}",name);
});
}
private JobLauncherTestUtils create(final Job j) {
return new MyJobLauncherTestUtils(j);
}
private static class MyJobLauncherTestUtils extends JobLauncherTestUtils {
MyJobLauncherTestUtils(Job j) {
this.setJob(j);
}
#Override // to remove #Autowire from base class
public void setJob(Job job) {
super.setJob(job);
}
}
}
When I call a service directly in my main() I can query the database and things work fine. When a jersey request comes in and maps the JSON to NewJobRequest I can't use my service because the #Autowire failed.
My app:
public class Main {
public static final URI BASE_URI = getBaseURI();
private static URI getBaseURI() {
return UriBuilder.fromUri("http://localhost/").port(9998).build();
}
protected static HttpServer startServer() throws IOException {
ResourceConfig rc = new PackagesResourceConfig("com.production.api.resources");
rc.getFeatures()
.put(JSONConfiguration.FEATURE_POJO_MAPPING, true);
return GrizzlyServerFactory.createHttpServer(BASE_URI, rc);
}
public static void main(String[] args) throws IOException {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Config.class);
//if this is uncommented, it'll successfully query the database
//VendorService vendorService = (VendorService)ctx.getBean("vendorService");
//Vendor vendor = vendorService.findByUUID("asdf");
HttpServer httpServer = startServer();
System.out.println(String.format("Jersey app started with WADL available at " + "%sapplication.wadl\nTry out %shelloworld\nHit enter to stop it...", BASE_URI, BASE_URI));
System.in.read();
httpServer.stop();
}
}
My Resource (controller):
#Component
#Path("/job")
public class JobResource extends GenericResource {
#Path("/new")
#POST
public String New(NewJobRequest request) {
return "done";
}
}
Jersey is mapping the JSON post to:
#Component
public class NewJobRequest {
#Autowired
private VendorService vendorService;
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
//uh oh, can't do anything here because #Autowired failed and vendorService is null
}
}
VendorService:
#Service
public class VendorService extends GenericService<VendorDao> {
public Vendor findByUUID(String uuid) {
Vendor entity = null;
try {
return (Vendor)em.createNamedQuery("Vendor.findByUUID")
.setParameter("UUID", uuid)
.getSingleResult();
} catch (Exception ex) {
return null;
}
}
}
-
#Service
public class GenericService<T extends GenericDao> {
private static Logger logger = Logger.getLogger(Logger.class.getName());
#PersistenceContext(unitName = "unit")
public EntityManager em;
protected T dao;
#Transactional
public void save(T entity) {
dao.save(entity);
}
}
My service config:
#Configuration
public class Config {
#Bean
public VendorService vendorService() {
return new VendorService();
}
}
My config
#Configuration
#ComponentScan(basePackages = {
"com.production.api",
"com.production.api.dao",
"com.production.api.models",
"com.production.api.requests",
"com.production.api.requests.job",
"com.production.api.resources",
"com.production.api.services"
})
#Import({
com.production.api.services.Config.class,
com.production.api.dao.Config.class,
com.production.api.requests.Config.class
})
#PropertySource(value= "classpath:/META-INF/application.properties")
#EnableTransactionManagement
public class Config {
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USER = "db.user";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
Environment environment;
#Bean
public DataSource dataSource() {
MysqlDataSource dataSource = new MysqlDataSource();
dataSource.setUrl(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
dataSource.setUser(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_USER));
dataSource.setPassword(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return dataSource;
}
#Bean
public JpaTransactionManager transactionManager() throws ClassNotFoundException {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactoryBean().getObject());
return transactionManager;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean() throws ClassNotFoundException {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceUnitName("unit");
entityManagerFactoryBean.setPackagesToScan(environment.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
Properties jpaProperties = new Properties();
jpaProperties.put(PROPERTY_NAME_HIBERNATE_DIALECT, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_FORMAT_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_FORMAT_SQL));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
}
The #Path and #POST annotations are JAX-RS, not Spring. So the container is instantiating your endpoints on its own, without any knowledge of Spring beans. You are most likely not getting any Spring logging because Spring is not being used at all.
I've figured out the issue and blogged about it here: http://blog.benkuhl.com/2013/02/how-to-access-a-service-layer-on-a-jersey-json-object/
In the mean time, I'm also going to post the solution here:
I need to tap into the bean that Spring already created so I used Spring's ApplicationContextAware
public class ApplicationContextProvider implements ApplicationContextAware {
private static ApplicationContext applicationContext;
public static ApplicationContext getApplicationContext() {
return applicationContext;
}
public void setApplicationContext (ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
}
And then used that static context reference within my object to be mapped to so I can perform lookups in the service:
public class NewJobRequest {
private VendorService vendorService;
public NewJobRequest() {
vendorService = (VendorService) ApplicationContextProvider.getApplicationContext().getBean("vendorService");
}
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
setVendor(vendorService.findById(request.get("vendorId")); //vendorService is null
}
....
}