Spring `#Autowire` field is `null` eventhough it works fine in other classes - spring

Spring #Autowire field is null even though it works fine in other classes successfully.
public class SendRunner implements Runnable {
private String senderAddress;
#Autowired
private SubscriberService subscriberService;
public SendRunner(String senderAddress) {
this.senderAddress = senderAddress;
}
#Override
public void run() {
sendRequest();
}
private void sendRequest() {
try {
HashMap<String, String> dataMap = new HashMap<>();
dataMap.put("subscriberId", senderAddress);
HttpEntity<?> entity = new HttpEntity<Object>(dataMap, httpHeaders);
Subscriber subscriber = subscriberService.getSubscriberByMsisdn(senderAddress);
} catch (Exception e) {
logger.error("Error occurred while trying to send api request", e);
}
}
Also this class is managed as a bean in the dispatcher servlet :
<bean id="SendRunner" class="sms.dating.messenger.connector.SendRunner">
</bean>
In here i'm getting a null pointer exception for subscriberService. What would be the possible reason for this? Thanks in advance.

Can you please try with below code snippet
#Configuration
public class Someclass{
#Autowired
private SubscriberService subscriberService;
Thread subscriberThread = new Thread() {
#Override
public void run() {
try {
HashMap<String, String> dataMap = new HashMap<>();
dataMap.put("subscriberId", senderAddress);
HttpEntity<?> entity = new HttpEntity<Object>(dataMap, httpHeaders);
Subscriber subscriber = subscriberService.getSubscriberByMsisdn(senderAddress);
} catch (Exception e) {
logger.error("Error occurred while trying to send api request", e);
}
}
};
}

Can you please annotate your SendRunner class with #Component or #Service and include the SendRunner package in componentscanpackage

Your bean not in Spring Managed context, below can be the reasons.
Package sms.dating.messenger.connector not in Component scan.
You are moving out of the Spring context by creating an object with new (see below),
this way you will not get the autowired fields.
SendRunner sendRunner = new SendRunner () ,
sendRunner.sendRequest();
Just check how I implement. Hope this will help.
#RestController
public class RestRequest {
#Autowired
SendRunner sendRunner;
#RequestMapping("/api")
public void Uri() {
sendRunner.start();
}
}
SendRunner class
#Service
public class SendRunner extends Thread{
#Autowired
private SubscriberService subscriberService;
#Override
public void run() {
SendRequest();
}
private void SendRequest() {
System.out.println("Object is " + subscriberService);
String senderAddress = "address";
subscriberService.getSubscriberByMsisdn(senderAddress);
}
}
Below are the logs printed when I hit the REST api.
Object is com.example.demo.SubscriberService#40f33492

Related

Spring Mongo Populator one by one

I'm using MongoDB and Spring over Kotlin and i want my application to populate a MongoDB collection upon startup. (and clean it every time it starts)
My question is, how can i populate the data one by one in order to be fault tolerant in case some of the data I'm populating with is problematic?
my code:
#Configuration
class IndicatorPopulator {
#Value("classpath:indicatorData.json")
private lateinit var data: Resource
#Autowired
private lateinit var indicatorRepository: IndicatorRepository
#Bean
#Autowired
fun repositoryPopulator(objectMapper: ObjectMapper): Jackson2RepositoryPopulatorFactoryBean {
val factory = Jackson2RepositoryPopulatorFactoryBean()
indicatorRepository.deleteAll()
factory.setMapper(objectMapper)
factory.setResources(arrayOf(data))
return factory
}
What I am looking for is something like:
#Bean
#Autowired
fun repositoryPopulator(objectMapper: ObjectMapper): Jackson2RepositoryPopulatorFactoryBean {
val factory = Jackson2RepositoryPopulatorFactoryBean()
indicatorRepository.deleteAll()
factory.setMapper(objectMapper)
val arrayOfResources: Array<Resource> = arrayOf(data)
for (resource in arrayOfResources){
try{
factory.setResources(resource)
} catch(e: Exception){
logger.log(e.message)
}
}
return factory
}
Any idea on how to do something like that would be helpful...
Thanks in advance.
There is no built in support for your ask but you can easily provide by tweaking few classes.
Add Custom Jackson 2 Reader
public class CustomJackson2ResourceReader implements ResourceReader {
private static final Logger logger = LoggerFactory.getLogger(CustomJackson2ResourceReader.class);
private final Jackson2ResourceReader resourceReader = new Jackson2ResourceReader();
#Override
public Object readFrom(Resource resource, ClassLoader classLoader) throws Exception {
Object result;
try {
result = resourceReader.readFrom(resource, classLoader);
} catch(Exception e) {
logger.warn("Can't read from resource", e);
return Collections.EMPTY_LIST;
}
return result;
}
}
Add Custom Jackson 2 Populator
public class CustomJackson2RepositoryPopulatorFactoryBean extends Jackson2RepositoryPopulatorFactoryBean {
#Override
protected ResourceReader getResourceReader() {
return new CustomJackson2ResourceReader();
}
}
Configuration
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Bean
public AbstractRepositoryPopulatorFactoryBean repositoryPopulator(ObjectMapper objectMapper, KeyValueRepository keyValueRepository) {
Jackson2RepositoryPopulatorFactoryBean factory = new CustomJackson2RepositoryPopulatorFactoryBean();
keyValueRepository.deleteAll();
factory.setMapper(objectMapper);
factory.setResources(new Resource[]{new ClassPathResource("badclassname.json"), new ClassPathResource("good.json"), new ClassPathResource("malformatted.json")});
return factory;
}
}
I've uploading a working example here
Using Sagar's Reader & Factory I just adjusted it to fit my needs (Kotlin, and reading resources all from the same JSON file) got me this answer:
#Configuration
class IndicatorPopulator {
#Value("classpath:indicatorData.json")
private lateinit var data: Resource
#Autowired
private lateinit var indicatorRepository: IndicatorRepository
#Autowired
#Bean
fun repositoryPopulator(objectMapper: ObjectMapper): Jackson2RepositoryPopulatorFactoryBean {
val factory: Jackson2RepositoryPopulatorFactoryBean = CustomJackson2RepositoryPopulatorFactoryBean()
factory.setMapper(objectMapper)
// inject your Jackson Object Mapper if you need to customize it:
indicatorRepository.deleteAll()
val resources = mutableListOf<Resource>()
val readTree: ArrayNode = objectMapper.readTree(data.inputStream) as ArrayNode
for (node in readTree){
resources.add( InputStreamResource(node.toString().byteInputStream()))
}
factory.setResources(resources.toTypedArray())
return factory
}
}

Bean not getting overridden in Spring boot

I am trying to write and test an application that used spring-cloud with azure functions following this tutorial.
https://github.com/markusgulden/aws-tutorials/tree/master/spring-cloud-function/spring-cloud-function-azure/src/main/java/de/margul/awstutorials/springcloudfunction/azure
I am tryign to write a testcase and override the bean.
Here is the application class having function and handler Bean function.
#SpringBootApplication
#ComponentScan(basePackages = { "com.package" })
public class DataFunctions extends AzureSpringBootRequestHandler<GenericMessage<Optional<String>>, Data> {
#FunctionName("addData")
public HttpResponseMessage addDataRun(
#HttpTrigger(name = "add", methods = {
HttpMethod.POST }, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) throws JsonParseException, JsonMappingException, IOException {
context.getLogger().info("Java HTTP trigger processed a POST request.");
try {
handleRequest(new GenericMessage<Optional<String>>(request.getBody()), context);
} catch (ServiceException ex) {
ErrorMessage em = new ErrorMessage();
return request.createResponseBuilder(handleException(ex, em)).body(em).build();
}
return request.createResponseBuilder(HttpStatus.CREATED).build();
}
#Autowired
MyService mService;
#Bean
public Consumer<GenericMessage<Optional<String>>> addData() {
ObjectMapper mapper = new ObjectMapper();
return req -> {
SomeModel fp = null;
try {
fp = mapper.readValue(req.getPayload().get(), SomeModel.class);
} catch (Exception e) {
throw new ServiceException(e);
}
mService.addData(fp);
};
}
}
I want to test by overriding the above bean.
Cosmosdb spring configuration
#Configuration
#EnableDocumentDbRepositories
public class CosmosDBConfig extends AbstractDocumentDbConfiguration {
#Value("${cosmosdb.collection.endpoint}")
private String uri;
#Value("${cosmosdb.collection.key}")
private String key;
#Value("${cosmosdb.collection.dbname}")
private String dbName;
#Value("${cosmosdb.connect.directly}")
private Boolean connectDirectly;
#Override
public DocumentDBConfig getConfig() {
ConnectionPolicy cp = ConnectionPolicy.GetDefault();
if (connectDirectly) {
cp.setConnectionMode(ConnectionMode.DirectHttps);
} else {
cp.setConnectionMode(ConnectionMode.Gateway);
}
return DocumentDBConfig.builder(uri, key, dbName).connectionPolicy(cp).build();
}
}
Here is the configuration
#TestConfiguration
#PropertySource(value = "classpath:application.properties", encoding = "UTF-8")
#Profile("test")
#Import({DataFunctions.class})
public class TestConfig {
#Bean(name="addData")
#Primary
public Consumer<GenericMessage<Optional<String>>> addData() {
return req -> {
System.out.println("data mock");
};
}
#Bean
#Primary
public DocumentDBConfig getConfig() {
return Mockito.mock(DocumentDBConfig.class);
}
}
Finally the test class
#RunWith(SpringRunner.class)
//#SpringBootTest //Enabling this gives initialization error.
#ActiveProfiles("test")
public class TempTest {
#InjectMocks
DataFunctions func;
#Mock
MyService mService;
#Before
public void setup() {
MockitoAnnotations.initMocks(this);
}
private Optional<String> createRequestString(final String res) throws IOException {
InputStream iStream = TempTest.class.getResourceAsStream(res);
String charset="UTF-8";
try (BufferedReader br = new BufferedReader(new InputStreamReader(iStream, charset))) {
return Optional.of(br.lines().collect(Collectors.joining(System.lineSeparator())));
}
}
#Test
public void testHttpPostTriggerJava() throws Exception {
#SuppressWarnings("unchecked")
final HttpRequestMessage<Optional<String>> req = mock(HttpRequestMessage.class);
final Optional<String> queryBody = createRequestString("/test-data.json");
doNothing().when(mService).addData(Mockito.any(SomeModel.class));
doReturn(queryBody).when(req).getBody();
doAnswer(new Answer<HttpResponseMessage.Builder>() {
#Override
public HttpResponseMessage.Builder answer(InvocationOnMock invocation) {
HttpStatus status = (HttpStatus) invocation.getArguments()[0];
return new HttpResponseMessageMock.HttpResponseMessageBuilderMock().status(status);
}
}).when(req).createResponseBuilder(any(HttpStatus.class));
final ExecutionContext context = mock(ExecutionContext.class);
doReturn(Logger.getGlobal()).when(context).getLogger();
doReturn("addData").when(context).getFunctionName();
// Invoke
final HttpResponseMessage ret = func.addDataRun(req, context);
// Verify
assertEquals(ret.getStatus(), HttpStatus.CREATED);
}
}
For this case instead of test configuration addData the actual bean is called from DataFunctions class. Also the database connection is also created when it should use the mocked bean from my test configuration. Can somebody please point out what is wrong in my test configuration?
I was able to resolve the first part of cosmos db config loading by marking it with
#Configuration
#EnableDocumentDbRepositories
#Profile("!test")
public class CosmosDBConfig extends AbstractDocumentDbConfiguration {
...
}
Also had to mark the repository bean as optional in the service.
public class MyService {
#Autowired(required = false)
private MyRepository myRepo;
}
Didn't use any spring boot configuration other than this.
#ActiveProfiles("test")
public class FunctionTest {
...
}
For the second part of providing mock version of Mock handlers, I simply made the test config file as spring application as below.
#SpringBootApplication
#ComponentScan(basePackages = { "com.boeing.da.helix.utm.traffic" })
#Profile("test")
public class TestConfiguration {
public static void main(final String[] args) {
SpringApplication.run(TestConfiguration.class, args);
}
#Bean(name="addData")
#Primary
public Consumer<GenericMessage<Optional<String>>> addData() {
return req -> {
System.out.println("data mock");
};
}
}
and made use of this constructor from azure functions library in spring cloud in my constructor
public class AppFunctions
extends AzureSpringBootRequestHandler<GenericMessage<Optional<String>>, List<Data>> {
public AppFunctions(Class<?> configurationClass) {
super(configurationClass);
}
}
public AzureSpringBootRequestHandler(Class<?> configurationClass) {
super(configurationClass);
}
Hope it helps someone.

Spring Autowired Shared Queue NullPointerException

I'm using Spring for the first time and am trying to implement a shared queue wherein a Kafka listener puts messages on the shared queue, and a ThreadManager that will eventually do something multithreaded with the items it takes off the shared queue. Here is my current implementation:
The Listener:
#Component
public class Listener {
#Autowired
private QueueConfig queueConfig;
private ExecutorService executorService;
private List<Future> futuresThread1 = new ArrayList<>();
public Listener() {
Properties appProps = new AppProperties().get();
this.executorService = Executors.newFixedThreadPool(Integer.parseInt(appProps.getProperty("listenerThreads")));
}
//TODO: how can I pass an approp into this annotation?
#KafkaListener(id = "id0", topics = "bose.cdp.ingest.marge.boseaccount.normalized")
public void listener(ConsumerRecord<?, ?> record) throws InterruptedException, ExecutionException
{
futuresThread1.add(executorService.submit(new Runnable() {
#Override public void run() {
try{
queueConfig.blockingQueue().put(record);
// System.out.println(queueConfig.blockingQueue().take());
} catch (Exception e){
System.out.print(e.toString());
}
}
}));
}
}
The Queue:
#Configuration
public class QueueConfig {
private Properties appProps = new AppProperties().get();
#Bean
public BlockingQueue<ConsumerRecord> blockingQueue() {
return new ArrayBlockingQueue<>(
Integer.parseInt(appProps.getProperty("blockingQueueSize"))
);
}
}
The ThreadManager:
#Component
public class ThreadManager {
#Autowired
private QueueConfig queueConfig;
private int threads;
public ThreadManager() {
Properties appProps = new AppProperties().get();
this.threads = Integer.parseInt(appProps.getProperty("threadManagerThreads"));
}
public void run() throws InterruptedException {
ExecutorService executorService = Executors.newFixedThreadPool(threads);
try {
while (true){
queueConfig.blockingQueue().take();
}
} catch (Exception e){
System.out.print(e.toString());
executorService.shutdownNow();
executorService.awaitTermination(1, TimeUnit.SECONDS);
}
}
}
Lastly, the main thread where everything is started from:
#SpringBootApplication
public class SourceAccountListenerApp {
public static void main(String[] args) {
SpringApplication.run(SourceAccountListenerApp.class, args);
ThreadManager threadManager = new ThreadManager();
try{
threadManager.run();
} catch (Exception e) {
System.out.println(e.toString());
}
}
}
The problem
I can tell when running this in the debugger that the Listener is adding things to the queue. When the ThreadManager takes off the shared queue, it tells me the queue is null and I get an NPE. It seems like autowiring isn't working to connect the queue the listener is using to the ThreadManager. Any help appreciated.
This is the problem:
ThreadManager threadManager = new ThreadManager();
Since you are creating the instance manually, you cannot use the DI provided by Spring.
One simple solution is implement a CommandLineRunner, that will be executed after the complete SourceAccountListenerApp initialization:
#SpringBootApplication
public class SourceAccountListenerApp {
public static void main(String[] args) {
SpringApplication.run(SourceAccountListenerApp.class, args);
}
// Create the CommandLineRunner Bean and inject ThreadManager
#Bean
CommandLineRunner runner(ThreadManager manager){
return args -> {
manager.run();
};
}
}
You use SpringĀ“s programatic, so called 'JavaConfig', way of setting up Spring beans (classes annotated with #Configuration with methods annotated with #Bean). Usually at application startup Spring will call those #Bean methods under the hood and register them in it's application context (if scope is singleton - the default - this will happen only once!). No need to call those #Bean methods anywhere in your code directly... you must not, otherwise you will get a separate, fresh instance that possibly is not fully configured!
Instead, you need to inject the BlockingQueue<ConsumerRecord> that you 'configured' in your QueueConfig.blockingQueue() method into your ThreadManager. Since the queue seems to be a mandatory dependency for the ThreadManager to work, I'd let Spring inject it via constructor:
#Component
public class ThreadManager {
private int threads;
// add instance var for queue...
private BlockingQueue<ConsumerRecord> blockingQueue;
// you could add #Autowired annotation to BlockingQueue param,
// but I believe it's not mandatory...
public ThreadManager(BlockingQueue<ConsumerRecord> blockingQueue) {
Properties appProps = new AppProperties().get();
this.threads = Integer.parseInt(appProps.getProperty("threadManagerThreads"));
this.blockingQueue = blockingQueue;
}
public void run() throws InterruptedException {
ExecutorService executorService = Executors.newFixedThreadPool(threads);
try {
while (true){
this.blockingQueue.take();
}
} catch (Exception e){
System.out.print(e.toString());
executorService.shutdownNow();
executorService.awaitTermination(1, TimeUnit.SECONDS);
}
}
}
Just to clarify one more thing: by default the method name of a #Bean method is used by Spring to assign this bean a unique ID (method name == bean id). So your method is called blockingQueue, means your BlockingQueue<ConsumerRecord> instance will also be registered with id blockingQueue in application context. The new constructor parameter is also named blockingQueue and it's type matches BlockingQueue<ConsumerRecord>. Simplified, that's one way Spring looks up and injects/wires dependencies.

Field created in spring component in not initialized with new keyword

I have spring component class annotated with #Component and in it I have field ConcurrentHashMap map, which is init in constructor of component and used in spring stream listener:
#Component
public class FooService {
private ConcurrentHashMap<Long, String> fooMap;
public FooService () {
fooMap = new ConcurrentHashMap<>();
}
#StreamListener(value = Sink.INPUT)
private void handler(Foo foo) {
fooMap.put(foo.id, foo.body);
}
}
Listener handle messages sent by rest controller. Can you tell me why I always got there fooMap.put(...) NullPointerException because fooMap is null and not initialzied.
EDIT:
After #OlegZhurakousky answer I find out problem is with async method. When I add #Async on some method and add #EnableAsync I can't anymore use private modificator for my #StreamListener method. Do you have idea why and how to fix it?
https://github.com/schwantner92/spring-cloud-stream-issue
Thanks.
Could you try using #PostConstruct instead of constructor?
#PostConstruct
public void init(){
this.fooMap = new ConcurrentHashMap<>();
}
#Denis Stephanov
When I say bare minimum, here is what I mean. So try this as a start, you'll see that the map is not null and start evolving your app from there.
#SpringBootApplication
#EnableBinding(Processor.class)
public class DemoApplication {
private final Map<String, String> map;
public static void main(String[] args) {
SpringApplication.run(DemoRabbit174Application.class, args);
}
public DemoApplication() {
this.map = new HashMap<>();
}
#StreamListener(Processor.INPUT)
public void sink(String string) {
System.out.println(string);
}
}
With Spring everything has to be injected.
You need to declare a #Bean for the ConcurrentHashMap, that will be injected in you Component. So create a Configuration class like:
#Configuration
public class FooMapConfiguration {
#Bean("myFooMap")
public ConcurrentHashMap<Long, String> myFooMap() {
return new ConcurrentHashMap<>();
}
}
Then modify your Component:
#Component
public class FooService {
#Autowired
#Qualifier("myFooMap")
private ConcurrentHashMap<Long, String> fooMap;
public FooService () {
}
#StreamListener(value = Sink.INPUT)
private void handler(Foo foo) {
fooMap.put(foo.id, foo.body); // <= No more NPE here
}
}

Overridden onMessage of MessageListner not getting called in Spring Kafka Consumer Unit Test

I am writing Kafka Consumer Unit Test, and need to Mock the Service of my KafkaConsumer for testing the Kafka Consumer independently. But, the mockObject of Service is not getting invoked, instead Spring is creating the original Service class object and calling it. Thus, my mock class object not getting called.
KafkaConsumer :
#Slf4j
#Component
#RequiredArgsConstructor (onConstructor = #__(#Autowired))
public class KafkaEventConsumer {
private final MyService requestService;
#KafkaListener (topics = "${kafka.topic:topic-name}")
public void receive(#Payload String message) throws Exception {
try {
LOGGER.debug("Received message:{} ", message);
ObjectMapper mapper = new ObjectMapper();
ForecastRequest forecastRequest = mapper.readValue(message, ForecastRequest.class);
JobDetail jobDetail = requestForecastService.refreshForecasts(forecastRequest);
if (jobDetail.getJobStatus() != JobStatus.complete) {
LOGGER.error("Failed to Refresh Forecast for ProgramId-{}, JobId-{}, JobStatus-{}",
forecastRequest.getProgramId(), jobDetail.getJobId(), jobDetail.getJobStatus());
throw new Exception("Internal Server Error");
}
} catch (Exception e) {
LOGGER.error("Failed to Refresh Forecast for Forecast Request {}", message, e);
throw e;
}
}
}
Kafka Consumer Test :
#RunWith (SpringRunner.class)
#ActiveProfiles ("kafkatest")
#SpringBootTest (classes = ForecastEventConsumerApplication.class)
#DirtiesContext
public class KafkaEventConsumerTest {
private static String TOPIC = "topic-name";
#Mock
private MyServiceImpl myServiceMock;
#InjectMocks
private KafkaEventConsumer kafkaEventConsumer;
private KafkaTemplate<String, String> template;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#ClassRule
public static final KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true,3, TOPIC);
#Before
public void setUp() throws Exception {
kafkaEventConsumer = new KafkaEventConsumer(myServiceMock);
// set up the Kafka producer properties
Map<String, Object> senderProperties = KafkaTestUtils.senderProps(embeddedKafka.getBrokersAsString());
// create a Kafka producer factory
ProducerFactory<String, String> producerFactory = new DefaultKafkaProducerFactory<String, String>(senderProperties);
// create a Kafka template
template = new KafkaTemplate<>(producerFactory);
// set the default topic to send to
template.setDefaultTopic(TOPIC);
// wait until the partitions are assigned
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
messageListenerContainer.setupMessageListener(new MessageListener<String, String>() {
#Override
public void onMessage(ConsumerRecord<String, String> record) {
try {
kafkaEventConsumer.receive(record.value());
} catch (Exception e) {
e.printStackTrace();
}
}
});
ContainerTestUtils.waitForAssignment(messageListenerContainer, embeddedKafka.getPartitionsPerTopic());
}
}
#AfterClass
public static void tearDown() throws Exception {
embeddedKafka.destroy();
}
#Test
public void testReceive() throws Exception {
String forecastRequestMessage = "{\"programId\":100011770}";
ForecastRequest forecastRequest = ForecastRequest.builder().programId(100011770L).build();
JobDetail jobDetail = JobDetail.builder().jobStatus(JobStatus.complete).build();
Mockito.when(forecastServiceMock.refreshForecasts(Matchers.any())).thenReturn(jobDetail);
template.sendDefault(forecastRequestMessage);
Thread.sleep(2000L);
// validate something
}
}
The problem is, in the above #Test method instead of calling the mocked version of MyService it is calling the original MyService implementation. Also, while debugging my code I found that overridden onMessage() is also not getting called. Please help me in finding what am I doing wrong here.
You have to stop() all the MessageListenerContainers before calling their setupMessageListener(). Then you will need to start() them back to let them to pick up a fresh listener:
protected void doStart() {
...
Object messageListener = containerProperties.getMessageListener();
Assert.state(messageListener != null, "A MessageListener is required");
Anyway that sounds like you really would like to mock only your MyService which is injected into the real KafkaEventConsumer. So, how about to consider to use that like this:
#MockBean
private MyServiceImpl myServiceMock;
And you won't need to do anything in your #Before and no need in the #InjectMocks.
The KafkaEmbedded can expose its host/port (or brokers) properties to the expected Spring Boot conventional configuration properties like this:
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", kafkaEmbedded.getBrokersAsString());
}
https://docs.spring.io/spring-boot/docs/2.0.0.RELEASE/reference/htmlsingle/#boot-features-testing-spring-boot-applications-mocking-beans

Resources