Dependency Injection with dynamically instanciated class with Spring Boot - spring

I'm trying to develop a spring-boot application which offer the possibility for the user to create and call some simple workflows.
The steps of the workflows are already written (they all extends the same class), and, when the user create a workflow, he/she just pick which steps he wants to include in his it. The steps and the workflows are saved in a database.
My problem comes when the user call the workflow: I want to instanciate dynamically each step using the class loader but with the dependencies injected by spring!
Here is an example of a plug-in:
public class HelloWorldStepPlugin extends StepPlugin {
private static final Logger LOG = LogManager.getLogger();
#Autowired
private HelloWorldRepository repository;
public HelloWorldStepPlugin() {
super(HelloWorldStepPlugin.class.getSimpleName());
}
#Override
public void process() {
LOG.info("Hello world!");
this.repository.findAll(); // <= throw a NullPointerException because this.repository is null
}
}
Here is how I execute a Workflow (in another class):
ClassLoader cl = getClass().getClassLoader();
for (Step s : workflow.getSteps()) {
StepPlugin sp = (StepPlugin) cl.loadClass(STEP_PLUGIN_PACKAGE + s.getPlugin()).newInstance();
sp.process();
}
How can I do to have my HelloWorldRepository injected by Spring?
Is there a much better approach to do what I intend to?

I suggest you declare your steps as prototype beans. Instead of saving class names in the database, save bean names. Then get the steps and the plugins from the spring context (i.e. using getBean()).

Related

Quarkus extension using a repository based on PanacheMongoRepository

I'm currently working on a Quarkus extension which is basically a filter that is using a PanacheMongoRepository. Here is a code snippet (this is in the runtime part of the extension) :
#Provider
#Priority(Priorities.AUTHORIZATION)
#AuthorizationSecured
public class AuthorizationFilter implements ContainerRequestFilter {
// Some injection here
#Inject
UserRepository userRepository;
#Override
public void filter(ContainerRequestContext requestContext) throws IOException {
// Some business logic here...
UserEntity userEntity = userRepository.findByName(name);
// Some business logic here...
}
}
The repository :
#ApplicationScoped
public class UserRepository implements PanacheMongoRepository<UserEntity> {
public UserEntity findByName(String name) {
return find("some query...", name).firstResult();
}
}
When the repository is called, I get the following exception:
org.jboss.resteasy.spi.UnhandledException: java.lang.IllegalStateException: This method is normally automatically overridden in subclasses...
java.lang.IllegalStateException: This method is normally automatically overridden in subclasses\n\tat io.quarkus.mongodb.panache.common.runtime.MongoOperations.implementationInjectionMissing(MongoOperations.java:765)\n\tat io.quarkus.mongodb.panache.PanacheMongoRepositoryBase.find(PanacheMongoRepositoryBase.java:119)
The processor
class AuthorizeProcessor {
private static final String FEATURE = "authorize";
#BuildStep
FeatureBuildItem feature() {
return new FeatureBuildItem(FEATURE);
}
#BuildStep(onlyIf = IsAuthorizeEnabled.class)
void registerAuthorizeFilter(
BuildProducer<AdditionalBeanBuildItem> additionalBeanProducer,
BuildProducer<ResteasyJaxrsProviderBuildItem> resteasyJaxrsProviderProducer
) {
additionalBeanProducer.produce(new AdditionalBeanBuildItem(UserRepository.class));
additionalBeanProducer.produce(new AdditionalBeanBuildItem(AuthorizationFilter.class));
resteasyJaxrsProviderProducer.produce(new ResteasyJaxrsProviderBuildItem(AuthorizationFilter.class.getName()));
}
}
Any idea ?
Thanks for your help :)
MongoDB with Panache (and the same for Hibernate with Panache) uses bytecode enhancement at build time. When this enhancement didn't occurs it leads to the exception you mentionned at runtime: java.lang.IllegalStateException: This method is normally automatically overridden in subclasses
It can occurs only when the repository or entity is not in the Jandex index. Jandex is used to index all the code of your application to avoid using reflection and classpath scanning to discover classes. If your entity / repository is not in the index this means it's not part of your application as we automatically index the classes of your application, so it must be inside an external JAR.
Usually, this is solved by adding the Jandex plugin to index the code of the external JAR (in fact there is multiple way to do this, see How to Generate a Jandex Index).
An extension suffer from the same issue as extensions are not indexed by default. But from an extension you can index the needed classes via a build step wich is more easy and avoid polluting the index with classes that are not needed.
This can be done by generating a new AdditionalIndexedClassesBuildItem(UserRepository.class.getName()) inside a build step.

Using Spring repository in static methods for setting up test data

In order to setup test data for my Spring Boot integration tests, I'd like to create some helper classes and methods which populate the data using the repositories.
Here is an example:
#Component
public class TestUtils {
private static TemplateRepository templateRepository;
#Autowired
public TestUtils(TemplateRepository templateRepository) {
TestUtils.templateRepository = templateRepository;
}
public static void createTemplates() {
Template template = Template.builder()
.content("some content")
.build();
templateRepository.save(template);
}
}
Due to a lack of experience, I cannot tell if this approach is fine. It it "safe" to inject the repository as static? Or are there better approaches for setting up test data?
Don't use static. If you want to use Java to initialize the data in the repository, just do so in your test.
What you can do if you need to create a few things in different repositories is create a dedicated component:
#Component
public class DatabaseInitializer {
private final TemplateRepository templateRepository;
private final MyOtherRepository myOtherRepository;
// Add constructor here
public void createInitialData() {
// Use repositories to persist some data
}
#ExtendWith(SpringExtension.class)
#Import(DatabaseInitializer.class)
class MyTest {
#Autowired
private DatabaseInitializer initDb;
#Test
void myTest() {
initDb.createInitialData(); // Or put this in a `#Before..` method
// actual test here
}
}
I use TestContainers and Flyway.
You can make SQL scripts and annotate test methods with #Sql and provide a .sql file and/or statements to be run.
You can store these .sql files in the test/resources folder.
Loading Initial Test Data
There is a very well explained process to initialize the data in docs. I would advice you to refer below
https://docs.spring.io/spring-boot/docs/current/reference/html/howto.html#howto.data-initialization
You just have to manintain Insert statements in predefined sql files.

How do I get JobRunr to detect my scheduled background job in a Spring controller/service?

I have been looking into using JobRunr for starting background jobs on my Spring MVC application, as I really like the simplicity of it, and the ease of integrating it into an IoC container.
I am trying to create a simple test scheduled job that writes a line of text to my configured logger every minute, but I'm struggling to figure out how to get the JobRunr background job server to detect it and queue it up. I am not using Spring Boot so I am just using the generic jobrunr Maven artifact rather than the "Spring Boot Starter". My setup is as follows:
pom.xml
<dependency>
<groupId>org.jobrunr</groupId>
<artifactId>jobrunr</artifactId>
<version>2.0.0</version>
</dependency>
ApplicationConfig.java
#Bean
public JobMapper jobMapper() {
return new JobMapper(new JacksonJsonMapper());
}
#Bean
#DependsOn("jobMapper")
public StorageProvider storageProvider(JobMapper jobMapper) {
InMemoryStorageProvider storageProvider = new InMemoryStorageProvider();
storageProvider.setJobMapper(jobMapper);
return storageProvider;
}
#Bean
#DependsOn("storageProvider")
public JobScheduler jobScheduler(StorageProvider storageProvider, ApplicationContext applicationContext) {
return JobRunr.configure().useStorageProvider(storageProvider)
.useJobActivator(applicationContext::getBean)
.useDefaultBackgroundJobServer()
.useDashboard()
.useJmxExtensions()
.initialize();
}
BackgroundJobsController.java
#Controller
public class BackgroundJobsController {
private final Logger logger = LoggerFactory.getLogger(getClass());
private #Autowired JobScheduler jobScheduler;
#Job(name = "Test")
public void executeJob() {
BackgroundJob.scheduleRecurrently(Cron.minutely(), () -> logger.debug("It works!"));
jobScheduler.scheduleRecurrently(Cron.minutely(), () -> logger.debug("It works too!"));
}
}
As you can see, I have tried both methods of initiating the background job in the executeJob method. The issue is basically getting Jobrunr to detect the jobs - is it simply a case of somehow triggering the executeJob method upon startup of the application? If so, does anyone know the most simple way to do that? Previously I have used the Spring #Scheduled annotation to automatically run through methods in a Service/Controller class upon startup of the application, so I was hoping there was a straightforward way to get Jobrunr to pick up the scheduled tasks I am trying to create. Apologies if it is something stupid that I have overlooked. I've spent a good few hours trying different things and reading through the documentation!
Thanks in advance!
There are different ways for doing so:
This is one, annotating a method with #PostConstruct is indeed another.
#SpringBootApplication
#Import(JobRunrExampleConfiguration.class)
public class JobRunrApplication {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(JobRunrApplication.class, args);
JobScheduler jobScheduler = applicationContext.getBean(JobScheduler.class);
jobScheduler.<SampleJobService>scheduleRecurrently("recurring-sample-job", every5minutes(), x -> x.executeSampleJob("Hello from recurring job"));
}
}
You can see an example here: https://github.com/jobrunr/example-java-mag/blob/main/src/main/java/org/jobrunr/examples/JobRunrApplication.java
Have you tried annotating your executeJob Method with a #PostConstruct ? That way upon initialisation of your application, the jobs would be registered to the JobServer.
I believe the #Job annotation is meant fo the method of the job itself. (In your case the debug method).
There is now a new way to do so:
You can add #Recurring to any Spring Boot, Micronaut or Quarkus bean method. A Spring Boot example:
#Component
public class SomeService {
#Recurring(id="recurring-job-every-5-min" interval = "PT5M")
#Job(name="job name for the dashboard")
public void runEvery5Minutes() {
// business logic comes here
}
}
For more info, see the JobRunr documentation.

Сamunda replace behaviour for external tasks in tests

I created simple Camunda spring boot project and also created simple BPMN process with switcher. (5.5 KB)
I used service task with external implementation as a spring beans. I want to write tests for process but I don't want to test how beans works. Because in general I use external implementation for connection to DB and save parameter to context or REST call to internal apps. For example I want skip execute service task(one) and instead set variables for switcher. I tried to use camunda-bpm-assert-scenario for test process and wrote simple test WorkflowTest.
I noticed if I use #MockBean for One.class then Camunda skip delegate execution. If use #Mock then Camunda execute delegate execution.
PS Sorry for bad english
One
#Service
public class One implements JavaDelegate {
private final Random random = new Random();
#Override
public void execute(DelegateExecution execution) throws Exception {
System.out.println("Hello, One!");
execution.setVariable("check", isValue());
}
public boolean isValue() {
return random.nextBoolean();
}
}
WorkflowTest
#SpringBootTest
#RunWith(SpringRunner.class)
#Deployment(resources = "process.bpmn")
public class WorkflowTest extends AbstractProcessEngineRuleTest {
#Mock
private ProcessScenario insuranceApplication;
#MockBean
private One one;
#Before
public void init() {
MockitoAnnotations.initMocks(this);
Mocks.register("one", one);
}
#Test
public void shouldExecuteHappyPath() throws Exception {
// given
when(insuranceApplication.waitsAtServiceTask("Task_generator")).thenReturn(externalTaskDelegate -> {
externalTaskDelegate.complete(withVariables("check", true));
}
);
String processDefinitionKey = "camunda-test-process";
Scenario scenario = Scenario.run(insuranceApplication)
.startByKey(processDefinitionKey) // either just start process by key ...
.execute();
verify(insuranceApplication).hasFinished("end_true");
verify(insuranceApplication, never()).hasStarted("three");
verify(insuranceApplication, atLeastOnce()).hasStarted("two");
assertThat(scenario.instance(insuranceApplication)).variables().containsEntry("check", true);
}
}
I found two solutions:
It's a little hack. If user #MockBean for delegate in a test. The delegate will be skipped but you have trouble with process engine variables.
Create two beans with one qualifier and use profiles for testing and production. I used to default profile for local start and test profile for testing.

Common shared data objects for entire application

I have some data objects that are common across a Spring boot application - one is the logged in employee object and other is a category. I have created a #Component class which contains these are static variables. This way I do not even have to autowire them. They can be used directly like CurrentContext.employee in controllers.
#Component
public final class CurrentContext {
public static Category currentCategory;
public static Employee employee;
#Autowired
private CategoryService categoryService;
#Autowired
private EmployeeService employeeService;
#EventListener
public void onApplicationEvent(ContextRefreshedEvent event) {
currentCategory = categoryService.getCategory();
}
#EventListener
public void onLoginSuccess(InteractiveAuthenticationSuccessEvent event) {
employee = employeeService.getEmployeeByUserId(((MyUserDetails) event.getAuthentication().getPrincipal()).getUserId());
}
}
Is this a right way? Please suggest if there is a better way to handle shared data
Edit
Some background - I require the current logged in employee and a category which is common for all employees. So I autowired employeeService and categoryService in my controllers and use them to get the data. They are required in almost all my controller methods, so, I wanted to create a bean of these so that I directly use them in my controller and also save frequent database calls.
Normally, we only put the dependencies related to the cross-cutting concerns (i.e dependencies that are across the whole application such as security , logging , transaction stuff , time provider etc.) in the static field.
By accessing these kind of dependencies in the static way , we don't need to pass them through method parameters /constructors from object to object , which will make the API much cleaner without such noise (BTW. This is called Ambient Context Pattern in the .NET world).
Your Employee object most probably belong to this type , so it is ok to access it in a static way. But as their scope is per session , you cannot simply put it in the static field of a class. If yes, then you always get the same employee for all sessions. Instead, you have to somehow store it in an object which is session scope (e.g HttpSession) . Then at the beginning of handling a web request , you get it from the session and then put it in a ThreadLocal which is encapsulated inside a "ContextHolder" object. You then access that "ContextHolder" in a static way.
Sound very complicated and scary ? Don't worry as Spring Security has already implemented this stuff for you. What you need to do is to customize Authentication#getPrincipal()or extend default Authentication to contain your Employee. Then get it using SecurityContextHolder.getContext().getAuthentication()
For your currentCategory , if they are not the cross-cutting concerns and is the application scope , make a singleton bean to get it values is a much better OOP design.
#Component
public final class CurrentCategoryProvider {
#Autowired
private CategoryService categoryService;
public Category getCurrentCategory(){
//or cache the value to the an internal properties depending on your requirements
return categoryService.getCategory();
}
}
You then inject CurrentCategoryProvider to the bean that need to access currentCategory.

Resources