Wro4j: Accessing Spring #Service from custom post processor - spring

I've successfully implemented a custom post processor filter with the help of the wro4j documentation.
Its job is to generate and prepend SASS vars to a group of SASS files which are then handed off to the rubySassCss filter for transpiling, and it's doing this job well.
The problem is that I wanted to hand the job of determining the SASS vars off to a custom ThemeManager #Service managed by Spring. I hadn't considered that the filter wouldn't be able to see the autowired #Service but that seems to be the case.
When I #Autowire the #Service into a controller, it works fine, but when I try the same thing with the filter I get a NPE when attempting to use it.
Is there a way to make the #Service visible to the filters or am I approaching this the wrong way?
Thanks for any help.
UPDATE:
It's taken some doing and attacking from a lot of angles, but I seem to be having success with autowiring my themeManagerService into the app configuration where I have my WRO filterRegistrationBean bean. I then pass the themeManagerService bean as a second argument to my custom ConfigurableWroManagerFactory.
Living in the custom WroManagerFactory is a reference to a custom UriLocator, which takes that themeManagerService as an argument. The custom UriLocator is invoked by a CSS resource containing an arbitrary keyword within a group.
The new UriLocator is able to generate a ByteArrayInputStream from what the themeManagerService provides it and pass it into the pipeline.
Simple.
I'll follow up when this approach pans/fizzles out.

In the end, I was able to provide the spring managed ThemeManagerService directly to the custom post processor, rather than relying on a custom UriLocator. I had tried that early on, but forgot to call super() in the new constructor, so the processor registration system was breaking.
I pass the #Autowired ThemeManagerService to my CustomConfigurableWroManagerFactory when registering the WRO bean:
#Autowired
ThemeManagerService themeManagerService;
#Bean
FilterRegistrationBean webResourceOptimizer(Environment env) {
FilterRegistrationBean fr = new FilterRegistrationBean();
ConfigurableWroFilter filter = new ConfigurableWroFilter();
Properties props = buildWroProperties(env);
filter.setProperties(props);
//The overridden constructor passes ThemeManager along
filter.setWroManagerFactory(new CustomConfigurableWroManagerFactory(props,themeManagerService));
filter.setProperties(props);
fr.setFilter(filter);
fr.addUrlPatterns("/wro/*");
return fr;
}
The constructor injection of ThemeManagerService into CustomConfigurableWroManagerFactory means it can be passed along to the custom postprocessor as it's registered by contributePostProcessors:
public class CustomConfigurableWroManagerFactory extends Wro4jCustomXmlModelManagerFactory {
private ThemeManagerService themeManagerService;
public CustomConfigurableWroManagerFactory(Properties props,ThemeManagerService themeManagerService) {
//forgetting to call super derailed me early on
super(props);
this.themeManagerService = themeManagerService;
}
#Override
protected void contributePostProcessors(Map<String, ResourcePostProcessor> map) {
//ThemeManagerService is provided as the custom processor is registered
map.put("repoPostProcessor", new RepoPostProcessor(themeManagerService));
}
}
Now, the post processor has access to ThemeManagerService:
#SupportedResourceType(ResourceType.CSS)
public class RepoPostProcessor implements ResourcePostProcessor {
private ThemeManagerService themeManagerService;
public RepoPostProcessor(ThemeManagerService themeManagerService) {
super();
this.themeManagerService = themeManagerService;
}
public void process(final Reader reader, final Writer writer) throws IOException {
String resourceText = "/* The custom PostProcessor fetched the following SASS vars from the ThemeManagerService: */\n\n";
resourceText += themeManagerService.getFormattedProperties();
writer.append(resourceText);
//read in the merged SCSS and add it after the custom content
writer.append(IOUtils.toString(reader));
reader.close();
writer.close();
}
}
This approach is working as expected/intended so far. Hope it comes in handy for someone else.
Wro4j is a great tool and much appreciated.

Related

Injection of bean inside ClientHeadersFactory doesn't work

I'm building a Quarkus app which handles http requests with resteasy and calls another api with restclient and I need to propagate a header and add another one on the fly so I added a class that implements ClientHeadersFactory.
Here's the code:
#ApplicationScoped
public abstract class MicroServicesHeaderHandler implements ClientHeadersFactory {
#Inject
MicroServicesConfig config;
#Override
public MultivaluedMap<String, String> update(MultivaluedMap<String, String> incomingHeaders,
MultivaluedMap<String, String> clientOutgoingHeaders) {
// Will be merged with outgoing headers
return new MultivaluedHashMap<>() {{
put("Authorization", Collections.singletonList("Bearer " + config.getServices().get(getServiceName()).getAccessToken()));
put("passport", Collections.singletonList(incomingHeaders.getFirst("passport")));
}};
}
protected abstract String getServiceName();
My issue is that the injection of the config doesn't work. I tried both with #Inject and #Context, as mentioned in the javadoc of ClientHeadersFactory. I also tried to make the class non abstract but it doesn't change anything.
MicroServicesConfig is a #Startup bean because it needs to be initialized before Quarkus.run() is called, otherwise the hot reload doesn't work anymore, since it's required to handle requests.
Here's the code FYI:
#Getter
#Startup
#ApplicationScoped
public final class MicroServicesConfig {
private final Map<String, MicroService> services;
MicroServicesConfig(AKV akv, ABS abs) {
// some code to retrieve an encrypted file from a secure storage, decrypt it and initialize the map out of it
}
It appears to be an issue with ClientHeadersFactory because if I inject my bean in my main class (#QuarkusMain), it works. I'm then able to assign the map to a public static map that I can then access from my HeaderHandler with Application.myPublicStaticMap but that's ugly so I would really prefer to avoid that.
I've searched online and saw several people having the same issue but according to this blogpost, or this one, it should work as of Quarkus 1.3 and MicroProfile 3.3 (RestClient 1.4) and I'm using Quarkus 1.5.2.
Even the example in the second link doesn't work for me with the injection of UriInfo so the issue doesn't come from the bean I'm trying to inject.
I've been struggling with this for weeks and I'd really like to get rid of my workaround now.
I'm probably just missing something but it's driving me crazy.
Thanks in advance for your help.
This issue has finally been solved in Quarkus 1.8.

Spring - Injection of beans using Builder pattern

Context
An application that utilizes Spring 4.1.7. All configurations are in XML files (not using annotations) and I rather keep it that way (but I can change the ways things are done if I must).
Problem
I have created a new class that comes with a builder class.
Now I'd like to inject other beans into this new class. I can probably use lookup-methods and similar solutions to do that and then use the new class's builder in the caller beans to create an instance. However, I rather an instance of this new class to be injected to its caller beans then they creating one through the builder. This is where I'm not sure how I can do that. For example, this looks like an Abstract Factory to me, but I don't know how I can pass those parameters (which are passed to the builder) at runtime to the Abstract Factory and subsequently the factories it builds.
Some code snippets to make the question clearer:
public final class Processor {
private final StatusEnum newStatus;
private final Long timeOut;
// I'd like this to be be injected by Spring through its setter (below)
private DaoBean daoInstance;
private Processor() {
this.newStatus = null;
this.timeOut = null;
}
private Processor(Builder builder) {
this.newStatus = builder.getNewStatus();
this.timeOut = builder.getTimeOut();
}
// To be called by Spring
public void setDaoInstance(DaoBean instance) {
this.daoInstance = instance;
}
public void updateDatabase() {
daoInstance.update(newStatus, timeOut);
}
// Builder class
public static final class Builder {
private StatusEnum newStatus;
private Long timeOut;
// lots of other fields
public Long getTimeOut() {
return this.timeOut;
}
public StatusEnum getNewStatus() {
return this.newStatus;
}
public Builder withTimeOut(Long timeOut) {
this.timeOut = timeOut;
return this;
}
public Builder withNewStatus(StatusEnum newStatus) {
this.newStatus = newStatus;
return this;
}
public Processor build() {
return new Processor(this);
}
}
}
I'd like an instance of "DaoBean" to be injected to the "Processor" class. But to do that, Processor will have to be a bean or otherwise I have to utilize something like lookup-methods. On the other hand, wherever I want to use processor, I have to do something like this:
new Processor.Builder()
.withTimeOut(1000L)
.withNewStatus(StatusEnum.UPDATED)
.build()
.updateDatabase();
Instead of this, I wonder if I can make the Processor a bean that Spring can inject to its callers whilst maintaining its immutability. An instance of DaoBean can then be injected to the Processor by Spring. That way I'd be able to segregate the wiring code and the business logic.
It's worth mentioning that the Builder has a lot more than 2 fields and not all of them have to be set. This is why I thought an abstract factory is the way to go (building instances of the Processor in different ways).
One solution, while keeping the builder, would probably be to simply making the Builder itself a Spring bean...
This allows something like this..
#Autowired
private Builder builder;
public void someMethod() {
Result = builder.withX(...).doSomething();
}
This way, your Result object is immutable, can be created via a nice builder and the builder can inject the Spring bean (dao, in your case) into it without anyone even noticing that it's there.
And the only thing that changes is, that you don't create the builder yourself, but let Spring create it for you...
#Component
#Scope("prototype") // normally a good idea
public static class Builder {
#Autowired
private DaoBean dao;
// your logic here
}
(Same works with JavaConfig or XML config, if you don't want to scan.)
Especially with many combinations, I prefer a builder pattern, since a factory would need complex method signatures. Of course, the builder has the disadvantage that you cannot check at compile time if a given combination of attribute types is at least theoretically acceptable. Ok, you could simulate that with various builders, but that would probably be overkill.

How to get all self injected Beans of a special type?

I would like to build a Spring application, where new components can be added easily and without much configuration. For example: You have different kinds of documents. These documents should be able to get exported into different fileformats.
To make this functionality easy to maintain, it should (basically) work the following way:
Someone programs the file format exporter
He/ She writes a component, which checks if the file format exporter is licensed (based on Spring Conditions). If the exporter is licensed a specialized Bean is injected in the application context.
The "whole rest" works dynamically based on the injected beans. Nothing needs to be touched in order to display it on the GUI, etc.
I pictured it the following way:
#Component
public class ExcelExporter implements Condition {
#PostConstruct
public void init() {
excelExporter();
}
#Bean
public Exporter excelExporter(){
Exporter exporter= new ExcelExporter();
return exporter;
}
#Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metadata) {
return true;
}
}
In order to work with those exporters (display them, etc.) I need to get all of them. I tried this:
Map<String, Exporter> exporter =BeanFactoryUtils.beansOfTypeIncludingAncestors(appContext, Exporter.class, true, true);
Unfortunate this does not work (0 beans returned). I am fairly new to this, would anyone mind to tell me how this is properly done in Spring? Maybe there is a better solution for my problem than my approach?
You can get all instances of a given type of bean in a Map effortlessly, since it's a built in Spring feature.
Simply autowire your map, and all those beans will be injected, using as a key the ID of the bean.
#Autowired
Map<String,Exporter> exportersMap;
If you need something more sophisticated, such as a specific Map implementation or a custom key. Consider defining your custom ExporterMap, as follows
#Component
class ExporterMap implements Map{
#Autowired
private Set<Exporter> availableExporters;
//your stuff here, including init if required with #PostConstruct
}

spring-security global-method-security protect-pointcut with #EnableGlobalMethodSecurity

How does one port from
<sec:global-method-security secured-annotations="disabled">
<sec:protect-pointcut expression='execution(* x.y.z.end*(..))' access='...' />
to spring java-config
#EnableGlobalMethodSecurity
#Configuration
public class MyConfiguration extends WebSecurityConfigurerAdapter {
?
There is a simmilar question here http://forum.spring.io/forum/spring-projects/security/726615-protect-pointcut-in-java-configuration
There's a workaround for it. The security points information is kept in MethodSecurityMetadataSource implementations (which are then used by MethodInterceptor) so we have to create an additional MethodSecurityMetadataSource. As mentioned in the spring forum post the xml pointcut configuration is kept in MapBasedMethodSecurityMetadataSource and processed by ProtectPointcutPostProcessor. we also need an instance of ProtectPointcutPostProcessor. Unfortunately this class is final and package-private so there are 2 options:
create your own class and copy/paste the whole content of the original one (that's what I did)
change the class modifiers with reflection and create an instance of the original one (haven't done that so no idea if it would work fine)
then create the following beans in your context:
#Bean
public Map<String, List<ConfigAttribute>> protectPointcutMap() {
Map<String, List<ConfigAttribute>> map = new HashMap<>();
// all the necessary rules go here
map.put("execution(* your.package.service.*Service.*(..))", SecurityConfig.createList("ROLE_A", "ROLE_B"));
return map;
}
#Bean
public MethodSecurityMetadataSource mappedMethodSecurityMetadataSource() {
// the key is not to provide the above map here. this class will be populated later by ProtectPointcutPostProcessor
return new MapBasedMethodSecurityMetadataSource();
}
// it's either the original spring bean created with reflection or your own copy of it
#Bean
public ProtectPointcutPostProcessor pointcutProcessor() {
ProtectPointcutPostProcessor pointcutProcessor = new ProtectPointcutPostProcessor((MapBasedMethodSecurityMetadataSource) mappedMethodSecurityMetadataSource());
pointcutProcessor.setPointcutMap(protectPointcutMap());
return pointcutProcessor;
}
we've created the necessary beans, now we have to tell spring to use them. I'm assuming you're extending GlobalMethodSecurityConfiguration. by default it creates DelegatingMethodSecurityMetadataSource which contains a list of other MethodSecurityMetadataSources. Depending on what you want to achieve you have following options:
if you want to keep all the other MethodSecurityMetadataSources (like the ones for parsing the #Secured annotations) you can extend the list in the delegating metadata source by overriding the following method:
#Override
protected MethodSecurityMetadataSource customMethodSecurityMetadataSource() {
return mappedMethodSecurityMetadataSource();
}
it would inject it on first place in the list though which may cause some problems.
if you want to keep the other sources but want yours to be the last in the list then override the following method:
#Override
public MethodSecurityMetadataSource methodSecurityMetadataSource() {
DelegatingMethodSecurityMetadataSource metadataSource = (DelegatingMethodSecurityMetadataSource) super.methodSecurityMetadataSource();
metadataSource.getMethodSecurityMetadataSources().add(mappedMethodSecurityMetadataSource());
return metadataSource;
}
if you want your source to be the only one (you don't want to use #Secured or any other annotations) then you can override the same method, just with different content
#Override
public MethodSecurityMetadataSource methodSecurityMetadataSource() {
return mappedMethodSecurityMetadataSource();
}
that's it! I hope it will help
I followed #marhewa comments and have been able to use the Spring version of class ProtectPointcutPostProcessor by defining the following bean
/**
* Needed to use reflection because I couldn't find a way to instantiate a
* ProtectPointcutPostProcessor via a BeanFactory or ApplicationContext. This bean will process
* the AspectJ pointcut defined in the map; check all beans created by Spring; store the matches
* in the MapBasedMethodSecurityMetadataSource bean so Spring can use it during its checks
*
* #return
* #throws Exception
*/
#Bean(name = "protectPointcutPostProcessor")
Object protectPointcutPostProcessor() throws Exception {
Class<?> clazz =
Class.forName("org.springframework.security.config.method.ProtectPointcutPostProcessor");
Constructor<?> declaredConstructor =
clazz.getDeclaredConstructor(MapBasedMethodSecurityMetadataSource.class);
declaredConstructor.setAccessible(true);
Object instance = declaredConstructor.newInstance(pointcutMethodMetadataSource());
Method setPointcutMap = instance.getClass().getMethod("setPointcutMap", Map.class);
setPointcutMap.setAccessible(true);
setPointcutMap.invoke(instance, pointcuts());
return instance;
}
This way I don't need to duplicate the code of this Spring class.
Cheers

Spring batch how to use ItemReadListener

I use spring batch for processing a file. The configuration of all components is made programatically.
I have a job that contains several TaskletSteps:
#Bean
#Named(SEEC_JOB)
public Job seecJob() {
return jobBuilderFactory.get(SEEC_JOB).start(seecMoveToWorkingStep()).next(seecLoadFileStep())
.on(ExitStatus.COMPLETED.getExitCode()).to(seecFlowMoveToArchiveOk()).from(seecLoadFileStep())
.on(ExitStatus.FAILED.getExitCode()).to(seecFlowMoveToArchiveKo()).end().build();
}
My question focus on seecLoadFileStep(), the detail bellow:
#Bean
public TaskletStep seecLoadFileStep() {
TaskletStep build = stepBuilderFactory.get(SEEC_LOAD_FILE_STEP)
.<SeecMove, SeecMove>chunk(cormoranProperties.seec.batchSize.get()).reader(seecItemReader())
.writer(seecItemWriter()).build();
return build;
}
I would like to throw a specific exception if a reading error hapens (by reading error I mean: the file is corrupted for example or it is wrong, absent xml tag...).
I have been reading spring batch doc and I think ItemReadListener is my guy:
public interface ItemReadListener<T> extends StepListener {
void beforeRead();
void afterRead(T item);
void onReadError(Exception ex);
}
but, I don't know how to use it! I have tried doing my seecItemReader() implements this interface but onReadError method is never called.
I don't know how to declare/register in the taskletStep the ItemReadListener.
Here a bit of spring doc:
Any class that implements one of the extensions of StepListener (but
not that interface itself since it is empty) can be applied to a step
via the listeners element. The listeners element is valid inside a
step, tasklet or chunk declaration. It is recommended that you declare
the listeners at the level which its function applies, or if it is
multi-featured (e.g. StepExecutionListener and ItemReadListener) then
declare it at the most granular level that it applies (chunk in the
example given).
An ItemReader, ItemWriter or ItemProcessor that itself implements one
of the StepListener interfaces will be registered automatically with
the Step if using the namespace element, or one of the the
*StepFactoryBean factories. This only applies to components directly injected into the Step: if the listener is nested inside another
component, it needs to be explicitly registered (as described above).
Could you please help me?
Thanks in advance!
As I guessed it was easier than I thougth, for registering programatically the ItemReadListener is via listener method in the tasklet configuration:
#Bean
public TaskletStep seecLoadFileStep() {
TaskletStep build = stepBuilderFactory.get(SEEC_LOAD_FILE_STEP)
.<SeecMove, SeecMove>chunk(cormoranProperties.seec.batchSize.get()).reader(seecItemReader()).listener(seecItemReaderListener())
.writer(seecItemWriter()).build();
return build;
}
And now the onError method is called when an Exception happens.

Resources