Multiple references and dependencies with declarative services - osgi

I am experiencing problems with the order components are loaded when using OSGi declaratives services through Karaf.
I have this situation:
#Component
public class A implements IA
{
doSomething() {...}
}
#Component
public class B implements IB
{}
#Component
public class C implements IC
{
#Reference
IA a
#Reference
(cardinality = ReferenceCardinality.MULTIPLE,
policyOption = ReferencePolicyOption.GREEDY,
unbind = "doUnRegister" )
void doRegister(IB b)
{
a.doSomething()
}
void doUnregister(IB b)
{
...
}
}
A, B, and C are three distinct bundles.
When firing up Karaf, a B is registered and doRegister is called. However: service A is not ready (a is null).
I tried the following:
set the start level of A to something lower than B... Did not work
to pickup the registrations of B in a work-list and actually use A later when C was activated. Did not work AND the code was cluttered.
searched for a way to write this requirement through the annotation on doRegister - NOT possible.
I tried to use a service locator and get the context through an activate method on C - DID NOT WORK, it crashed Karaf.
I must clearly be missing something, is there anybody that have experienced similar problems and found a solution?
UPDATE:
Reference A a changed into IA a. Added forgotten information on Reference B().

Based upon the example code you provide, C wont be activated until A and B are present since the references to A and B are static, mandatory references. So start ordering is not relevant.
Also, references are set in the order they are written in the component description XML. When Bnd process the annotations into the component description XML, it writes the references out in order by the reference name. The reference name can be explicitly set and defaults to the name of the annotated member. So in your example code, a comes before doRegister, so the field a will be set before doRegister is called.
My guess is that, in your effort to reduce your actual code to this example, you have lost some important information to understand your problem. This would include the static/dynamic and mandatory/optional nature of your reference as well as the reference names.

Related

Spring 5 State Based Bean Injection - possible?

I am trying to find some 'controller' (not #Controller) within Spring 5.0 that is responsible to resolve what instance of an implementation to inject within Spring. I want to provide my own implementation of that controller (or to extend it), so that I can add my own logic for state-based bean resolution.
For example, given some interface Foo, with implementation FooImpl1 and FooImpl2, and some state Baz.
Then, when Baz = 1, I want to step into my own logic to decide to provide FooImpl1 instead of FooImpl2 for the required inject of Foo implementation.
Spring does this today, the logic seems to be:
Given the need to inject class X, find its implementations
If only one of X is found, use that
If more than one X is found, use Primary
If more than one X is found and no Primary, find Qualifier
If more than one X if found and no Primary and no Qualifier, attempt to match X with property or parameter X by name (ie: don't inject Y if the parameter or property is x and not y).
What I want to do is at some point in the logic above, to invoke my own disambiguation/resolution of the required implementation to be injected, based on my own logic and state.
So, before I go and dig into Spring to locate where that logic is implemented, I am hoping to find that it is implemented in some controller/service that I can extend, best still if this is backed by some configuration...
You can implement your own #Configuration that returns a Spring #Bean:
#Configuration
public class Config {
private final Baz baz;
#Autowired
Config(Baz baz) {
this.baz = baz
}
#Bean
public Foo getFoo() {
switch (baz) {
case 1:
return new FooImpl1();
default:
return new FooImpl2();
}
}
}
Please also read the paragraph about Full #Configuration vs “lite” #Bean mode?. The last paragraph states that:
In common scenarios, #Bean methods are to be declared within #Configuration classes, ensuring that “full” mode is always used and that cross-method references therefore get redirected to the container’s lifecycle management. This prevents the same #Bean method from accidentally being invoked through a regular Java call, which helps to reduce subtle bugs that can be hard to track down when operating in “lite” mode.
It seems what I am looking for are the interfaces BeanFactory and most likely ConfigurableBeanFactory. There are still some things to work through off-course, but this definitely is the right direction.
For those immediately curious as to what I am talking about ... see https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/beans/factory/config/ConfigurableBeanFactory.html

Inherited properties of related entities are not visible in spring-data-neo4j-rest

I have three NodeEntities A, B, and C. A is the parent of B and C. C has a property of type Set. For all three entities I have also a PagingAndSortingRepository. The Spring Boot application is set up as in the example https://spring.io/guides/gs/accessing-neo4j-data-rest/.
Now there is a strange thing: If I browse the B-repository directly using the url localhost:8080/B I see all the parent properties that B inherits from A. But if I browse the Bs over C, like localhost:8080/C/0/B I see the Bs but all the inherited properties are empty. Is this a bug or is there something missing?
Do you have a sample project that reproduces this? Or at least share the code for the classes.
Could be that your B relationship needs to have a #Fetch annotation to be fully hydrated for the load.
Update
As I presumed, the transitive child is not loaded automatically, so if you really need the data there, then add the #Fetch annotation.
public class Composite extends Component {
#Fetch
private Set<Leaf> leaf;
....
}

Is there any way to intercept all Linq to SQL queries?

I've built some code that can rebuild expression trees so I can avoid triggering the no supported translation to SQL exception and it works fine as long as I call my function to replace the iqueryable. The problem is that I'd like it to automatically be applied to all queries in my project without having to worry about calling this function on each one separately. Is there any way that I can intercept everything?
I've tried using Reflection.Emit to create a wrapping provider and using reflection to replace it on the data context and it turns out that even with Reflection.Emit I can't implement the internal IProvider interface.
I've also tried replacing the provider with a RealProxy based class and that works for non-compiled queries, but the CompiledQuery.Execute method is throwing an exception because it won't cast to the SqlProvider class. I tried replacing the response to the Compile method on the provider with another proxy so I could intercept the Execute call, but that failed a check on the return type being correct.
I'm open to any other ideas or ways of using what I've already tried?
It's hard to tell whether this is an applicable solution without seeing your code, but if you have a DI-friendly app architecture you can implement an interceptor and have your favorite IoC container emit the appropriate type for you, at run-time.
Esoteric? A little. Consider an interface like this:
public interface ISomeService
{
IEnumerable<SomeEntity> GetSomeEntities();
// ...
}
This interface might be implemented like this:
public class SomeService : ISomeService
{
private readonly DbContext _context // this is a dependency!
private readonly IQueryTweaker _tweaker; // this is a dependency!
public SomeService(DbContext context, IQueryTweaker tweaker) // this is constructor injection!
{
_context = context;
_tweaker = tweaker;
}
public IEnumerable<SomeEntity> GetSomeEntities()
{
return _tweaker.TweakTheQuery(_context.SomeEntities).ToList();
}
}
Every time you implement a method of the ISomeService interface, there's always a call to _tweaker.TweakTheQuery() that wraps the IQueryable, and that not only gets boring, it also feels like something is missing a feature - the same feeling you'd get by wrapping every one of these calls inside a try/catch block, or if you're familiar with MVVM in WPF, by raising this annoying PropertyChanged event for every single property setter in your ViewModel.
With DI Interception, you factor this requirement out of your "normal" code and into an "interceptor": you basically tell the IoC container that instead of binding ISomeService directly to the SomeService implementation, you're going to be decorating it with an interceptor, and emit another type, perhaps SomeInterceptedService (the name is irrelevant, the actual type only exists at run-time) which "injects" the desired behavior into the desired methods. Simple? Not exactly.
If you haven't designed your code with DI in mind (are your dependencies "injected" into your classes' constructor?), it could mean a major refactoring.
The first step breaks your code: remove the IQueryTweaker dependency and all the TweakTheQuery calls from all ISomeService implementations, to make them look like this - notice the virtualness of the method to be intercepted:
public class SomeService : ISomeService
{
private readonly DbContext _context
public SomeService(DbContext context)
{
_context = context;
}
public virtual IEnumerable<SomeEntity> GetSomeEntities()
{
return _context.SomeEntities.ToList();
}
}
The next step is to configure the IoC container so that it knows to inject the SomeService implementation whenever a type's constructor requires an ISomeService:
_kernel.Bind<ISomeService>().To<SomeService>();
At that point you're ready to configure the interception - if using Ninject this could help.
But before jumping into that rabbit's hole you should read this article which shows how decorator and interceptor are related.
The key point is, you're not intercepting anything that's internal to LINQ to SQL or the .NET framework itself - you're intercepting your own method calls, wrapping them with your own code, and with a little bit of help from any decent IoC container, you'll be intercepting the calls to methods that call upon Linq to SQL, rather than the direct calls to Linq to SQL itself. Essentially the IQueryTweaker dependency becomes a dependency of your interceptor class, and you'll only code its usage once.
An interesting thing about DI interception, is that interceptors can be combined, so you can have a ExecutionTimerServiceInterceptor on top of a AuditServiceInterceptor, on top of a CircuitBreakerServiceInterceptor... and the best part is that you can configure your IoC container so that you can completely forget it exists and, as you add more service classes to the application, all you need to do is follow a naming convention you've defined and voilà, you've just written a service that not only accomplishes all the strictly data-related tasks you've just coded, but also a service that will disable itself for 3 minutes if the database server is down, and will remain disabled until it's back up; that service also logs all inserts, updates and deletes, and stores its execution time in a database for performance analysis. The term automagical seems appropriate.
This technique - interception - can be used to address cross-cutting concerns; another way to address those is through AOP, although some articles (and Mark Seeman's excellent Dependency Injection in .NET) clearly demonstrate how AOP frameworks are a less ideal solution over DI interception.

SD MongoDB polymorphism in subdocument

I just started developing some app in Java with spring-data-mongodb and came across some issue that I haven't been able to solve:
Have a couple of document beans like this:
#Document(collection="myBeanBar")
public class BarImpl implements Bar {
String id;
Foo foo;
// More fields and methods ...
}
#Docuemnt
public class FooImpl implements Foo {
String id;
String someField;
// some more fields and methods ...
}
And I have a repository class with a method that simply invokes a find similar to this:
public List<? extends Bar> findByFooField(final String fieldValue) {
Query query = Query.query(Criteria.where("foo.someField").is(fieldValue));
return getMongoOperations().find(query, BarImpl.class);
}
Saving a Bar works just fine, it would save it in mongo along with the "_class" attribute for both Foo and Bar. However, finding by some attribute in Foo would throw an exception like this:
Exception in thread "main" java.lang.IllegalArgumentException: No property someField found on test.Foo!
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentPropertyPath(AbstractMappingContext.java:225)
at org.springframework.data.mongodb.core.convert.QueryMapper.getPath(QueryMapper.java:202)
at org.springframework.data.mongodb.core.convert.QueryMapper.getTargetProperty(QueryMapper.java:190)
at org.springframework.data.mongodb.core.convert.QueryMapper.getMappedObject(QueryMapper.java:86)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:486)
Which, after some digging, makes some sense, since nowhere in the query is the sub-document concrete type being specified, and the Entity Information of Bar says the type of foo is Foo (not FooImpl), which in turn can not have properties cause it is an interface.
My question is: Is there a way to specify it or work-around this issue without declaring the sub-document type as a concrete type?
I've been googling it for a couple of days and looking at the documentation and API and the source code but I can not find a clear way to do it. I'd really appreciate your help.
Thank you very much.
I had a similar problem, I have a class that implements an interface and when I use findAll I get the error:
org.springframework.data.mapping.model.MappingInstantiationException: Could not instantiate bean class [test.MetaClasse]: Specified class is an interface.
After debugging SpringData code, I realized that Mapper uses #TypeAlias to discover the type it has to instantiate, so I just put #TypeAlias("FullClassName") on my implementations of test.MetaClasse and it worked!
I tested with your situation and it will work!
Like mentioned in this comment, the solution with having full class name in the type alias is imperfect as it might make refactoring cumbersome.
Instead you can just configure type mappings and make it work automagically. Here's how:
First you'll need to annotate BarImpl and FooImpl with #TypeAlias. It doesn't have to be a full class name, could be anything else. For example #TypeAlias("bar_impl") and #TypeAlias("foo_impl") respectively.
Then we’re going to need the reflections library. Pick the latest version for the build tool of your choice here.
For example with Gradle:
implementation("org.reflections:reflections:0.10.2")
Now we’re going to need a small extension to DefaultMongoTypeMapper to make it easy to configure and instantiate. Here’s how it would look in Kotlin:
class ReflectiveMongoTypeMapper(
private val reflections: Reflections = Reflections("com.example")
) : DefaultMongoTypeMapper(
DEFAULT_TYPE_KEY,
listOf(
ConfigurableTypeInformationMapper(
reflections.getTypesAnnotatedWith(TypeAlias::class.java).associateWith { clazz ->
getAnnotation(clazz, TypeAlias::class.java)!!.value
}
),
SimpleTypeInformationMapper(),
)
)
where com.example is either your base package or the package with MongoDB models.
This way we will find all classes annotated with #TypeAlias and register alias to type mappings.
Next we'll need to adjust the app's mongo configuration a bit. The configuration has to extend AbstractMongoClientConfiguration and we need to override method mappingMongoConverter to make use of the mapper we created before. It should look like this:
override fun mappingMongoConverter(
databaseFactory: MongoDatabaseFactory,
customConversions: MongoCustomConversions,
mappingContext: MongoMappingContext,
) = super.mappingMongoConverter(databaseFactory, customConversions, mappingContext).apply {
setTypeMapper(ReflectiveMongoTypeMapper())
}
Done!
Now all alias to type mappings will be registered automatically on context startup and all your polymorphic fields will work just fine.
You can check the full code example on GitHub.
Also, here's a blog post where you can read about the root cause of this issue as well as check other ways to solve it (in case you don't want to rely on reflection): https://blog.monosoul.dev/2022/09/16/spring-data-mongodb-polymorphic-fields/

Spring AOP superclass method execution without #target

Consider the following situation:
class A() {
void a();
}
#MyAnnotation
class B extends A {
void b();
}
I want to advice all methods of all classes annotated with #MyAnnotation (i.e B.a()).
That's pretty easy task due to the possibility of using #target pointcut expression.
BUT! in that case all beans in a container (even unsuitable) will be Proxified what is unacceptable.
Now the question: Is it possible to build up pointcut expressino without #target but with the same effect?
You can use within like this.
execution(* *(..)) && within(#MyAnnotation *)
refer to https://stackoverflow.com/a/2522821/672586 and http://forum.springsource.org/showthread.php?28525-Difference-between-target-and-within-in-Spring-AOP for more details. The relevant section from the forum post explaining the difference between within and target
One difference between the two is that #within() is matched statically, requiring the corresponding annotation type to have only the CLASS retention. Whereas, #target() is matched at runtime, requiring the same to have the RUNTIME retention. Other than that, within the context of Spring, there is no difference between the join points selected by two.

Resources