I have some individual web services written with spring boot that run individually and I want to create group projects based on needs.
Right now, the controllers are annotated with the #RestController annotation and obviously, they are working fine when the apps run individually.
But I want to convert these projects into maven dependencies, where I just pass them as dependencies and the controllers can exist by just that.
I have converted these projects into executable dependencies by adding classifier = exec.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<classifier>exec</classifier>
</configuration>
</plugin>
On other projects with only components/services, this approach work since it is just autowiring the service class.
On the group project, I still can autowire the controllers and run the functions which is okay. But those controller classes have #RestController and #RequestMapping annotations and I was wondering if it is possible to activate those controllers without doing anything?
Like just add the dependency and the controllers of that dependency are there.
Related
I have Projects built with Spring-boot, Maven and Kotlin.
I want to expose some Services and FeignClients in a maven Project, so others can use them.
For a class with Annotations like #Service that works well. But I need to expose also FeignClients, which are annotated with #FeignClient, but as it looks other Projects are not
able to inject those Clients. Do I have to configure something in my pom.xml?
Im using spring-cloud-starter-openfeign
Here is some Code.
My FeignClient looks like:
...
#FeignClient(name = "MyAPIClient", url = "\${url}", configuration = [MyApiClientConfiguration::class])
interface MyAPIClient {
...
And I try to inject that Client in another Project like this:
...
#Service
class MyService(val myAPIClient: MyAPIClient) {
...
The Error is pretty clear. It says, there is no bean with the name MyAPIClient. So it's not
visible or available. "Consider defining a bean of type 'com.mycomp.MyAPIClient' in your configuration."
Do I have to configure something explicitly to expose an OpenFeignClient to other projects in my pom.xml?
Thanks for your help
It would work automatically if you had the same package structure in both projects. See how the search for feign clients is performed by default.
In other cases, you need to specify basePackages of basePackageClasses attributes of #EnableFeignClients annotation (in the app where you need to inject your feign client). Note that if you do that, then the default behavior (scanning the current package where this annotation is placed) stops working, so you need to specify it manually too in this case.
I have a spring boot rest service that included an external project in pom as it's dependency. That external project is basically a jar that has spring AOP code.
The base package in my main application that includes this external jar with spring AOP code is x.y.z
The class in external jar where the #before advice is, is under the package a.b.c
With this class under a.b.c package, it doesn't get recognized by the main application where I want to use the spring aop implementation and apply the aspect. However, when I change it's package from a.b.c to x.y.z (which I really can't do in real life) it works fine.
I know that in spring boot service which happens to be the including service, it scans everything under root package given in the application class, x.y.z in this case and that is why aspect works fine if it's class is under x.y.z.
however, the problem is that this spring app jar will be used across multiple applications. So changing package name like this is not an option.
Is there a way to accomplish this without changing the package name of the class where spring app code is ?
Probably component scan is only activated for your application class packages by default. You can extend it to multiple packages, including the aspect package:
XML style configuration:
<context:component-scan base-package="x.y.z, a.b.c" />
Annotation style configuration:
#ComponentScan(basePackages = {"x.y.z", "a.b.c"})
Disclaimer: I am not a Spring user, only an AspectJ expert. I just knew that you can configure component scan, googled the syntax for you and hope it is correct.
Please define the bean (of jar project )inside main application. Give the #ComponentScan(basePackages = {"x.y.z", "a.b.c"}) as well as #EnableAspectJAutoProxy. Also include below piece of code.
ex:
` #Bean
public LoggingHandler loggingHandler()
{
return new LoggingHandler();
}`
Also annotate external jar code with:
`#Aspect
#Component
public class LoggingHandler {`
What #kriegaex suggests is correct. In addition to that, please make sure you are using #Component along with #Aspect. Since #Aspect is not a Spring annotation, Spring won't recognize it and hence your aspect won't be registered. So, using #Component is mandatory to getting aspects to work in Spring environment.
I'm trying to extend an OSGI service. The OSGI service that is being extended includes some references and properties. I'm using the new org.osgi.service.component.annotations package. The meta XML generated by the annotations processor of OSGi R6 implementation does not account for the reference and property declarations made in the OSGI service I'm extending.
Apache Felix Maven SCR plugin handles this use case well and the class annotated with Felix annotations includes references and properties of the base class as well.
Is there a way to get this working with the official OSGI annotation implementation. I don't want to fallback to Felix SCR plugin unless I have to as their official website says to move on to the OSGI implementation and this is a new project where SCR plugin is not already in use.
The meta XML generated by the annotations processor of OSGi R6 implementation does not account for the reference and property declarations made in the OSGI service I'm extending.
The behaviour you are expecting is down to the build tool you are using to generate the XML, not the annotations themselves. In general it is not a good idea to generate the XML based on annotations found in the parent class. This is because the parent class located at build time may not be the same as the parent class located at runtime. In this situation it is possible that generated injection sites might not actually be present at runtime, causing lots of problems. In fact even if the type is the same, you are referencing private details of the parent class from the subclass.
That warning aside, you are probably using a bnd-based tool, such as the maven-bundle-plugin or bnd-maven-plugin to generate the XML file. To avoid the issues I have mentioned bnd does not search the parent class of a component for annotations, but this behaviour can be overridden in configuration using the following instruction:
-dsannotations-options: inherit
If you add that configuration option then you should see the behaviour that you want, but it is strongly recommended that you do not do this when the parent class and child class are in different bundles.
Another option that doesn't require inheriting annotations from the base class is to redeclare the needed references in the component itself:
#Component(
service = SomeService.class,
reference = {
#Reference(
name = "baseClassField",
field = "baseClassField",
service = SomeOtherService.class
),
#Reference(
name = "otherBaseClassField",
field = "otherBaseClassField",
service = YetAnotherService.class,
cardinality=ReferenceCardinality.MULTIPLE,
policy=ReferencePolicy.DYNAMIC
)
}
)
public class MyServiceImplementation
extends AbstractSomeServiceBaseImpl
implements SomeService
{...}
The obvious disadvantage is that you explicitly hardcode the implementation details of the superclass, which may be even more wrong than implicitly inheriting them at build time. This way not only can the runtime class have different fields than the compile time dependency, but even at compile time when the base class changes you have to make sure to update your class to reflect the added, removed, or renamed fields.
For use in maven you can define in this way:
<plugins>
<plugin>
<groupId>biz.aQute.bnd</groupId>
<artifactId>bnd-maven-plugin</artifactId>
<version>3.5.0</version>
<executions>
<execution>
<id>run-bnd</id>
<goals>
<goal>bnd-process</goal>
</goals>
</execution>
</executions>
<configuration>
<bnd><![CDATA[-dsannotations-options: inherit]]></bnd>
</configuration>
</plugin>
</plugins>
Noobish question here, but I'm struggling to make this work.
I've an old project with submodules which does not use Spring or anything, just final class and static Instance.
--- Main
------ Server
------ Business
------ Webservices
Server has a dependency with Business and Webservices.
Webservices has a dependency with Business
Server is the sub-module with the web.xml file.
I have to add a new service in Business sub-module and I want to start using Spring and dependency injection to do so, in order to start migrating the project to Spring.
(I'm not talking SpringBoot, just regular Spring).
In Business sub-module, I did:
add spring-core, spring-beans, spring-contet dependencies as well as javax.inject .
Using spring version 4.3.2
create an interface IMyService and its implementation MyServiceImpl and added the #Service annotation on the impl.
add a spring-context.xml file in src/main/resources declaring context:annotation-config and context:component-scan base-package
Then I created, in my submodule a "bridge" to try and use the Spring bean from my submodule in a non spring bean of another submodule, like described here : https://blog.jdriven.com/2015/03/using-spring-managed-bean-in-non-managed-object/
However the context never get injected.
I've tried adding in the web.xml of Server the contextConfigLocation but no dice either.
What Am I missing so that my Spring context get initialized in the Business module ?
I'm using the maven-jetty-plugin to run my Spring MVC webapp during the integration-test phase of a Maven build, and run various tests on it. At this point, I'd like to be able to switch out some of the Spring configuration, so that I can point at a different bean implementation during the integration tests. This is so that I can change which database to run against, rather than use the production connection settings.
What sort of approach should I consider? Should I attempt to use resource filtering on the servlet-context.xml file? Should I have two different configuration files? How do I get this to play nicely with the Jetty plugin?
EDIT: I'm considering using Spring's Java-based #Configuration annotations in preference to the XML servlet-context file, and switching what sort of beans I construct based on environment variables or similar, but this feels wrong as well.
I will suggest using spring profile+maven filtering:
Define a property in pom.xml which can ben overwritten via command line: -Dspring.profile.active=development
<properties>
<spring.profile.active>test</spring.profile.active>
</properties>
Add resource filtering in pom.xml. Make sure your web.xml is in directory src/main/resources.
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
Active the specific spring profile in web.xml, ${spring.profile.active} will be replaced after filtering.
<context-param>
<param-name>spring.profiles.active</param-name>
<param-value>${spring.profile.active}</param-value>
</context-param>
Define beans in spring profile
<beans profile="production">
<jee:jndi-lookup id="dataSource" jndi-name="java:comp/env/jdbc/datasource"/>
</beans>
Previously, I always create a profile that contains jetty-maven-plugin configurations and integration tests configurations.
But when I learned about spring-test-mvc, I switched to it because everything that you want to achieve in integration tests using jetty-maven-plugin can be achieve. Plus, you can mock the services needed (eg. authentication in different app).
So I suggest to switch to spring-test-mvc. IMHO, jetty-maven-plugin style is quite painful.