Before I discovered Spring Boot's Info Actuator had almost everything I wanted to publish, I made a few meta endpoints to ensure that I could access build and Git information that would help when trying to validate things like:
"Is the right version deployed?"
"Who built this?"
"When was it built?"
"Which git commit is this based on?"
After doing that, I did get around to discovering the Info actuator and that it answers almost all of those questions for me, but there are a few things from the Git information that I'd like to add -- mostly the commit message and the dirty flag.
I looked at the output if I turn on full git metadata with:
management.info.git.mode=full
But ... that adds a lot more information, most of which I don't care about, so it's more than I really want.
What I'd like to do is take the GitInfoContributor and extend/replace it, but I'm not totally sure how to do that. It's easy to add my own contributor, but if I add my own contributor and call builder.withDetails("git"), like this:
package ca.cpp.api.submitapi.config
import org.springframework.boot.actuate.info.Info
import org.springframework.boot.actuate.info.InfoContributor
import org.springframework.boot.info.GitProperties
import org.springframework.stereotype.Component
#Component
class CustomGitInfoContributor(private val properties: GitProperties): InfoContributor {
override fun contribute(builder: Info.Builder?) {
builder?.withDetail("git",mapOf("dirty" to properties.get("dirty"))
}
}
This replaces the whole set of git properties, and in the meantime, I think the core GitInfoContributor will still be there, still be providing information that I'm throwing away.
Is there a reasonable way to add only the elements I want, either with my own contributor that can merge its information with the info already under "git" or by somehow extending/replacing the existing GitInfoContributor?
Simplest way to add new element under "git" part is extending GitInfoContributor
kotlin:
#Component
class CustomGitInfoContributor #Autowired
constructor(properties: GitProperties) : GitInfoContributor(properties) {
override fun contribute(builder: Info.Builder) {
val map = generateContent()
map["dirty"] = properties.get("dirty")
builder.withDetail("git", map)
}
}
java:
#Component
public class CustomGitInfoContributor extends GitInfoContributor {
#Autowired
public CustomGitInfoContributor(GitProperties properties) {
super(properties);
}
#Override
public void contribute(Info.Builder builder) {
Map<String, Object> map = generateContent();
map.put("dirty", getProperties().get("dirty"));
builder.withDetail("git", map);
}
}
This code will add dirty part after default git info e.g. {"git":{"commit":{"time":"2018-11-03T15:22:51Z","id":"caa2ef0"},"branch":"master","dirty":"true"}}
In case you do not want to generate the default git info part simple remove generateContent() call.
Related
I'm building a Quarkus app which handles http requests with resteasy and calls another api with restclient and I need to propagate a header and add another one on the fly so I added a class that implements ClientHeadersFactory.
Here's the code:
#ApplicationScoped
public abstract class MicroServicesHeaderHandler implements ClientHeadersFactory {
#Inject
MicroServicesConfig config;
#Override
public MultivaluedMap<String, String> update(MultivaluedMap<String, String> incomingHeaders,
MultivaluedMap<String, String> clientOutgoingHeaders) {
// Will be merged with outgoing headers
return new MultivaluedHashMap<>() {{
put("Authorization", Collections.singletonList("Bearer " + config.getServices().get(getServiceName()).getAccessToken()));
put("passport", Collections.singletonList(incomingHeaders.getFirst("passport")));
}};
}
protected abstract String getServiceName();
My issue is that the injection of the config doesn't work. I tried both with #Inject and #Context, as mentioned in the javadoc of ClientHeadersFactory. I also tried to make the class non abstract but it doesn't change anything.
MicroServicesConfig is a #Startup bean because it needs to be initialized before Quarkus.run() is called, otherwise the hot reload doesn't work anymore, since it's required to handle requests.
Here's the code FYI:
#Getter
#Startup
#ApplicationScoped
public final class MicroServicesConfig {
private final Map<String, MicroService> services;
MicroServicesConfig(AKV akv, ABS abs) {
// some code to retrieve an encrypted file from a secure storage, decrypt it and initialize the map out of it
}
It appears to be an issue with ClientHeadersFactory because if I inject my bean in my main class (#QuarkusMain), it works. I'm then able to assign the map to a public static map that I can then access from my HeaderHandler with Application.myPublicStaticMap but that's ugly so I would really prefer to avoid that.
I've searched online and saw several people having the same issue but according to this blogpost, or this one, it should work as of Quarkus 1.3 and MicroProfile 3.3 (RestClient 1.4) and I'm using Quarkus 1.5.2.
Even the example in the second link doesn't work for me with the injection of UriInfo so the issue doesn't come from the bean I'm trying to inject.
I've been struggling with this for weeks and I'd really like to get rid of my workaround now.
I'm probably just missing something but it's driving me crazy.
Thanks in advance for your help.
This issue has finally been solved in Quarkus 1.8.
I'm trying to set up a project with two data sources, one is MongoDB and the other is Postgres. I have repositories for each data source in different packages and I annotated my main class as follows:
#Import({MongoDBConfiguration.class, PostgresDBConfiguration.class})
#SpringBootApplication(exclude = {
MongoRepositoriesAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class
})
public class TemporaryRunner implements CommandLineRunner {
...
}
MongoDBConfiguration:
#Configuration
#EnableMongoRepositories(basePackages = {
"com.example.datastore.mongo",
"com.atlassian.connect.spring"})
public class MongoDBConfiguration {
...
}
PostgresDBConfiguration:
#Configuration
#EnableJpaRepositories(basePackages = {
"com.example.datastore.postgres"
})
public class PostgresDBConfiguration {
...
}
And even though I specified the base packages as described in documentation, I still get those messages in the console:
13:10:44.238 [main] [] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode!
13:10:44.266 [main] [] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data MongoDB - Could not safely identify store assignment for repository candidate interface com.atlassian.connect.spring.AtlassianHostRepository.
I managed to solve this issue for all my repositories by using MongoRepository and JpaRepository but AtlassianHostRepository comes from an external lib and it is a regular CrudRepository (which totally makes sense because the consumer of the lib can decide what type of DB he would like to use). Anyway it looks that basePackages I specified are completely ignored and not used in any way, even though I specified com.atlassian.connect.spring package only in #EnableMongoRepositories Spring Data somehow can't figure out which data module should be used.
Am I doing something wrong? Is there any other way I could tell spring data to use mongo for AtlassianHostRepository without changing the AtlassianHostRepository.class itself?
The only working solution I found was to let spring data ignore AtlassianHostRepository (because it couldn't figure out which data source to use) then create a separate configuration for it, and simply create it by hand:
#Configuration
#Import({MongoDBConfiguration.class})
public class AtlassianHostRepositoryConfiguration {
private final MongoTemplate mongoTemplate;
#Autowired
public AtlassianHostRepositoryConfiguration(final MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Bean
public AtlassianHostRepository atlassianHostRepository() {
RepositoryFactorySupport factory = new MongoRepositoryFactory(mongoTemplate);
return factory.getRepository(AtlassianHostRepository.class);
}
}
This solution works fine for a small or limited number of repositories used from a library, it would be rather cumbersome to create all the repositories by hand when there are more of them, but after reading the source code of spring-data I see no way to make it work with basePackages as stated in documentation (I may be wrong though).
I am attempting to implement a filter in a micronaut microservice, using the example code documented in Section 6.18 of the documentation:
https://docs.micronaut.io/latest/guide/index.html#filters
I have a HelloWord service that is essentially the same as the service provided on the documentation, with a controller that goes to "/hello" (as documented). I am also using the same TraceService and trace filter that is provided in Section 6.18. I am compiling and running the server without problems.
Unfortunately, the filter is not being engaged when I test the microservice.
I am pretty sure that something is missing in my code, but as I said I am using the same code that is in the example:
TraceService Class
import io.micronaut.http.HttpRequest;
import io.reactivex.Flowable;
import io.reactivex.schedulers.Schedulers;
import org.slf4j.*;
import javax.inject.Singleton;
#Singleton
public class TraceService {
private static final Logger LOG = LoggerFactory.getLogger(TraceService.class);
Flowable<Boolean> trace(HttpRequest<?> request) {
System.out.println("TRACE ENGAGED!");
return Flowable.fromCallable(() -> {
if (LOG.isDebugEnabled()) {
LOG.debug("Tracing request: " + request.getUri());
}
// trace logic here, potentially performing I/O
return true;
}).subscribeOn(Schedulers.io());
}
}
Trace Filter
import io.micronaut.http.*;
import io.micronaut.http.annotation.Filter;
import io.micronaut.http.filter.*;
import org.reactivestreams.Publisher;
#Filter("/hello/**")
public class TraceFilter implements HttpServerFilter {
private final TraceService traceService;
public TraceFilter(TraceService traceService) {
System.out.println("Filter created!");
this.traceService = traceService;
}
#Override
public Publisher<MutableHttpResponse<?>> doFilter(HttpRequest<?> request, ServerFilterChain chain) {
System.out.println("Filter engaged!");
return traceService.trace(request)
.switchMap(aBoolean -> chain.proceed(request))
.doOnNext(res -> res.getHeaders().add("X-Trace-Enabled", "true")
);
}
}
The Controller
import io.micronaut.http.annotation.*;
#Controller("/hello")
public class HelloController {
#Get("/")
public String index() {
return "Hello World";
}
}
Note that the controller uses code from Section 2.2 of the documentation:
https://docs.micronaut.io/latest/guide/index.html#creatingServer
I did a number of things to try and see what was happening with the filter, including putting little printouts in strategic parts of the Service and the filter. These printouts are not printing out, which tells me that the filter is not being created or used by Micronaut.
Clearly I am missing somethning. I suspect that there is something I need to do in order to get the system to engage the filter. Unfortunately the documentation just tells how to make the filter, not how to use it in the microservice. Furthermore, there don't appear to be any complete code examples that tell how to make the request system utilize the filter (maybe there is an annotation I need to add to the controller???).
Could someone tell me what I am missing? How do I get the filter to work? At the very least, could someone provide a complete example of how to create the filter and use it in an actual microservice?
Problem solved.
It actually helps a great deal if one puts the filter and service files in the right place. It was late when I made the files and I put them in the test area, not the development area. Once placed in the right place, the filter was properly injected into the microservice.
Sorry for the waste of space here, folks. Is there any way a poster can delete an embarrassing post?
I want to expose data from a database as Restful APIs in a Spring(SpringBoot) application. Spring Data Rest appears to be an exact fit for purpose for this activity.
This database is read-only for my application needs. The default provides all the HTTP methods. Is there a configuration that I can use to restrict (in fact prevent) the other methods from being exposed?
From the Spring docs on Hiding repository CRUD methods:
16.2.3. Hiding repository CRUD methods
If you don’t want to expose a save or delete method on your
CrudRepository, you can use the #RestResource(exported = false)
setting by overriding the method you want to turn off and placing the
annotation on the overriden version. For example, to prevent HTTP
users from invoking the delete methods of CrudRepository, override all
of them and add the annotation to the overriden methods.
#RepositoryRestResource(path = "people", rel = "people")
interface PersonRepository extends CrudRepository<Person, Long> {
#Override
#RestResource(exported = false)
void delete(Long id);
#Override
#RestResource(exported = false)
void delete(Person entity);
}
It is important that you override both delete methods as the exporter
currently uses a somewhat naive algorithm for determing which CRUD
method to use in the interest of faster runtime performance. It’s not
currently possible to turn off the version of delete which takes an ID
but leave exported the version that takes an entity instance. For the
time being, you can either export the delete methods or not. If you
want turn them off, then just keep in mind you have to annotate both
versions with exported = false.
As of early 2018, there is now the ability to only expose repository methods explicitly declared for exposure (DATAREST-1176)
See RepositoryRestConfiguration
A Export false at Type level does not allow overriding with export true at Method level ticket (DATAREST-1034) was opened, but closed as a duplicate of DATAREST-1176. Oliver Gierke stated:
I'll resolve this as fixed against the version of DATAREST-1176 for
now but feel free to reopen in case there's anything else you need.
They are not exact duplicates and the functionality described in 1034 would have been more user friendly, but there are at least some options now.
By default, Spring boot exposes all methods to REST. You can set that to false.
config.setExposeRepositoryMethodsByDefault(false);
For more information, you can refer org.springframework.data.rest.core.config.RepositoryRestConfiguration.
Sample code snippet to do this:
#Configuration
public class ApplicationRepositoryConfig implements RepositoryRestConfigurer {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config, CorsRegistry cors) {
..........
config.setExposeRepositoryMethodsByDefault(false);
}
}
Since Spring Data REST 3.1, we can configure exposure per HTTP method. I used the following snippet to disable exposure of PUT, PATCH, POST and DELETE methods for items and collections:
#Component
public class SpringDataRestCustomization implements RepositoryRestConfigurer {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config, CorsRegistry cors) {
ExposureConfiguration exposureConfiguration = config.getExposureConfiguration();
exposureConfiguration.withItemExposure((metadata, httpMethods) -> httpMethods.disable(HttpMethod.PUT)
.disable(HttpMethod.PATCH).disable(HttpMethod.POST).disable(HttpMethod.DELETE))
.withCollectionExposure((metadata, httpMethods) -> httpMethods.disable(HttpMethod.PUT)
.disable(HttpMethod.PATCH).disable(HttpMethod.POST).disable(HttpMethod.DELETE));
}
}
I would like to build a Spring application, where new components can be added easily and without much configuration. For example: You have different kinds of documents. These documents should be able to get exported into different fileformats.
To make this functionality easy to maintain, it should (basically) work the following way:
Someone programs the file format exporter
He/ She writes a component, which checks if the file format exporter is licensed (based on Spring Conditions). If the exporter is licensed a specialized Bean is injected in the application context.
The "whole rest" works dynamically based on the injected beans. Nothing needs to be touched in order to display it on the GUI, etc.
I pictured it the following way:
#Component
public class ExcelExporter implements Condition {
#PostConstruct
public void init() {
excelExporter();
}
#Bean
public Exporter excelExporter(){
Exporter exporter= new ExcelExporter();
return exporter;
}
#Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metadata) {
return true;
}
}
In order to work with those exporters (display them, etc.) I need to get all of them. I tried this:
Map<String, Exporter> exporter =BeanFactoryUtils.beansOfTypeIncludingAncestors(appContext, Exporter.class, true, true);
Unfortunate this does not work (0 beans returned). I am fairly new to this, would anyone mind to tell me how this is properly done in Spring? Maybe there is a better solution for my problem than my approach?
You can get all instances of a given type of bean in a Map effortlessly, since it's a built in Spring feature.
Simply autowire your map, and all those beans will be injected, using as a key the ID of the bean.
#Autowired
Map<String,Exporter> exportersMap;
If you need something more sophisticated, such as a specific Map implementation or a custom key. Consider defining your custom ExporterMap, as follows
#Component
class ExporterMap implements Map{
#Autowired
private Set<Exporter> availableExporters;
//your stuff here, including init if required with #PostConstruct
}