Spring Boot REST with Hadoop HBASE - spring

I'm looking to build an simple RESTFull API to access into HBase.
I looked Python HappyBase, but my cluster is kerberised. Now I'm into Spring.
I used to make simple API REST with Solr Cloud and Spring Boot.
Is it possible to do same with Hbase ?
I have no idea if I have to use Spring Boot 'Yarn App'
=> https://spring.io/guides/gs/yarn-basic/
Or Spring Hadoop.
=> https://projects.spring.io/spring-hadoop/
Just want a really simple API.
Thanks for help.

I wrote a simple demo project for using hbase in spring boot restful application without xml.
This demo mainly depends spring-data-hadoop and hbase-client.
gradle dependencies:
compile('org.springframework.boot:spring-boot-starter-data-rest')
compile('org.springframework.boot:spring-boot-starter-web')
compile 'org.springframework.data:spring-data-hadoop:2.5.0.RELEASE'
compile('org.apache.hbase:hbase-client:1.3.1'){
exclude group :'log4j',module:'log4j'
exclude group :'org.slf4j',module:'slf4j-log4j12'
exclude group: 'javax.servlet', module: 'servlet-api'
}
compile('org.springframework.boot:spring-boot-configuration-processor')
providedRuntime('org.springframework.boot:spring-boot-starter-tomcat')
Configure the hbase connection parameters in spring boot's application.properties (No XML!):
spring.data.hbase.zkQuorum=192.168.0.109:2181
spring.data.hbase.zkBasePath=/hbase
spring.data.hbase.rootDir=file:///home/hbase-1.2.2
class HbaseProperties.java:
#ConfigurationProperties(prefix = "spring.data.hbase")
public class HbaseProperties {
// Addresses of all registered ZK servers.
private String zkQuorum;
// Location of HBase home directory
private String rootDir;
// Root node of this cluster in ZK.
private String zkBasePath;
// getters and setters...
}
HbaseConfig.java, inject the configurations into the HbaseTemplate:
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.hadoop.hbase.HbaseTemplate;
#Configuration
#EnableConfigurationProperties(HbaseProperties.class)
public class HbaseConfig {
#Autowired
private HbaseProperties hbaseProperties;
#Bean
public HbaseTemplate hbaseTemplate() {
org.apache.hadoop.conf.Configuration configuration = HBaseConfiguration.create();
configuration.set("hbase.zookeeper.quorum", this.hbaseProperties.getZkQuorum());
configuration.set("hbase.rootdir", this.hbaseProperties.getRootDir());
configuration.set("zookeeper.znode.parent", this.hbaseProperties.getZkBasePath());
return new HbaseTemplate(configuration);
}
}
Service class, we can use the configured HbaseTemplate now:
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.hadoop.hbase.HbaseTemplate;
import org.springframework.stereotype.Service;
import com.zql.hbasedemo.vo.Quote;
#Service
public class FeedService {
#Autowired
private HbaseTemplate hbaseTemplate;
#PostConstruct
public void test(){
Quote quote = new Quote();
quote.setEventType("ft");
quote.setHandicap("4");
quote.setMarket("OU");
quote.setMatchId("27350208");
quote.setSelection("OVER");
quote.setPrice("1.93");
saveQuote(quote);
}
public Quote saveQuote(Quote quote) {
hbaseTemplate.put("quotes", quote.getMatchId(), "data", quote.getMarket() + ":" + quote.getSelection(),
quote.getPrice().getBytes());
return quote;
}
}
Rest Controller.
#RestController
public class FeedController {
#Autowired
private FeedService feedService;
#SuppressWarnings({ "unchecked", "rawtypes" })
#PostMapping(value = "/feed/quote", consumes = "application/json", produces = "application/json")
public ResponseEntity<Quote> saveQuote(#RequestBody Quote quote) {
Quote result = feedService.saveQuote(quote);
return new ResponseEntity(result, new HttpHeaders(), HttpStatus.OK);
}
}

Related

spring-boot-starter-jdbc DAO repository object not injected in working legacy webservice

I am new in the spring/boot word and have a working JAX-WS based web-service declared in a springboot project. It is started and configured via web.xml and sun-jaxws.xml. So, no beans included there only endpoints declarations and servlet definitions and mappings.
I just now want to save the items i get in the webservice into the mysql database using spring-boot-starter-jdbc which is not working:
I can't achieve this as the repository is not injected in the webservice implementation.
Followed all steps in other question, but not achieving this!
Normally declaration of the datasource parameters in application.properties and annotating #webservice and #Repository would suffice to get the injection of the repository working in the webservice class. What am i missing ?
Here details of the steps I followed:
the webservice implementation is a package X and i created a #SpringBootApplication in order to use a mysql datasource declared in application.properties.
So i annotated the webservice as a #Component and the data access repository with #Repository and #Component
parent version: spring-boot-starter-parent : 1.5.10.RELEASE
webservice service implementation:
BServiceManager.java
package X;
....
#Component
#WebService(name = "***",***)
#BindingType("http://schemas.xmlsoap.org/wsdl/soap/http")
#XmlSeeAlso({
packagesxxx.class,
....
})
public class BServiceManager
implements xxxxx
{
....
#Autowired
private ItemRepository irepo;
....
#WebMethod(**)
#WebResult(***)
public ResponseDataInfo sendInfo( ){
....
trepo.saveitem(item)
....
}
}
ItemRepository.java
package Y.Z;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.dao.DataAccessException;
import org.springframework.stereotype.Component;
import org.springframework.stereotype.Repository;
#Component
#Repository
public class ItemRepository {
#Autowired
private JdbcTemplate jdbcTemplate ;
....
public boolean saveitem(Item item) {
....
}
}
Item.java
package Y.Z;
public class Item {
....
}
GetItemsApplication
package Y;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.ComponentScan;
//#ComponentScan(basePackages={"Y","Y.Z","X"})
#SpringBootApplication
public class GetItemsApplication {
....
public static void main(String[] args) {
SpringApplication.run(GetItemsApplication.class, args);
log.info("--Spring Boot inits done--");
}
}
application.properties
spring.data.jpa.repositories.enabled=false
spring.data.jdbc.repositories.enabled=true
# MySQL properties
spring.datasource.url=****
spring.datasource.username=****
spring.datasource.password=****
....
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
logging.level.org.springframework.jdbc.core.JdbcTemplate=debug
NB: even having datasource bean is not helping :
File: DataSourceConfig.java
package Y.Z;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.autoconfigure.jdbc.DataSourceBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
//#EnableJdbcRepositories for Spring
#Configuration
public class MDataSourceConfig {
#Bean
public DataSource dataSource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
return dataSourceBuilder.build();
}
#Bean
public JdbcTemplate getJdbcTemplate() {
return new JdbcTemplate(dataSource());
}
}
Your Webservice doesn't seem to get created by Spring.
Therefore Spring has no control over its dependencies, so you have to get the dependency programmatically.
There are many ways to do this.
Easy but not very elegant and uses global variables which might cause problems, especially with tests: https://stackoverflow.com/a/18486178/66686
More elegant but requires weaving Spring autowiring using #Configurable

How Do I Unit Test A Jersey REST API Without Running A Server?

I am working with a REST API that is using Jersey with Spring Boot (so no specific application context in XML or Java) and Spring Data JPA.
I want to write unit tests on the GET and POST endpoints, however, I don't want to start a web server as it takes too long.
If I use JerseyTest my Spring Beans don't get loaded into the context.
public class InMemoryContainerPackageTest extends
JerseyTestNg.ContainerPerClassTest {
#Override
protected TestContainerFactory getTestContainerFactory() {
return new InMemoryTestContainerFactory();
}
#Override
public Application configure() {
ResourceConfig config = new ResourceConfig()
.register(SpringLifecycleListener.class)
.register(RequestContextFilter.class)
.register(this)
.register(MyController.class)
.packages("com.my.service");
return config;
}
If I use SpringBootTest it starts up a web server which takes about 30 seconds and ideally I want all my tests to complete in under 5 seconds otherwise developers won't run them.
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class TestNGClass1 extends AbstractTestNGSpringContextTests {
I don't think MockMvc works with Jersey endpoints.
If I use JerseyTest my Spring Beans don't get loaded into the context.
What you can do is set the property "contextConfig" in your ResourceConfig. The value will be a Spring ApplicationContext instance. So if you are using Java configuration, you would just use an AnnotationConfigApplicationContext.
#Override
public ResourceConfig configure() {
return new ResourceConfig()
.register(TestResource.class)
.property("contextConfig",
new AnnotationConfigApplicationContext(SpringConfig.class));
}
Here, SpringConfig is an arbitrary Spring #Configuration class. Below is a complete example.
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.core.Response;
import org.glassfish.jersey.server.ResourceConfig;
import org.glassfish.jersey.test.JerseyTest;
import org.glassfish.jersey.test.inmemory.InMemoryTestContainerFactory;
import org.glassfish.jersey.test.spi.TestContainerFactory;
import org.junit.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import static org.assertj.core.api.Assertions.assertThat;
public class SpringTest extends JerseyTest {
public static class MessageService {
public String getMessage() {
return "Hello World";
}
}
#Configuration
public static class SpringConfig {
#Bean
public MessageService service() {
return new MessageService();
}
}
#Path("test")
public static class TestResource {
#Autowired
private MessageService service;
#GET
public String get() {
return service.getMessage();
}
}
#Override
public ResourceConfig configure() {
return new ResourceConfig()
.register(TestResource.class)
.property("contextConfig",
new AnnotationConfigApplicationContext(SpringConfig.class));
}
#Override
public TestContainerFactory getTestContainerFactory() {
return new InMemoryTestContainerFactory();
}
#Test
public void testIt() {
Response res = target("test")
.request()
.get();
String msg = res.readEntity(String.class);
System.out.println(msg);
assertThat(msg).isEqualTo("Hello World");
}
}
As far as the JPA, you are going to have to configure that yourself. When using Spring Boot, all of the JPA bootstrapping is taken care of. If you are going to use Jersey Test Framework, then you are ignoring all Spring Boot configuration.
It's really not that hard to configure JPA yourself. It basically consists of configuring a DataSource, a TransactionManager, a JpaVendorAdaptor, and a LocalContainerEntityManagerFactoryBean. And to enable the Spring Data repositories, you just need to use the #EnableJpaRepositories.
Have a look at this complete example configuration.
Another thing to be wary of is that when we use the Jersey Test Framework, we will not have the test scoped transactions (and rollbacks) that you will get when working with Spring Test. So when you write your tests, you need to take this into consideration.

Unable to use two Neo4j Instances with Spring boot/Spring data neo4j

Expected Behavior
Trying to use two Neo4j instances with Spring boot and Spring data Neo4j
Current Behavior
Able to use only one Neo4j instances. Unable to use two repositories.
Steps to Reproduce (for bugs)
1. Run two Neo4j Instances
2. Create Data source configuration for both Neo4j Instances using spring boot.
3. Use Repository to access the Node entity
4. It will throw error
Context
Consider that I am running a library and renting books to other users. If the user is renting the book from me, the same node details will be present in their repository and I will allow them to edit the node entity through my application (like adding keywords, adding highlights about the books etc.)
So in both repository the node details will be same.
Below are the application. properties details for both Neo4j repositories.
My Neo4j Repository Details
spring.data.neo4j.uri=bolt://localhost:7687
spring.data.neo4j.username=neo4j
spring.data.neo4j.password=neo4j
Rental User's Neo4j Repository Details
(Accessing through http which is running in some other machine)
rental.data.neo4j.uri=bolt://...:7687
rental.data.neo4j.username=neo4j
rental.data.neo4j.password=neo4j
Below is the Rental User's Neo4j configuration :
#configuration
#EnableNeo4jRepositories(basePackages = "com.metadata.dao.rentallibrary", sessionFactoryRef = "rentalSessionFactory", transactionManagerRef = "rentalUsertransactionManager")
#EnableTransactionManagement
#EntityScan("com.metadata.dao")
public class rentalUserNeo4jConfiguration {
#Value("${rental.data.neo4j.uri}")
private String url;
#Value("${rental.data.neo4j.username}")
private String userName;
#Value("${rental.data.neo4j.password}")
private String password;
#Bean(name = "rentalSessionFactory")
public SessionFactory rentalUserSessionFactory() {
return new SessionFactory(rentalNeo4jconfiguration(), "com.metadata.dao.rentallibrary.entity");
}
#Bean
public org.neo4j.ogm.config.Configuration rentalNeo4jconfiguration() {
org.neo4j.ogm.config.Configuration configuration = new org.neo4j.ogm.config.Configuration.Builder().uri(url)// "
.credentials(userName, password)
.build();
return configuration;
}
#Bean
public Neo4jTransactionManager rentalUsertransactionManager() {
return new Neo4jTransactionManager(rentalUserSessionFactory());
}
}
And below is my library's Neo4j configuration details :
#configuration
#EnableNeo4jRepositories(basePackages = "com.metadata.dao.mylibrary", sessionFactoryRef = "myUserSessionFactory", transactionManagerRef = "myUserTransactionManager")
#EnableTransactionManagement
public class MyUserNeo4jConfiguration {
#Value("${spring.data.neo4j.uri}")
private String url;
#Value("${spring.data.neo4j.username}")
private String userName;
#Value("${spring.data.neo4j.password}")
private String password;
#Bean(name = "myUserSessionFactory")
#Primary
public SessionFactory myUserSessionFactory() {
return new SessionFactory(myUserconfiguration(), "com.metadata.dao.mylibrary.entity");
}
#Bean
public org.neo4j.ogm.config.Configuration myUserconfiguration() {
org.neo4j.ogm.config.Configuration configuration = new org.neo4j.ogm.config.Configuration.Builder().uri(url)
.credentials(userName, password)
.build();
return configuration;
}
#Bean
public Neo4jTransactionManager myUserTransactionManager() {
return new Neo4jTransactionManager(myUserSessionFactory());
}
}
I am trying access both repositories using session Factory (via qualifier) it is working fine. But I am trying to access the data, through repositories I am facing the Issue.
**Accessing through SessionFactory :**
#Autowired
#Qualifier(value = "myUserSessionFactory")
SessionFactory myUserSessionFactory;
#Autowired
#Qualifier(value = "rentalUserSessionFactory")
SessionFactory rentalUserSessionFactory;
Below are the error details, I am getting when trying to access the data through :
java.lang.IllegalArgumentException: Class class com.metadata.dao.Book is not a valid entity class. Please check the entity mapping.
at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:88) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:40) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.session.Neo4jSession.save(Neo4jSession.java:469) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_141]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_141]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_141]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_141]
at org.springframework.data.neo4j.transaction.SharedSessionCreator$SharedSessionInvocationHandler.invoke(SharedSessionCreator.java:131) ~[spring-data-neo4j-5.0.8.RELEASE.jar:5.0.8.RELEASE]
at com.sun.proxy.$Proxy81.save(Unknown Source) ~[na:na]
org.neo4j.ogm.exception.core.TransactionManagerException: Transaction is not current for this thread
at org.neo4j.ogm.session.transaction.DefaultTransactionManager.rollback(DefaultTransactionManager.java:86) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.transaction.AbstractTransaction.rollback(AbstractTransaction.java:65) ~[neo4j-ogm-api-3.1.0.jar:3.1.0]
at org.neo4j.ogm.drivers.bolt.transaction.BoltTransaction.rollback(BoltTransaction.java:61) ~[neo4j-ogm-bolt-driver-3.1.0.jar:3.1.0]
at org.neo4j.ogm.transaction.AbstractTransaction.close(AbstractTransaction.java:144) ~[neo4j-ogm-api-3.1.0.jar:3.1.0]
at org.springframework.data.neo4j.transaction.Neo4jTransactionManager.doCleanupAfterCompletion(Neo4jTransactionManager.java:379) ~[spring-data-neo4j-5.0.8.RELEASE.jar:5.0.8.RELEASE]
Node Entity Name in dao.mylibrary.entity : Book
Node Entity Name in dao.rentallibrary.entity : RentedBook
Please let me know why this issue is occurring while using Neo4j repositories? Can't we use two Neo4j repositories with Spring data neo4j & Spring boot? Or Am I doing something wrong?
My Environment
OGM Version used: 3.1.0
Java Version used: 1.8
Neo4J Version used:3.2.3
Bolt Driver Version used (if applicable): 3.1.0
Operating System and Version: Windows
Please let me know if you need any additional information.
Update This has been addressed in Spring Data Neo4j Lovelace RC1 and we wrote a small howto: https://michael-simons.github.io/neo4j-sdn-ogm-tips/using_multiple_session_factories
Thanks for submitting this as GitHub issue #498.
It seems that the current Spring Data Neo4j version has a bug in propagating the different session factories to the repositories. In short: there is no way to make this work right now (for example like you can do with Spring Data JPA).
If you need (and want) repositories, I cannot help you right now. What work's however is injecting the different session factories:
#Autowired
#Qualifier("myUserSessionFactory")
private SessionFactory myUserSessionFactory;
#Autowired
#Qualifier("rentalUserSessionFactory")
private SessionFactory rentalUserSessionFactory;
and then do something like
Map<String, Object> params = new HashMap<>();
params.put("name", "test");
ThingEntity t = this.myUserSessionFactory.openSession().queryForObject(
ThingEntity.class,
"MATCH (n:`Thing`) WHERE n.name = $name WITH n RETURN n", params);
Regardless of the bug in our code, I recommend the following configuration. For the primary beans ("myUserconfiguration") use one config class
package gh.neo4jogm.gh498;
import gh.neo4jogm.gh498.domain1.ThingEntity;
import org.neo4j.ogm.session.SessionFactory;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jProperties;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
#Configuration
#EnableNeo4jRepositories(
basePackages = Domain1Config.BASE_PACKAGE,
sessionFactoryRef = "myUserSessionFactory",
transactionManagerRef = "myUserTransactionManager"
)
#EntityScan(basePackageClasses = ThingEntity.class)
class Domain1Config {
static final String BASE_PACKAGE = "gh.neo4jogm.gh498.domain1";
#Primary
#Bean
#ConfigurationProperties("spring.data.neo4j")
public Neo4jProperties myNeo4jProperties() {
return new Neo4jProperties();
}
#Primary
#Bean
public org.neo4j.ogm.config.Configuration myUserconfiguration() {
return myNeo4jProperties().createConfiguration();
}
#Primary
#Bean
public SessionFactory myUserSessionFactory() {
return new SessionFactory(myUserconfiguration(), BASE_PACKAGE);
}
#Bean
public Neo4jTransactionManager myUserTransactionManager() {
return new Neo4jTransactionManager(myUserSessionFactory());
}
}
The basic idea is to use #ConfigurationProperties to map the default properties to an instance of Neo4jProperties (our properties class) and use it like we do create the action configuration.
The same then for the other session factory:
package gh.neo4jogm.gh498;
import gh.neo4jogm.gh498.domain2.OtherThingEntity;
import org.neo4j.ogm.session.SessionFactory;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jProperties;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
import static gh.neo4jogm.gh498.Domain2Config.BASE_PACKAGE;
#Configuration
#EnableNeo4jRepositories(
basePackages = BASE_PACKAGE,
sessionFactoryRef = "rentalUserSessionFactory",
transactionManagerRef = "rentalUsertransactionManager"
)
#EntityScan(basePackageClasses = OtherThingEntity.class)
class Domain2Config {
static final String BASE_PACKAGE = "gh.neo4jogm.gh498.domain2";
#Bean
#ConfigurationProperties("rental.data.neo4j")
public Neo4jProperties rentalNeo4jProperties() {
return new Neo4jProperties();
}
#Bean
public org.neo4j.ogm.config.Configuration rentalNeo4jconfiguration() {
return rentalNeo4jProperties().createConfiguration();
}
#Bean
public SessionFactory rentalUserSessionFactory() {
return new SessionFactory(rentalNeo4jconfiguration(), BASE_PACKAGE);
}
#Bean
public Neo4jTransactionManager rentalUsertransactionManager() {
return new Neo4jTransactionManager(rentalUserSessionFactory());
}
}
Here you map all properties prefixed with rental.data.neo4j to another properties instance.

Boilerplate project with spring boot 2.0.0 not exposing custom actuator endpoints

I am trying to upgrade a spring boot boilerplate project to Spring boot 2.0.0.
I followed the official migration guide(this and this) but it is unable to expose the actuator custom endpoints.
I tested with this dummy endpoint:
import org.springframework.boot.actuate.endpoint.annotation.Endpoint;
import org.springframework.boot.actuate.endpoint.annotation.ReadOperation;
import org.springframework.boot.actuate.endpoint.annotation.Selector;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
#Component
#Endpoint(id="testing-user")
public class ActiveUsersEndpoint {
private final Map<String, User> users = new HashMap<>();
ActiveUsersEndpoint() {
this.users.put("A", new User("Abcd"));
this.users.put("E", new User("Fghi"));
this.users.put("J", new User("Klmn"));
}
#ReadOperation
public List getAll() {
return new ArrayList(this.users.values());
}
#ReadOperation
public User getActiveUser(#Selector String user) {
return this.users.get(user);
}
public static class User {
private String name;
User(String name) {
this.name = name;
}
public String getName() {
return this.name;
}
public void setName(String name) {
this.name = name;
}
}
}
The endpoints work well if exposed directly from the child project but doesn't work if the endpoints are exposed from the parent boilerplate project which is being added as a dependency.
In my application.yml, I have added:
management:
endpoints:
web:
base-path: /
exposure:
include: '*'
There aren't many resources available and those which are, aren't helping.
Found the answer.
Instead of creating the bean with #Component, have a configuration file to create all the beans of your endpoints. For example, the configuration file might look like this:
#ManagementContextConfiguration
public class HealthConfiguration {
#Bean
public ActiveUsersEndpoint activeUsersEndpoint() {
return new ActiveUsersEndpoint();
}
// Other end points if needed...
}
The important thing was to have spring.factories file in resources.
The file will point to the configuration file where you created the beans of all your endpoints:
org.springframework.boot.actuate.autoconfigure.web.ManagementContextConfiguration=com.foo.bar.HealthConfiguration

Exporting Spring Boot Actuator Metrics (& Dropwizard Metrics) to Statsd

I'm trying to export all of the metrics which are visible at the endpoint /metrics to a StatsdMetricWriter.
I've got the following configuration class so far:
package com.tonyghita.metricsdriven.service.config;
import com.codahale.metrics.MetricRegistry;
import com.ryantenney.metrics.spring.config.annotation.EnableMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricReader;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.metrics.reader.MetricReader;
import org.springframework.boot.actuate.metrics.reader.MetricRegistryMetricReader;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(MetricsConfig.class);
#Value("${statsd.host:localhost}")
private String host = "localhost";
#Value("${statsd.port:8125}")
private int port;
#Autowired
private MetricRegistry metricRegistry;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry);
}
#Bean
#ExportMetricWriter
public MetricWriter metricWriter() {
LOGGER.info("Configuring StatsdMetricWriter to export to {}:{}", host, port);
return new StatsdMetricWriter(host, port);
}
}
Which writes all of the metrics which I've added to Statsd, but I'd like to also send the system/JVM metrics that are visible on the /metrics endpoint.
What am I missing?
I had the same problem and found a solution here: https://github.com/tzolov/export-metrics-example
Just add a MetricsEndpointMetricReader to your config and everything available at th e/metrics endpoint will be published to the StatsdMetricWriter.
Here is a complete example config for spring boot 1.3.x and dropwizard metrics-jvm 3.1.x:
import com.codahale.metrics.MetricRegistry;
import com.codahale.metrics.jvm.GarbageCollectorMetricSet;
import com.codahale.metrics.jvm.MemoryUsageGaugeSet;
import com.codahale.metrics.jvm.ThreadStatesGaugeSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.endpoint.MetricsEndpoint;
import org.springframework.boot.actuate.endpoint.MetricsEndpointMetricReader;
import org.springframework.boot.actuate.metrics.Metric;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.Delta;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MetricsConfiguration {
#Bean
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
metricRegistry.register("jvm.memory",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
metricRegistry.register("jvm.garbage-collector",new GarbageCollectorMetricSet());
return metricRegistry;
}
/*
* Reading all metrics that appear on the /metrics endpoint to expose them to metrics writer beans.
*/
#Bean
public MetricsEndpointMetricReader metricsEndpointMetricReader(final MetricsEndpoint metricsEndpoint) {
return new MetricsEndpointMetricReader(metricsEndpoint);
}
#Bean
#ConditionalOnProperty(prefix = "statsd", name = {"prefix", "host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter(#Value("${statsd.prefix}") String statsdPrefix,
#Value("${statsd.host}") String statsdHost,
#Value("${statsd.port}") int statsdPort) {
return new StatsdMetricWriter(statsdPrefix, statsdHost, statsdPort);
}
}
From what I've seen in spring-boot code, only calls to CounterService and GaugeService implementations are forwarded to dropwizard's MetricRegistry.
Therefore, as you already observed, only counter.* and gauge.* metrics from the /metrics endpoint will end up in Statsd.
System and JVM metrics are exposed through custom SystemPublicMetrics class, which doesn't use counter or gauge service.
I'm not sure if there is a simpler solution (maybe someone from Spring team will comment), but one way to do it (not spring-boot specific) would be to use a scheduled task that periodically writes system stats to the MetricRegistry.
To register JVM metrics you can use the JVM related MetricSets supplied by codehale.metrics.jvm library. You can just add the whole set without supplying whether they are gauges or counters.
Here is my example code where I am registering jvm related metrics:
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
#Autowired
private StatsdProperties statsdProperties;
#Autowired
private MetricsEndpoint metricsEndpoint;
#Autowired
private DataSourcePublicMetrics dataSourcePublicMetrics;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry());
}
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
//jvm metrics
metricRegistry.register("jvm.gc",new GarbageCollectorMetricSet());
metricRegistry.register("jvm.mem",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
return metricRegistry;
}
#Bean
#ConditionalOnProperty(prefix = "metrics.writer.statsd", name = {"host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter() {
return new StatsdMetricWriter(
statsdProperties.getPrefix(),
statsdProperties.getHost(),
statsdProperties.getPort()
);
}
}
Note: I am using spring boot version 1.3.0.M4
Enjoy! (see the public metrics logged in console as dropwizard metrics)
#Configuration
#EnableMetrics
#EnableScheduling
public class MetricsReporter extends MetricsConfigurerAdapter {
#Autowired private SystemPublicMetrics systemPublicMetrics;
private MetricRegistry metricRegistry;
#Scheduled(fixedDelay = 5000)
void exportPublicMetrics() {
for (Metric<?> metric : systemPublicMetrics.metrics()) {
Counter counter = metricRegistry.counter(metric.getName());
counter.dec(counter.getCount());
counter.inc(Double.valueOf(metric.getValue().toString()).longValue());
}
}
#Override
public void configureReporters(MetricRegistry metricRegistry) {
this.metricRegistry = metricRegistry;
ConsoleReporter.forRegistry(metricRegistry).build().start(10, TimeUnit.SECONDS);
}
}

Resources