How to enable Cassandra CqlSession Metrics in springbootapplication - spring-boot

I want to enable cassandra cqlsession metrics. when trying to register the cqlsession metrics it provides optional.empty() in springboot application. Here am using cassandra datastax java driver 4.6.
Here is my code:
#Autowired
private CqlSession cqlsession;
MetricRegistry metricRegistry = cqlsession.getMetrics()
.orElseThrow(() -> new IllegalArgumentException("not able to get metrics"))
.getRegistry();
Throwing IllegalArgumentException Error.
when referring the official docs for cassandra datastax (https://docs.datastax.com/en/developer/java-driver/4.6/manual/core/metrics/#configuration). the same set of conf files added to the project not resoles the problem

I've solved this with following approach:
The configuration class
import static com.datastax.oss.driver.api.core.config.DefaultDriverOption.METRICS_NODE_ENABLED;
import static com.datastax.oss.driver.api.core.config.DefaultDriverOption.METRICS_SESSION_ENABLED;
import org.springframework.boot.autoconfigure.cassandra.DriverConfigLoaderBuilderCustomizer;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableConfigurationProperties(CassandraProperties.class)
public class CassandraMetricsConfig {
#Bean
DriverConfigLoaderBuilderCustomizer configLoaderBuilderCustomizer(CassandraProperties cassandraProperties) {
return builder -> {
builder.withStringList(METRICS_SESSION_ENABLED, cassandraProperties.getSessionMetrics());
builder.withStringList(METRICS_NODE_ENABLED, cassandraProperties.getNodeMetrics());
};
}
}
The property class
#ConfigurationProperties(prefix = "cassandra")
#Data
public class CassandraProperties {
#NotNull
private List<String> sessionMetrics = new ArrayList<>();
#NotNull
private List<String> nodeMetrics = new ArrayList<>();
}
application.yml
cassandra:
session-metrics:
- bytes-sent
- connected-nodes
...
node-metrics:
- pool.open-connections
- pool.in-flight
...
Note this approach works only for metrics which do not require an additional configuration like cql-requests. If you want to monitor the cql-requests you have to extend the example to configure the required properties.

Related

Unable to use two Neo4j Instances with Spring boot/Spring data neo4j

Expected Behavior
Trying to use two Neo4j instances with Spring boot and Spring data Neo4j
Current Behavior
Able to use only one Neo4j instances. Unable to use two repositories.
Steps to Reproduce (for bugs)
1. Run two Neo4j Instances
2. Create Data source configuration for both Neo4j Instances using spring boot.
3. Use Repository to access the Node entity
4. It will throw error
Context
Consider that I am running a library and renting books to other users. If the user is renting the book from me, the same node details will be present in their repository and I will allow them to edit the node entity through my application (like adding keywords, adding highlights about the books etc.)
So in both repository the node details will be same.
Below are the application. properties details for both Neo4j repositories.
My Neo4j Repository Details
spring.data.neo4j.uri=bolt://localhost:7687
spring.data.neo4j.username=neo4j
spring.data.neo4j.password=neo4j
Rental User's Neo4j Repository Details
(Accessing through http which is running in some other machine)
rental.data.neo4j.uri=bolt://...:7687
rental.data.neo4j.username=neo4j
rental.data.neo4j.password=neo4j
Below is the Rental User's Neo4j configuration :
#configuration
#EnableNeo4jRepositories(basePackages = "com.metadata.dao.rentallibrary", sessionFactoryRef = "rentalSessionFactory", transactionManagerRef = "rentalUsertransactionManager")
#EnableTransactionManagement
#EntityScan("com.metadata.dao")
public class rentalUserNeo4jConfiguration {
#Value("${rental.data.neo4j.uri}")
private String url;
#Value("${rental.data.neo4j.username}")
private String userName;
#Value("${rental.data.neo4j.password}")
private String password;
#Bean(name = "rentalSessionFactory")
public SessionFactory rentalUserSessionFactory() {
return new SessionFactory(rentalNeo4jconfiguration(), "com.metadata.dao.rentallibrary.entity");
}
#Bean
public org.neo4j.ogm.config.Configuration rentalNeo4jconfiguration() {
org.neo4j.ogm.config.Configuration configuration = new org.neo4j.ogm.config.Configuration.Builder().uri(url)// "
.credentials(userName, password)
.build();
return configuration;
}
#Bean
public Neo4jTransactionManager rentalUsertransactionManager() {
return new Neo4jTransactionManager(rentalUserSessionFactory());
}
}
And below is my library's Neo4j configuration details :
#configuration
#EnableNeo4jRepositories(basePackages = "com.metadata.dao.mylibrary", sessionFactoryRef = "myUserSessionFactory", transactionManagerRef = "myUserTransactionManager")
#EnableTransactionManagement
public class MyUserNeo4jConfiguration {
#Value("${spring.data.neo4j.uri}")
private String url;
#Value("${spring.data.neo4j.username}")
private String userName;
#Value("${spring.data.neo4j.password}")
private String password;
#Bean(name = "myUserSessionFactory")
#Primary
public SessionFactory myUserSessionFactory() {
return new SessionFactory(myUserconfiguration(), "com.metadata.dao.mylibrary.entity");
}
#Bean
public org.neo4j.ogm.config.Configuration myUserconfiguration() {
org.neo4j.ogm.config.Configuration configuration = new org.neo4j.ogm.config.Configuration.Builder().uri(url)
.credentials(userName, password)
.build();
return configuration;
}
#Bean
public Neo4jTransactionManager myUserTransactionManager() {
return new Neo4jTransactionManager(myUserSessionFactory());
}
}
I am trying access both repositories using session Factory (via qualifier) it is working fine. But I am trying to access the data, through repositories I am facing the Issue.
**Accessing through SessionFactory :**
#Autowired
#Qualifier(value = "myUserSessionFactory")
SessionFactory myUserSessionFactory;
#Autowired
#Qualifier(value = "rentalUserSessionFactory")
SessionFactory rentalUserSessionFactory;
Below are the error details, I am getting when trying to access the data through :
java.lang.IllegalArgumentException: Class class com.metadata.dao.Book is not a valid entity class. Please check the entity mapping.
at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:88) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.session.delegates.SaveDelegate.save(SaveDelegate.java:40) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.session.Neo4jSession.save(Neo4jSession.java:469) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_141]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_141]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_141]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_141]
at org.springframework.data.neo4j.transaction.SharedSessionCreator$SharedSessionInvocationHandler.invoke(SharedSessionCreator.java:131) ~[spring-data-neo4j-5.0.8.RELEASE.jar:5.0.8.RELEASE]
at com.sun.proxy.$Proxy81.save(Unknown Source) ~[na:na]
org.neo4j.ogm.exception.core.TransactionManagerException: Transaction is not current for this thread
at org.neo4j.ogm.session.transaction.DefaultTransactionManager.rollback(DefaultTransactionManager.java:86) ~[neo4j-ogm-core-3.1.0.jar:3.1.0]
at org.neo4j.ogm.transaction.AbstractTransaction.rollback(AbstractTransaction.java:65) ~[neo4j-ogm-api-3.1.0.jar:3.1.0]
at org.neo4j.ogm.drivers.bolt.transaction.BoltTransaction.rollback(BoltTransaction.java:61) ~[neo4j-ogm-bolt-driver-3.1.0.jar:3.1.0]
at org.neo4j.ogm.transaction.AbstractTransaction.close(AbstractTransaction.java:144) ~[neo4j-ogm-api-3.1.0.jar:3.1.0]
at org.springframework.data.neo4j.transaction.Neo4jTransactionManager.doCleanupAfterCompletion(Neo4jTransactionManager.java:379) ~[spring-data-neo4j-5.0.8.RELEASE.jar:5.0.8.RELEASE]
Node Entity Name in dao.mylibrary.entity : Book
Node Entity Name in dao.rentallibrary.entity : RentedBook
Please let me know why this issue is occurring while using Neo4j repositories? Can't we use two Neo4j repositories with Spring data neo4j & Spring boot? Or Am I doing something wrong?
My Environment
OGM Version used: 3.1.0
Java Version used: 1.8
Neo4J Version used:3.2.3
Bolt Driver Version used (if applicable): 3.1.0
Operating System and Version: Windows
Please let me know if you need any additional information.
Update This has been addressed in Spring Data Neo4j Lovelace RC1 and we wrote a small howto: https://michael-simons.github.io/neo4j-sdn-ogm-tips/using_multiple_session_factories
Thanks for submitting this as GitHub issue #498.
It seems that the current Spring Data Neo4j version has a bug in propagating the different session factories to the repositories. In short: there is no way to make this work right now (for example like you can do with Spring Data JPA).
If you need (and want) repositories, I cannot help you right now. What work's however is injecting the different session factories:
#Autowired
#Qualifier("myUserSessionFactory")
private SessionFactory myUserSessionFactory;
#Autowired
#Qualifier("rentalUserSessionFactory")
private SessionFactory rentalUserSessionFactory;
and then do something like
Map<String, Object> params = new HashMap<>();
params.put("name", "test");
ThingEntity t = this.myUserSessionFactory.openSession().queryForObject(
ThingEntity.class,
"MATCH (n:`Thing`) WHERE n.name = $name WITH n RETURN n", params);
Regardless of the bug in our code, I recommend the following configuration. For the primary beans ("myUserconfiguration") use one config class
package gh.neo4jogm.gh498;
import gh.neo4jogm.gh498.domain1.ThingEntity;
import org.neo4j.ogm.session.SessionFactory;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jProperties;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
#Configuration
#EnableNeo4jRepositories(
basePackages = Domain1Config.BASE_PACKAGE,
sessionFactoryRef = "myUserSessionFactory",
transactionManagerRef = "myUserTransactionManager"
)
#EntityScan(basePackageClasses = ThingEntity.class)
class Domain1Config {
static final String BASE_PACKAGE = "gh.neo4jogm.gh498.domain1";
#Primary
#Bean
#ConfigurationProperties("spring.data.neo4j")
public Neo4jProperties myNeo4jProperties() {
return new Neo4jProperties();
}
#Primary
#Bean
public org.neo4j.ogm.config.Configuration myUserconfiguration() {
return myNeo4jProperties().createConfiguration();
}
#Primary
#Bean
public SessionFactory myUserSessionFactory() {
return new SessionFactory(myUserconfiguration(), BASE_PACKAGE);
}
#Bean
public Neo4jTransactionManager myUserTransactionManager() {
return new Neo4jTransactionManager(myUserSessionFactory());
}
}
The basic idea is to use #ConfigurationProperties to map the default properties to an instance of Neo4jProperties (our properties class) and use it like we do create the action configuration.
The same then for the other session factory:
package gh.neo4jogm.gh498;
import gh.neo4jogm.gh498.domain2.OtherThingEntity;
import org.neo4j.ogm.session.SessionFactory;
import org.springframework.boot.autoconfigure.data.neo4j.Neo4jProperties;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
import static gh.neo4jogm.gh498.Domain2Config.BASE_PACKAGE;
#Configuration
#EnableNeo4jRepositories(
basePackages = BASE_PACKAGE,
sessionFactoryRef = "rentalUserSessionFactory",
transactionManagerRef = "rentalUsertransactionManager"
)
#EntityScan(basePackageClasses = OtherThingEntity.class)
class Domain2Config {
static final String BASE_PACKAGE = "gh.neo4jogm.gh498.domain2";
#Bean
#ConfigurationProperties("rental.data.neo4j")
public Neo4jProperties rentalNeo4jProperties() {
return new Neo4jProperties();
}
#Bean
public org.neo4j.ogm.config.Configuration rentalNeo4jconfiguration() {
return rentalNeo4jProperties().createConfiguration();
}
#Bean
public SessionFactory rentalUserSessionFactory() {
return new SessionFactory(rentalNeo4jconfiguration(), BASE_PACKAGE);
}
#Bean
public Neo4jTransactionManager rentalUsertransactionManager() {
return new Neo4jTransactionManager(rentalUserSessionFactory());
}
}
Here you map all properties prefixed with rental.data.neo4j to another properties instance.

Spring Boot REST with Hadoop HBASE

I'm looking to build an simple RESTFull API to access into HBase.
I looked Python HappyBase, but my cluster is kerberised. Now I'm into Spring.
I used to make simple API REST with Solr Cloud and Spring Boot.
Is it possible to do same with Hbase ?
I have no idea if I have to use Spring Boot 'Yarn App'
=> https://spring.io/guides/gs/yarn-basic/
Or Spring Hadoop.
=> https://projects.spring.io/spring-hadoop/
Just want a really simple API.
Thanks for help.
I wrote a simple demo project for using hbase in spring boot restful application without xml.
This demo mainly depends spring-data-hadoop and hbase-client.
gradle dependencies:
compile('org.springframework.boot:spring-boot-starter-data-rest')
compile('org.springframework.boot:spring-boot-starter-web')
compile 'org.springframework.data:spring-data-hadoop:2.5.0.RELEASE'
compile('org.apache.hbase:hbase-client:1.3.1'){
exclude group :'log4j',module:'log4j'
exclude group :'org.slf4j',module:'slf4j-log4j12'
exclude group: 'javax.servlet', module: 'servlet-api'
}
compile('org.springframework.boot:spring-boot-configuration-processor')
providedRuntime('org.springframework.boot:spring-boot-starter-tomcat')
Configure the hbase connection parameters in spring boot's application.properties (No XML!):
spring.data.hbase.zkQuorum=192.168.0.109:2181
spring.data.hbase.zkBasePath=/hbase
spring.data.hbase.rootDir=file:///home/hbase-1.2.2
class HbaseProperties.java:
#ConfigurationProperties(prefix = "spring.data.hbase")
public class HbaseProperties {
// Addresses of all registered ZK servers.
private String zkQuorum;
// Location of HBase home directory
private String rootDir;
// Root node of this cluster in ZK.
private String zkBasePath;
// getters and setters...
}
HbaseConfig.java, inject the configurations into the HbaseTemplate:
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.hadoop.hbase.HbaseTemplate;
#Configuration
#EnableConfigurationProperties(HbaseProperties.class)
public class HbaseConfig {
#Autowired
private HbaseProperties hbaseProperties;
#Bean
public HbaseTemplate hbaseTemplate() {
org.apache.hadoop.conf.Configuration configuration = HBaseConfiguration.create();
configuration.set("hbase.zookeeper.quorum", this.hbaseProperties.getZkQuorum());
configuration.set("hbase.rootdir", this.hbaseProperties.getRootDir());
configuration.set("zookeeper.znode.parent", this.hbaseProperties.getZkBasePath());
return new HbaseTemplate(configuration);
}
}
Service class, we can use the configured HbaseTemplate now:
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.hadoop.hbase.HbaseTemplate;
import org.springframework.stereotype.Service;
import com.zql.hbasedemo.vo.Quote;
#Service
public class FeedService {
#Autowired
private HbaseTemplate hbaseTemplate;
#PostConstruct
public void test(){
Quote quote = new Quote();
quote.setEventType("ft");
quote.setHandicap("4");
quote.setMarket("OU");
quote.setMatchId("27350208");
quote.setSelection("OVER");
quote.setPrice("1.93");
saveQuote(quote);
}
public Quote saveQuote(Quote quote) {
hbaseTemplate.put("quotes", quote.getMatchId(), "data", quote.getMarket() + ":" + quote.getSelection(),
quote.getPrice().getBytes());
return quote;
}
}
Rest Controller.
#RestController
public class FeedController {
#Autowired
private FeedService feedService;
#SuppressWarnings({ "unchecked", "rawtypes" })
#PostMapping(value = "/feed/quote", consumes = "application/json", produces = "application/json")
public ResponseEntity<Quote> saveQuote(#RequestBody Quote quote) {
Quote result = feedService.saveQuote(quote);
return new ResponseEntity(result, new HttpHeaders(), HttpStatus.OK);
}
}

Enabling Redis cache in spring boot

I have following configuration on my spring boot project.
#SpringBootApplication
#EnableTransactionManagement
#EnableCaching
#EnableScheduling
#EnableAsync
public class Application {
String redisHost = "localhost";
int redisPort = 6379;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory factory = new JedisConnectionFactory();
factory.setHostName(redisHost);
factory.setPort(redisPort);
factory.setUsePool(true);
return factory;
}
#Bean
RedisTemplate<Object, Object> redisTemplate() {
RedisTemplate<Object, Object> redisTemplate = new RedisTemplate<Object, Object>();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
return redisTemplate;
}
#Bean
public CacheManager cacheManager() {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate());
return cacheManager;
}
}
Also I have following maven dependency on pom.
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
I have a separate redis server running on my local machine on defined port. Also in my service classes I have annotations like #Cacheable, #CachePut to support caching.
I can start spring boot application without errors and CRUD operations also works. But seems it is not using the defined redis cache. I used 'redi desktop manger' browsing tool and couldn't find any data on redis. Also I tried with monitoring redis server via redis cli command 'monitor', I can't see any changes on monitor.
So I assume redis caching still not working on my spring boot application. Can someone help me to figure out the issue?
I am using spring boot version 1.4.2.RELEASE
Thanks!
Given you are using Spring Boot, much of your Redis configuration is unnecessary since Spring Boot provides "auto-configuration" support for Redis, both as a data source as well as a caching provider.
You were also not specific about what version of Spring Boot you were using (e.g. 1.5.0.RC1) to run your application, or whether you had any application.properties on your application's classpath, which might make a difference if you explicitly specified spring.cache.type (set to something other than "redis", for instance).
However, in general, I cannot really see much wrong with your Redis or Spring Cache #Configuration class. However, it does seem to be a problem with not explicitly setting cacheManager.setUsePrefix(true). When I set this RedisCacheManager property ('usePrefix`), then everything worked as expected.
I am not (Spring Data) Redis expert so I am not exactly sure why this is needed. However, I based my test configuration on Spring Boot's "auto-configuration" support for Redis caching as well as your #Configuration "Application" class, shown above.
And, because you can eliminate most of your explicit configuration and use Spring Boot's "auto-configuration" support for Redis as a data source as well, I added a "AutoRedisConfiguration" #Configuration class to my test class. I.e. you can use this to configure Redis instead of my other #Configuration class ("CustomRedisConfiguration") which uses your configuration + the fix.
Here is the complete test example...
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.spring.cache;
import static org.assertj.core.api.Assertions.assertThat;
import java.util.Arrays;
import java.util.Properties;
import java.util.concurrent.atomic.AtomicBoolean;
import javax.annotation.PostConstruct;
import org.junit.Before;
import org.junit.FixMethodOrder;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.MethodSorters;
import org.spring.cache.CachingWithRedisIntegrationTest.CachingWithRedisConfiguration;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringBootConfiguration;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.cache.CacheManager;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.cache.concurrent.ConcurrentMapCacheManager;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Profile;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.data.redis.cache.RedisCacheManager;
import org.springframework.data.redis.connection.jedis.JedisConnectionFactory;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.test.context.ActiveProfiles;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.util.Assert;
/**
* Integration tests testing Spring's Cache Abstraction using Spring Data Redis auto-configured with Spring Boot.
*
* To run this test, first start a Redis Server on localhost listening on the default port, 6379.
*
* #author John Blum
* #see org.junit.Test
* #since 1.0.0
*/
#RunWith(SpringRunner.class)
#ActiveProfiles("auto")
#FixMethodOrder(MethodSorters.NAME_ASCENDING)
#ContextConfiguration(classes = CachingWithRedisConfiguration.class)
#SuppressWarnings("unused")
public class CachingWithRedisIntegrationTest {
protected static final int REDIS_PORT = 6379;
protected static final String REDIS_HOST = "localhost";
private AtomicBoolean setup = new AtomicBoolean(false);
#Autowired
private MathService mathService;
#Autowired(required = false)
private RedisTemplate<Object, Object> redisTemplate;
#Before
public void setup() {
if (redisTemplate != null && !setup.getAndSet(true)) {
redisTemplate.delete(Arrays.asList(0L, 1L, 2L, 4L, 8L));
}
}
#Test
public void firstCacheMisses() {
assertThat(mathService.factorial(0L)).isEqualTo(1L);
assertThat(mathService.wasCacheMiss()).isTrue();
assertThat(mathService.factorial(1L)).isEqualTo(1L);
assertThat(mathService.wasCacheMiss()).isTrue();
assertThat(mathService.factorial(2L)).isEqualTo(2L);
assertThat(mathService.wasCacheMiss()).isTrue();
assertThat(mathService.factorial(4L)).isEqualTo(24L);
assertThat(mathService.wasCacheMiss()).isTrue();
assertThat(mathService.factorial(8L)).isEqualTo(40320L);
assertThat(mathService.wasCacheMiss()).isTrue();
}
#Test
public void thenCacheHits() {
assertThat(mathService.factorial(0L)).isEqualTo(1L);
assertThat(mathService.wasCacheMiss()).isFalse();
assertThat(mathService.factorial(1L)).isEqualTo(1L);
assertThat(mathService.wasCacheMiss()).isFalse();
assertThat(mathService.factorial(2L)).isEqualTo(2L);
assertThat(mathService.wasCacheMiss()).isFalse();
assertThat(mathService.factorial(4L)).isEqualTo(24L);
assertThat(mathService.wasCacheMiss()).isFalse();
assertThat(mathService.factorial(8L)).isEqualTo(40320L);
assertThat(mathService.wasCacheMiss()).isFalse();
}
interface MathService {
boolean wasCacheMiss();
long factorial(long number);
}
#EnableCaching
#SpringBootConfiguration
#Import({ AutoRedisConfiguration.class, CustomRedisConfiguration.class })
static class CachingWithRedisConfiguration {
#Bean
MathService mathService() {
return new MathService() {
private final AtomicBoolean cacheMiss = new AtomicBoolean(false);
#Override
public boolean wasCacheMiss() {
return cacheMiss.getAndSet(false);
}
#Override
#Cacheable(cacheNames = "Factorials")
public long factorial(long number) {
cacheMiss.set(true);
Assert.isTrue(number >= 0L, String.format("Number [%d] must be greater than equal to 0", number));
if (number <= 2L) {
return (number < 2L ? 1L : 2L);
}
long result = number;
while (--number > 1) {
result *= number;
}
return result;
}
};
}
#Bean
#Profile("none")
CacheManager cacheManager() {
return new ConcurrentMapCacheManager();
}
}
#Profile("auto")
#EnableAutoConfiguration
#SpringBootConfiguration
static class AutoRedisConfiguration {
#PostConstruct
public void afterPropertiesSet() {
System.out.println("AUTO");
}
#Bean
static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer =
new PropertySourcesPlaceholderConfigurer();
propertySourcesPlaceholderConfigurer.setProperties(redisProperties());
return propertySourcesPlaceholderConfigurer;
}
static Properties redisProperties() {
Properties redisProperties = new Properties();
redisProperties.setProperty("spring.cache.type", "redis");
redisProperties.setProperty("spring.redis.host", REDIS_HOST);
redisProperties.setProperty("spring.redis.port", String.valueOf(REDIS_PORT));
return redisProperties;
}
}
#Profile("custom")
#SpringBootConfiguration
static class CustomRedisConfiguration {
#PostConstruct
public void afterPropertiesSet() {
System.out.println("CUSTOM");
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory factory = new JedisConnectionFactory();
factory.setHostName(REDIS_HOST);
factory.setPort(REDIS_PORT);
factory.setUsePool(true);
return factory;
}
#Bean
RedisTemplate<Object, Object> redisTemplate() {
RedisTemplate<Object, Object> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
return redisTemplate;
}
#Bean
CacheManager cacheManager() {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate());
cacheManager.setUsePrefix(true); // THIS IS NEEDED!
return cacheManager;
}
}
}
Hope this helps!
Cheers,
John
I am using Spring Boot 2.0 and its very simple to use Redis for simple caching purpose.
Annotate your Spring boot application with #EnableCaching
In your application.properties have these properties
spring.cache.type=redis
redis.host.url=
redis.host.port=
Annotate your methods with #Cacheable.
That's it!!
If you are using AWS Elasticache and you have checked the in-transit encryption then you need to add a RedisConfiguration file to set your ssl to true.
Spring Boot 2.0 now uses LettuceConnectionFactory.
To do the above simply add a class and mark it with #Configuration annotation and add the following bean
#Bean
public LettuceConnectionFactory redisConnectionFactory() {
RedisStandaloneConfiguration configuration = new RedisStandaloneConfiguration();
configuration.setHostName(redisHost);
configuration.setPort(redisPort);
return new LettuceConnectionFactory(configuration, LettuceClientConfiguration.builder().useSsl().disablePeerVerification().build());
}
You can check existence of your keys in redis with command in redis-cli: keys *
If there will be your key, everything is OK :)
The methods of your classes should be #Cacheable. Along with #CachePut you should use #CacheEvict with the keys and their values for delete method to flush out the resources. Also the host, port, type and password is needed in application.properties file.
`spring.redis.password= password`
`spring.cache.type=redis`
`spring.redis.host=localhost`
`spring.redis.port=6379`
Also add some more properties like so, as per requirement.
`spring.cache.redis.time-to-live=600000`
`spring.cache.redis.cache-null-values=false`
`spring.cache.redis.use-key-prefix=true

Get Spring Boot management port at runtime when management.port=0

I'm looking for advice on how to get the port that was assigned to the embedded Tomcat that is serving the actuator endpoint when setting management.port property to 0 in integration tests.
I'm using Spring Boot 1.3.2 with the following application.yml configuration:
server.port: 8080
server.contextPath: /my-app-context-path
management.port: 8081
management.context-path: /manage
...
My integration tests are then annotated with #WebIntegrationTest, setting the ports shown above to 0:
#WebIntegrationTest({ "server.port=0", "management.port=0" })
And the following utility class should be used to get access to the application configuration when doing full integration tests:
#Component
#Profile("testing")
class TestserverInfo {
#Value( '${server.contextPath:}' )
private String contextPath;
#Autowired
private EmbeddedWebApplicationContext server;
#Autowired
private ManagementServerProperties managementServerProperties
public String getBasePath() {
final int serverPort = server.embeddedServletContainer.port
return "http://localhost:${serverPort}${contextPath}"
}
public String getManagementPath() {
// The following wont work here:
// server.embeddedServletContainer.port -> regular server port
// management.port -> is zero just as server.port as i want random ports
final int managementPort = // how can i get this one ?
final String managementPath = managementServerProperties.getContextPath()
return "http://localhost:${managementPort}${managementPath}"
}
}
I already know the standard port can be get by using the local.server.port and there seems to be some equivalent for the management endpoint named local.management.port. But that one seems to have a different meaning. The official documentation doesn't mention a way to do this.
Is there currently any undocumented way to get a hand on that management port?
Solution Edit
As I am using the Spock Framework and spock-spring module for testing my Spring Boot application, I have to initialize the application like this:
#ContextConfiguration(loader = SpringApplicationContextLoader.class, classes = MyApplication.class)
Somehow spock-spring or the test-initialization seems to affect the evaluation of the #Value Annotation so that #Value("${local.management.port}") resulted in
java.lang.IllegalArgumentException: Could not resolve placeholder
'local.management.port' in string value "${local.management.port}"
With your solution I knew the property existed, so I simply use the Spring Environment directly to retrieve the property-value at test runtime:
#Autowired
ManagementServerProperties managementServerProperties
#Autowired
Environment environment
public String getManagementPath() {
final int managementPort = environment.getProperty('local.management.port', Integer.class)
final String managementPath = managementServerProperties.getContextPath()
return "http://localhost:${managementPort}${managementPath}"
}
Management port needs to be set to 0 via the #SpringBootTest's properties field and then to get the port in the tests use #LocalServerPort and/or #LocalManagementPort annotations.
Examples:
From Spring Boot 2.0.0:
properties = {"management.server.port=0"}
#ExtendWith(SpringExtension.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, properties = {
"management.server.port=0" })
public class HealthCheckIT {
#Autowired
private WebTestClient webTestClient;
#LocalManagementPort
int managementPort;
#Test
public void testManagementPort() {
webTestClient
.get().uri("http://localhost:" + managementPort + "/actuator/health")
.header(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE)
.accept(MediaType.APPLICATION_JSON)
.exchange()
.expectStatus().isOk();
}
}
For older versions: [1.4.0 - 1.5.x]:
properties = {"management.port=0"}
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = {
"management.port=0", "management.context-path=/admin" })
public class SampleTest {
#LocalServerPort
int port;
#LocalManagementPort
int managementPort;
This is how I've done it, copied straight from my test class (I use RestAssured for assertions):
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.test.SpringApplicationConfiguration;
import org.springframework.boot.test.WebIntegrationTest;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import static com.jayway.restassured.RestAssured.get;
import static org.hamcrest.CoreMatchers.equalTo;
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(Application.class)
#WebIntegrationTest(randomPort = true, value = {"management.port=0", "management.context-path=/admin"})
#DirtiesContext
public class ActuatorEndpointTest {
#Value("${local.management.port}")
private int localManagementPort;
#Test
public void actuatorHealthEndpointIsAvailable() throws Exception {
String healthUrl = "http://localhost:" + localManagementPort + "/admin/health";
get(healthUrl)
.then()
.assertThat().body("status", equalTo("UP"));
}
}
This is an update on #magiccrafter answer which requires the following to work properly
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
<scope>test</scope>
</dependency>
and it requires this annotation as well
#AutoConfigureWebTestClient

Exporting Spring Boot Actuator Metrics (& Dropwizard Metrics) to Statsd

I'm trying to export all of the metrics which are visible at the endpoint /metrics to a StatsdMetricWriter.
I've got the following configuration class so far:
package com.tonyghita.metricsdriven.service.config;
import com.codahale.metrics.MetricRegistry;
import com.ryantenney.metrics.spring.config.annotation.EnableMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricReader;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.metrics.reader.MetricReader;
import org.springframework.boot.actuate.metrics.reader.MetricRegistryMetricReader;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(MetricsConfig.class);
#Value("${statsd.host:localhost}")
private String host = "localhost";
#Value("${statsd.port:8125}")
private int port;
#Autowired
private MetricRegistry metricRegistry;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry);
}
#Bean
#ExportMetricWriter
public MetricWriter metricWriter() {
LOGGER.info("Configuring StatsdMetricWriter to export to {}:{}", host, port);
return new StatsdMetricWriter(host, port);
}
}
Which writes all of the metrics which I've added to Statsd, but I'd like to also send the system/JVM metrics that are visible on the /metrics endpoint.
What am I missing?
I had the same problem and found a solution here: https://github.com/tzolov/export-metrics-example
Just add a MetricsEndpointMetricReader to your config and everything available at th e/metrics endpoint will be published to the StatsdMetricWriter.
Here is a complete example config for spring boot 1.3.x and dropwizard metrics-jvm 3.1.x:
import com.codahale.metrics.MetricRegistry;
import com.codahale.metrics.jvm.GarbageCollectorMetricSet;
import com.codahale.metrics.jvm.MemoryUsageGaugeSet;
import com.codahale.metrics.jvm.ThreadStatesGaugeSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.endpoint.MetricsEndpoint;
import org.springframework.boot.actuate.endpoint.MetricsEndpointMetricReader;
import org.springframework.boot.actuate.metrics.Metric;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.Delta;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MetricsConfiguration {
#Bean
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
metricRegistry.register("jvm.memory",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
metricRegistry.register("jvm.garbage-collector",new GarbageCollectorMetricSet());
return metricRegistry;
}
/*
* Reading all metrics that appear on the /metrics endpoint to expose them to metrics writer beans.
*/
#Bean
public MetricsEndpointMetricReader metricsEndpointMetricReader(final MetricsEndpoint metricsEndpoint) {
return new MetricsEndpointMetricReader(metricsEndpoint);
}
#Bean
#ConditionalOnProperty(prefix = "statsd", name = {"prefix", "host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter(#Value("${statsd.prefix}") String statsdPrefix,
#Value("${statsd.host}") String statsdHost,
#Value("${statsd.port}") int statsdPort) {
return new StatsdMetricWriter(statsdPrefix, statsdHost, statsdPort);
}
}
From what I've seen in spring-boot code, only calls to CounterService and GaugeService implementations are forwarded to dropwizard's MetricRegistry.
Therefore, as you already observed, only counter.* and gauge.* metrics from the /metrics endpoint will end up in Statsd.
System and JVM metrics are exposed through custom SystemPublicMetrics class, which doesn't use counter or gauge service.
I'm not sure if there is a simpler solution (maybe someone from Spring team will comment), but one way to do it (not spring-boot specific) would be to use a scheduled task that periodically writes system stats to the MetricRegistry.
To register JVM metrics you can use the JVM related MetricSets supplied by codehale.metrics.jvm library. You can just add the whole set without supplying whether they are gauges or counters.
Here is my example code where I am registering jvm related metrics:
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
#Autowired
private StatsdProperties statsdProperties;
#Autowired
private MetricsEndpoint metricsEndpoint;
#Autowired
private DataSourcePublicMetrics dataSourcePublicMetrics;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry());
}
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
//jvm metrics
metricRegistry.register("jvm.gc",new GarbageCollectorMetricSet());
metricRegistry.register("jvm.mem",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
return metricRegistry;
}
#Bean
#ConditionalOnProperty(prefix = "metrics.writer.statsd", name = {"host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter() {
return new StatsdMetricWriter(
statsdProperties.getPrefix(),
statsdProperties.getHost(),
statsdProperties.getPort()
);
}
}
Note: I am using spring boot version 1.3.0.M4
Enjoy! (see the public metrics logged in console as dropwizard metrics)
#Configuration
#EnableMetrics
#EnableScheduling
public class MetricsReporter extends MetricsConfigurerAdapter {
#Autowired private SystemPublicMetrics systemPublicMetrics;
private MetricRegistry metricRegistry;
#Scheduled(fixedDelay = 5000)
void exportPublicMetrics() {
for (Metric<?> metric : systemPublicMetrics.metrics()) {
Counter counter = metricRegistry.counter(metric.getName());
counter.dec(counter.getCount());
counter.inc(Double.valueOf(metric.getValue().toString()).longValue());
}
}
#Override
public void configureReporters(MetricRegistry metricRegistry) {
this.metricRegistry = metricRegistry;
ConsoleReporter.forRegistry(metricRegistry).build().start(10, TimeUnit.SECONDS);
}
}

Resources