Consider defining a bean of type 'net.corda.core.messaging.CordaRPCOps' in your configuration - spring

Unable to use CordaRPCOps implementation methods in my CustomController,
#RequestMapping(value="/peers", produces = MediaType.APPLICATION_JSON)
public Map<String, List<String>> peers() throws Exception
{
CordaRPCOps proxy=rpc.getParameterValue("proxy");
Party myIdentity= proxy.nodeInfo().getLegalIdentities().get(0);
return ImmutableMap.of("peers", rpcOpsImpl.networkMapSnapshot()
.stream()
.filter(nodeInfo -> nodeInfo.getLegalIdentities().get(0) != myIdentity)
.map(it -> it.getLegalIdentities().get(0).getName().getOrganisation())
.collect(toList()));
}
getting error while building runPartyAServer as,
APPLICATION FAILED TO START
Description:
Field services in net.corda.server.controllers.CustomController required a bean of type 'net.corda.core.messaging.CordaRPCOps' that could not be found.
Action:
Consider defining a bean of type 'net.corda.core.messaging.CordaRPCOps' in your configuration.

As the error message says, you must define a bean of type CordaRPCConnection/CordaRPCOps.
Something similar to:
#Bean
private fun connect(): CordaRPCConnection {
val hostAndPort = NetworkHostAndPort(configuration.host, configuration.port)
val client = CordaRPCClient(hostAndPort)
val connection = client.start(configuration.user, configuration.password)
return connection;
}
We do not provide any DI container integration by default.

Related

How to create a conditional bean in Spring

I need to create a Conditional Bean in Spring. The use case is as following:
Class 1
In this class we are trying to create the Bean, which should be created for some clients who have the required permission, and for others it will return empty(). Thus the application should boot-up for all the clients without the BeanCreationException
#org.springframework.context.annotation.Configuration
public class SomeBeanConfiguration {
#Bean
public Optional<SomeBean> someBean() {
// whoAmI() ? returns IAmClient_1 - for whom this bean should be created
// whoAmI() ? returns IAmClient_2 - for whom this bean should not be created
final String somePermission = whoAmI();
try {
return Optional.of(SomeBean.builder()
.withPermission(new SomeCredentialsProvider(somePermission))
.build());
} catch (Exception ex) {
LOG.error("SomeBean creation exception : ", ex);
}
return Optional.empty();
}
}
Class 2
Where we are using this Bean in Constructor injection
#Bean
public SomeHelper someHelper(Optional<SomeBean> someBean) {
return new someHelper(someBean);
}
But the someHelper for client, who have permission are also getting an Optional.empty() in constructor.
What I am doing wrong here? Can anyone please help?
You need to change your method that's creating the bean. It should not be returning a bean of type Optional, it should be returning a bean of type SomeBean. Also, consider rewriting your logic to something more understandable, like dropping the catch block and creating the bean based on the output of whoAmI().
#Bean
public SomeBean someBean() {
// whoAmI() ? returns IAmClient_1 - for whom this bean should be created
// whoAmI() ? returns IAmClient_2 - for whom this bean should not be created
String somePermission = whoAmI();
if (somePermission.equals("IAmClient_1") {
return SomeBean.builder().withPermission(newSomeCredentialsProvider(somePermission)).build());
} else {
return null;
}
}
Now, when you autowire the Optional, the optional will contain the bean for IAmClient_1, and will be empty for all other cases.
In my opinion, it would be better to always construct SomeBean and just modify its behavior based on the value of the permission you're checking, but that's up to you.

ReadOnlyKeyValueStore from a KTable using spring-kafka

I am migrating a Kafka Streams implementation which uses pure Kafka apis to use spring-kafka instead as it's incorporated in a spring-boot application.
Everything works fine the Stream, GlobalKTable, branching that I have all works perfectly fine but I am having a hard time incorporating a ReadOnlyKeyValueStore. Based on the spring-kafka documentation here: https://docs.spring.io/spring-kafka/docs/2.6.10/reference/html/#streams-spring
It says:
If you need to perform some KafkaStreams operations directly, you can
access that internal KafkaStreams instance by using
StreamsBuilderFactoryBean.getKafkaStreams(). You can autowire
StreamsBuilderFactoryBean bean by type, but you should be sure to use
the full type in the bean definition.
Based on that I tried to incorporate it to my example as in the following fragments below:
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_CONFIG_BEAN_NAME)
public KafkaStreamsConfiguration defaultKafkaStreamsConfig() {
Map<String, Object> props = defaultStreamsConfigs();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "quote-stream");
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, SpecificAvroSerde.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "stock-quotes-stream-group");
return new KafkaStreamsConfiguration(props);
}
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_BUILDER_BEAN_NAME)
public StreamsBuilderFactoryBean defaultKafkaStreamsBuilder(KafkaStreamsConfiguration defaultKafkaStreamsConfig) {
return new StreamsBuilderFactoryBean(defaultKafkaStreamsConfig);
}
...
final GlobalKTable<String, LeveragePrice> leverageBySymbolGKTable = streamsBuilder
.globalTable(KafkaConfiguration.LEVERAGE_PRICE_TOPIC,
Materialized.<String, LeveragePrice, KeyValueStore<Bytes, byte[]>>as("leverage-by-symbol-table")
.withKeySerde(Serdes.String())
.withValueSerde(leveragePriceSerde));
leveragePriceView = myKStreamsBuilder.getKafkaStreams().store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
But adding the StreamsBuilderFactoryBean(which seems to be needed to get a reference to KafkaStreams) definition causes an error:
The bean 'defaultKafkaStreamsBuilder', defined in class path resource [com/resona/springkafkastream/repository/KafkaConfiguration.class], could not be registered. A bean with that name has already been defined in class path resource [org/springframework/kafka/annotation/KafkaStreamsDefaultConfiguration.class] and overriding is disabled.
The issue is I don't want to control the lifecycle of the stream that's what I get with the plain Kafka APIs so I would like to get a reference to the default managed one as I want spring to manage it but whenever I try to expose the bean it gives the error. Any ideas on what's the correct approach to that using spring-kafka?
P.S - I am not interested in solutions using spring-cloud-stream I am looking for implementations of spring-kafka.
You don't need to define any new beans; something like this should work...
spring.application.name=quote-stream
spring.kafka.streams.properties.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
spring.kafka.streams.properties.default.value.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
#SpringBootApplication
#EnableKafkaStreams
public class So69669791Application {
public static void main(String[] args) {
SpringApplication.run(So69669791Application.class, args);
}
#Bean
GlobalKTable<String, String> leverageBySymbolGKTable(StreamsBuilder sb) {
return sb.globalTable("gkTopic",
Materialized.<String, String, KeyValueStore<Bytes, byte[]>> as("leverage-by-symbol-table"));
}
private ReadOnlyKeyValueStore<String, String> leveragePriceView;
#Bean
StreamsBuilderFactoryBean.Listener afterStart(StreamsBuilderFactoryBean sbfb,
GlobalKTable<String, String> leverageBySymbolGKTable) {
StreamsBuilderFactoryBean.Listener listener = new StreamsBuilderFactoryBean.Listener() {
#Override
public void streamsAdded(String id, KafkaStreams streams) {
leveragePriceView = streams.store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
}
};
sbfb.addListener(listener);
return listener;
}
#Bean
KStream<String, String> stream(StreamsBuilder builder) {
KStream<String, String> stream = builder.stream("someTopic");
stream.to("otherTopic");
return stream;
}
}

How can i unit test a Kstream with an Kafka Binder?

i want to unit test an Kafka Stream Aggregate and i am totally confused which method to use.
I read about the TestSupportBinder but i do not think this is working in my case, therefore i use the KafkaEmbedded Method. This is how i initialize the embedded Kafka.
#Before
public void setUp() throws Exception{
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("group-id", "false", embeddedKafka);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
DefaultKafkaConsumerFactory<Object, LoggerMessage> cf = new DefaultKafkaConsumerFactory<>(consumerProps);
consumer = cf.createConsumer();
embeddedKafka.consumeFromAnEmbeddedTopic(consumer, OUTPUT_TOPIC);
}
What i want to test is the following:
public interface Channels {
String LOGGER_IN_STREAM = "logger-topic-in-stream";
String LOGGER_IN = "logger-topic-in";
String LOGGERDATAVALIDATED_OUT = "loggerDataValidated-topic-out";
#Input(Channels.LOGGER_IN)
SubscribableChannel processMessage();
#Input(Channels.LOGGER_IN_STREAM)
KStream<Object, LoggerMessage> loggerKstreamIn();
#Output(Channels.LOGGERDATAVALIDATED_OUT)
MessageChannel validateLoggerData();
}
And i get following error message
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'some.domain.Channels': Invocation of init method failed; nested exception is java.lang.IllegalStateException: No factory found for binding target type: org.apache.kafka.streams.kstream.KStream among registered factories: channelFactory,messageSourceFactory
Caused by: java.lang.IllegalStateException: No factory found for binding target type: org.apache.kafka.streams.kstream.KStream among registered factories: channelFactory,messageSourceFactory
What am i doing wrong?
I missed to inject my Channels Interface as a MockBean. After i did that everything worked as expected.

Registring CustomPropertyEditors for using in BeanWrapper

I am trying to register custom property editors using the following configuration in my spring boot application.Reffered the following documentation link section 5.4.2.1.
#Bean
public static CustomEditorConfigurer customEditorConfigurer() {
CustomEditorConfigurer configurer = new CustomEditorConfigurer();
configurer.setPropertyEditorRegistrars(new PropertyEditorRegistrar[] {
(registry) -> registry.registerCustomEditor(Instant.class, new CustomInstantEditor()) });
return configurer;
}
When I created a BeanWrapper and using it I am getting the following error
Code:
BeanWrapper newAccountWrapper = new BeanWrapperImpl(newAccount);
newAccountWrapper.setPropertyValue("chardate", value);
Error is:
Exception is Failed to convert property value of type [java.lang.String] to required type [java.time.Instant] for property 'chardate'; nested exception is java.lang.IllegalStateException: Cannot convert value of type [java.lang.String] to required type [java.time.Instant] for property 'chardate': no matching editors or conversion strategy found
But the above code works if I register the CustomEditor for the BeanWrapper
BeanWrapper newAccountWrapper = new BeanWrapperImpl(newAccount);
newAccountWrapper.registerCustomEditor(Instant.class, new
CustomInstantEditor());
So can I not register customPropertyEditors using CustomEditorConfigurer BeanFactoryPostProcessor ?
Additional Info:
BeanWrapper newAccountWrapper = new BeanWrapperImpl(newAccount);
newAccountWrapper.registerCustomEditor(Instant.class, new CustomInstantEditor());
newAccountWrapper.registerCustomEditor(Money.class, new CustomMoneyEditor());
newAccountWrapper.setAutoGrowNestedPaths(true);
accountDomainElements.forEach((accountElement, value) -> {
newAccountWrapper.setPropertyValue(accountElement, value);
Give a try
#Bean
public CustomEditorConfigurer customEditorConfigurer() {
CustomEditorConfigurer configurer = new CustomEditorConfigurer();
Map<Class<?>, Class<? extends PropertyEditor>> customEditors = new HashMap<>();
customEditors.put(Instant.class, CustomInstantEditor.class);
configurer.setCustomEditors(customEditors);
return configurer;
}

Caching Java 8 Optional with Spring Cache

I have a method:
#Cacheable(key = "#jobId")
public Optional<JobInfo> getJobById(String jobId) {
log.info("Querying for job " + jobId);
counterService.increment("queryJobById");
Job job = jobsRepository.findOne(jobId);
if (job != null) {
return Optional.of(createDTOFromJob(job));
}
return Optional.empty();
}
When I am trying to retrieve the cached item I am getting the following exception:
2016-01-18 00:01:10 ERROR [trace=,span=] http-nio-8021-exec-2 [dispatcherServlet]:182 - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.data.redis.serializer.SerializationException: Cannot serialize; nested exception is org.springframework.core.serializer.support.SerializationFailedException: Failed to serialize object using DefaultSerializer; nested exception is java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [java.util.Optional]] with root cause
java.lang.IllegalArgumentException: DefaultSerializer requires a Serializable payload but received an object of type [java.util.Optional]
Just implement the Serializable interface in your DTO
#Document(collection = "document_name")
public class Document implements Serializable {
private static final long serialVersionUID = 7156526077883281623L;
Spring supports caching Optional. The issue is your Redis serializer (JdkSerializationRedisSerializer probably). It uses Java based serialization which requires the classes to be Serializable. You can solve this by configuring the RedisCacheManager to use another serializer that doesn't have this limitation. For example you can use Kryo (com.esotericsoftware:kryo:3.0.3):
#Bean
RedisCacheManager redisCacheManager (RedisTemplate<Object, Object> redisOperations) {
// redisOperations will be injected if it is configured as a bean or create it: new RedisTemplate()...
redisOperations.setDefaultSerializer(new RedisSerializer<Object>() {
//use a pool because kryo instances are not thread safe
KryoPool kryoPool = new KryoPool.Builder(Kryo::new).build();
#Override
public byte[] serialize(Object o) throws SerializationException {
ByteBufferOutput output = new ByteBufferOutput();
Kryo kryo = kryoPool.borrow();
try {
kryo.writeClassAndObject(output, o);
} finally {
kryoPool.release(kryo);
output.close();
}
return output.toBytes();
}
#Override
public Object deserialize(byte[] bytes) throws SerializationException {
if(bytes.length == 0) return null;
Kryo kryo = kryoPool.borrow();
Object o;
try {
o = kryo.readClassAndObject(new ByteBufferInput(bytes));
} finally {
kryoPool.release(kryo);
}
return o;
}
});
RedisCacheManager redisCacheManager = new RedisCacheManager(redisOperations);
redisCacheManager.setCachePrefix(new DefaultRedisCachePrefix("app"));
redisCacheManager.setTransactionAware(true);
return redisCacheManager;
}
Note that this is just an example, I didn't test this imeplementation. But I use the Kryo serializer in production in the same manner for redis caching with Spring.
Because your serialized object is not implement RedisSerializer, or you can extend class JdkSerializationRedisSerializer, which have implement RedisSerializer.
example code:
import org.springframework.data.redis.serializer.JdkSerializationRedisSerializer;
import org.springframework.data.redis.serializer.RedisSerializer;
import org.springframework.data.redis.serializer.SerializationException;
public class YourDTOObject extends JdkSerializationRedisSerializer implements Serializable{
/**
*
*/
private static final long serialVersionUID = 1L;
....
}
More details and principle, please visit my blog

Resources