spring neo4j deep relationship not working - spring

I have 2 users in my neo4j DB:
enter image description here
here are my node objects:
#Node
public class User {
#Id
#GeneratedValue
private Long id;
#Property("user_id")
private Long userId;
#Property("name")
private String name;
#Relationship(type = "HAS_ACCESS", direction = Relationship.Direction.OUTGOING)
public List<HasAccessRelation> relation;
}
#AllArgsConstructor
#NoArgsConstructor
#RelationshipProperties
public class HasAccessRelation {
#RelationshipId
private Long id;
private String type;
#TargetNode
private Folder folder;
}
#Node
public class Folder {
#Id
#GeneratedValue
private Long id;
#Property("folder_id")
private Long folderId;
#Property("owner_user_id")
private Long ownerUserId;
#Property("name")
private String name;
#Relationship(type = "HAS_ACCESS", direction = Relationship.Direction.OUTGOING)
public List<HasAccessRelation> relations = new ArrayList<>();
}
When I delete list of HasAccessRelation in my Folder class - everything works as expected, my User repository findAll method returns users with HAS_ACCESS relations and inside these relations I have my folders. But I want to return all folder structures that the user has (deep graph with bigger depth). So I added list of relations to my Folder Node object and it stopped working. The exception I'm getting:
org.neo4j.driver.exceptions.DatabaseException: Expected
RegularSinglePlannerQuery(QueryGraph {Nodes: [' rootNodeIds#7'], Predicates: ['id(` rootNodeIds#7`) IN $rootNodeIds']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(),Map(n -> FunctionInvocation(Namespace(List()),FunctionName(collect),false,Vector(Variable( rootNodeIds#7)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['n'], Optional Matches: : ['QueryGraph {Rels: ['( UNNAMED105)--[relationshipIds]--( UNNAMED126)'], Predicates: ['id(relationshipIds) IN $relationshipIds']}']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(n -> Variable(n)),Map(__sr__ -> FunctionInvocation(Namespace(List()),FunctionName(collect),true,Vector(Variable(relationshipIds)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', 'n'], Optional Matches: : ['QueryGraph {Nodes: ['relatedNodeIds'], Predicates: ['id(relatedNodeIds) IN $relatedNodeIds']}']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(n -> Variable(n), __sr__ -> Variable(__sr__)),Map(__srn__ -> FunctionInvocation(Namespace(List()),FunctionName(collect),true,Vector(Variable(relatedNodeIds)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', '__srn__', 'n']},InterestingOrder(RequiredOrderCandidate(List()),List()),UnwindProjection( rootNodeIds#384,Variable(n)),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: [' rootNodeIds#384', '__sr__', '__srn__', 'n']},InterestingOrder(RequiredOrderCandidate(List()),List()),RegularQueryProjection(Map(user -> Variable( rootNodeIds#384), __sr__ -> Variable(__sr__), __srn__ -> Variable(__srn__)),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', '__srn__', 'user']},InterestingOrder(RequiredOrderCandidate(List()),List()),RegularQueryProjection(Map(__sn__ -> Variable(user), __sr__ -> Variable(__sr__), __srn__ -> Variable(__srn__)),QueryPagination(None,None),Selections(Set())),None,None)),None)),None)),None)),None)),None)
Instead, got:
RegularSinglePlannerQuery(QueryGraph {Nodes: [' rootNodeIds#7'], Predicates: ['id(` rootNodeIds#7`) IN $rootNodeIds']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(),Map(n -> FunctionInvocation(Namespace(List()),FunctionName(collect),false,Vector(Variable( rootNodeIds#7)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['n'], Optional Matches: : ['QueryGraph {}']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(n -> Variable(n)),Map(__sr__ -> FunctionInvocation(Namespace(List()),FunctionName(collect),true,Vector(Variable(relationshipIds)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', 'n'], Optional Matches: : ['QueryGraph {Nodes: ['relatedNodeIds'], Predicates: ['id(relatedNodeIds) IN $relatedNodeIds']}']},InterestingOrder(RequiredOrderCandidate(List()),List()),AggregatingQueryProjection(Map(n -> Variable(n), __sr__ -> Variable(__sr__)),Map(__srn__ -> FunctionInvocation(Namespace(List()),FunctionName(collect),true,Vector(Variable(relatedNodeIds)))),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', '__srn__', 'n']},InterestingOrder(RequiredOrderCandidate(List()),List()),UnwindProjection( rootNodeIds#384,Variable(n)),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: [' rootNodeIds#384', '__sr__', '__srn__', 'n']},InterestingOrder(RequiredOrderCandidate(List()),List()),RegularQueryProjection(Map(user -> Variable( rootNodeIds#384), __sr__ -> Variable(__sr__), __srn__ -> Variable(__srn__)),QueryPagination(None,None),Selections(Set())),Some(RegularSinglePlannerQuery(QueryGraph {Arguments: ['__sr__', '__srn__', 'user']},InterestingOrder(RequiredOrderCandidate(List()),List()),RegularQueryProjection(Map(__sn__ -> Variable(user), __sr__ -> Variable(__sr__), __srn__ -> Variable(__srn__)),QueryPagination(None,None),Selections(Set())),None,None)),None)),None)),None)),None)),None)
Plan: ProduceResult(List(__sn__, __sr__, __srn__)) {
Could someone please help me to figure out what I'm doing wrong?

Related

Speed up integeration tests withVirtualTime - stream of Flux.range with .delayElements

I have Controller that returns Flux of Strings - basically returns stream of data (Strings containing current time), Strings are delayed 1 second using on .delayElements(Duration.ofSeconds(1)) on Flux
Controller:
#GetMapping(value = "/currentTimeReactiveFlux", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
#ResponseBody
public Flux<String> exampleOfFluxDeleyedResponse() {
return Flux.range(1, 1000)
.delayElements(Duration.ofSeconds(1))
.map(integer -> "Current time (1) is " + OffsetDateTime.now())
.doOnNext(s -> log.info("S: " + s + " -> " + LocalDateTime.now().getSecond()))
.doOnCancel(() -> log.info("Cancelled"));
}
Is it possible to speed up this in Integration tests with WebClientTest. This is working slowly ( delays):
Controller integration test:
#Autowired
WebTestClient webTestClient;
#Test
void testSpeedUpStreamsOfFlux() {
FluxExchangeResult<String> result = webTestClient.get().uri("/currentTimeReactiveFlux")
.accept(MediaType.TEXT_EVENT_STREAM)
.exchange()
.expectStatus().isOk()
.returnResult(String.class);
Flux<String> eventFlux = result.getResponseBody();
StepVerifier.withVirtualTime(()-> eventFlux)
.expectNextCount(10)
.thenAwait(Duration.ofSeconds(10))
.thenCancel()
.verify();
}

Delay in processing with Kafka Streams

I have created a Kafka Stream topology and I am having 1 Source and 2 Sinks.
I am using Spring Boot(2.1.9) with Kafka Streams and Not using Spring Cloud. Kafka Version 2.3.0
#Configuration
#EnableKafkaStreams
public class StreamStart {
#Bean
public KStream<String, String> process(StreamsBuilder builder) {
KStream<String, String> inputStream = builder.stream("streamIn", Consumed.with(Serdes.String(), Serdes.String()));
KStream<String, String> upperCaseStream = inputStream.mapValues(value -> value.toUpperCase());
upperCaseStream.to("outTopic", Produced.with(Serdes.String(), Serdes.String()));
KTable<String, Long> wordCounts = upperCaseStream
.flatMapValues(v -> Arrays.asList(v.split(" ")))
.selectKey((k, v) -> v)
.groupByKey(Serialized.with(Serdes.String(), Serdes.String()))
.count(Materialized.<String, Long, KeyValueStore<Bytes, byte[]>> as("counts-store"));
wordCounts.toStream().to("wordCountTopic", Produced.with(Serdes.String(), Serdes.Long()));
return upperCaseStream;
}
}
The data flows in outTopic instantaneously whereas data getting displayed in wordCountTopic takes 20-25 seconds for each record.
Any Suggestions?

Spring Data JPA Criteria API - how to search by field equals within two entities?

I have 3 different entities: Partner is main entity, Offer is partner's offer (many to one) and Location is any location in partner.
#Entity
class ObjectLocation(#ManyToOne var place: Place, var partnerId: String) {
constructor() : this(Place(), "")
#Id
var id: String = IDGenerator.longId()
// other fields omitted
...
}
#Entity
class Offer(var partnerId: String, ...) {
constructor() : this(...)
#Id
var id: String = IDGenerator.longId()
...
}
#Entity
class Partner(...) {
constructor() : this(...)
#Id
var id: String = IDGenerator.longId()
...
}
So, I need to find all the Offers by the Place criteria. I've tried this:
Specification {
root: Root<Offer>, criteriaQuery: CriteriaQuery<*>, criteriaBuilder: CriteriaBuilder ->
val objectLocationRoot = criteriaQuery.from(ObjectLocation::class.java)
val objectCityId: Expression<String> = objectLocationRoot
.get<Place>("place")
.get<City>("parentCity")
.get<String>("id")
val objectPartnerId: Expression<String> = objectLocationRoot.get<String>("partnerId")
val offerPartnerId: Expression<String> = root.get<String>("partnerId")
val goodLocations: Predicate = criteriaBuilder.equal(objectCityId, cityId)
val objQuery: Subquery<String> = criteriaQuery.subquery(String::class.java)
.select(objectPartnerId)
.where(goodLocations)
return#Specification criteriaBuilder.equal(objQuery, offerPartnerId)
}
But this only gave me following exception:
antlr.NoViableAltException: unexpected token: where
at org.hibernate.hql.internal.antlr.HqlBaseParser.fromRange(HqlBaseParser.java:1519) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.fromClause(HqlBaseParser.java:1343) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.selectFrom(HqlBaseParser.java:1063) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.queryRule(HqlBaseParser.java:748) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.subQuery(HqlBaseParser.java:3910) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.primaryExpression(HqlBaseParser.java:967) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.hql.internal.antlr.HqlBaseParser.atom(HqlBaseParser.java:3549) [hibernate-core-5.2.17.Final.jar:5.2.17.Final]
....
org.springframework.dao.InvalidDataAccessApiUsageException: org.hibernate.hql.internal.ast.QuerySyntaxException: unexpected token: where near line 1, column 175
[select generatedAlias0 from com.arkell.entity.Offer as generatedAlias0, com.arkell.entity.geo.ObjectLocation as generatedAlias1 where
(select generatedAlias1.partnerId from where generatedAlias1.place.parentCity.id=:param0)=generatedAlias0.partnerId]; nested exception is java.lang.IllegalArgumentException:
org.hibernate.hql.internal.ast.QuerySyntaxException: unexpected token: where near line 1, column 175 [select generatedAlias0
from com.arkell.entity.Offer as generatedAlias0, com.arkell.entity.geo.ObjectLocation as generatedAlias1 where
(select generatedAlias1.partnerId from where generatedAlias1.place.parentCity.id=:param0)=generatedAlias0.partnerId]
at org.springframework.orm.jpa.EntityManagerFactoryUtils.convertJpaAccessExceptionIfPossible(EntityManagerFactoryUtils.java:367)
at org.springframework.orm.jpa.vendor.HibernateJpaDialect.translateExceptionIfPossible(HibernateJpaDialect.java:227)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.translateExceptionIfPossible(AbstractEntityManagerFactoryBean.java:527)
at org.springframework.dao.support.ChainedPersistenceExceptionTranslator.translateExceptionIfPossible(ChainedPersistenceExceptionTranslator.java:61)
at org.springframework.dao.support.DataAccessUtils.translateIfNecessary(DataAccessUtils.java:242)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:153)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodInterceptor.invoke(CrudMethodMetadataPostProcessor.java:135)
How it is possible to find all offers by their partner's location cities?
Oh wow, that was very easy.
Specification {
root: Root<Offer>, criteriaQuery: CriteriaQuery<*>, criteriaBuilder: CriteriaBuilder ->
val objectLocationRoot: Root<ObjectLocation> = criteriaQuery.distinct(true).from(ObjectLocation::class.java)
return#Specification criteriaBuilder.and(
criteriaBuilder.equal(root.get<String>("partnerId"), objectLocationRoot.get<String>("partnerId")),
criteriaBuilder.equal(objectLocationRoot.get<Place>("place")
.get<City>("parentCity")
.get<String>("id")("streetType"), cityId)
)
}

Batch processing an ArrayList with LongStream using Java 8

I am trying to process an ArrayList with content of Long type as in the given example below using Java 8's LongStream but I get the below error.
import java.util.*;
import java.util.stream.*;
public class HelloWorld{
public static void main(String []args){
List<Long> data=new LinkedList();
for(Long j=0L;j<300L;j++){
data.add(j);
}
int BATCH = 10;
LongStream.range(0, (data.size()+BATCH-1)/BATCH)
.mapToLong(i -> data.subList(i*BATCH, Math.min(data.size(), (i+1)*BATCH)))
.forEach(batch -> process(batch));
}
static void process(List<Long> list){
System.out.println(list);
}
}
But I get the below exception. I have tried with mapToLong insted of map but mapToLong is not recognzied
$javac HelloWorld.java
HelloWorld.java:13: error: incompatible types: possible lossy
conversion from long to int
.map(i -> data.subList(i*BATCH, Math.min(data.size(),
(i+1)*BATCH)))
^
HelloWorld.java:14: error: incompatible types: long cannot be
converted to List<Long>
.forEach(batch -> process(batch));
^
2 errors
map in LongStream is supposed to map an element of the LongStream to a long, not to a List.
Use mapToObj:
LongStream.range(0, (data.size()+BATCH-1)/BATCH)
.mapToObj(i -> data.subList((int)i*BATCH, (int)Math.min(data.size(), (i+1)*BATCH)))
.forEach(batch -> process(batch));
Or:
IntStream.range(0, (data.size()+BATCH-1)/BATCH)
.mapToObj(i -> data.subList(i*BATCH, Math.min(data.size(), (i+1)*BATCH)))
.forEach(batch -> process(batch));

Sort Map<String, Long> by value reversed

I have a Map<String, Long> map which I want to sort by the Long value in reversed order using the features of Java 8. With Google I found this thread which provides this solution
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(comparing(Entry::getValue))
.collect(toMap(Entry::getKey, Entry::getValue,
(e1,e2) -> e1, LinkedHashMap::new));
If I want to have the order reversed in the comments it says to use comparing(Entry::getValue).reversed() instead of comparing(Entry::getValue).
However, the code doesn't work. But with this little adaption it does:
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(Comparator.comparing(Entry::getValue))
.collect(Collectors.toMap(Entry::getKey, Entry::getValue,
(e1, e2) -> e1, LinkedHashMap::new));
Do I have to do some imports first to be able to run the original code?
What still remains to get the reversed order, since
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(Comparator.comparing(Entry::getValue).reversed())
.collect(Collectors.toMap(Entry::getKey, Entry::getValue,
(e1, e2) -> e1, LinkedHashMap::new));
gives my an error message:
The type Map.Entry does not define getValue(Object) that is applicable here
As explained in this answer, the type inference of Java 8 hit its limit when you chain method invocations like in Comparator.comparing(Entry::getValue).reversed().
In contrast, when using nested invocations like in Collections.reverseOrder(Comparator.comparing(Entry::getValue)) it will work.
Of course, you can use static imports:
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(reverseOrder(comparing(Entry::getValue)))
.collect(toMap(Entry::getKey, Entry::getValue,
(e1, e2) -> e1, LinkedHashMap::new));
but it should be noted that the compiler likes to provide misleading error messages when you forget an import static statement (i.e. the method can’t be found) and combine it with lambda expressions or method references.
As a final note, there are also the existing comparator implementations Map.Entry.comparingByValue() and Map.Entry.comparingByValue(Comparator) which allow you to use
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(reverseOrder(comparingByValue()))
.collect(toMap(Entry::getKey, Entry::getValue,
(e1, e2) -> e1, LinkedHashMap::new));
or
Map<String, Long> sortedMap = map.entrySet().stream()
.sorted(comparingByValue(reverseOrder()))
.collect(toMap(Entry::getKey, Entry::getValue,
(e1, e2) -> e1, LinkedHashMap::new));

Resources