Trying to insert Json into Neo4j - spring

Everyone I am new to neo4j and I am trying to enter Json into Neo4j but I am getting Match statement instead of create. Earlier I tried something myself and when When I inserted Json message only as
{"name":"john","dept":"Science"}
it went without a glitch but everytime I want try to add numeric data it gets error.
2020-03-10 13:21:59.793 INFO 94817 --- [ntainer#0-0-C-1] o.n.o.drivers.http.request.HttpRequest : Thread:
29, url: http://localhost:7474/db/data/transaction/92, request: {"statements":[{"statement":"UNWIND {rows}
as row **MATCH** (n) WHERE ID(n)=row.nodeId SET n:`UsersInfo` SET n += row.props RETURN row.nodeId as ref,
ID(n) as id, {type} as type","parameters":{"type":"node","rows":[{"nodeId":23,"props":{"name":"raj",
"dept":"science","age":11}}]},"resultDataContents":["row"],"includeStats":false}]}
These are my classes
KafkaConfiguration
#EnableKafka
#Configuration
public class KafkaConfiguration {
#Bean
public ConsumerFactory<String, Users> userConsumerFactory(){
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_json");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
new JsonDeserializer<>(Users.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Users> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Users> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(userConsumerFactory());
return factory;
}
}
KafkaConsumer class
Service
public class KafkaConsumer {
#Autowired
public Neo4jservice neo4jService;
#KafkaListener(topics = "UsersJson", groupId = "group_id", containerFactory = "kafkaListenerContainerFactory")
public void consume(Users users) {
System.out.println("Consumed message: " + users);
UsersInfo usern = new UsersInfo();
usern.setAge(users.getAge());
usern.setDept(users.getDept());
usern.setId(users.getId());
usern.setName(users.getName());
neo4jService.saveIntoStudentsTable(usern);
}
}
Neo4jService
#Service
public class Neo4jservice {
#Autowired
private UsersRepo userRepo;
public UsersInfo saveIntoStudentsTable(UsersInfo users) {
UsersInfo usern = userRepo.save(users);
return (usern);
}
}
UsersRepo
#Repository
public interface UsersRepo extends Neo4jRepository<UsersInfo, Long>{
}
Users class
public class Users {
private Long id;
private String name;
private String dept;
private Integer age;
**getters,setters and toString method here**
}
Likewise UsersInfo class
#NodeEntity
public class Users {
#Id
private Long id;
private String name;
private String dept;
private Integer age;
**getters,setters and toString method here**
}
Any help will be greatly appreciated. Thanks

You are setting also the id value of the User class.
This will make Spring Data Neo4j and the Neo4j Object Graph Mapper that is used for the persistence think that the entity already exists.
In this case it will MATCH on an existing id(n) and update the properties as you can see in the logs instead of CREATE a new node.

Related

Spring R2DBC Multi Datasource with Spring Boot

I created a service that connects to two schemas (ex. fo_pgdb, if_pgdb) My issue is when the service queries the table in the if_pgdb schema it looks as though it is querying the table in the fo_pgdb schema. I have checked and hard coded the database URLs in both class attributes (shown in code examples below) look fine. What could be the issue?
example:
query on table in fo_pgdb schema is "select * from bid_lines where bidlinseqnumber in (123, 345) returns a result set. because ids 123 and 345 have records in the table.
query on table in if_pgdb schema is "select * from bid_lines where bidlinseqnumber in (567, 8910) returns empty result set. But ids 567 and 8910 those records with those ids are in the table.
test: when I use the ids 123 and 345 in the query on the table in the if_pgdb schema I get the same records that are in the table that are in the fo_pgdb table. That should not happen.
#Configuration
#EnableR2dbcRepositories(entityOperationsRef = "foEntityTemplate", basePackages = "com.r2dbc.poc.repository")
public class FODatabaseConfig {
//#Value("${spring.r2dbc.fo.connection.url}")
private String url = "r2dbc:postgresql://username:password#database-dev-fo-css-rr-db.corp.com:1200/fo_pgdb";
#Bean
#Qualifier("foConnectionFactory")
public ConnectionFactory foConnectionFactory() {
return ConnectionFactories.get(url);
}
#Bean
public R2dbcEntityOperations foEntityTemplate(#Qualifier("foConnectionFactory") ConnectionFactory connectionFactory) {
DefaultReactiveDataAccessStrategy strategy = new DefaultReactiveDataAccessStrategy(PostgresDialect.INSTANCE);
DatabaseClient databaseClient = DatabaseClient.builder()
.connectionFactory(connectionFactory)
.bindMarkers(PostgresDialect.INSTANCE.getBindMarkersFactory())
.build();
return new R2dbcEntityTemplate(databaseClient, strategy);
}
}
#Configuration
#EnableR2dbcRepositories(entityOperationsRef = "ifEntityTemplate")
public class IFDatabaseConfig {
//#Value("${spring.r2dbc.if.connection.url}")
private String url = "r2dbc:postgresql://username:password#database-blue-if-CSS-db.corp.com:1200/if_pgdb";
#Bean
#Qualifier("ifConnectionFactory")
public ConnectionFactory ifConnectionFactory() {
return ConnectionFactories.get(url);
}
#Bean
public R2dbcEntityOperations ifEntityTemplate(#Qualifier("ifConnectionFactory") ConnectionFactory connectionFactory) {
DefaultReactiveDataAccessStrategy strategy = new DefaultReactiveDataAccessStrategy(PostgresDialect.INSTANCE);
DatabaseClient databaseClient = DatabaseClient.builder()
.connectionFactory(connectionFactory)
.bindMarkers(PostgresDialect.INSTANCE.getBindMarkersFactory())
.build();
return new R2dbcEntityTemplate(databaseClient, strategy);
}
}
#Service
#RequiredArgsConstructor
public class CrewMemberSchedulePeriodPaymentService {
private final FOCrewMemberBidLineRepository foCrewMemberBidlineRepository;
private final IFCrewMemberBidLineRepository ifCrewMemberBidLineRepository;
public Flux<FOCrewMemberBidLine> getFOBidLines(List<Long> id) {
return foCrewMemberBidlineRepository.findAllById(id);
}
public Flux<IFCrewMemberBidLine> getIFBidLines(List<Long> id) {
return ifCrewMemberBidLineRepository.findAllById(id);
}
}
#Repository
public interface FOCrewMemberBidLineRepository extends R2dbcRepository<FOCrewMemberBidLine, Long> {
#Override
Flux<FOCrewMemberBidLine> findAllById(Iterable<Long> longs);
}
#Repository
public interface IFCrewMemberBidLineRepository extends R2dbcRepository<IFCrewMemberBidLine, Long> {
#Override
Flux<IFCrewMemberBidLine> findAllById(Iterable<Long> longs);
}
#Table(value = "BID_LINES")
#Builder
#NoArgsConstructor
#AllArgsConstructor
#Data
public class FOCrewMemberBidLine {
#Id
#Column(value = "bidlinseqnumber")
private Long bidlinseqnumber;
#Column(value = "bidlinschedperiod")
private String bidlinschedperiod;
}
#Table(value = "BID_LINES")
#Builder
#NoArgsConstructor
#AllArgsConstructor
#Data
public class IFCrewMemberBidLine {
#Id
#Column(value = "bidlinseqnumber")
private Long bidlinseqnumber;
#Column(value = "bidlinschedperiod")
private String bidlinschedperiod;
}
Maybe be you can add the connection factory invkoing the method, like this:
#Bean
#Qualifier("foConnectionFactory")
public ConnectionFactory foConnectionFactory() {
return ConnectionFactories.get("r2dbc:postgresql://username:password#database-dev-fo-css-rr-db.corp.com:1200/fo_pgdb");
}
#Bean
#Qualifier("ifConnectionFactory")
public ConnectionFactory ifConnectionFactory() {
return ConnectionFactories.get("r2dbc:postgresql://username:password#database-blue-if-CSS-db.corp.com:1200/if_pgdb");
}
#Bean
public R2dbcEntityOperations ifEntityTemplate() {
DefaultReactiveDataAccessStrategy strategy = new DefaultReactiveDataAccessStrategy(PostgresDialect.INSTANCE);
DatabaseClient databaseClient = DatabaseClient.builder()
.connectionFactory(ifConnectionFactory()) //<-- change
.bindMarkers(PostgresDialect.INSTANCE.getBindMarkersFactory())
.build();
return new R2dbcEntityTemplate(databaseClient, strategy);
}
#Bean
public R2dbcEntityOperations foEntityTemplate() {
DefaultReactiveDataAccessStrategy strategy = new DefaultReactiveDataAccessStrategy(PostgresDialect.INSTANCE);
DatabaseClient databaseClient = DatabaseClient.builder()
.connectionFactory(foConnectionFactory()) //<-- change
.bindMarkers(PostgresDialect.INSTANCE.getBindMarkersFactory())
.build();
return new R2dbcEntityTemplate(databaseClient, strategy);
}
You can have all of your beans in the same class and each bean will be created with the connection factory that you need.
Cheers.

Spring Batch - How to make two queries or pass two objects to the Processor or writer?

I am developing Spring Boot Spring Batch code. Reading data from the Oracle DB and loading all the data into the MongoDB (NOSQL DB). Modelling of MongoDB is developed as de-normalized way as per the standard way of implementing mongo relations/modelling.
I've TableA and TableB table and Join Table TableAB between them which is 3rd Table. When I read TableA Table via JdbcCursorItemReader<TableA> that time for each PK Id of TableA I need to Query to SubDivision Table to get all the SubDivision for the TableA's PK and populate SubDivision Data set it into model of TableA. TableA model has list Of SubDivisions.
The only way I see making the query from TableAProcessor and set data into the model of TableA, its easy to implement, but issue is that its making 100K calls to DB from TableAProcess if I've 100K TableA records.
How can I achieve this and set the SubDivision data to the model of TableA from either using Tasklet or any other way?
How to avoid calling so many query from Processor ?
I cant make a single query due to some limitations hence I need to query one more query to DB to get SubDivision Data.
#Slf4j
public class TableAProcessor implements ItemProcessor<TableA, TableA>{
#Autowired
private TableADao tableADao;
#Override
public TableA process(TableA tableA) throws Exception {
log.debug("TableA DETAILS : "+tableA);
List<SubDivision> subDivisions = tableADao.getSubDivision(tableA.getPKId());
tableA.setSubDivisions(subDivisions);
return tableA;
}
}
Model
public class TableA {
#Transient
private Integer Id;
#Field
private String mongoId;
........
.......
#Field
private List<SubDivision> subDivisions;
}
TableABatchConfig.java
#Configuration
public class TableABatchConfig {
private static final String sql = "SELECT * FROM TABLEA";
#Autowired
#Qualifier(value="oracleDataSource")
private DataSource dataSource;
#Bean(destroyMethod = "")
#StepScope
public JdbcCursorItemReader<TableA> TableAReader() throws Exception {
JdbcCursorItemReader<TableA> reader = new JdbcCursorItemReader<TableA>();
reader.setDataSource(this.dataSource);
reader.setSql(sql);
reader.setRowMapper(new TableARowMapper());
reader.afterPropertiesSet();
return reader;
}
#Bean
public ItemProcessor<TableA, TableA> TableAProcessor() {
return new TableAProcessor();
}
#Bean
public TableAWriter TableAWriter() {
return new TableAWriter();
}
}
TableAJob.java
#Configuration
#PropertySource("classpath:application.properties")
public class TableAJob {
#Value( "${spring.chunk.size}")
private String chunkSize;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JdbcCursorItemReader<TableA> TableAReader;
#Autowired
private ItemProcessor<TableA, TableA> TableAProcessor;
#Autowired
private TableAWriter TableAWriter;
#Bean
public TableAStepExecuListner TableAStepExecuListner() {
return new TableAStepExecuListner();
}
#Bean("readTableAJob")
#Primary
public Job readTableAJob() {
return jobBuilderFactory.get("readTableAJob")
.incrementer(new RunIdIncrementer())
.start(TableAStepOne())
.build();
}
#Bean
public Step TableAStepOne() {
return stepBuilderFactory.get("TableAStepOne")
.<TableA, TableA>chunk(Integer.parseInt(chunkSize))
.reader(TableAReader)
.processor(TableAProcessor)
.writer(TableAWriter)
.listener(TableAStepExecuListner())
.build();
}
}
dao
#Service
public class TableADao {
private static final String SQL = "COMPLEX JOIN QUERY";
#Autowired
private JdbcTemplate jdbcTemplate;
public List<SubDivision> getSubDivision(Integer pkId){
List<Map<String, Object>> results = jdbcTemplate.queryForList(SQL,new Object[] { pkId });
List<SubDivision> divisions = new ArrayList<>();
for (Map<String, Object> row : results) {
divisions.add(SubDivision.builder().subDivisionCd((String)row.get("SUBDIVISION_CD"))
......
.........
.........
......
.build());
}
return divisions;
}
}
TableAWriter.java
public class TableAWriter implements ItemWriter<TableA>{
#Autowired
private TableARepository TableARepository;
#Override
public void write(List<? extends TableA> items) throws Exception {
TableARepository.saveAll(items);
}
}

Best Approach to load application.yml in spring boot application

I am having Spring Boot application and having application.yml with different properties and loading as below.
#Configuration
#ConfigurationProperties(prefix="applicationprops")
public class ApplicationPropHolder {
private Map<String,String> mapProperty;
private List<String> myListProperty;
//Getters & Setters
}
My Service or Controller Class in which I get this properties like below.
#Service
public ApplicationServiceImpl {
#Autowired
private ApplicationPropHolder applicationPropHolder;
public String getExtServiceInfo(){
Map<String,String> mapProperty = applicationPropHolder.getMapProperty();
String userName = mapProperty.get("user.name");
List<String> listProp = applicationPropHolder.getMyListProperty();
}
}
My application.yml
spring:
profile: dev
applicationprops:
mapProperty:
user.name: devUser
myListProperty:
- DevTestData
---
spring:
profile: stagging
applicationprops:
mapProperty:
user.name: stageUser
myListProperty:
- StageTestData
My questions are
In my Service class i am defining a variable and assigning Propertymap for every method invocation.Is it right appoach?
Is there any other better way I can get these maps without assigning local variable.
There are three easy ways you can assign the values to instance variables in your bean class.
Use the #Value annotation as follows
#Value("${applicationprops.mapProperty.user\.name}")
private String userName;
Use the #PostConstruct annotation as follows
#PostConstruct
public void fetchPropertiesAndAssignToInstanceVariables() {
Map<String, String> mapProperties = applicationPropHolder.getMapProperty();
this.userName = mapProperties.get( "user.name" );
}
Use #Autowired on a setter as follows
#Autowired
public void setApplicationPropHolder(ApplicationPropHolder propHolder) {
this.userName = propHolder.getMapProperty().get( "user.name" );
}
There may be others, but I'd say these are the most common ways.
Hope, you are code is fine.
Just use the below
#Configuration
#ConfigurationProperties(prefix="applicationprops")
public class ApplicationPropHolder {
private Map<String,String> mapProperty;
private List<String> myListProperty;
public String getUserName(){
return mapProperty.get("user.name");
}
public String getUserName(final String key){
return mapProperty.get(key);
}
}
#Service
public ApplicationServiceImpl {
#Autowired
private ApplicationPropHolder applicationPropHolder;
public String getExtServiceInfo(){
final String userName = applicationPropHolder.getUserName();
final List<String> listProp = applicationPropHolder.getMyListProperty();
}
}

Ignite : select query returns null

I am new to ignite , I am trying to fetch data using ignite repository but below query returns 'null'.
my repository
#Component
#RepositoryConfig(cacheName = "UserCache")
#Repository
public interface UserRepository extends IgniteRepository<UserEntity, Long> {
#Query("select a.* from UserEntity a where a.lastname=? ")
UserEntity selectUserlastName(String plastName);
My cache configuration as
CacheConfiguration<Long, UserEntity> lUserCacheConfig =
createCacheConfigurationStore("UserCache", UserCacheStore.class);
CacheJdbcPojoStoreFactory<Long, UserEntity> lUserJdbcStoreFactory = new
CacheJdbcPojoStoreFactory<>();
UserJdbcPojoStoreFactory<? super Long, ? super UserEntity>
lUserJdbcPojoStoreFactory = new UserJdbcPojoStoreFactory<>();
lUserJdbcStoreFactory.setDataSource(datasource);
lUserJdbcStoreFactory.setDialect(new OracleDialect());
lUserJdbcStoreFactory.setTypes(lUserJdbcPojoStoreFactory.
configJdbcContactType());
lUserCacheConfig.setCacheStoreFactory(lUserJdbcStoreFactory);
// Configure Cache..
cfg.setCacheConfiguration(lUserCacheConfig);
My PojoStore is as below:
public class UserJdbcPojoStoreFactory<K, V> extends
AnstractJdbcPojoStoreFactory<Long, UserEntity> {
private static final long serialVersionUID = 1L;
#Autowired
DataSource datasource;
#Override
public CacheJdbcPojoStore<Long, UserEntity> create() {
// TODO Auto-generated method stub
setDataSource(datasource);
return super.create();
}
#Override
public JdbcType configJdbcContactType() {
JdbcType jdbcContactType = new JdbcType();
jdbcContactType.setCacheName("UserCache");
jdbcContactType.setKeyType(Long.class);
jdbcContactType.setValueType(UserEntity.class);
jdbcContactType.setDatabaseTable("USER");
jdbcContactType.setDatabaseSchema("ORGNITATION");
jdbcContactType.setKeyFields(new JdbcTypeField(Types.INTEGER, "id",
Long.class, "id"));
jdbcContactType.setValueFields(
new JdbcTypeField(Types.VARCHAR, "NAME", String.class, "NAME"), //
new JdbcTypeField(Types.VARCHAR, "LASTNAME", String.class, "lastname"),
//
return jdbcContactType;
}
}
Please suggest ..
Please check that #Query annotation imported from ignite-spring-data library and test your query using SqlFieldsQuery.

How to map configuration objects to java object

I have a spring boot application which is using a spring cloud config.
How can i map a configuration element with some java object.
My config is something like this:
clients:
- id : 1
name: client 1
groups : [a,b]
- id : 2
name: client 2
groups : [a]
And my java object is:
public class ClientInfo {
private String clientId;
private List<String> profiles;
public ClientInfo(String clientId, List<String> pips) {
this.clientId = clientId;
this.profiles = pips;
}
public String getClientId() {
return clientId;
}
public void setClientId(String clientId) {
this.clientId = clientId;
}
public List<String> getProfiles() {
return profiles;
}
public void setProfiles(List<String> profiles) {
this.profiles = profiles;
}
}
I want to map my configuration with List
Use below code to configure configuration properties in to java Object,
#Component
#EnableConfigurationProperties
#ConfigurationProperties(prefix = "clients")
public class ClientInfo {
private String id;
private String name;
private List<String> groups;
public String getId(){ return id;}
public String getName(){ return name;}
public List<String> getGroups(){ return groups;}
}
Check following for example http://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-external-config.html
Inject this class in another class :
#Autowired
private ClientInfo clientInfo;
The above auto wiring will not work if the class is instantiated using "new operator".
Actually I found the reason why it was not working.
All that was needed is to have another class which contains a list of ClientInfo and have #EnableConfigurationProperties and #ConfigurationProperties annotations on it. This is because "clients" in my configuration is a list. After this change we can use #Autowired annotation to inject the configuration.

Resources