Spring data Mongodb write operation from ApplicationRunner - spring

I have a problem with writing to mongodb instance. Problem is that i can't write anything from this class.
#Component
#AllArgsConstructor
public class DemoDataWriter implements ApplicationRunner {
private WarehouseRepository warehouseRepository;
private CustomerRepository customerRepository;
#Override
public void run(ApplicationArguments args) throws Exception {
Customer customer1 = new Customer("Gleb", new Coordinate(4, 3));
Customer customer2 = new Customer("Sasha", new Coordinate(8, 9));
Customer customer3 = new Customer("Misha", new Coordinate(15, 10));
Map<String, Integer> merchandiseQuantity1 = new HashMap<>();
merchandiseQuantity1.put("computer", 16);
merchandiseQuantity1.put("bebra", 6);
merchandiseQuantity1.put("vacine", 10);
Map<String, Integer> merchandiseQuantity2 = new HashMap<>();
merchandiseQuantity1.put("laptop", 100);
merchandiseQuantity1.put("grivna", 20);
merchandiseQuantity1.put("beer", 1);
Map<String, Integer> merchandiseQuantity3 = new HashMap<>();
merchandiseQuantity1.put("cup", 13);
merchandiseQuantity1.put("chair", 90);
merchandiseQuantity1.put("notebook", 18);
Map<String, Integer> merchandiseQuantity4 = new HashMap<>();
merchandiseQuantity1.put("gun", 54);
merchandiseQuantity1.put("answer", 42);
merchandiseQuantity1.put("computer", 4);
Map<String, Integer> merchandiseQuantity5 = new HashMap<>();
merchandiseQuantity1.put("gun", 16);
merchandiseQuantity1.put("grinva", 6);
merchandiseQuantity1.put("charger", 132);
Map<String, Integer> merchandiseQuantity6 = new HashMap<>();
merchandiseQuantity1.put("computer", 16);
merchandiseQuantity1.put("bebra", 6);
merchandiseQuantity1.put("vacine", 10);
Warehouse warehouse1 = new Warehouse("Compluter Inc", merchandiseQuantity1, new Coordinate(43,12));
Warehouse warehouse2 = new Warehouse("Bebra", merchandiseQuantity2, new Coordinate(21, 89));
Warehouse warehouse3 = new Warehouse("LG", merchandiseQuantity3, new Coordinate(15, 90));
Warehouse warehouse4 = new Warehouse("Abchihba", merchandiseQuantity4, new Coordinate(567, 890));
Warehouse warehouse5 = new Warehouse("Node", merchandiseQuantity5, new Coordinate(389, 54));
Warehouse warehouse6 = new Warehouse("Meta", merchandiseQuantity6, new Coordinate(321, 590));
customerRepository.save(customer1);
customerRepository.save(customer2);
customerRepository.save(customer3);
warehouseRepository.save(warehouse1);
warehouseRepository.save(warehouse2);
warehouseRepository.save(warehouse3);
warehouseRepository.save(warehouse4);
warehouseRepository.save(warehouse5);
warehouseRepository.save(warehouse6);
}
}
But i can write to database from Controller.
#PostMapping(path = "/customer/create")
public Customer createNewCustomer(#RequestBody Customer customer) {
System.out.println(customer.toString());
return customerRepository.save(customer);
}
Maybe problem in my way of using spring for this logic and i just need to find different way for compliting such operation.

My suggestion is to check whether u are able to go inside the run method in the debug mode.

Related

Writing in multiple unrelated tables in spring batch writer

Can we have a writer which will write in 2 different unrelated tables simultaneously in spring batch? Actually, Along with the main data I need to store some metadata in a different table. How can I go about it?
Please find below . Let say you have 3 tables to write
#Bean
public CompositeItemWriter compositeWriter() throws Exception {
CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
List<ItemWriter> writers = new ArrayList<ItemWriter>();
writers.add(firstTableWriter());
writers.add(secondTableWriter());
writers.add(thirdTableWriter());
compositeItemWriter.setDelegates(writers);
return compositeItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> firstTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO FIRSTTABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new FirstTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> secondTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO SECOND TABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new SecondTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
#Bean
public JdbcBatchItemWriter<YourDTO> thirdTableWriter() {
JdbcBatchItemWriter<YourDTO> databaseItemWriter = new JdbcBatchCustomItemWriter<>();
databaseItemWriter.setDataSource(dataSource);
databaseItemWriter.setSql("INSERT INTO THIRD TABLE");
ItemPreparedStatementSetter<YourDTO> invoicePreparedStatementSetter = new ThirdTableSetter();
databaseItemWriter.setItemPreparedStatementSetter(invoicePreparedStatementSetter);
return databaseItemWriter;
}
//SettterClass Example
public class FirstTableSetter implements ItemPreparedStatementSetter<YourDTO> {
#Override
public void setValues(YourDTO yourDTO, PreparedStatement preparedStatement) throws SQLException {
preparedStatement.setString(1, yourDTO.getMyValue());
}
}

Export entities to database schema through java code

A long time ago, I did that with a code like that:
Configuration config = new Configuration();
Properties props = new Properties();
FileInputStream fos = = new FileInputStream( file_name );
props.load(fos);
fos.close();
config.setProperties(props);
config.addAnnotatedClass(...);
Connection conn = DriverManager.getConnection(url,usuario,senha);
SchemaExport schema = new SchemaExport();
schema.create(true, true);
But now, if I try use this code, I got a compilation error. Seeing the javadoc for SchemaExport, I notice a lot of changes in the methods used on this example.
Hpw I could do that now?
update
based on the suggested link, I implemented the method this way:
public void criarTabelas(String server, String user, String pass) throws Exception {
StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder().applySetting("hibernate.hbm2ddl.auto", "create").applySetting("hibernate.dialect", dialect).applySetting("hibernate.id.new_generator_mappings", "true").build();
MetadataSources sources = new MetadataSources(standardRegistry);
for(Class<?> entity : lista_entidades())
sources.addAnnotatedClass(entity);
MetadataImplementor metadata = (MetadataImplementor) sources.getMetadataBuilder().build();
SchemaExport export = new SchemaExport();
export.create(EnumSet.of(TargetType.DATABASE), metadata);
}
private List<Class<?>> lista_entidades() throws Exception {
List<Class<?>> lista = new ArrayList<Class<?>>();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Entity.class));
for (BeanDefinition bd : scanner.findCandidateComponents("org.loja.model"))
lista.add(Class.forName(bd.getBeanClassName()));
return lista;
}
Now I need a way to establish a jdbc connection and associate to the SchemaExport.
I solve this issue with this code:
public void criarTabelas(String server, String user, String pass) throws Exception {
Connection conn = DriverManager.getConnection(url_prefix+server+"/"+url_suffix, user, pass);
StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder()
.applySetting("hibernate.hbm2ddl.auto", "create")
.applySetting("hibernate.dialect", dialect)
.applySetting("hibernate.id.new_generator_mappings", "true")
.applySetting("javax.persistence.schema-generation-connection", conn)
.build();
MetadataSources sources = new MetadataSources(standardRegistry);
for(Class<?> entity : lista_entidades())
sources.addAnnotatedClass(entity);
MetadataImplementor metadata = (MetadataImplementor) sources.getMetadataBuilder().build();
SchemaExport export = new SchemaExport();
export.create(EnumSet.of(TargetType.DATABASE), metadata);
conn.close();
}
private List<Class<?>> lista_entidades() throws Exception {
List<Class<?>> lista = new ArrayList<Class<?>>();
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Entity.class));
for (BeanDefinition bd : scanner.findCandidateComponents("org.loja.model"))
lista.add(Class.forName(bd.getBeanClassName()));
System.out.println("lista: "+lista);
return lista;
}

Java8 generate Map containing another Map

How do I achieve this using java=8
I have a CSV in below format and from this i want to populate Map<String, Map<String, String>
where the outer map will have key scriptId and transationType as these are the distinct Type and inner map for scriptId key should contain first 5 values stating from position 2 as key and 3 as value.
<scriptId<
<TATA,TATA Moters>
<REL,Reliance Industries Ltd>
<LNT, L&T>
<SBI, State Bank of India>>
<transactionType,<
<P,B>
<S,S>>
Content of CSV File
Type,ArcesiumValue,GICValue
scriptId,TATA,TATA Moters
scriptId,REL,Reliance Industries Ltd
scriptId,LNT,L&T
scriptId,SBI,State Bank of India
transactionType,P,B
transactionType,S,S
How do i generate this using Java8
public void loadReferenceData() throws IOException {
List<Map<String, Map<String, String>>> cache = Files.lines(Paths.get("data/referenceDataMapping.csv")).skip(1)
.map(mapRefereneData).collect(Collectors.toList());
System.out.println(cache);
}
public static Function<String, Map<String, Map<String, String>>> mapRefereneData = (line) -> {
String[] sp = line.split(",");
Map<String, Map<String, String>> cache = new HashMap<String, Map<String, String>>();
try {
if (cache.containsKey(sp[0])) {
cache.get(sp[0]).put(sp[1], sp[2]);
} else {
Map<String, String> map = new HashMap<String, String>();
map.put(sp[1], sp[2]);
cache.put(sp[0], map);
}
} catch (NumberFormatException e) {
e.printStackTrace();
}
return cache;
};
Well it is much simpler to use two Collectors:
Map<String, Map<String, String>> groupCSV = Files.lines(Paths.get("..."))
.skip(1L).map(l -> l.split(","))
.collect(Collectors.groupingBy(a -> a[0], Collectors.toMap(a -> a[1], a -> a[2])));

Spring batch JdbcPagingItemReader endless loop

I am facing problems with JdbcPagingItemReader. I am stucked in endless loop. I read about that itemReader contract have to return null, but I don't manage to implement it correctly. Somebody can show me an example?
public List<TransactionDTO> getTransactions(Integer chunk, LocalDateTime startDate, LocalDateTime endDate)
throws Exception {
final TransactionMapper transactionMapper = new TransactionMapper();
final SqlPagingQueryProviderFactoryBean sqlPagingQueryProviderFactoryBean = new SqlPagingQueryProviderFactoryBean();
sqlPagingQueryProviderFactoryBean.setDataSource(dataSource);
sqlPagingQueryProviderFactoryBean.setSelectClause(env.getProperty("sql.fromdates.select"));
sqlPagingQueryProviderFactoryBean.setFromClause(env.getProperty("sql.fromdates.from"));
sqlPagingQueryProviderFactoryBean.setWhereClause(env.getProperty("sql.fromdates.where"));
sqlPagingQueryProviderFactoryBean.setSortKey(env.getProperty("sql.fromdates.sort"));
final Map<String, Object> parametros = new HashMap<>();
parametros.put("startDate", startDate);
parametros.put("endDate", endDate);
final JdbcPagingItemReader<TransactionDTO> itemReader = new JdbcPagingItemReader<>();
itemReader.setDataSource(dataSource);
itemReader.setQueryProvider(sqlPagingQueryProviderFactoryBean.getObject());
// TODO esto debe ser el chunk
itemReader.setPageSize(1);
itemReader.setFetchSize(1);
itemReader.setRowMapper(transactionMapper);
itemReader.afterPropertiesSet();
itemReader.setParameterValues(parametros);
ExecutionContext executionContext = new ExecutionContext();
itemReader.open(executionContext);
List<TransactionDTO> list = new ArrayList<>();
TransactionDTO primerDto = itemReader.read();
while (primerDto != null) {
list.add(itemReader.read());
}
itemReader.close();
return list;
}

Fetch properties from Sonarqube via Sonarqube wsClient

I'd like to fetch sonar.timemachine.period1 via wsclient.
Seeing that it doesn't have one, I decided to bake one for myself
private Map<String, String> retrievePeriodProperties(final WsClient wsClient, int requestedPeriod) {
if (requestedPeriod > 0) {
final WsRequest propertiesWsRequestPeriod =
new GetRequest("api/properties/sonar.timemachine.period" + requestedPeriod);
final WsResponse propertiesWsResponsePeriod =
wsClient.wsConnector().call(propertiesWsRequestPeriod);
if (propertiesWsResponsePeriod.isSuccessful()) {
String resp = propertiesWsResponsePeriod.content();
Map<String, String> map = new HashMap<>();
map.put(Integer.toString(requestedPeriod), resp);
return map;
}
}
return new HashMap<>();
}
but it always return an empty Map<>
Any lead where I can go from this direction?
You can use org.sonar.api.config.Settings to fetch properties defined in SonarQube.

Resources