Problem with creating new objects on each RestController iterate spring boot - spring

My problem is that on each post method my Restcontroller creates new objects and I can't work on the first one that
#RestController
public class ControllerRoom {
private final Room cinema;
public ControllerRoom() {
this.cinema = new Room(9,9);
}
My main goal is to avoid it and check with post method if some of the json object was requested twice, if yes then exception will be thrown, but since it creates over and over, it can't throw a exception
#PostMapping("/purchase")
public ResponseEntity<Seats> sellSeat(#RequestBody Seats seat){
System.out.println("Requested body =" + seat.getRow() +" "+ seat.getColumn() + " " + seat.getPrice() );
for (Seats s : cinema.getSeats()){
if (s.getColumn() == seat.getColumn() && s.getRow() == seat.getRow()){
if (!s.isTaken()) {
System.out.println(s.setTaken(true));
return new ResponseEntity<>(s, HttpStatus.OK);
} if (s.isTaken()){
throw new SeatRequestException("The ticket has been already purchased!");
}
}
}
throw new SeatRequestException("The number of a row or a column is out of bounds!");
}
In case if it's a problem, Room class add the objects to the ArrayList in this method of it class
#JsonGetter(value = "available_seats")
public ArrayList<Seats> getSeats() {
for (int i=1; i<=total_columns; i++){
for (int j=1; j<=total_rows; j++){
if (i<=4) {
available_seats.add(new Seats(i, j, false, 10));
} else {
available_seats.add(new Seats(i,j, false,8));
}
}
}
System.out.println(available_seats.size());
return available_seats;
}

Related

How to delete alarge amount of data one by one from a table with their relations using transactional annotation

I have a large amount of data that I want to purge from the database, there are about 6 tables of which 3 have a many to many relationship with cascadeType. All the others are log and history tables independent of the 3 others
i want to purge this data one by one and if any of them have error while deleting i have to undo only the current record and show it in console and keep deleting the others
I am trying to use transactional annotation with springboot but all purging stops if an error occurs
how to manage this kind of need?
here is what i did :
#Transactional
private void purgeCards(List<CardEntity> cardsTobePurge) {
List<Long> nextCardsNumberToUpdate = getNextCardsWhichWillNotBePurge(cardsTobePurge);
TransactionTemplate lTransTemplate = new TransactionTemplate(transactionManager);
lTransTemplate.setPropagationBehavior(TransactionTemplate.PROPAGATION_REQUIRED);
lTransTemplate.execute(new TransactionCallback<Object>() {
#Override
public Object doInTransaction(TransactionStatus status) {
cardsTobePurge.forEach(cardTobePurge -> {
Long nextCardNumberOfCurrent = cardTobePurge.getNextCard();
if (nextCardsNumberToUpdate.contains(nextCardNumberOfCurrent)) {
CardEntity cardToUnlik = cardRepository.findByCardNumber(nextCardNumberOfCurrent);
unLink(cardToUnlik);
}
log.info(BATCH_TITLE + " Removing card Number : " + cardTobePurge.getCardNumber() + " with Id : "
+ cardTobePurge.getId());
List<CardHistoryEntity> historyEntitiesOfThisCard = cardHistoryRepository.findByCard(cardTobePurge);
List<LogCreationCardEntity> logCreationEntitiesForThisCard = logCreationCardRepository
.findByCardNumber(cardTobePurge.getCardNumber());
List<LogCustomerMergeEntity> logCustomerMergeEntitiesForThisCard = logCustomerMergeRepository
.findByCard(cardTobePurge);
cardHistoryRepository.deleteAll(historyEntitiesOfThisCard);
logCreationCardRepository.deleteAll(logCreationEntitiesForThisCard);
logCustomerMergeRepository.deleteAll(logCustomerMergeEntitiesForThisCard);
cardRepository.delete(cardTobePurge);
});
return Boolean.TRUE;
}
});
}
As a solution to my question:
I worked with TransactionTemplate to be able to manage transactions manually
so if an exception is raised a rollback will only be applied for the current iteration and will continue to process other cards
private void purgeCards(List<CardEntity> cardsTobePurge) {
int[] counter = { 0 }; //to simulate the exception
List<Long> nextCardsNumberToUpdate = findNextCardsWhichWillNotBePurge(cardsTobePurge);
cardsTobePurge.forEach(cardTobePurge -> {
Long nextCardNumberOfCurrent = cardTobePurge.getNextCard();
CardEntity cardToUnlik = null;
counter[0]++; //to simulate the exception
if (nextCardsNumberToUpdate.contains(nextCardNumberOfCurrent)) {
cardToUnlik = cardRepository.findByCardNumber(nextCardNumberOfCurrent);
}
purgeCard(cardTobePurge, nextCardsNumberToUpdate, cardToUnlik, counter);
});
}
private void purgeCard(#NonNull CardEntity cardToPurge, List<Long> nextCardsNumberToUpdate, CardEntity cardToUnlik,
int[] counter) {
TransactionTemplate lTransTemplate = new TransactionTemplate(transactionManager);
lTransTemplate.setPropagationBehavior(TransactionTemplate.PROPAGATION_REQUIRED);
lTransTemplate.execute(new TransactionCallbackWithoutResult() {
#Override
public void doInTransactionWithoutResult(TransactionStatus status) {
try {
if (cardToUnlik != null)
unLink(cardToUnlik);
log.info(BATCH_TITLE + " Removing card Number : " + cardToPurge.getCardNumber() + " with Id : "
+ cardToPurge.getId());
List<CardHistoryEntity> historyEntitiesOfThisCard = cardHistoryRepository.findByCard(cardToPurge);
List<LogCreationCardEntity> logCreationEntitiesForThisCard = logCreationCardRepository
.findByCardNumber(cardToPurge.getCardNumber());
List<LogCustomerMergeEntity> logCustomerMergeEntitiesForThisCard = logCustomerMergeRepository
.findByCard(cardToPurge);
cardHistoryRepository.deleteAll(historyEntitiesOfThisCard);
logCreationCardRepository.deleteAll(logCreationEntitiesForThisCard);
logCustomerMergeRepository.deleteAll(logCustomerMergeEntitiesForThisCard);
cardRepository.delete(cardToPurge);
if (counter[0] == 2)//to simulate the exception
throw new Exception();//to simulate the exception
} catch (Exception e) {
status.setRollbackOnly();
if (cardToPurge != null)
log.error(BATCH_TITLE + " Problem with card Number : " + cardToPurge.getCardNumber()
+ " with Id : " + cardToPurge.getId(), e);
else
log.error(BATCH_TITLE + "Card entity is null", e);
}
}
});
}

calling my apex method in apex trigger getting the error

public static void insertInboundJive(Map<Id, String> mapCases){
try{
system.debug('Aditya');
Map<Id, String> mapCases1 = new Map<Id, String>();
Map<Id, Integer> mapIncrements = new Map<Id, Integer>();
//List<ICS_Case_Interaction__c> lstCaseInteraction;
if(mapCases != null && mapCases.size() > 0) {
List<ICS_Case_Interaction__c> lstCaseInteraction = [ SELECT Id,case__r.origin FROM ICS_Case_Interaction__c Where case__r.Id =:mapCases.keySet()];
for(ICS_Case_Interaction__c caseInteracts :lstCaseInteraction ){
if(caseInteracts.case__r.Id != null && caseInteracts.case__r.Status == 'New Customer Message'){
system.debug('**AdityaDebug**' +caseInteracts.case__r.Id);
system.debug('**AdityaDebug**' +caseInteracts.case__r.Status);
mapcases1.put(caseInteracts.case__r.Id , TYPE_JIVE_INBOUND);
Integer intIncrement = mapIncrements.get(caseInteracts.case__r.Id);
system.debug('Increment' +intIncrement);
if(intIncrement != null){
intIncrement++;
system.debug('Increment++' +intIncrement);
}
else {
intIncrement = 1;
}
mapIncrements.put(caseInteracts.case__r.Id, intIncrement);
}
}
if(mapCases.size() > 0) {
insertByCaseAsync(mapCases, mapIncrements);
}
}
}
catch(Exception ex){
Core_Log_Entry.logEntryWithException('Case Interaction Metrics', 'CaseInteraction','insertInboundEmail', 'Error', null, null, ex);
}
}
This is my Method in the class.I am trying to call the apex method in the trigger.but its throwing the error.Could you please help me and try to reach out the best.
The error which I am getting was
line 188, col 106. Method does not exist or incorrect signature: void insertInboundJive(List) from the type ICS_Case_Interactions_Trigger_Handler
if(trigger.isUpdate) {
if(Label.ICS_Case_Interaction_Metrics.equals('1')) {ICS_Case_Interactions_Trigger_Handler.insertInboundJive(trigger.new);}
}
You are trying to pass the wrong parameters. In the method you have defined that when called you need to pass a Map where the values are String however you are passing Trigger.new which is a list of Objects. My approach is to handle the mapping in the trigger and then manipulate data in the controller:
In this case you can do the below to pass the records and get the string of data you want in the controller.. or do it in the trigger so you don't change the controller.
Map<Id,Contact> map = new Map<Id,ICS_Case_Interaction__c>(); // new map
for(ICS_Case_Interaction__c con :trigger.new){
map.put(con.Id, con); // enter the records you need for the method
}
if(trigger.isUpdate) {
if(Label.ICS_Case_Interaction_Metrics.equals('1')) {
ICS_Case_Interactions_Trigger_Handler.insertInboundJive(map);
}
}
and in the controller you should have
public static void insertInboundJive(Map<Id, ICS_Case_Interaction__c> mapCases){
}

Duplication with Chunks in Spring Batch

I have a huge file, I need to read it and dump it into DB. Any invalid records(invalid length, duplicate keys, etc), if present need to be written into a Error Report. Due to the huge size of the file we tried using the chunk-size(commit-interval) as 1000/5000/10000. In the process I found that the data was being processed redundantly due to the usage of chunks and thus my Error Report is incorrect, it not only has the actual invalid records from the input-file but also the duplicates from the chunks.
Code snippet:
#Bean
public Step readAndWriteStudentInfo() {
return stepBuilderFactory.get("readAndWriteStudentInfo")
.<Student, Student>chunk(5000).reader(studentFileReader()).faultTolerant()
.skipPolicy(skipper)..listener(listener).processor(new ItemProcessor<Student, Student>() {
#Override
public Student process(Student Student) throws Exception {
if(processedRecords.contains(Student)){
return null;
}else {
processedRecords.add(Student);
return Student;
}
}
}).writer(studentDBWriter()).build();
}
#Bean
public ItemReader<Student> studentFileReader() {
FlatFileItemReader<Student> reader = new FlatFileItemReader<>();
reader.setResource(new FileSystemResource(studentInfoFileName));
reader.setLineMapper(new DefaultLineMapper<Student>() {
{
setLineTokenizer(new FixedLengthTokenizer() {
{
setNames(classProperties50);
setColumns(range50);
}
});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Student>() {
{
setTargetType(Student.class);
}
});
}
});
reader.setSaveState(false);
reader.setLinesToSkip(1);
reader.setRecordSeparatorPolicy(new TrailerSkipper());
return reader;
}
#Bean
public ItemWriter<Student> studentDBWriter() {
JdbcBatchItemWriter<Student> writer = new JdbcBatchItemWriter<>();
writer.setSql(insertQuery);
writer.setDataSource(datSource);
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Student>());
return writer;
}
I've tried with various chunk sizes, 10, 100, 1000, 5000. The accuracy of my error report deteriorates with the increase in chunk size. Writing to Error Report is happening from my implementation of Skip Policy, kindly do let me know if that code is required too to help me out.
How do I ensure that my writer picks up unique set of records in each chunk?
Skipper Implementation:
#Override
public boolean shouldSkip(Throwable t, int skipCount) throws SkipLimitExceededException {
String exception = t.getClass().getSimpleName();
if (t instanceof FileNotFoundException) {
return false;
}
switch (exception) {
case "FlatFileParseException":
FlatFileParseException ffpe = (FlatFileParseException) t;
String errorMessage = "Line no = " + ffpe.getLineNumber() + " " + ffpe.getMessage() + " Record is ["
+ ffpe.getInput() + "].\n";
writeToRecon(errorMessage);
return true;
case "SQLException":
SQLException sE = (SQLException) t;
String sqlErrorMessage = sE.getErrorCode() + " Record is [" + sE.getCause() + "].\n";
writeToRecon(sqlErrorMessage);
return true;
case "BatchUpdateException":
BatchUpdateException batchUpdateException = (BatchUpdateException) t;
String btchUpdtExceptionMsg = batchUpdateException.getMessage() + " " + batchUpdateException.getCause();
writeToRecon(btchUpdtExceptionMsg);
return true;
default:
return false;
}

Spring Boot send only changed data

I am building a game in Spring Boot on a server and classic Javascript on a backend.
Right now I have this:
...
#Autowired
private SimpMessagingTemplate template;
...
#Scheduled(fixedRate = 1000 / Constants.FPS)
public void renderClients() {
for(Game g : games) {
template.convertAndSend("/game/render/" + g.getId(), g);
}
}
...
Basically I have a multiple Games running and I send each with it's id to the client.
However the data I am sending (or the most of the data) is static (not changing)...
What if I want not to send the whole data but only parts which have changed.
Btw the response JSON looks like this:
{"id":"862b1dd8-48d5-4562-802a-7d669a5a5ed5","players":[{"id":"da8dcbec-7028-4a39-9547-a4e2dc321c3c","name":"John Doe","position":{"x":100.0,"y":100.0},"rotation":0.0,"hero":{"maxHealth":1300.0,"movementSpeed":4.5,"attackDamage":32.75,"width":68,"height":71,"heroName":"drowRanger","radius":34.0},"stats":{"kills":0,"lastHits":0},"lastClick":null}],"duration":380107.12}
and the only thing that is changing is duration and sometimes the x and y when the player moves...
Is it even possible?
Could I write some middleware that will do that at the time the objects are converted to JSON?
Maintain a data structure stores your changed value, and attach it to your Game Object.
When the time to send ,convert the map to a json ,and clear it.
Using this way may use more memory than before , but won't cost much time.
I DID IT!!
In my GameController I do:
#Scheduled(fixedRate = 1000 / Constants.FPS)
public void renderClients() throws Exception {
for(Game g : games) {
template.convertAndSend("/game/render/" + g.getId(), g.formatToSend());
}
}
Notice the g.formatToSend() method
here is how a Game class looks like:
public class Game {
private BandWidthOptimizer optimizer = new BandWidthOptimizer();
...
...
public String formatToSend() throws Exception {
return optimizer.optimize(this);
}
}
And Here Comes THE BandWidthOptimizer:
package com.iddqd.doto.optimization;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.sun.tools.classfile.Opcode;
import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import java.util.Iterator;
import java.util.Set;
import java.util.function.BiConsumer;
public class BandWidthOptimizer {
import com.fasterxml.jackson.databind.ObjectMapper;
import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
private String[] preserveKeys;
public BandWidthOptimizer() {
this.preserveKeys = new String[0];
}
public BandWidthOptimizer(String[] preserveKeys) {
this.preserveKeys = preserveKeys;
}
public String optimize(Object obj) throws Exception {
String json = mapper.writeValueAsString(obj);
Object nobj = parser.parse(json);
Object oobj = parser.parse(lastJSON);
JSONObject newJsonObj = (JSONObject)nobj;
JSONObject oldJsonObj = (JSONObject)oobj;
JSONObject res = getJSONObjectDiff(newJsonObj, oldJsonObj);
lastJSON = json;
return res.toJSONString();
}
private JSONObject getJSONObjectDiff(JSONObject obj1, JSONObject obj2) {
JSONObject res = new JSONObject();
Set set = obj1.keySet();
for (Object key : set) {
// If doesn't exist put it in the diff
if (!obj2.containsKey(key)) {
res.put(key, obj1.get(key));
} else {
// Get the values from both objects
Object val1 = obj1.get(key);
Object val2 = obj2.get(key);
// If their instances are of the same type
if(val1 == null) {
continue;
}
if(val2 == null) {
res.put(key, val1);
continue;
}
if (val1.getClass().equals(val2.getClass())) {
// If they are JSONObject
if (val1 instanceof JSONObject) {
// Recursively parse JSONObject with all of it's properties
JSONObject nested = getJSONObjectDiff((JSONObject) obj1.get(key), (JSONObject) obj2.get(key));
// If it contains any keys
if(nested.keySet().size() > 0) {
// Store the diff into final diff
res.put(key, nested);
}
// If they are JSONArrays
} else if (val1 instanceof JSONArray) {
// If val1 contains some values (is not empty)
if(((JSONArray) val1).size() > 0) {
// Get their diff
JSONArray arr = getJSONArrayDiff((JSONArray) val1, (JSONArray) val2);
// If array is not empty
if (arr.size() > 0) {
// put it into the diff
res.put(key, arr);
}
}
// If they are just a pure values
} else {
// Compare them - If they're not equal
if(!val1.equals(val2)) {
// put the val1 into diff
res.put(key, val1);
}
}
} else {
res.put(key, val1);
}
}
}
return res;
}
private JSONArray getJSONArrayDiff(JSONArray arr1, JSONArray arr2) {
JSONArray res = new JSONArray();
// For every element
for(int i = 0; i < arr1.size(); i++) {
Object val1 = arr1.get(i);
// If i is out of arr2 bounds
if(i > arr2.size()) {
// put the arr1 item into the diff
res.add(val1);
}
Object val2 = arr2.get(i);
if(val1 == null) {
continue;
}
if(val2 == null) {
res.add(val1);
continue;
}
// If their types are equal
if(val1.getClass().equals(val2.getClass())) {
// If they are JSONObjects
if(val1 instanceof JSONObject) {
// Get their diff
JSONObject obj = getJSONObjectDiff((JSONObject) val1, (JSONObject) val2);
// If it contains any keys
if(obj.keySet().size() > 0) {
// Store the diff into final diff
res.add(obj);
}
// If they are JSONArrays
} else if (val1 instanceof JSONArray) {
// Get their diff
JSONArray arr = getJSONArrayDiff((JSONArray) val1, (JSONArray) val2);
// If array is not empty
if(arr.size() > 0) {
// put it into the diff
res.add(arr);
}
// If they are just a pure values
} else {
// Compare them - If they're not equal
if(val1 != val2) {
// add the val1 into diff
res.add(val1);
}
}
} else {
res.add(val1);
}
}
return res;
}
}
This is it, now if nothing moves on the map the result JSON looks like this:
{"duration":282964.56}
because only the duration changes
But when my Player moves on the map see what happens:
{"duration":386676.06,"players":[{"position":{"x":556.5914801003707,"y":153.55964799554002}}]}
TODO
I have to implement a preserveKeys functionallity because I always want to send some keys like id and so on...

BsonClassMapSerializer already registered for AbstractClassSerializer

I'm using the Mongo c# driver 2.0 and am running into BsonSerializer registration issues when registering AbstractClassSerializers for my Id value objects.
MongoDB.Bson.BsonSerializationException: There is already a serializer registered for type HistoricalEventId.
When I peek into the BsonSerializer I'm seeing that a BsonClassMapSerializer is already registered for my type.
I'm assuming that a BsonClassMapSerializer is being created for my entity types and it's also creating a BsonClassMapSerializer for the Id field as well. Has anyone run into this before? The Bson serializer code is shown below if that helps.
Sorry if the formatting is wrong, c# doesn't seem to be showing up well.
HistoricalEventIdBsonSerializer
public class HistoricalEventIdBsonSerializer : ToObjectIdBsonSerializer<HistoricalEventId>
{
public override HistoricalEventId CreateObjectFromObjectId(ObjectId serializedObj)
{
HistoricalEventId parsedObj;
HistoricalEventId.TryParse(serializedObj, out parsedObj);
return parsedObj;
}
}
ToObjectIdBsonSerializer
public abstract class ToObjectIdBsonSerializer<T> : AbstractClassSerializer<T> where T : class
{
private static readonly Type _convertibleType = typeof(IConvertible<ObjectId>);
public abstract T CreateObjectFromObjectId(ObjectId serializedObj);
public override T Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
var bsonType = context.Reader.GetCurrentBsonType();
ObjectId value;
switch (bsonType)
{
case BsonType.Undefined:
value = ObjectId.Empty;
context.Reader.ReadUndefined();
break;
case BsonType.Null:
value = ObjectId.Empty;
context.Reader.ReadNull();
break;
case BsonType.ObjectId:
value = context.Reader.ReadObjectId();
break;
case BsonType.String:
value = new ObjectId(context.Reader.ReadString());
break;
default:
throw new NotSupportedException("Unable to create the type " +
args.NominalType.Name + " from the bson type " + bsonType + ".");
}
return this.CreateObjectFromObjectId(value);
}
public override void Serialize(BsonSerializationContext context, BsonSerializationArgs args, T value)
{
if (value == null)
{
context.Writer.WriteObjectId(ObjectId.Empty);
}
else
{
if (!_convertibleType.IsAssignableFrom(args.NominalType))
{
throw new NotSupportedException("The type " + args.NominalType.Name +
" must implement the " + _convertibleType.Name + " interface.");
}
var typedObj = (IConvertible<ObjectId>)value;
context.Writer.WriteObjectId(typedObj.ToValueType());
}
}
}
IConvertible
public interface IConvertible<out T>
{
T ToValueType();
}
My assumption must have been correct because I just fixed this by doing the BsonSerializer registration before creating the MongoClient and getting the database. Hopefully this will help someone else.

Resources