Hibernate Delete object by id - performance

Which is the best method(performance wise) to delete an object if only its id is available.
HQL. Will executing this HQL load the SessionContext object into hibernate persistence context ?
for(int i=0; i<listOfIds.size(); i++){
Query q = createQuery("delete from session_context where id = :id ");
q.setLong("id", id);
q.executeUpdate();
}
Load by ID and delete.
for(int i=0; i<listOfIds.size(); i++){
SessionContext session_context = (SessionContext)getHibernateTemplate().load(SessionContext.class, listOfIds.get(i));
getHibernateTemplate().delete(session_context) ;
}
Here SessionContext is the object mapped to session_context table.
Or, well off course is there an all together different and better approach ?

Out of the two, the first one is better, where you will save memory. When you want to delete the Entity and you have the id with you, writing a HQL is preferred.
In you case there is a third and better option,
Try the below,
//constructs the list of ids using String buffer, by iterating the List.
String idList = "1,2,3,....."
Query q = createQuery("delete from session_context where id in (:idList) ");
q.setString("idList", idList);
q.executeUpdate();
Now if there are 4 items in the list only one query will be fired, Previously there would be 4.
Note:- For the above to work, session_context should be an independent table.

Btw, say no to that ugly string, there is .setParameterList(), so:
List<Long> idList = Arrays.asList(1L, 2L, 3L);
Query q = createQuery("delete from session_context where id in (:idList) ");
q.setParameterList("idList", idList);
q.executeUpdate();
update
I must update on this, in our environment, in the end it turned out, that using setParameterList gives much worse performance, than creating a string manualy and using setString like #ManuPK suggested.

You should also consider caching - first level (session) and second level cache.
The first option is probably the best if the delete is the only or the first operation in transaction.
If you query for some SessionContext objects then call the HQL to delete then all objects in query cache will be evicted, because hibernate doesn't know which to delete. This is not the case with the second approach.
If you use second level cache then it is even more complicated and highly depends on what you do with SessionContext objects.

Related

Insert a list of objects into my room database at once, and the order of objects is changed

I tried to insert a list of objects from my legacy litepal database into my room database, and when I retrieved them from my room database, I found the order of my objects is no longer the same as that of my old list.
Below is my code:
// litepal is my legacy database
val litepalNoteList = LitePal.findAll(Note::class.java)
if (litepalNoteList.isNotEmpty()) {
litepalNoteList.forEach { note ->
// before insertion, I want to trun my legacy note objects into Traininglog objects
// note and Traninglog are of different types, but their content should be the same
val noteContent = note.note_content
val htmlContent = note.html_note_content
val createdDate = note.created_date
val isMarked = note.isLevelUp
val legacyLog = TrainingLog(
noteContent = noteContent,
htmlLogContent = htmlContent,
createdDate = createdDate,
isMarked = isMarked)
logViewModel.viewModelScope.launch(Dispatchers.IO) {
trainingLogDao.insertNewTrainingLog(legacyLog)
} // the end of forEach
}
The problem is that in my room database, the order of TraningLog objects differs randomly from that of my old list in the Litepal database.
Anyone konw why is this happening?
If the order matters, then you should extract data using the ORDER BY phrase. Otherwise you are leaving the order up to the query optimiser.
So say instead of #Query("SELECT * FROM trainingLog") then you could ORDER the result by using #Query("SELECT * FROM trainingLog ORDER BY createdDate ASC")
The efficiency of extracting the above would be improved by having an index on the createdDate column/field (in room #ColumnInfo(index = true)). However, it should be noted that there are overheads to having an index. Insertions and deletions and updates may incur additional processing to maintain the index. Additionally an index uses more space.
You may wish to have an insert function that can take a list rather than run multiple threaded inserts. Room will then (I believe) do all the inserts in a single transaction (1 disk write instead of many).
e.g.
instead of or as well as
#Insert
fun insert(trainingLog: TrainingLog): Long
you could have
#Insert
fun insert(trainingLogList: List<TrainingLog>): LongArray
Then all you need to do is build the List in your loop and then after the loop invoke the single insert.

Getting max value on server (Entity Framework)

I'm using EF Core but I'm not really an expert with it, especially when it comes to details like querying tables in a performant manner...
So what I try to do is simply get the max-value of one column from a table with filtered data.
What I have so far is this:
protected override void ReadExistingDBEntry()
{
using Model.ResultContext db = new();
// Filter Tabledata to the Rows relevant to us. the whole Table may contain 0 rows or millions of them
IQueryable<Measurement> dbMeasuringsExisting = db.Measurements
.Where(meas => meas.MeasuringInstanceGuid == Globals.MeasProgInstance.Guid
&& meas.MachineId == DBMatchingItem.Id);
if (dbMeasuringsExisting.Any())
{
// the max value we're interested in. Still dbMeasuringsExisting could contain millions of rows
iMaxMessID = dbMeasuringsExisting.Max(meas => meas.MessID);
}
}
The equivalent SQL to what I want would be something like this.
select max(MessID)
from Measurement
where MeasuringInstanceGuid = Globals.MeasProgInstance.Guid
and MachineId = DBMatchingItem.Id;
While the above code works (it returns the correct value), I think it has a performance issue when the database table is getting larger, because the max filtering is done at the client-side after all rows are transferred, or am I wrong here?
How to do it better? I want the database server to filter my data. Of course I don't want any SQL script ;-)
This can be addressed by typing the return as nullable so that you do not get a returned error and then applying a default value for the int. Alternatively, you can just assign it to a nullable int. Note, the assumption here of an integer return type of the ID. The same principal would apply to a Guid as well.
int MaxMessID = dbMeasuringsExisting.Max(p => (int?)p.MessID) ?? 0;
There is no need for the Any() statement as that causes an additional trip to the database which is not desirable in this case.

dart - Sort a list of futures

So I have a set of options that each contain an int value representing their ordinal.
These options are stored in a remote database with each option being a record.
As such when I fetch them from the db I end up with a list of future:
e.g. List<Future<Option>>
I need to be able to sort these Options.
The following dart pad shows a simplified view of what I'm trying to achieve:
https://dartpad.dartlang.org/a5175401516dbb9242a0edec4c89fef6
The Options MUST be futures.
My original solution was to copy the Options into a list, 'complete' them, then sort the list.
This however caused other problems and as such I need to do an 'insitu' sort on the original list.
You cannot sort the futures before they have completed, and even then, you need to extract the values first.
If you need to have a list of futures afterwards, this is what I would do:
List<Future<T>> sortFutures<T>(List<Future<T>> input, [int compare(T a, T b)]) {
var completers = [for (var i = 0; i < input.length; i++) Completer<T>()];
Future.wait(input).then((values) {
values.sort(compare);
for (int i = 0; i < values.length; i++) {
completers[i].complete(values[i]);
}
});
return [for (var c in completers) c.future];
}
This does not return the original futures because you don't know the ordering at the time you have to return them. It does return futures which complete with the same value.
If any of the futures completes with an error, then this blows up. You'll need more error handling if that is possible.
Gentlfolk,
thanks for the help.
julemand101 suggestion of using Future.wait() lead me to the answer.
It also helped me better understand the problem.
I've done a new gist that more accurately shows what I was attempting to do.
Essentilly when we do a db request over the network we get an entity back.
The problem is that the entity will often have references to other entities.
This can end in a whole tree of entities needing to be returned.
Often you don't need any of these entities.
So the solution we went for is to only return the database 'id' of each child entity (only the immediate children).
We then store those id's in a class RefId (see below).
The RefId is essentially a future that has the entities id and knows how to fetch the entity from the db.
When we actually need to access a child entity we force the RefId to complete (i.e. retrieve the entity across the network boundary).
We have a whole caching scheme to keep this performant as well as the ability to force the fetching of child elements, as part of the parent request, where we know up front they will be needed.
The options in my example are essentially menu items that need to be sorted.
But of course I can't sort them until they have been retrieved.
So a re-written example and answer:
https://dartpad.dartlang.org/369e71bb173ba3c19d28f6d6fec2072a
Here is the actual IdRef class we use:
https://dartpad.dartlang.org/ba892873a94d9f6f3924436e9fcd1b42
It now has a static resolveList method to help with this type of problem.
Thanks for your assistance.

How to get a SOQL query out of a for loop

Code added.
I have searched endlessly for a solution here and cannot find one, please help!
I have three objects (A, B, and C). A has a lookup to B, and B is the master to C (detail). Both A and C have many records related to each B record.
I want to have a job run that gets a subset of records from object C (it will usually be around 5,000 records). Then go through each of those and get the records on Object A that lookup to the same Object B record, summarize an Object A number field, and put that on the C record.
I have successfully gotten this to work in small scale, <100 Object C records. But each Object C record requires a new SOQL query since I am iterating through them in a for loop after I get all the Object C records. Plus I know this it is not best practice to ever have a query in a loop.
How can I get this to work? Since the records share the relationship with Object B, is there another way to get the data from the Object A records that match? Or is there some way to pull two lists, one Object C and one Object A. Then summarize the Object A records and line the lists up some how?
Thanks in advance!
Code:
public class nightlyJob {
public static void updateNumbers(){
integer I = 29;
List<ObjectC__c> CUpdateList = new List<ObjectC__c>();
List<ObjectC__c> CpullList =
[SELECT ID, Index__c, ObjectB__r.id
FROM ObjectC__c
WHERE Index__c = :I];
for(ObjectC__c s : CpullList){
List<ObjectA__c> AList =
[SELECT ObjectB__c, Number__c
FROM ObjectA__c
WHERE ObjectB__c = :s.ObjectB__r.Id];
decimal NumSum = 0;
for(ObjectA__c a : AList){
NumSum = a.Number__c + NumSum;
}
s.Num__c = NumSum;
CUpdateList.add(s);
}
update CUpdateList;
}
}
It looks like you are really missing several fundamental concepts at the moment.
The biggest problem you are up against in SFDC development is that "database" operations are very expensive and are strictly limited. It's not just a matter of "best practice": if in a single transaction you exceed these limits -- number of SOQL calls, number of records returned, number of records updated, number of DML statements, etc. -- your transaction will fail. For details, search online for "Salesforce Execution Governors and Limits".
You can write code that works within these limitations, but there is a bit of a learning curve.
First, learn to use collections with SOQL queries to get your SOQL queries out of loops. This is a.k.a. "bulkfication" and it fundamental to SFDC development:
List<ObjectC__c> CpullList =
[SELECT ID, Index__c, ObjectB__r.id
FROM ObjectC__c
WHERE Index__c = :I];
// Create a map with the results of this query.
// key=ObjectC__c.Id, value = Object__c record
Map<Id, ObjectC__c> objCmap = Map<Id, ObjectC__c>(CpullList);
// Build a set of all the Object_B id's from this result set
Set<Id> objBids = new Set<Id>();
for (ObjectC__c record : CpullList) {
objBids.add(record.ObjectB__r.id);
}
// Now you can use only one SOQL query instead of a loop
List<ObjectA> AList = [SELECT ObjectB__c, Number__c
FROM ObjectA__c
WHERE ObjectB__c in:objBids];
Next, use "SOQL aggregate functions" whenever you can. Example: in your code here, you could use "SUM()" and "group by" instead of performing these calculations with loops:
// Get the sum of ObjectA__c.Number__c for each Object B in objBIds
AggregateResult[] groupedResults = [select ObjectB__c,
sum(Number__c) sumA
from ObjectA__c
where ObjectB__c in: objBids
group by ObjectB__c];
for (AggregateResult ar : groupedResults) {
System.debug('Object B Id' + ar.get('Objectb__c'));
System.debug('Sum of ObjectA__c.Number__c' + ar.get('sumA'));
// Here, you might want to build a Map<Id, Integer> sumAmap:
// key=Object B ID, value=sumA
// and then use it along with objCmap to build a collection of Object C's
// for your update statement...
}
You can continue this process and apply these ideas to make the code more efficient.
But even after you have your methods working as efficiently as possible, you still may run into limits due to the number of records you're dealing with. At that point, you will need to learn about the Batchable interface, the Queuable interface and #future calls (how to process a larger number of records, split across transactions) That's really too much to information to cover in a single SO answer.

Create a linq subquery returns error "Local sequence cannot be used in LINQ to SQL implementations of query operators except the Contains operator"

I have created a linq query that returns my required data, I now have a new requirement and need to add an extra field into the returned results. My entity contains an ID field that I am trying to map against another table without to much luck.
This is what I have so far.
Dictionary<int, string> itemDescriptions = new Dictionary<int, string>();
foreach (var item in ItemDetails)
{
itemDescriptions.Add(item.ItemID, item.ItemDescription);
}
DB.TestDatabase db = new DB.TestDatabase(Common.GetOSConnectionString());
List<Transaction> transactionDetails = (from t db.Transactions
where t.CardID == CardID.ToString()
select new Transaction
{
ItemTypeID= t.ItemTypeID,
TransactionAmount = t.TransactionAmount,
ItemDescription = itemDescriptions.Select(r=>r.Key==itemTypeID).ToString()
}).ToList();
What I am trying to do is key the value from the dictonary where the key = itemTypeID
I am getting this error.
Local sequence cannot be used in LINQ to SQL implementations of query operators except the Contains operator.
What do I need to modify?
This is a duplicate of this question. The problem you're having is because you're trying to match an in-memory collection (itemDescriptions) with a DB table. Because of the way LINQ2SQL works it's trying to do this in the DB which is not possible.
There are essentially three options (unless I'm missing something)
1) refactor your query so you pass a simple primitive object to the query that can be passed accross to the DB (only good if itemDescriptions is a small set)
2) In your query use:
from t db.Transactions.ToList()
...
3) Get back the objects you need as you're doing, then populate ItemDescription in a second step.
Bear in mind that the second option will force LINQ to evaluate the query and return all transactions to your code that will then be operated on in memory. If the transaction table is large this will not be quick!

Resources