I'm trying to get a query returned ordered on a filed which is calculated in Play.
This is the query I'm using.
return all().order("points").fetch();
where points is defined as
public Integer points;
and is retrieve thanks to this getter
public int getPoints(){
List<EventVote> votesP = votes.filter("isPositive", true).fetch();
List<EventVote> votesN = votes.filter("isPositive", false).fetch();
this.points= votesP.size()-votesN.size();
return this.points;
}
The getter is correctly called when I do
int votes=objectWithPoints.points;
I have the feeling I'm pretending a bit too much out of siena, but I would love this to work (or some similar code). Currently it just skips the order condition. Ordering on any other field works correctly.
I think you're true when you say you await a bit too much :)
The Siena query all().order("points").fetch() performs a request to the DB.
So it will order the values stored into the DB not into your program.
From what you say, I see that you have a getter getPoints which computes a value.
Yet, if you don't store this value into the database, the ordering can't be performed by Siena.
So either you compute the value, set it in your object and save the object to the DB.
objectWithPoints.points = getPoints();
objectWithPoints.save();
Either you order values by yourself in your program after computing them.
Related
I'm using EF Core but I'm not really an expert with it, especially when it comes to details like querying tables in a performant manner...
So what I try to do is simply get the max-value of one column from a table with filtered data.
What I have so far is this:
protected override void ReadExistingDBEntry()
{
using Model.ResultContext db = new();
// Filter Tabledata to the Rows relevant to us. the whole Table may contain 0 rows or millions of them
IQueryable<Measurement> dbMeasuringsExisting = db.Measurements
.Where(meas => meas.MeasuringInstanceGuid == Globals.MeasProgInstance.Guid
&& meas.MachineId == DBMatchingItem.Id);
if (dbMeasuringsExisting.Any())
{
// the max value we're interested in. Still dbMeasuringsExisting could contain millions of rows
iMaxMessID = dbMeasuringsExisting.Max(meas => meas.MessID);
}
}
The equivalent SQL to what I want would be something like this.
select max(MessID)
from Measurement
where MeasuringInstanceGuid = Globals.MeasProgInstance.Guid
and MachineId = DBMatchingItem.Id;
While the above code works (it returns the correct value), I think it has a performance issue when the database table is getting larger, because the max filtering is done at the client-side after all rows are transferred, or am I wrong here?
How to do it better? I want the database server to filter my data. Of course I don't want any SQL script ;-)
This can be addressed by typing the return as nullable so that you do not get a returned error and then applying a default value for the int. Alternatively, you can just assign it to a nullable int. Note, the assumption here of an integer return type of the ID. The same principal would apply to a Guid as well.
int MaxMessID = dbMeasuringsExisting.Max(p => (int?)p.MessID) ?? 0;
There is no need for the Any() statement as that causes an additional trip to the database which is not desirable in this case.
Does anyone know if there's an easy way to negate a parse query? Something like this:
Parse.Query.not(query)
More specifically I want to do a relational query that gets everything except for the objects within the query. For example:
const relation = myParseObject.relation("myRelation");
const query = relation.query();
const negatedQuery = Parse.Query.not(query);
return await negatedQuery.find();
I know one solution would be to fetch the objects in the relation and then create a new query by looping through the objectIds using query.notEqualTo("objectId", fetchedObjectIds[i]), but this seems really circuitous...
Any help would be much appreciated!
doesNotMatchKeyInQuery is the solution as Davi Macedo pointed out in the comments.
For example, if I wanted to get all of the Comments that are not in an Article's relation, I would do the following:
const relationQuery = article.relation("comments").query();
const notInRelationQuery = new Parse.Query("Comment");
notInRelationQuery.doesNotMatchKeyInQuery("objectId", "objectId", relationQuery);
const notRelatedComments = await notInRelationQuery.find();
How I understand it is that the first argument is specifying the key in the objects that we are fetching. The second argument is specifying the key in the objects that are in the query that we're about to argue. And lastly we argue a query for the objects we don't want. So, it essentially finds the objects you don't want and then compares the values of the objects you do want to the values of the objects you don't want for the argued keys. It then returns all the objects you do want. I could probably write that more succinctly, but w/e.
Code added.
I have searched endlessly for a solution here and cannot find one, please help!
I have three objects (A, B, and C). A has a lookup to B, and B is the master to C (detail). Both A and C have many records related to each B record.
I want to have a job run that gets a subset of records from object C (it will usually be around 5,000 records). Then go through each of those and get the records on Object A that lookup to the same Object B record, summarize an Object A number field, and put that on the C record.
I have successfully gotten this to work in small scale, <100 Object C records. But each Object C record requires a new SOQL query since I am iterating through them in a for loop after I get all the Object C records. Plus I know this it is not best practice to ever have a query in a loop.
How can I get this to work? Since the records share the relationship with Object B, is there another way to get the data from the Object A records that match? Or is there some way to pull two lists, one Object C and one Object A. Then summarize the Object A records and line the lists up some how?
Thanks in advance!
Code:
public class nightlyJob {
public static void updateNumbers(){
integer I = 29;
List<ObjectC__c> CUpdateList = new List<ObjectC__c>();
List<ObjectC__c> CpullList =
[SELECT ID, Index__c, ObjectB__r.id
FROM ObjectC__c
WHERE Index__c = :I];
for(ObjectC__c s : CpullList){
List<ObjectA__c> AList =
[SELECT ObjectB__c, Number__c
FROM ObjectA__c
WHERE ObjectB__c = :s.ObjectB__r.Id];
decimal NumSum = 0;
for(ObjectA__c a : AList){
NumSum = a.Number__c + NumSum;
}
s.Num__c = NumSum;
CUpdateList.add(s);
}
update CUpdateList;
}
}
It looks like you are really missing several fundamental concepts at the moment.
The biggest problem you are up against in SFDC development is that "database" operations are very expensive and are strictly limited. It's not just a matter of "best practice": if in a single transaction you exceed these limits -- number of SOQL calls, number of records returned, number of records updated, number of DML statements, etc. -- your transaction will fail. For details, search online for "Salesforce Execution Governors and Limits".
You can write code that works within these limitations, but there is a bit of a learning curve.
First, learn to use collections with SOQL queries to get your SOQL queries out of loops. This is a.k.a. "bulkfication" and it fundamental to SFDC development:
List<ObjectC__c> CpullList =
[SELECT ID, Index__c, ObjectB__r.id
FROM ObjectC__c
WHERE Index__c = :I];
// Create a map with the results of this query.
// key=ObjectC__c.Id, value = Object__c record
Map<Id, ObjectC__c> objCmap = Map<Id, ObjectC__c>(CpullList);
// Build a set of all the Object_B id's from this result set
Set<Id> objBids = new Set<Id>();
for (ObjectC__c record : CpullList) {
objBids.add(record.ObjectB__r.id);
}
// Now you can use only one SOQL query instead of a loop
List<ObjectA> AList = [SELECT ObjectB__c, Number__c
FROM ObjectA__c
WHERE ObjectB__c in:objBids];
Next, use "SOQL aggregate functions" whenever you can. Example: in your code here, you could use "SUM()" and "group by" instead of performing these calculations with loops:
// Get the sum of ObjectA__c.Number__c for each Object B in objBIds
AggregateResult[] groupedResults = [select ObjectB__c,
sum(Number__c) sumA
from ObjectA__c
where ObjectB__c in: objBids
group by ObjectB__c];
for (AggregateResult ar : groupedResults) {
System.debug('Object B Id' + ar.get('Objectb__c'));
System.debug('Sum of ObjectA__c.Number__c' + ar.get('sumA'));
// Here, you might want to build a Map<Id, Integer> sumAmap:
// key=Object B ID, value=sumA
// and then use it along with objCmap to build a collection of Object C's
// for your update statement...
}
You can continue this process and apply these ideas to make the code more efficient.
But even after you have your methods working as efficiently as possible, you still may run into limits due to the number of records you're dealing with. At that point, you will need to learn about the Batchable interface, the Queuable interface and #future calls (how to process a larger number of records, split across transactions) That's really too much to information to cover in a single SO answer.
There is a table, it is a poco entity generated by entity framework.
class Log
{
int DoneByEmpId;
string DoneByEmpName
}
I am retrieving a list from the data base. I want distinct values based on donebyempid and order by those values empname.
I have tried lot of ways to do it but it is not working
var lstLogUsers = (context.Logs.GroupBy(logList => logList.DoneByEmpId).Select(item => item.First())).ToList(); // it gives error
this one get all the user.
var lstLogUsers = context.Logs.ToList().OrderBy(logList => logList.DoneByEmpName).Distinct();
Can any one suggest how to achieve this.
Can I just point out that you probably have a problem with your data model here? I would imagine you should just have DoneByEmpId here, and a separate table Employee which has EmpId and Name.
I think this is why you are needing to use Distinct/GroupBy (which doesn't really work for this scenario, as you are finding).
I'm not near a compiler, so i can't test it, but...
Use the other version of Distinct(), the one that takes an IEqualityComparer<TSource> argument, and then use OrderBy().
See here for example.
I want to add some calculated properties to an EntityObject without loosing the possibility of querying it agains the database.
I created a partial class and added the fields I need in the object. Than I wrote a static function "AttachProperties" that should somehow add some calculated values. I cannot do this on clientside, since several other functions attach some filter-conditions to the query.
The functions should look like this:
return query.Select(o =>
{
o.HasCalculatedProperties = true;
o.Value = 2;
return o;
});
In my case the calculated value depends on several lookups and is not just a simple "2". This sample works with an IEnumerable but, of course, not with an IQueryable
I first created a new class with the EntityObject as property and added the other necessary fields but now I need this extended class to be of the same basetype.
First, in my opinion changing objects in a Select() is a bad idea, because it makes something else happen (state change) than the method name suggests (projection), which is always a recipe for trouble. Linq is rooted in a functional programming (stateless) paradigm, so this kind of usage is just not expected.
But you can extend your class with methods that return a calculation result, like:
partial class EntityObject
{
public int GetValue()
{
return this.MappedProp1 * this.MappedProp2;
}
}
It is a bit hard to tell from your question whether this will work for you. If generating a calculated value involves more than a simple calculation from an object's own properties it may be better to leave your entities alone and create a services that return calculation results from an object graph.
Try something like this:
return from o in collection
select new O()
{
OtherProperty = o.OtherProperty,
HasCalculatedProperties = true,
Value = 2
};
This will create a copy of the original object with the changes you require and avoid all the messiness that come with modifying an entity in a select clause.