Doctrine result cache and DateTime params - caching

How to get result cache working with DateTime params up to seconds? Since those params change every second they can't be cached and so result cache becomes useless. I'm not sure what's the technique dealing with it, I was thinking maybe it would be possible to ID the query without taking into account the params?

Okay, apparently resultCacheId is useless:
if ($data = $resultCache->fetch($cacheKey)) {
// is the real key part of this row pointers map or is the cache only pointing to other cache keys?
if (isset($data[$realKey])) {
$stmt = new ArrayStatement($data[$realKey]);
} elseif (array_key_exists($realKey, $data)) {
$stmt = new ArrayStatement(array());
}
}
if (!isset($stmt)) {
$stmt = new ResultCacheStatement($this->executeQuery($query, $params, $types), $resultCache, $cacheKey, $realKey, $qcp->getLifetime());
}
After getting the data by the resultCacheId, it still checks the result by realKey (which is calculated including the params) and of course won't find it. So we're getting the cache populated, but it's not being used.

Related

NHibernate second level cache: Query cache doesn't work as expected

The packages I use:
NHibernate 5.2.1
NHibernate.Caches.SysCache 5.5.1
The NH cache config:
<configuration>
<configSections>
<section name="syscache" type="NHibernate.Caches.SysCache.SysCacheSectionHandler,NHibernate.Caches.SysCache" />
</configSections>
<syscache>
<!-- 3.600s = 1h; priority 3 == normal cost of expiration -->
<cache region="GeoLocation" expiration="3600" sliding="true" priority="3" />
</syscache>
</configuration>
I want to query a bunch of locations using their unique primary keys. In this unit test I simulate two requests using different sessions but the same session factory:
[TestMethod]
public void UnitTest()
{
var sessionProvider = GetSessionProvider();
using (var session = sessionProvider.GetSession())
{
var locations = session
.QueryOver<GeoLocation>().Where(x => x.LocationId.IsIn(new[] {147643, 39020, 172262}))
.Cacheable()
.CacheRegion("GeoLocation")
.List();
Assert.AreEqual(3, locations.Count);
}
Thread.Sleep(1000);
using (var session = sessionProvider.GetSession())
{
var locations = session
.QueryOver<GeoLocation>().Where(x => x.LocationId.IsIn(new[] { 39020, 172262 }))
.Cacheable()
.CacheRegion("GeoLocation")
.List();
Assert.AreEqual(2, locations.Count);
}
}
If the exact same IDs are queried in the exact same order, the second call would fetch the objects from the cache. In this example however, the query is called with only two of the previously submitted IDs. Although the locations have been cached, the second query will fetch them from the DB.
I expected the cache to work like a table that is queried first. Only the IDs that have not been cached yet, should trigger a DB call. But obviously the whole query seems to be the hash key for the cached objects.
Is there any way to change that behavior?
There is no notion of a partial query cache, it's all or nothing: if the results for this exact query are found - they are used, otherwise the database is queried. This is because the query cache system does not have specific knowledge about the meaning of the queries (eg. it cannot infer the fact that result of a particular query is a subset of some cached result).
In other words the query cache in NHibernate acts as a document storage rather than a relation table storage. The key for the document is a combination of the query's SQL (in case of linq some textual representation of the expression tree), all parameter types, and all parameter values.
To solve your particular case I would suggest to do some performance testing. Depending on the tests and a dataset size there are some possible solutions: filter cached results on a client (something like following), or not use query cache, or you can implement some caching mechanism for the particular query on the application level.
[TestMethod]
public void UnitTest()
{
var sessionProvider = GetSessionProvider();
using (var session = sessionProvider.GetSession())
{
var locations = session
.QueryOver<GeoLocation>()
.Cacheable()
.CacheRegion("GeoLocation")
.List()
.Where(x => new[] {147643, 39020, 172262}.Contains(x.LocationId))
.ToList();
Assert.AreEqual(3, locations.Count);
}
Thread.Sleep(1000);
using (var session = sessionProvider.GetSession())
{
var locations = session
.QueryOver<GeoLocation>().
.Cacheable()
.CacheRegion("GeoLocation")
.List()
.Where(x => new[] {39020, 172262}.Contains(x.LocationId))
.ToList();
Assert.AreEqual(2, locations.Count);
}
}
More information on how the (N)Hibernate query cache works, can be found here.

Fetch history of records using LINQ

I am using entity framework with repository pattern and unit of work objects..
I have an entity Request with properties "RequestId", "OldRequestId", which can be accessed using requestRepository object.
eg: requestRepostiory.GetAll(), requestRepository.GetFiltered(r=> r.Requestid =10)
If I pass a RequestId, it should retrieve me the specific record.
If the OldRequestId is not null in the retrieved record, it should bring the old request data as well.
It should go on until the OldRequestId is null.
Simple way would be something like this:
public static IEnumerable<Data> GetRecursive(int id)
{
while (true)
{
var tmp = GetFiltered(x => x.Requestid == id);
yield return tmp;
if (tmp.OldRequestId.HasValue)
id = tmp.OldRequestId.Value;
else
yield break;
}
}
Please note, that this code would run make multiple queries towards the database. Performance won't be the best, but it might work for your scenario.

MVC3 Entity Framework Code First Updating Subset Related List of Items

I have a table of data with a list of key value pairs in it.
Key Value
--------------------
ElementName PrimaryEmail
Email someemail#gmail.ca
Value Content/Images/logo-here.jpg
I am able to generate new items on my client webpage. When, I create a new row on the client and save it to the server by executing the following code the item saves to the database as expected.
public ViewResult Add(CardElement cardElement)
{
db.Entry(obj).State = EntityState.Added;
db.SaveChange();
return Json(obj);
}
Now, when I want to delete my objects by sending another ajax request I get a failure.
public void Delete(CardElement[] cardElements)
{
foreach (var cardElement in cardElements)
{
db.Entry(cardElement).State = EntityState.Deleted;
}
db.SaveChanges();
}
This results in the following error.
Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. Refresh ObjectStateManager entries.
I have tried other ways of deleting including find by id remove and attach and delete but obviously I am approaching in the right fashion.
I am not sure what is causing your issue, but I tend to structure my deletes as follows:
public void Delete(CardElement[] cardElements)
{
foreach (var cardElement in cardElements)
{
var element = db.Table.Where(x => x.ID == cardElement.ID).FirstOrDefault();
if(element != null)
db.DeleteObject(element);
}
db.SaveChanges();
}
although I tend to do database first development, which may change things slightly.
EDIT: the error you are receiving states that no rows were updated. When you pass an object to a view, then pass it back to the controller, this tends to break the link between the object and the data store. That is why I prefer to look up the object first based on its ID, so that I have an object that is still linked to the data store.

How can I create temporary records of Linq-To-Sql types without causing duplicate key problems?

I have code that generates records based on my DataGridView. These records are temporary because some of them already exist in the database.
Crop_Variety v = new Crop_Variety();
v.Type_ID = currentCropType.Type_ID;
v.Variety_ID = r.Cells[0].Value.ToString();
v.Description = r.Cells[1].Value.ToString();
v.Crop = currentCrop;
v.Crop_ID = currentCrop.Crop_ID;
Unfortunately in this little bit of code, because I say that v.Crop = currentCrop,
now currentCrop.Crop_Varieties includes this temporary record. And when I go to insert the records of this grid that are new, they have a reference to the same Crop record, and therefore these temporary records that do already exist in the database show up twice causing duplicate key errors when I submit.
I have a whole system for detecting what records need to be added and what need to be deleted based on what the user has done, but its getting gummed up by this relentless tracking of references.
Is there a way I can stop Linq-To-Sql from automatically adding these temporary records to its table collections?
I would suggest revisiting the code that populates DataGridView (grid) with records.
And then revisit the code that operates on items from a GridView, keeping in mind that you can grab bound item from a grid row using the following code:
public object GridSelectedItem
{
get
{
try
{
if (_grid == null || _grid.SelectedCells.Count < 1) return null;
DataGridViewCell cell = _grid.SelectedCells[0];
DataGridViewRow row = _grid.Rows[cell.RowIndex];
if (row.DataBoundItem == null) return null;
return row.DataBoundItem;
}
catch { }
return null;
}
}
It is also hard to understand the nature of Crop_Variety code that you have posted. As the Crop_Variety seems to be a subclass of Crop. This leads to problems when the Crop is not yet bound to database and potentially lead to problems when you're adding Crop_Variety to the context.
For this type of Form application I normally have List _dataList inside form class, then the main grid is bound to that list, through ObjectBindingList or another way. That way _dataList holds all data that needs to be persisted when needed (user clicked save).
When you assign an entity object reference you are creating a link between the two objects. Here you are doing that:
v.Crop = currentCrop;
There is only one way to avoid this: Modify the generated code or generate/write your own. I would never do this.
I think you will be better off by writing a custom DTO class instead of reusing the generated entities. I have done both approaches and I like the latter one far better.
Edit: Here is some sample generated code:
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="RssFeed_RssFeedItem", Storage="_RssFeed", ThisKey="RssFeedID", OtherKey="ID", IsForeignKey=true, DeleteOnNull=true, DeleteRule="CASCADE")]
public RssFeed RssFeed
{
get
{
return this._RssFeed.Entity;
}
set
{
RssFeed previousValue = this._RssFeed.Entity;
if (((previousValue != value)
|| (this._RssFeed.HasLoadedOrAssignedValue == false)))
{
this.SendPropertyChanging();
if ((previousValue != null))
{
this._RssFeed.Entity = null;
previousValue.RssFeedItems.Remove(this);
}
this._RssFeed.Entity = value;
if ((value != null))
{
value.RssFeedItems.Add(this);
this._RssFeedID = value.ID;
}
else
{
this._RssFeedID = default(int);
}
this.SendPropertyChanged("RssFeed");
}
}
}
As you can see the generated code is establishing the link by saying "value.RssFeedItems.Add(this);".
In case you have many entities for wich you would need many DTOs you could code-generate the DTO classes by using reflection.

linqToSql related table not delay loading properly. Not populating at all

I have a couple of tables with similar relationship structure to the standard Order, OrderLine tables.
When creating a data context, it gives the Order class an OrderLines property that should be populated with OrderLine objects for that particular Order object.
Sure, by default it will delay load the stuff in the OrderLine property but that should be fairly transparent right?
Ok, here is the problem I have: I'm getting an empty list when I go MyOrder.OrderLines but when I go myDataContext.OrderLines.Where(line => line.OrderId == 1) I get the right list.
public void B()
{
var dbContext = new Adis.CA.Repository.Database.CaDataContext(
"<connectionString>");
dbContext.Connection.Open();
dbContext.Transaction = dbContext.Connection.BeginTransaction();
try
{
//!!!Edit: Imortant to note that the order with orderID=1 already exists
//!!!in the database
//just add some new order lines to make sure there are some
var NewOrderLines = new List<OrderLines>()
{
new OrderLine() { OrderID=1, LineID=300 },
new OrderLine() { OrderID=1, LineID=301 },
new OrderLine() { OrderID=1, LineID=302 },
new OrderLine() { OrderID=1, LineID=303 }
};
dbContext.OrderLines.InsertAllOnSubmit(NewOrderLines);
dbContext.SubmitChanges();
//this will give me the 4 rows I just inserted
var orderLinesDirect = dbContext.OrderLines
.Where(orderLine => orderLine.OrderID == 1);
var order = dbContext.Orders.Where(order => order.OrderID == 1);
//this will be an empty list
var orderLinesThroughOrder = order.OrderLines;
}
catch (System.Data.SqlClient.SqlException e)
{
dbContext.Transaction.Rollback();
throw;
}
finally
{
dbContext.Transaction.Rollback();
dbContext.Dispose();
dbContext = null;
}
}
So as far as I can see, I'm not doing anything particularly strange but I would think that orderLinesDirect and orderLinesThroughOrder would give me the same result set.
Can anyone tell me why it doesn't?
You're just adding OrderLines; not any actual Orders. So the Where on dbContext.Orders returns an empty list.
How you can still find the property OrderLines on order I don't understand, so I may be goofing up here.
[Edit]
Could you update the example to show actual types, especially of the order variable? Imo, it shoud be an IQueryable<Order>, but it's strange that you can .OrderLines into that. Try adding a First() or FirstOrDefault() after the Where.

Resources