I have an MVC 3 site that uses cached in memory objects.
when the site first gets hit, it takes around a minute to build the cache, once built its very fast for everyone then on.
when im developing, ive had to reduce the number of cached objects as everytime i recomple my project it drops the cache and has to rebuild it.
Is there a way i can set Visual studio so it keeps the in memory cache when i recomple ?
here is some of my code i use for caching....
/// <summary>
/// Method to get all the prices
/// </summary>
public static List<DB2011_PriceRange> AllPrices
{
get
{
lock(_PriceLock)
{
if (HttpRuntime.Cache["Prices"] != null)
{
return (List<DB2011_PriceRange>)HttpRuntime.Cache["Prices"];
}
else
{
PopulatePrices();
return (List<DB2011_PriceRange>)HttpRuntime.Cache["Prices"];
}
}
}
}
/// <summary>
/// Poplate the Cache containing the prices
/// </summary>
private static void PopulatePrices()
{
// clear the cache and the list object
ClearCacheAndList("Trims", ref ListAllPrices);
using(var DB = DataContext.Get_DataContext)
{
ListAllPrices = (from p in DB.DB2011_PriceRange
select p).ToList();
}
// add the list object to the Cache
HttpRuntime.Cache.Add("Prices", ListAllPrices, null, DateTime.Now.AddHours(24), Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);
}
any help is always appricaiated
Truegilly
Recompiling your application causes the AppDomain that is hosting it to be restarted which is what is disposing of your cache. You could:
Try to save your cache to disk and read it from there when the app starts. It might be faster.
Use an out-of-process cache such as Velocity
I don't believe so. When you publish new DLLs the a new process that runs the application is created. Since you're using an in-memory cache, all objects would be removed.
You could "warm the caches" with a special method that would pre-populate them all when you publish new code.
Related
I’m having issues with a static member of my app class losing its value and I’m not quite sure I understand why. In my app constructor I check if the user is logged in and if not redirect to a login page where I set the static app class member.
I understand if the app is forced to close to free up resources, these values are not retained so a new app instance would start and go back to login screen. However, what I’m seeing is the static member losing its value during an application session. I can do a check to see if this is null on resume and redirect to login page but I don’t understand why this happens.
My understaning was that the only way you would lose values would be if the app was killed in the background but this problem would suggest it can happen when resuming too.
In a normal C# application static members will typically survive forever, but unfortunately your observations are entirely correct; in Xamarin Forms static members are not guaranteed to persist for the length of the application's life.
In Android's case if the underlying platform indicates a low memory state (or increased demands on memory from multiple running applications) then static members are considered collectable by the GC, which is often triggered when you pause the application (ie. switching to a different app). They will be reduced to their default value, eg. null, zero, etc.
I've wrestled with this curio for years, and the most performant work around is to implement a re-population pattern on those static members, eg.
internal List<MyCustomType> _AListOfStuff
internal List<MyCustomType> AListOfStuff
{
get
{
if (_AListOfStuff == null)
{
PopulateAListOfStuff(); //If this occurs then the static member has been garbage collected: reload it
}
return _AListOfStuff;
}
}
From what you've said, I appreciate that your particular usage of static members probably doesn't fit with this solution, however all I can offer is that you're not crazy; it is a documented quirk, and not considered a bug (don't even bother shaking that tree; I've been down that route with the devs and was told in no uncertain terms that the behaviour is here to stay, and is necessary to ensure overall device stability).
Static member will not lose. If we see the code then we can assist further. Another approach would be, try using singleton pattern, it will create new instance only if it's instance is null. sample below:
public sealed class SingletonSample
{
private static SingletonSample instance = null;
private static readonly object padlock = new object();
public static SingletonSample Instance
{
get
{
lock (padlock)
{
if (instance == null)
{
instance = new SingletonSample();
}
return instance;
}
}
}
public string FirstName { get; set; }
}
I am using MiniProfiler to Profile My MVC app and WCF services, this works like a charm with one caveat - when the profile information contains sql.
Symptoms:
The "query time (ms)" heading is missing from the popup
The "% in sql" is also missing from the bottom of the popup
If I click on the " sql" links it shows the grey overlay but no information and throws some jQuery error (it can't find the element).
After a little digging I discovered that this is all to do with HasSqlTimings in the json response there is an inconsistency between HasSqlTimings (false) at the root of the json response and the information that is in Root / Children hierarchy (true).
[OnDeserialized]
void OnDeserialized(StreamingContext ctx)
{
HasSqlTimings = GetTimingHierarchy().Any(t => t.HasSqlTimings);
HasDuplicateSqlTimings = GetTimingHierarchy().Any(t => t.HasDuplicateSqlTimings);
if (_root != null)
{
_root.RebuildParentTimings();
}
}
I took a look at the source and it looks like it should work just fine but no deal! Does anyone have any idea where I might be going wrong?
I had the same issue when implementing this. The issue occurs when the UI layer doesn't make any sql calls but then the WCF data comes back with sql calls. The root timing doesn't know there are sql timings below in the hierarchy.
I added one line so that when adding "remote" timings we set the "hasSqlTimings" field so that the UI knows how to render the box properly. Here is the code I modified in MvcMiniProfiler\MiniProfiler.cs:
/// <summary>
/// Adds <paramref name="externalProfiler"/>'s <see cref="Timing"/> hierarchy to this profiler's current Timing step,
/// allowing other threads, remote calls, etc. to be profiled and joined into this profiling session.
/// </summary>
public static void AddProfilerResults(this MiniProfiler profiler, MiniProfiler externalProfiler)
{
if (profiler == null || externalProfiler == null) return;
profiler.Head.AddChild(externalProfiler.Root);
profiler.HasSqlTimings = profiler.GetTimingHierarchy().Any(t => t.HasSqlTimings);
}
I am attempting to use a file as a trigger to refresh my cached items. When the file is changed, I need to fire an event (every time the file changes). I'm currently using the HostFileChangeMonitor class. Here's where a cached item gets set, using a policy to link it to a file:
private static void SetPolicy(CacheNames cacheName, CacheItem item)
{
string strCacheName = cacheName.ToString();
CacheItemPolicy policy;
if (_policies.TryGetValue(strCacheName, out policy))
{
_cacheObject.Set(item, policy);
return;
}
policy = new CacheItemPolicy();
List<string> filePaths = new List<string> {
string.Format(#"{0}\{1}.txt",Config.AppSettings.CachePath,cacheName.ToString())
};
var changeMonitor = new HostFileChangeMonitor(filePaths);
_cacheObject.Set(item, policy);
changeMonitor.NotifyOnChanged(new OnChangedCallback(RefreshCache));
policy.ChangeMonitors.Add(changeMonitor);
}
The NotifyOnChanged fires only once, however. Because of that, I am currently removing and then re-adding the item to the cache in the RefreshCache method called by the NotifyOnChanged:
private static void RefreshCache(object state)
{
//remove from cache
WcfCache.ClearCache("Refreshed");
//resubscribe to NotifyOnChanged event
WcfCache.SetCache("Refreshed", true, CacheNames.CacheFileName);
//grab all cache data and refresh each in parallel
}
Is there a better way to do this? Is there an event I can tap into that will ALWAYS fire (instead of just the first time like this NotifyOnChanged)? This seems pretty fragile. If the HostFileChangeMonitor doesn't get added properly one time, the entire app's cache will never refresh.
Have you tried the FileSystemWatcher and handling its Change event?Here's MSDN's documentation on the subject for more detailed info.
I've just discovered this class and from what I get, the NotifyOnChanged is not meant to be fired every time a watched file is updated (even though the name lets you think otherwise).
Rather the file is constantly being watched and cached and you simply need to get it from the cache whenever you need it.
Update: I have dropped the cache system in favor of a database solution - pitty.
I have a backend MVC controller where i need data caching. I use MemoryCache.Default to store key/value pairs, nothing big. Nevermind policies and expire times, i'f got that. The thing that mystifys me is why my cache gets cleaned out after I'f accessed a key (retrived the value) the first time. If i don't access the cached item, eventually the item will expire and my remove handler is called - it's all good. But when i retrive the item the first time, my remove handler is called after a short while. The ChacheEntryRemovedReason is set to:
CacheSpecificEviction // A cache entry was evicted for as reason that is defined by a particular cache implementation.
I can't find any explanation to what this means.
The mystifying thing here is that when i inspect the cache object when debugging in the handler (and on succeeding controller calls), the cache enum is empty. If I "set" (add) a new CacheItem to the cache, I can yet again access the key once, and history repeats.
The behavior is like a one-off caching mechanism which i totally don't need.
Any help or comments would be much appreciated!
Some simplified code just for the fun of it:
private static ObjectCache cache = MemoryCache.Default;
internal void insertInCache(string key, int value) {
CacheItemPolicy policy= new CacheItemPolicy() {
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
Priority = CacheItemPriority.NotRemovable,
SlidingExpiration = TimeSpan.FromMinutes(ITEM_EXPIRE_TIME),
RemovedCallback = new CacheEntryRemovedCallback(RemovedHandler)
};
cache.Set(key, value, policy);
}
static void RemovedHandler(CacheEntryRemovedArguments args) {
if(args.RemovedReason == CacheEntryRemovedReason.Expired) {
//do something - or i actually want it to disappear when expired
} else {
cache.Set(args.CacheItem, somepolicy);//reinsert to keep in cache
}
}
//Apparently this triggers some cache mong mode
internal void getSome(string key){
int thisIsWhatIWanted = (int)cache.GetCacheItem(key).Value;
}
This is just example code so please don't nag me about my skillz.
My own best guess is that it may have to do with the cache not being setup properly, MVC witchery or the fact I'm running my application on a debug IIS (visual studido)
On Linq to SQL's DataContext I am able to call SubmitChanges() to submit all changes.
What I want is to somehow reject all changes in the datacontext and rollback all changes (preferable without going to the database).
Is this possible?
Why not discard the data context and simply replace it with a new instance?
public static class DataContextExtensions
{
/// <summary>
/// Discard all pending changes of current DataContext.
/// All un-submitted changes, including insert/delete/modify will lost.
/// </summary>
/// <param name="context"></param>
public static void DiscardPendingChanges(this DataContext context)
{
context.RefreshPendingChanges(RefreshMode.OverwriteCurrentValues);
ChangeSet changeSet = context.GetChangeSet();
if (changeSet != null)
{
//Undo inserts
foreach (object objToInsert in changeSet.Inserts)
{
context.GetTable(objToInsert.GetType()).DeleteOnSubmit(objToInsert);
}
//Undo deletes
foreach (object objToDelete in changeSet.Deletes)
{
context.GetTable(objToDelete.GetType()).InsertOnSubmit(objToDelete);
}
}
}
/// <summary>
/// Refreshes all pending Delete/Update entity objects of current DataContext according to the specified mode.
/// Nothing will do on Pending Insert entity objects.
/// </summary>
/// <param name="context"></param>
/// <param name="refreshMode">A value that specifies how optimistic concurrency conflicts are handled.</param>
public static void RefreshPendingChanges(this DataContext context, RefreshMode refreshMode)
{
ChangeSet changeSet = context.GetChangeSet();
if (changeSet != null)
{
context.Refresh(refreshMode, changeSet.Deletes);
context.Refresh(refreshMode, changeSet.Updates);
}
}
}
Refer to Linq to SQL - Discard Pending Changes
In .net 3.0 use the db.GetChangeSet().Updates.Clear() for updated, db.GetChangeSet().Inserts.Clear() for new or db.GetChangeSet().Deletes.Clear() for deleted items.
In .net 3.5 and above the result of GetChangeSet() is now readonly, loop the collection in for or foreach and refresh every ChangeSet table like also macias wrote in his comment.
As Haacked said, just drop the data context.
You probably shouldn't keep the data context alive for a long time. They're designed to be used in a transactional manner (i.e. one data context per atomic work unit). If you keep a data context alive for a long time, you run a greater risk of generating a concurrency exception when you update a stale entity.
Calling Clear() on the Updates, Deletes and Inserts collection does not work.
GetOriginalEntityState() can be useful, but it only gives the IDs for foreign key relationships, not the actual entities so you're left with a detached object.
Here's an article that explains how to discard changes from the data context: http://graemehill.ca/discard-changes-in-linq-to-sql-datacontext
EDIT: Calling Refresh() will undo updates, but not deletes and inserts.
The Refresh will work, however you have to give the entities you want to reset.
For example
dataContext.Refresh(RefreshMode.OverwriteCurrentValues, someObject);
You can use the GetOriginalEntityState(..) to get the original values for the objects e.g. Customers using the old cached values.
You can also iterate through the changes e.g. updates and refresh the specific objects only and not the entire tables because the performance penalty will be high.
foreach (Customer c in MyDBContext.GetChangeSet().Updates)
{
MyDBContext.Refresh(System.Data.Linq.RefreshMode.OverwriteCurrentValues, c);
}
this will revert the changes using persisted data in the database.
Another solution is to dump the datacontext you use, using Dispose().
In any case it is a good practice to override the Insert and Remove methods in the collection of e.g. Customers you use and add e.g. an InsertOnSubmit() call. This will resolve your issue with pending Insertions and Deletions.
My application is outlook style with a icon to select an active form (ListBox). Before allowing the user to change their context they have to accept changes or discard them.
var changes = db.GetChangeSet();
if ((changes.Updates.Count > 0) || (changes.Inserts.Count > 0) || (changes.Deletes.Count > 0))
{
if (MessageBox.Show("Would you like to save changes?", "Save Changes", MessageBoxButton.YesNo) == MessageBoxResult.Yes)
{
db.SubmitChanges();
} else
{
//Rollback Changes
foreach (object objToInsert in changes.Inserts)
{
db.GetTable(objToInsert.GetType()).DeleteOnSubmit(objToInsert);
}
foreach (object objToDelete in changes.Deletes)
{
db.GetTable(objToDelete.GetType()).InsertOnSubmit(objToDelete);
}
foreach (object objToUpdate in changes.Updates)
{
db.Refresh(RefreshMode.OverwriteCurrentValues, objToUpdate);
}
CurrentForm.SetObject(null); //Application Code to Clear active form
RefreshList(); //Application Code to Refresh active list
}
}
Excellent write up on here, but here is a copy and paste of the code used.
Public Sub DiscardInsertsAndDeletes(ByVal data As DataContext)
' Get the changes
Dim changes = data.GetChangeSet()
' Delete the insertions
For Each insertion In changes.Inserts
data.GetTable(insertion.GetType).DeleteOnSubmit(insertion)
Next
' Insert the deletions
For Each deletion In changes.Deletes
data.GetTable(deletion.GetType).InsertOnSubmit(deletion)
Next
End Sub
Public Sub DiscardUpdates(ByVal data As DataContext)
' Get the changes
Dim changes = data.GetChangeSet()
' Refresh the tables with updates
Dim updatedTables As New List(Of ITable)
For Each update In changes.Updates
Dim tbl = data.GetTable(update.GetType)
' Make sure not to refresh the same table twice
If updatedTables.Contains(tbl) Then
Continue For
Else
updatedTables.Add(tbl)
data.Refresh(RefreshMode.OverwriteCurrentValues, tbl)
End If
Next
End Sub
Here is how I did it. I just followed Teddy's example above and simplified it. I have one question though, why even bother with the refresh on the DELETES?
public static bool UndoPendingChanges(this NtsSuiteDataContext dbContext)
{
if (dbContext.ChangesPending())
{
ChangeSet dbChangeSet = dbContext.GetChangeSet();
dbContext.Refresh(RefreshMode.OverwriteCurrentValues, dbChangeSet.Deletes);
dbContext.Refresh(RefreshMode.OverwriteCurrentValues, dbChangeSet.Updates);
//Undo Inserts
foreach (object objToInsert in dbChangeSet.Inserts)
{
dbContext.GetTable(objToInsert.GetType()).DeleteOnSubmit(objToInsert);
}
//Undo deletes
foreach (object objToDelete in dbChangeSet.Deletes)
{
dbContext.GetTable(objToDelete.GetType()).InsertOnSubmit(objToDelete);
}
}
return true;
}
//This works for me, 13 years later using Dotnet (Core) 6:
dbContext.ChangeTracker.Clear();