SysCache2 and FluentNhibernate on MVC aplication - model-view-controller

I am having a problem with SysCache/SysCache2 on my MVC application. My configuration seems to be correct. I have set it up just like countless examples on the web.
On my class I have put: Cache.Region("LongTerm").NonStrictReadWrite().IncludeAll();
Here is a Test I made for the application cache.
[Test]
public void cache()
{
using (var session = sessionFactory.OpenSession())
using (var tx = session.BeginTransaction())
{
var acc = session.QueryOver<Log>().Cacheable().List();
tx.Commit();
}
var test = sessionFactory.Statistics.SecondLevelCacheHitCount;
using (var session = sessionFactory.OpenSession())
{
var acc = session.QueryOver<Log>().List();
}
var test1 = sessionFactory.Statistics.SecondLevelCacheHitCount;
}
The first statement is cached as I see in the session factory statistics (for example 230 records).
If i understand it right second statement that is below shouldnt hit the db but the Cache.
Problem here is that it goes to DB anyway. Checked with profiler to be 100% sure.
I don't know what am I doing wrong here. Anyone has an idea?

I have managed to solve this problem. It had to do with my session creation. I didn't use session per request which triggered not going to cache. I created transaction on begining and it lasted through entire session. I managed to trigger entering cache if i opened the session again within using mark like: using(var sess = session.SessionFactory.OpenSession()) but this solution was only a workaround which didn't suit me so I changed how I created sessions in the first place and it works fine now! :)

Related

How to implement caching in MS Teams Client SPFX Webpart

I am trying to implement caching for a SPFX WebPart using #pnp-Storage. The caching is working alright in the Temas browser but in the Teams client, it isn't working. It is very slow as I have to make multiple azure function call. Can someone please help me with caching in the Team's app. Please refer the code below.
// Getting data from session variable
This.isListsExists = this.storage.session.get(isListsExists);
// If it exists in the session variable then don't make the HTTP call. Otherwise, make the
// call and save it in the session variable.
if (!this.isListsExists) {
this.isListsExists = await this.mapDataProvider.checkIfAllListsExist(); //cache
// Setting Session variable.
this.storage.session.put(isListsExists, this.isListsExists, end);
}
I was using session storage and it wasn't working with teams but when I changed it to local storage it worked like a charm.
// Edit - Cache code
this.isListsExists = this.storage.local.get(isListsExistsCache);
// console.log("isListsExists - " + this.isListsExists);
if (!this.isListsExists) {
this.isListsExists = await this.mapDataProvider.checkIfAllListsExist(); //cache
this.storage.local.put(isListsExistsCache, this.isListsExists, end);
}

CMS Open Payments Data Limitation

I've finally got around to getting the code needed to import web API into my SQL environment. However, when I ran the SSIS Script Component package (Script Language: Visual Studio C# 2017) I was only able to retrieve 1000 records out of of millions. A consultant mentioned that I may have to incorporate the App Token into my code in order to access additional records.
Would someone be able to confirm that this true? And if so, how should it be coded?
Here is the code prior to my "ForEach" loop code:
public override void CreateNewOutputRows()
{
//Set Webservice URL
ServicePointManager.Expect100Continue = true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
string wUrl = "https://openpaymentsdata.cms.gov/resource/bqf5-h6wd.json";
string appt = "MyAppToken";
try
{
//Call getWebServiceResult to return our Article attributes
List<Payment> outPutResponse = GetWebServiceResult(wUrl);
If there's an alternative method to using the app token (like in the HTTP Connection for example) please let me know.
Figured it out...
https://openpaymentsdata.cms.gov/resource/bqf5-h6wd.json?$limit=10000&$$app_token="MyAppToken"

Can I switch use of 'entities.SingleOrDefault' ==> 'entities.Find' without hazards?

In my WCF service's business logic, most of the places when I need to locate an entity, I use this syntax:
public void UpdateUser(Guid userId, String notes)
{
using (ProjEntities entities = new ProjEntities())
{
User currUser = entities.SingleOrDefault(us => us.Id == userId);
if (currUser == null)
throw new Exception("User with ID " + userId + " was not found");
}
}
I have recentely discovered that the DbContext has the Find method, and I understand I can now do this:
public void UpdateUser(Guid userId, String notes)
{
using (ProjEntities entities = new ProjEntities())
{
User currUser = entities.Find(userId);
if (currUser == null)
throw new Exception("User with ID " + userId + " was not found");
}
}
Note : the 'userId' property is the primary key for the table.
I read that when using Find method entity framework checks first to see if the entity is already in the local memory, and if so - brings it from there. Otherwise - a trip is made to the database (vs. SingleOrDefault which always makes a trip to the database).
I was wondering if I now will convert all my uses of SingleOrDefault to Find is there any potential of danger?
Is there a chance I could get some old data that has not been updated if I use Find and it fetches the data from memory instead of the database?
What happens if I have the user in memory, and someone changed the user in the database - won't it be a problem if I always use now this 'memory' replica instead of always fetching the latest updated one from the database?
Is there a chance I could get some old data that has not been updated
if I use Find and it fetches the data from memory instead of the
database?
I think you have sort of answered your own question here. Yes, there is a chance that using Find you could end up having an entity returned that is out of sync with your database because your context has a local copy.
There isn't much more anyone can tell you without knowing more about your specific application; do you keep a context alive for a long time or do you open it, do your updates and close it? obviously, the longer you keep your context around the more susceptible you are to retrieving an up to date entity.
I can think of two strategies for dealing with this. The first is outlined above; open your context, do what you need and then dispose of it:
using (var ctx = new MyContext())
{
var entity = ctx.EntitySet.Find(123);
// Do something with your entity here...
ctx.SaveChanges();
}
Secondly, you could retrieve the DbEntityEntry for your entity and use the GetDatabaseValues method to update it with the values from the database. Something like this:
var entity = ctx.EntitySet.Find(123);
// This could be a cached version so ensure it is up to date.
var entry = ctx.Entry(entity);
entry.OriginalValues.SetValues(entry.GetDatabaseValues());

Oracle sessions stay open after closing connection

While testing a new application, we came across an issue that sometimes a stored proc takes over 1 minute to execute and causes a time out. It was not 1 stored proc in particulary, it could be any.
Trying to reproduce the issue I've created a small (local) testapp that calls the same stored proc in different threads (code below).
Now it seems that the Oracle-sessions are still there. Inactive. And the CPU of the Oracle-server hits 100%.
I use the System.Data.OracleClient
I'm not sure if one is related to the other, but it slows down the time needed to get an answer from the database.
for (int index = 0; index < 1000; ++index)
{
ThreadPool.QueueUserWorkItem(GetStreet, index);
_runningThreads++;
WriteThreadnumber(_runningThreads);
}
private void GetStreet(object nr)
{
const string procName = "SPCK_ISU.GETPREMISESBYSTREET";
DataTable dataTable = null;
var connectionstring = ConfigurationManager.ConnectionStrings["CupolaDB"].ToString();
try
{
using (var connection = new OracleConnection(connectionstring))
{
connection.Open();
using (var command = new OracleCommand(procName, connection))
{
//Fill parameters
using (var oracleDataAdapter = new OracleDataAdapter(command))
{
//Fill datatable
}
}
}
}
finally
{
if (dataTable != null)
dataTable.Dispose();
}
}
EDIT:
I just let the dba make a count of the open sessions and there are 105 sessions that stay open-inactive. After closing my application, the sessions are removed.
Problem is solved.
We hired an Oracle-expert to take a look at this and the problem was caused due to some underlying stored procedures that took a while to execute and consumed a lot of CPU.
After the necessary tuning, everything runs smoothly.

MVC Mini Profiler with Entity Framework: How to Get Connection

I would like to use MVC Mini Profiler for Entity Framework Connection. The way I did it is like this:
public static XXXXX.DAL.BO.XXXXXEntities GetEntityConnection()
{
var conn = ProfiledDbConnection.Get(new EntityConnection(ConfigurationManager.ConnectionStrings["XXXXXEntities"].ConnectionString));
return ObjectContextUtils.CreateObjectContext<XXXXX.DAL.BO.XXXXXEntities>(conn);
}
So the following line is to get the Context for the rest of the code:
XXXXX.DAL.BO.XXXXXEntities ctx = GetEntityConnection();
When I attempted to view this site on a browser, however, the WebDev.WebServer40.exe crashed.
Does anyone have any idea why?
Thanks heaps.
P.S.
Previously it was
XXXXX.DAL.BO.XXXXXEntities ctx = new XXXXX.DAL.BO.XXXXXEntities();
and it worked fine.
If you are able to use the v3.0.10 nuget for EF6, then all you need to do to hook up Entity Framework is
protected void Application_Start()
{
MiniProfilerEF6.Initialize();
}
Using EF 5 or earlier (with the corresponding nuget package) would require you to generate an EFProfiledDbConnection as Anirudh wrote in his answer:
var conn = new EFProfiledDbConnection(GetConnection(), MiniProfiler.Current);
return ObjectContextUtils.CreateObjectContext<MyModel>(conn);
try initialising your connection to :
connection = new EFProfiledDbConnection( new EntityConnection(ConfigurationManager.ConnectionStrings["XXXXXEntities"].ConnectionString),
MiniProfiler.Current);
works for me.

Resources