For a given application, we would need to store about 600 Kb of data in the web session per registered user who connects on our website. We would have about 1,000 registered users in parallel hence we need to store 600 Mb of session data.
The reason we need so much data in the session is to avoid querying frequently a table with about 1 billion rows in the database.
I understood Heroku stores session information in the database. This is fine as it means the session data is available cross-dynos (no session affinity).
Is there another way of storing more efficiently information across dynos ? Reading the docs, I found memcachier.
My questions would be the following :
Do you think storing that amount of session in the database would be performant enough
Do you suggest other caching systems than memcachier to store session information available across different dynos ?
Thanks a lot for your help !
Olivier
Heroku does not store session information at all -- how session information is stored depends entirely on your application and your application's framework and that will work in the same way regardless of whether you are deployed on Heroku or any other system.
As far as what kind of storage is sensible, however: it sounds like cookie storage is right out, due to the volume of data. Database storage was the de facto default for web applications for a long time and there's nothing wrong with it. Memcached would be faster, and how much faster exactly depends on your configuration (are you using connection pooling? does each page view hit the database for something else anyways? what is your caching system like?). But as long as you're sure this strategy of storing so much info in session data is sound, then the difference between database and memcached storage will not be great.
Related
Hello i need help to decide what the best session driver i must use in my e-commerce website. Redis? Memcached? file driver? or other?
It depends on your setup
If you choose the file session driver, your session data will be saved in the app/storage/sessions folder on the server.
If you choose the database session driver, you can use the DB to keep sessions.
Otherwise, you can store the data (encrypted) in the user cookies.
Why use file driver:-
The advantage of using the file driver could be that mySQL/SQL server load whereas file
access should be faster.
Why use database driver:-
If your site isn't that big (couple hundreds unique a day). It also provides you with easy access to all the users logged in from a time period so you can track stuff.
Why use Redis / Memcached driver:-
Redis & Memcached driver both provide high reading speed. so when you need to frequently access data, this is your best choice & also if your site is very big & the data read/write frequency is high.
So choose any of them according to your need.
I store data in
HttpContext.Current.Application.Add(appKey, value);
And read data by this one:
HttpContext.Current.Application[appKey];
This has the advantage for me that is using a key for a value but after a short time (about 20 minutes) it does not work, and I can not find [appKey],because the application life cycle in iis data will lose.
i want to know is that another way to store my data by key and value?
i do not want sql server,file,... and want storing data on server not on client
i store users some data in it.
thanks for your helping
Since IIS may recycle and throw away any cache/memory contents at any time, the only way you will get data persisted is to store it outside IIS. Some examples are; (and yes, I included the ones you stated you didn't want just to have the list a bit more complete, feel free to skip them)
A SQL database (there are quite a few free ones if the price is prohibitive)
A NoSQL database (same thing there, quite a few free ones and usually simpler to use for key/value)
File (which you also stated you didn't want)
Some kind of external memory cache, a'la AppFabric cache or memcached.
Cookies (somewhat limited in size and not secure in any way by default)
you could create a persistent cookie on the user's machine so that the session doesn't expire, or increase the session timeout to a value that would work better for your situation/users
How to create persistent cookies in asp.net?
Session timeout in ASP.NET
You're talking about persisting data beyond the scope of a session. So you're going to have to use some form of persistent storage (Database, File, Caching Server).
Have you considered using AppFabric. It's actually pretty easy to implement. You could either access it directly from your code using the nuget packages, or you could just configured it as a session store. (I think) doing the latter would mean you'd get rid of the session timeout issue.
Do you understand that whatever you decide to store in Application, will be available for all users in your application?
Now regarding your actual question, what kind of data do you plan on storing? If its user sensitive data, then it probably makes sense to store it in the session. If it's client specific and it doesn't contain any sensitive information, than cookies is probably a reasonable way forward.
If it is indeed an application wide data and it must be the same for every user of your application, then you can make configuration changes to make sure that it doesn't expiry after 20 minutes.
It seems some web architects aim to have a stateless web application. Does that mean basically not storing user sessions? Or is there more to it?
If it is just the user session storing, what is the benefit of not doing that?
Reduces memory usage. Imagine if google stored session information about every one of their users
Easier to support server farms. If you need session data and you have more than 1 server, you need a way to sync that session data across servers. Normally this is done using a database.
Reduce session expiration problems. Sometimes expiring sessions cause issues that are hard to find and test for. Sessionless applications don't suffer from these.
Url linkability. Some sites store the ID of what the user is looking at in the sessions. This makes it impossible for users to simply copy and paste the URL or send it to friends.
NOTE: session data is really cached data. This is what it should be used for. If you have an expensive query which is going to be reused, then save it into session. Just remember that you cannot assume it will be there when you try and get it later. Always check if it exists before retrieving.
From a developer's perspective, statelessness can help make an application more maintainable and easier to work with. If I know a website I'm working on is stateless, I need not worry about things being correctly initialized in the session before loading a particular page.
From a user's perspective, statelessness allows resources to be linkable. If a page is stateless, then when I link a friend to that page, I know that they'll see what I'm seeing.
From the scaling and performance perspective, see tsters answer.
We're running our blog from a shared hosting account. Our host limits the allowed inodes/number of files on the hosting account to 150,000. We've implemented our caching mechanism that caches all pages in full as soon as they are accessed so that subsequent seeks are delivered from cache. Unfortunately, the inode limit won't allow us to store more pages very soon.
Fortunately we have sqlite on our server. Though we have mysql too, but our shared hosting account only allows us to have maximum 25 concurrent connections from the apache webserver to the mysql server. That's a major constraint! Its said that sqlite is "serverless", and so I believe sqlite won't have that kind of limitation.
With that, should I and can I use a sqlite table to store full cache pages of all dynamic pages of our blog ? The average cached page size is around 125 kbs and I have around 125,000 cache pages and growing.
Will that have any kind of bottlenecks to slow down the cache page delivery out of the sqlite database file?
Will I be able to write more pages to the cache_table in the sqlite database while simuntaneously deliverying sought pages from the cache_table to the site visitors?
I't not a good idea cause sqlite usage may impact you website performance (at least on response time).
I recommend to use Memcached or NoSQL DB as a last resort (need to test for response time rise).
But if you have not choise, sqlite will be better then MySQL, cause its select operations are faster.
Haven't calculated that because there has never been a need to calculate max page generation time. I manage all pages statically in full, and with that, it has always been a stable process without any trouble.
Our server load varies from 400 to 8000 page requests in an hour.
I have webRole with some data stored in Session. The data is some tens of small variables (strings), and one-two big objects (some megabytes). I need to run this webRole in multiple instances. Since two requests from the single user can go to different instances, Session became useless. So, i am looking for most efficient and simplest method of storing volatile user data for this case. I know that i can store it in cookies at client side, but this will fail for big objects. I also know that i can user data in Azure storage - but this seems to be more complicated than Session. Can anybody suggest both efficient and simple method, like Session state? Or probably some workaround to get Session state working correctly when multiple instances enabled.
This may help
http://social.msdn.microsoft.com/Forums/en-US/windowsazure/thread/7ddc0ca8-0cc5-4549-b44e-5b8c39570896
You need to use another session state storage than memory. In Azure you can use Cache, Storage tables or SQL server to share session data between instances.