I need to implement like/dislike functionality (for anonymous users so there is no need to sign up). Problem is that content is served by Varnish and I need to display actual number of likes.
I'm wondering how it's done on website like stackoverflow. Assuming pages are cached in Varnish (for anonymous users only), so every time user votes on answer/question, page needs to be purged from cache. Am I right? Current number of votes needs to be visible for other users.
What is good approach in this situation? Should I send PURGE to Varnish every time user hits "like" button?
A common way of implementing this is to do the like button and display client side in Javascript instead. This avoids the issue slightly.
Assuming that pressing Like leads to a POST request hitting a single Varnish server, you can make the object be invalidated/replaced in different ways. Using purge and a VCL restart is most likely the better way to do this.
Of course there is a slight race here, where other clients will be served the old page while this is ongoing.
Related
Building a web application.
User have access trough their browser to shared resources host on a server, however if UserA is already using Resource1, Resource1 should not be available to UserB until UserA release Resource1 or until a given amount of time.
For this part : I chose to use a MySQL table with a list of tuples (resource,currentuser) and run a cron task to delete expired tuples.
Now I want to be able to notify UserA that UserB wants to access Resource1 and if get not answer from UserA, then UserA lost his lock on Resource1 and then the Resource is then available to UserB.
For this part, I guess I have to use AJAX. I have thought about the following solution :
User's browser make periodic AJAX call (let's say each minute) to prove he is still alive and upon a call, if another User has requested the same resource, he has to challenge a server request in a given amount of time(for example a captcha). If the challenge fails, it means the user is not here anymore (maybe he left his browser opened or the webpage unfocused).
The tricky part is : "he has to challenge a server request in a given amount of time (for example a captcha)". How to do that?
Am I following the best path ?
Yes, what you've outlined is fine. Using ajax is also completely fine, especially if you're simply polling every minute.
For example, let's say you have the following:
setInterval(function() {
$.get('/resource/status', function(response) {
if (response.data.newRequest) {
//This would signal a new request to the resource
}
})
}, 60000)
When handling the new request to access the resource, you could use something like reCaptcha and display that however appropriate (overlay or inline). When you do this, you could also start a timer to determine if it's exceeded the amount of time allocated or not. If it has, then you can do another ajax request and revoke this person's access to the resource, or however you want to handle that.
i would use web sockets to control all the users that need to get the resource.
this way you will know who is connected and using the resource and when he finish using it you can let the next user the resource and so on ,
(this way can tell each user an estimation of how much time it will take him to get the resource and do some progress bar)
I think there're two problems here.
How to notify users that resource becomes available?
Periodic AJAX requests might be okay, but you can also consider long-polling or websockets to get close to notifying waiting users in real time.
How to find out that resource is still used by user?
If you want to catch the moment when human user is not doing anything on page, you can track mouse movement/clicking or keyboard button pressing. If nothing is done for last n minutes, the page might be considered as not active.
If you want to make sure that page is not exploited by automated software, you can ask to fill in captcha once in n minutes when resource is being used.
I'm developing an SPA and find myself needing to fire off several (5-10+) ajax calls when loading some sections. With web2py, it seems that many of them are waiting until others are done or near done to get any data returned.
Here's an example of some of Chrome's timeline output
Where green signifies time spent waiting, gray signifies time stalled, transparent signifies time queued, and blue signifies actually receiving the content.
These are all requests that go through web2py controllers, and most just do a simple operation (usually a database query). Anything that accesses a static resource seems to have no trouble being processed quickly.
For the record, I'm using sessions in cookies, since I did read about how file-based sessions force web2py into similar behavior. I'm also calling session.forget() at the top of any controller that doesn't modify the session.
I know that I can and I intend to optimize this by reducing the number of ajax calls, but I find this behavior strange and undesirable regardless. Is there anything else that can be done to improve the situation?
If you are using cookie based sessions, then requests are not serialized. However, note that browsers limit the number of concurrent connections to the same host. Looking at the timeline output, it does look like groups of requests are indeed made concurrently, but Chrome will not make all 21 requests concurrently.
If you can't reduce the number of requests but must make them all concurrently, you could look into domain sharding or configuring your web server to use HTTP/2.
As an aside, in web2py, if you are using file based sessions and want to unlock the session file within a given request in order to prevent serialization of requests, you must use session.forget(response) rather than just session.forget() (the latter prevents the session from being saved even if it has been changed, but it does not immediately unlock the file). In any case, there is no session file to unlock if you are using cookie based sessions.
I have a Django server. The server loads a webpage with almost all static content but a few numbers must load from the database.
I'm thinking about performance/price; I can host my Django server on a fast server and render the page using Django templates. or I can host the server on a slower machine and make a static page that loads the few numbers using ajax and host the page cheaply somewhere else like github.io.
The latter choice will have most of the page load real quick and real cheap.
I was wondering what are the trade-offs ?
Whichever server you decide to hire, you should always think of reducing the server load - no matter how fast your server is. By reducing server load I mean only make your server do what is really required at the moment.
Let's learn something from the big players like Facebook, for instance
You log into your account and you see that you've got 5 notifications and 3 new messages plus a couple of photos and highly interesting statuses of your friends. Cool! You now click on the notifications icon to find out if that hot girl (forgive me if you're a girl :D) has added you to her friends list or not. As you click a big white <div> pops up AND you see nothing but a loading gif! The notifications do appear, but after a couple of seconds. Try doing it with a slow internet connection, and you get to adore the beauty of the loading gif for a lot more time.
So, what do you make of it?
Facebook only made it's server count the number of notifications and new messages, and displayed those numbers to you. Thus reducing server load. It only displayed the notifications to you when you wanted to see them. And to load the notifications, all it took was a minimal AJAX call in which only around 10 KB of data was transferred!
Facebook does it all the time and everywhere. Consider this: Robert Downey Jr. posts a photo of himself on his Facebook page. A little while later, you see that it has got 10k+ comments. You decide to read them and click the comments button. An attractive loading gif pops up again for a little while and is soon replaced by comments. But hey, only 10 comments were loaded. What the ... Oh wait! That's how Facebook reduces its server load - read those 10 comments first, if you want to read more, send a request again.
Twitter does it too - the infinite scroll.
Icing on the cake
This approach benefits you in two ways:
It reduces server load - less chances of crashing a website.
It decreases your website's page-load time since you'll be passing less data i.e. the data required at that moment. Thus making your website faster. (Yes, it can outrun Flash, too!)
Food for thought
If you've got some cool technologies around such as AJAX, why not use it? Your server is not a donkey, for God's sake!
P.S. By Facebook and Twitter, I mean the engineers behind them.
Well It would depend on the following:
A. Whether you want to Display that number on Page load itself or when user clicks to see it* ?
If you want to show the the numbers at the time of Page load Itself than it is preferable to get them at time of Template response itself.
Why do you would want your Site Visitors to wait till those numbers populate (if the intention is to display them) ?
If it is to be displayed on User's click only then Ajax should be preferred
B. How much Time is this Query going to take and Can the query be optimized to minimal time ?
If the Query you are making takes a Lot of time than first effort should be made to optimize that query to be as fast as possible,
If the query can give result in minimal time than it is futile to do another Request to Server via Ajax.
But if you know the Query will take a lot of Time than Ajax is fine.
I run a Symfony 1.4 project with very large amount of data. The main page and category pages are using pagers which need to know how much rows are available. I'm passing a query which contains joins to the pager which leads to a loading-time of 1 minute on these pages.
I configured cache.yml for the respective actions. But I think the workaround is insufficient and here are my assumptions:
Symfony rebuilds the cache within a single request which is made by a user. Let's call this user "cache-victim" to simplify things.
In our case, the data needs to be up-to-update - a lifetime of 10 minutes would be sufficient. Obviously, the cache won't be rebuilt, if no user is willing to be the "cache-victim" and therefore just cancels the request. Are these assumptions correct?
So, I came up with this idea:
Symfony should fake the http-request after rebuilding the cache. The new cache-entries should be written on a temporary file/directory and should be swapped with the previous cache-entries, as soon as cache rebuilding has finished.
Is this possible?
In my opinion, this is similar to the concept of double buffering.
Wouldn't it be silly, if there was a single "gpu-victim" in a multiplayer game who sees the screen building up line by line? (This is a lop-sided comparison, I know ... ;) )
Edit
There is no "cache-victim" - Every 10 minutes page reloading takes 1 minute for every user.
I think your problem is due to some missing or wrong indexes. I've a sf1.4 project for a large soccer site (i.e. 2M pages/day) and pagers aren't going so slow even if our database has more than 1M rows these days. Take a look at your query with EXPLAIN and check where it is going bad...
Sorry for necromancing (is there a badge for that?).
By configuring cache.yml you are just caching the view layer of your app (that is, css, js and html) for REQUESTS WITHOUT PARAMETERS. Navigating the pager obviously has a ?page=X on the GET request.
Taken from symfony 1.4 config.yml documentation:
An incoming request with GET parameters in the query string or submitted with the POST, PUT, or DELETE method will never be cached by symfony, regardless of the configuration. http://www.symfony-project.org/reference/1_4/en/09-Cache
What might help you is to cache the database results, but its a painful process on symfony/doctrine. Refer to:
http://www.symfony-project.org/more-with-symfony/1_4/en/08-Advanced-Doctrine-Usage#chapter_08_using_doctrine_result_caching
Edit:
This might help you as well:
http://www.zalas.eu/symfony-meets-apc-alternative-php-cache
To improve performances, I'd like to add a fairly long Cache-Control (up to 30 minutes) to each page since they do not change often. However, each page also displays the name of the user logged in (like this website).
The problem is when the user logs in or logs out: the user name must change. How can I change the user name after each login/logout action while keeping a long Cache-Control?
Here are the solutions I can think of:
Ajax request (not cached) to retrieve and display the user name. If I have 2 requests (/user?registered and /user?new), they could be cached as well. But I am afraid this extra request would nullify my caching performance-wise
Add a unique URL variable (?time=) to make the URL different, and cancel the cache. However, I would have to add this variable to all links on my webpage, not very convenient code-wise
This problems becomes greater if I actually have more content that is not the same for registered users and new users.
Cache-Control: private
Is usually enough in practice. It's what SO uses.
In theory, if you needed to allow for the case of variable logins from the same client you should probably set Vary on Cookie (assuming that's the mechanism you're using for login). However, this value of Vary (along with most others) messes up IE's caching completely so it's generally avoided. Also, it's often desirable to allow the user to step through the back/forward list including logged-in/out pages without having to re-fetch.
For situations where enforcing proper logged-in-ness for every page is critical (such as banking), an full Cache-Control: no-cache is typically used instead.