I have a page on my website that makes AJAX GET requests when a user clicks a button, for example the url to be gotten will look like:
/php/getData.php?field1=val1&field2=val2
The value returned by getData.php with these two values will not change (at least for a few months) so how can I implement cache control in my .htaccess file to tell the browser to cache the result for a certain amount of time?
For example, I tell the browser to cache js and css file in the following way:
<FilesMatch "\.(css|js)$">
Header set Cache-Control "max-age=3024000, must-revalidate"
</FilesMatch>
^ this sets the cache-control header for 30 days.
Any help would be much appreciated.
Thanks
Unfortunately, there is no directive in Apache to match against a query-string. Just filenames and directories.
Related
So I'm trying to resolve my sites bad "served scaled image" grade - see here: http://cl.ly/image/1A430t0k1r0s
I'm using a responsive site powered by Wordpress. I'm using one image on my homepage full width slider (so the image needs to be large). How can I fix this score?
url: http://cl.ly/1n162x1K3O15
You need to use the GZip compression technique and browser caching to resolve this problem. The below is the code for simple GZip compression but you can check it out in detail here
<FilesMatch "\.(ico|jpg|jpeg|png|gif|js|css|swf|svg|pdf|flv|mp3)$">
<IfModule mod_expires.c>
ExpiresActive on
ExpiresDefault "access plus 1 month 2 days 3 hours" //example you can change it
Header set Cache-Control "public"
</IfModule>
</FilesMatch>
Go for adaptive images if this still doesn't work.
I am bulding an ajax website and it needs to have a percentage loader for the new pages. New pages are ajax loaded, cached and slided with animation. Anyway, in order to do that, I need to fetch somehow the size of the content I am going to receive during the current request, in order to calculate the percentage. Unfortunately it does not set the Content-Length header for pages, but just for files. Can I force it to always add the header? Any other way to create this percentage loader? Any ideas will be helpful. Thanks!
I am working with a Wordpress theme that is driven on AJAX. A nice way to load content, but not so much for SEO purposes.
The URL's all and width the 'same' string like for example: #menu-item-44 (the only difference will be the number at the end).
For it is AJAX driven, I can not make use of Wordpress' permalink structure so my question is really, can I fix this with a rewrite in my htaccess file?
For example: www.somesite.com/#menu-item-44 becomes www.somesite.com/contact
Your help will be much appriciated!
Thanks
You can't rewrite the # part of a url as it does not get sent to the server.
Look into the javascript pushstate() and googles ajax solution using hash bangs (#!)
I track traffic to the domains that I own through a "home brew" cookie system. One one site, I have noticed that I get a lot of HTTP referer traffic from one particular domain (http://www.getbig.com/). I did some sleuthing, and found out what this person has done. This person has attempted to use my sites logo as their avatar on a forum. However, instead of linking to the image in the "img" tag:
<img src="http://www.example.com/image.jpg" width="" height="" alt="" border ="" />
they have linked to my main domain name:
<img src="http://www.example.com/" width="" height="" alt="" border ="" />
Every single time a page is loaded where this person has posted in this forum, a new hit gets registered. This is artificially inflating my visitor statistics, and I would like to stop it. If they had simply linked to the image, I could just change the image name, but they have linked to the site itself and I am not sure what to do. Aside from sending them a "cease and desist", what technical options do I have?
The principle is called hotlinking – or at least it is when done correctly, as you pointed out. There are a few solutions to "stop" it from happening.
The most common one is to serve a different page or image instead of the expected one. Apache's mod_rewrite (or similar) allows you to rewrite URLs based on particular criteria, such as the referer header in this case. You will need to be at least allowed to create your own .htaccess file. There are tools to help generate the .htaccess content.
A less informative way to do this would be to deny access via environment variable. First check the referer header with SetEnvIf and deny access based on it. This would only return a HTTP#403 response code.
If you don't have this sort of access, you could read the referer header at the application level and make a decision there. This might only be a partial solution depending how the content is delivered (i.e. request handled by the webserver or an additional application layer such as PHP).
Contact the user in question. This is less scalable and doesn't stop them if they don't agree with your kind request.
The first three are solutions to stop hotlinking in general. They can be adjusted to match only a particular referer.
In this particular case, I doubt any of these will have a significant effect unless you provide a picture in response. If the URI doesn't contain the actual image name but only the protocol and domain name, the browsers opening the page are unlikely to show anything relevant for the img tag at the moment. Providing a different content won't change this situation, unless it's an image. Serving an image explaining why you don't allow hotlinking (even if they request the main page) would probably have a more important impact on the user.
It is difficult to assess how your statistics will be affected by these solutions. Assuming they are collected on the main page, they could bring the data back to normal as that page won't be served anymore. If they're based on the access logs, it might be a different story.
What I would recommend is check out the Referer, and if it is coming from http://www.getbig.com/, instead of your website you serve the absolute filthiest image you can find on the internet.
It's much, much easier to just send them an email though.. (this is my actual advice).
I'm still not sure how it works(but it's not the point:D). As far as I noticed, whole content(almost:D) is in the iframe and chat window is outside iframe. Request are probably made via ajax, and urls are changing like this const_part_of_url#something - so the only url anchors(or whatever it's called) are changing.
2 things bothering me :
What about googlebot, is it able to index those pages correctly(not gmail, but say some web page with similar "technology" used), 1st beacuse of iframe, 2nd because of only anchor changes in urls?
Is it possible to make some part of url changing not only anchors?
The thing is I have an mp3 search engine where you can listen these mp3s too, and this kind of floating, "not-reloading" player with playlist would be kinda cool:D But I'm very concern about proper page indexing and other SEO blah blah... so I don't really now if it's worth trying:D
Cheers
you can detect robots and not feed them with user-eyes-only content ...
Edit : you can also load it on demand (javascript)... bots wont load it