Why is my Symfony site so slow? Can someone decipher this pingdom? - performance

my site is really really slow, 4603prospect.com? Sometimes its ok, sometimes its slow. I am caching thumbnails, I dont understand what to make of this pingdom report.... http://tools.pingdom.com/?url=4603prospect.com&treeview=0&column=objectID&order=1&type=0&save=false
Thanks
Todd

Things that might help to check:
Are you using shared hosting? Your server may be responding slowly due to the activity on other websites on the server
Are your images falling foul of an htaccess rule which results in redirects?
How many images are you loading at once? Most browsers can only make ~8 http requests at a time. You could try combining some of your CSS and JS files together to limit the HTTP requests.

Related

Fixing slow response time for resources

I have a Magento website and I have been noticing an increase in warnings from Catchpoint that various images, CSS files, and javascript files are taking longer than usual to load. We use Edgecast for our CDN and have all images, CSS, and JS files hosted there. I have been in contact with them and they determined that the delays happen when the cache for the resource has expired and it must contact the origin for an updated file. The problem is that I can't figure out why it would take longer than a second to return a small image file. If I load the offending image off our server (not from the CDN) in my browser it always returns quickly. I assume that if you call up an image file directly using the full URL to the image file (say a product image, for example), that would bypass any Magento logic or database access and simply return the image to you. This should happen quickly, and it normally does, but sometimes it doesn't.
We have a number of things in play that may have an effect. There are API calls to the server for various integrations, though they are directed at a secondary server and not the web frontend. We may also have a large number of stale images since Magento doesn't delete any images even if you replace them or delete the product.
I realize this is a fairly open ended question, and I'm sorry if it breaks SO protocol, but I'm grasping at straws here. If anyone has any ideas on where to look or what could cause small resource files, like images, to take upwards of 8 seconds to load, I'm all ears. As an eCommerce site, it's getting close to peak season, and I can feel the hot breath of management on my neck. Any help would be greatly appreciated.
Thanks!
Turns out we had stumbled upon some problems with the CDN that they were somewhat aware of and not quick to admit. They made some changes to our account to work around the issues and things are much better now.

Over 6 seconds to receive data from server. Screenshot included, can't find why. Wordpress, W3TC, Cloudflare, etc

I am using cloudflare, W3 Total Cache, and amazon s3. I am using database, page, object, and browser caching via W3TC.
What causes the delay in receiving data from the server as seen here? This is incredible, I feel it might have been faster without W3TC installed.
Screenshot of headers
A couple of thoughts.
It looks to me like there are two 404s on the home page, so probably good to get them fixed:
http://cdn.thedigitalhippies.com/wp-content/themes/sahifa/fonts/BebasNeue-webfont.woff
http://cdn.thedigitalhippies.com/wp-content/themes/sahifa/fonts/BebasNeue-webfont.eot?
There are 20 odd CSS files on the page and plenty of js files. I think you would get good improvement if you combined and minified these into single CSS and JS files with somthing like http://wordpress.org/extend/plugins/wp-minify/
Hope this helps!

Can a site detect and block another for requesting a picture many times?

I'm depending on my users uploading image links, not image files. So my site actually requests this links from these external sites by including them on an element and getting the src from the database.
Could sites detect they are receiving lots of "requests" (no idea if it's the right term) and block your url from doing more requests?
If they can, how many times would go unnoticed? Also are there policies like having to add a link to the source or something?
Sites can detect and block your IP address, especially if you are sending a lot of requests that could slow down their site and cause performance issues on their end. Having users post links shouldn't be any issue though.
Short answer: yes, hotlinking to files such as images can be detected and blocked. How many times can you do it? That depends on the limitations set by the website hosting the files you are hotlinking. There isn't really a definitive answer for every server about the limitations.

website Caching image gallery

I have a website that shows hundreds / thousands of images to users. Users reload the gallery fairly frequently, but their caches are invariably not big enough to save the images from visit to visit, and have to slowly re-download.
What are good ways to ensure that images get cached? I've already ruled out using localstorage. What advice can you give me?
Just to be clear, the http headers are set correctly (folks with big caches get instantaneous loads), just that the browser default caches are not big enough.
Have you considered caching on the server-side like varnish in front of your webserver? If you have enough memory on your server(s) this will help alot on the download-time for your users (and your serverload).
Or are you just interested in client-side cache? If so, it would help if you could post your http cache-headers.

Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea?

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Resources