I'm developing a website that display a map using OpenLayers3 API.
I know that most of requests will refer to a specific location so I'm wondering if it's possible to save server-side the Tiles of that region, in order to reduce calls to OpenStreetMap that sometimes is slow.
Thanks in advance!
Yes, you can. There are several solutions
You can try dedicated software called more or less "map tiles proxy"
Mapproxy
Mapcache
You can reuse your existing web server
You will tweak it for the intent of caching map tiles (Nginx recipe, did not found one out of the box for Apache)
You can use what we call "reverse proxy" like Varnish. See a recipe for this.
Related
We have a number of EC2 instances running Solr in EC2, which we've used in the past through another application. We would like to move towards allowing users (via web browser) to directly access Solr.
Without something "in front" of Solr this results in a security risk, so we have opted to try to use ELB (specifically the Application Load Balancer) as a simple and maintenance free way of preventing certain requests from hitting SOLR (i.e. preventing the public from DELETING or otherwise modifying the documents in Solr).
This worked great, but we realize that we need to deal with the CORS issue. In other words, we need to add the appropriate headers to requests that come in from a browser. I have not yet seen a way of doing this with Application Load Balancer but am wondering if it is possible to do someway. If it is not possible, I would love as an additional recomendation the easier and least complicated way of adding these headers. We really really really hate to add something like nginx in front of Solr because then we've got additional redundancy to deal with, more servers, etc.
Thank you!
There is not much I can find on CORS for ALB either and I remember when I used Beanstalk with ELB I had to add CORS support in my java application directly.
Having said that, I can find a lot of articles on how to set up CORS for Solr.
Can it be an option for you?
For my new web project I am considering to abadon server-side processing of web pages in favor of just using static HTML5 pages. All dynamic content of the page will be loaded using ajax from a REST service. No need for php, jsp, jsf.
I came across this post and it seems I am not the only one.
What are the advantages and disadvantages using this approach?
I can imagine there are more client-server requests since many REST calls have to be made in order to gather all the information needed to display the web page.
I believe that we have much more PROS than CONS. One of the good ideias to give HTML pages trought a WEB app server, like apache, nginx, or ISS, is that you can apply more security and control to the container that is delivered.
BUT, HOWEVER
Use static content, like JS, CSS, and HTML5 to consume only services, is the next goal in software development. When you start to divide things like, API and UX, you can test then separatly, you can develop at same time, service and interface, and you have much more speed and quality to development.
When we look at a web page, and the weight that the DOM have, and how much cost for the app server to give all that container to the user, and sometimes less then 10% of this is JSON from a service, we need to start to rethink the architecture of our web app.
I have been developing apps like this since one year and a half now, all the projects, and be sure, we re not to come back to the past. Its very important to work oriented to services, and how to consume this services.
If you use for example, Amazon S3 to host your HTML,JS, CSS, IMAGES, files, you dont need a app server, only the REST api to consume and give the content to the user. Costless and very,very faster.
We're planning to introduce DMS to our customer's Sitecore installation. It's a rather popular site in our country and we have to use proxy caching server (it's Nginx in this case) to make it high-traffic-proof.
However, as far as we know, it's not possible to use all the DMS features with caching proxy enabled - for example personalization of content - if it gets cached it won't be personalized.
Is there a way to make use of all the DMS features with proxy cache turned on? If not, how do you handle this problem for high-traffic sites - is it buying more Content Delivery servers to carry the load, or extending current server with better hardware (RAM, CPU, bandwidth)?
You might try moving away from your proxy caching for some pages, or even all.
There's no reason not to use a CDN for static assets and media library assets, so stick with that
Leverage Sitecore's built-in html cache for sublayouts/renderings - there are quite a few options for caching
Use Sitecore's Debug feature to track down the slowest components on your site
Consider using indexes instead of doing "fast" or Sitecore queries
Don't do descendants query "//*" (I often see this when calculating selected state for navigation - hint: go the other way, calculate the ancestors of the current page)
#jammykam wrote an excellent answer on this over here.
John West wrote a great blog post on this also, though a bit older.
Good luck!
I've been wondering about this myself.
I have been thinking of implementing an ajax web service that:
- talks to the DMS and returns JSON
- allows you to render the personalised components client side
- allows you to trigger anlaytics events
I have been googling around and I haven't found anyone that has done it and published the information yet. The only place I have found something similar is actually in the mobile sdk, but I haven't had a chance to delve into it yet.
I have also not been able to use proxy server caching and DMS together successfully. For extremely high loads, I have recommended to clients to follow the standard optimization and scaling guidelines, especially architecting for proper Sitecore sublayout and layout caching for as much of the site as possible. With that caching done, follow it up by distributing across multiple Content Delivery nodes with load balancing to help support high volume with personalization at the same time.
I've heard that other CMS's with personalization use a javascript approach to load the personalized content on the client-side, but I would be worried about losing track of the analytics data that is gathered when personalized content is loaded and interacted with.
I'm with a webhost, a web farm or cluster, I guess you could say. I have a 47 page company website, and all speed tests suggest I use a CDN.
I've googled and SE's this to no end, but still don't understand how to implement a content delivery network. Are they suggesting I order a subdomain and put all my .css, .js, and image files in that subdomain? Or are they suggesting that instead of downloading jquery 1.7, I just link to malsup's jquery? But then what would I do for images and .css?
Just kinda confused here; any help in this regard would be truly appreciated!
Yes - you can implement a CDN with a cluster web host. In the vast majority of cases if you can change your DNS settings you can implement CDN. Another suggestion is to use a cookie-less domain. But, a content delivery network will optimize the delivery of all the files you mentioned. While I'm not sure of all the particulars of your specific setup and situation, it sounds like you could use front-end optimization and an overall faster site delivery. Take a look at the following that highlights EdgeCast's integration of Google's PageSpeed into our content delivery network, and how they'll help out sites like yours in tandem: http://www.edgecast.com/docs/ec-edgeopt-datasheet.pdf
I was wondering whether there are any best practices or if you have any good tips on improving performance of a self hosted openX instance.
Apart from the usual suspects (e.g. tuning apache, php and mySQL), using memcached for caching etc.
Is there any room for a CDN or other proxy cache in front of openX? (my guess is no due to the dynamic nature of the delivery scripts results)?
I suppose you could cache the actual creative (image files, etc.) but you've guessed correctly that caching the scripts isn't going to be a good thing.
Based on some recent questions, it looks like there's not a whole lot to be done to improve OpenX's performance, unfortunately.
You can refer these steps given on openx website,
performance tuning
Openx has some problems like it fetches all the banners from system which are matching to zone id, and apply all delivery limitation on them to select right banner for serving.
but you can try given steps in url, and that will improve the performance.
Yes, you can use a CDN.
I'm using a CDN (pull zone with CNAME). And I only set this parameter:
Configuration > Global Settings > Banner Delivery Settings > OpenX Server Access Paths
Image Store URL: http:// cdn.yourdomain.com/www/images