Any fix for Firefox's DNS Cache? - firefox

Firefox has a powerful feature in which it caches DNS lookups to improve speed.
I have a situation where this is causing more problem then it is fixing. I manage a website and we recently migrated our site to a new, upgraded server. This server has a different public IP address. We updated our DNS records and everything is correct, but some customers using FireFox are reporting that they are still seeing the "site under construction" page we put up on the old server before the migration. I was running into this problem as well.
The only fix I found that worked was to flush the browser cache, selecting "Everything" from the dropdown, and then closing the browser and re-launching. That works, but it's not a "solution" when you consider that we're dealing with customers all across the country, many of whom are not computer literate, and most of them don't even take the time to let us know that they are experiencing a problem. We are losing business because customers who don't know any better see a "site under construction" message and move on, probably spending their money on some other site.
There needs to be some sort of way to tell FireFox that its cache for this site is outdated and it needs to update. But I have no idea how to do that (or if it's even possible) and I can't find any information about it. Every "solution" to this problem involves flushing the browser cache. Again, that works, but it's a reactive solution when someone happens to call in to support. We need a proactive solution.
Any ideas?

Maybe there's no way for you to remote visitors FirFox, cuz it's not safe for the users.
The best way is to alert the visitors on the old site that they should refresh the DNS cache. Find the option in firefox for "Clear Recent History" and then select everything and check all the boxes under details.
The 2nd way is using this add-on: https://addons.mozilla.org/en-US/firefox/addon/dns-cache/
The 3rd way,
type about:config in Firefox’s address bar
acknowledge the warning that appears next
find an entry called network.dnsCacheExpiration and set it’s value to 0
if there’s no such entry, create a new integer item with the name above and a value of 0
now go back and change the value to 3600

Related

demandware site updates not instant

My company has recently begun using demandware, however we are finding it slightly odd that instant site updates are not possible, whether through a cache clear or server updates, at the moment our servers update every 24 hours, so effectively any changes we make one day won't go live until the following day. In demandware is there anything that can be done so we can instantly see new updates to our site or perhaps a server update occurring every hour?
Thank you
What is that change that you want to be instant as update on production?
Why it should be instant, even if the site is over heavy load at the moment?
There are various ways you can achieve quicker updates (remote includes with different caching, integration feeds targeted on production, cache segmentation, etc.), still I need more details about what exactly you are trying to do to suggest you appropriate solution.
Also your question is better suited in Demandware XChange portal, not in Stack Overflow, as this is site for programming, and your question is not a programming one.
Unfortunately, there is no way to make instant changes on Demandware. you can however makes changes that will be reflected on the site within 15 mintues. To do this, you will need to be logged in to production.
There are typically three instances that you can make changes; development, staging and production. If you are seeing the changes come through after 24 hours then you are most certainly logged in to the staging instance, where changes are replicated over to production at a designated time.
What you'll need to do is login to production and make the required update. Then, clear your site cache to make the change visible on the live site. You will then need to wait on average 15 minutes for the data centres to dump their cache and serve a new copy of the site.
To clear the site cache:
1) Navigate to Administration > Manage Sites
2) Select the relevant brand/region
3) Click the Cache tab
4) Click the first "Invalidate" button
For clearing cache..
You need to go to Administration -> Manage Sites -> Select the site -> Click Cache tab and then click on Invalidate Cache.
..
This initialises the cache invalidation process which usually takes some time but not an entire day.
..
Please check if there are any other caching mechanisms added for your sites.
Usually, Invalidating cache works just fine.

My website is slow and I don't know how to fix it

It's been 2 days and I think i might have to kill myself.
My website for some reason suddenly started taking way way wayyyy to long to load.
I have cloudflare enable on my domain to cache content so my site can load faster, I've tried turning it off, but my site is still taking forever to load.
I've used pingdom(http://tools.pingdom.com/fpt/#!/dFvagb/http://streamaton.com/) and according to the results it shows that the domain itself is taking to much time to load(whut?).
I've tried visiting other section of my site like my admin panel and the site loads up pretty fast.
I have no idea how to pin point the root of this problem.
it depends a bit on the circumstances:
Could the case be that you have much higher load than usually?
If not, did you perform any code changes that might be responsible for that change?
it could also be the case that the server your website is on is under unusual high load or in some weird half-dead state. Sending a mail to your ISP asking them to check your server might be a good idea in any case.

Magento Community 1.6.1; Users get empty response after products updated

Running Magento Community 1.6.1 on Apache, MySQL.
Hi, I have a huge problem with visitors getting locked out from our site by receiving empty response from server. Any user that has anything in the cart when any product is updated (actually just pressing save is enough), will get locked out by receiving empty_response from the web server.
Users gain access again if they remove their session cookie, and/or we clear the sessions folder in /var/, but until then all they get is empty_response from the web server.
Rolling back the database to one a few days older than the first symptoms kills the problem, without having to replace any files.
No logs or exceptions are produced.
It took a while to cause the problem, and even a bit longer to find the cause, so rolling back to a database backup from before the problem is not an option.
Have read a lot about cookie lifetimes, etc, but nothing of that has been related to this. Furthermore - being able to verify with a functional database allowed me to copy the settings (actually copying the content of core_config_data) from the functional database to a copy of the current live database, for more test. Results were the same…
So, ending up here… would be extremely thankful for any tips, hints or directions that put me on the right track!
Thanks for reading my post, and many thanks in advance for spending your time reflecting over this issue
you should try
database repair tool
clear all logs in database
re-save customer groups
inspect cookie settings with fiddler or any other similar tool
verify that your server clock is correct
enable error messages
inspect server error logs to see if there is any logs related to this

Images not loading on Facebook

I'm usually a great debugger when it comes to helping family members with their computer problems, I also would normally post this type of question here, but I'm hoping this community can help me get to the bottom of this.
A family member is having problems with certain websites not loading all of the resources, primarily images is what it appears. I have disabled her Symantec protection in case it was scanning or preventing stuff from loading and have also uninstalled and disabled startup applications she doesn't need.
One example of a file that is not loading on her system is:
http://static.ak.fbcdn.net/rsrc.php/v1/yp/r/kk8dc2UJYJ4.png
I'm assuming this loads for everyone else here.
Any thoughts would be much appreciated. Also she gets a similar issue in IE, Chrome, Firefox.
The first place I'd look is if there's a commercial ad-blocker installed, as I guess it can't be an add-in/extension as different browsers have their own settings.
And it may sound silly, but did you check the hosts file (system32/drivers/etc/hosts)? Is it possible static.ak.fbcdn.net is just being redirected? You might want to try opening the command prompt and just doing ping static.ak.fbcdn.net and confirming her computer's exact behavior.
In my case FB redirects me to a749.g.akamai.net (or 125.56.208.11) and everything works fine.
Minor edit: I'm a bit skeptical that's the cause, as FB serves other stuff from that domain (CSS, JS). Photos and profile pictures seem to come from a different domain. But I'd still be interested in whether the problem occurs when connecting to the resource or displaying it.
Thats probably because your DNS resolves the Akamai CDN server, used by facebook to fetch images, to an IP address that is not reachable from your network. You may want to get the IP address of facebook CDNs used by your computer at the time this happens and contact your network administrator to find the reason behind the IP blockage (may be because of firewall). Other than that, you can try changing your DNS in your system settings which might give you an IP address that works for your network.
PS: I ran into this issue a few weeks ago and have found my findings to be correct.

Changing domain linked to a Selenium::Client::Driver instance

I'm using the Selenium Client (v 1.2.18) to do automated navigation of retail websites for which there exists no external API. My goal is to determine real-time, site-specific product availability using the "Check Availability" button that exists on a lot of these sites.
In case there's any concern, each of these checks will be initiated by a real live consumer who is actually interested in whether or not something's available at that store. There will be no superfluous requests or other internet badness.
I'm using Selenium's Grid framework so that I can run stuff in parallel and I'm keeping each of the controlled browsers open between requests. The issue I'm experiencing is that I need to perform these checks across a number of different domains, and I won't know in advance which one I will have to check next. I didn't think this would be too big an issue, but it turns out that when a Selenium browser instance gets made, it gets linked to a specific domain and I haven't been able to find any way to change what domain that is. This requires restarting a browser each time a request comes in for a domain we're not already linked to.
Oh, and the reason we're using Selenium instead something more light-weight (eg. Mechanize) is because we need something that can handle JavaScript.
Any help on this would be greatly appreciated. Thanks in advance.
I suppose you are restricted from changing domain because of same origin policy. Did you try using browser with elevated security privileges like iehta for internet explorer and chrome for firefox browsers. While using these modes of browsers, use open method in your tests and pass the URL which you want to open. This might solve your problem.

Resources