I'm building a NextJS application, which uses SSR. I'm soon going to host this website online, but I need to check how much bandwidth is used, by first-time users, when they load our application. I try resorting to the chrome developers tool for that but it seems that even after clearing browsing data, cache, temporary files, prefetch, and restarting the PC it does not removes the files at any cost.
Related
Working on a site, using Chrome Incognito to review my changes, but files are still being cached and I can't review what I've done. Incognito isn't supposed to cache files. I just had a session with my hosting server to be sure nothing is being cached there, and it's not. To be certain, I deleted the file I'm working on from the server, then logged into a remote computer where I tried accessing the file and got a 404 error, so I know it's not cached on the server. I then completely shut down Chrome in this computer, restarted, opened an Incognito Window, went to the URL, and the file is still showing in a state before I made the most recent changes I'm trying to verify. I've repeatedly cleared the Chrome cache, but that doesn't effect Incognito. How do I clear whatever is caching these files in my system? Or maybe a better question would be, how are developers avoiding cache issues when building?
I have TFS 2015 installed on one of the company's servers. I try to access TFS using web access and it is extremely slow, it takes more than 5 minutes for a page to load and sometimes even longer. If I restart the server, TFS becomes a little bit faster (a page would need only a minute or so to load), but soon it becomes slower.
The server itself is okay. The CPU and memory are not even fully utilized (~20% - ~40% is utilized).
Other applications that are installed on the server are working fine, so it's just TFS.
Any suggestions?
Log in the application tier machine to try to access the web access to see whether you can see the same behavior.
Check the network connection between the application tier machine and data tier machine if you set up TFS in a multiple server configuration. You may try to turn off the firewall and anti-virus software on the machines.
Clean the cache folder on the application tier, usually the folder locates in: C:\TfsData\ApplicationTier\_fileCache
Check the Requirements and compatibility, to see whether your TFS set up on a appropriate environment.
If the items above is not helpful. You may need to consider move your TFS to another hardware.
I've build a Joomla website and it has been online for like 2 years now, always worked fine.
Recently I added a Google maps plugin and changed some header images.
Strange thing is, those changes are only displaying on my own laptop. The problem is not that I'm by accident viewing a local website, I'm 100% visiting the website online. I just opened the website on chrome with www.mywebsite.com and it displays all changes made in the backend. When I log out and back in again to the backend trough /administrator I can see the changes.
Now when I open the website on another computer (tried like three different ones), the changes are not displaying. Even when I log in to the backend through /administrator, still online on the same network, I am not seeing the changes made on my laptop and vice versa.
Frontend sometimes can have problems with caching, but the cache is disabled so this can't be the problem. And even if so, why would the backend display other data on 2 different computers..?
Also I've already tried removing the browser cache on those other computers.
Has anyone experienced this problem? I'm guessing it might be a problem on the side of my hosting company..
It had to do with a rule in the hosts-file on my laptop which directed to an old location/ip-adress of the website which has been moved to another server.
When I browse to a local site on IIS using Chrome I get intermittent slow performance.
It doesn't seem to matter whether the request is a full page request or an ajax request, it happens a significant percentage of the time, enough to slow down my development or make me use a different browser. Browsing to the same site in the live environment runs fine. Firefox and IE are running fine, just seems to be Chrome.
The network tab is showing the delay on the Blocking phase on my machine so I don't think it's a problem with DNS and disabling IPv6 didn't help me. Could it be something to do with the application or session cookies? I'm running Windows 8.1 with IIS 8.5 and the general performance of the machine is good.
Very frustrating because I prefer Chrome tools to the dev tools in other browsers and I've not had this issue on other dev machines where I've used Chrome.
Clear all your data (cookies, history and cache) if that doesn't work, reboot your PC, if that doesn't work, reinstall Chrome.
I hope it helps
I've been working on a VB.NET/VS2008/AJAX/SQL Server project for over 2 years now without any real issues coming up. However, we're in the last week of our project doing some heavy stress testing and the project starts failing once I get about 150 simultaneous users. I've even gone so far as to create a stripped down version of the site which only logs in a user, pulls up their profile and then logs off. That still fails under stress. When I says "fails" I mean the CPU's are spiked and the App Pool eventually crashes. This is running on a Windows 2008 R2 duo quad server w/ 16 gig of memory. The memory never spikes but the CPU tops out.
I ran YSlow on the site and it pointed out that I needed to compress the .axd files, etc... I did that by implementing Gzip compression on everything but that's what got me to the 150 users. I run YSlow now and it says everything is "A".
I'm really not sure where to go from here. I'd be more than willing to share the stripped down version of the site for anyone to review. I'm not sure if it's the server, my code or the web.config.
I know it is a bit late but have you considered increasing the number of worker processes in the application pool of your site to form a web garden? You can do this on the IIS Manager.