I have a strange issue with google page speed insights. I am receiving this error even though I have moved all my scripts to footer, and critical css is inlined in header. Mobile results are very good, and desktop is receiving this error. anyone has some idea on how to solve this?
This is the url for test: google page speed insights
I had a look at the source for the URL that you provided and it seems like there is a script tag inside the HEAD.
<script>
sessionStorage.setItem('BASEURL', 'http://test.bigcat.rs/mirokredit/wp-content/themes/mirobuild');
sessionStorage.setItem('LANG', 'de');
</script>
You should also move that to just before the closing BODY tag add the "async" attribute to the tag. It will notify the browser to run the script asynchronously as soon as it becomes available and remove the blocking.
Hope that helps!
I have found the problem and the solution! In my original css I have a BG image for top content on desktop. In critical css I have included only bg-color and I wanted to serve this bg-image later. Once I have included bg-image in critical css( in header) desktop results improved and render blocking script error message disappeared. Google pagespeed insights examined the final result picture with the one generated with only critical css, And it found out that the pictures were not similar enough, and That's why it said that I had blocking content.
Related
I have a site. I tested it in Google Pagespeed Insights and it gave me some suggestions. I changed the issues. I deleted some files. But it's still showing the same result and giving the same suggestions. Gtmetrix is showing the changes, but Google Pagespeed Insights is not. Can anyone tell me what is the reason ?
Site Link : https://www.galpal.co.uk/
You are getting this error because the file is still referenced in the HTML.
Because the browser doesn't know that you have deleted the file off your server it still makes the request as you left a reference to it in your HTML.
Even thought the file returns 404 not found there is still a request made for the file. At this point that is something that will block rendering of the above the fold content.
Remove the reference in your HTML (remove the <script> element referencing the file) and Page Speed Insights will reflect the change.
To be clear GT Metrix is wrong here and Page Speed Insights is correct.
p.s. - you should add defer attribute (or async but that is more difficult) to your scripts
<script src="yourJavaScriptFile.js" defer></script>
My problem is as illustrated in the above picture, where we can see a set of two font files being fetched and transferred twice.
These fonts are self-hosted for my Gatsby.js site, and this double-fetching leads me to see font flashes twice on page load. Some clues that might shed some light:
1) Lighthouse is complaining about this error being logged into the console:
"Failed to load resource: net::ERR_ACCESS_DENIED" with regards to one of the fonts (Open Sans).
2) The initiator for the second set of font requests is "other". A google search tells me that "other" refers to requests made from a user (which seems unlikely here since this is on page load) or for preloaded requests (I didn't explicitly include a preloaded link in my HTML head, but maybe one of those Webpack scripts?).
3) This issue only appears in my production build. It works okay and only fetches once in development mode.
Does anyone have any idea what might be causing this, or could perhaps give me some advice as to how I might even begin to debug this problem?
Edit : I don't know if it actually loads from cache, so I can't create the question named "prevent from loading cache".
Problem : Browsers sometimes save my code and keep loading only the code they saved(Maybe it saves in the cache). When this problem occurs, browser is like caching the old code and won't change anything. This is to say, It won't load any new code I updated.
Information: This occurs in HTML, CSS, Javascript on all browsers. I am using Apache in XAMPP as an appserv.
Deleting cache in all browsers won't fix this.
My first way to stop this is to delete the file, refresh browser and replace it.
The second ways is changing the pathname.
After the fix, the problem will occur again at anytime :(, so I would like to know how to prevent this.
Edited: If possible, please explain for newbie because I am very young beginner.
Try adding a variable like current timestamp to each url in its query string.
Just use querystring e.g. http://www.domain.com/style.css?version=1 for first version.
Now you update the stylesheet and you would like to reflact changes to all users browser who have cached version of old stylesheet. for this just change version querystring value to 1.1
e.g. http://www.domain.com/style.css?version=1.1
This works for javascript, css and all other files called in by your html page.
Also for all files like html, css, js, you can eTag header. More information can be found here.
http://www.w3.org/2005/MWI/BPWG/techs/CachingWithETag.html
OK so I've got this addon where I'm trying to load a bitmap from a file:/// URI and draw it to canvas.
All that goes fine until I need to get the data off the canvas using getImageData, then I run into a security exception. I go to Moz Chat and am told that because I'm loading the image from a page modded HTML File, it's a cross domain policy issue and not allowed.
The solution, they say, is to go to the main module and load the image there, copy it to a canvas, then serialize the data with getImageData and send it back to the HTML doc.
One problem: Jetpack doesn't know what "Image" is because it doesn't have an HTML DOM, thus the operation seems to be rendered more or less impossible.
Why is this a cross domain policy issue in the first place? Beyond that, how do I load the image without access to the DOM?
The simpler and best example of communication between main.js module and a content script you can read is in Add-on SDK docs: look for the section under the title Communicating With Content Scripts.
Basically, this is how the main module tells the content script (the pagemod in your case) something:
worker.port.emit("getElements", tag);
and this is how it listens to whatever the content script tells him:
worker.port.on("gotElement", function(elementContent) {
console.log(elementContent);
});
On the other side, the content script listens to what the main module says to it this way:
self.port.on("getElements", ...
And finally, a case this example is missing is how the content script may emit an event to tell something to the main module:
self.port.emit("myCustomEvent", var1=someValue, var2=otherValue, ...)
But that's the idea. I also recommend you to tale a careful look at this more general explanation about how content script (pagemods, widgets, tabs, panels, etc) work because this is the mosst important concept to understand how sdk addons work.
Last, about the cross domain issue on content scripts you can read more here.
I'm totally stumped here, so any ideas would be appreciated.
I have a RichFaces application, which recently became non-functional when used from IE6. The problem began when I included the following line in my main template:
<a4j:loadScript src="resource://jquery.js"/>
This results in the following generated HTML:
<script src="/AgriShare/a4j/g/3_3_3.Finaljquery.js.jsf" type="text/javascript"></script>
By "non-functional" I mean that pages no longer load, b/c the first page appears to hang the browser for a long time, and then all references to jQuery say that the object was not defined. Eventually this appears to put IE6 in a state where further clicks do nothing.
After a lot of trial and error I have established the following:
The app still works in Chrome, Firefox and IE8
The app still works in IE6, if I switch to HTTP. So, the problem appears to be related to HTTPS, which I can't dispose of.
I further narrowed down the problem by trying to manually request 3_3_3.Finaljquery.js.jsf in IE6 address bar. It asks me if I want to save the file (so it can see it is there), but when I say 'Save', it hangs for about 5 seconds and then says:
Internet Explorer cannot download 3_3_3.Finaljquery.js.jsf from [host_name].
The connection with the server was reset.
Doing the same download over HTTP succeeds.
Gradually reducing the size of the file, I noticed that the download eventually succeeds over HTTPS, if I get the files size below ~ 110KB. There is no specific size it works at though. I tried the same trick with prototype.js and it worked at a different size value.
I can't trace the SSL session, b/c I cannot get access to the certificate's private key, so now I have absolutely no clue what to try next.
Any ideas would be greatly appreciated.
Try using Fiddler for debugging. It can handle SSL.
You might also want to consider hosting the server yourself and taking a look at the server log.
The problem was solved by turning off compression of javascript files in Web Cache.
Sounds like the problem might be related to this: http://support.microsoft.com/default.aspx?scid=kb;en-us;327286