How to apply pagespeed insights results - performance

Well, I trying establish a web page with a wordpress and GoDaddy hosting. I want to make fast web page, because people says fast web pages appear on first line at Google (as specially mobile web page speed is very important people says). So want to make very fast web page but my level of knowledge is not very advanced, I progress by learning.
If I test my web page with Insights, mine mobile score is about 60-70. If I read reports of Insights there are lots of improvements links appear at blow. I want to learn how to fix that. If you help me make an example, I will do the others myself.
If we start at first problem which is /css?family=…(fonts.googleapis.com) this problem seen below of "Eliminate resources that prevent rendering" topic. So how to fix it. What should I do?
Also at the "covorage" tab there are some source codes are seen and it is not using. For example I am not using easy-sheare plugin (secong row at the image) at homepage.
How to remove safely that codes from home page. If I can learn how one is made, I can correct the others myself.

The issue you are running into is something I have seen over and over again. GoDaddy and Wordpress sites generally are bloated and perform poorly.
Here are some tips to improve your speed & get a better PS ranking.
Hosting: Do you need to be on Godaddy? I have seen this time and time again. Most websites on GD are SLOW. GD is good for domain registration, not for hosting. Most non-tech folks do not know any better. Try using Amazon Lightsail, AWS-S3, Google Firebase, or Netlify. They all offer much faster page loads by reducing initial server response time. And they are surprisingly simple to learn and deploy.
CDN: You must use a content-distribution-network (CDN). Check out Cloudfront. They offer a free tier that works quite well.
Wordpress: This is your real issue. Wordpress is neither easy to build nor easy to maintain. You need multiple plugins to make the site perform. Best you build your own. If you have to be on Wordpress checkout image optimizers, minifiers, and cache plugins. Gumlet, WP Rocket, Shortpixel are quite popular to improve speed.

Related

Website creation query

I need to create a website which stores the list of all games the player has played and it shows right on your profile. As the player goes on completing a game, he adds the game into his list.
So i would need a basic lo-gin configuration and then by using AJAX, I will populate the list of games which he wants to add to his list. So that he can track the list with games that he has played.
So now I need suggestion on how to go on with it?
How to start building?
Which language do I need to pickup?
I am well versed with Java and j2ee.
Is this enough?
Also I am a freelancer so I can't afford to pay for a website. So any free website hosting service which will help me to build the website which I have in mind??
Also if I use any free website hosting service, will they provide me with a database and AJAX capabilities?
Here's the basic setup:
You need a domain first. Try to pick something unique, as it will be cheaper. You can find one on namecheap: https://www.namecheap.com
You need hosting. Again, go with namecheap.
To start building, you need to learn some HTML and CSS. HTML is markup of the web, and CSS is the stylesheet of the web. They aren't hard languages to start off in. You can start for free at Khan Academy: https://www.khanacademy.org/computing/computer-programming/html-css
I believe namecheap offers database support as well. Ajax isn't provided by a hosting service. It's more of a group of languages (HTML, CSS, JavaScript).
This should get you going. I can't really give you more detailed information than this because your question is really broad. If you Google your questions, you'll get good answers and guides.
Best of luck.

What's the best SEO practice when you do an AJAX driven website?

I encountered several websites running in Ajax and it seems like their SEO is pretty bad, does Google really crawl websites like that?
Optimization guides for different search engines tell that bots are unable to crawl such sites. I think, Google's bots might use Chrome's engine for some purposes (I remember, they made site screenshots one time), but nevertheless, it's the static HTML that's important. Therefore, the usual practice would be generating valid HTML on the server to provide functional site for user agents like, for example, Lynx, and then patching it with AJAX, history API and all other imaginable bells and whistles.

Extremely clever URL system (how to make advance SEO urls)

I am building an intensive web application and currently all my URLs at the moment are in page.php?action=string format. Don't worry, we have a fall back plan to change all pages quickly to the SEO URLs via a config file.
I want to know two things. What script is running this site:
http://lookbook.nu/ (also http://stackoverflow.com)
If you just look at it, hover over areas, crazy ajax calls, so many subdomain calls, so many clean URLs. What would be the best approach to do this - is this a RoR thing? All the URLs are so clean and structured. It really impressed me.
I am not wishing for a htaccess solution as I am using nginx.
StackOverflow actually runs on ASP.NET MVC, but you have URL rewriting built in Apache too if that's your thing. No clue about nginx, though.
Edit: A simple Google search revealed http://wiki.nginx.org/HttpRewriteModule so you're in luck!

When trying to integrate one website with another what is the way to go? Iframe or pulling content?

My company has multiple vendors that all have their own websites. I am creating a website that acts as a dashboard where customers can access all of the vendor's sites. I wanted to know what is the best option for doing this?
Here's what I have so far:
Iframe
Can bring in the entire website
Seems secure enough (not sure if I'm missing any information on security issues for this)
Users can interact with the vendor's website through our site
Our website cannot fully interact with the vendor's website (Also may be missing info here)
Pulling in the content
Can bring in the entire website
Not very secure from what I hear (Some websites actually say that pulling another website in is a voilation of security and will alert the user of this or something similar...
Users can interact with their website through our site
Our website can fully interact with the vendor's website
Anyone have any other options...?
What are some of the downsides to bringing in a site with an iframe and is this really our only option for doing something like this?
Optimally, we would like to pull in their site to ours without using an iframe- What options do we have on this level? Is there anything better than an iframe?
Please add in as much information as you can about iframes, pulling content, security, and website interactions like this. Anything to add in is appreciated.
Thanks,
Matt
As far as "pulling content" is concerned I wouldn't advise it as it can break. All it takes is a simple HTML change on their end and your bot will break. Also, it's more work than you think to do this for one site, let alone the many that you speak of. However, there are 3rd party apps that can do this for you if you have the budget.
You could use an iframe/frames, however, many sites might try to bust out of them and it can ruin the user experience of the site within the frame.
My advice is to use the following HTML for each link in your dashboard.
Vendor Site Link
If you can have the sites that you are embedding add some client-side script, then you could use easyXSS. It allows for easy transferring of data, and also calling javascript methods across the domain boundry.
I would recommend iFrames. Whilst not the most glamorous of elements, many payment service providers use iFrames for the Verified by Visa/Mastercard Secure Code integration.

Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea?

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Resources