Do I need to resubmit my sitemap after updating it? - sitemap

I made some updates in my sitemap, when I first submitted it all the links are like this
http://www.www.bagreviewsguru.com/
and I changed it to
http://www.bagreviewsguru.com/
because there are warnings saying that urls are not accesible due to double www. So I need to resubmit it to Google Search Console?

It may speed up things according this blog post: http://www.lauradhamilton.com/resubmitting-your-sitemap-to-google. However, it's not something guaranteed.

Related

Endless spinning wheel at Woocommerce checkout

I am in the process of building a shop on wordpress using woocommerce. All is well except that at checkout there is an endless spinning wheel blocking the payment processing.
I have checked the error logs - which show no errors
I have disabled all plug ins - which makes no difference
I have reverted to the default woocommerce shopfront theme - which has the same result.
The console shows no errors.
I have also followed the instructions here https://docs.woocommerce.com/document/endless-loadingspinner-on-the-checkout-page/ which assumes its a memory limit issue. This did not work.
So, after a bit of digging I found some reference here https://mikejolley.com/2015/11/12/debugging-unexpected-token-in-woocommerce-2-4/ and here https://www.maxsangster.com/blog/woocommerce-endless-loading-spinner-on-checkout/
Referring to these two pages I have been able to see that the json response from /?wc-ajax=checkout is just returning HTML which I would imagine is where the issue is.
However I am running an Apache server rather than Nginx as has been mentioned in some threads and articles. Assuming there is a server misconfiguration of the server or something that needs changing what might that be? Bearing in mind that I do not have direct access to this, so will need to be asking someone else to sort it out for me.
And finally if there is something else I can try what might that be?
Thank you to #plushyObject for generating the spark for this one.
The issue turned out that I had a legacy static html holding page in place as the site's homepage rather than having one set up with wordpress. Simply removing the .html page and letting wordpress's homepage take over solves the problem.
The moral of the story create your holding page in wordpress.
Go to Google Developer Tools, then Go to the Network tab. Click the button and make the request and let that bad boy spin out.
You mentioned the response is returning HTML. Click on the request that goes out that appears to be taking forever (/?wc-ajax=checkout) and then click on the Preview tab to display that HTML. I bet it shows an error or a clue, anyway.
In my case the checkout was working fine on Desktop but not on Mobile. After many search on internet I read the solution of #UntitledGraphic.
I had set a redirect in ht-access. The redirect was showing a different home page for mobile. When I removed it, the error gone. I checked back & forth and was sure the ht-access redirect was the problem.
If you have set any redirect in the Htaccess then remove it. This will solve your issue in this case.
I also checked the redirect code on function.php instead of htaccess again the problem appeared. So finally removed the redirection.
In my own case the redirection was the issue. So I had to delete the redirect I created via Cpanel.

Google search does not show my website

I already added two versions: one with www, and the other without www, both pointing to same location. I set the non-www as the preferred version. I add sitemap only to the non-www version. I added all the links in the sitemap. Then still my sitemap index status is Pending. I have an article that was current and relevant, but when I search for that keyword in google search, i ended already on the last search page, still my website doesn't appear. However those websites which are not even related to my search keyword appear. It's a bit frustrating cause I am confident my page is relevant and properly created, however Google still doesn't show it in the search results :(
Is there anything I still need to do?
It takes a couple of days for Google to crawl through and update the results after you make each change. Wait a couple of days and see what happens.

URL Re-writes and Google Indexing

I was asked to perform some URL re-writes for a new site with numerous dynamic pages and this has all worked fine.
However when I look at the URLs that Google has indexed, it has indexed the 'non-rewrite' url, so all the '?', '&' etc are being used.
What do you have to do to force Google to index your re-written URLs?
I just assumed it would do this automatically and never expected it to be an issue.
All help is gratefully appreciated.
Thanks.
Steps
1) Make sure that expired pages are no longer publicly accessible
2) Anything you do not wish Bots to crawl should be flagged with appropriate "nofollow" meta tags
3) Submit a new sitemap to your Google Web developer account
4) Make sure your Website throws a 404 error when a page isn't found. It is always a good idea to make a splash page for a 404 error which links back to your home page. (this is accomplished different ways across different server-side languages)
Google will automatically remove indexed pages if they no longer exist.. So be patient.

How to clear Linkedin Share cache?

I have a new description in the page but when I share the page it is still using the old description that no longer exists, I am after something similar like the Facebook Lint.
Any ideas?
You can append dummy query string value to your url and make it look like a new url and LinkedIn fetches it again. I've tried it and it works.
For example:
https://www.codeproof.com/?refid=LinkedIn
where refid=LinkedIn is just a dummy value.
If your url already contains query string and then just append "&refid=LinkedIn" at the end of the url.
Unfortunately, appending a query string to the URL no longer works.
From the following StackOverflow post:
LinkedIn's content cache presently stores website information for approximately 7 days before the crawler will revisit the site.
There looks like there is no instant way to clear the cache, but to wait seven days, remove the media and re-add it.
Appending a query string to the URL no longer works, so you'll have to wait 7 days.
But if you really need to share your URL with the medias you want, you'll have to go with a custom API call.
From the LinkedIn developer docs :
The first time that LinkedIn's crawlers visit a webpage when asked to
share content via a URL, the data it finds (Open Graph values or our
own analysis) will be cached for a period of approximately 7 days.
This means that if you subsequently change the article's description,
upload a new image, fix a typo in the title, etc., you will not see
the change represented during any subsequent attempts to share the
page until the cache has expired and the crawler is forced to revisit
the page to retrieve fresh content.
If you make API calls that directly provide the content to be shared
rather than by a URL that requires analysis, LinkedIn will always use
the values you provide.
Step 1: Visit https://www.linkedin.com/post-inspector/
Step 2: Enter your URL and click on Inspect, You will see the updated preview image
Step 3: Now try sharing your URL on LinkedIn
I've just found a way to force linkedin to fetch a fresh version of the page. Just create a redirect to your destination page and share the redirect page.
For example:
If your page that you want to share is: http://stackoverflow.com
Create a redirect for a page: https://stackoverflow.com/share-li to go to http://stackoverflow.com
And then share the https://stackoverflow.com/share-li on linked in. This way linkedin will think it's a new page and it'll get a fresh page version.
It's easy to do if you're using wordpress, just install a redirection plugin like this one for example: https://wordpress.org/plugins/redirection/
For wordpress these steps work for me:
In the home page I've removed the featured image and add it as a simple image on the header of the page
I've created a redirect page in my blog like (mydomain.com/social) that redirects all requests to my blog (mydomain.com)
Share the blog again in the social networks and everything will be ok
It's done =D
Unfortunately, there is none as of now. We are investigating what it would take to expose a similar feature. Please stay tuned. We'll announce it on the developer site at http://developer.linkedin.com.

How to force a page to be removed from the search engine index?

Situation: Google has indexed a page in a forum. The thread is now deleted. How/whether can I make Google and other search engines to delete the cached copy? I doubt they would have anything against that since the linked page does not exist anymore and keeping the index updated and valid should be in their best interests.
Is this possible or do I have to wait months for an index update? Or will the page now stay there forever?
I am not the owner of the respective site so I can't change robots.txt for example. I would like to force the update as the "third party".
I also noticed that a new page on that resource I created two days ago is already in the cache. Given that can I make an estimate how long will it take for a non valid page on this domain to be dropped?
EDIT: So I did the test. It took google shortly under 2 months to drop the page. Quite a long time...
It's damn near impossible to get it removed - however replacing the page with entirely blank content will ensure that you nuke the ranking of the page when it is respidered.
You can't really make Google delete anything, except perhaps in extreme circumstances. You can adjust your robots.txt file to promote a revisit interval that might update things sooner, but if it is a low traffic site, you might not get a revisit very soon.
EDIT:
Since you are not the site-owner, you can modify the meta tags on the page with "revisit-after" tags as discussed here.
You cant make search engines to remove the link but don't worry soon the link will be removed as the link will not longer be active. You need not wait for months for this to happen.
If your site is registered with Google Webmaster, you can request to remove pages from the index. It works, I tried and used it in the past.
EDIT: Since you are not the owner, I am afraid that this solution would not work.

Resources