Does Google give back its former PageRank to a page that returned 301 during some time - pagerank

I made a terrible mistake.
2 monthes ago, I put a 301 permanent redirect on a PHP page that displays many old pages of my website.
But those pages were highly SEO friendly and had a good PageRank.
Today I removed the 301 redirect.
Will these pages get back their old PageRank ? Or will it be reset ?
Thanks.

If Google still finds those pages interesting, then yes you will recover your rankings. However, since these page were not ranked anymore, others may have replaced them and user may have found them more interesting in the mean time. There may also have been ranking algorithms modifications impacting your pages.

Related

Is it possible to auto-update a Wordpress loop if the contents of the loop have changed?

What I'm wanting to be able to do is create a page that lists all the posts of a custom post type, and if someone creates a new post, anyone viewing the listing page will see the new post appear in real time.
I've done some research and it looks like what I'm wanting is a 'persistent client/server connection'? Something like with Meteor, or Ajax, or WebSockets? (I've read a bit about those technologies and I'm still a little fuzzy on if these do what I'm looking for)
My current solution is to blindly refresh the post listing page for visitors every 60 seconds, but it's not very realtime, and it causes increased bandwidth usage to reload the entire page every minute, even if there are no new changes to the wordpress custom posts database.
Can anyone point me in the right direction? I'm happy to clarify anything that I may have been vague about.

How does varnish deal with dynamic content?

I am studying up on caching and I am looking into varnish for caching. I am wondering though how does varnish deal with dynamically generated content?
All over the place people are saying you shouldn't really cache content that might change a lot but on the other hand when I look at the response headers for stackoverflow I see pages being served up via varnish.
Content here changes by the second so how does this even work? Excuse me if it's a bit of a simple question, I will research some more while this question is up.
You need to define dynamic :
if the content depends on the user (through Cookies for example), it should not be cached as you'll have lots of different contents and your HIT/MISS ration will not be high since every user has a different content.
if the content changes in time, you can always cache the content a little, for example a few seconds.
if the content changes in time, a better option is to separate the "static content" from the dynamic one. You may cache the page template and do ajax calls to refresh the content. You may also use esi, it's an old technology but it lets you specify differents "zones" in your pages, each having its cache duration.
you can benefit from IMS requests. Telling the backend to send the response body only if it has changed since the last request can save you lots of processing time. I think varnish does this from version 4
As for the stackoverflow architecture, you may learn a lot reading Nick Craver's blog post about it : http://nickcraver.com/blog/2016/02/17/stack-overflow-the-architecture-2016-edition/

What are the correct Google Webmaster URL Parameter settings for Pagination and Sorting?

We have about 4,000 products in a store, but Google is monitoring 31,000 URLS. To try and reduce the number of indexed pages, I have restricted the URL parameters in webmaster admin to:
LIMIT = All
P = 1
DIR = Asc
I'm thinking this will reduce the category pages indexed to 1 instance of each category.
The site has canonical URL tags defined, a comprehensive robots.txt and a daily updated sitemap.
Is this acceptable or is it better to Let Google Decide?
In general, if you are comfortable with what we decide is better for your site, then it's better to leave it to us. That said, if Googlebot is creating problems with crawling irrelevant or duplicate URLs, then of course, go ahead and create the filters.
Other than that, it's often much better to actually just disallow crawling the URLs if you can. That's less error prone and it will decrease the traffic to the roboted pages for good.
And finally, to actually answer your question, that setup looks good to me, although without knowing the exact URLs it's hard to say for sure.

caching snippets (modX)

I was simple cruising through the modx options and i noticed the option to cache snippets. I was wondering what kind of effect this would have (downsides) to my site. I know that caching would improve the loading time of the site by keeping them 'cached' after the first time and then only reloading the updates but this all seems to good to be true. My question is simple: are there any downsides to caching snippets? Cheers, Marco.
Great question!
The first rule of Modx is (almost) always cache. They've said so in their own blog.
As you said, the loading time will be lower. Let's just get the basics on the floor first. When you chose to cache a page, the page with all the output is stored as a file in your cache-folder. If you have a small and simple site, you might not see the biggest difference in caching and not, but if you have a complex one with lots of chunks-in-chunks, snippets parsing chunks etc, the difference is enormous. Some of the websites I've made goes down 15-30 levels to parse the content in come sections. Loading all this fresh from the database can take up to a coupe of seconds, while loading a flat-file would take only a few microseconds. There is a HUGE difference (remember that).
Now. You can cache both snippets and chunks. Important to remember. You can also cache one chunk while uncache the next level. Using Modx's brilliant markup, you can chose what to cache and what to uncache, but in general you want as much as possible cached.
You ask about the downside. There are none, but there are a few cases where you can't use cached snippets/chunks. As mentioned earlier, the cached response is divided into each page. That means that if you have a page (or url or whatever you want to call it), where you display different content based on for example GET-parameters. You can't cache a search-result (because the content changes) or a page with pagination (?page=1, ?page=2 etc would produce different output on the same page). Another case is when a snippet's output is random/different every time. Say you put a random quotes in your header, this needs to be uncached, or you will just see the first random result every time. In all other cases, use caching.
Also remember that every time you save a change in the manager, the cache will be wiped. That means that if you for example display the latest news-articles on your frontpage, this can still be cached because it will not display different content until you add/edit a resource, and then the cache will be cleared.
To sum it all up. Caching is GREAT and you should use it as much as possible. I usually make all my snippets/chunks cached, and if I crash into problems, that is the first thing I check.
Using caching makes your webserver respond quicker (good for the user) and produces fewer queries to the database (good for you). All in all. Caching is a gift. Use it.
There's no downsides to caching and honestly I wonder what made you think there were downsides to it?
You should always cache everything you can - there's no point in having something be executed on every page load when it's exactly the same as before. By caching the output and the source, you bypass the need for processing time and improve performance.
Assuming MODX Revolution (2.x), all template tags you use can be called both cached and uncached.
Cached:
[[*pagetitle]]
[[snippet]]
[[$chunk]]
[[+placeholder]]
[[%lexicon]]
Uncached:
[[!*pagetitle]] - this is pointless
[[!snippet]]
[[!$chunk]]
[[!+placeholder]]
[[!%lexicon]]
In MODX Evolution (1.x) the tags are different and you don't have as much control.
Some time ago I wrote about caching in MODX Revolution on my blog and I strongly encourage you to check it out as it provides more insight into why and how to use caching effectively: https://www.markhamstra.com/modx/2011/10/caching-guidelines-for-modx-revolution/
(PS: If you have MODX specific questions, I'd suggest posting them on forums.modx.com - there's a larger MODX audience there that can help)

mod_rewrite vs php parsing

im working on a little project of mine and need your help in deciding whether mod_rewrite is performance friendly or parsing each url in php.
urls will almost always have a fixed pattern. very few urls will have different pattern.
for instance, most urls would be like so :
dot.com/resource
some others would be
dot.com/other/resource
i expect around 1000 visitors a day to the site. will server load be an issue?
intuitively, i think mod rewrite would work better. but just for that peace of mind, i'd like input from you guys. if anyone has carried out any tests or can point me towards the same, id be obliged.
thanks.
You may want to check out the following Stack Overflow post:
Any negative impacts when using Mod-Rewrite?
Quoting the accepted answer:
I've used mod_rewrite on sites that get millions/hits/month without any significant performance issues. You do have to know which rewrites get applied first depending on your rules.
Using mod_rewrite is most likely faster than parsing the URL with your current language.
If you are really worried about performance, don't use htaccess files, those are slow. Put all your rewrite rules in your Apache config, which is only read once on startup. htaccess files get re-parsed on every request, along with every htaccess file in parent folders.
To add my own, mod_rewrite is definitely capable of handling 1,000 visitors per day.

Resources