Googlebot Tired Message in Webmaster Tools
One of my client's got this error, but the pages are not duplicate, imo. Thoughts? This hurtful for rankings or google being nice and telling us that it might cause a spike in bandwidth fees?
Looks like some new blackhat links page algo kicking in. What sorta page triggered this? Or was it the whole site?
What was the exact message displayed by the way? Just having the pages as duplicate contents? Sometime ago, I also experienced something strange and then later on I found out it was because of the web servers problems, not that uptime and reliable as I thought.
(1) that was the exact message
(2) What is with all the signature links?
(3) Google responded at http://www.seroundtable.com/archives....html#comments
Oh I see, how did I miss that on the blog anyway? Well, your point does make sense, if the pages were duplicates and insignificant, why did Google bother all the communications since they could easily drop the pages as they are already doing this thousands of times everyday.
Is the faster crawling option turned on already at the webmaster tools or not? It more than probably should have been switched on and setting it to the normal status I think will be nice, you know, it gives Google the flashing green light for just swallowing bandwidth sometimes.
By the way, sorry about the sig links as I just warmed up while doing thousands of things online, please let me know the number of them to be ok to have and I'll trim the rest immediately.
Yea, good points.
Regards to the signature, cleaning it up would be nice.
I have had a funny warning from google webmaster tools myself recently drawing my attention to the fact that my robots.txt is set to disallow affiliate landing pages the irony is that these pages are now indexed whereas the homepage is now ranking between 200 and 800 for keywords when previously it was first page.
John Mu commented on this:
"I believe this is a fairly new message (but the problem is obviously old -- oh those endless calendar scripts...). The problem is that the Googlebot is wasting your server's resources by crawling URLs which most likely don't need to be crawled. There is a chance that we might be missing better content because of that, so we thought it would be good to let webmasters know about these issues and give them a chance to direct us to greener pastures."
Google clearly states in their Webmaster Technical Guidelines:
"Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead."
So I do not see nothing new or strange.
what about number of links ?
I have received the same message on 28 july. Actually, there is not information in french web about it.
But i would like to know your opinion about this sentence in the message : "Googlebot encountered extremely large number of links in your website". What the relation between number of links and duplicate content ? i don't understand ...
should i have to reduce the number of my links ?
How to know when the pbm will be resolved (i'm not sur google send us a message again to say : all is good
and with million of pages, how resolving all the duplicate content... Hard work in perspective.
Sorry but i can't give my website name cause of