Have you ever been in this potentially frustrating situation? (The video below should be queued to the right spot.)
Not Blocked in robots.txt, Server Logs Look Fine
But, why do the Mobile-Friendly Test Tools and Search Console say that they’re blocked?
Compounding Things Further
She went on to say that her team has another site, that’s absolutely identical, that’s in a different country, but doing fine.
And to compound this further, she said that this issue had resolved itself for a month (even though nothing was done to it), at which point they saw that more pages were indexed.
But now, the issue has presented itself again.
Responses and Proposed Solutions
So, now it was John Mueller’s turn to provide a response.
One of the first things John brought up was that there can be a difference between testing tools and the real indexing process. The difference has to do with time limits, or ‘deadlines.’
One thing John mentioned was that “…usually, also when this happens, it can be a sign that things are kind of on the edge in the sense that sometimes it works and sometimes it doesn’t work…”
Earlier, I said that the problem seemed to have resolved itself, only to recur again. Maybe that’s why John said that things are possibly ‘kind of on the edge.’
(Granted, from my search, it doesn’t seem that they occur as frequently as the regular office-hours, but if you’re facing an issue like this and happen to be able to get onto one, it might be time well spent.)
He went on to elaborate that, “…in the cases I’ve looked at, it is more something where they’re almost like hundreds or 200 300 400 500 requests that are required to render a page.”
With more and more required resources, the more hit-or-miss the indexing process becomes.