A client noticed Google Search Console reported about an error: a page indexed though blocked in robots txt. Actually it was’t blocked.
Long time no see, GSC glitch!
Sadly, but on 15th August FE International reported about the glitch touched Google search algorithms itself. resulted in “unusual search results”. E-commerce sites also reported drastically fluctuations in organic ranking of E-commerce sites fluctuated drastically but happily the flow was fixed by 11th August.
I wonder what we SEO people would do if Google crashes?
Back to Google Search Console. For me it’s the primary source of data on website health from Google point of view. (Unlike SEhref, AMrush etc. onlne services.) Starting from mobile friendliness, backlinks and coverage to newly adopted ore web vitals it gives a clear picture what SEO professional should do the very 1st to persuade Google to show a bit of mercy to client’s website.
As a rule the current version of website in Google cache is not equal to the reality due to missing sitemap xml or irrelevant robots txt.