One user asked about having hundreds of pages showing as ‘Indexed, though blocked by robots.txt’ in GSC. This only really becomes a problem if the blocked pages are ranking in place of the content you want indexed. Much of the time, pages showing this error in GSC can only be found via a site: search, and even then many of them get omitted from the initial results. It’s highly unlikely that users would ever come across these, so digging into how and why they’re being found by other sources becomes a much lower priority. If they are showing up in place of actual content, you need to question why Google isn’t prioritizing the desired version in the way you’d expect them to.
The post “Indexed, though blocked by robots.txt” pages aren’t always an issue appeared first on Lumar.