whiterabit
Regular Member
Hi all,
I have come across a potential problem and I wanted to clarify my thoughts on the situation with the opinions of other.
I currently have one of my website hosted with a 3rd pty developer and they have set my robot.txt file to
User-agent: *
Disallow: /
My understanding is that this prevents all robots from crawling my site yeah? And as a result is a reasonable explanation as to why my G-Webmaster account is showing crawl errors.
Are there any benefits to this or am I potentially hamstringing my website and SEO efforts?
Long story short, if I want to be indexed and ranked accordingly should "User-agent: * Disallow:/" stay or go?
I think go...
I have come across a potential problem and I wanted to clarify my thoughts on the situation with the opinions of other.
I currently have one of my website hosted with a 3rd pty developer and they have set my robot.txt file to
User-agent: *
Disallow: /
My understanding is that this prevents all robots from crawling my site yeah? And as a result is a reasonable explanation as to why my G-Webmaster account is showing crawl errors.
Are there any benefits to this or am I potentially hamstringing my website and SEO efforts?
Long story short, if I want to be indexed and ranked accordingly should "User-agent: * Disallow:/" stay or go?
I think go...