Q. robots.txt prevent a page from being indexed by search engines?

A
Yes, it encrypts the page
B
No, robotstxt automatically deletes the page
C
Not necessarily; it only blocks crawling, but if a page is linked elsewhere, it might still get indexed without content
D
No, robotstxt is only for images
Solution:

robotstxt can stop a crawler from visiting a page, but if other sites link to it, the URL can still be indexed (though without content) For index blocking, you need a noindex directive or password protection

Entri Contact Image

Get Expert Advice for Free: Register for Your Free Consultation Now!