Disallow access and bot indexing in htaccess

A htaccess protected subdirectory of my website somehow has been indexed (months ago) by google. I had to add this directory to robots.txt but I don’t want the protected url to be visible in robots.txt anymore.
I switched this directory to a new name and it won’t probably be indexed again as it’s not referenced anywhere but… just in case, I would like to add a noindex to it.

I added to my subdirectory .htaccess

Header set X-Robots-Tag "noindex"

It’s working fine when I disable htaccess protection (I get the noindex header response). As soon as I add the protection

AuthType basic
AuthName "Restricted Area"
AuthUserFile /path/to/.htpassword
Require user admin

and simulate an error by hitting cancel on the authentication window, I get a 401 error and no “noindex” header.

Should I find a way to add a noindex on the 401 error page or is there an other way to manage that?


Source: .htaccess

Leave a Reply