Robots.txt file at FusionAuth root location?
-
Hello,
Is it possible to include a robots.txt file with my FusionAuth self-hosted community instance?
My website which is live has a robots.txt file at the root domain (https://rootdomain.com/robots.txt), and FusionAuth is running at subdomain auth.rootdomain.com.
In Google Search Console, it is complaining that there is no robots.txt file for my auth subdomain at https://auth.rootdomain.com/robots.txt. It's my understanding that for Google search indexing, you need to have a separate robots.txt file for each subdomain.
As a result, Google is crawling and indexing the FusionAuth authorize and forgot pages with the various parameters in the url, which I want to block them from indexing.
How can I add a robots.txt file to the root of my auth subdomain?
-
@ronn316 Could not you just config it at a much higher level, somewhere like in your proxy which stands between your self-hosted FusionAuth instance and the outside world. I am saying that because while I was trying to figure out a way to show a different favicon I stumbled upon this Q&A.
Hopefully this will help you .
-
@ronn316 Hey - the easiest way to prevent indexing on any pages you don't want indexed is to add a meta tag to the templates. Something that looks like this -> <meta name="robots" content="noindex">
Cheers,
Tony -
@ronn316 Could not you just config it at a much higher level, somewhere like in your proxy which stands between your self-hosted FusionAuth instance and the outside world. I am saying that because while I was trying to figure out a way to show a different favicon I stumbled upon this Q&A.
Hopefully this will help you .
-
@kasir-barati actually shortly after posting this thread I had the same idea and already implemented the robots.txt through my reverse proxy
Good idea about replacing the favicon through the proxy as well! I'll do that too.
-
@ronn316 Nice, Please mark the thread as resolved so that others know what was the answer for you.
Thanks
-