1

I know can create ONE robots.txt file for all domains on an Apache server*, but I want to append to each domain's (if pre-existing) robots.txt. I want some general rules in place for all domains, but I need to allow different domains to have their unique rules.

Is there a way to accomplish this?

(*In my case Apache 2.2.x)

Gaia
  • 1,975
  • 5
  • 36
  • 63

2 Answers2

5

From Apache's point of view, robots.txt is just an asset to be served. You can alter the content returned when robots.txt is requested by passing it through an output filter.

If you want to append some text, you could define an external filter. Assuming that Apache is running on Unix-like operating system, the filter configuration could be

ExtFilterDefine appendRobotstxt cmd="/bin/cat - /var/www/html/robots-tail.txt"
<Location /robots.txt>
    SetOutputFilter appendRobotstxt
</Location>

That would concatenate robots-tail.txt to the end of the response.

200_success
  • 4,830
1

Note that you'd probably have to incorporate the changes. If a domain already has

User-agent: *
Disallow: /search

and you want to add for all domains

User-agent: *
Disallow: /admin/

you'd have to make it

User-agent: *
Disallow: /search
Disallow: /admin/

because robots.txt parsers stop as soon as they found a block that matches them.

unor
  • 246