I'm playing around with my homelab and I'm trying to include robots.txt file. I'm launching traefik and portainer using this docker_compose file. This is using Docker swarm mode
version: "3.3"
services:
traefik:
container_name: traefik
image: "traefik:latest"
restart: unless-stopped
command:
- --entrypoints.web.address=:80
- --entrypoints.websecure.address=:443
- --providers.docker.network=web
- --providers.docker=true
- --api.dashboard=true
- --api.insecure=true
- --log.level=DEBUG
- --certificatesresolvers.leresolver.acme.httpchallenge=true
- --certificatesresolvers.leresolver.acme.email=SOME_EMAIL@gmail.com
- --certificatesresolvers.leresolver.acme.storage=./acme.json
- --certificatesresolvers.leresolver.acme.httpchallenge.entrypoint=web
- --providers.docker.exposedbydefault=false
- --providers.file.filename=/dynamic.yaml
- --providers.docker.swarmMode=true
ports:
- 80:80
- 443:443
- 8080:8080
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
- ./traefik-data/acme.json:/acme.json
- ./traefik-data/dynamic.yaml:/dynamic.yaml
networks:
- web
deploy:
labels:
- "traefik.enable=true"
- "traefik.http.routers.http-catchall.rule=hostregexp({host:.+})"
- "traefik.http.routers.http-catchall.entrypoints=web"
- "traefik.http.routers.http-catchall.middlewares=redirect-to-https"
- "traefik.http.middlewares.redirect-to-https.redirectscheme.scheme=https"
- "traefik.http.routers.api.rule=Host(monitor.SOME_DOMAIN.dev)"
- "traefik.http.routers.api.service=api@internal"
placement:
constraints:
- node.labels.entrypoint == true
portainer:
image: portainer/portainer-ce:latest
command: -H unix:///var/run/docker.sock
restart: unless-stopped
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./portainer-data:/data
networks:
- web
deploy:
labels:
# Portainer
- "traefik.enable=true"
- "traefik.http.routers.portainer.rule=Host(`portainer.SOME_DOMAIN.dev`)"
- "traefik.http.routers.portainer.entrypoints=websecure"
- "traefik.http.services.portainer.loadbalancer.server.port=9000"
- "traefik.http.routers.portainer.service=portainer"
- "traefik.http.routers.portainer.tls.certresolver=leresolver"
placement:
constraints:
- node.labels.entrypoint == true
networks:
web:
external: true
volumes:
portainer-data:
driver: local
driver_opts:
o: bind
type: none
device: ./portainer-data
and then I'm trying to launch nginx to serve robots.txt file
version: "3.9"
services:
antybots:
image: nginx:alpine
container_name: antybots
volumes:
- /mnt/config/robots/robots.txt:/usr/share/nginx/html/robots.txt:ro
deploy:
labels:
# Antybots
- "traefik.enable=true"
- "traefik.http.routers.antybots.rule=HostRegexp(`{host:.*}`) && Path(`/robots.txt`)"
- "traefik.http.routers.antybots.entrypoints=web"
- "traefik.http.services.antybots.loadbalancer.server.port=80"
- "traefik.http.routers.antybots.service=antybots"
- traefik.http.routers.antybots.priority=99
networks:
- web
networks:
web:
external: true
But all I got when I enter to https://SOME_DOMAIN.dev/robots.txt or https://ANYTHING.SOME_DOMAIN.dev/robots.txt I receive either 404 or empty content of robots.txt
User-agent: *
Disallow:
which I not I placed in my robots.txt file.
I verified that container properly sees my robots.txt file. I think that might be caused by redirection to https which is probably caused by
- "traefik.http.routers.http-catchall.middlewares=redirect-to-https"
- "traefik.http.middlewares.redirect-to-https.redirectscheme.scheme=https"
Is there any way to bypass this for this one file and serve it as http? Or it might be caused by something else?