routes.py and root_static

56 views
Skip to first unread message

Annet

unread,
Oct 14, 2020, 11:30:16 AM10/14/20
to web2py-users
I have the following lines of code in routes.py

routers = dict(

    # base router
    BASE = dict(
        domains = {
            'domain.com' : 'init',
            'www.domain.com' : 'init',
            'ldc.domain.com' : 'admin',
            'my.domain.com' : 'controlpanel',
        }
    ),
)

I want to add root_static = ['favicon.ico', 'robots.txt'], to routers but I am not sure
about the syntax and where to put the robots.txt file.

BASE = dict(
        domains = { },
        root_static = ['favicon.ico', 'robots.txt'],
    ),
)

would this be correct?

Should robots.txt go into the /applications/init/static/


Kind regards,

Annet

Jose C

unread,
Oct 16, 2020, 6:39:18 AM10/16/20
to web2py-users

BASE = dict(
        domains = { },
        root_static = ['favicon.ico', 'robots.txt'],
    ),
)

would this be correct?

Should robots.txt go into the /applications/init/static/

Yep, any files served at the root of your site (e.g.  https://mydomain.com/robots.txt)  go in the /static directory of your app and then you specify them in the root_static list as you've done.  Reminder, you need to restart your app each time you change anything in routes.py for the new changes to take effect.

Jose


Annet

unread,
Oct 19, 2020, 5:33:50 AM10/19/20
to web2py-users
Hi Jose,

Thanks for your reply.

One more thing. I've got routes.py in web2py's root folder, as you can see I've got
three applications, do I place the robots.txt file in all three application's static
folder?


Kind regards,

Annet

Op vrijdag 16 oktober 2020 om 12:39:18 UTC+2 schreef Jose C:

Jose C

unread,
Oct 19, 2020, 5:57:34 AM10/19/20
to web2py-users


On Monday, 19 October 2020 10:33:50 UTC+1, Annet wrote:
Hi Jose,

Thanks for your reply.

One more thing. I've got routes.py in web2py's root folder, as you can see I've got
three applications, do I place the robots.txt file in all three application's static
folder?


Yes you'd have one robots.txt for each application, and they can be different.  So for example, assuming your init app at domain.com is public and allows access to everything you'd have a robots.txt with something like:
User-agent: *
Disallow:


Then assuming that in your admin and controlpanel apps you want no indexing you might have a robots.txt in each app folder with something like:
User-agent: *
Disallow: /


In other words, when web2py receives a request for http://(www.)domain.com/robots.txt it will serve the robots.txt file in the init application's static dir. If the request is for http://ldc.domain.com/robots.txt then it serves the robots.txt file that is in the admin app's static directory, and so on.

Reply all
Reply to author
Forward
0 new messages