Drill-down menu with PostgreSQL

131 views
Skip to first unread message

Louis Amon

unread,
Feb 11, 2015, 11:55:18 AM2/11/15
to web...@googlegroups.com
I am trying to build a geo-based drill-down sitemap for my website, pretty much like this https://www.airbnb.com/sitemaps/western-europe?locale=en

What I have is a model with the following fields:
administrative_area_level_1
administrative_area_level_2
locality
sublocality_level_1

I could loop on each field and run queries to list corresponding entries, but that seems very database-inefficient.


Is there a clean way to build this kind of hierarchy with a single request ?

Niphlod

unread,
Feb 11, 2015, 2:28:55 PM2/11/15
to web...@googlegroups.com
did you actually timed the queries ? without knowing the cardinality of the sets, it's pretty impossible to say that a single big-query will be more efficient than 4 on small subsets.

Louis Amon

unread,
Feb 12, 2015, 3:28:34 AM2/12/15
to web...@googlegroups.com
I haven’t timed the queries but there’s way more than 4 queries to be done.

For instance if I build a drill-down to zoom in on my city (Bordeaux), I would need to build something like this :

  • Aquitaine (administrative_area_level_1)
    • Gironde (administrative_area_level_2)
      • Bordeaux (locality)
      • Talence
      • etc.

So to build the whole tree I’d need to do nested loops over the first two fields (administrative_area_level_1 & administrative_area_level_2).
Assuming there’s N administrative_area_level_1 and M administrative_area_level_2, I’d need N*M queries to list all localities.

That seems a bit brutish.


I’ve read a bit about nested_queries in web2py but there isn’t much to be found. Is it a deprecated mechanism ?

Should I use rows.find() & rows.sort() then ?


Le 11 févr. 2015 à 20:28, Niphlod <nip...@gmail.com> a écrit :

did you actually timed the queries ? without knowing the cardinality of the sets, it's pretty impossible to say that a single big-query will be more efficient than 4 on small subsets.

--
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
---
You received this message because you are subscribed to a topic in the Google Groups "web2py-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/web2py/yF1UHKQqseI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to web2py+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Niphlod

unread,
Feb 12, 2015, 6:32:24 AM2/12/15
to web...@googlegroups.com
if your model is a single table with 4 fields, what's the expected resultset ? a nested dict ? if yes, no queries needed, just python.

BTW: a webpage holding 200 locations and requiring 800 queries is not a webpage a user would need. 
In other words, what's the maximum number of leaves you want to prefetch ?


Louis Amon

unread,
Feb 12, 2015, 7:27:29 AM2/12/15
to web...@googlegroups.com
a webpage holding 200 locations and requiring 800 queries is not a webpage a user would need. 

That’s actually a very good point !

This page is actually designed for SEO, but even SEO doesn’t go well with hundreds of links.

I think I should build a navigation architecture that maps the drill down I want to build.
This way I solve both problems : the DAL query will be much simpler & the result in terms of SEO will also improve !

Thanks Niphlod for helping me clear that up. Not all answers need to be technical :)

Niphlod

unread,
Feb 12, 2015, 11:26:01 AM2/12/15
to web...@googlegroups.com
mind that there's even another "angle": given that usually a city resides in the same place (same goes for regions, countries, etc etc etc) you can carefully cache the "computationally expensive" resultset and update it once a week (or once modifications are made). 
That's what we - dba - call as "materialization" process: something huge to compute, read often, modified rarely...can be computed rarely and stored somewhere with the most correct structure that is "read-safe".

Louis Amon

unread,
Apr 5, 2015, 1:41:18 PM4/5/15
to web...@googlegroups.com
That's an interesting approach indeed. Thanks Niphlod :)

Well, in the end my company hired a SEO consultant and our conclusion was that we need to build a drill-down navigation instead of one page containing the whole tree.

E.g. :

country
  • administrative_area_level_1
  • administrative_area_level_1
  • ...
Then if you click on any of the leaves :

administrative_area_level_1
  • administrative_area_level_2
  • administrative_area_level_2
  • ...
And so forth.
As you said "a webpage holding 200 locations and requiring 800 queries is not a webpage a user would need".

Search engines know that and disregard pages holding more than 200 links, so a clean routing and drill-down navigation is the right answer for this SEO issue 

Hope this helps other people :)

Carlos Zenteno

unread,
Apr 5, 2015, 6:04:34 PM4/5/15
to web...@googlegroups.com
Thanks for explaining the "materialization" process.  It will come handy on my app...
Reply all
Reply to author
Forward
0 new messages