Thanks Justin for sharing!
I've downloaded and extracted the India.geojsonl on a webserver.
-> pretty useful! One can loop through a huge file without having to load it all into RAM.
top lines look like:
{"type": "Feature", "properties": {},"geometry": {"type": "Polygon","coordinates": [[[83.06380515611697, 25.34167404697847], [83.06380909901775, 25.341635591519122], [83.06386494585949, 25.341640268589657], [83.06386100295869, 25.341678724047526], [83.06380515611697, 25.34167404697847]]]}}
{"type": "Feature", "properties": {},"geometry": {"type": "Polygon","coordinates": [[[87.87555977691633, 22.397660095199], [87.8754256865811, 22.397658576750985], [87.87542690191553, 22.397566835386712], [87.87556099225075, 22.39756835383578], [87.87555977691633, 22.397660095199]]]}}
...
So we have just basic polygons, one for each building, no properties or categorization.
Here are some ideas on what to do with this:
1. Postgresql DB:
- Load all of these into a PostGreSQL DB
- Setup an api that will take a lat/lon and give all shapes within 1km radius
- Next possible api: send a bounding polygon and get all buildings in that
2. Split up by district or lower level admin boundaries:
- Load the admin boundaries into a Postgresql DB
- Loop through each line (ie each building)
- Find out which place it's in by ST_Within query
- Dump it into a separate .geojsonl for that place
- We've now split it into multiple smaller files that are more usable, can be loaded up by OSM mappers etc.