Singapore postcodes available for download?

10,735 views
Skip to first unread message

Simon Males

unread,
Jun 12, 2013, 2:25:21 AM6/12/13
to hackerspacesg
Hi all,

I feel like my Google-fu is awfully weak.

Where can I download Singapore postcodes which are appropriately mapped to addresses.

As far as I can tell SingPost doesn't have a list readily available to download.

--
Simon Males

Stephan February

unread,
Jun 12, 2013, 4:03:16 AM6/12/13
to hacker...@googlegroups.com
Hi Simon

I don't know that there is a download available. If you *must* have a download, you will need to approach the folks at the URA, or browse through data.gov.sg

In the past I have simply made use of the Google geocoding APIs: https://developers.google.com/maps/documentation/geocoding 
You can do reverse and forward address lookups based off the postal code. 

Regards
Stephan

--
--
Chat: http://hackerspace.sg/chat
 
---
You received this message because you are subscribed to the Google Groups "HackerspaceSG" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hackerspaces...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Mats Engstrom

unread,
Jun 12, 2013, 4:03:24 AM6/12/13
to hacker...@googlegroups.com
I couldn't find a free db, buy scrape yourself from  http://sgp.postcodebase.com/ or purchase at http://www.geopostcodes.com/singapore_zip_codes are two alternatives.

Don't know how complete the files at http://download.geonames.org/export/dump/ are, but there's a SG file there with about 1500 zipcodes


--

Juanita

unread,
Jun 12, 2013, 6:30:02 AM6/12/13
to HackerspaceSG
Silly question but can you throw code at singpost's postal code finder
on their website?

A company I worked with applied to singpost to get access to their
postal codes for a map app. Not sure if its worth your time and effort
though.

Vikram Verma

unread,
Jun 12, 2013, 8:51:29 AM6/12/13
to hacker...@googlegroups.com, Vikram Verma
On 2013-06-12, Mats Engstrom wrote:
> I couldn't find a free db, buy scrape yourself from
> http://sgp.postcodebase.com/ [..]

Hey Matt,

Have you heard about <http://krake.io>? It is the best thing on the
internet.

You can specify a data model like:

{
"origin_url": "http://sgp.postcodebase.com/all",
"columns": [
{
"col_name": "source",
"xpath": "//*[@id='block-system-main']/div/div/div[2]/table/tbody/tr/td[1]/a",
"required_attribute": "href",
"options": {
"columns": [
{
"col_name": "title",
"xpath": "//fieldset[1]/div/div[2]/ul/li[1]/span[2]/span"
},
{
"col_name": "postcode",
"xpath": "//fieldset[1]/div/div[2]/ul/li[9]/span[2]/span"
}
]
},
"next_page": {
"xpath": "//*[@id='block-system-main']/div/div/div[3]/ul/li[11]/a"
}
}
]
}

And be fed back tasty CSV!

OMG, right?!

xoxo,
vi <3

Vikram Verma

unread,
Jun 12, 2013, 8:54:52 AM6/12/13
to hacker...@googlegroups.com, Vikram Verma
On 2013-06-12, Vikram Verma wrote:
> Hey Matt,

Your name is Simon. Simon. Simon.
http://www.binarymoon.co.uk/wp-content/uploads/2006/08/firefly-serenity-simon.jpg

Mats Engstrom

unread,
Jun 12, 2013, 12:13:41 PM6/12/13
to hacker...@googlegroups.com, Vikram Verma
Oooohhh. Krake.io sure is funky!  It can definitely come in handy.  Usually I whip up a bash script with a lot of sed & regex -magic in it to do things like this, but krake will make to so much easier.  

Thanks.

Mats Engstrom

unread,
Jun 12, 2013, 12:32:48 PM6/12/13
to hacker...@googlegroups.com, Vikram Verma
.... and to top it off - The pronunciation of krake according to their site is \ˈkrä-kə\ which is extra funny for a Swede like me because "kräk" means "vomit" in Swedish. :-)  

That's almost as a bad choice of name as Honda did a couple of years ago (NSFW):

Car maker Honda introduced their new car "Fitta" in the Nordic countries during 2001, only to find out that "fitta" is an old word, currently used in vulgar language ...

Vikram Verma

unread,
Jun 12, 2013, 4:06:58 PM6/12/13
to HackerspaceSG, Vikram Verma
On 2013-06-13, Mats Engstrom wrote:
> Oooohhh. Krake.io sure is funky! It can definitely come in handy. Usually
> I whip up a bash script with a lot of sed & regex -magic in it to do things
> like this, but krake will make to so much easier.

Like this?

#!/usr/bin/env bash

db="$HOME/postcodes.asv"
for node in `seq 1 124289`; do
XML=`curl -s "http://sgp.postcodebase.com/node/${node}"`
address() { echo -e "$XML" | grep "itemprop='description'" | awk -F '<|>' '{ print $3 }'; }
postcode() { echo -e "$XML" | grep "itemprop='postalCode'" | awk -F '<|>' '{ print $7 }'; }
(address; postcode) | tr '\n' $'\031' >> "$db"
echo -n $'\030' >> "$db"
done

Yeah, I'd use Krake ;-)

xoxo,
vi

P.S. Could we move `grep`'s pattern to the `awk` expression? We'd have
to move the separator definition to the action.. Tamas?

Alex Smith

unread,
Jun 12, 2013, 4:42:18 PM6/12/13
to hacker...@googlegroups.com
On Wednesday, 12 June 2013 21:06:58 UTC+1, Vikram Verma wrote:
On 2013-06-13, Mats Engstrom wrote:
> Oooohhh. Krake.io sure is funky!  It can definitely come in handy.  Usually
> I whip up a bash script with a lot of sed & regex -magic in it to do things
> like this, but krake will make to so much easier.

Like this?

  #!/usr/bin/env bash

  db="$HOME/postcodes.asv"
   for node in `seq 1 124289`; do [...some awk...]
  
If you do use this method and end up with some mildly useful data, it may be a Nice Thing To Do to shove it in something like a mapit (http://code.mapit.mysociety.org/) instance, or at least db - it's fairly simple based on http://code.mapit.mysociety.org/import/ and gives you a really nice to use API/data service to query (which I have used in other projects with UK datasets, so I'm somewhat biased) - however looking at the SingPost T&C that may (almost definitely) violate their terms.

It's also generally worth looking at things like gadm.org, though in Singapore's case the data seems to be very limited.

A 

Martin Bähr

unread,
Jun 12, 2013, 9:46:40 PM6/12/13
to Vikram Verma, HackerspaceSG
On Thu, Jun 13, 2013 at 04:06:58AM +0800, Vikram Verma wrote:
> On 2013-06-13, Mats Engstrom wrote:
> > Oooohhh. Krake.io sure is funky! It can definitely come in handy. Usually
> > I whip up a bash script with a lot of sed & regex -magic in it to do things
> > like this, but krake will make to so much easier.
>
> Like this?
>
> #!/usr/bin/env bash
>
> db="$HOME/postcodes.asv"
> for node in `seq 1 124289`; do
> XML=`curl -s "http://sgp.postcodebase.com/node/${node}"`
> address() { echo -e "$XML" | grep "itemprop='description'" | awk -F '<|>' '{ print $3 }'; }
> postcode() { echo -e "$XML" | grep "itemprop='postalCode'" | awk -F '<|>' '{ print $7 }'; }
> (address; postcode) | tr '\n' $'\031' >> "$db"
> echo -n $'\030' >> "$db"
> done

> P.S. Could we move `grep`'s pattern to the `awk` expression? We'd have
> to move the separator definition to the action.. Tamas?

(not tamas but), all you need to do is move the grep search expression
into /.../ insode the awk code before the block. it is applied
independent of the seperator:

address() { echo -e "$XML" | awk -F '<|>' '/itemprop=\'description\'/ { print $3 }'; }
postcode() { echo -e "$XML" | awk -F '<|>' '/itemprop=\'postalCode\'/ { print $7 }'; }

if this doesn't work then there is an issue with the \', haven't
used awk on lines containing ' myself.

oh, since you are parsing the same $XML here and then just run those one
after another you could do it in one command:

echo -e "$XML" | awk -F '<|>' '/itemprop=\'description\'/ { print $3 }
/itemprop=\'postalCode\'/ { print $7 }';

this asumes that the description always comes before the postalCode.
if that's not the case you could save the values found in a variable: {
description=$3 and postalCode=$7 and then use END { print description
postalCode }

and even more fancy: replace print with i believe printf (need to look
that one up) to allow you to output the values without a newline so you
can drop the tr.

you can then also pipe the curl output directly though awk:
curl -s "http://sgp.postcodebase.com/node/${node} | awk ... >> $db

greetings, martin.
--
eKita - the online platform for your entire academic life
hackerspace beijing - http://qike.info
--
chief engineer eKita.co
pike programmer pike.lysator.liu.se caudium.net
foresight developer realss.com foresightlinux.org
unix sysadmin trainer developer societyserver.org
Martin B�hr working in china http://societyserver.org/mbaehr/

Alex Smith

unread,
Jun 12, 2013, 4:35:37 PM6/12/13
to hacker...@googlegroups.com, Vikram Verma
On Wed, Jun 12, 2013 at 9:06 PM, Vikram Verma <m...@vikramverma.com> wrote:
On 2013-06-13, Mats Engstrom wrote:
> Oooohhh. Krake.io sure is funky!  It can definitely come in handy.  Usually
> I whip up a bash script with a lot of sed & regex -magic in it to do things
> like this, but krake will make to so much easier.

Like this?

  #!/usr/bin/env bash

  db="$HOME/postcodes.asv"
  for node in `seq 1 124289`; do [...some awk...]

Tamas Herman

unread,
Jun 13, 2013, 12:44:34 AM6/13/13
to hacker...@googlegroups.com
thanks vikram for the krake.io hint; i didn't know about it.
as i see they use florian's customer feedback on their site. interesting... :)

i couldn't add anything to martin's simplified version; i would have
done it the same way.
if we compare the origianl krake to the simpler shell script, i would
announce the shell script as a winner in almost any aspect... what do
you think?

actually in this specific case krake would be more expensive too as it
seems we need to scrape more than 5000 pages. the few liner shell
script took probably 15 minutes by vikram and 15 minutes for martin
(including writing the email). even if you pay them well for their
effort it's cheaper than krake.

--
tom

Paul Gallagher

unread,
Jun 13, 2013, 2:10:34 AM6/13/13
to hacker...@googlegroups.com
if we compare the origianl krake to the simpler shell script, i would announce the shell script as a winner in almost any aspect...

..except reuse. 

Isn't the core idea of krake.io (or scraperwiki.com for that matter) that it just takes one person to do it, and then we have a machine-readable data set that can be shared by all, with refresh taken care of by the service, and a single-point of maintenance if things change in future?

Cheers,
Paul


Vikram Verma

unread,
Jun 13, 2013, 4:45:12 AM6/13/13
to HackerspaceSG, Vikram Verma
On 2013-06-13, Martin Bähr wrote:
> (not tamas but), all you need to do is move the grep search expression
> into /.../ insode the awk code before the block. it is applied
> independent of the seperator:
>
> address() { echo -e "$XML" | awk -F '<|>' '/itemprop=\'description\'/ { print $3 }'; }
> postcode() { echo -e "$XML" | awk -F '<|>' '/itemprop=\'postalCode\'/ { print $7 }'; }
>
> if this doesn't work then there is an issue with the \', haven't
> used awk on lines containing ' myself.
>
> oh, since you are parsing the same $XML here and then just run those one
> after another you could do it in one command:
>
> echo -e "$XML" | awk -F '<|>' '/itemprop=\'description\'/ { print $3 }
> /itemprop=\'postalCode\'/ { print $7 }';
>
> this asumes that the description always comes before the postalCode.
> if that's not the case you could save the values found in a variable: {
> description=$3 and postalCode=$7 and then use END { print description
> postalCode }
>
> and even more fancy: replace print with i believe printf (need to look
> that one up) to allow you to output the values without a newline so you
> can drop the tr.
>
> you can then also pipe the curl output directly though awk:
> curl -s "http://sgp.postcodebase.com/node/${node} | awk ... >> $db


Awksome! ;-)

curl -s http://sgp.postcodebase.com/node/[1-124289] | awk -F '<|>' '
/itemprop=.description/ { printf "%s\031", $3 }
/itemprop=.postalCode/ { printf "%s\030", $7 }' >> "/tmp/postcodes.asv"

Simon Males

unread,
Jun 13, 2013, 1:35:55 AM6/13/13
to hackerspacesg
Thanks for the help.

Was really hoping for an official source, that is maintained. No else has mentioned OpenStreetMap, but as far as I can tell they don't have postcodes. Same for GeoNames (a service/project I love) don't seem to hold SG postcodes.

I've imported postal code data before for Australia, and Australia Post offer it as a CSV download. Was hoping that it was just that easy!


On Thu, Jun 13, 2013 at 12:44 PM, Tamas Herman <herma...@gmail.com> wrote:
--
--
Chat: http://hackerspace.sg/chat

---
You received this message because you are subscribed to the Google Groups "HackerspaceSG" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hackerspaces...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.





--
Simon Males

Meng Weng Wong

unread,
Jun 13, 2013, 12:47:39 PM6/13/13
to hacker...@googlegroups.com
Have you contacted SingPost or SLA?

Teh Gary

unread,
Jun 13, 2013, 1:23:31 PM6/13/13
to hacker...@googlegroups.com
Hmmm... was told by Vikram this afternoon that folks in hackerspace needed some postal code data set. I was just wondering if the one linked to below will be helpful?
Comes complete not just address and postal code but latitude, longitude as well. You could export it using JSON or CSV. Have fun. :)

Simon Males

unread,
Jun 14, 2013, 3:06:26 AM6/14/13
to hackerspacesg
Yes, and basically you can purchase a copy for $200.

Vikram Verma

unread,
Jun 14, 2013, 5:10:06 AM6/14/13
to hacker...@googlegroups.com, Vikram Verma
On 2013-06-13, Vikram Verma wrote:
> curl -s http://sgp.postcodebase.com/node/[1-124289] | awk -F '<|>' '
> /itemprop=.description/ { printf "%s\031", $3 }
> /itemprop=.postalCode/ { printf "%s\030", $7 }' >> "/tmp/postcodes.asv"

lol n00b

max=3 # 124289
curl -s sgp.postcodebase.com/node/[1-$max] | awk -F "[<>']" '
/contentString/ { ORS="\031"; OFS="\030"; print $6, $12, $18, $24 }'

pwn3d!

On 2013-06-13, Tamas Herman wrote:
> if we compare the origianl krake to the simpler shell script, i would
> announce the shell script as a winner in almost any aspect... what do
> you think?

the architecture of krake is what won me over. most scrapers are
composed of parallelisable instructions; with krake, runtime is
automagically reduced by executing these concurrently over a buncha
servers. each of those instances has a unique address, making it
hard(er) to get permab&.

paul's point is valid too.

y'all are the best,
vi <3
Reply all
Reply to author
Forward
0 new messages