Trouble writing queen neighbor enumeration of all census block groups

10 views
Skip to first unread message

dod...@ncsu.edu

unread,
Apr 21, 2016, 8:50:45 AM4/21/16
to pysal-dev
Let me start by saying, other than a stata .do file and some MCMC in R, I haven't coded since an AOL warez group application in VB; so I apologize for being here messin with y'all to begin with.  The was about the only place I saw an active discussion of pysal and the thing just saved my damn life.

I am writing an environmental justice paper working with demographic and exposure data at the census block group level.  Because block groups can be relatively small in size, a pollution source and the people living in one block group can easily impact AT LEAST first order neighbors.  I was going to be a dullard and just aggregate back up through the FIPS code, but that is just bad math.

Got the shape file for ACS year I want, tried arcGIS first but the learning but I was not getting anywhere.  I then read about Pysal and installed that

imported shape file
ran the (12 hours) queen neighbor analysis on all 216,000 block groups

In [52]: w.histogram
Out[52]:

[(0, 87),
 (1, 709),
 (2, 3634),
 (3, 16627),
 (4, 48736),
 (5, 56952),
 (6, 42848),
 (7, 24878),
 (8, 12646),
 (9, 6294),
 (10, 3040),
 (11, 1515),
 (12, 759),
 (13, 432),
 (14, 233),
 (15, 128),
 (16, 85),
 (17, 44),
 (18, 34),
 (19, 20),
 (20, 21),
 (21, 13),
 (22, 8),
 (23, 7),
 (24, 6),
 (25, 1),
 (26, 3),
 (27, 1),
 (28, 2),
 (29, 1),
 (30, 2),
 (31, 1),
 (32, 0),
 (33, 2),
 (34, 0),
 (35, 1),
 (36, 1),
 (37, 1),
 (38, 0),
 (39, 0),
 (40, 0),
 (41, 0),
 (42, 0),
 (43, 0),
 (44, 0),
 (45, 0),
 (46, 1),
 (47, 0),
 (48, 0),
 (49, 0),
 (50, 0),
 (51, 0),
 (52, 0),
 (53, 0),
 (54, 0),
 (55, 0),
 (56, 0),
 (57, 0),
 (58, 0),
 (59, 0),
 (60, 0),
 (61, 1)]

What I need is a .csv (or honestly anything will do if I copy/paste it somewhere) that enumerates each block group by FIPS (which should be what the ACS shapefile uses for ID) and it's list of neighbors.

If I can get the list, I can move it over to an environment where I am more comfortable.  I sat there and played with it for hours last night and could get a couple cracks at numpy.savetext to work, but it was only a single column or a smallter matrix, and the numbers were stored in scientific notation because FIPS codes are 12 digits. One time it told me the tuple was out of range, and I think that was the closest I got

and if I try and do anything with w.full() I get a memory error (I am assuming it is coming from constructing the [w*n]x[w*n] matrix.  Because I was thinking I could just 
info = w.full()
numpy.savetext("queen.csv", info, delimiter ",")

but no such luck.  In the mean time I will redo it with an individual state and see if it likes me a little better.

I searched for just the data itself rather extensively before hand, or I promise I would not be here wasting your time.

Thank you, 
Dave

dfo...@gmail.com

unread,
Apr 21, 2016, 9:29:37 AM4/21/16
to pysa...@googlegroups.com
Glad to hear pysal has helped out.  The two primary output formats for spatial weights matrices are GAL and GWT.  Both are plain text files that can be opened in any text editor.
-GWT format has one row per neighbor pair, organized in a three column format: ID1 ID2 weight.
-GAL format is organized as two rows per observation. row 1 is the origin ID and count of neighbors; row 2 is the list of neighbor IDs. This format is only valid for adjacency based weights matrices, which is what I think you're using.

Try this:
In [1]: import pysal as ps
In [2]: w = ps.queen_from_shapefile('my_shp.shp', idVariable='column_name_for_IDs')
In [3]: outfile = ps.open('my_w.gwt', 'w')
In [4]: outfile.write(w)
In [5]: outfile.close()
In [6]: outfile = ps.open('my_w.gal', 'w')
In [7]: outfile.write(w)
In [8]: outfile.close()





--

---
You received this message because you are subscribed to the Google Groups "pysal-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pysal-dev+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages