"[P]overty is not an island; it is a borderland." Virginia Eubanks's searing declaration in Automating Inequality serves as our foundation this week, as we explore the universality of vulnerability and return to the false dichotomy of the deserving and undeserving poor. It's a brilliant metaphor, underscoring the construction of poverty as a collective fiction (with very real consequences) that we as a society have decided to accept.
In context, Eubanks is referring to the idea that the membrane between the poor and not-poor is much more porous than we would believe. More than 60% of Americans will experience some form of poverty in their lives. This statistic alone should serve to puncture the centuries of history devoted to viewing the poor as somehow separate and less-than the rest of the population. Unfortunately, models of othering the poor kept pace with other social advancements; once it was no longer acceptable to view poverty as hereditary, narratives of personal responsibility rose to the fore.
In supporting these new models of "inherent" (but now individualized) predisposition to poverty, Eubanks explores how surveillance and data collection serves to reinforce these narratives. In contemplating the question (to be clear, not the one we should be asking) of whether people deserve to be poor, their life stories are scrutinized for choices they might have made differently:
Marginalized groups face higher levels of data collection when they access public benefits, walk through highly policed neighborhoods, enter the health-care system, or cross national borders. That data acts to reinforce their marginality when it is used to target them for suspicion and extra scrutiny. Those groups seen as undeserving are singled out for punitive public policy and more intense surveillance, and the cycle begins again. It is a kind of collective red-flagging, a feedback loop of injustice.
Alarmingly, she documents how these aspects of individuals are further divorced from context and fed into automated decision systems. Data retention rules for these systems are inconsistently regulated, and some can retain data indefinitely, reinforcing the idea that poverty is somehow innate rather than circumstantial.
These decision algorithms wrongly attempt to pinpoint a notion of poverty that is ultimately ephemeral.
Another dimension of the borderland of poverty comes from Sendhil Mullainathan and Eldar Shafir's findings in Scarcity. Through numerous behavioral studies, they find that reduced capacity for executive function and decision making is part and parcel of experiencing poverty itself, not some inherent quality of the individuals and communities in question. When poverty is taken away (farmers after a bountiful harvest, in one study), these specific impairments disappear (unlike the long-term trauma that frequently accompanies poverty in the United States).
There is an apparently superficial link between these cracks in the wall of poverty that I think runs far deeper, regarding error rates. In Scarcity's research, human subjects perform worse on tests of fluid intelligence and executive function in the presence of scarcity (even artificial scarcity will do the trick). In Automating Inequality, automated decision systems run algorithms that can massively increase the error rate for denying benefits. After the adoption of one such system:
Between 2006 and 2008, the combined error rate more than tripled, from 5.9 percent to 19.4 percent. Most of that growth was in the negative error rate: 12.2 percent of those applying for food stamps were being incorrectly denied.
What does it say about our society that we are willing to accept errors of this magnitude in machines, but punish them for humans (and continue to reproduce the conditions that lead to these errors)? If we're forced to grapple with the idea that many of the things we pathologize about poverty are actually transient, what does that say about our punitive approaches? More to the point, perhaps we should instead be taking a different approach to poverty, one that reflects its reality as a massive reduction in human capacity (and increase in human tragedy) through entirely avoidable circumstances.
Here are this week's invitations:
Personal: What are the worst decisions you have made? Have they followed you around, or have you been able to leave them in your past?
Communal: How can we dismantle the inequities of the surveillance of public services, where those most in need have to submit to additional surveillance to access benefits?
Solidarity: Support free legal assistance to address fundamental human needs such as housing, family safety, income security, health care, education, and more at Northwest Justice Project.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
FAQ
Can I share this newsletter with non-Googlers? Yes! Feel free to forward this note externally; it does not contain confidential information.
Is this an official Google newsletter? Nope. The views expressed in this newsletter are not the official position of Google, and we are not affiliated with any particular ERG.