The first time I used a hammer, I hit my thumb. I still do every so
often. Other people observing me hit my thumb can be classified into
3 buckets:
- schadenfreude: ha ha he hit his thumb, he's stupid / he won't make
that mistake again, it's a rite of passage
- pity: poor guy hit his thumb, can I teach him another way?
- other (programmers always include a default case)
In writing software, we have a tendency to react with schadenfreude by
declaring that users don't know how to use the software we write. In
a larger scope, we reject their pain, and we don't react any
differently - nor do we often change the way the program interacts
with the user. This reaction becomes almost inevitable as a self-
defense against the pain of others.
The second path, "pity", is an empathic acceptance of the pain of the
end user. It clearly has more potential to reduce the total pain of
the system by helping others, but it starts too late: my thumb has
already been hit.
If empathy is applied proactively, can someone stop me from hitting my
thumb? How? Education takes time, and I'll admit I'm both impatient
and arrogant ("I'm not going to hit my thumb, I'm not stupid.")
Is it possible to design a better hammer? Maybe a hammer could be
built which would stop itself before hitting my thumb. However,
remembering that I'm unwilling to be trained, it has to be easy to
use.
Re-examining the problem shows that I hit my thumb because my thumb is
next to the nail, because I have to hold the nail in place to pound it
in. Another solution would be a hammer that doesn't require me to
hold the nail.
After some revisions, I start using a nail gun. It's a better tool
for the task, which reduces the risk of my injuring myself while still
accomplishing the same goal - driving a nail - in a way that's easy to
understand, although very different than my original action.
In software, being Rugged will require security to be included during
design. I propose that, for software to be user-ready, we have to
design it with foresight of how users can unwittingly cause themselves
injury, and, if necessary, shift our perspective to keep them safe.
One place to start looking for user-ready design is this paper from
the University of Cambridge, in cooperation with the BBC, which puts
forth the claim that designing with the end user in mind is necessary
for actual security. "Understanding scam victims: seven principles
for systems security", August 2009, http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-754.pdf
I really like the sentiments here. Designing things that are difficult to
misuse should be a key part of "rugged". And the shifting of perspective is
critical to getting this right. It's not easy for people entrenched in
building software to see how outsiders will misuse that software. We do
this to ourselves all the time too. It's quite difficult to write APIs for
other developers to use without making security mistakes.
Let's start thinking about how to capture these characteristics of Rugged as
we go forward. Suggestions?
--Jeff
--
You received this message because you are subscribed to the Google Groups
"Rugged Software" group.
To post to this group, send email to rugged-...@owasp.org.
To unsubscribe from this group, send email to
rugged-softwa...@owasp.org.
For more options, visit this group at
http://groups.google.com/a/owasp.org/group/rugged-software/?hl=en.