Hi George,
On 3/19/2015 11:38 PM, George Neuner wrote:
> On Thu, 19 Mar 2015 15:44:44 -0700, Don Y <
th...@is.not.me.com> wrote:
>
>> On 3/19/2015 2:52 PM, George Neuner wrote:
>>
>> When you want to program *behaviors*, you can't expect the way one person
>> (or even one *class*/group of people) to behave to bear any relationship
>> to some other group.
>>
>> Unless you try to build a "be all" system with umpteen "configuration
>> options" that you grill the user about.
>
> That's why there are AI systems that learn by doing. You don't ask
> anything, just monitor and see what repetitious behavior falls out.
So, release a product that assumes *nothing* and let EVERYONE deal with
the learning issue? (i.e., why tailor a "default release" to one group
over another? Do you canvas the entire market and figure out what the
"best" group to target is *likely* to be? What if they don't end up
being the *actual* customers?)
>>>> ... Things are getting too complex for a
>>>> "one size fits all" approach to all users.
>>>
>>> I didn't say anything about "one size fits all". I said the vast
>>> majority of people should not be programming - an understatement to be
>>> sure.
>>
>> I don't think that's the case, anymore.
>>
>> How many appliances require some sort of "customization"? How does
>> a blind user configure his wireless router? Firewall? "Program"
>> his TV/cable box/DVR/XM radio? (do they even *have* provisions to
>> present data in a non-visual form?)
>
> This may sound insensitive, but I rather doubt many blind users
> configure these things much at all.
Because they *can't*. A "display reader" (i.e., decode N-digit 7-segment
displays) has been high on the wish-list of many blind/VI folks for many
years. What's that digital clock say? How much time is remaining on
the count-down timer on the microwave oven? (when will my biscotti
be ready to come out of the oven?)
> I have a very good friend who is almost completely blind ... he
> perceives only light and dark ... and I see where he adapts the
> surroundings and where he adapts *to* the surroundings.
I'm sure he doesn't let others arrange *his* surroundings and then
learns what they are! Anything he has control over he undoubtedly
opts to arrange himself.
> My brother-in-law teaches at a school for the disabled. Some of his
> students are blind, some are deaf and most have other disabilities as
> well. I have seen much the same behaviors there ... they learn to
> deal with things as they are introduced to them and don't mess around
> too much.
Again, because they are *forced* to do so! Look at how long it has taken
to get *money* that can be resolved without visual cues: all paper money
here is the same size, same basic color. And that's something that's
essential to "living"!
You will also see that people address disabilities differently depending on
when they encounter them in their life. E.g., blind from birth is different
that losing vision to diabetic retinopathy, macular degeneration, etc.
Folks who lose their vision later in life tend to have no real recourse:
too late to learn braille (and neuropathy from e.g., diabetes, can make
it too difficult to gain the tactile sensitivity to resolve braille) to
any effective degree.
If you lost your vision *today*, would you suddenly be incompetent as
a programmer? Could you not "visualize" code fragments in enough detail
to compose a working script *if* you could see what you were "writing"?
People losing their (central) vision later in life end up "dealing with"
the loss -- by simply realizing it *is* a loss; gone along with the
things that relied upon it (reading, driving, setting the household
thermostat, etc.)
> Despite that, they accomplish a great deal. Might they accomplish
> more if their surroundings were more individually friendly? Probably.
> OTOH, if they walk into something because it was "customized" for
> someone else, they might get hurt. And if they sit down to a computer
> that is customized for someone else, they might be unable to do
> anything at all. So it's a trade-off: standardize the environment and
> everyone can do something. Customize it and perhaps nobody can do
> anything.
So, all computers should be standardized? Anything that *could* be
customized *shouldn't* -- because it wouldn't be a uniform interface
for *all*? Perhaps the seats in cars should all have their adjustment
disabled? Or, at least forced to be located in a particular location
so folks can access them "universally"? Perhaps automatically returning
to some "default" position when they sense the occupant getting off?
That's just silly. What you want is to tailor customizations to individuals.
E.g., *my* house should be set up for *my* needs; not those of some "nominal
occupant". When we have house guests, we don't expect them to *know* where
the clean linens are located. Or, where the trash basket in the *kitchen*
is located (is it behind *this* cabinet? Or, *that* one? Or, perhaps
somewhere else entirely??) or the clean glassware/dinnerware, etc.
Yet, no one has ever died of thirst, here, for want of a drinking glass.
Or, come wandering out of the shower au naturel because they couldn't find
fresh towels.
My solution is to tie the customizations *to* the individual. So when *I*
interact with it, it behaves as *I* expect it to behave. Yet, when
another person standing beside me interacts, it reacts *differently* -- as
*they* expect.
>> We were looking at refrigerators the other night. How would I know
>> the current temperature setting -- *buy* a talking thermometer and
>> leave it in the frig for a while?? How would I know if the "extra
>> compartment" was currently configured as a freezer, refrigerator or
>> "soft freezer"?
>>
>> Everything is getting more "customizable". And those customizations are
>> becoming more *dynamic* in nature. It's not just a "set and forget"
>> sort of thing.
>
> And my question to that is "who the hell needs that?" For 250 years
> [the "refrigerator" was invented in 1755], the refrigerator was a dumb
> box and it worked just fine.
Then why aren't we all waiting for routine deliveries of *ice* (for our
"ice boxes")? Refrigerators have, for years, allowed the temperature of
the freezer and refrigeration compartments to be independently set (within
reason). I've owned refrigerators that allowed the butter compartment to
have yet another setting! Newer refrigerators add a third compartment
that can be used as additional freezer space, refrigerator space or
something between the two extremes. (i.e., an extra evaporator is
added along with another set of controls for it)
The *coming* refrigerators (as well as other appliances) go beyond this
and actually try to track *consumption*: "You're out of milk!" How
much of this is toy vs. utility? <shrug> How much is carrying a phone
in your pocket a "toy" vs. "utility"?
While I don't need an appliance to prepare my weekly shopping list, it
sure would be nice if, on an unexpected stop at the grocery store, I could
check to see if I had any broccoli on hand so I could exploit the unannounced
sale on flank steak (sure don't want to buy *more* broccoli if I don't need
it! OTOH, would be chagrined to return home and discover I had none and
now need to make another trip out if I want to prepare "beef with broccoli")
Should I, instead, discipline myself to ALWAYS carry a list of my "foodstuffs
inventory" on hand just in case the need arises? Or, shrug it off and just
resolve myself to making another (avoidable) trip back to the store?
> Despite what some people think, every goddamn thing does not need a
> computer brain, and for most people the majority of things need very
> little in the way of customization.
Sure! We can settle on a standard layout for furnishings in our homes,
a standard placement of items in kitchen cupboards/drawers, a standard
set of TV channels on all sets, ditto for radio stations, etc. Imagine
how much easier it would be to know you just had to hit "next station" 7
times to get to the station you *prefered* -- instead of having a
"favorite" -- REGARDLESS OF WHOSE HOME YOU WERE IN!
Homes should all be heated to 70F and cooled to 80F. These temperatures
should be in place over the period M-F7A-10P and allowed to drift lower/higher
by no more than 5 degrees during the "sleep hours". Everyone should work 9A-5P
lest they find themselves working when the HVAC assumes they are asleep.
We should all listen to the same music and in the same sequence. Heck,
maybe at the same *times* (so you would know what time to tune in when
you want to hear a particular song).
All phones should be black. No need for wallpaper on computer screens.
All text should be displayed in courier -- and the same size! Newspapers
(electronic or otherwise) should all be read in a fixed sequence. All
articles should be read in their entirety (no need for bookmarks).
>> When you discuss a piece of code with a colleague, do you pronounce
>> every punctuation mark? Chances are, you adapt your presentation
>> (discussion) style to your expectations of the other party's capabilities.
>
> I don't recite a piece of code to anyone - they look at it themselves.
> Likewise no one recites code to me - I look at it myself.
Then you're dealing with sighted people who are "connected" to your source
code in some way. You;ve never had to dictate a code fragment to someone
on a field service call across the country.
> I used to know a vision impaired programmer who used a braille reader
> and keyboard. I never worked directly with him, but he worked very
> much like I did (and do): when he collaborated with other people they
> would pass code back and forth with markup and comment.
Did you ever *look* at his "display"? Ever ask him what he would
change *if* he could change it? I.e., if he wasn't "forced" to
"adapt to it"?
A standard braille line is 40 cells. For *text*, this can be comparable to
a written line of text -- Grade 2 braille supports lots of contractions
that economize on space (along with reading and writing effort). Grade 1
braille would mean you have LESS THAN 40 "(text) characters" available
as a character maps to one (or two) cells.
A '[' requires 2 cells to represent. Amusingly, the *same* 2 cells are
used to represent ']'. The same applies for '(' and ')'! OTOH, open
and close single quotes have different representations. As is the
case for open/close *double* quotes. Curiously, single quotes take
2 cells while double quotes take *one*. And "ditto" marks differ from
each of the above (though require 2 cells). Apostrophe and "accent"
differ from all of the above.
Decimal point is not the same as "period". Numbers are introduced with
a special "number follows" sign (which eats a cell). *Individual* uppercase
letters are preceded with an "uppercase" sign (another cell). Strings of
uppercase letters are introduced with a different sign (two cells).
So, mixed case identifiers get long, fast. Expressions with parenthesized
subexpressions and array references? <frown>
There's no '_' representation.
And, forget about "indent level"! Waste a cell on each layer of indent??
(actually, the rule is to indent *two* cells per level) :<
"Math braille", "Computer braille", "Grade 2 braille", etc.
Then, we can address 8-dot cells -- which use a different set of rules for
encodings (and, which might not be available on all braille displays -- unless
designed with that in mind!).
The following:
/* This is one comment */
struct alist *xp, *findit();
EACH fill an entire braille "line" (ignoring the indent). Most braille
displays are *one* line!
This:
www.cbsnews.com/VIDEO/TV/1503/10/newsitem/some_silly_video.mov
takes *6* lines to represent! (in "print", a braille "page" consists
of 25, 40-cell lines on SINGLE SIDED -- think about it -- 11x11.5 paper)
The point of all this "silly detail" is to show you the consequences
of "adapting to" an existing "system" that was conceived without its
impact on this form of representation. Creating a scripting language
with the same ARBITRARY ignorance of these issues leads to results
that are just as clumsy!
You may want to *candidly* inquire of your blind programmer friend
just what its like dealing with the limitations of his output device.
And, ask him to think about what he *wished* he could impose on
the writers of any code that he's had to read/modify over the years.
If he thinks about it AS IF he really *had* that capability, you'll
be surprised at what issues come up!
[I did this with blind, deaf, physically handicapped, etc. users
many years ago. The biggest problem with those interviews was
getting folks to shed the "I have to adapt" mentality that has
been bred into them -- because they *have* no choice! Ask them
what things would be like if *they* could write the "rules" and
you end up with an entirely different set of criteria! E.g., one
that surprised me from a blind user was wanting products that
don't "look blind"... i.e., look like they were assembled from
off-the-shelf components -- a consequence of low production volumes
and avoidance of high tooling costs -- in someone's GARAGE!
"Make it look sexxy!" "Make it easy to clean (because it's going
to be in my hands a lot and get dirty/greasy from all that handling...
AND, I am unlikely to SEE just how 'disgusting' it LOOKS -- to
sighted folks around me!"]
> I doubt there are even a handful of people who could successfully
> write a non-trivial program via voice reader: it is too much context
> to hold onto.
You can't be serious? *You* couldn't write one of the scripts that
I've described, here? I imagine it would take you just a few minutes...
even dealing with the "audio navigation" issues!
> It is said that Stephen Hawking could hold and manipulate all the
> equations in his head. But he is a genius - most people are not.
Actually, what you learn to value most (with blindness) is MEMORY!
You can't "verify" something "on inspection" so have to REMEMBER how
you left it. And, discipline yourself to leave it the same way
each time (to reduce the number of things that you have to remember).
"Which is the blue shirt that goes best with those black slacks?
And, where the hell *are* those slacks??"
This makes blindness late in life doubly troublesome -- memory tends
to get worse with age; esp STM. (i.e., where you kept the scissors
when you were a CHILD doesn't help you remember where you left them
YESTERDAY!)
>> When you're dealing with a "casual user", you have to be more
>> explicit -- OR, use a language that doesn't introduce "unspoken"
>> items into the dialog (e.g., punctuation, whitespace, etc.).
>
> Show me a language with no (equivalent of) whitespace.
I meant the *significance* of whitespace. So, " " means something
notably different than "\t\t" or "\t ". If someone was reading
*this* to you, would you be aware of the locations of the line breaks?
The dual spaces (or not!) after each "full stop"? The paragraph
breaks?
> It's a fallacy to think that you somehow are reducing complexity by
> renaming what you consider to be "cryptic" symbols. It doesn't matter
> whether the symbol is represented by the character '=>*' or the word
> 'arrow-star' ... the person using it still has to understand what it
> means.
Of course you can reduce complexity! Information gets encoded in
data in a variety of ways.
For example, a language that implicitly terminates each statement/line
at the '\n' doesn't ALSO need an explicit line termination (e.g., ';').
A language that doesn't allow expressions like:
A = B = C = D = 4 + 3
means the "reader" need not be prepared for a second (third, fourth, etc.)
'=' after encountering the first.
A language that encodes data type in the (required) identifier itself
omits the need for explicit type declarations (e.g., FORTRAN i,j,k).
The statement:
A := 3;
"reads" with a lot more complexity than:
A gets 3
yet encodes the same information.
> And to repeat myself - why are you reciting it over the phone? Send a
> goddamn fax, or an email, or a tweet, or a text message and let them
> "read" it (for some definition of "read") for themselves.
I bump into you in the store. You ask how *you* could configure *your*
phone answering system to behave *like* mine. I can recite the pseudo-code
I described (upstream) and, chances are, it will "make sense" to you
(if you've written any code in that language). No need for me to
write it down. No risk that you'll forget that I said "colon equals"
instead of "equals" in some places. For an effective language, you
can "visualize" the algorithm:
"Yeah, that makes sense! Lookup the caller's phone number in a
database (I'll sort out how to build that database as a separate
project). Determine the 'contact category' for the caller.
Then, conditionally execute one of several different types of
actions based on that information. Maybe I'll use an if-tree...
or, perhaps a 'case' (switch) statement."
If the language is cluttered with lots of "magic" that the user
has to remember, then there is far less chance of him getting it
right "quickly".
>> The whole move towards the "safer" programming environments (and those
>> are really only intended for "REAL" programmers) indicates that programs
>> can't continue to be intricate, error prone creations. The system
>> designer/implementer(s) have to consider the abstractions that they
>> present to the "user" (programmer?) to maximize the chances of the
>> user "getting it right".
>
> That's one way to look at it. The other way is there are a whole lot
> of people who are charitably called "programmers" and who would be
> doing the world a favor if they did something else.
No argument, there! But, that's not what's happening. You don't
see employers dismissing lots of candidates and raising payscales
to compete for The Good Ones. Rather, you see them trying to make
use of The NotGood Ones to keep their costs low, reliance on "key"
staff minimized, etc.
I started this project with a bottom-feeding hardware approach: make
the hardware *dirt* cheap. Compensate with cleverer software -- within
the capabilities of that cheap hardware.
But, the reality of the cost differential between "dirt cheap" and
"cheap" made that criterion silly. With "cheap" I can make the
software a lot more capable, extensible and robust. I.e., more
suited to letting others extend it without breaking everything in
the process. (design of OS, complexity of comms, features in
scripting language, etc.)
> This isn't going anywhere. We should just agree to disagree.
Agreed. My goal is to show that systems *can* be accessible in
more than just superficial ways. *And*, to document the costs
of that accessibility in the design process! (it's not "free"
as those adherents would like to believe)
Or, should that be "disagreed"? :>
Time for C's biscotti -- else The Big Frown come morning! :-/