Seven types of DIY-bio risks compared to machine shop risks

4 views
Skip to first unread message

Paul D. Fernhout

unread,
Dec 27, 2008, 10:37:22 PM12/27/08
to openmanu...@googlegroups.com
I'm trying to think about whether I overreacted to this picture,
http://www.huffingtonpost.com/2008/12/25/do-it-yourself-dna-amateu_n_153489.html
especially in light of Meridith's and Joseph's comments. As I reflect on
recent discussions, it seems to me there are at least seven risks to any
type of biotech (whether at home or in academia or industry). Here I compare
them to home machine shop risks.

1) There is a risk to the researcher and people in the immediate vicinity
from the materials and processes used incidentally in the work. This is
somewhat similar to the risk of driving a car or owning a firearm. With the
right safety precautions and equipment and training, the risks are probably
manageable. However, a big difference is that many of the risks of biotech
are invisible (carcinogens and teratogens) whereas the risks of cars and
firearms tend to be more easily seen. Also, generally a person will only own
a couple cars or a few firearms, whereas biotech may involve many chemicals
each with different risks alone and in combination. Still, as home machine
shops get more advanced and handle more chemicals, they too will face some
of these risks. Note that any sort of work with chemicals has these sorts of
risks, but biochemicals may be worse in many ways.

2) There is the risk some biotechnology could be used intentionally to make
biotoxins. These are fairly similar to the risks a home machine shop could
be used to make guns or bombs.

3) There is a risk some biotechnology may be used to intentionally make
bioweapons like plagues as communicable diseases. This is similar to, but
more dangerous than, the risk a home machine shop could be used to make many
guns or big bombs by making an assembly line (or even nuclear bombs of some
type, like dirty bombs made from consumer or medical radioactive). Biotech
is still more dangerous because the technology of a bioweapon could be
self-replicating as a plague.
"The Coming Plague: Newly Emerging Diseases in a World Out of Balance"
http://www.amazon.com/Coming-Plague-Emerging-Diseases-Balance/dp/0140250913
This is probably the biggest concern of government. (What does national
government really care if a few people give themselves and their neighbors
cancer? Smoking is already legal and has been for a long time.) Still,
anyone developing a bioweapon would likely be pretty foolish to be an
obvious participant in online communities because that would invite
attention. Likely the greatest risk here is foreign-government-related
individuals doing this for one reason or another. There are two risks there
-- one is advancing the state of the art of bioweapons labs by using amateur
developed information; the other is a bioweapon developer inserting Trojan
DNA information into a public repository, so someone thinks they're
replicating a benign sequence when they are really making a local plague. (A
use for public key cryptography and circles of trust?) Still, in defense of
the DIY-biotech crowd, Bryan has pointed out elsewhere that widespread
knowledge of biotech might lead cooperating individuals to be able to invent
a cure for a natural or unnatural plague, presumably like if society was
otherwise breaking down. I think this is a stretch to expect DIY-biotech
amateurs to do this without containment facilities to use to work on a
pathogen safely, but I guess, in the worst case, a million already infected
and dying DIY-biotech amateurs each bravely experimenting differently on
themselves and their already infected and dying friends and family --
coordinated through what remained of the internet -- might find a cure in an
apocalyptic situation of a 100% fatal disease with minimal ethical issues in
some variations of a worst case scenario. Might make a thrilling sci-fi
novel plot, if it hasn't been done already. :-)

4) There is a risk to the ecology or global human health of an unintentional
escaped invisible self-replicating microorganism. This is similar to, but
more dangerous than, the risk a machine shop could make a clanking
self-replicator (because it is easier to know when a clanking replicator
leaves the building, at least for the larger ones (and current plans would
be for 3D printers like RepRap or larger things the size of buildings), and
also because practically no machine shop work these days is toward
self-replication of machine tools). Also, clanking replicators are still
hypothetical and are only one possible and rare machine shop project, but
bio-replicators exist now and are the target of *every* biotech project.
Still, one might argue the risks are minimal for biotech because it's a
tough world out there for any biological organisms, as between UV in
sunlight and other locally adapted bacteria for competition, chances are
that any accidentally released bacteria won't get far (at least, if it is
not specifically designed for hardiness in the outside world for some other
reason). Still, bacteria can spread any new gene sequence globally in about
two weeks (they are essentially a networked supercomputer), so whatever new
gene sequences do get created by biotech researchers which are not contained
100% and are in any way useful or at least not harmful to bacteria are
likely to last forever in the wild (at least as fragments).

5) There is a risk that sentient life forms capable of feeling pain will be
created who will be unhappy about the situation. This is a complex topic,
but in not likely to apply in the DIY-bio situation anytime soon. This risk
is more currently applicable to the home hobbyist working on Artificial
Intelligence or simulation right now, and will be more applicable to all
such work as time goes by.

6) There is a risk that unauthorized personnel with different plans will
access the collected materials. For example, local children might think it
fun to break into a biotech lab and spread stuff around. Certainly there are
enough twelve year old kids making computer viruses to show what can happen.
While a break in to a current day machine shop can happen, the potential
downside risk is much more limited mostly to, say, making small arms.
However, with the spread of 3D printing into the home as we talk about on
the open manufacturing list as desirable, the capability for mischief will
grow. For example, imagine a twelve year old kid spreading a computer virus
that gets into 3D printers and has them print out spider-like robots at
night with venom injectors or knives that murder people in their beds. One
child might kill hundreds of millions of people in one evening by doing that
via the global internet if everyone had home *mechanical* fab labs, and that
kid might be bragging all the time to his or her friends about how clever he
or she was. Maybe that person might even be a government operative who might
even target people by race, or campaign donations, or Facebook page, or some
other way. How many of us leave our 2D printers on at night?
"It’s Not Exciting, but Neglecting Printer Security is Dangerous"
http://www.itbusinessedge.com/blogs/top/?p=257
(A possible plotline for another thrilling sci-fi novel or movie. :-)
So, if DIY-biotech wants to start throwing rocks back, the house that open
mechanical engineering is living in is starting to look a little glassy. :-)
But you know what, I'd rather that glass got broken here and now, before
everyone has this technology in their homes, so throw away. :-)

7) There is a meta-risk that the value of anything produced won't be worth
the other risks, compared to non-biotech alternatives, and also that such
efforts distract from solving social problems with existing technology. For
example, we already have renewable energy solutions with wind and solar
power and energy efficiency, but we haven't widely deployed those ideas. In
the "Biological Technology in 2050" article by Robert Carlson from 2001,
Carlson starts talking about how in fifty years people will be reading a
magazine on a genetically engineered leaf, but we already have fairly good
digital paper using silicon type technology, and that will likely continue
to improve; while Carlson admits this, he says biotech will be cheaper, but
as I see it, everything in 50 years will likely be free because it will made
by automated systems, so cost is not a big issue. :-) But I could see his
point on easy recyclability, so he may well be right in the end (although we
may have recyclable displays from a variety of technologies by then, like
through plasma-temperature incinerators and separators). Similarly, we have
heirloom plant varieties that are pest resistant and climate tolerant, but
we have currently deployed risky corporate monocultures instead for social
reasons (including monocultures already using GMOs with questionable
benefits and unknown risks and huge public resistance). Would a diversity of
DIY-biotech users really change that, in the same way gardeners grow their
own food using heirloom seeds, but that is not most gardeners? Machine shop
work faces the same issue, but being less risky in general, the likelihood
of getting more good than bad for any machine shop project is higher,
because you don't have these one-in-a-billion or whatever plague risks.

As an analogy, consider two of NASAs historic big quests. One is the search
for life on other planets. The other is the search for extraterrestrial
intelligence. But, after spending tens of billions of dollars on all that,
what is the result? Well, humans will still eat an octopus, one of the most
interesting extraterrestrial intelligences (they live in the ocean, so they
are "extraterrestrial" :-). Dolphins are still caught in drift nets. Whale
are still killed for meat and oil. Both dolphins and whales have brains
bigger than humans and are able to do things like beam pictures to each
other using sonar that humans are only lately learning to do. Whales have
likely communicated around the globe for millions of years using low
frequency sound waves as the longest running internet we know of (and we are
busy jamming it with our own noise like from ships). As for terrestrial
intelligences, elephants have bigger brains than humans and are poached for
their tusks. Dogs and livestock and many other animals which clearly feel
emotions are regularly mistreated or eaten. People of other races and
religions are often killed in genocides and wars (though rarely eaten).
Artificial intelligences are being worked on with the intent to make them
slaves and with little knowledge of whether they feel pain or joy. And
historically, close relatives of Homo Sapiens like Neanderthals (who had
bigger brains) and various other relatives were quite possible killed off
intentionally. (Homo Sapiens' alliance with the Wolf 100000 years ago may
have been the decisive factor in our survival -- but about three to four
million dogs are euthanized from abandonment every year in the USA). And our
entire biosphere remains at risk in various ways from humanity having become
equivalent to a geological force, using nuclear weapons and creating vast
systematic pollution of the oceans and the atmosphere and so on. So, the
search for life, including intelligent life, in the cosmos, at a vast cost
of money, has in some ways has directly contributed to willful ignorance and
inaction on current pressing matters related to life and intelligence and
companionship. The implicit assumption there is no other intelligence on
Earth reflects a certain humanistic chauvinism, that only a similar to human
creature is of any importance, and even then, that only a similar enough
culture and technology and so on would be work knowing. Similarly, the quest
for life on other planets while disrespecting life here is problematical,
ethically. So, in that sense, like the search for life and intelligence in
the universe, biotech of any sort to create new life forms is a distraction
from obvious sustainable solutions on Earth involving the life forms we
already have but which we have undervalued (like heirloom seeds or Russian
phages, and so on).

Still, a similar thing might be said about open source machine shop work,
that it is a distraction from solving existing social problems -- except for
one key difference. Much of the open manufacturing program we have been
talking about is to take existing proprietary or tacit knowledge and make it
available it explicit form under free and open source licenses, specifically
to solve a social problem of access. The purpose of most biotech R&D seems
more to make new knowledge and new life forms, which seems riskier. I have
less of an issue with just cataloging the biotech information we do have
under free and open source licenses (like the Encyclopedia of Life).
http://www.eol.org/
Still, even in the open machine shop situation, new things are likely to get
built. But most of those will not be self-replicating.

On reflection, I think my biggest problem with DIY-biotech comes from my
personal experiences around biology and chemistry labs in academia, where
even in relatively affluent surroundings (they usually had hoods) I had it
pointed out to me by others who worked in the labs how dangerous they were
for researchers and bystanders. While a student or worker around labs doing
mostly computer stuff myself, I have been exposed to (primarily by air),
among other things, chemicals that bind into your DNA and fluoresce,
organic-metallic compounds in air ducting, the airborne results of a hood
fire in the lab next door, and radioactive materials. It is possible that
has shortened my life by a significant amount, but it is in the nature of
these exposures that there is no accountability, since where do you point
the finger? All that for someone who has never worked with that biotech
stuff directly myself. (I shared a residence with a post-doc in organic
chemistry for a time, and who knows what he tracked home; in his 20s then,
he used to joke about how most organic chemists died in their 50s and 60s --
I'm not sure about their housemates). If academic biotech-related labs (and
similar labs) are so poorly run as to create a variety of *avoidable* risks
(given the possibilities for hands-off automation even twenty years ago),
then I really question wanting to have that kind of work done in random
homes with less equipment and less training and less oversight, because at
least we know where the academic labs are and so people can avoid them and
the people who work in them. Still, I may be wrong about this. It is
possible that the focus on working with safer chemicals might mean
DIY-biotech is less risky than the acrid wafting odor of nasty DNA altering
chemicals in academia I remember breathing only all too well (as a bystander
just working in the lab next door).

For balance, I should note that I've also been exposed to asbestos from both
being around academia and some rental properties, as well as radon in a
university office when I managed a robotics lab (equivalent to smoking
several packs of cigarettes a day someone later told me, as it was
discovered after I left) so you can get exposure to bad stuff not from
biotech. I have also been exposed to agricultural chemicals like pesticides
and fungicides during volunteer work on a farm and shop chemicals like
paints and glues in the line of hobby work. I was exposed to second-hand
cigarette smoke growing up. And day to day, plastics and other consumer
materials (including gasoline) also are things I've been exposed to. As well
as x-rays for dentistry and medicine. And mercury and other materials in
dental amalgam fillings. Dust from lead paint in old homes. And also mercury
and other materials from air pollution falling on our garden soil. And a
case of sun poisoning as a child from a bad sunburn. And that's just what I
remember off-hand from what I was aware of. (Good thing the body at least
has some mechanisms for dealing with the heavy metals.) And I doubt my
exposure to health risks in those ways is atypical for most US Americans
today. So, we all do face and accept health risks in various ways (or have
them imposed on us as others profit). I might well be the next person who
needs a biotech cure for cancer from all those environmental exposures
listed above. :-( On the other hand, with all those typical health risk
exposures for the average US American, why add any more?

Since everything in life is risky to some degree, we have to weigh risk
versus reward, in the context of alternatives. By way of apology for perhaps
overreacting, it may well be true that it is safer to have someone like
Meridith living in the apartment next-door than someone who owns a gun, or
someone who has a vicious dog, or someone who has kids who play with
matches, or someone who has bald tires on their car. My outlining these
risks in this note is more to think about them out loud, not to claim such
work should categorically not be done by amateurs if it is done by
professionals. Also, were anyone denied the chance to do DIY-biotech, maybe
that would-be-hobbyist would take up something else that was legal in their
spare time and yet might be also dangerous, like, say smoking, where
second-hand smoke is dangerous, or drinking, where drunk driving kills tens
of thousands of people per year. A positive purpose in life is one way to
resist other possible negative influences in our life. Or such a person
might start developing interesting television shows that help keep millions
of US Americans sitting in front of TV and getting obese, helping cause tens
or hundreds of thousands of early deaths per year. See, the average US
American already tolerates a box in their home from which virtual spiders
come out from to murder them in their sleep (and at other times). :-( And
most will defend having that box in their home until their untimely deaths.
http://www.turnoffyourtv.com/healtheducation/junkfood.html
http://www.treadmill-desk.com/
http://wii.com/ (It's a start)

In the case of the article on Meridith's work, she claims she has the first
risk under control at least perhaps to the degree of the risk of owning a
firearm (in part from working with a safer subset of what a typical
university biotech lab would have on hand as a matter of course). She's
obviously got good intent for risks two and three. The fourth risk hinges on
risks of mutation and release which are hard to quantify. The fifth risk
does not apply. The sixth risk is unlikely to allow harm for her specific
set up. The seventh risk (a meta-risk) is somewhat subjective for a hobby
activity. Is the activity worth any such risks for the reward of either
self-education or maybe a successful diagnostic test? On more reflection, I
can see how one could build a case for that. This case may even be stronger
in the face of society dealing with things like thousands of illegal
meth-amphetamine labs every year being illegally set up doing much more
dangerous stuff in terms of the first risk, in part because of crazy social
policies that promote illegal drug use and promote illegal drug production.
It's not the DIY-biotech researcher's fault that our society is not willing
to put in place social policies that allow him or her better access to
regulated research labs. (Though see a comment below.)

Still, I really question whether *any* scientists globally should be doing
biotech development work beyond basic research at this point. I reject the
argument that by itself biotech is going to make everything cheaper. Corn in
the USA still cost something significant per pound. As I suggest here:
http://www.pdfernhout.net/post-scarcity-princeton.html
the cost of computing going to near zero will already drive the cost of
nearly everything else to near zero. (Biotech itself heavily relies on
conventional computing for data analysis.) So, I'm not convinced we need
biotech to do reduce costs right now. After we are a wealthier society, then
we will have the resources, time, and patience to do biotech in a safer way
(including in containment in space). Sure, I'm not that worried about what
Meridith specifically is doing after she explained herself more (though I'm
still hazy on the splicing approach used and that risk). But I still see
such work as essentially the camel's nose in the tent, as the biotech camel
will just push more and more those limits, whether radioactive tracer
materials or a greater variety of DNA splicing chemicals or a wider variety
of genetic materials and larger projects.
http://en.wikipedia.org/wiki/Camel%27s_nose
"If the camel once gets his nose in the tent, his body will soon follow."

Still, I've outlined a nightmare scenario for home fab labs above too.

One other comment from the Huffington Post article:
http://www.huffingtonpost.com/2008/12/25/do-it-yourself-dna-amateu_n_153489.html
"""
I'm a research biologist, in the process of launch my own biotech business.

I have been aware from some time that I COULD operate the startup phase of
my company out of my garage. But I'm appalled that someone actually WOULD do
such a thing.

Biotechnology can be powerful and dangerous stuff. Benign activities,
however, are nearly impossible for a casual observer to distinguish from
dangerous ones. And make no mistake about it -- if you had biotech labs in
garages around the country, the only types of inspections you could count on
getting for those labs would be the casual kind.

Several metropolitan areas have biotech "incubator" facilities. You can rent
laboratory space for a modest fee. Safe biological waste disposal is
included with the lab space. I urge anyone who wants to try anything more
sophisticated than what is allowed in a high school bio lab to seek out
like-minded partners, and rent a space at an incubator facility. That
includes ALL gene-splicing.

If laws do not already restrict homebrew biotechnology, they should be
revised immediately.
"""

Of course, I'd probably suggest we consider restricting that entrepreneur's
startup, too, while we were tinkering with the legal DNA of our society. :-)

But that seems politically infeasible in the USA, given how GMOs have been
impossible to keep out of the food supply, and even now, instead of being
able to sue such biotech companies for contaminating the genetic material of
organic farmers, the organic farmers may get sued for "stealing" the DNA of
the contaminating GMO which may be destroying their business. :-(
"U.S. organic food industry fears GMO contamination"
http://www.reuters.com/article/domesticNews/idUSN1216250820080312
"""
Widespread contamination of U.S. corn, soybeans and other crops by
genetically engineered varieties is threatening the purity of organic and
natural food products and driving purveyors of such specialty products to
new efforts to protect their markets, industry leaders said this week. A
range of players, from dairy farmers to natural food retailers, are behind
an effort to introduce testing requirements and standards for certification
aimed at keeping contamination at bay. That goal is rapidly becoming harder,
however, as planting of biotech corn, soybeans, and other crops expands
across the United States. "Now there is a real shortage of organic grain for
animal husbandry and dairy operations," said Organic Consumers Association
national director Ronnie Cummins. "People are having to be real careful."
... That has become more difficult as biotech corn acres have expanded in
the United States. In 2007, an estimated 73 percent of the 92.9 million
acres of U.S. corn planted were biotech, according to the U.S. Department of
Agriculture.
"""

So, the contamination process has already spread quite far. If we don't roll
it all back, then are DIY-biotech people even if they grow into numbers in
the hundreds of thousands really a big problem? I don't know. But I don't
have to be happy about all this.

--Paul Fernhout
(Sorry, I'm just getting too tired to revise this email further or shorten
it. I hope it is as balanced and fair as I can make it.)

Bryan Bishop

unread,
Dec 28, 2008, 12:20:54 AM12/28/08
to openmanu...@googlegroups.com, kan...@gmail.com
On Sat, Dec 27, 2008 at 9:37 PM, Paul D. Fernhout
<pdfer...@kurtz-fernhout.com> wrote:

> I'm trying to think about whether I overreacted to this picture,
> http://www.huffingtonpost.com/2008/12/25/do-it-yourself-dna-amateu_n_153489.html
> especially in light of Meridith's and Joseph's comments. As I reflect on
> recent discussions, it seems to me there are at least seven risks to any
> type of biotech (whether at home or in academia or industry).
>
> Here I compare them to home machine shop risks.

As you know, I'm somewhat deep within the "futurism" communities, the
groups that like to make wild predictions about the future. I should
rephrase that-- I don't really care to make predictions and sit back
on the side lines as if cheering some un-named, anonymous
technological force shifting tides one way and another, and instead I
focus more on doing, action, etc. One aspect of the futurist mindset,
the cheerleader mindset if you will, is the brutal insistence on
'risk'. This goes the same with the economists (whatever it is that
they do). There's even organizations like the Lifeboat Foundation and
conferences on Global Catastrophic Risks, where people get very
anxious about how the world is going to end because one guy wants to
make an ai dictator for the world, and then there are other fellows
who say asteroids are the largest risk, and so on. Frankly, I'll have
none of this crap. You only see 'risks' and dangers because you're
system *sucks*, and you know it. In effect you're asking them to
"please don't do anything that might kill me | children | any of us,
because my technological dependencies suck so much that I have to end
up begging you instead of being sure that my dependencies will not be
affected by you being a moron or doing something inadvertently
stupid". Yes, it's true, we're all in the same boat (or ark?) here,
just as long as we're sharing atmospheres, common food sources, common
energy sources, and so on, so even I am susceptible to the same faults
as everyone else (as well as *from* anybody), but I'm not going to
fool myself thinking that I can ask all of the bacteria and disease
and natural, highly distributed progression of the world to "pretty
please stop being so nasty until I can get my work and systems
operating, kthnxbai". The systems that I'm referencing are the type
that have full control over their infrastructure, the "civilization
seeds", von Neumann probes, space habitats, even in the case where
there are vitamin supply networks since those can to some extent be
encapsulated and defined as a single unit.

Off on a tangent, I find it annoying how some of those future-tech
enthusiasts get all excited, but then fear the risks of various
technologies. If they are really so interested in technology as they
say they are, why aren't they championing redundancy tech, like the
possibilities of cloning, or preserving your own DNA for more than
your lifetime, when faced with the terrible possibilities of the
galaxy - such as asteroids, or grey goo scenarios? Cloning isn't what
everyone wants it to be, but it's the closest thing that you have at
the moment. Such as the "off-site seed facility" proposed for the moon
(it seems to come up every few decades). There's ethical questions re:
cloning (your DNA) that I am sure you can dig up, but the fact that
"technological redundancy, backup, etc." isn't a response to "Single
Point of Failure (SPOF) Risk" analyses, shows that something is going
wrong overall in the thought processes. And I readily admit that these
backup processes that presently exist hardly capture all of the
information (and maybe they never will)- still, it would be foolish to
think that this is a showstopper for that train of thought in response
to SPOF analyses.

Btw, my pre-emptive strike against any attempt at your calling for
regulation of garage biology is to point out that the mold on your
bread in your breadbox doesn't read U.N. international law
backjournals. It would be nice if they did, in fact amazing, alas the
world disappoints. This is not to say that just because some people
might not be interested in laws, we shouldn't bother; but I do mean to
point out that if there's a single slip up, and it's a big enough of
an "oops", that it effectively puts all the effort on regulation
beforehand to shame and all for not. This doesn't mean it's not worth
trying to invest in security- I'm sure many security professionals
make many millions making probabilistic, statistical, and mathematical
arguments about why risk management is still important (as much as
Taleb would like to hit them over the head with a frying pan and Black
Swan). IMHO we both agree that there is some turning of the head to
"deep societal issues" (re: infrastructure and such) - but regulating
the hell out of diybio (perceived risks) is, well, messing with the
core concept of a proactive attempt at solving those deep and
pervasive fundamental probelms (re: infrastructure, post-scarcity). In
fact, using the concept of 'risk' is nearly free propaganda for
scarcity points of view. In reality there's not really that much of a
'risk' to the overall infrastructure and our way of life because it's
all unsustainable anyway, not "complete", even though many people
might be able to fool themselves into believing otherwise (in reality,
I can't cite or bring forth a full dependency chain analysis of my
livelihood, and I doubt any more than a handful of people presently
living could do it). It's as if the "risk" is chipping away at the
"security" of our industrial civilization that .. oh wait, isn't
actually there. ;-)

I do admit that I find it unfortunate to have to posit scenarios in
the future where habitats have atmospheres with metadata about what it
has been exposed to before, and then people making informed decisions
about whether or not they want to subject themselves to that
atmosphere, whether or not to go "down" in the hierarchy of
"cleansliness". I find this unfortunate not just because of how sad it
is, but because of also the interesting questions regarding immune
systems and how children not exposed to common things in early life
will grow up to be fairly unhealthy, sick adults. A more centralist
view of this was given in Greg Bear's "Earth 2" movie, where there was
a primary atmosphere on a space station, instead of many smaller
self-contained atmospheric capsules, though I'm sure I've probably
picked up this idea from some science fiction.

> 1) There is a risk to the researcher and people in the immediate vicinity
> from the materials and processes used incidentally in the work. This is

"World's smallest BSL5 lab on a chip"

> 2) There is the risk some biotechnology could be used intentionally to make
> biotoxins. These are fairly similar to the risks a home machine shop could
> be used to make guns or bombs.

"There's a risk that some people could think (and do) bad thoughts.
Ban biotech brains! Hitler! Godwin's law! " (Is that how Godwin's law
works? Oh wait. :-))

> 3) There is a risk some biotechnology may be used to intentionally make
> bioweapons like plagues as communicable diseases. This is similar to, but

Which is a problem with *your* (and my - that's why we're working
together anyway) system. That's like saying Microsoft doesn't have to
patch their operating system because viruses are morally wrong.

> individuals doing this for one reason or another. There are two risks there
> -- one is advancing the state of the art of bioweapons labs by using amateur
> developed information; the other is a bioweapon developer inserting Trojan
> DNA information into a public repository, so someone thinks they're
> replicating a benign sequence when they are really making a local plague. (A
> use for public key cryptography and circles of trust?) Still, in defense of

Poisoned data in science is supposed to be fixable, i.e. by repeating
the experiments and maybe publishing a new article to point out the
mistakes, but since the original data sets and papers don't backlink
to the updated versions, this means that there's this ever-expanding
outer sphere of improvements while the inner core remains silent and
possibly causes damage to newly growing aspects of the overall system.
:-( And on top of that, rarely do experiments fully specify the exact
bill of materials and exact bill of technology, plus exact recipes, to
repeat the actions and recover the same information. I would be
interested in elaborating on a PGP proposal to NCBI though. I know a
good number of bioinformaticists that are more than willing to listen
and push some ideas upstream :-)-- esp. in Korea, believe it or not.

> the DIY-biotech crowd, Bryan has pointed out elsewhere that widespread
> knowledge of biotech might lead cooperating individuals to be able to invent
> a cure for a natural or unnatural plague, presumably like if society was
> otherwise breaking down. I think this is a stretch to expect DIY-biotech

Well, yes, generally, the more eyes you have on the bugs in a system, ...

> amateurs to do this without containment facilities to use to work on a
> pathogen safely, but I guess, in the worst case, a million already infected

But shouldn't you just be arguing for the release of freely available
biocontainment facility manufacture information? I mean, instead of
just magically hoping that everybody who does dangerous stuff is
automagically in a BSL5 facility.

> and dying DIY-biotech amateurs each bravely experimenting differently on
> themselves and their already infected and dying friends and family --
> coordinated through what remained of the internet -- might find a cure in an
> apocalyptic situation of a 100% fatal disease with minimal ethical issues in
> some variations of a worst case scenario.

Not all bugs are doomsday bugs. Some bugs submitted in bugtrackers
turn out to be feature requests. :-)

> 4) There is a risk to the ecology or global human health of an unintentional
> escaped invisible self-replicating microorganism. This is similar to, but

Yes, but the open manufacturing issues-- if focused on and solved-
would not cause one guy's lack of knowledge about a superbug not turn
into a SPOF and doom us all. (see above, etc. etc. haven't said
anything new here)

> more dangerous than, the risk a machine shop could make a clanking
> self-replicator (because it is easier to know when a clanking replicator
> leaves the building, at least for the larger ones (and current plans would
> be for 3D printers like RepRap or larger things the size of buildings), and

"In other news, in a freakish industrial accident involving a giant
11-story tall Theo Jansen walking mechanism with eight thousand legs,
over 300 brave men and women lost their lives to the impending robot
invasion." Watch out- those giant buildings can sneak up on you.

> 5) There is a risk that sentient life forms capable of feeling pain will be
> created who will be unhappy about the situation. This is a complex topic,

See also: parent nightmares, worst of.

> 6) There is a risk that unauthorized personnel with different plans will
> access the collected materials. For example, local children might think it
> fun to break into a biotech lab and spread stuff around. Certainly there are
> enough twelve year old kids making computer viruses to show what can happen.
> While a break in to a current day machine shop can happen, the potential
> downside risk is much more limited mostly to, say, making small arms.
> However, with the spread of 3D printing into the home as we talk about on
> the open manufacturing list as desirable, the capability for mischief will
> grow. For example, imagine a twelve year old kid spreading a computer virus
> that gets into 3D printers and has them print out spider-like robots at
> night with venom injectors or knives that murder people in their beds. One
> child might kill hundreds of millions of people in one evening by doing that
> via the global internet if everyone had home *mechanical* fab labs, and that
> kid might be bragging all the time to his or her friends about how clever he
> or she was. Maybe that person might even be a government operative who might
> even target people by race, or campaign donations, or Facebook page, or some
> other way. How many of us leave our 2D printers on at night?
> "It's Not Exciting, but Neglecting Printer Security is Dangerous"
> http://www.itbusinessedge.com/blogs/top/?p=257
> (A possible plotline for another thrilling sci-fi novel or movie. :-)
> So, if DIY-biotech wants to start throwing rocks back, the house that open
> mechanical engineering is living in is starting to look a little glassy. :-)

Yep, that's why we want to make sure at the core of all this we're
promoting the technologies that make it so that viruses don't
completely ruin everything. All computer metaphors being directly
applicable here, except again for the issues re: cloning above and how
it's not the 'perfect' type that everybody wants (it's a bit fuzzy in
comparison to digitally backuped up information that everyone should
be doing because of computers spontaneously failing without warning).

> 7) There is a meta-risk that the value of anything produced won't be worth
> the other risks, compared to non-biotech alternatives, and also that such
> efforts distract from solving social problems with existing technology. For

So far I see diybio, synthetic biology and so on as being highly
cooperative with shared designs and packaging thereof, recipes, kits,
toolchains, etc.

> example, we already have renewable energy solutions with wind and solar
> power and energy efficiency, but we haven't widely deployed those ideas. In

*or* widely distributed the exact engineering documents, specs, etc.

> Still, a similar thing might be said about open source machine shop work,
> that it is a distraction from solving existing social problems -- except for
> one key difference. Much of the open manufacturing program we have been
> talking about is to take existing proprietary or tacit knowledge and make it
> available it explicit form under free and open source licenses, specifically
> to solve a social problem of access. The purpose of most biotech R&D seems

Access is the one-word version.

> more to make new knowledge and new life forms, which seems riskier. I have

Huh? What do you define as a new life form? For instance, in the
original article that we've been talking about, Meredith was working
on her melaminometer organism, would that be a new life form? Is it a
new life form when you change one nucleotide? Two? Three? What about a
gigabyte of junk DNA added to an organism?

> less of an issue with just cataloging the biotech information we do have
> under free and open source licenses (like the Encyclopedia of Life).
> http://www.eol.org/
> Still, even in the open machine shop situation, new things are likely to get
> built. But most of those will not be self-replicating.

That last sentence sounds like you're glad they will not be
self-replicating. How odd. :-) I still haven't replied to your comment
re my comment about the curves and tails being all messed up for
growing keratin for carving into new shapes and such, based off of the
amount of biomass needed to grow that much keratin in a short amount
of time and such (versus excavation or melting of a big chunk of
aluminum cans and having a giant chunk of metal to play with); but
basically that line of thought could lead to a response that shows
that bio stuff could replace mechanical stuff (even though it's not
linearly programmable tech), and then we have a self-replicable
alternatives to our mechanically manufactured artifacts of daily life.
Replacements for everything wouldn't be immediate, and in fact the
general look and feel of "biological approaches" would most definitely
require paradigm shifts about common every-day activities. Anyway,
just pointing that out.

> On reflection, I think my biggest problem with DIY-biotech comes from my
> personal experiences around biology and chemistry labs in academia, where
> even in relatively affluent surroundings (they usually had hoods) I had it
> pointed out to me by others who worked in the labs how dangerous they were
> for researchers and bystanders. While a student or worker around labs doing

Yeah, we've all heard the horror stories. It's certainly spooky. I've
also heard my share of terrible stories about negligent lab students.
There was one such negligent labrat keeping ethidium bromide in food
refridgerators, and 30 years later the labmates started dying of
cancer, save one who is now a contact of mine (pissed off as all
hell-- I'm surprised he doesn't go snap the offendant's neck). Plus
general realizations about where *really* the information in material
safety data sheets are coming from. :-/

> Since everything in life is risky to some degree, we have to weigh risk
> versus reward, in the context of alternatives. By way of apology for perhaps

I vehemently disagree there, Paul. See reasons at the beginning
regarding my headhsaking at 'risk'. The mode of thought that generates
that sentence, smells to me like a form of relativism ("since
everything .. is risky .. we have to weigh [risks] .. in the context
of alternatives").

> matches, or someone who has bald tires on their car. My outlining these
> risks in this note is more to think about them out loud, not to claim such
> work should categorically not be done by amateurs if it is done by
> professionals. Also, were anyone denied the chance to do DIY-biotech, maybe

Should my attempts to persuade you to give up the 'risk module',
you'll probably enjoy-
http://www.global-catastrophic-risks.com/
http://www.nickbostrom.com/

> Still, I really question whether *any* scientists globally should be doing
> biotech development work beyond basic research at this point. I reject the
> argument that by itself biotech is going to make everything cheaper. Corn in
> the USA still cost something significant per pound. As I suggest here:
> http://www.pdfernhout.net/post-scarcity-princeton.html
> the cost of computing going to near zero will already drive the cost of
> nearly everything else to near zero. (Biotech itself heavily relies on
> conventional computing for data analysis.) So, I'm not convinced we need
> biotech to do reduce costs right now. After we are a wealthier society, then
> we will have the resources, time, and patience to do biotech in a safer way
> (including in containment in space). Sure, I'm not that worried about what

Cost-efficiency is hardly a good reason. I suggest thinking about the
possibilities of personal growth, an understanding of biology, of what
it takes to keep your own systems running, in terms of management and
system administration, which is somewhat the function of certain forms
of biotech. Less immediately personal forms of biotech still of course
have an interesting roll to play though, like the free biofuels that I
was talking about a few weeks ago.

> Of course, I'd probably suggest we consider restricting that entrepreneur's
> startup, too, while we were tinkering with the legal DNA of our society. :-)

The 'legal' is ambiguous here:

(1) Legal versus illegal DNA

(2) Commons infrastructure (re: CC and such, Kelty's "recursive publics", etc.)

- Bryan
http://heybryan.org/
1 512 203 0507

marc fawzi

unread,
Dec 28, 2008, 2:14:06 AM12/28/08
to openmanu...@googlegroups.com
I think the deep philosophical issue about it, in support of Meridith et al, is the statement "i exist and my existence matters" which makes people like her want to do something different rather than conform

how much does a person want to prove that they exist and their existence matters (disrupt and cause an unpredictable change in reality, aka havoc) vs how much they want to prove that they don't (conform and force others to conform so as to eliminate all risks except the fact that total conformity or lack of differences = zero potential energy)

The following wants results in starkly different world views:

1. want to sort of kind of exist and have our existence sort of kind of matter (some worry re: risk)

2. we want to exist very much and have our existence matter a great deal (no worry re: risk)

3. we want everyone to look and act like us so there is only us (total isolation from uncertainty, emotion, difference, risk, i.e. death)

:)

Paul D. Fernhout

unread,
Dec 28, 2008, 3:00:48 PM12/28/08
to openmanu...@googlegroups.com
Bryan Bishop wrote:
> You [meaning everyone] only see 'risks' and dangers because you're

> system *sucks*, and you know it. In effect you're asking them to
> "please don't do anything that might kill me | children | any of us,
> because my technological dependencies suck so much that I have to end
> up begging you instead of being sure that my dependencies will not be
> affected by you being a moron or doing something inadvertently
> stupid". Yes, it's true, we're all in the same boat (or ark?) here,
> just as long as we're sharing atmospheres, common food sources, common
> energy sources, and so on, so even I am susceptible to the same faults
> as everyone else (as well as *from* anybody), but I'm not going to
> fool myself thinking that I can ask all of the bacteria and disease
> and natural, highly distributed progression of the world to "pretty
> please stop being so nasty until I can get my work and systems
> operating, kthnxbai". The systems that I'm referencing are the type
> that have full control over their infrastructure, the "civilization
> seeds", von Neumann probes, space habitats, even in the case where
> there are vitamin supply networks since those can to some extent be
> encapsulated and defined as a single unit.
> [snip]

> IMHO we both agree that there is some turning of the head to
> "deep societal issues" (re: infrastructure and such) - but regulating
> the hell out of diybio (perceived risks) is, well, messing with the
> core concept of a proactive attempt at solving those deep and
> pervasive fundamental probelms (re: infrastructure, post-scarcity). In
> fact, using the concept of 'risk' is nearly free propaganda for
> scarcity points of view. In reality there's not really that much of a
> 'risk' to the overall infrastructure and our way of life because it's
> all unsustainable anyway, not "complete", even though many people
> might be able to fool themselves into believing otherwise (in reality,
> I can't cite or bring forth a full dependency chain analysis of my
> livelihood, and I doubt any more than a handful of people presently
> living could do it). It's as if the "risk" is chipping away at the
> "security" of our industrial civilization that .. oh wait, isn't
> actually there. ;-)

Bryan-

That's all a very good point -- it's true that our society has no future as
it is because it is currently unsustainable (fossil fuel use and pollution
and social inequity) so it must change or die. Also, it faces a huge risk of
nuclear war and bio-warfare because of the current level of competitiveness
(especially in the USA). Some people think there just is no future at all no
matter what we do because of the warfare risk emerging out of
competitiveness and nationalism (but even if that was true, we should keep
up appearances anyway, and maybe we will get lucky, as in Howard Zinn's
"Optimism of Uncertainty" essay. :-) So our global society will either
change or it will destroy itself (intentionally, by accident, or just
slowly). But, our society *is* slowly changing towards sustainability and
towards better conflict resolution ideas (the internet is helping a lot
there). There is progress towards sustainability, even with backsliding. For
example, solar panels and windmills are poised to reshape the energy
landscape over the next decade or two. Still, there are different paths to
sustainability, some more obviously dangerous than others.

Perhaps, for me, it just comes down to that in weighing clanktech vs.
biotech in ushering in worldwide prosperity with the minimal dangers
(assuming the current level of biotech and agriculture to go with advanced
clanktech), it seems to me that clanktech is the clear winner in the most
reward for the least risk in the near term. (Clanktech, for reference, being
a term I just made up for our current level of industrial civilization and
its ability to self-replicate and self-repair at the human-run or somewhat
automated machine shop level of macroscopic (not microscopic) objects, also
using existing biotech like agricultural plants.)

But there is that word "risk" again. :-)

(By the way, I'm the only one I knew who once won the board game "Risk"
based from Europe, but even then I have to admit I got "lucky" with two
other opponents fighting it out. Of course, nowadays I'm more into
"cooperative games").
http://www.familypastimes.com/
http://www.learningherbs.com/wildcraft.html
http://www.co-optimus.com/game.php
http://www.boingboing.net/2006/01/11/variant-rules-that-m.html
http://en.wikipedia.org/wiki/Cooperative_board_game
http://www.boardgamegeek.com/browser.php?itemtype=game&sortby=mechanic&mechid=23&mechname=Co-operative+Play
http://en.wikipedia.org/wiki/Finite_and_Infinite_Games
http://video.google.com/videoplay?docid=-962221125884493114

Maybe I've just been too long a fan of comp.risks. :-)
http://groups.google.com/group/comp.risks/topics

I used to be a World Future Society member too:
http://en.wikipedia.org/wiki/World_Future_Society
(At first the membership was a gift from the sister of the wife of the WFS
founder who I used to hang out with in relation to a Unitarian Church Social
Concerns Committee and some salons she ran; I also took a public policy
graduate class by chance with a son of the WFS founder; somewhere along the
line with moving and such I let the membership lapse.) The WFS brand of
futurism in part suggests you should outline the possible futures in order
to get a sense of what is possible and to promote thinking about that. From
wikipedia: "Through its magazine The Futurist, media, meetings, and dialogue
among its members, it raises awareness of change and encourages development
of creative solutions. The Society takes no official position on what the
future may or should be like. Instead it provides a neutral forum for
exploring possible, probable, and preferable futures."

Alan Kay style futurism is more how you might shape that by invention (as in
"The best way to predict the future is to invent it." :-) But I can't say I
don't like that idea, too. :-)

I used to think we could have space habitats that would be distant enough to
be immune from Earthly nonsense. I later came to see that if my beautiful
spinning O'Neill habitat has clanktek (machine shop size self-replicating
infrastructure that may even have human labor as in integral component), and
Kurzweil is busy building his *nanotech* weapons and other types of
"defensive" weapons systems out of paranoia and competitiveness (that may
well destroy all of Earth), then what are the odds some of that junk will be
launched into space as well to consume all the clanktech?

In "Voyage From Yesteryear", the post-scarcity culture was confronted by
competitive psychopaths from Earth trying to take over and change that new
society back into craziness, fortunately, unlike the Native Americans wiped
out mostly by invaders and their accompanying pathogens, the mythical
Chironians with similar post-scarcity views on the universe had the upper
hand in technology and social organization over the invaders for Earth. But
I pointed out to Hogan that had the Earthers just sprayed human biological
pathogens from orbit, there might have been no story, same as with the
destruction of most of the Native Americans.

So, I'm unconvinced anymore that isolation does any more than change some
time constants of interaction -- which may well be a good thing, but by
itself is not enough without a general mindshift change globally.
http://www.global-mindshift.org/memes/wombat.swf
That is one reason for my engagement with ideas on the internet in various
posts, rather than being 100% hunkered down coding my own survival. It's
more like, we all go forward together or no one is going anywhere (for long).

We live in a matrix of both meshwork and hierarchy (Manuel De Landa), of
both limited isolation and limited connectivity. While it is true maybe
somewhere deep underground in some moon somewhere some clanktech (or even
biotech) might survive a limited nanotech war, the odds seem low to me. The
fact is, the clanktech Beserker series was extremely optimistic in that
sense (that the humans could fight advanced machine beserkers).
http://www.berserker.com/
Metascale but still nanotech-powered threats seem much more possible as
threats if nanotech worked eventually:
http://en.wikipedia.org/wiki/Replicator_(Stargate)
And a heat signature on a moon will probably give even such a clanktech or
biotech facility away.

I am starting to feel maybe the only hope is for the entire social network
of humanity to resist these things as it would epidemics. It perhaps
requires an entire social network with some common values and procedures,
not some isolated outposts hoping to survive the worst. At least, that's how
I see it now. As Benjamin Franklin said, "We must all hang together, or
assuredly we shall all hang separately." (For separate hanging, replace that
with your favorite doom from robots.) Here's one with Tarantula-scale
spidery robots with poison venom I was probably half-remembering in my
previous post:
http://www.cyberpunkreview.com/movie/decade/1980-1989/runaway/
(By the way, I forgot to mention another common failure mode is the twelve
year old gets control of a botnet of millions of household robots...)

So, this is more of an ecological view, that there will be some threats and
they will mostly be dealt with on a recurring scale (or not) by a social
network, the same as invasive biological pests today (some prevention, some
destruction, some just living with, some introduction of parasites hoping it
doesn't make things worse). Still, it seems going out of your way to make
potentially invasive pests when you don't need to seems foolish or reckless.
I maintain we just don't need much biotech right now (even though there have
been minor successes that have helped some people). We would probably be
better of at least developing a better understanding of human health and
disease fighting first, too. Anything that increases the spread of this
seems to be going in the wrong direction, IMHO.

Also, please note I am not just picking on the DIY-biotech people (except in
that they represent in some sense an extreme of least regulation of a messed
up system). I feel the entire biotech industry is questionable at this point
in our social-technical development (including the weapons aspect run by the
government, which may have created a more virulent weaponized Lyme disease
which may have been accidentally released)
http://www.google.com/search?hl=en&q=lyme+plum
I think the fairest argument there to defend DIY-bio is to argue that
industry, government, and academia are already so irresponsible, what does
it matter if people do it at home to towards more positive ends? Is the
genie out of the bottle? And so it does not matter? If so, one might argue
that the value in DIY-biotech at home will indeed outweigh the additional
risk, along the lines of creating widespread knowledge of the dangers and
providing some new valuable products. But note, that's a very different
argument than "rah, rah" cheerleading that biotech will make everything
cheap. This is more a, "Biotech researchers are already burning down the
house, so let's roast a few marshmallows while we're at it on that fire
because we're so hungry now that the kitchen is gone". :-(

Also, I still maintain we have not even begun to explore what is possible
with what we already have (biologically and technologically) to make our
civilization more sustainable, secure, and prosperous. As I reflect more on
that idea just now, it might be a lot wiser to consolidate what we know now
as a priority instead of focusing on making even more new things right now
(even, frankly, faster computers). This isn't intended as stopping change --
this is saying, maybe our priority should be in consolidating our gains for
the benefit of all humanity before going forward to more new stuff later.
And there remains a lot of innovation to do in even consolidating what we know.

From the OSCOMAK ideas of a decade ago, here:
http://www.kurtz-fernhout.com/oscomak/origins.htm
"""
Robert Muller, Assistant Secretary-General (retired), United Nations, quoted
in Surviving: The Best Game on Earth by Norrie Huddle, Schocken Books, New
York, 1984, pg. 251 - 252.

The present condition of humanity was best described by the philosopher
Gottfried Leibnitz a few hundred years ago when he said that humans would be
so occupied with making scientific discoveries in every sector for several
centuries that they would not look at the totality. But, he said, someday
the proliferation and complexity of our knowledge would become so
bewildering that it would be necessary to develop a global, universal, and
synthetic view. This is exactly the time and juncture at which we have
arrived. It shows in our new preoccupations with what is called
'interdisciplinary', 'global thinking', 'interdependence', and so on. It is
all the same phenomenon.

One of the most useful things humanity could do at this point is to
make an honest inventory of what we know. I have suggested to foundations
that they ought to bring together the chief editors of the world's main
encyclopedias to agree on a common table of contents of human knowledge. But
it can be a dangerous idea. Why? Well, when the Frenchman Diderot invented
the first encyclopedia, the archbishop of Paris ran to the king of France to
have the book burned because it would totally change the existing value
system of the Catholic church. If we developed a common index of human
knowledge today it would similarly cause a change in our value systems. We
would discover that in the whole framework of knowledge the contest between
Israel and the Muslims would barely be listed because it is such a small
problem in the totality of our preoccupation as a human species. The meeting
might have to last several days before the editors would even mention it!
This is exactly the point: some people don't want to develop such a
framework of knowledge because they want their problem to be the most
important problem on earth and go to great lengths to promote that notion.

So that is what I believe to be most necessary for global security: an
ordering of our knowledge at this point in our evolution, a good, honest
classification of all we know from the infinitely large to the infinitely
small - the cosmos, our planet, humanity, our dreams, our wishes, and so on.
We haven't done it yet, but we will have to do it one way or another.
"""

So, I think we have a lot of work to do before thinking we are *worthy* as a
society to go around making lots of new life forms, sorry. (Agreed, it's
fuzzy whether to call a modified bacterial strain a new life form,
especially as the genes will soon jump globally into the bacterial
supercomputer that surrounds the Earth.) Notice that any success for
developing clanking replicators by OSCOMAK would essentially flow out of
such a larger ordering of all manufacturing knowledge (including by adding
metadata) -- it would be closer to a last step to make a new replicator than
a first step. And, as with the Skills of Xanadu, organizing technical
knowledge is just part of a larger current worldwide effort to also organize
information about social issues and virtues and values including access to
the arts and literature, which is happening in parallel with OSCOMAK. But
making new replicators is a *first* step of DIY-biotech as it has been
presented (as in that picture with Meridith), before that biological
knowledge is organized, systematized, analyzed, and reflected upon (let
alone the social knowledge to use the technical knowledge wisely). Still, I
know that sentiment won't stop any biotech, because we already have a lot of
it literally in the field and on our plates (GMO corn, etc.). Again, as I
said, I feel the best argument for DIY-biotech is, "the rest of the biotech
system and our society is so messed up, so why bother us?" And, I feel there
is some legitimacy to that line or argument.

One of my earliest memories as a child was having my crib taken away and
being replaced by a regular bed, probably in part as I had started just
climbing over the bars in the morning. I was very upset about this because I
didn't want the change -- I liked the crib, maybe the bars made me feel
safe, I don't know. :-) But, consider this:
"The Family Bed in Islam"
http://www.themodernreligion.com/family/bed.html
"The family bed is an aspect of traditional family life, which has largely
become a thing of the past [in the USA, but not elsewhere]. Even Muslims
have adopted the unnatural Western cultural practices of confining the baby
to a separate room away from its parents and replacing breast-feeding with
bottle-feeding. ... The standard American baby handbook, What to Expect the
First Year (Eisenberg) advises: "If you can tolerate an hour or more of
vigorous crying and screaming, don't go to the baby, soothe him, feed him,
or talk to him when he wakes up in the middle of the night. Just let him cry
until he's exhausted himself-and the possibility, in his mind, that he's
going to get anywhere, or anyone, by crying-and has fallen back to sleep.
The next night do the same; the crying will almost certainly last a shorter
time…You may find that earplugs, the whir of the fan, or the hum of voices
or music on the radio or TV can take the edge off the crying without
blocking it out entirely." ... Is it any wonder that American youth feel
alienated and depressed? Today's young people are characterized by a lack of
connection with the home and family and a deep insecurity about whether they
are loved. This feeling of distance from others is most likely something
which started at infancy. If we gave our child the message since he was a
baby that we are only available if and when it is convenient to us, who can
blame them when they have problems later on in his life. If feels afraid and
alone, it will not occur to him to ask his parents for advice, but he will
instead turn to love substitutes and develop bad habits. Could you respect
someone who sat by and knew you were crying and didn't try to help you solve
the problem?"

So, perhaps I developed a belief that walls and bars could be trusted (as
might many US Americans), but people could be not trusted. Now, I'm trying
really hard here to accept that Meridith can be trusted -- that social
networks can be trusted. But, you're also turning around in your comments
and then telling me, no that isn't really true, the only reason to trust
Meridith is so she can help build cribs in space or on Earth to make us safe
from the Kurzweils of the world (or the twelve-year old biohackers). Which
is it? :-)

Or is it only some aspects of the social network can be trusted? And we
still want backups? I'm not saying which it is, I'm just pointing out
inconsistencies and related issues.

> I know a
> good number of bioinformaticists that are more than willing to listen
> and push some ideas upstream :-)-- esp. in Korea, believe it or not.

From:
http://www.pdfernhout.net/reading-between-the-lines.html
"While not strictly about college, the movie Good Will Hunting is a good
example of the dynamics of this system and the absence of alternatives. Like
Rudolf the Red-Nosed Reindeer, Hunting is teased and attacked at an early
age for his unique abilities. Hunting is only able to obtain expensive care
for his related mental difficulties because an academic (Santa) sees value
in that ability to serve [the academic] (not [Hunting] as a person). The
trade-off is that, like Rudolf guiding Santa's sleigh to gain acceptance for
has difference, Hunting is then asked to apply his talents to the
military-industrial complex, either in a job for a defense contractor
implementing Mutually Assured Destruction or in a job for a financial
services firm implementing market-driven World Hunger. Fortunately, at the
end of the movie he picks a different path (although there remain overtones
of elitism in his choice -- an academically and professionally successful
girlfriend is all that is acceptable to the plot). In the course of the
movie, the alienation from his lifelong childhood friend and lifelong
neighborhood is completed. It is simply unacceptable to the community for
Will Hunting to continue in a life of manual labor which he enjoys, both for
the work and the camaraderie, and to continue to use his gift in a
self-pleasing hobbyist way -- his gift must be put to use for the capitalist
system for it or for Will Hunting to be deemed of value. "

A big problem with our society and many stories in it is that they focus on
people finding value in doing things for others, which, while very
important, is only part of life, and also in many stories the relationships
are asymmetrical in terms of power or content (like love flows one way and
money the other). Happiness in life can come from several things, like:
* pleasure (hedonism)
* a sense of flow in work/play
* helping others
* preserving some pattern that is important to us (a sense of duty?)
* healthy growth
and maybe other abstract categories. Rudolf and Good Will Hunting were
pushed by others into certain roles because others found them of commercial
or political value. And that seems not good.

As for the rest of your dismissal of risk assessment, again, as six months
ago, I urge you to read the book "A Wizard of Earthsea" by Ursula K. Le
Guin, if you have not already:
http://en.wikipedia.org/wiki/A_Wizard_of_Earthsea
"At the school, Sparrowhawk masters his craft with amazing ease, but his
pride and arrogance grow even faster than his skill and, in his hubris, he
attempts to conjure a dead spirit - a dangerous spell which goes awry. He
inadvertently summons a spirit of darkness which attacks and scars him. The
being is driven off by the Archmage, who exhausts himself in the process and
dies shortly thereafter."

On one of your points, there are a lot of people in this world who with one
phone call could call in an airstrike against "a giant 11-story tall Theo
Jansen walking mechanism with eight thousand legs" and make that accidental
release a non-problem (except for cleaning up the mess). There is no one I
know in this world who with one phone call can make an accidental release of
an plague like AIDS or Lyme disease go away. These are risks of totally
different qualities.

I'm not arguing against freedom in its totality (for DIY-biotechers or
clanktechers or others). But, I would argue that freedom is one of many good
things (family, prosperity, security, comfort, health, neighborliness,
charity, and so on). And it is in the nature of most things that there is an
exponential cost going towards 100% of something all the time (regardless of
what Benjamin Franklin says about keeping government on a short leash). So,
it is the last few percent of "freedom" that is the most costliest, just
like it is a lot harder to get a grade of 100% in school than 95% (SATs are
set up this way too, with some few hard problems). And that is what we are
arguing about here IMHO in discussing the value of biotech to our society
versus the freedom to do it (however it is done, by reckless career-driven
academic professionals using up their graduate students, or by careful
safety-conscious amateur with nothing to prove, or both).

Here is a story about trying to get that last 1% of freedom and passing on
all the rest of the values in life, which has a rather unhappy ending:
http://en.wikipedia.org/wiki/Vagabond_(film)
"The film begins with the contorted body of the woman, covered in frost.
From this image, an unseen and unheard interviewer puts the camera on the
last men to see her and the ones who found her. The action then goes
backwards, to see the woman, Mona (Sandrine Bonnaire) walking along the
roadside, hiding from cops and trying to get a ride. Along her journey she
meets and takes up with other vagabonds such as herself as well as a
Tunisian vineyard worker, a family of goat farmers, a professor researching
trees, and a maid who envies what she perceives to be a beautiful and
passionate lifestyle. Mona explains to one of her temporary companions that
at one time she had an office job and did very well for herself, but she
became unsettled with the way she was living -- choosing instead to wander
the country free from any responsibility, picking up what she could to
survive as she goes. Throughout the film, Mona's condition seems to become
progressively worse until she finally falls where we first saw her, frozen
and entrenched in her misery in a ditch."

The problem is in part, as Gatto, Holt, Goodstein, Llewellyn, and many
others point out, that the basic institutions in our society like most K-12
schools or the PhD process (or many other big institutions) are so messed up
at this point that we don't know what responsible freedom looks like
anymore. We have departed only very recently (the last few thousand years)
from the hunter-gatherer context we evolved in for so many hundreds of
thousands of years. Without good stories to guide us about balance, stories
like Ursula K. Le Guin tries to write, individuals tend to jump from the
frying pan into the fire, without having a chance to lead a more balanced
life because they have never seen that modeled in life or in story or song.
And the advice of many people who are loving and caring and trying to be
helpful is genuinely worthy of suspicion for that reason, because of limits
of the advice giver's perspective and experience, especially if, like
Kurzweil, they have been a success in a competitive capitalist economy (say,
the academic who got a PhD in the 1970s pre-crunch and still recommends
academia as it worked for them).

For some reason I'm thinking about this issue right now: :-)
http://www.wildhorses.com/
"Return to Freedom is a great organization that operates The American Wild
Horse Sanctuary in Lompoc, CA, a wonderful refuge for horses and burros."
http://www.returntofreedom.org/
"Return To Freedom is dedicated to preserving the freedom, diversity and
habitat of America's wild horses through sanctuary, education and
conservation, while enriching the human spirit through direct experience
with the natural world."

But even those "wild" horses have complex herd and family behavior that
constrain who they can be -- but perhaps they are happier as wild horses in
a natural herd rather than as either pack animals or racehorses? :-)

So you're going to have to decide if with my advice I fall into that
category of well meaning people stuck in an old metaphor. Still, even people
running preserves for wild horses may worry about them running on
superhighways full of cars. :-)

But in the end, all metaphors break down as "the map is not the territory"
and humans are not horses, and as human children become adults, they make
their own environments for themselves and their friends (even just houses
and gardens and homesteads and cities and nations, but someday even ocean
habitats and space habitats), and people decide who they associate with for
whatever reasons, depending on the costs and the benefits to themselves or
those they care about.

Maybe DIY-biotech will save the planet for humanity instead of destroy it. I
remain skeptical.

Anyway, I probably should rewrite this more, but I'm sending it and trying
to move back to coding the next version of a social semantic desktop.

--Paul Fernhout

Reply all
Reply to author
Forward
0 new messages