> Dear DIY Biology people,
Weren't you the person who said that you don't trust us?
> Rather, this is about asking people who identify as members of a
> synthetic biology community to take a few next steps toward coming of
> age.
>
> The facts on the ground
>
> The second starting point is to imagine two circles in a Venn
> diagram. One circle is the set of people who know how to perform
> various manipulations and pieces of construction work, who could for
> example make the DNA, or troubleshoot what was wrong in a co-
> transfection setup as above. The second circle is the set of people
> who might be motivated to build and release a self-replicating
> organism that hurts people. The number of people in the first circle
> has been growing steadily, at a guess at around 10% per year, for many
> decades since 1973. At the moment, the number in the second circle is
> large, and is affected by international political attitudes (I am
> guessing that it has grown significantly in the past 5 years). If we
> are in luck, there might now be no people in the intersection of those
> two circles. But even if we are lucky now, there is no reason to think
> we will stay lucky in the future, because the number of people in the
> first circle will continue to grow.
How about you just go use a separated habitat/environment if you're
going to be a hypochondriac and hikikomori about viruses?
> To run the calculation for the first circle, let's ask, if there
> are 20,000 undergraduates at UC Berkeley, how many possess the
> technical skills and access to labs to make a gram positive organism,
> anthrax, resistant to the first line antibiotic fluoroquinilone
> antibiotic, ciprofloxin? Let's guess that one tenth of them do. 2,000
> UC Berkeley undergraduates. Now, let's try to guess how many have the
> DNA manipulative skills needed to construct the plasmids and perform
> the transfections needed to follow recipes to recover animal viruses?
> Surely, more than 20? Maybe 200? Now, given that techniques keep
> getting easier, and more people keep getting trained in their use, how
> many past and present UC Berkeley undergraduates will have those
> skills in 2016?
>
> So I would like to stipulate some things. I believe that most
> reasonable people can agree with Venter that the applications of
> synthetic biology within the current ghetto boundaries pose no
> significant risk. Hold a gun to my head, and I say: "zero risk".
> Zero, zip, nada, none. To say this again, there is no reason anyone
> should fear a minimal Mycoplasma genome, or a bug that makes plastic,
> or methane, or artemisinin. Period. Full stop.
>
> I further submit that developing an additional class of DNA
> hackers via an undergraduate engineering route (as opposed to the
> existing scientific or biomedical communities) also provides some
> increment of risk. I can't quantify that risk, either, although I
> suspect it is not high, but it will become very much higher if we
> permit an outlaw hacker culture to come into being and are foolish
> enough to glamorize it.
So, whenever I see people talking about risks, I always refer to the
concise Wikipedia article on SPOFs, or Single Points of Failure. In
any network system, a SPOF is where you have this node spontaneously
fail or do something terrible, which might be a cascading black swan,
completely unpredictable and turns everything upside down. For
instance, if you only have your very important document on one
computer, that's a SPOF, because if that point fails, then all of the
other things that you were going to do with that one document will now
be generally undoable.
http://en.wikipedia.org/wiki/Single_point_of_failure
"""
The strategy to prevent from total systems failure is
1) Reduced Complexity
Complex systems shall be designed according to principles decomposing
complexity to the required level.
2) Redundancy
Redundant systems include a double instance for any critical
component with an automatic and robust switch or handle to turn
control over to the other well functioning unit (failover)
3) Diversity
Diversity design is a special redundancy concept that cares for the
doubling of functionality in completely different design setups of
components to decrease the probability that redundant components might
fail both at the same time under identical conditions.
4) Transparency
Whatever systems design will deliver, long term reliability is based
on transparent and comprehensive documentation.
"""
So, I see DIYbio as implementing a few of these, especially #4 and #1.
But when it comes to risks of total system failure due to viruses, I
see we're completely failing at #3 with having only one single
atmosphere, and thus a possible medium for harmful biological agents.
I mean, it's just a bad idea. It's nice though. I really do like
everything, but I hardly think it's fair to play the blame game and
act like viruses are the cause for how much that possibility sucks-
it's also partly due to just the way that the world has historically
worked.
> The reason the boundaries and self-policing can't work anymore is
> that the multiple and reasonable connotations of the term "synthetic
> biology" naturally mean that anybody not of the ghetto will
> immediately associate it with the entirety of recombinant DNA work in
> general. And this is a time when discussions about recombinant DNA
> powered work are breaking surface again. For all sorts of reasons,
> including the ones above On November 18th, Kofi Annan of the UN
> called for a world discussion about the dangers arising from the
Global problems need global solutions- habitats are one way if you
want to do the environmental isolation from viruses gig. I mean, let's
be honest here. You're looking at the problem of viral infection, so
let's solve *that* problem. Viruses are easy to replicate. Habitats
aren't- that's where the actual effort needs to go into, if you really
are worried about these viral infections.
<snip>
> possible and prepare the ground for the needed work. Denial and
> evasion are not our friends here. People who are refusing to
> acknowledge that that there is a problem while other people are
> working to envision 21st century public health systems... such people
> are just not helping Uncle Ben.
Because Uncle Ben really isn't a person. Now, if the government wants
to contribute, stay active in these matters, increase infrastructure
for new research and so on, that's fantastic and I am sure many people
on this list would find that a worthwhile use of their time.
> Responsibility to articulate and help bring about positive
> consequences. I suspect that most people who read this will share the
I agree. There is a strong need for clear articulation of topics and
issues. I suspect that there is a problem of 'institutional
boundaries' however- some of us might be articulating thoughts from a
point of view that government bodies simply can't address, such as the
idea of passing university and corporate boundaries for labwork, which
for a while now has been the norm.
"Why should the next election not feature a proposal .. on open source
standard parts?" I'd be happy to support open source standard parts
for just about anything. An election might not be the best place for
that topic though- technical things should happen for technical
reasons- which you gave a good list of (food, energy, health, housing,
water, cleanup, ..).
Um, dude, what do you think speciation is?
There are thousands, perhaps millions of species-specific proteins and
metabolic pathways out there. Countless more existed in species that
are now extinct. Certainly there's a laundry-list of machinery that's
common across species -- heritability and parallel evolution at work
(thanks, selection pressure!) -- but the innumerable variation already
extant in nature is precisely the result of species-specific genes.
Even common functionality has subtle differences from one species to
another, DNA-wise.
Cheers,
--mlp
Yes, in humans it encodes the use of 13 proteins which have caused
Aubrey quite a headache. There have been some who have suggested
attempting to port the proteins into the nucleic DNA instead of
mitochondrial DNA.
http://www.mfoundation.org/research/adgpubs#allo
http://www.mfoundation.org/research/adgpubs#mtmut
See also MitoSENS. I'm sure there's other more interesting
mitochondrial DNA research out there that I am failing to remember to
cite.
Why does mitochonrdria have dna? Does it use it somehow?
-Tim
|
show details 8:38 PM (3 minutes ago)
|
|
Thank you. :-) Strategies for preventing total system failure are
important, and another aspect of it is a constructive or
creation-based approach. Anyway, check out this too:
http://constructal.org/ "According to the Constructal law, every
system is destined to remain imperfect, i.e. with flow resistances.
The natural constructal tendency then is to distribute the
imperfections of the system, and this distribution of imperfection
generates the shape and structure of the system."
http://en.wikipedia.org/wiki/Systemantics
"""
* The Primal Scenario or Basic Datum of Experience: Systems in general
work poorly or not at all. (Complicated systems seldom exceed five
percent efficiency.)
* The Fundamental Theorem: New systems generate new problems.
* The Law of Conservation of Anergy [sic]: The total amount of anergy
in the universe is constant. ("Anergy" = 'human energy')
* Laws of Growth: Systems tend to grow, and as they grow, they encroach.
* The Generalized Uncertainty Principle: Systems display antics.
(Complicated systems produce unexpected outcomes. The total behavior
of large systems cannot be predicted.)
* Le Chatelier's Principle: Complex systems tend to oppose their own
proper function. As systems grow in complexity, they tend to oppose
their stated function.
* Functionary's Falsity: People in systems do not actually do what the
system says they are doing.
* The Operational Fallacy: The system itself does not actually do what
it says it is doing.
* The Fundamental Law of Administrative Workings (F.L.A.W.): Things
are what they are reported to be. The real world is what it is
reported to be. (That is, the system takes as given that things are as
reported, regardless of the true state of affairs.)
* Systems attract systems-people. (For every human system, there is a
type of person adapted to thrive on it or in it.)
* The bigger the system, the narrower and more specialized the
interface with individuals.
* A complex system cannot be "made" to work. It either works or it doesn't.
* A simple system, designed from scratch, sometimes works.
* Some complex systems actually work.
* A complex system that works is invariably found to have evolved from
a simple system that works.
* A complex system designed from scratch never works and cannot be
patched up to make it work. You have to start over, beginning with a
working simple system.
* The Functional Indeterminacy Theorem (F.I.T.): In complex systems,
malfunction and even total non-function may not be detectable for long
periods, if ever.
* The Newtonian Law of Systems Inertia: A system that performs a
certain way will continue to operate in that way regardless of the
need or of changed conditions.
* Systems develop goals of their own the instant they come into being.
* Intrasystem [sic] goals come first.
* The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems
usually operate in failure mode.
* A complex system can fail in an infinite number of ways. (If
anything can go wrong, it will.) (See Murphy's law.)
* The mode of failure of a complex system cannot ordinarily be
predicted from its structure.
* The crucial variables are discovered by accident.
* The larger the system, the greater the probability of unexpected failure.
* "Success" or "Function" in any system may be failure in the larger
or smaller systems to which the system is connected.
* The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by
failing to fail safe.
* Complex systems tend to produce complex responses (not solutions) to problems.
* Great advances are not produced by systems designed to produce great advances.
* The Vector Theory of Systems: Systems run better when designed to
run downhill.
* Loose systems last longer and work better. (Efficient systems are
dangerous to themselves and to others.)
* As systems grow in size, they tend to lose basic functions.
* The larger the system, the less the variety in the product.
* Control of a system is exercised by the element with the greatest
variety of behavioral responses.
* Colossal systems foster colossal errors.
* Choose your systems with care.
"""
> I think I'd add something to that list... something that Jason's
> microbial-birdwatching addresses I think - I haven't quite put my finger on
> it yet, but it's got something to do with an network-based, over-arching
> awareness of what's going on. Everywhere.
That's interesting, could you elaborate some more on this over-arching
awareness?
> I mean transparency is one thing, but you've still got to have people who
> can read - and having networks of on-the-ground trend/change-spotters would
> create a lot of resilience in the system overall - with regards early
> warnings etc.
Neil Gershenfeld has mentioned that there is a new type of literacy,
which largely surpasses the basics of reading/writing and is more in
tune with microcontroller programming, or basic stuffhacking.
"From this combination of passion and inventiveness I began to get a
sense that what these students are really doing is reinventing
literacy. Literacy in the modern sense emerged in the Renaissance as
mastery of the liberal arts. This is liberal in the sense of
liberation, not politically liberal. The trivium and the quadrivium
represented the available means of expression. Since then we've boiled
that down to just reading and writing, but the means have changed
quite a bit since the Renaissance. In a very real sense post-digital
literacy now includes 3D machining and microcontroller programming.
I've even been taking my twins, now 6, in to use MIT's workshops; they
talk about going to MIT to make things they think of rather than going
to a toy store to buy what someone else has designed. The World Bank
is trying to close the digital divide by bringing IT to the masses.
The message coming back for the fab labs is that rather than IT for
the masses the real story is IT development for the masses. Rather
than the digital divide, the real story is that there's a fabrication
and an instrumentation divide. Computing for the rest of the world
only secondarily means browsing the Web; it demands rich means of
input and output to interface computing to their worlds. There was an
amazing moment as I was talking to these Army generals about how the
most profound implication of emerging technology for them might not
lie in designing a better weapon to win a war, but rather in giving
more people something else to do. So we're now at a cusp where
personal fabrication is poised to reinvent literacy in the developed
world, and to engage the intellectual capacity of the rest of the
world."
re: http://en.wikipedia.org/wiki/Systemantics
My dad would agree with this one:
* Le Chatelier's Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
He's a school teacher - and has long held that every new "measure" instituted within the educational system always has the exact opposite effect to the one intended.
I'd add one to that list I think :
* Complexity is the fractaline manifestation of flawed assumptions.
A bit like 80s haircuts etc.