Which Singularity Don't You Want?

26 views
Skip to first unread message

Damien Sullivan

unread,
Nov 3, 2005, 1:19:28 PM11/3/05
to
There's something of a Singularity backlash these days, at least memetically.
To pick a prominent example on rec.arts.sf.written, we have James Nicoll, who
asks for optimistic hard SF set in the near future, in space, no catastrophes
killing off most of humanity, and *no Singularity*.

What exactly does he mean by that? I could ask him, but it's more fun to go
through some possibilities first, since in my experience "the Singularity" has
become slippery, with many possible (if related) meanings.

To go from the most extreme to the most sensible root: first we have the
"Technorapture", where something -- intelligence, connections, technology --
increases exponentially over the course of a few hours, culminating in a mass
upload of everybody on Earth and possibly their subsequent disappearance from
the physical universe. The core Vingean example is presumed in _Marooned in
Realtime_ (where the Extinction/disappearance was a *plot device*, lest we
forget.) Preceding that as an image was _Childhood's End_, but psionics and
the Singularity don't mix well. Explosive transcensions also happened in _A
Fire Upon the Deep_, though at least the Powers stayed in our universe.
Extreme Drexlerian nanotech tends to be involved, to do the uploading and
provide computational substrate.

Saying you don't want this is perfectly reasonable to me; Drexlerian nanotech
is something to be skeptical of, as are the claims to extreme speed and
universality of the process, and we can forget about the mystical
disappearance.

(Though in fairness I'll note that the concept of making "basement universes"
to disappear into has some attention from physicists. But mass nanotech is
not obviously a tool for quickly making black holes, which would be needed.)

(I'll also note that while Ken MacLeod is sometimes quoted for his "The
Rapture for Nerds!" line in _The Cassini Division, the nerds in the book
turned out to be *right*. They had their Rapture, and they were conscious,
not zombie programs.)

Dropping down a level, "post-Singularity" has been applied to various
settings, not always with the superhuman intelligence, but with cool
technology which some might think makes storytelling challenging, generally
nanotech and AI, with the key functions being the copyability of almost
everything, including human minds. "Post-scarcity" and immortality
("post-mortality", perhaps?) Examples: MacLeod's _The Sky Road_, where a near
future with nanotech immortality pills and some modestly transhuman minds was
internally called "post-Singularity". Vinge's "Just Peace" had mental
copying; his "Original Sin" had immortality and weird probability tech;
MacLeod's _Newton's Wake_ had another Rapture but also various normal humans
running around with backups. Wil McCarthy's Collapsium books seem
post-Singularity in this sense. Also Cory Doctorow's _Down and Out in the
Magic Kingdom_. Some readers seem to be getting tired of all this, so "no
Singularity" seems interpretable as "no nanotech! No copies, backups, or
uploads! No biological immortality!" Again, there's grounds for being
skeptical of the nanotech, though my impression is that if other ways were
found of doing the same things the readers would still be unhappy.

Dropping further, we get closer to what I think of as Vinge's original ideas.
One form is that when we learn to increase our intelligence, those smarter
results will be able to increase their own even faster, in an exponential
growth for at least a while, with accompanying exponential growth in other
technologies. The Technorapture is just an extreme manifestation of the
process; the post-scarcity, post-mortality technologies are sometimes a
spinoff, sometimes an enabler (that level of nanotech being considered useful
for the mental technologies.) Skepticism: since the smarter beings are
probably even more complex, they may need all their enhanced brains to
accomplish a similar increase. Whether a continued linear increase in
intelligence is really less worldshaking (especially for the SF writer) than
an exponential one is a question I leave to the reader.

Which brings us to the root of it all in John Campbell's rejection of Vinge's
sequel to "Bookworm, Run!", with a human enhanced like the chimp of the
original story: "You can't tell this story, and neither can anyone else."
This has cousins in Niven's own writings on superintelligent beings; the basic
idea is that you can't plausibly write about someone much smarter than
yourself, let alone a society of them. Sure, they might still have human
motivations, but can you portray their thoughts, their actions, the
technologies or social arrangements they would produce? Can you understand
them? -- Which actually leads me to a common conflation: difficulty of
*prediction* is not the same as impossibility of *understanding*; we might
well be able to understand a posthuman world (even if not quickly enough to
keep up) without being able to create one. Loose analogies fly around at this
point; some say "they'll be to us as we are to dogs", I invoke
Turing-completeness and say dogs just aren't that good at understanding each
other, in the sense we mean it. It's not that we're too complex for dogs, but
dogs are complex enough to understand anything.

But anyway, at this point the cry of "no more Singularity" sounds like "don't
show enhanced intelligence!" which I think gets to be a problem, if we're
pretending to be talking about hard SF. As Vinge wrote in his classic essay,
there are multiple routes to enhancing intelligence. Developing human level
AI and making it better is just one of them, not even that frequent in Vinge's
writing. A second was IA, "intelligence augmentation", itself with multiple
forms; one was computer prosthetics to human brains, as shown in _The Peace
War_ and _Marooned in Realtime_ and "True Names". A third was connecting
human minds together, mentioned in _Marooned in Realtime_. Usually as direct
neural links, but I note that if we take "mind as society of agents"
seriously, that leads to "society of humans as a crude mental process", and
perhaps such thinking would lead to better organizations of corporations and
governments than the recursive primate hierarchies we tend to use.

The final approach was genetic enhancement, and I think skepticism must here
run aground. Even if you don't believe in AI, for philosophical or practical
reasons, even if plugging network jacks into the brain doesn't seem all that
transformative, even if simple chemical interventions don't help much and
nanotech rewiring is impossible, there are still the facts that the brain is a
biological machine constructed by our genes (with, yes, environmental
influence), that IQ varies, and seems to vary largely with genes (80%
hereditable -- and if that's high, it would make IQ easier to control, not
harder!) Twiddle with the right genes and you can get results, even if you
don't know exactly how it works. (Vingean result: _Tatja Grimm's World_)

We're not far from it now. Find correlations of genes with IQ and personality
traits, create multiple embryos and select the most preferred gene combination
among them, and you can create quite strong selection pressures in the
direction you want; this could well be next-decade tech. (Fiction: the movie
"Gattaca", though I think they selected among 8 eggs, while I could see a
whole ovary being used.) Learn how the homeobox and development genes work
and you can try tricks such as creating bigger brains, and bigger brain-body
ratios, and perhaps controlling sizes of areas within the brain. Mice with
bigger brains have already been created (though the article didn't mention
whether they seemed any smarter.) This is a bit further off (need to learn
more, and test it on animals) but possibly not that far; there are practical
and ethical problems to worry about (physiological side effects of a large
head; necessity for C-sections; possibility of this causing speciation and
lack of breeding partners), but there's also the temptation of making 300+
IQs.

At this level, what I see the Singularity really saying is "our tools don't
only work out there, or in our bodies for simple healing. Our bodies and
brains are machines we can take apart, understand, improve, and perhaps copy;
this will happen, this will happen soon, and this will have consequences."
This is what unifies the nanotech immortality pill with the brain-computer
interface or the gene-selected supermind or the upload, namely the treatment
of the human condition in all its aspects as a controllable material process.
The future doesn't just offer more food, water, energy, and neat clothes; it
offers people who are, as a result of their parents' will if not their own,
smarter and saner and stronger and healthier and more beautiful than we are.
At the very least.

So, please, specify which Singularity it is that you're tired of!

Damien Sullivan

unread,
Nov 3, 2005, 1:21:53 PM11/3/05
to
While writing about the varieties of Singularity and people's reaction to it,
I had a thought which perhaps might shed a photon on the phenomenon of
"Scottish SF". The thought about the Singularity is that it is ultimately
about taking materialism seriously: our bodies (and brains) are machines,
which we can understand and then manipulate; this unifies the root concept of
intelligence-enhancement with associated ideas such as uploads, backups, and
immortality of the body.

"Scottish SF" is a disputable term, but here I have in mind the authors Iain
Banks, Ken MacLeod, Charles Stross (these three being a real social group),
Alastair Reynolds, and non-Scots Wil McCarthy and Greg Egan. Frequent themes
in the groups' writings are AI, uploads, nano-immortality, extensive genetic
engineering, mental control (nanite, implants, Culture drug glands) along with
the usual "post-scarcity" economics. Their writings tend to be a materialist
revel in the possibilities (even if the world as a whole is sometimes bleak).
Very transhumanist, in the sense of using our understanding (acquired as
humans) to go past our human limits.

As opposed, say, to the writings of Lois Bujold and Terry Pratchett, which are
comparatively very humanist, centered on the human condition. Bujold has a
lot of advanced biotech in her SF, and some perfectly nice people have come
out of the labs (quaddies, hermaphrodites, Taura), but I note that extensive
human manipulation is not something any of the core, Good, characters do. Not
even intensive selection of embryos, beyond minimal needs. A quaddie and a
herm will select the attributes of their children, but their biologies make
that mandatory. [Research: did they actually select more, such as
appearance?] Miles and Ekaterin select the sex and timing of their children,
and being clean of nasty recessives is taken for granted, but they didn't make
many embryos and select the smartest or most social one. The attitude is
still "take your chances and do your best with what you've got", vs. pushing
for the top. That's left for the Cetagandan haut, who are kind of creepy.
And no one, not even the haut, lives all that long -- 120 years for the
Betans, most "like us", 150 years for the quaddies and old haut. Nice, but
MacLeod had people born before the moon landing living to be 350, both as
software and as continuous human bodies.

Pratchett is fantasy, so it may seem odd to mention him, but I think it works.
Magic is usually kept off to the side, not used extensively as a technology,
and the Igors are equivalent to advanced transhuman biotech, but on-page kept
as simple but good doctors and comic relief, not allowed to resurrect Watchmen
or even deliver Vimes's baby.[1]

At any rate, what's the putative Singularity-Scottish connection? Just that
at least some of the Scots are openly socialist (with some market memes thrown
in), and subculturally heir to a strong Marxist materialism, with roots
perhaps going back to the Scottish Enlightenment. So perhaps they're more set
up to embrace and exploit the possibilities of seeing humans as material
objects and processes.

[1] Which was unfair. Dr. Lawn said Igor could only approach if he was boiled
first, since Igors look so unhygienic, but while Lawn may know about boiling,
Igors actually know why -- about germs. And if there was a real hemorrhage,
I'd prefer having an Igor on the spot...

Aaron Denney

unread,
Nov 3, 2005, 1:55:03 PM11/3/05
to
On 2005-11-03, Damien Sullivan <pho...@ofb.net> wrote:
> A third was connecting human minds together, mentioned in _Marooned
> in Realtime_. Usually as direct neural links, but I note that if we
> take "mind as society of agents" seriously, that leads to "society of
> humans as a crude mental process", and perhaps such thinking would
> lead to better organizations of corporations and governments than the
> recursive primate hierarchies we tend to use.

I just wanted to admire the phrase "recursive primate hierarchies".

--
Aaron Denney
-><-

Andrew Plotkin

unread,
Nov 3, 2005, 2:16:25 PM11/3/05
to

Also known as: it's monkeys all the way down.

--Z

"And Aholibamah bare Jeush, and Jaalam, and Korah: these were the borogoves..."
*
I'm still thinking about what to put in this space.

Wayne Throop

unread,
Nov 3, 2005, 2:12:38 PM11/3/05
to
: pho...@ofb.net (Damien Sullivan)
: psionics and the Singularity don't mix well

I'm not sure I fully agree. A lot of singularity-fu is
just prosthetic psionics. All the gear in Vinge's "fast times"
setting, or the "advanced" folk (and Tunc most advanced of all)
in the peacewar/realtime setting, for examples. People have
invisible communications that are functionally indistinguishable
from telepathy, and manipulate autons and such in wasy functionally
indistinguishable from telekinesis. Pham Nguyen's interaction
with the Countermeasure, and its effects on the local star sure
smacked of spooky mystic action at a distance, reponding to
thought alone.

Or, consider Gordon Dickson's "Wolfling"; a very nice scene at
the end of that, demonstrating just how outclassed the earth military
would be against a single soldier with nigh-singularity-level support.

"Adok! Remove that wall. No flying debris, no side effects,
I just want it gone."
<adok turns to face the wall, and a peculiar noise and flash occur,
as if they were so loud / bright that they'd deafen/blind you, but cut
off before you could even perceive them... the wall is gone>
"Adok! Those clouds! Remove them!"
<adok turns his attention to the sky, but otherwise does not move;
a vast whistle, cut short before blasting everybody to deafness...
the clouds are gone>
"Adok!" <points to the Alpha Centauri governor,
who gibbers in terror...>

(Though of course, Wolfling was written before
there was a Singularity... but still, a nice scene.)

Wayne Throop thr...@sheol.org http://sheol.org/throopw

Steve Simmons

unread,
Nov 3, 2005, 2:48:45 PM11/3/05
to
Andrew <erky...@eblong.com> wrote on 11/03/05 at 19:16:

> Here, Aaron Denney <wno...@ofb.net> wrote:

>> I just wanted to admire the phrase "recursive primate hierarchies".

> Also known as: it's monkeys all the way down.

Judging from the newspapers, it's monkeys all the way up, too.
--
Realize that life is a situation comedy that will never be canceled. A
laugh track has been provided, and the reason why we are put in the
material world is to get more material.
From "Swami Beyondananda's Guidelines for Enlightenment"

Peter D. Tillman

unread,
Nov 3, 2005, 3:02:16 PM11/3/05
to
In article <dkdkbg$8u6$1...@naig.caltech.edu>,
pho...@ofb.net (Damien Sullivan) wrote:

>
> (I'll also note that while Ken MacLeod is sometimes quoted for his "The
> Rapture for Nerds!" line in _The Cassini Division, the nerds in the book
> turned out to be *right*. They had their Rapture, and they were conscious,
> not zombie programs.)
>

Well, conscious and lively up to when Ellen May zapped them....

Which raises a point, how likely is it that a posthuman society,
explicitly hostile to plain ol' humans, would leave themselves open to a
relatively crude, genocidal attack, such as Ellen May's?

Cheers -- Pete Tillman

Michael S. Schiffer

unread,
Nov 3, 2005, 3:41:29 PM11/3/05
to
pho...@ofb.net (Damien Sullivan) wrote in
news:dkdkg1$8u6$2...@naig.caltech.edu:
>...

> As opposed, say, to the writings of Lois Bujold and Terry
> Pratchett, which are comparatively very humanist, centered on
> the human condition. Bujold has a lot of advanced biotech in
> her SF, and some perfectly nice people have come out of the labs
> (quaddies, hermaphrodites, Taura), but I note that extensive
> human manipulation is not something any of the core, Good,
> characters do. Not even intensive selection of embryos, beyond
> minimal needs. A quaddie and a herm will select the attributes
> of their children, but their biologies make that mandatory.
> [Research: did they actually select more, such as appearance?]
> Miles and Ekaterin select the sex and timing of their children,
> and being clean of nasty recessives is taken for granted, but
> they didn't make many embryos and select the smartest or most
> social one.

In the case of the Vorkosigan books, embryo selection seems to be
ruled out for moral reasons, at least fot the major good guys.
Cordelia's ethics on the boundaries of life are as close to the
Catholic Church's as her sexual morality is distant, and this
likewise applies at minimum to anyone whose upbringing she's
influenced. There's at least evidence that this extends further
than just her-- the elaborate means by which the Escobarans rid
themselves of embryos rather than disposing of them themselves, for
example-- though of course not to places like Jackson's Whole or,
for that matter, Barrayar.

On the other hand, embryo selection probably isn't necessary with
higher-end tech in that universe-- they can do genetic and somatic
search-and-replace on adults, after all, and even failing that they
could presumably choose the gametes they wanted to use. It's true
that they don't do as much optimizing as they probably could.

The attitude is still "take your chances and do
> your best with what you've got", vs. pushing for the top.
> That's left for the Cetagandan haut, who are kind of creepy.

And who would stand as a warning for those who are bothered by
things like their grandchildren being effectively a different
species from them, except that almost nobody knows what the haut are
doing.

And
> no one, not even the haut, lives all that long -- 120 years for
> the Betans, most "like us", 150 years for the quaddies and old
> haut.

Jacksonian nobles last longer, but their method presents certain
ethical issues for most other people. And given that they still
need the brain to be working, it probably doesn't get them that much
more time than the haut. (And in practice, they probably don't even
get that long given their propensity to die by violence or
intrigue.) Mark is also working on life extension, but those
effects will probably be left (along with the genetic time bomb set
up in _Ethan of Athos_) beyond the scope of the books.

Mike

--
Michael S. Schiffer, LHN, FCS
msch...@condor.depaul.edu

Damien Sullivan

unread,
Nov 3, 2005, 3:44:57 PM11/3/05
to
"Peter D. Tillman" <Til...@toast.net_DIESPAMMERSDIE> wrote:

>Well, conscious and lively up to when Ellen May zapped them....
>
>Which raises a point, how likely is it that a posthuman society,
>explicitly hostile to plain ol' humans, would leave themselves open to a
>relatively crude, genocidal attack, such as Ellen May's?

Not very, especially if we ask the Orion's Arm folks.

But this was an odd situation; the posthumans weren't the originals who'd made
the wormhole and such. Those had gone totally mad (likelihood of that... ssh)
and fallen into Jupiter. The newly sane descendants may have had some
resources problems -- did that society have fusion? Did the Jovians have
access to more metals? It might have been a "we've fallen and can't get up"
situation.

I suppose we could ask whether they could have hardenred themselves bigger, or
just how big is Jupiter anyway, even to a comet swarm? OTOH, if you were them
and had survived a comet attack, would you announce it?

-xx- Damien X-)

Mike Schilling

unread,
Nov 3, 2005, 3:50:42 PM11/3/05
to

"Andrew Plotkin" <erky...@eblong.com> wrote in message
news:dkdnm9$2br$1...@reader2.panix.com...

> Here, Aaron Denney <wno...@ofb.net> wrote:
>> On 2005-11-03, Damien Sullivan <pho...@ofb.net> wrote:
>> > A third was connecting human minds together, mentioned in _Marooned
>> > in Realtime_. Usually as direct neural links, but I note that if we
>> > take "mind as society of agents" seriously, that leads to "society of
>> > humans as a crude mental process", and perhaps such thinking would
>> > lead to better organizations of corporations and governments than the
>> > recursive primate hierarchies we tend to use.
>>
>> I just wanted to admire the phrase "recursive primate hierarchies".
>
> Also known as: it's monkeys all the way down.

Or bishops.


Damien Sullivan

unread,
Nov 3, 2005, 3:55:29 PM11/3/05
to
thr...@sheol.org (Wayne Throop) wrote:
>: pho...@ofb.net (Damien Sullivan)
>: psionics and the Singularity don't mix well
>
>I'm not sure I fully agree. A lot of singularity-fu is
>just prosthetic psionics. All the gear in Vinge's "fast times"

True, and we have had psychic transcensions since Childhood's End -- maybe the
energy beings of Star Trek, definitely Jason Ironheart and the Vorlons etc. of
Babylon-5, or the Lylmik of Julian May. There's similarity of effect, and
convergence to a godpoint of manipulating much of reality at will.

OTOH, the difference in mechanism and philosophy feels important to me,
especially from the hard SF viewpoint. The implausibility of "the technology
won't stretch that far" isn't the same as that of "that effect doesn't even
exist". The Singularity could happen -- or maybe it can't, but it's largely
an engineering issue, vs. psychic powers not existing.

If people were writing Singularity fiction using the idea as a black box
excuse to have cool magical effects, then they'd be similar. And maybe
there's stuff like that. But since I can envision how the 'magic' is supposed
to happen, it's different. Maybe the authors using psychic powers believed in
those, in which case it's similar at the authorial level, but for a modern
reader it's not. Electrode-controlled robots aren't the same as pure
telekinesis.

>indistinguishable from telekinesis. Pham Nguyen's interaction
>with the Countermeasure, and its effects on the local star sure
>smacked of spooky mystic action at a distance, reponding to
>thought alone.

Yeah, but it was supposed to be appealing to Zone mechanisms somewhere. And
since the Zones themselves are pretty much magical, *shrug*.

-xx- Damien X-)

Damien Sullivan

unread,
Nov 3, 2005, 4:06:51 PM11/3/05
to
"Michael S. Schiffer" <msch...@condor.depaul.edu> wrote:

>In the case of the Vorkosigan books, embryo selection seems to be
>ruled out for moral reasons, at least fot the major good guys.
>Cordelia's ethics on the boundaries of life are as close to the
>Catholic Church's as her sexual morality is distant, and this

Indeed.

>On the other hand, embryo selection probably isn't necessary with
>higher-end tech in that universe-- they can do genetic and somatic

Point.

>could presumably choose the gametes they wanted to use. It's true
>that they don't do as much optimizing as they probably could.

Barrayar has haut genetic material as of _DI_; think they'll use it?
Barrayar, passing up the chance to create supersoldiers? Okay, this is
Gregor's Barrayar, and he's Cordelia's, and they have that mutant phobia...

>Jacksonian nobles last longer, but their method presents certain
>ethical issues for most other people. And given that they still
>need the brain to be working, it probably doesn't get them that much
>more time than the haut. (And in practice, they probably don't even
>get that long given their propensity to die by violence or

And the procedure doesn't always work. Also it depends when they do it; if
someone with sub-Betan life expectancy does it at age 80, they might end up
living to 160 in the second body.

It's not just nobles; other people use the service. Which raises a question,
how old is the oldest person in the Nexus? Hey Lois, do your worldbuilding...

These people can also clone and replace any organ; one wonders why the
body-theft is even necessary. I suppose refreshing the whole vascular or
lymph system would be harder. But it also seems like they'd have
head-in-a-jar capability, with what we've seen even from Barrayar's medicine
for Aral; is there any life support function they couldn't provide
mechanically indefinitely? Of course, the brain may conk out. Though stuff
like Illyan's memory chip seemed not that far from uploading-quality tech.

Which gets back to the humanist model; I suspect Lois doesn't want to deal
with 400 year old people or AI. After 1000 years things are better than now,
but not too much better. Enough to offer hope and inspiration, not enough to
disturb, except in selected places.

-xx- Damien X-)

Justin Fang

unread,
Nov 3, 2005, 4:08:42 PM11/3/05
to
In article <dkdkbg$8u6$1...@naig.caltech.edu>,
Damien Sullivan <pho...@ofb.net> wrote:
[reordered]

>So, please, specify which Singularity it is that you're tired of!

I'm tired of the Singularity depicted as inevitable and imminent (unless
disaster and/or deliberate suppression interferes). E.g. the presumably
intended as nonfiction _The Singularity is Near_ by Ray Kurzweil. I may be
unfair to him; I didn't read the book since the title alone made me laugh.

I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t
are uploaded to become as gods, while the slow-adopters, the doubters, and
the technophobes are disassembled and converted to computronium. _Left
Behind_ for geeks.

>Skepticism: since the smarter beings are probably even more complex, they
>may need all their enhanced brains to accomplish a similar increase.

For instance, the corrolary to Moore's Law known as Rock's Law: the price of
a chip fab doubles every 4 years.

>Whether a continued linear increase in intelligence is really less
>worldshaking (especially for the SF writer) than an exponential one is a
>question I leave to the reader.

Assuming IQ actually measures intelligence, then I remind you that the Flynn
effect suggests that such a linear increase has been ongoing for some time
now. No Singularity has yet occurred.

>The final approach was genetic enhancement, and I think skepticism must here
>run aground. Even if you don't believe in AI, for philosophical or practical
>reasons, even if plugging network jacks into the brain doesn't seem all that
>transformative, even if simple chemical interventions don't help much and
>nanotech rewiring is impossible, there are still the facts that the brain is a
>biological machine constructed by our genes (with, yes, environmental
>influence), that IQ varies, and seems to vary largely with genes (80%
>hereditable -- and if that's high, it would make IQ easier to control, not
>harder!) Twiddle with the right genes and you can get results, even if you
>don't know exactly how it works. (Vingean result: _Tatja Grimm's World_)

If you don't know exactly how it works, then the results you get are likely
to not be the ones you want. I really wouldn't want to be an early adopter--
or more precisely, the parent of an early adoptee--on this one.

IANA molecular biologist, but it seems to me that you'd at least want to have
solved the protein folding problem before tackling any of this stuff.

>We're not far from it now. Find correlations of genes with IQ and personality
>traits, create multiple embryos and select the most preferred gene combination
>among them, and you can create quite strong selection pressures in the
>direction you want; this could well be next-decade tech.

Unless you discover that while the various alleles you selected for all
increase intelligence taken individually or in most combinations, putting
them all together leads to an anti-synergistic effect.

>whole ovary being used.) Learn how the homeobox and development genes work
>and you can try tricks such as creating bigger brains, and bigger brain-body
>ratios, and perhaps controlling sizes of areas within the brain. Mice with
>bigger brains have already been created (though the article didn't mention
>whether they seemed any smarter.) This is a bit further off (need to learn
>more, and test it on animals) but possibly not that far; there are practical
>and ethical problems to worry about (physiological side effects of a large
>head; necessity for C-sections; possibility of this causing speciation and
>lack of breeding partners), but there's also the temptation of making 300+
>IQs.

Is a 300+ IQ score even defined? Anyway, if intelligence boosts become
common enough, the IQ tests will simply be renormalized.

>At this level, what I see the Singularity really saying is "our tools don't
>only work out there, or in our bodies for simple healing. Our bodies and
>brains are machines we can take apart, understand, improve, and perhaps copy;
>this will happen, this will happen soon, and this will have consequences."
>This is what unifies the nanotech immortality pill with the brain-computer
>interface or the gene-selected supermind or the upload, namely the treatment
>of the human condition in all its aspects as a controllable material process.
>The future doesn't just offer more food, water, energy, and neat clothes; it
>offers people who are, as a result of their parents' will if not their own,
>smarter and saner and stronger and healthier and more beautiful than we are.
>At the very least.

Prospective parents will disagree wildly on what is "smarter and saner and
stronger and healthier and more beautiful". Personal observation suggests
that what many parents want is kids which are just like them, only more so.
I suspect that the interaction of that with designer kids results in a lot
of kids who are messed up in various ways. Plus a lot who turn out just fine,
of course.

--
Justin Fang (jus...@panix.com)

Damien Sullivan

unread,
Nov 3, 2005, 4:42:15 PM11/3/05
to
jus...@panix.com (Justin Fang) wrote:

>>So, please, specify which Singularity it is that you're tired of!
>
>I'm tired of the Singularity depicted as inevitable and imminent (unless

Fair enough, especially since those typically are about the extreme forms.

At the other end, having cognitive science accomplish nothing of interest
seems about as implausible.

>I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t

How many books has this actually happened in? _Marooned_, kind of; _Fire Upon
the Deep_, implied; _Cassini Division_ and _Stone Canal_, partly -- the nerds
had their fun, no one got disassembled that I remember; Accelerando, maybe?

>For instance, the corrolary to Moore's Law known as Rock's Law: the price of
>a chip fab doubles every 4 years.

http://firstmonday.org/issues/issue7_11/tuomi/index.html

says Moore's Law doesn't hold up in any rigorous form, though that may not
matter for our purposes.

>>Whether a continued linear increase in intelligence is really less
>>worldshaking (especially for the SF writer) than an exponential one is a
>>question I leave to the reader.
>
>Assuming IQ actually measures intelligence, then I remind you that the Flynn
>effect suggests that such a linear increase has been ongoing for some time
>now. No Singularity has yet occurred.

Good point. OTOH, such a gradual rise is ripe for the "we're living through
it" argument, where we don't notice. How would the average person from N
years back, plucked forward, cope? For that matter the average person now
doesn't understand the tech and economy they live in... and no one understands
all of it, leading to the "Singularity came and went a long time ago"
argument.

>>We're not far from it now. Find correlations of genes with IQ and
>>personality traits, create multiple embryos and select the most preferred
>>gene combination among them, and you can create quite strong selection
>>pressures in the direction you want; this could well be next-decade tech.
>
>Unless you discover that while the various alleles you selected for all
>increase intelligence taken individually or in most combinations, putting
>them all together leads to an anti-synergistic effect.

Maybe, but how likely is that? And even then, people can assure themselves of
passing on the good alleles or combinations. People will gain the knowledge
to bias the results of reproduction in a way the human race has never seen
outside of plagues. Is there any reason not to expect that?

>Is a 300+ IQ score even defined? Anyway, if intelligence boosts become
>common enough, the IQ tests will simply be renormalized.

Probably not, and quite possibly, but that's hardly the point; the point would
be the much smarter person.

>Prospective parents will disagree wildly on what is "smarter and saner and
>stronger and healthier and more beautiful". Personal observation suggests
>that what many parents want is kids which are just like them, only more so.

True, somewhat, and if it changes a lot from generation to generation that
would keep the gene pool from lurching far in one direction. OTOH, the traits
themselves aren't that arbitrary, what would vary more is the tradeoffs among
them to make, if necessary. The other thing parents want is probably "kids
who will be successful"; who's likely to attract a mate and get a good income
and be happy? (Some of those might bias against going for really high IQ.)

-xx- Damien X-)

Dr. Dave

unread,
Nov 3, 2005, 5:06:26 PM11/3/05
to
Justin Fang wrote:
>
> I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t
> are uploaded to become as gods, while the slow-adopters, the doubters, and
> the technophobes are disassembled and converted to computronium. _Left
> Behind_ for geeks.

Curse you, Justin Fang, for stealing my line. Except I was going to
suggest "_Left Behind_ for geeks" as a blurb for Walter Jon Williams's
_Metropolitan_ series. The un-Transcended aren't disassembled, though;
they're merely sequestered.

David Tate

Wayne Throop

unread,
Nov 3, 2005, 5:36:56 PM11/3/05
to
: jus...@panix.com (Justin Fang)
: I remind you that the Flynn

: effect suggests that such a linear increase has been ongoing for some time
: now. No Singularity has yet occurred.

We've been using these SRBs with the same O-rings for some time now,
and no catastrophe has occurred.

Joseph Michael Bay

unread,
Nov 3, 2005, 5:39:03 PM11/3/05
to
jus...@panix.com (Justin Fang) writes:

>Is a 300+ IQ score even defined?

Sure, a 30-year-old with a 300 IQ is as smart as a 90-year-old.

Now get off my lawn! Twenty-three skidoo!


>Prospective parents will disagree wildly on what is "smarter and saner and
>stronger and healthier and more beautiful".

I would wonder if smarter and saner aren't mutually exclusive to
some degree, y'know?

--
Joe Bay Leland Stanford Junior University
www.stanford.edu/~jmbay/ Program in Cancer Biology
The white zone is for loading and unloading only. If you have to load
or unload, go to the white zone. You'll love it. It's a way of life.

Wayne Throop

unread,
Nov 3, 2005, 5:41:24 PM11/3/05
to
::: psionics and the Singularity don't mix well

:: A lot of singularity-fu is just prosthetic psionics.

: pho...@ofb.net (Damien Sullivan)
: True, and we have had psychic transcensions since Childhood's End --


: maybe the energy beings of Star Trek, definitely Jason Ironheart and
: the Vorlons etc. of Babylon-5, or the Lylmik of Julian May. There's
: similarity of effect, and convergence to a godpoint of manipulating
: much of reality at will. OTOH, the difference in mechanism and
: philosophy feels important to me, especially from the hard SF
: viewpoint. The implausibility of "the technology won't stretch that
: far" isn't the same as that of "that effect doesn't even exist". The
: Singularity could happen -- or maybe it can't, but it's largely an
: engineering issue, vs. psychic powers not existing.

I don't disagree here, but I note the boundary can get blurry.
For example, Flux and Anchor. Or Bowman in the 2001 sequels. And,
wrt the "psychic powers not existing", that can get very blurry, too,
which hyperspace and exotic particles and fifth forces and so on and on
being used in hard-seeming SF. Doubly blurry when you consider, oh, say,
McGill Feighan's access to the "energy dimension" to do his tricks... is
that all that much different than Asimov's use of energy and dimenisions
in The Gods Themselves? Or, is the use of diagrams and "psychic machines"
in Schmitz's Hub setting, or Harisson's Deathworld (first book) setting
not treating psychic powers as "a matter of engineering" (once you grant
hyperspace or psychon particles, or whatever)?

So anyways. Yes, they *aren't* *often* *mixed*, in a practical,
catalogue-of-what-is sense. But they *could* mix well, and sometimes
there's a bit of difussion at the boundary; enough to quote a handfull
of instances anyways, and in some of those, it's done pretty well.

Perhaps that's a fiddly distinction to make,
but it seems to me to be worth keeping in mind.

Wayne Throop

unread,
Nov 3, 2005, 5:53:57 PM11/3/05
to
:: I'm tired of the Singularity depicted as inevitable and imminent
:: (unless

: pho...@ofb.net (Damien Sullivan)
: Fair enough, especially since those typically are about the extreme forms.


: At the other end, having cognitive science accomplish nothing of interest
: seems about as implausible.

Hear, hear. Both ends, I mean.

Michael S. Schiffer

unread,
Nov 3, 2005, 5:56:42 PM11/3/05
to
pho...@ofb.net (Damien Sullivan) wrote in
news:dkdu5b$c4c$3...@naig.caltech.edu:

> "Michael S. Schiffer" <msch...@condor.depaul.edu> wrote:

>...


>>could presumably choose the gametes they wanted to use. It's
>>true that they don't do as much optimizing as they probably
>>could.

> Barrayar has haut genetic material as of _DI_; think they'll use
> it? Barrayar, passing up the chance to create supersoldiers?
> Okay, this is Gregor's Barrayar, and he's Cordelia's, and they
> have that mutant phobia...

One thing that runs thoroughly through Bujold's work is the
importance of other inputs than genes to getting something
worthwhile at the end of development. Even super-soldiers need
parents, and the record in that universe of producing and retaining
loyalty in children one is using as tools isn't great. (One of
Mark's rescuees did go back of her own accord, but the opposite
result is a recurring theme of the books, from the Quaddies to Taura
to Mark.) The haut have a solution, but Miles hasn't IIRC seen much
of Star Creche childrearing, and wouldn't want to duplicate it if he
had.

Producing super-soldiers who are actually loyal to Barrayar *and*
more useful to it than the equivalent expenditure's worth of trained
Barrayaran recruits is no small trick, even if Gregor or ImpSec
doesn't care about the moral issues or the induced mutations. Add
the *source* of these particular mutie genes, and I think Barrayar
is safe from the temptation this generation.

>...

> It's not just nobles; other people use the service. Which
> raises a question, how old is the oldest person in the Nexus?
> Hey Lois, do your worldbuilding...

> These people can also clone and replace any organ; one wonders
> why the body-theft is even necessary. I suppose refreshing the
> whole vascular or lymph system would be harder. But it also
> seems like they'd have head-in-a-jar capability, with what we've
> seen even from Barrayar's medicine for Aral;

I'd guess that there are limits to how many emergency systems can be
stacked on top of one another. (Miles' cryorevival was clearly
pushing things.) So you can live without most of your body for a
little while, or without some of it for a long while (or
indefinitely, given a replacement) but at some point you're trying
to run an unwieldy and failure-prone set of substitutes 24/7 long
enough to get the cloned organs ready, and then there's the matter
of getting them attached, up, and running without killing the
patient.

To some extent, this is an extension of our own medical technology
experience, where you can replace a heart for a while, but not yet
indefinitely. Not because it's an engineering impossibility, but
because working the bugs out is takes more years than we've yet been
able to devote to the problem. I don't know if that could delay
artificial bodies with better-than-human lifespans a thousand years,
but it could push it back a century or more without straining my
credulity much, and I don't know how far the enabling technologies
are. (I'm guessing cloning organs is a lot closer, but unless you
can keep a full set on hand you still need to be able to stay alive
somehow while your next liver is grown.)

is there any life
> support function they couldn't provide mechanically
> indefinitely? Of course, the brain may conk out. Though stuff
> like Illyan's memory chip seemed not that far from
> uploading-quality tech.

I'm not sure about that-- it had a problem-prone interface, and it's
not at all clear that it has the necessary processing power. (He
clearly had a heck of a search engine to usefully sort through all
that data, but where that is versus emulating human intelligence
isn't clear to me.) Of course, it's also decades-old tech by the
current books, but the fact that it's not standard issue on Beta
Colony suggests that the interface bugs may still not be worked out.

> Which gets back to the humanist model; I suspect Lois doesn't
> want to deal with 400 year old people or AI.

Agreed.

After 1000 years
> things are better than now, but not too much better. Enough to
> offer hope and inspiration, not enough to disturb, except in
> selected places.

That analysis sounds about right to me. I'm not sure that the
Scottish school is likely to be any more right about the future, but
they generally are doing a more comprehensive attempt at
extrapolation.

Though they have their own arbitrary limitations-- the Culture's
deliberate choice to have 500 year lifespans and humanoid body plans
and the like for their mainstream citizens, the Zones, the
Eschaton's rule about messing with causality, etc. (Or in the Fall
Revolution books, just killing a few quillion AIs whenever there's
any danger they might transcend.)

And while I admire the speculative bravura of the "Scots", I
generally *enjoy* the "humanist" books better. Vinge is the only
one of the (near-)Singularity school whose books I consistently like
as well as admire, and he generally keeps the camera focused on
fairly human-level affairs. (I've enjoyed some books or stories by
most of the others, but for me they're fairly rarefied air to
breathe for too long.)

Garrett Wollman

unread,
Nov 3, 2005, 6:29:31 PM11/3/05
to
In article <11310...@sheol.org>, Wayne Throop <thr...@sheol.org> wrote:

You ever going to fix your newsreader, Wayne?

>: pho...@ofb.net (Damien Sullivan)


>: Singularity could happen -- or maybe it can't, but it's largely an
>: engineering issue, vs. psychic powers not existing.
>
>I don't disagree here, but I note the boundary can get blurry.

Indeed. Consider one of Mr. Sullivan's earlier examples: Julian May's
Galactic Milieu. If you sweep aside the theology for a moment,
everything else is "explained" (in the story-internal sense, not
necessarily to the readers) by the "Dynamic Field Theory" (some sort
of super-GUT attributed to a Chinese physicist). Not only the
traditional "skiffy" stuff -- force-fields, the Inertialess Drive, the
Subspace Translator, and the time-gate itself -- but all of the "psi"
stuff is explained by the same theory. (It's an interesting halfway
point between mechanism and vitalism: some species just evolved the
ability to manipulate these fields with their minds, but how that
could be, biologically speaking, is swept under the rug, since it's
immaterial to the story.) There's no reason machines can't be built
to do the same thing, and in fact they are, although for some reason
(unknown to me) they are either very uncommon or proscribed.

>So anyways. Yes, they *aren't* *often* *mixed*, in a practical,
>catalogue-of-what-is sense. But they *could* mix well, and sometimes
>there's a bit of difussion at the boundary; enough to quote a handfull
>of instances anyways, and in some of those, it's done pretty well.

Indeed.

-GAWollman

--
Garrett A. Wollman | As the Constitution endures, persons in every
wol...@csail.mit.edu | generation can invoke its principles in their own
Opinions not those | search for greater freedom.
of MIT or CSAIL. | - A. Kennedy, Lawrence v. Texas, 539 U.S. 558 (2003)

Dan Goodman

unread,
Nov 3, 2005, 6:36:45 PM11/3/05
to
Damien Sullivan wrote:

> Which brings us to the root of it all in John Campbell's rejection of
> Vinge's sequel to "Bookworm, Run!", with a human enhanced like the
> chimp of the original story: "You can't tell this story, and neither
> can anyone else." This has cousins in Niven's own writings on
> superintelligent beings; the basic idea is that you can't plausibly
> write about someone much smarter than yourself, let alone a society
> of them. Sure, they might still have human motivations, but can you
> portray their thoughts, their actions, the technologies or social
> arrangements they would produce? Can you understand them?

Seems to me that Vinge came along well after Olaf Stapledon's _Odd
John_ and _Star Maker_ were published.

And I believe Vinge had read Poul Anderson's _Brain Wave_.


--
Dan Goodman
Journal http://www.livejournal.com/users/dsgood/
Clutterers Anonymous unofficial community
http://www.livejournal.com/community/clutterers_anon/
Decluttering http://decluttering.blogspot.com
Predictions and Politics http://dsgood.blogspot.com
All political parties die at last of swallowing their own lies.
John Arbuthnot (1667-1735), Scottish writer, physician.

Wayne Throop

unread,
Nov 3, 2005, 6:35:59 PM11/3/05
to
: wol...@khavrinen.csail.mit.edu (Garrett Wollman)
: You ever going to fix your newsreader, Wayne?

< looks at headers, looks at upthread headers >

Um. Which bug?

James Nicoll

unread,
Nov 3, 2005, 6:47:24 PM11/3/05
to
In article <dkdu8q$9a2$1...@panix3.panix.com>,

Justin Fang <jus...@panix.com> wrote:
>In article <dkdkbg$8u6$1...@naig.caltech.edu>,
>Damien Sullivan <pho...@ofb.net> wrote:
>[reordered]
>>So, please, specify which Singularity it is that you're tired of!
>
>I'm tired of the Singularity depicted as inevitable and imminent (unless
>disaster and/or deliberate suppression interferes). E.g. the presumably
>intended as nonfiction _The Singularity is Near_ by Ray Kurzweil. I may be
>unfair to him; I didn't read the book since the title alone made me laugh.
>
>I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t
>are uploaded to become as gods, while the slow-adopters, the doubters, and
>the technophobes are disassembled and converted to computronium. _Left
>Behind_ for geeks.

What he said.

Except I think we've passed through a couple of small-s singularities
and might be in the middle of one (urbanization).

And I hate Batman's Cave Singularity stories, where everyone
runs around looking for the singularity.
--
http://www.cic.gc.ca/english/immigrate/
http://www.livejournal.com/users/james_nicoll

Garrett Wollman

unread,
Nov 3, 2005, 6:54:21 PM11/3/05
to
In article <11310...@sheol.org>, Wayne Throop <thr...@sheol.org> wrote:
>: wol...@khavrinen.csail.mit.edu (Garrett Wollman)
>: You ever going to fix your newsreader, Wayne?
>
>< looks at headers, looks at upthread headers >
>
>Um. Which bug?

The one where it keeps on putting the invalid distribution "world"
into every article you post.

Damien Sullivan

unread,
Nov 3, 2005, 6:54:36 PM11/3/05
to
jdni...@panix.com (James Nicoll) wrote:

> Except I think we've passed through a couple of small-s singularities
>and might be in the middle of one (urbanization).
>
> And I hate Batman's Cave Singularity stories, where everyone
>runs around looking for the singularity.

Examples?

-xx- Damien X-)

James Nicoll

unread,
Nov 3, 2005, 7:03:33 PM11/3/05
to
In article <dke7vs$g1i$1...@naig.caltech.edu>,
CUSP by Metzgler. Everyone know that the Singularity is coming,
the same way they know Jesus is coming, and everyone (except the cannon-
fodder who make up 99.9999999% of humanity) wants to be the ones who
control it.

And it's universal: part of the plot involves aliens who also
want to control singularities.

Wayne Throop

unread,
Nov 3, 2005, 7:08:29 PM11/3/05
to
::: You ever going to fix your newsreader, Wayne?
:: Which bug?
: The one where it keeps on putting the invalid distribution "world"
: into every article you post.

Oh, that bug. That was fixed as of Fri Nov 4 00:07:40 UTC 2005.

Damien Sullivan

unread,
Nov 3, 2005, 7:15:14 PM11/3/05
to
I'd forgotten about the field theories in Julian May. Yeah, attempt at
sounding scientific. Where's the energy for the big creative effects coming
from, though? And she's deliberately vague about whether some creative
shapechanging is illusion or real.

thr...@sheol.org (Wayne Throop) wrote:

>: far" isn't the same as that of "that effect doesn't even exist". The
>: Singularity could happen -- or maybe it can't, but it's largely an
>: engineering issue, vs. psychic powers not existing.
>
>I don't disagree here, but I note the boundary can get blurry.
>For example, Flux and Anchor. Or Bowman in the 2001 sequels. And,

Haven't read Flux. Bowman... off to the side, but I'd say closer to the
technology end, perhaps because I've read the book where Clarke talks about
the monolith's makers and how they'd transferred their minds into ships before
learning how to be stable vortices of energy or something. And in 2010 (book)
Bowman sets off a nuke to recharge his batteries. So it's more like really
advanced engineering, with a nod to energy conservation (and someone I just
realized was too lazy to go play in the Sun.)

>wrt the "psychic powers not existing", that can get very blurry, too,
>which hyperspace and exotic particles and fifth forces and so on and on
>being used in hard-seeming SF. Doubly blurry when you consider, oh, say,
>McGill Feighan's access to the "energy dimension" to do his tricks... is
>that all that much different than Asimov's use of energy and dimenisions

Haven't read Feighan, don't remember much of Gods Themselves, do remember the
Foundation series 'psychic' powers. Explicitly described as manipulating
electromagnetic fields very very subtly. Somewhat less plausible than
Drexlerian nanotech but again feeling different to me than classic psychic
powers. I'll grant it's blurry though, especially when Gaia comes in. But
hey, Brain Eater.

Also, psychic transcendance tends to be some big power rampup, not the
intelligence->technology cycle of Vinge.

I think you've simply read more of this cross-boundary stuff. For me psychic
powers and technological imitations thereof are quite distinct. Though I'll
mention one fuzzy piece you haven't: Lord of Light. "Mutations" leading to
various electromagnetic manipulation powers, and the aliens who'd pulled the
same trick as Bowman's masters, but all seeming pretty foofy.

Hmm. Gradation:
emergent psychic powers proper: X-Men, Liaden
emergent psychic powers with vague explanation: Zelazny, Julian May, with some
limited mechanical access to the same forces or powers
psychic (direct brain effect) powers with a technological cause: Asimov's
telepathy, where there's no hint that this is a justification of psychic
claims today;
things that look psychic but are backed up by a whole toolchain and the
handwaving is engineering more than scientific: Singularity, implant
telepathy, robotic "telekinesis".

I still see a much bigger jump to the last one (ignoring the Singularities
where everyone just vanishes, which I think tend to be plot devices anyway).
Enough that I never made a connection before, and am not particularly swayed
by it now. Similar effects up to a point, but as Banks said the point of
technology is wish fulfillment. Hard SF tries to show how the wishes get
fulfilled. "Dynamic field theory" ain't it.

-xx- Damien X-)

Damien Sullivan

unread,
Nov 3, 2005, 7:19:45 PM11/3/05
to
jdni...@panix.com (James Nicoll) wrote:

>>> And I hate Batman's Cave Singularity stories, where everyone
>>>runs around looking for the singularity.
>>
>>Examples?
>>
> CUSP by Metzgler. Everyone know that the Singularity is coming,

Haven't even heard of it... oh, came out this year. Hey, my public library
has it. You don't make it sound that appealing even to a Singularity fan.
Oh, but now it reminds me of extropian debates where there was concern about
who would have the first AI or nanotech general assembler and thus Eat The
World. Something like that?

> And it's universal: part of the plot involves aliens who also
>want to control singularities.

Well, that seems to make sense given the premise. If all civilization turn
into coherent entities then they're powerful resources to control.

-xx- Damien X-)

Wayne Throop

unread,
Nov 3, 2005, 7:20:49 PM11/3/05
to
: pho...@ofb.net (Damien Sullivan)
: emergent psychic powers proper: X-Men, Liaden

Hm. Liaden is an interesting example. Before I read Crystal Soldier,
I had no idea that the Liaden universe jnf cbfg-fvathynevgl.
V'z abg fher xabjvat guvf unf vzcebirq gur rkcrevrapr sbe zr.


"A soldier's <mumble> is more than might,
little Jela was a demon to fight."

--- half-remembered Yxtrang saying, from (I think) Plan B
where "mumble" is something like "strength" or "job"
or "task" or something, and I don't have the book
to hand and my google-fu has failed me
O the embarrassment

Wayne Throop

unread,
Nov 3, 2005, 7:31:22 PM11/3/05
to
: pho...@ofb.net (Damien Sullivan)
: Oh, but now it reminds me of extropian debates where there was concern about

: who would have the first AI or nanotech general assembler and thus Eat The
: World. Something like that?

In turn, this reminds me to ask if anybody here has read
"The Last Stand of the DNA Cowboys" by Mick Farren,
and if so, what the *hell* was it about?

( In it, the world got et before the start of the story,
which was an interesting trick; the mental image I get from the
description of strange anomalies appearing and inexorably growing
at tremendous speed to consume the whole world has stuck with me. )

I suppose I really oughta have read the books before it in the series...
it's the fourth book in a trilogy. Though from what I can gather, the
world got et before the first entry in the series.

Damien Sullivan

unread,
Nov 3, 2005, 7:47:58 PM11/3/05
to
"Michael S. Schiffer" <msch...@condor.depaul.edu> wrote:

>One thing that runs thoroughly through Bujold's work is the
>importance of other inputs than genes to getting something

True.

>worthwhile at the end of development. Even super-soldiers need
>parents, and the record in that universe of producing and retaining

"Azi." But yeah.

> loyalty

>Mark's rescuees did go back of her own accord, but the opposite
>result is a recurring theme of the books, from the Quaddies to Taura
>to Mark.) The haut have a solution, but Miles hasn't IIRC seen much

Even the ba and haut have shown bugs. The other examples though, seem to
exhibit more the perils of Blatant Stupidity. The quaddie project was pretty
effective until a combination of Leo blowing the whistle and the company
deciding to commit mass murder. Taura wasn't given anything like a real
upbringing, Mark was raised by a sadistic madman.

Barrayar's culture, on the other, is very good at instilling fanatic loyalty
in its citizens. The Counts express some free will but most others are mental
thralls of the Emperor and his Auditors. Raise your super-soldiers like
Glorious Soldiers of the Empire -- like everyone else but more so -- rather
than meat-tools, and I think you'd have better results.

Admittedly doing that migth be right where Barrayar's culture would run
aground. "Mutie!"

>Producing super-soldiers who are actually loyal to Barrayar *and*
>more useful to it than the equivalent expenditure's worth of trained
>Barrayaran recruits is no small trick, even if Gregor or ImpSec

Point; war is more about technology than strength (Taura's flaw) (though
brains behind the tech is important, and not necessarily the strong point of
the average underfed Barrayaran recruit.

>> whole vascular or lymph system would be harder. But it also
>> seems like they'd have head-in-a-jar capability, with what we've
>> seen even from Barrayar's medicine for Aral;
>
>I'd guess that there are limits to how many emergency systems can be
>stacked on top of one another. (Miles' cryorevival was clearly

Maybe... of course revival from a hasty freeze isn't the same as a prepared
take over of bodily functions.

>to run an unwieldy and failure-prone set of substitutes 24/7 long
>enough to get the cloned organs ready, and then there's the matter
>of getting them attached, up, and running without killing the
>patient.

Once they had Aral stable in the capital and waiting for his new heart, was
there any sense of risk or time pressure?

>> indefinitely? Of course, the brain may conk out. Though stuff
>> like Illyan's memory chip seemed not that far from
>> uploading-quality tech.
>
>I'm not sure about that-- it had a problem-prone interface, and it's
>not at all clear that it has the necessary processing power. (He

Okay, not that close either. And I didn't mean that the chip itself was
upload capable, just that the quality was close to what you'd need. I guess
picking up new memories is different than reading existing synapses, though.

>And while I admire the speculative bravura of the "Scots", I
>generally *enjoy* the "humanist" books better. Vinge is the only

I do re-read Bujold and Pratchett much more often. They also re-use lots of
characters more, so I'm not sure what's at work: humanist focus, or having a
familiar world and characters to curl up with and re-read? Or just better
writing? Or more choices? Both have been rather prolific authors.

Something to watch out for.

Okay, let me count: There are 34 Discworld novels alone, not that I have them
all, though I also have Bromeliad and the Carpet People. 14 Vorkosiverse
novels, plus 2 Chalion novels I own. Ken has 8 novels I've read, I've heard
of a 9th. I have 8 Banks novels. 5 Vinge novels, two short stories
collections. I haven't actually bought any Stross, due to poverty. Don't own
Reynolds. 4 Egan novels, I think?

So Bujold has more novels than any two "Scots", at least on my shelves.
Pratchett potentially outweighs the whole set.

-xx- Damien X-)

Mark Atwood

unread,
Nov 3, 2005, 7:49:01 PM11/3/05
to
jus...@panix.com (Justin Fang) writes:
>
> I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t
> are uploaded to become as gods, while the slow-adopters, the doubters, and
> the technophobes are disassembled and converted to computronium. _Left
> Behind_ for geeks.

Have there actaully been any such written? I can't think of any off the top
of my head.

_Accelerondo_ comes close, but in the threads of lives followed by the
novel, the l33t are the ones who bear the greatest risk of being made
obsolescent, while it's the IP lawyers, corporate executives, and
bankers who start converting everyone else into computronium.


--
Mark Atwood When you do things right, people won't be sure
m...@mark.atwood.name you've done anything at all.
http://mark.atwood.name/ http://www.livejournal.com/users/fallenpegasus

Damien Sullivan

unread,
Nov 3, 2005, 7:49:37 PM11/3/05
to
thr...@sheol.org (Wayne Throop) wrote:

>Hm. Liaden is an interesting example. Before I read Crystal Soldier,

So my library has _Cusp_ and _Thud!_ but not _Crystal Soldier_. I need to
whine and see if that works. Not that I'm complaining about it having
_Thud!_.

-xx- Damien X-)

Damien Sullivan

unread,
Nov 3, 2005, 7:54:42 PM11/3/05
to
thr...@sheol.org (Wayne Throop) wrote:
>: pho...@ofb.net (Damien Sullivan)
>: Oh, but now it reminds me of extropian debates where there was concern about
>: who would have the first AI or nanotech general assembler and thus Eat The
>: World. Something like that?
>
>In turn, this reminds me to ask if anybody here has read
>"The Last Stand of the DNA Cowboys" by Mick Farren,
>and if so, what the *hell* was it about?

Wow, what a segue. But yes! It was one of my childhood library books. I
grouped it with _Cowboy Feng's Space Bar and Grill_ since both had cowboys in
the title and involved the End of The World, or what felt like it. And were
weird.

As for what it was about, I don't know. I figured some aliens hit Earth with
something nasty and we got to see the death throes. It did inspire some
Hodgell fanfic I haven't actually written down, about Jame's missing two
years, spent visiting various bubble Kensholds in Perimal Darkling and
inadvertently popping them one by one. "Dark Kenshold", I'd call it.

>I suppose I really oughta have read the books before it in the series...
>it's the fourth book in a trilogy. Though from what I can gather, the
>world got et before the first entry in the series.

I've asked the same "has anyone read it?" question. I guess I got answers
about there being a series and other books. It was standalone weirdness to
me.

-xx- Damien X-)

Gene Ward Smith

unread,
Nov 3, 2005, 8:06:32 PM11/3/05
to

Damien Sullivan wrote:

> Dropping further, we get closer to what I think of as Vinge's original ideas.
> One form is that when we learn to increase our intelligence, those smarter
> results will be able to increase their own even faster, in an exponential
> growth for at least a while, with accompanying exponential growth in other
> technologies.

This is the form of the Singularity I believed in as a teenager in the
Sixties, before Sigularity fiction became so popular. But by the time
it did become popular, I'd given up thinking we could chart the course
of the future, and became something of a Singularity sceptic. But I
think the real reason for the cry of "no Singularity" is that people
want settings and characters they can relate to; hence, the increasing
popularity of fantasy vs science fiction, and the very non-post-human
worlds which populate much of even far future sf.

> Which brings us to the root of it all in John Campbell's rejection of Vinge's
> sequel to "Bookworm, Run!", with a human enhanced like the chimp of the
> original story: "You can't tell this story, and neither can anyone else."
> This has cousins in Niven's own writings on superintelligent beings; the basic
> idea is that you can't plausibly write about someone much smarter than
> yourself, let alone a society of them.

I've never seen it done.

> We're not far from it now. Find correlations of genes with IQ and personality
> traits, create multiple embryos and select the most preferred gene combination
> among them, and you can create quite strong selection pressures in the
> direction you want; this could well be next-decade tech.

The process of finding the high-IQ genes has already begun.
Manufacturing new high IQ genes will prove far more difficult, I
suspect.

Wayne Throop

unread,
Nov 3, 2005, 8:11:39 PM11/3/05
to
: "Gene Ward Smith" <genewa...@gmail.com>
: The process of finding the high-IQ genes has already begun.

: Manufacturing new high IQ genes will prove far more difficult,
: I suspect.

There are two depressing aspects to this aspect of the future.

First, while it's a possibility for a "soft-takeoff" singularity, which
one might think would be good and unthreatening and all, it also (given
the involvement with various neuropathologies in the early part of the
research) might point towards the use of Focus. Ew, ik.

Second, this path towards singularity doesn't do *me* much good.


People are going to stop thinking about themselves and start
thinking about me, Al Franken. That's right. I believe we're
entering the Al Franken decade. Oh, for me, Al Franken, the
eighties will be pretty much the same as the seventies. But for
you, when you see a news report you'll be thinking "I wonder what Al
Franken thinks about this?" "I wonder how this inflation thing is
hurting Al Franken?" And you women will be thinking "What can I wear
that will please Al Franken?" or "What can I not wear?" A lot of you
are probally thinking "Why Al Franken?" Well, because I thought of
it, and I'm on TV.

--- SNL skit

David Johnston

unread,
Nov 3, 2005, 8:53:59 PM11/3/05
to
On Thu, 3 Nov 2005 18:19:28 +0000 (UTC), pho...@ofb.net (Damien
Sullivan) wrote:

>At this level, what I see the Singularity really saying is "our tools don't
>only work out there, or in our bodies for simple healing. Our bodies and
>brains are machines we can take apart, understand, improve, and perhaps copy;
>this will happen, this will happen soon, and this will have consequences."
>This is what unifies the nanotech immortality pill with the brain-computer
>interface or the gene-selected supermind or the upload, namely the treatment
>of the human condition in all its aspects as a controllable material process.
>The future doesn't just offer more food, water, energy, and neat clothes; it
>offers people who are, as a result of their parents' will if not their own,
>smarter and saner and stronger and healthier and more beautiful than we are.

Yeah. That's one of the kinds of Singularities that I balk at. I
don't really believe that you can engineer everyone to be dramatically
all round smarter, saner, stronger, healthier and more beautiful with
no downside. There's always some kind of downside. That's a basic
axiom of my world-view. (And I particularly don't believe you can
simultaneous engineer both superhuman intelligence and greater sanity
than a typical person. Intelligence is the root of neurosis, after
all.)

Wayne Throop

unread,
Nov 3, 2005, 9:01:46 PM11/3/05
to
: rgo...@block.net (David Johnston)
: I don't really believe that you can engineer everyone to be

: dramatically all round smarter, saner, stronger, healthier and more
: beautiful with no downside. There's always some kind of downside.
: That's a basic axiom of my world-view. (And I particularly don't
: believe you can simultaneous engineer both superhuman intelligence and
: greater sanity than a typical person. Intelligence is the root of
: neurosis, after all.)

Are humans less sane than the other mammals?

Peter Meilinger

unread,
Nov 3, 2005, 9:14:09 PM11/3/05
to
Wayne Throop <thr...@sheol.org> wrote:

>Are humans less sane than the other mammals?

Two words, my friend - "Cop Rock."

Pete

Wayne Throop

unread,
Nov 3, 2005, 9:25:55 PM11/3/05
to
:: Are humans less sane than the other mammals?

: Two words, my friend - "Cop Rock."

I think that's the answer to the question
"are humans more inane than the other mammals?".


http://www.webster.edu/~woolflm/janegoodall.html
1975: Cannibalism - Passion killed and ate Gilka's infant, and shared
the meat with her daughter, Pom. Together they continued eating
infants for two years.

David Johnston

unread,
Nov 3, 2005, 9:34:48 PM11/3/05
to
On Fri, 04 Nov 2005 02:01:46 GMT, thr...@sheol.org (Wayne Throop)
wrote:

>
>Are humans less sane than the other mammals?

Yes.

Wayne Throop

unread,
Nov 3, 2005, 9:37:10 PM11/3/05
to
:: Are humans less sane than the other mammals?

: Yes.

It is a widespread sentiment. And so I am skeptical.


"Man is the only animal that blushes. Or needs to."

--- Mark Twain

Justin Fang

unread,
Nov 3, 2005, 10:25:14 PM11/3/05
to
In article <dke07n$c4c$4...@naig.caltech.edu>,
Damien Sullivan <pho...@ofb.net> wrote:

>jus...@panix.com (Justin Fang) wrote:
>>>So, please, specify which Singularity it is that you're tired of!

>>I'm tired of the Singularity depicted as inevitable and imminent (unless

>Fair enough, especially since those typically are about the extreme forms.
>
>At the other end, having cognitive science accomplish nothing of interest
>seems about as implausible.

Sure, but I don't really see the dichotomy here. There's plenty of things
more advanced cognitive science could plausibly accomplish other than
exponential growth to brains the size of planets.

>>I'm tired of the aptly-named Rapture of the Nerds scenarios, where the l33t

>How many books has this actually happened in? _Marooned_, kind of; _Fire Upon
>the Deep_, implied; _Cassini Division_ and _Stone Canal_, partly -- the nerds
>had their fun, no one got disassembled that I remember; Accelerando, maybe?

_Accelerando_ had the solar system turned into computronium, but implied that
this was ultimately a trap. Zindell's _A Requiem for Homo Sapiens_ series
had an antagonist who was planning a galactic-scale version.

But you're right, the extreme form doesn't tend to show up much in novels, or
at least not the ones I read (which could be selection effect, of course).
I'm fairly sure it's turned up in short stories, though I can't think of any


off the top of my head.

Now that I think about it, ISTM to be something I encounter more commonly on
the net. You may remember a recent discussion we had about it here, in which
some posters appeared to be convinced that something like that was possible,
even likely, within the next few decades.

>>For instance, the corrolary to Moore's Law known as Rock's Law: the price of
>>a chip fab doubles every 4 years.

>http://firstmonday.org/issues/issue7_11/tuomi/index.html
>
>says Moore's Law doesn't hold up in any rigorous form, though that may not
>matter for our purposes.

Oh, I think it does matter, by showing once again how silly it is to predict
that any exponential growth curve will continued indefinitely in the real
world. (E.g. James Nicoll's Textile Singularity).

>>>Whether a continued linear increase in intelligence is really less
>>>worldshaking (especially for the SF writer) than an exponential one is a
>>>question I leave to the reader.

>>Assuming IQ actually measures intelligence, then I remind you that the Flynn
>>effect suggests that such a linear increase has been ongoing for some time
>>now. No Singularity has yet occurred.

>Good point. OTOH, such a gradual rise is ripe for the "we're living through
>it" argument, where we don't notice. How would the average person from N
>years back, plucked forward, cope?

Why not ask an average 80-year-old? Or even Jack Williamson: born 1908.
Migrated with his family to New Mexico in a covered wagon as a child. SF
wrting career stretching from the 1930's to at least April 2005.

Consider the Hmong refugees who went from roughly stone-age villages to
urban America. Considerable difficulties adjusting, yes, but hardly total
incomprehension.

Less extreme examples happen every day, as third-world peasants stream into
cites by the millions. (As James Nicoll points out elsethread.)

>For that matter the average person now doesn't understand the tech and
>economy they live in... and no one understands all of it, leading to the
>"Singularity came and went a long time ago" argument.

That may be a singularity, or a bunch of them, but not The Singularity as
Vinge described it.

>>>We're not far from it now. Find correlations of genes with IQ and
>>>personality traits, create multiple embryos and select the most preferred
>>>gene combination among them, and you can create quite strong selection
>>>pressures in the direction you want; this could well be next-decade tech.

>>Unless you discover that while the various alleles you selected for all
>>increase intelligence taken individually or in most combinations, putting
>>them all together leads to an anti-synergistic effect.

>Maybe, but how likely is that?

I'm guessing quite likely unless you fully understand what you're doing. The
human genome is a horrendous mess of spaghetti code.

Then there's this speculation, about higher rates of autism resulting from
intermarriage of engineer-types who already have mild tendencies towards it:
http://www.wired.com/wired/archive/9.12/aspergers_pr.html

>And even then, people can assure themselves of
>passing on the good alleles or combinations. People will gain the knowledge
>to bias the results of reproduction in a way the human race has never seen
>outside of plagues. Is there any reason not to expect that?

Yes. Politics.

>>Is a 300+ IQ score even defined? Anyway, if intelligence boosts become
>>common enough, the IQ tests will simply be renormalized.

>Probably not, and quite possibly, but that's hardly the point; the point would
>be the much smarter person.

Okay. Although even if you do solve the genetics of the problem, you still
can't neglect the environmental component--see Ariane Emory.

>>Prospective parents will disagree wildly on what is "smarter and saner and
>>stronger and healthier and more beautiful". Personal observation suggests
>>that what many parents want is kids which are just like them, only more so.

>True, somewhat, and if it changes a lot from generation to generation that
>would keep the gene pool from lurching far in one direction. OTOH, the traits
>themselves aren't that arbitrary, what would vary more is the tradeoffs among
>them to make, if necessary. The other thing parents want is probably "kids
>who will be successful"; who's likely to attract a mate and get a good income
>and be happy?

The healthy, good-looking, reasonably smart person with very good social
skills? What people, culturally, are presented as the models of success?

>(Some of those might bias against going for really high IQ.)

Also note that many parents may be uncomfortable at the thought of a child
much smarter than they are.

--
Justin Fang (jus...@panix.com)

Howard Brazee

unread,
Nov 3, 2005, 10:36:03 PM11/3/05
to
On Fri, 04 Nov 2005 01:53:59 GMT, rgo...@block.net (David Johnston)
wrote:

>
>Yeah. That's one of the kinds of Singularities that I balk at. I
>don't really believe that you can engineer everyone to be dramatically
>all round smarter, saner, stronger, healthier and more beautiful with
>no downside. There's always some kind of downside. That's a basic
>axiom of my world-view.

Heaven is boring.

>(And I particularly don't believe you can
>simultaneous engineer both superhuman intelligence and greater sanity
>than a typical person.

Why not? It's engineering two different things.

> Intelligence is the root of neurosis, after all.)

I don't see a correlation. Some dumb animals are sane others are
insane.

Howard Brazee

unread,
Nov 3, 2005, 10:37:03 PM11/3/05
to
On 4 Nov 2005 02:14:09 GMT, Peter Meilinger <mell...@bu.edu> wrote:

>>Are humans less sane than the other mammals?
>
>Two words, my friend - "Cop Rock."

I suppose that answer probably meant something to you.

Howard Brazee

unread,
Nov 3, 2005, 10:37:54 PM11/3/05
to
On Fri, 04 Nov 2005 02:34:48 GMT, rgo...@block.net (David Johnston)
wrote:

>>Are humans less sane than the other mammals?
>
>Yes.

Have you observed other mammals closely?

Howard Brazee

unread,
Nov 3, 2005, 10:38:46 PM11/3/05
to
On Fri, 04 Nov 2005 02:37:10 GMT, thr...@sheol.org (Wayne Throop)
wrote:

>It is a widespread sentiment. And so I am skeptical.


>
>
> "Man is the only animal that blushes. Or needs to."
>
> --- Mark Twain

I don't believe Mark Twain. I've also read claims that man is the
only animal that dreams, but that is obviously untrue.

Peter Meilinger

unread,
Nov 3, 2005, 10:52:38 PM11/3/05
to

TV show back in... Good God, it was all the way back in 1990.
It was a cop drama AND a musical. Possibly also a floor wax
and dessert topping, but it didn't last long enough to
find out. Find me a non-human mammal that thinks that's
even close to a good idea, and we can talk. For the record,
Steven Bochco does not count as a non-human mammal for
the sake of this argument.

Pete

Garrett Wollman

unread,
Nov 3, 2005, 11:18:25 PM11/3/05
to
In article <llllm191fh0m3vntv...@4ax.com>,

Howard Brazee <how...@brazee.net> wrote:
>On Fri, 04 Nov 2005 01:53:59 GMT, rgo...@block.net (David Johnston)
>wrote:

>>(And I particularly don't believe you can


>>simultaneous engineer both superhuman intelligence and greater sanity
>>than a typical person.
>
>Why not? It's engineering two different things.

Biological Systems Don't Work Like That.

Mother Nature is quite the skinflint when it comes to mechanism. Even
if you stipulate (as I do not) that "intelligence" is a *thing* that
can be unambiguously measured and optimized by careful artificial
selection,[1] it's unlikely that those genes serve only one function.
(I think AI or IA is a much more likely alternative, and even there I
see grave dangers that could put the kibosh on their ultimate
development if not society as a whole.[2])

-GAWollman

[1] Add that one to the "Fallacy of the Reified Mean" file.

[2] #include <stddisclaimer.h>
(While I work for people who study this subject scientifically, I do
not, nor do I speak for them.)

--
Garrett A. Wollman | As the Constitution endures, persons in every
wol...@csail.mit.edu | generation can invoke its principles in their own
Opinions not those | search for greater freedom.
of MIT or CSAIL. | - A. Kennedy, Lawrence v. Texas, 539 U.S. 558 (2003)

David Johnston

unread,
Nov 3, 2005, 11:22:47 PM11/3/05
to
On Fri, 04 Nov 2005 03:37:54 GMT, Howard Brazee <how...@brazee.net>
wrote:

Yes. I'm not saying they can't go crazy, because they can. But the
more intelligent a mammal is, the more likely it is to go crazy.

James Nicoll

unread,
Nov 3, 2005, 11:27:04 PM11/3/05
to
In article <436a5a0b...@news.telusplanet.net>,

Intelligence sucks up a third of our metabolism and our
gigantic noggins already are a little smaller than the peak human
cranial size (Except right now I can't recall if it was Cro Mag
or Neandertal who held the record and HSN isn't a direct ancestor
of ours).

Of course, bird brains may have some tricks to show us.


--
http://www.cic.gc.ca/english/immigrate/
http://www.livejournal.com/users/james_nicoll

Wayne Throop

unread,
Nov 3, 2005, 11:32:18 PM11/3/05
to
: rgo...@block.net (David Johnston)
: I'm not saying they can't go crazy, because they can. But the

: more intelligent a mammal is, the more likely it is to go crazy.

Are lemmings intelligent?
Are overcrowded rats intelligent?

Dr. Dave

unread,
Nov 3, 2005, 11:37:25 PM11/3/05
to
Wayne Throop wrote:
> : pho...@ofb.net (Damien Sullivan)
> : emergent psychic powers proper: X-Men, Liaden
>
> Hm. Liaden is an interesting example. Before I read Crystal Soldier,
> I had no idea that the Liaden universe [spoiler].

Well, sort of.

I view it as an interesting retcon both of what's different about the
Liaden Universe, and what's missing that sorta oughta be there. It
works for me as backstory, but the actual telling of it wasn't as
entertaining as the other books (IMHO). Of course, since I think the
other books are *vastly* entertaining, that's not too much of a pan.

David Tate

Dr. Dave

unread,
Nov 3, 2005, 11:43:01 PM11/3/05
to
Howard Brazee wrote:
>
> Heaven is boring.

That's a meme I have never understood. *My* idea of heaven certainly
isn't.

David Tate

James Nicoll

unread,
Nov 3, 2005, 11:44:37 PM11/3/05
to
In article <436a5a0b...@news.telusplanet.net>,
David Johnston <rgo...@block.net> wrote:
>On Thu, 3 Nov 2005 18:19:28 +0000 (UTC), pho...@ofb.net (Damien
>Sullivan) wrote:
>
>>At this level, what I see the Singularity really saying is "our tools don't
>>only work out there, or in our bodies for simple healing. Our bodies and
>>brains are machines we can take apart, understand, improve, and perhaps copy;
>>this will happen, this will happen soon, and this will have consequences."
>>This is what unifies the nanotech immortality pill with the brain-computer
>>interface or the gene-selected supermind or the upload, namely the treatment
>>of the human condition in all its aspects as a controllable material process.
>>The future doesn't just offer more food, water, energy, and neat clothes; it
>>offers people who are, as a result of their parents' will if not their own,
>>smarter and saner and stronger and healthier and more beautiful than we are.
>
>Yeah. That's one of the kinds of Singularities that I balk at. I
>don't really believe that you can engineer everyone to be dramatically
>all round smarter, saner, stronger, healthier and more beautiful with
>no downside. There's always some kind of downside.

Now, what was the George Turner about enhanced intelligence
kids? There were three groups, as I recall, and all of them tweeked
in different ways. One group killed themselves. Of course, since they
were the smartest people on a planet careening towards a disaster
they could do nothing about, they coudl appreciate their situation
more finely than regular humans.

Wayne Throop

unread,
Nov 3, 2005, 11:40:59 PM11/3/05
to
:: Hm. Liaden is an interesting example. Before I read Crystal

:: Soldier, I had no idea that the Liaden universe [spoiler].

: "Dr. Dave" <dt...@ida.org>
: Well, sort of.


: I view it as an interesting retcon both of what's different about the
: Liaden Universe, and what's missing that sorta oughta be there. It
: works for me as backstory, but the actual telling of it wasn't as
: entertaining as the other books (IMHO). Of course, since I think the
: other books are *vastly* entertaining, that's not too much of a pan.

My reaction was somewhat similar. The issue of [spoiler] is that
the whole dramliza thing went in a completely different direction
than I expected. I think it's stated upthread that
gur qenzvm ner na rknzcyr bs qrirybcvat cflpuvp cbjref,
ohg vg gheaf bhg gung gurl ner tratvarrerq.

The whole background seems... at odds with the notion of Liaden and
Terran communities as presented in earlier-written stories. That seems
to imply a much stronger continuity to *us* for the one faction,
and Crystal Soldier just doesn't feel like that sort of setting,
or an intermediate towards that sort of setting.