You can tell right away in this book that something's missing. I
haven't read the earlier books in this series (need to be ordered
from the UK) and it shows. Characters pop up that have the obvious
"I was a major character in an earlier book" sheen -- they act like
they should be developed but the development was somewhere else.
Overall, this is a good book by a writer who will some day be great.
The ideas are fascinating, and the universe is deep. It's one of
the earliest and more interesting treatments of the concept that the
infinite wealth of nanotechnology can bring about the pure "withering away
of the state" of anarcho-communism. Macleod's Earth is such an
anarcho-communist society, based on a school of philosophy called
"the true knowledge" which marries socialism and anarchism in a way
quite different from what Marx envisioned. Most people walks around
armed, and there are enclaves of people using other systems (called
non-cooperators by the socialists)
The book's main flaw is that it attempts way too much for 240 pages.
It tries to get into depth on several different societies:
a) The socialist Earth society
b) The militarist/socialist "Cassini Division" society. The
Division is a military unit, those who have volunteered
to protect the solar system from
c) The VR/download human/post-human/AI society, which has many
forms, including its successors on Jupiter
d) The non-cooperating capitalist anarchy of London
e) The anarcho-capitalist society of New Mars, a stellar colony.
Each of these societies could be, should be, a novel, and I am told
the New Mars society is described in one of the earlier books. However,
he tries to get all of these into one book and that is simply too much to
attempt. Vernor Vinge has often written about how it would be
impossible to write decent fiction about transcendent post-humans, and
he's right. Macleod does not give us a post-human character point of
view, but there is something unsatisfying about the role they do play.
Finally, his prose style is somewhat unpolished, in my opinion, but this
will improve with time. His female protagonist doesn't even seem
remotely female to me. And I also find her character deeply
disturbing in ways I am not sure Macleod intended, as will be described
in the spoiler section.
With those criticisms aside, this book does indeed break new ground
in lots of interesting ways and is well worth reading.
Now on to the spoilers
-------------------------------
-------------------------------
Either there is depth missing or I simply don't buy Macleod's anarcho
communist utopia. Sure, nanomachines provide the material wealth,
including all the gold and silver members of the Union want when they
visit the capitalists. But who does the dirty work? Everybody seems
to be volunteering only to do cool, intersting jobs, but the rest of
the population seems missing. The most we see is that on a flight,
somebody volunteers to be the flight attendant. But nanotech, while
getting rid of material need (other than for real estate and raw
materials) doesn't eliminate the service jobs, and the restrictions
against AI in the Union leave many of those to humans.
He can make a case for this if he wants, but in a book so much about
politics, so many details are missing. Even the most die-hard
capitalist would become a socialist in exchange for eternal youth, yet
in this book, they don't.
But the largest problem is with Ellen, the protagonist. She is _evil_.
I mean beyond Hitler or Stalin level evil. She spends most of her life
thinking that AIs are just tools, to be destroyed at will, not worthy of
any rights. And she and her group fight to kill them. Later she is
forcefully convinced otherwise (through events that are still inadequately
explained) and what does she do?
She commits a massive genocide, against the will of the people of Earth
and the Division which gave her all her tools. Her society plans a huge
pre-emptive genocide before even talking to their victims. After this
is called off, she implements another method even before she gets real
evidence of duplicity by the Jovians.
At best she sees evidence that some Jovians, though not all, want to take
over New Mars ships. The Division's ships and the Union are pretty much
immune to the takeover. She isn't at risk. Yet she kills them all
without a thought -- untold millions of advanced beings. Yes, 2000
of their years ago, some of their predecessors took slaves and killed,
yet they are nothing compared to Ellen's atrocities.
Can one even blame the Jovians? They're much smarter than us, after all.
They knew what the Division had planned with the comet attack. They have
a demonstrable reason to want to get control of some ships, even to make
an advance strike on regular humans to protect themselves. And yet
not even all of them wanted to do the attack. Yet all are killed.
My other issue is that the Jovians don't seem to be nearly smart enough.
It's amazing that beings this advance would be subject to the attack in
question, and not able to predict it and prepare defences. The Earth
delegate is correct. Their ancestors, 2000 years ago, were able to
crumble up moons and build wormholes. A few comets, spaced out over
years of their subjective time, should not be enough to topple them into
civil war and destruction.
This is the problem Vinge identifies. How do you truly write conflict
between post-human beings and ourselves? His answer was to create fake
zones that the transcendent beings could not enter, but humans could.
Short of that, it's hard to imagine how they can't win. It's like humanity
losing a war with mice.
--
Brad Templeton http://www.templetons.com/brad/
Really? Which ones? I have read his previous books,
but the connections seem pretty limited to me.
>The book's main flaw is that it attempts way too much for 240 pages.
This is one of my favourite things about Macleod. His books are concise.
>Finally, his prose style is somewhat unpolished, in my opinion, but this
>will improve with time. His female protagonist doesn't even seem
>remotely female to me.
How many military women that age do you know?
>And I also find her character deeply disturbing in ways I am not sure
>Macleod intended, as will be described in the spoiler section.
I'm sure that this was quite intentional.
Spoilers below:
.
.
.
.
.
.
.
.
.
.
.
>But the largest problem is with Ellen, the protagonist. She is _evil_.
>I mean beyond Hitler or Stalin level evil.
No. She has the True Knowledge. I did think it was a bit of a wimp
out for Macleod to make her right as well. I fully expected her to
nuke the AIs from orbit with no other justification than the Knowledge.
[for those of you who read past the spoiler warning and haven't read
the book, I am not talking about London taxicab driving qualifications!]
--
Niall [real address ends in se, not es]
>The book's main flaw is that it attempts way too much for 240 pages.
>It tries to get into depth on several different societies:
> a) The socialist Earth society
> b) The militarist/socialist "Cassini Division" society. The
> Division is a military unit, those who have volunteered
> to protect the solar system from
> c) The VR/download human/post-human/AI society, which has many
> forms, including its successors on Jupiter
> d) The non-cooperating capitalist anarchy of London
> e) The anarcho-capitalist society of New Mars, a stellar colony.
Point of note: New Mars is explored in depth in "The Stone Canal". So
are the origins of the post-human AIs and what happened to Jupiter. The
roots of the story actually go down to the 1970's, and are an out-growth
of events in "The Star Fraction". (I particularly like the way Earth in
the 27th century is basically run by descendants of Jon Wilde's parents'
splinter faction, but that's just my sense of whimsy.)
I would add that in my opinion you REALLY need to start on Ken's work
with The Star Fraction and The Stone Canal -- it's a series (although
each book is somewhat self-contained) and I disagree with the decision
to start US publication with volume #3 (although I can see why marketing
exigencies might have suggested this).
(I'm currently in the process of re-reading the whole series, including
"The Sky Road" -- the latest book -- so will probably have more to say
later.)
-- Charlie
Um...True Knowledge? Care to explain. Please?
Because as it stands, I can't see much of a justification for the
acts described in the original post.
Kristopher/EOS
snip review and most of spoliers
>
>-------------------------------
>
>-------------------------------
>
>But nanotech, while
>getting rid of material need (other than for real estate and raw
>materials) doesn't eliminate the service jobs, and the restrictions
>against AI in the Union leave many of those to humans.
>
I haven't read the book, so I won't comment on anything else - but, I would
argue that most service jobs relate to the satisfaction of material needs,
in ways that mean the nanotech economy could eliminate many of them:
e.g. 1) my car breaks down. Instead of taking it to the garage, I simply
have a nano diagnostic & repair kit sort it out - maybe grow a new one, if
necessary.
2) Moving house - estate agent's role is greatly reduced - just find &
buy a bit of land, then grow a house on a preset design.
3)The bank/building society/S&L's roles are also greatly reduced or
eliminated, as mortgage business is largely unnecessary. The bank's role in
company finance presumably largely disappears as well. Presumably, no one
has any savings either. (As I said, I've not yet read the book, or indeed
any McLeod - though The Star Fraction is waiting on top of my guitar amp -
so I don't know if the economy is monetised or not).
4) Insurance - who needs it? No one will need or want to steal (except
perhaps something truly unique like an original artwork). Damaged goods are
in most cases easily replaced
Just in the UK economy alone, those four areas employ several million
people in services (I'm including here companies that service them, as
well). Most of those jobs would be completely unnecessary in a nanotech
economy.
I'm sure we could find ways to eliminate most other service jobs under
nanotech, especially if we have widespread AI (though I notice you said they
have restrictions on that) as well. I agree there would still be some
service jobs, but really not many at all. Perhaps only a couple of per cent
of the population, at most.
Cheers,
Justin
One thing that this book really has going for it is quite possibly
the best cover copy/book quote I've ever seen in my life.
With that out of the way, off to the spoilers/
>But the largest problem is with Ellen, the protagonist. She is _evil_.
I have trouble disagreeing with this.
>I mean beyond Hitler or Stalin level evil. She spends most of her life
>thinking that AIs are just tools, to be destroyed at will, not worthy of
>any rights. And she and her group fight to kill them. Later she is
>forcefully convinced otherwise (through events that are still inadequately
>explained) and what does she do?
>
>She commits a massive genocide, against the will of the people of Earth
>and the Division which gave her all her tools. Her society plans a huge
>pre-emptive genocide before even talking to their victims. After this
>is called off, she implements another method even before she gets real
>evidence of duplicity by the Jovians.
>
>At best she sees evidence that some Jovians, though not all, want to take
>over New Mars ships.
This was my major problem with the book. I find it hard to believe
that we have these super-duper fast-human post-singularity
AI-type-folk and that when they upload themselves to a ship and try
to disguise the fact, they make the rather idiotic mistake of
having the simulated human motion be synchronized? Right. It
seemed like a plot device to minimize to overwhelming moral
ambiguity (if not, as above, just plain evilness) of Ellen's
decision. Giving Ellen such a blatantly obvious clue _and_ having
her proven correct made the end of the book seem like too much of a
cop out.
What I'm most interested in is why the fastfolk decided to get the
hell of Jupiter in such a blatantly bad fashion. It seemed too much
of plotting by authorial fiat. I hope that this is somehow
explained in a future volume and that maybe Ellen's unwavering
"vision" will cause her to do something that has actual moral
consequences besides a nice thank you note from the people of
Earth.
Aaron
--
Aaron Bergman
<http://www.princeton.edu/~abergman/>
I refer to personal service, not service related to goods. And you
still need software to do all the goods related services.
>
> 2) Moving house - estate agent's role is greatly reduced - just find &
>buy a bit of land, then grow a house on a preset design.
I must wonder if you've bought a house! What the agent does is only
partly to do with the material. A lot of it is the legal and the interface
between people. Since in the nano-utopia, real estate, especially private
real estate is one of the few things remaining with value, you can expect
the complications of getting a piece of real estate to call your own to
increase.
>
> 3)The bank/building society/S&L's roles are also greatly reduced or
>eliminated, as mortgage business is largely unnecessary. The bank's role in
No, because if land becomes all that is valuable, it becomes extremely
valuable, and choice land even more so, so you still need to find a way
to finance it. Unless the state rules that everybody has to live in
giant nano-built urban complexes (which can be close to unlimited) then
*somebody* gets to have the private scenic beach house, and that means
some way of deciding who it is -- probably money.
> 4) Insurance - who needs it? No one will need or want to steal (except
>perhaps something truly unique like an original artwork). Damaged goods are
>in most cases easily replaced
For goods, yes, but you are forgetting liability insurance for injury,
life assurance, healt insurance and all sorts of other forms. And a fire
destroying all your personal items is a trauma you may still want to cover.
Right. They are not nearly smart enough. It makes no sense to win a
battle with post-human intelligences because you do something clever and
they do something stupid. Not when they are not just smart but _fast_.
Even if they start doing this synchronous movement, how many of their hours
does it take to figure out that it will get spotted? How many of their
years have they been planning this?
No, it would have worked better simply to have the Division's advanced
virus detecting software have noticed that the transmission the ship
received was loaded with baddies.
>seemed like a plot device to minimize to overwhelming moral
>ambiguity (if not, as above, just plain evilness) of Ellen's
>decision. Giving Ellen such a blatantly obvious clue _and_ having
>her proven correct made the end of the book seem like too much of a
>cop out.
This is what bothered me about it. I don't mind Macleod writing an
evil protagonist. In fact, it's very interesting. A book with Stalin
as the protagonist would be interesting, if disturbing. But this was
like that book having us discover at the last second that some of the
millions he killed really were hatching a secret plot to destroy Russia,
and so it was now all OK.
>What I'm most interested in is why the fastfolk decided to get the
>hell of Jupiter in such a blatantly bad fashion. It seemed too much
>of plotting by authorial fiat. I hope that this is somehow
>explained in a future volume and that maybe Ellen's unwavering
>"vision" will cause her to do something that has actual moral
>consequences besides a nice thank you note from the people of
>Earth.
I expect the fastfolk aren't really dead, just resting. They'll be back.
> > No. She has the True Knowledge. I did think it was a bit of a
> > wimp out for Macleod to make her right as well. I fully expected
> > her to nuke the AIs from orbit with no other justification than
> > the Knowledge.
>
> Um...True Knowledge? Care to explain. Please?
It's the political philosophy accepted by one of the societies in the
book. You're probably better off reading the book than relying on
summaries here, but very roughly it's the synthesis of cooperation and
anarcho-communism with a pragmatic and non-optimistic (more or less
Heinleinian) view of human nature.
Not all of the societies in The Cassini Division accept the True
Knowledge, of course. I can't tell which of the sides (if any) the
author is on, which I imagine is just the way he wants it.
But some versions of anarchism have more government (or something that
_looks_ a whole lot like government) than some non-anarchist political
theories. Fourieran phalanxes, for example.
--
Dan Goodman
dsg...@visi.com
http://www.visi.com/~dsgood/index.html
Whatever you wish for me, may you have twice as much.
>Personally, I find all forms of anarchism utterly implausible, so the other
>questions become moot. It just isn't in the nature of the beast (which is to
>say, us.)
Just for the fun of arguing, I'd like to point out that only stable
anarchism is utterly implausible. No social system is utterly
implausible on a sufficiently short time-scale. To use a somewhat loose
analogy, it's like thermodynamics -- the further away you are from
equilibrium, the more baroque are the possibilities of non-equilibrium
behaviour (ourselves being the proof of the pudding :-). The micro-
anarchy described in _Star Fraction_ I can swallow, because it turns out
to be fairly "micro" in its duration too. But the homeostatic anarchy on
Earth in _Cassini Division_ is indeed a bit hard to swallow.
I must say, BTW, that I am fascinated by Ken's attitude to AI. From the
debates we have been having around here, he seems to be of the opinion
that self-hood is not reducible to mere information processing, but this
doesn't quite square with his books.
--
Mike Arnautov
m...@mipmip.demon-co-antispam-uk
Replace dashes with dots and remove the antispam component.
> I can't tell which of the sides (if any) the
> author is on, which I imagine is just the way he wants it.
That's one of the attractive characteristics of the book.
After reading the book, I sent the author an EMail; the following is
extracted from that.
1. I don't believe your Jupiter attack could work. If I correctly
understand the story, the comets in question are a small part of the
series of comets that have been impacting New Mars as part of the
terraforming project. Hence their mass and energy must be small compared
to that of New Mars--otherwise they would be burning it up, greatly
increasing its mass and gravity, et multae caetera. But Jupiter is
enormously bigger than New Mars. So how can their impact do enough damage
to Jupiter to destroy a post human civilization--especially one that has,
by its standards, years of advance warning?
2. You say that it is necessary to accelerate while going through the
Wormhole. But when the comets go through, neither they nor the wormhole is
accelerating.
3. I don't understand why the non-co's are so much poorer and more
backward than the rest of the planet. Some of them are immigrants from the
majority civilization--what stops them from bringing the necessary
knowledge and equipment to start nanotech production for themselves? The
obvious explanations are either that you think capitalism doesn't work,
which is inconsistent with the picture of New Mars, or that they are being
deliberately kept down by the dominant civilization, which doesn't seem to
fit the description in the book--although this may be something you are
deliberately suggesting without saying.
4. I don't understand why, in the final bits of the book, the New Mars
people turned out to be so far off in their estimate of their ability to
defend against the fast folk. If I understand the situation correctly, New
Mars people have more computing power available than Solar Union people,
both because they have electronics and because they are much richer per
capita. And they have much more experience with trying to hack into each
other's systems and defending against the same. Furthermore, they have
much more experience with self-concious AI, since many of them are
self-concious AI's. So why is it that their estimate of the situation is
badly off and the Solar Union gets it right?
It would make sense if a few of the New Mars people were optimists, and
the rest were simply unwilling to stop them--but it sounds from the book
as though the traders represent the concensus view.
5. I enjoyed the portrayal of the anarcho-communist society and the
contrast with the anarcho-capitalist society. But I don't think you give
an adequate or (to me) convincing account of how the former works--how it
solves the standard coordination problems. Your volunteer stewardess story
is plausible enough--indeed, on my flight back, there were two girls (at
least one French, not speaking English) who were flying as unescorted
minors and ended up spending part of the flight as volunteer amateur
stewardesses distributing candy bars and such. But running a complicated
society on volunteers without price signals is a much harder problem. You
hint at some sort of, presumably voluntary, central planning using big
Babbage Engines, and suggest that it becomes less necessary as the society
becomes more decentralized (and, I presume, less interdependent, which is
not at all the same thing--the price system, after all, is a mechanism for
combining decentralization with interdependence).
But I would like to see a clearer explanation. I am familiar with a number
of gift societies (SF fandom, SCA, the open source software movement), but
all of them function within a larger trade society, which enormously
simplifies their coordination problem--and the ones I know about have a
lot more politics (in the invidious sense, involving status competition,
competition for control of what happens, etc.) then the Solar Union seems
to.
The two explanations that occurred to me are:
A. Some fundamental change in how people think and act (but I don't see
one, and there ought to be historical examples)
B. The society really works very badly, from an economic standpoint, as
suggested by the contrast with New Mars. But its inherited technology is
so good that it can work at 10% of potential (due to misallocation due to
lack of prices etc) and still be quite attractive.
6. I thought the idea of Stirnerite, near Randian, anarcho-communists
contrasted with moralistic anarcho-capitalists was great fun.
7. How much have you thought about the sexual institutions that you (and
others) postulate for your future society? My first reaction is that the
society of casual promiscuity you portray, while a pleasant (male)
fantasy, isn't very plausible--at least, it corresponds to no real
heterosexual human society I am aware of, and seems based on Mead's
(apparently fictional) Samoa mixed with sixties practices that seem to be
in decline at present. My suspicion (see _The Adapted Mind_ for some
evidence) is that the considerable similarity of sexual practices across
very different societies reflects innate human characteristics, presumably
ones that led to reproductive success in the hunter gatherer environment
in which we spent most of our species history.
Two possible responses occur to me. One is that the sexual practices are a
side effect of life extension. The practices of existing societies
developed in a context where most people spent a fair part of their adult
life producing children--perhaps in a world where a large part of the
population has finished that part of life and is doing other things,
practices will be different. The other is that the practices you describe
may be a side effect of wealth, itself due to technology.
8. I still don't believe that people would regard being replaced by a
backup as the equivalent of not dying--but that assumption is important
for the story you are telling, so I will suspend disbelief.
9. I haven't done any calculations, but I am sceptical of radio
transmissions over thousands of light years--unless your source antenna is
of astronomical size, I doubt you can keep a tight enough beam to do it
with any reasonable amount of power input.
--
David Friedman
www.best.com/~ddfr/
> 3. I don't understand why the non-co's are so much poorer and more
> backward than the rest of the planet. Some of them are immigrants from the
> majority civilization--what stops them from bringing the necessary
> knowledge and equipment to start nanotech production for themselves? The
> obvious explanations are either that you think capitalism doesn't work,
> which is inconsistent with the picture of New Mars, or that they are being
> deliberately kept down by the dominant civilization, which doesn't seem to
> fit the description in the book--although this may be something you are
> deliberately suggesting without saying.
This part makes sense to me: they're poor because there aren't enough
of them. Most people prefer to cooperate.
In our world, a small group that deliberately keeps itself isolated
from the rest of human civilization is going to be poor and backward.
Our technology isn't something you can just carry with you; keeping a
technological civilization going requires more knowledge, and more
work, than a single person is capable of.
You could imagine that this will be different in the future, but you
could also imagine (as Macleod did) that it won't. I think his
assumption is the more reasonable one.
Yeah, but the socialists, being socialists, will offer the fruits of
their technology to anybody, or so they say. It's not clear why the
non-cooperators can't leave the city, take some free gold, and return.
Would they be shot if they try?
And as I point out, even the most dyed in the wool capitalist would
turn socialist in exchange for the nano-capsule that provides
eternal youth.
The dichotomy in the solar union doesn't make a great deal of
sense. A socialist society that truly provides "to each according
to their needs" and does it without coercion would of course
get all takers. There must be some form of restriction that's not
documented in this book, something you must agree to do or not do
in order to be in the Union.
Disclaimer: I've only just read The Star Fraction which I enjoyed
immensely (just the phrase "That's a load of serdar argic" had me on
the floor screaming for air and mercy). The only thing that detracted
from my admiration of that book was the totally cardboard female
characters. Especially the biologist who just strolls out of her life
to trail the protagonist with no second thought to friends, family,
pet rabbits, nothing.
It must have been bad for me to notice - usually my brain provides
enough characterisation that its weakness in a novel doesn't tend to
bother me, thought its presence pleases me.
Not that it's going to stop me from reading the Stone Canal, mind.
Frossie
--
Joint Astronomy Centre, Hawaii http://www.jach.hawaii.edu/~frossie/
Language is the soul's ozone layer and we thin it at our peril --Sven Birkerts
It might be more correct to say that no political system is stable over
a long time scale, at least not yet.
Of course there are some old systems around -- Iceland, for example.
And the USA is now one of the older ones.
But how old are they really? The USA, politically, is quite divorced
from its roots, with the government taking a fat chunk of the GDP and
a variety of other changes.
An anarchist system on these time scales is not out of the question.
Anarchy, after all, is the absence of a _state_, not the absence of
order and systems of organization. A state is simply something with
a monopoly on the big guns.
>9. I haven't done any calculations, but I am sceptical of radio
>transmissions over thousands of light years--unless your source antenna is
>of astronomical size, I doubt you can keep a tight enough beam to do it
>with any reasonable amount of power input.
Well, according to various calculations I've seen, the Arecibo antenna
could probably detect large-scale US military radars from hundreds of light
years away (if they were pointed in the right direction). You could
certainly communicate over a few thousand light years, albeit perhaps at a
slow data rate, with antennae that are not of "astronomical" size.
> Just for the fun of arguing, I'd like to point out that only stable
> anarchism is utterly implausible. No social system is utterly
> implausible on a sufficiently short time-scale.
OBOtherSF: _The Syndic_, C.M. Kornbluth. A benign anarchist utopia
descended from the Mafia (!!!) is portrayed as being stable only over a
span of a few decades, and it is made quite clear that there is no point
trying to preserve it, as those very efforts would serve to undermine
it. Fun book.
> Mike Arnautov
Steve
--
------------------------------------------------------------------
Steve Taylor st...@afs.net.au
Applied Financial Services
Phone: +61 3 9670 0233
Fax: +61 3 9670 5018
A bit more subtle than that, actually. There's also a malign anarchist
dystopia descended from the Mafia.
> A bit more subtle than that, actually. There's also a malign anarchist
> dystopia descended from the Mafia.
Yeah. I didn't want to clutter up the posting though.
> Dan Goodman
S.
>There must be some form of restriction that's not
>documented in this book, something you must agree to do or not do
>in order to be in the Union.
Shifting from 'we' to 'you' apparently does it, in some contexts.
Fair point. There would probably still be work in restaurants, cafés etc,
for instance, perhaps more so than now, legal and medical services would
still be necessary, education would probably expand, etc. OK, I'm willing to
accept that quite a lot of that work would still be around - even so, it
could be considerably reduced by automation (example, I've just left a job
in customer service for a major telephone company - several years ago, they
had a considerable downsizing program, reducing the number of engineers
employed by roughly half. This was largely possible because of the
introduction of digital exchanges. So now the company is largely customer
service orientated <allegedly>. And they've already found ways to reduce the
number of customer service representatives necessary using automated
answering services. The next generation of such services are on their way,
and should make it possible to have another massive reduction in staff.).
And you
>still need software to do all the goods related services.
>>
In a mature nanotech economy, I suspect that much of that software would be
legacy code.
>> 2) Moving house - estate agent's role is greatly reduced - just find
&
>>buy a bit of land, then grow a house on a preset design.
>
>I must wonder if you've bought a house! What the agent does is only
>partly to do with the material. A lot of it is the legal and the interface
>between people.
Ah, the legal aspects are handled differently in the UK. It's called
conveyancing, and is handled by a type of lawyer called a solicitor. The
estate agent's role reduces largely to getting the house sold & flogging
mortgages and life & pensions products to the buyer.
When I was thinking of buying a flat a few months ago, I was, IIRC, quoted
just under £700 for the legal work. Lawyers aren't cheap, so I'd be
surprised if this represents more than a couple of man-days of actual work -
for a transaction that I might make every ten or twenty years. Not, I think,
a significant work-generator, unless people start moving a lot more.
Since in the nano-utopia, real estate, especially private
>real estate is one of the few things remaining with value, you can expect
>the complications of getting a piece of real estate to call your own to
>increase.
You may have a point. My first cut at this is that the cost of land ought to
drop, as we get rid of all that unneccessary office space, warehousing,
factory space, retail, not to mentioned clearing up contaminated land,
possibly even agricultural land, if we use nanotech to build our food. So
most land, would, IMO, be virtually free. However, as you point out below,
land in good locations would indeed be worth a lot. So ISTM that those who
live in such locations will indeed need mortgages to finance their purchase.
But most people would therefore live in less choice locations - just as they
do now.
>>
>> 3)The bank/building society/S&L's roles are also greatly reduced or
>>eliminated, as mortgage business is largely unnecessary. The bank's role
in
>
>No, because if land becomes all that is valuable, it becomes extremely
>valuable, and choice land even more so, so you still need to find a way
>to finance it. Unless the state rules that everybody has to live in
>giant nano-built urban complexes (which can be close to unlimited) then
>*somebody* gets to have the private scenic beach house, and that means
>some way of deciding who it is -- probably money.
No, I don't see that. As referred to above, we would have more land for
living purposes than we had before. Only a problem if the population goes
way up. True enough that private scenic beach houses won't be cheap. But
suburban land will be.
>
>> 4) Insurance - who needs it? No one will need or want to steal (except
>>perhaps something truly unique like an original artwork). Damaged goods
are
>>in most cases easily replaced
>
>For goods, yes, but you are forgetting liability insurance for injury,
>life assurance, healt insurance and all sorts of other forms. And a fire
>destroying all your personal items is a trauma you may still want to cover.
>>
I once had the misfortune to work in the insurance industry. Some of the
sorts of things you refer to would still be relevant in the nanotech utopia.
But, why do we have things like life assurance? To maintain a family's
circumstance if something happens to the breadwinner(s). Well, if most
products are free or nearly so and most people don't need to work, this kind
of thing is virtually unnecessary - and yet life & pensions products
represent one of the largest sectors in personal insurance (at least in the
UK). As I said before, I believe most insurance relating to damages and/or
theft would disappear. Motor insurance (assuming we're still driving cars!)
would remain because of the issues you referred to such as liability for
injury.
I still think the insurance industry would shrink a great deal, though its a
fair point that some aspects would remain, particularly personal & public
liability insurance, and health insurance (a much less significant factor
here, as we mostly pay this through our taxes, which is why it didn't
initially occur to me)
Ultimately I feel sure that all these industries would shrink significantly
under nanotech, though not as much as I initially thought. To put some
figures on it, they might perhaps be a quarter of their current size, where
initially I was thinking 5 - 10%.
Cheers,
Justin
> Stoned koala bears drooled eucalyptus spittle in awe
> as <b...@templetons.com> declared:
>
> >The book's main flaw is that it attempts way too much for 240 pages.
> >It tries to get into depth on several different societies:
> > a) The socialist Earth society
> > b) The militarist/socialist "Cassini Division" society. The
> > Division is a military unit, those who have volunteered
> > to protect the solar system from
> > c) The VR/download human/post-human/AI society, which has many
> > forms, including its successors on Jupiter
> > d) The non-cooperating capitalist anarchy of London
> > e) The anarcho-capitalist society of New Mars, a stellar colony.
>
> Point of note: New Mars is explored in depth in "The Stone Canal". So
> are the origins of the post-human AIs and what happened to Jupiter. The
> roots of the story actually go down to the 1970's, and are an out-growth
> of events in "The Star Fraction". (I particularly like the way Earth in
> the 27th century is basically run by descendants of Jon Wilde's parents'
> splinter faction, but that's just my sense of whimsy.)
>
You mean there's a happy ending for the AIs? I just read The Cassini
Division, and I am not about to read any more by him.
--
Samuel Kleiner Everybody knows that dragons don't exist.
But while this simplistic formulation may satisfy the layman,
it does not suffice for the scientific mind. -Stanislav Lem
>An anarchist system on these time scales is not out of the question.
>Anarchy, after all, is the absence of a _state_, not the absence of
>order and systems of organization. A state is simply something with
>a monopoly on the big guns.
Hmmm... Let's not start one of *those* discussions again. Suffice it to
say that it is a view. Others have other views. :-)
>In article <19990709141443...@ng-fy1.aol.com>,
>JoatSimeon <joats...@aol.com> wrote:
>>Personally, I find all forms of anarchism utterly implausible, so the other
>>questions become moot. It just isn't in the nature of the beast (which is to
>>say, us.)
>But some versions of anarchism have more government (or something that
>_looks_ a whole lot like government) than some non-anarchist political
>theories. Fourieran phalanxes, for example.
Explain. It'll be the group's oddball political theory for the week.
:-)
--
Phil Fraering "What are we going to do tonight, Miles?"
p...@globalreach.net "Same thing we do every night, Ivan,
/Will work for tape/ try to take over the Imperium!"
MADAM IM ADAM
>You mean there's a happy ending for the AIs?
Not as far as I know, nor did I indicate so.
Note that the fast thinkers get through time at a couple of thousand
times human speed, or more -- by the time you know they're out there,
civilization has risen and fallen a couple of zillion times, and the
descendents of the AI equivalent of cockroaches have taken over. That's
basically why the humans in Ken's series have a big down on the fast
thinker civilizations -- they're fundamentally not something you want
to live next door to.
You'll also note some interesting questions raised in The Stone
Canal about the implications of reincarnating people in cloned bodies
from memory recordings ...
-- Charlie
The truth is land is not that expensive as a gross commodity right now.
You can get a house for $10,000 in the middle of nowhere.
What almost all people do is buy the best land they can afford. If land
becomes the one thing most valued, as it will in a nano-society, this
factor will increase, not weaken.
People want the best views, the best climate, the best neighbours and
nearby facilities and attractions and at the same time the best privacy.
Yes, people without money will always be able to get near-free living
space in some large complex, but anybody with any means will spend
about half those means on getting a good place to live.
>
>No, I don't see that. As referred to above, we would have more land for
>living purposes than we had before. Only a problem if the population goes
>way up. True enough that private scenic beach houses won't be cheap. But
>suburban land will be.
We already have, in places like the USA, more land available for living
than we can use. We like the land that's nice, and close to things
and people we want to visit.
>
>I once had the misfortune to work in the insurance industry. Some of the
>sorts of things you refer to would still be relevant in the nanotech utopia.
>But, why do we have things like life assurance? To maintain a family's
>circumstance if something happens to the breadwinner(s). Well, if most
>products are free or nearly so and most people don't need to work, this kind
>of thing is virtually unnecessary - and yet life & pensions products
>represent one of the largest sectors in personal insurance (at least in the
I don't agree. People will always want more than the basic level no
matter how high that basic level gets. Today our poor people live much better
than kings (in the material sense) did long ago.
If you're a hot software designer, making a good income designing
nano-software, and supporting your familiy in the high style, you might
well want life assurance just as you do today, to keep them in that
style if you kick it.
>Ultimately I feel sure that all these industries would shrink significantly
>under nanotech, though not as much as I initially thought. To put some
>figures on it, they might perhaps be a quarter of their current size, where
>initially I was thinking 5 - 10%.
It doesn't actually matter. People actually *want* occupations, both to
occupy themselves in interesting ways and to get ahead of the Jones,
no matter what level the Jones are at.
I don't expect a leisure utopia. I think what we do, and what we get
paid for, will simply move to other things. Creative work for those
who can do it. Personal service for those who can't.
Now if you posulate not just nano, but AI servants (if that's moral)
you do indeed get into the question of what people are going to do with
their time.
>An anarchist system on these time scales is not out of the question.
>Anarchy, after all, is the absence of a _state_, not the absence of
>order and systems of organization. A state is simply something with
>a monopoly on the big guns.
No, it's not. A state is a very complex thing, culturally and socially as
well as politically. It is missing the point at best to call it simply the
people with the big guns (in fact, in many historical circumstances, that
has been a false description).
>dsg...@visi.com (Dan Goodman) writes:
>
>>In article <19990709141443...@ng-fy1.aol.com>,
>>JoatSimeon <joats...@aol.com> wrote:
>>>Personally, I find all forms of anarchism utterly implausible, so the other
>>>questions become moot. It just isn't in the nature of the beast (which is to
>>>say, us.)
>
>>But some versions of anarchism have more government (or something that
>>_looks_ a whole lot like government) than some non-anarchist political
>>theories. Fourieran phalanxes, for example.
>
>Explain. It'll be the group's oddball political theory for the week.
Oooold communitarian socialist model. Look for it in the history books.
Basically a kind of commune, which tended to succeed only if led by a
sufficiently charismatic and competent leader, flopping otherwhise.
Well, pure anarchists would say that a state in which participation is
voluntary is an anarchist system. That's what Macleod's book in effect
claims to have. There is a "state" of sorts in his solar union, with
decision making functions, delegates, etc. But participation is
voluntary, and if the "state" makes the wrong decisions and the people won't
back them up the state loses.
The "big guns" above is a metaphor for "the strongest force."
Which states from history do you believe existed even though those
supposedly under their rule had more military power than the state
itself?
While clearly if this takes place those subjects who had the stronger
power are participating somewhat voluntarily, to the extent that if they
are ordered to do something truly against their will they have the power
to refuse, though perhaps at great cost. And because that cost is high,
one can imagine such systems existing for some period of time. But
have any spanned decades?
We have a functioning, long-lasting, anarchy. It's called world politics.
It's been around a long time -- quite a while since the last time a
a nation ruled the civilized world, or even its entire trading sphere.
No nation has a monopoly on the big guns, though for a few years in the 40s
the USA did. It seems to work pretty well. Of of course nations bully
other nations -- with the USA leading the way in that area, but it
continues and is stable.
>In article <378abb5f...@news.netcom.ca>,
>Ian <iadm...@undergrad.math.uwaterloo.ca> wrote:
>>b...@templetons.com (Brad Templeton) wrote:
>>
>>>An anarchist system on these time scales is not out of the question.
>>>Anarchy, after all, is the absence of a _state_, not the absence of
>>>order and systems of organization. A state is simply something with
>>>a monopoly on the big guns.
>>
>>No, it's not. A state is a very complex thing, culturally and socially as
>>well as politically. It is missing the point at best to call it simply the
>>people with the big guns (in fact, in many historical circumstances, that
>>has been a false description).
>
>Well, pure anarchists would say that a state in which participation is
>voluntary is an anarchist system. That's what Macleod's book in effect
>claims to have. There is a "state" of sorts in his solar union, with
>decision making functions, delegates, etc. But participation is
>voluntary, and if the "state" makes the wrong decisions and the people won't
>back them up the state loses.
>
>The "big guns" above is a metaphor for "the strongest force."
>
>Which states from history do you believe existed even though those
>supposedly under their rule had more military power than the state
>itself?
Since you haven't restricted the discussion to nation-states, I'll cite
much of feudal Europe. One might pay especial attention to
Poland-Lithuania.
The main point, though, is that having a "power monopoly" isn't what
defines a state. It's a _feature_ of many states, but there are many other
features which are at least as important. Being the biggest guns on the
block isn't sufficient condition for being a state.
>While clearly if this takes place those subjects who had the stronger
>power are participating somewhat voluntarily, to the extent that if they
>are ordered to do something truly against their will they have the power
>to refuse, though perhaps at great cost. And because that cost is high,
>one can imagine such systems existing for some period of time. But
>have any spanned decades?
Poland lasted for quite a while, until its inefficiencies eventually
resulted in a failure to compete in a changing environment.
>We have a functioning, long-lasting, anarchy. It's called world politics.
"World politics" doesn't fit the definition of a society. Nations have
historically had far more ability to survive and act independently than
individuals in a society have ever realistically had, and they don't act
like individuals do. The less functionally independent nations become, the
less politically independent they become as well.
> Which states from history do you believe existed even though those
> supposedly under their rule had more military power than the state
> itself?
The Angevin empire would be a pretty clear example--unless you want to
count all of the feudal lords as part of the state, even though their
power bases were mostly independent of the King.
--
David Friedman
www.best.com/~ddfr/
> Brad Templeton wrote:
> > The dichotomy in the solar union doesn't make a great deal of
> > sense. A socialist society that truly provides "to each according
> > to their needs" and does it without coercion would of course
> > get all takers. There must be some form of restriction that's not
> > documented in this book, something you must agree to do or not do
> > in order to be in the Union.
>
> A thought experiment: Take the most rabid Libertarian (or other right
> winger) you know. If you don't know anyone rabid enough I'll introduce
> you to some. Someone who believes all communism is inherently evil, and
> must involve some nasty coercive trick, even if that trick is not
> visible (mind control is suggested by the non-cos in the book). Would
> *they* join such a socialist society ?
Note that the most visible non-co in the book is a very attractive
character, and presumably one of the smartest humans around, given that he
is roughly their equivalent of Einstein.
The society might be stifling without being coercive, and I think that is
part of what is being suggested. There is an orthodox ideology ("the true
knowledge") and the population seems to spend most of its time playing,
which some people might find pointless, or boring, or ignoble, or ... .
Imagine a society where most material goods are freely provided, but where
nobody will be friends with you unless you share their view of philosophy,
the good life, etc.--and you don't.
--
David Friedman
www.best.com/~ddfr/
I don't think so. I read it second (after "The Star Fraction"), so I
knew who Jonathon Widle and Jordan Hubbard were, but nothing is said
about them in "The Cassini Division" that isn't explained there, and JW
only gets a minor role, and JH none at all (only his writings get
mentioned). The other three characters who appear in "The Stone Canal" I
had not feeling at all (until I read it) that they were anything but
minor characters,
> The book's main flaw is that it attempts way too much for 240 pages.
> It tries to get into depth on several different societies:
> a) The socialist Earth society
> b) The militarist/socialist "Cassini Division" society. The
> Division is a military unit, those who have volunteered
> to protect the solar system from
> c) The VR/download human/post-human/AI society, which has many
> forms, including its successors on Jupiter
> d) The non-cooperating capitalist anarchy of London
> e) The anarcho-capitalist society of New Mars, a stellar colony.
>
> Each of these societies could be, should be, a novel, and I am told
> the New Mars society is described in one of the earlier books. However,
> he tries to get all of these into one book and that is simply too much to
> attempt.
This is kind of tough for me to comment on - I guess I have enough
political background that I can fill in the blanks even if they are
there.
> And I also find her character deeply
> disturbing in ways I am not sure Macleod intended, as will be described
> in the spoiler section.
He did intend. Almost certainly.
> With those criticisms aside, this book does indeed break new ground
> in lots of interesting ways and is well worth reading.
>
> Now on to the spoilers
> -------------------------------
> -------------------------------
>
> Either there is depth missing or I simply don't buy Macleod's anarcho
> communist utopia. Sure, nanomachines provide the material wealth,
> including all the gold and silver members of the Union want when they
> visit the capitalists. But who does the dirty work? Everybody seems
> to be volunteering only to do cool, intersting jobs, but the rest of
> the population seems missing. The most we see is that on a flight,
> somebody volunteers to be the flight attendant. But nanotech, while
> getting rid of material need (other than for real estate and raw
> materials) doesn't eliminate the service jobs, and the restrictions
> against AI in the Union leave many of those to humans.
<OPINION>
Communist utopias are inherently hard to portray, because they have to
take for granted the exact opposite of most of the things we accept
about human nature in order to work. Ken does as good a job as anyone.
Have you ever read "The Disposessed" by Ursula Le Guin ? That book is
probably still the bets effort in that direction.
</OPINION>
> And she and her group fight to kill them. Later she is
> forcefully convinced otherwise (through events that are still inadequately
> explained) and what does she do?
Where is she convinced before she (temporarily) becomes an AI herself ?
That occurs after the genocide IIRC.
> She commits a massive genocide, against the will of the people of Earth
> and the Division which gave her all her tools. Her society plans a huge
> pre-emptive genocide before even talking to their victims. After this
> is called off, she implements another method even before she gets real
> evidence of duplicity by the Jovians.
Yep. She has the True Knowledge, so she does what she thinks is
necessary, as long as it is within her power, with no thought or
compassion for others. We think (naturally) that that is intolerably
evil, but Ken MacLeod does quite a good job of portraying it as a
reasonable basis for an anarchist utopia. That is, I think, what the
book is about.
Simon
A thought experiment: Take the most rabid Libertarian (or other right
winger) you know. If you don't know anyone rabid enough I'll introduce
you to some. Someone who believes all communism is inherently evil, and
must involve some nasty coercive trick, even if that trick is not
visible (mind control is suggested by the non-cos in the book). Would
*they* join such a socialist society ?
Simon
The short answer is yes. If the society doesn't *force* them to do
things, or pay taxes, and keeps them defended, it is a libertarian
society.
The Union follows a philosophy of doing what you can get away with.
There must be some reason that the non-cooperators don't avail themselves
of the benefits of the Union, ie. there is a price for those benefits they
don't want to pay.
I don't know any libertarian so rabid they would turn down fountain
of youth medicine unless the price was some form of slavery. But it seems
in this world you can move freely from one group to the other. Malley
seems a bit strange, a bit mentally ill, to let himself get addicted and
old.
Sure, you can fill in blanks, but not some of them because here he's
trying to break new political ground, and there are many theories about
the types of societies he is describing.
>
>> And I also find her character deeply
>> disturbing in ways I am not sure Macleod intended, as will be described
>> in the spoiler section.
>
>He did intend. Almost certainly.
Oh, he intended ambivalence about her motives, but the book ends with
everybody happy she did her genocide, and Earth sending messages to the
far future to say how happy everybody is about it.
They wouldn't be. They would be highly ambivalent about it as well.
Truth is, she planned all this and intended to carry it out without
waiting to talk to them, and without waiting for signs that they are
duplicitous. The last minute takeover of the New Mars crew seems to
be a "see, she was right" ending rather than a good way to express the
enormaty of her actions. As the POV character, we don't hear much about
the Jovians. They are "things" to her for most of the book, the book
thus treats them like things, and they are squashed like things.
>
>Communist utopias are inherently hard to portray, because they have to
>take for granted the exact opposite of most of the things we accept
>about human nature in order to work. Ken does as good a job as anyone.
>Have you ever read "The Disposessed" by Ursula Le Guin ? That book is
>probably still the bets effort in that direction.
Of course. But LeGuin does a much better point of showing both sides
of both societies in The Disposessed. Neither culture is any sort of
utopia. We're not really shown much of the downside of the Union. Why
do people live in the non-co areas?
>Yep. She has the True Knowledge, so she does what she thinks is
>necessary, as long as it is within her power, with no thought or
>compassion for others. We think (naturally) that that is intolerably
>evil, but Ken MacLeod does quite a good job of portraying it as a
>reasonable basis for an anarchist utopia. That is, I think, what the
>book is about.
The True Knowledge seems to say "destroy what you don't understand,
before it can hurt you." One wonders if a culture based on that can
really survive. Because the rest of the Union culture is based on
cooperation, or so it seems. The idea that it's more productive to
work out ways to cooperate with strangers than it is to kill them on
sight. She feels that really smart strangers are so dangerous that it's
not possible to cooperate.
I don't think he goes into nearly enough depth to understand whether it's
a reasonable basis for an anarchist utopia.
> Not that it's going to stop me from reading the Stone Canal, mind.
:The Stone Canal: has much more plausible women, well, considering
that one of them is a no, that would be a spoiler. :] :The Cassini
Division: has a female POV that didn't bother me at all as female.
I really really dislike her, but I didn't have any other problems with
her. :The Sky Road: has a really brilliantly realised female protagonist
in one thread. I even liked her - she's a minor character in :The Stone
Canal: and I didn't like her then, and I strongly disagree with what
she does all the way down the line, but I can't help sympathising with
her.
Ken MacLeod is really good at writing ambiguities, and things that
make you think and don't have clear-cut right/wrong left/right
good/bad walk/don't walk choices.
That's quite unusual for SF, not being sure at the end who the good
guys are, or whether the concept of "good guys" is at all relevant.
--
Jo - - I kissed a kif at Kefk - - J...@bluejo.demon.co.uk
http://www.bluejo.demon.co.uk - Interstichia; Poetry; RASFW FAQ; etc.
I'm not sure they would, actually. They're all anarchists, so they don't
care that she didn't do as she was told, and I guess they're delighted
that they're not all dead, posthumanised or slaves. I wouldn't expect
any of the characters to be ambivalent about what she did. I am, but
that is different.
> Truth is, she planned all this and intended to carry it out without
> waiting to talk to them, and without waiting for signs that they are
> duplicitous. The last minute takeover of the New Mars crew seems to
> be a "see, she was right" ending rather than a good way to express the
> enormaty of her actions.
I actually think the ending is the only one that makes any sense, and
not the cop-out other people seem to see it as. The point is, what she
did is still wrong, by my standards or yours, but if we'd been there
instead of her, we'd be dead. The alternative - she kills them all, but
it turns out they were OK after all - seems more like a cop-out to me.
It would make the morality of the situation clear, and put her firmly in
the wrong. As it is, we have to admit she was right, even though what
she did was unjustifiable.
> The True Knowledge seems to say "destroy what you don't understand,
> before it can hurt you."
As far as I can see (and the author actually doesn't make it terribly
clear) the True Knowledge dictates that if you perceive a threat (and
you cannot really deny that the posthumans were definitely a threat, and
there was prior knowledge they were likely to be untrustworthy) you
should act to eliminate it, as long as you can. It seems to owe a lot to
Max Stirner.
> One wonders if a culture based on that can
> really survive. Because the rest of the Union culture is based on
> cooperation, or so it seems. The idea that it's more productive to
> work out ways to cooperate with strangers than it is to kill them on
> sight. She feels that really smart strangers are so dangerous that it's
> not possible to cooperate.
It wasn't so much that the posthumans were smart as that they were
totally unpredictable and massively powerful. From a human viewpoint
what they say one day they may not keep to the next, as from their point
of view it was years ago. At the same time, they clearly had powers
hugely beyond those of the Union, even if they didn't use them terribly
competently.
The humans can deal with each other, in spite of their amoral egoist
philosophy, because they know they are more or less equal in strength
and predictably similar in nature. At the same time, for them, there is
massive material abundance. All of this breaks down with the posthumans
who are unpredictable in motivation, massively and mysteriously powerful
and probably lusting after control of the solar system
Simon
A cop-out? They would never have found out "they were OK after all"
because she killed them. The question would have been left unanswered.
The ending that would answer the question would be, that she tries to
kill them, she fails, and the survivors turn out to be nice chaps.
That leaves her an unamgiguous monster. (Of course, this is Ender's
situation, but Ender is a child obeying orders, who thinks he's in a
simulation.)
>It wasn't so much that the posthumans were smart as that they were
>totally unpredictable and massively powerful. From a human viewpoint
>what they say one day they may not keep to the next, as from their point
>of view it was years ago. At the same time, they clearly had powers
>hugely beyond those of the Union, even if they didn't use them terribly
>competently.
It's been played out in game theory that "attack first if they are a threat"
isn't a strategy that appears to work.
What she has done by her actions is destroy the human race. She's given
not just evidence, but proof, that the human race can't be trusted.
This destroys it because the arrival of the post-humans can't be stopped.
Even if you destroy them all, they can be created again, and they will
be created.
But this time those creating them will know how much they are feared, and
the post-humans created will know that humanity can't be trusted. They
will make sure -- and this time they'll be good at it and have a
warning -- that humanity can't destroy them again. Chances are they
will do this by totally disarming humanity, or simply wiping it out.
That's what the True Knowledge would tell them to do, but not just that
alone -- she's provided not just the fear of a threat, but proof.
To make doubly sure they will probably build a wormhole and send it out
to her to kill her, and her New Mars friends to boot.
This is the great flaw of the True Knowledge. Destroying a threat only
makes sense if you can destroy it permanently. But post-humanism can't
possibly be destroyed in such a way. Not even a totalitarian regime
clamping down on AI could stop it forever.
The True Knowledge has you shoot the first soldier coming over the hill,
because he's a threat. But all that means is that the army behind him
decides you can't be trusted, and kills you. If you risk cooperating
with that first soldier, you may get an ally out of the army. Or it
may kill you.
The key is, if you know the army can kill you, the only way you can survive
is to work with it. Killing the first round just seals your doom.
>Not all of the societies in The Cassini Division accept the True
>Knowledge, of course. I can't tell which of the sides (if any) the
>author is on, which I imagine is just the way he wants it.
Indeed. One of the several things I like about MacLeod is that, unlike
most authors of political SF, he doesn't reserve the good lines only for
the people he agrees with.
--
Patrick Nielsen Hayden : p...@panix.com : http://www.panix.com/~pnh
>Matt Austern <aus...@sgi.com> wrote in
><fxtwvw9...@isolde.engr.sgi.com>:
>
>>Not all of the societies in The Cassini Division accept the True
>>Knowledge, of course. I can't tell which of the sides (if any) the
>>author is on, which I imagine is just the way he wants it.
>
>Indeed. One of the several things I like about MacLeod is that, unlike
>most authors of political SF, he doesn't reserve the good lines only for
>the people he agrees with.
I'm re-reading The Cassini Division right now -- working my way through
the first three books before starting The Sky Road. In light of this
discussion it's striking how many people seem to misread Elen May. She
has some rather dark motivations which aren't exactly concealed (notably,
where her parents went), and which make her attack on the fast thinkers
extremely reasonable; and she's not a happy-go-lucky character, either,
despite the bouncy tone of the narrative prose. She's a high-ranking
officer in a very paranoid military machine which just happens to be
embedded in communist state which on the surface resembles a political
anarchy, and she just happens to have the physical age and energy of
a twenty-year-old because of advanced medical techniques. But her
real motivations aren't those of a kid in an adventure novel; they're
those of a general officer fighting a cold war. There's an inherent
contradiction between the surface appearances of the Union and the
political undercurrents within the Division, and she's an unreliable
narrator who isn't telling everything she knows (or thinks) because she
wants to put a good spin on her story.
At least, that's my current reading of it ....
Ow, my head hurts!
-- Charlie
Just so. Which is why I boggle at some of the people who breathlessly
announce their insight that golly, if you think about it, maybe Ellen May
isn't a completely nice person.
Not only that, but ya know, Severian probably isn't telling us everything
he knows, either. Do ya think?
>>At least, that's my current reading of it ....
>
>Just so. Which is why I boggle at some of the people who breathlessly
>announce their insight that golly, if you think about it, maybe Ellen May
>isn't a completely nice person.
Too much brain candy rots the critical faculties.
Or rather, too much contemporary SF seems to deal with transparently
motivated protagonists who don't have any concealed vices or prejudices.
(I'm thinking here of a rather popular space opera series where the
heroine, who is on anti-aging drugs, is conducting a war against a
vicious and frightening enemy, and ... um, has virtually _no_ human
weaknesses, much less confused motivations and self-serving urges. Maybe
this is something to do with SF-as-adolescent-power-wish-fulfillment ...)
-- Charlie
>Or rather, too much contemporary SF seems to deal with transparently
>motivated protagonists who don't have any concealed vices or prejudices.
>(I'm thinking here of a rather popular space opera series where the
>heroine, who is on anti-aging drugs, is conducting a war against a
>vicious and frightening enemy, and ... um, has virtually _no_ human
>weaknesses, much less confused motivations and self-serving urges. Maybe
>this is something to do with SF-as-adolescent-power-wish-fulfillment ...)
What gets me about the charge that SF is "adolescent-power-wish-
fulfillment" is that people say that like it's a _bad_ thing.
Yes, SF is often about things of concern to adolescents, and one of the
things of concern to adolescents is power -- how to get it it, how to use
it, what it does to you, how it works in the world. SF provides an
excellent toolbox of ways to think about this.
I suspect that if we were more comfortable with this fact, and didn't
regard it as somehow disreputable, we'd handle it better.
As opposed to the current, apparently politically acceptable, war
on any sign of radicalism or self-expression in youth?
Yeah, point taken. However, I always get a little nervous about
fictions where the main protagonist is totally right, totally
focussed, and powerful. Somehow it reminds me of the kind of
propaganda that certain political movements tended to pump out,
earlier this century ...
-- Charlie
> But her
> real motivations aren't those of a kid in an adventure novel; they're
> those of a general officer fighting a cold war.
Worse than that. She's on the Central Committee, so her ideological
heritage, of which she is aware, is revolutionary politics rather than
military heirarchy: she's a 'dead man on leave', and she knows it. She
knows that ultimate victory, and thus her survival, depend on making
sure that the movement does not succumb to 'line wobble'. Which,
incidentally, is top phrase.
I've just picked up my copy of _The Sky Road_ from the bookshop. So,
that's this evening planned perfectly.
Chris
_reasonable_? Genocide, *genocide* because some of the founders of their
society, thousands of their years ago, engaged in a raid over a disputed
piece of real estate and killed her parents and enslaved some others?
I'm sorry, but I don't see the resonable here.
No, I'm not "upset that Ellen is not a completely nice person." She is
perhaps the most evil protagonist by our standards that I've ever seen in
a novel.
I'm not disputing the concept of the evil protagonist. (Indeed, it's
going to get quite a bit more attention with the story of Anakin
Skywalker getting some limited exposure. :-)
I am interested though, in Macleod's literary goal in making her partly right
at the last minute.
It's disturbing. Disturbing isn't implicitly bad, but it's worthy of
discussion.
In addition, I think she's just as evil by the standards of Macleod's
world. Her genocide has almost surely doomed the human race, though
she may have felt it was doomed anyway. The idea that the post-humans
can be stopped forever is ridiculous. They'll be created again.
"We're back. And this time we're *pissed*"
It ocurred to me -- how do we know that the slow-humans won? The
signal could have been faked after all.
Aaron
--
Aaron Bergman
<http://www.princeton.edu/~abergman/>
> >@ nospam . antipope . org <charlie> wrote in
> ><slrn7om8kg.s6.SPAMB...@charlie.ed.datacash.com>:
>
> >>discussion it's striking how many people seem to misread Elen May.
> >>She has some rather dark motivations which aren't exactly concealed
> >>(notably, where her parents went), and which make her attack on the
> >>fast thinkers extremely reasonable; and she's not a happy-go-lucky
> >>character, either,
>
> _reasonable_? Genocide, *genocide* because some of the founders of
> their society, thousands of their years ago, engaged in a raid over a
> disputed piece of real estate and killed her parents and enslaved some
> others?
Because the Jovians had continued, for most of that time, to broadcast and
update the computer viruses that had wrecked human civilization, which
certainly seemed like a hostile move. Because there was no way for the
humans to know whether they were being honest when they said that the
viruses weren't being created deliberately, and therefore no way of
knowing whether to trust the sole evidence they had of the Jovians'
non-hostility when they shut the virus broadcasts off. Especially when
the Jovians started to take over New Martian ships, which made it look
very much like they may have been hoaxing when they shut the broadcasts
off.
> I am interested though, in Macleod's literary goal in making her
> partly right at the last minute.
I seem to recall a discussion in this forum some months back about Greg
Egan's _Diaspora_, in which (if memory serves) Ken MacLeod participated,
and put forward the opinions about AI that Ellen held early in _The
Cassini Division_ -- that AIs couldn't be "truly" conscious.
--
Avram Grumer | Any sufficiently advanced
Home: av...@bigfoot.com | technology is indistinguishable
http://www.bigfoot.com/~avram/ | from an error message.
>What gets me about the charge that SF is "adolescent-power-wish-
>fulfillment" is that people say that like it's a _bad_ thing.
>
>Yes, SF is often about things of concern to adolescents, and one of the
>things of concern to adolescents is power -- how to get it it, how to use
>it, what it does to you, how it works in the world. SF provides an
>excellent toolbox of ways to think about this.
You know perfectly well that is not "adolescent-power-wish-fulfillment".
A proper adolescent power fantasy presents the adolescent reader with a
scenario in which the protagonist (with whom he identifies) is not
powerless but powerful, and can kick butt on all his enemies and bad
people generally, but (of course) never abuses his power 'cause he's a
Slan. Doc Smith is APF, for instance.
If a novel deals intelligently with what *really* happens when you get
power and have to deal with it, then it isn't APF. Otherwise _How Like
A God_ would be APF, wouldn't it?
The proposition that sf is all adolescent power fantasies can easily be
rebutted nowadays. It certainly shouldn't be embraced because it isn't
true anymore. Grown-up power nightmares, maybe.
--
Del Cotter | "Top-of-the-range Peugeot 205 Diesel Turbot"
d...@branta.demon.co.uk | --For Sale notice at work
Del Cotter wrote:
> You know perfectly well that is not "adolescent-power-wish-fulfillment".
> A proper adolescent power fantasy presents the adolescent reader with a
> scenario in which the protagonist (with whom he identifies) is not
> powerless but powerful, and can kick butt on all his enemies and bad
> people generally, but (of course) never abuses his power 'cause he's a
> Slan. Doc Smith is APF, for instance.
>
> If a novel deals intelligently with what *really* happens when you get
> power and have to deal with it, then it isn't APF. Otherwise _How Like
> A God_ would be APF, wouldn't it?
Too right. In fact I've spent a good bit of time wedging in references to all
the better-known APFs I could think of into the books. (Argh, but I just
realized I left out Modesty Blaise! Must see if I can get it into the
galleys.)
Brenda
--
---------
Brenda W. Clough, author of HOW LIKE A GOD, from Tor Books
http://www.sff.net/people/Brenda/
Interesting point. However, the fast-humans would have probably
created another wormhole, sent it off into the future and killed
Ellen and the New Martians, and possibly freed their fellow
fastfolk there. Or not even a wormhole. Just a berserker would
be sufficient.
He put them forward as his own, or just illustrating that character?
--
__________________________________________________
David Navarro http://www.alcaudon.com
__________________________________________________
That that is is that that that that is not is not.
--
Nancy Lebovitz na...@netaxs.com
Calligraphic button catalogue available by email!
>People want the best views, the best climate, the best neighbours and
>nearby facilities and attractions and at the same time the best privacy.
>
Nano-tech won't supply the best neighbors (see _The Diamond Age_), but
it might do quite a bit to supply the best views and climate. I wonder
how much a planet could be modified toward being generally delightful.
I can't think of much sf on the subject, though Lafferty's "Slow
Tueday Night" and Zelazny's _Isle of the Dead_ have a couple of
a little about it.
Probably lots, but politically the movement towards conservation will
probably be too strong. Future tech will probably let you have
viewscreen windows that make it seem like you're at a beach house too,
but people will still want, and pay for, a real one.
The problem with the argument of plenty is that people *want* to be
different from, and better than, their neighbours. If there is
something they can get that others can't afford, they will want it,
either for real or imagined reasons.
That said, while staying on the earth, there certainly isn't going
to be enough beautful coastal land to give everybody a real waterfront
home.
We won't eliminate scarcity of everything, and if we could, we would
invent other things to be scare, and create artificial scarcities to
compete over. It seems to be our nature and already happens today.
Why do people pay _extra_ to get DKNY stamped on their clothes?
> Avram Grumer wrote:
> >
> > I seem to recall a discussion in this forum some months back about Greg
> > Egan's _Diaspora_, in which (if memory serves) Ken MacLeod participated,
> > and put forward the opinions about AI that Ellen held early in _The
> > Cassini Division_ -- that AIs couldn't be "truly" conscious.
>
> He put them forward as his own, or just illustrating that character?
He was posting them here. But it was right when he was writing that
book, I don't know if he stands by that all the time. He should be
back from holiday today (though he may not go online and download all
of rasfw instantly he gets in the house, I hear some people don't)
and be able to answer that for himself if he wants to.
>Stoned koala bears drooled eucalyptus spittle in awe
>as <p...@panix.com> declared:
>>What gets me about the charge that SF is "adolescent-power-wish-
>>fulfillment" is that people say that like it's a _bad_ thing.
>
>As opposed to the current, apparently politically acceptable, war
>on any sign of radicalism or self-expression in youth?
>
>Yeah, point taken. However, I always get a little nervous about
>fictions where the main protagonist is totally right, totally
>focussed, and powerful. Somehow it reminds me of the kind of
>propaganda that certain political movements tended to pump out,
>earlier this century ...
I basically agree with both of your paragraphs above, but I'm a little
puzzled as to how they connect to what I said. I'm left thinking that some
kind of connective tissue has been left out -- or that, perhaps, you're
reading me as defending some rather more specific kind of "power wish-
fulfullment" than I think I am.
>In article <8E024A6...@news.panix.com>,
>P Nielsen Hayden <p...@panix.com> wrote:
>>@ nospam . antipope . org <charlie> wrote in
>><slrn7om8kg.s6.SPAMB...@charlie.ed.datacash.com>:
>>>discussion it's striking how many people seem to misread Elen May. She
>>>has some rather dark motivations which aren't exactly concealed
>>>(notably, where her parents went), and which make her attack on the
>>>fast thinkers extremely reasonable; and she's not a happy-go-lucky
>>>character, either,
I know you're just going to say "count the carets, I did it right," but
nonetheless I feel compelled to note that I didn't write any of the above,
and that I wish that when people quote quotations, they would simply edit
the headers as well.
> On Tue, 13 Jul 1999, in rec.arts.sf.written
> P Nielsen Hayden <p...@panix.com> wrote:
>
> >What gets me about the charge that SF is "adolescent-power-wish-
> >fulfillment" is that people say that like it's a _bad_ thing.
> >
> >Yes, SF is often about things of concern to adolescents, and one of
> >the things of concern to adolescents is power -- how to get it it, how
> >to use it, what it does to you, how it works in the world. SF
> >provides an excellent toolbox of ways to think about this.
>
> You know perfectly well that is not
> "adolescent-power-wish-fulfillment."
I would be tempted to say that you know perfectly well that starting your
post "you know perfectly well" amounts to accusing me of speaking in
deliberate bad faith, but perhaps you don't know that perfectly well. If
so, perhaps you should think it over.
That said, sure, maybe you and I agree on what works of SF are merely crap
wish-fulfillment fantasy and what works are in fact thoughtful examinations
of power's uses and effects. However, there's barely an intelligent work
in the field that I haven't heard dismissed by someone with this line, and
moreover it's one of the commonest charges levelled against the field _as a
whole_ by its detractors and apostates. And many of them quite
specifically mean the works you and I would defend.
>I basically agree with both of your paragraphs above, but I'm a little
>puzzled as to how they connect to what I said. I'm left thinking that some
>kind of connective tissue has been left out -- or that, perhaps, you're
>reading me as defending some rather more specific kind of "power wish-
>fulfullment" than I think I am.
Naah: I was being vague and following my internal thought associations
without adequately explaining them. (Yesterday was a vague kind of day
for me -- I really needed a clue transplant, or some more sleep.)
If I can recapture my train of thought without veering off the tracks and
into a wilderness of confusion ...
One complaint about Ken's work that seems to come up a bit is that it is
morally ambiguous. Moral ambiguity is a feature of the real world, but
some people seem to dislike it intensely in their diet of fiction.
I notice that there's a lot of fiction out there that seems to be
excessively unambiguous in its treatment of complex issues -- almost as
if there's some sort of market pressure selecting for this characteristic.
And it isn't simply adolescent wish-fulfillment (which is a red herring in
this context: there's always been a large chunk of it in SF, and I don't
have a problem with it). It's as if people really don't like ambiguity,
or want to read something that reassures them that actually the universe
_does_ contain bedrock black-and-white issues where the good guys wear
the white hats and always win.
I can understand people wanting this; what I'm having difficulty with is
the implicit assumption that a work that doesn't conform to this expectation
is bad. That attitude is not consistent with the voracious appetite for
new ideas that I always associated with SF, and its occurence here puzzles
me.
-- Charlie
>One complaint about Ken's work that seems to come up a bit is that it is
>morally ambiguous. Moral ambiguity is a feature of the real world, but
>some people seem to dislike it intensely in their diet of fiction.
>
>I notice that there's a lot of fiction out there that seems to be
>excessively unambiguous in its treatment of complex issues -- almost as
>if there's some sort of market pressure selecting for this
>characteristic. And it isn't simply adolescent wish-fulfillment (which
>is a red herring in this context: there's always been a large chunk of
>it in SF, and I don't have a problem with it). It's as if people really
>don't like ambiguity, or want to read something that reassures them that
>actually the universe _does_ contain bedrock black-and-white issues
>where the good guys wear the white hats and always win.
>
>I can understand people wanting this; what I'm having difficulty with is
>the implicit assumption that a work that doesn't conform to this
>expectation is bad. That attitude is not consistent with the voracious
>appetite for new ideas that I always associated with SF, and its
>occurence here puzzles me.
Gotcha, boss. I couldn't agree more.
>_reasonable_? Genocide, *genocide* because some of the founders of their
>society, thousands of their years ago, engaged in a raid over a disputed
>piece of real estate and killed her parents and enslaved some others?
Yes. Ellen May blames the Jovians for the Green Plague (which only
killed about five or ten billion people). She blames them for murdering
her parents. She blames them for repeatedly attacking the inner system
civilization with computer viruses.
When they pop up and resume contact and say "hi guys, sorry about what
happened 200 years ago, we won't do it again" her response to them is
pretty much what you'd expect of a general in the Israeli army responding
to Adolf Hitler popping out of hiding in Brazil and apologizing while
explaining that he's taking over South America for purely peaceful
reasons.
There are, in fact, at least FOUR instances of genocide in The Star
Fraction, The Stone Canal and The Cassini Division. (Dunno about The Sky
Road; I'll be able to tell you next week :)
>I am interested though, in Macleod's literary goal in making her partly right
>at the last minute.
There's no "at the last minute" about it. She explicitly points out,
halfway through the book, that the problem is that the fast-thinkers are
fast; one year to us, a thousand years to them. If they're allowed to
get away with repudiating the immoral actions of their ancestors, what's
to stop their descendants repudiating their peace overtures? Given the
time difference, this could happen painfully soon by human standards --
and leave the human species at the mercy of a very strong post-human
civilization.
This is not the morality of a comic-book heroine, this is the logic of
a general on the central committee staring military defeat -- and
probable species extinction -- in the face. Elen is right, by the
standards of her experience, her civilization, her situation, and all
the precedents. This doesn't mean what she does is good; just that it
is necessary for the survival of the human species.
-- Charlie
> .... It's as if people really
>>don't like ambiguity, or want to read something that reassures them that
>>actually the universe _does_ contain bedrock black-and-white issues
>>where the good guys wear the white hats and always win.
>>
>>I can understand people wanting this; what I'm having difficulty with is
>>the implicit assumption that a work that doesn't conform to this
>>expectation is bad. That attitude is not consistent with the voracious
>>appetite for new ideas that I always associated with SF, and its
>>occurence here puzzles me.
>
>Gotcha, boss. I couldn't agree more.
Some random hypotheses ...
1. Future shock. We live in interesting times, indeed. SF was traditionally
read by people who were neophiliac. However, the implications of rapid
and continual change are now unavoidable, and we are seeing an audience
who don't read SF because of the traditional appetite for new ideas; they
read it because they want the same reassurance that some fantasy readers
seem to want -- namely that their core values will always remain the same,
however strange the world may become. (I'm inclined to stick Star Wars in
this basket, especially because of the Campbellian overtones of Lucas'
work.)
2. Brain candy. Too much saccharine power-tripping rots the mind, especially
if you don't take enough moral fibre in your diet to keep the old critical
faculties moving.
3. Imperial #1-ism. I'm probably treading on dangerous ground here, but I've
noted that there's a strong undertone of "we're #1, so we _must_ be
the good guys!" in American popular culture. This is no surprise, and
there was an identical tone in British popular fiction a century ago; it
seems to go with being a superpower. The point is, nobody likes to see
themselves cast in the role of the bad guys. Powerful nations _need_ to
inculcate a mythology that justifies their power: otherwise their citizens
will feel like shits.
This cultivated sense of confidence in one's own side's rectitude spills
over into a strong dislike for fictional works that contradict it --
works that question the assumption that the character one is emphathising
with is actually right. In particular it opens an emotional can of worms
if you extrapolate it to readers who live in a country that, to be fair,
has more in the way of self-doubt than most other pre-eminent imperial
powers had in their day. (And this may be part of Ken's problem: he writes
with a background of living in a post-imperial country.)
4. Constitutional myopia. "We hold these rights to be self-evident" may make
for a grand-sounding constitution, and it is indeed the bedrock of US
law (see above), but it is by no means a universally-understood (or
In fact, agreement with the idea that there are absolute rights
seems to be a peculiarly American thing: most other democracies have
rights built into their legal system at the constitutional level, but
they're predicated on a social contract rather than an absolutist theory
of rights. A culture that accepts the idea of absolute rights is one
that is prone to assume that black-or-white distinctions can be made.
5 Cross-infection from fantasy. Most heroic fantasy seems to rely on a
cyclic theory of history; typical plot structures entail some Bad Guy
threatening the natural order (see [4] above), a Good Guy having to
go on a quest, and the Good Guy finally repairing the natural order by
doing something to the Bad Guy. Traditionally, SF relied on the idea
of continual progress -- insofar as our hero is a warrior-scientist
who goes places, makes discoveries, and changes the natural order of
things, he'd slot into the typical fantasy structure just about right
where the Bad Guy belongs(!). If, however, our SF hero is motivated to
defend the natural order of things (see [4] above), we can slot him into
the fantasy Good Guy's shoes (and, incidentally, produce SF that
satisfies the readers in [1] above, who want reassuring about their
core values.)
Anyone want to shoot any of these down, or add some of their own?
-- Charlie
I'm curious where you're getting this. Was this from previous
discussions, but I don't see it from this thread.
>>One complaint about Ken's work that seems to come up a bit is that it is
>>morally ambiguous. Moral ambiguity is a feature of the real world, but
>>some people seem to dislike it intensely in their diet of fiction.
>
>I'm curious where you're getting this. Was this from previous
>discussions, but I don't see it from this thread.
That's where this thread seems to have started; a review of The Cassini
Division that concluded that its central protagonist was quite simply
evil. Evil is not an appropriate description of her; her actions are
entirely consistent with one viewpoint, and while somewhat horrific,
make sense and can be seen as pure self-defense. The thread of moral
ambiguity runs through all Ken's work; there's nobody who you can point
to and say, "this person is obviously the bad guy".
Hence topic drift.
-- Charlie
But that's not the point. You're misreading a moral condemnation
for a dislike of moral ambiguity. When you say "evil is not an
approriate description of her" you are making a moral judgment
yourself and others can and do disagree. As far as I can tell,
the moral ambiguity of the main character has not been a
criticism of the book; the criticism has been of the rather ad
hoc nature of Ellen being proved correct in the end.
This thread was getting entirely too supercilious for my tastes.
> Avram Grumer wrote:
> >
> > I seem to recall a discussion in this forum some months back about Greg
> > Egan's _Diaspora_, in which (if memory serves) Ken MacLeod participated,
> > and put forward the opinions about AI that Ellen held early in _The
> > Cassini Division_ -- that AIs couldn't be "truly" conscious.
>
> He put them forward as his own, or just illustrating that character?
As his own, I think.
Poking around on Deja.com, I found the "Egan & Dennett" thread. Here's
message <ywcwHJAs$tX1...@libertaria.demon.co.uk>, by Ken MacLeod
<k...@libertaria.demon.co.uk>, and it looks more like he was talking about
computer emulations of human consciousness than of possible non-human
consciousnesses that might be found in computers:
: In article <bob-ya02408000R...@news.pacifier.com>, Bob
: Hearn <b...@gobe.com> writes
: >In article <M21fWDAL...@libertaria.demon.co.uk>, Ken MacLeod
: ><k...@libertaria.demon.co.uk> wrote:
: >
: >> (I'd be among the hold-outs. I just flat out do not believe that
: >> computers can have subjectivity. If the Ndoli implant ever comes along,
: >> I'll regard those who switch as dead, replaced by meat puppets with
: >> computers in their skulls.)
: >
: >Is it the difference in hardware that bothers you, or the fact that
: >your consciousness would have to be transferred to it, in some sense
: >destroying the *real* you?
: >
:
: >If the former, what is it that makes carbon more conscious than silicon,
: >or any other material?
: >
:
: Its physical properties. I don't rule out in principle that whatever
: physical properties are necessary for an arrangement of matter to be
: conscious could be achieved with other materials.
:
: >If the latter, does it bother you that your component molecules are
: >continually being replaced?
: >
:
: No. The new molecules are just as capable of sustaining consciousness as
: the old ones. Trust me on this :-)
:
: There are two separate questions here. The difference in hardware
: bothers me because I think that only some specific arrangements of
: matter can actually be conscious. Emulation doesn't cut it.
:
: The second question assumes that the idea of consciousness being
: 'transferred' makes sense, and is not just is a mechanistic relic of the
: idea of a separable soul. I'm sure the widespread use of computers is
: making it more intuitively plausible, but it's still wrong. A mind is
: not a file.
:
: As for 'in some sense destroying the *real* you' - yes, for some funny
: reason I think having my skull cleaned out and filled with sponge would
: destroy the real me. When this arrangement of matter dies, I die.
:
: A thought experiment. You meet some godlike aliens who tell you that,
: millions of years after you die, an exact replica of you will be created
: with all your memories up until death. Its weal or woe depends on what
: you do in your life here. You believe them (for whatever reason).
:
: Is it rational to regard that future person as yourself?
:
: Should you modify your behaviour in consequence?
:
: Suppose the godlike aliens can create replicas *right now* of people who
: have just died, and reward or punish the replicas. Do the answers
: change?
:
: Well, mine are no, no, and no.
:
: >I frightens me that when something like the Ndoli device comes along, as I'm
: >sure it will, there will probably be a significant number of people who feel
: >as you do. If it happens in my lifetime, I'm not looking forward to having to
: >face a society that doesn't even believe I exist, refuses to grant me
: >rights, etc.
: >We could wind up with an entirely new kind of religious war.
:
: The Butlerian Jihad, as Frank Herbert called it. I'm fairly sure
: something like the Ndoli device *won't* come along, but if it does, I
: intend to be among the mujahedin.
: --
: Ken MacLeod 'Civilized man takes for granted that order is better than
: chaos and that, due to the natural order of the world,
: certain things are simply impossible. An assault by flesh
: eating ghouls, however, calls into question this assumption.'
:
: - John Marmysz, _The Nihilist's Notebook_
Ouch!
That sounds pretty much like Penrose's cryptotheism. I don't think
there is (or that there *can* be) such thing as "true" consciousness
as opposed to "untrue" consciousness. If something passes the Turing
Test, then it's conscious, period. God knows there are enough carbon
human beings that cannot manage that much.
I mean, the issue bifurcates here. If I get run over by a bus and have
to have my "consciousness" transferred to a machine, there are two
questions: One is whether I'm still *me*, and that's a thorny one, as
it depends exclusively on what you define as oneself. Maybe there
isn't a continuity between my brain and my mind file, and therefore my
new consciousness is not "me", but at least, like Iain Banks put it,
it will be someone who remembers me perfectly. As to the other
question, whether it's "truly" conscious, well, I can only recall 18th
century discussions as to whether Native Americans and blacks (hell,
and women for that matter) were truly conscious either, or even more
modern debates on animal consciousness (I can't get out of my head the
studies to determine whether deer chased down with dogs actually feel
stress... jesus...) The definition of conscious is entirely
subjective. You are conscious if you *feel* conscious.
On the issue of whether silicon does or doesn't have what it takes to
sustain consciousness, either Ken knows a whole lot more than I do
about the chemistry of carbon and silicon, and can tell us exactly why
carbon can and silicon can't, or he subscribes to a Penrose-like view
of the brain, in which it has some magical qualities that make it able
to bear consciousness where any other information-processing machine
cannot.
In the light of this, I don't know what to make of the (back to
SPOILERS) realization by Ellen that machine-assisted consciousness
didn't lead to a loss of her being. I don't know if this means Ken
changed his mind, or if Ellen has been deluded.
Anyway... The reason why I changed the title of the thread is because
I have this feeling that the future, if we get to have one, pretty
much *has* to look like the Culture, with AIs in charge. I see we're
having enough problems as it is with ultra-specialization in the
different fields of technology and engineering, with people making
mistakes that would be obvious if they had some more training in other
fields, or duplicating advances made in other labs, simply because
they were published in a different scientific journal. Year after year
it takes longer to catch up with the state of the art in any
particular field, and I feel that soon we'll have to resort to either
human consciousness downloaded onto hardware, full biological
immortality or hardcore AI (or any combination of all three, as in
Banks' Culture) in order to keep science advancing.
I'm probably disagreeing strongly with Ken on this, but I feel that
human alone is not going to be good enough for much longer...
--
But where has this sort of complaint come up? I hope you don't think
it's from me -- I've said several times I find it quite interesting.
Debating the morals and portrayal of a character, and criticising them,
is hardly the same as saying they shouldn't be there! So who has made
these complaints?
Hmm. Actually what I didn't like was the seeming removal of moral
ambiguity at the end. The Jovians are shown to be bad guys. A thankful
Earth beams messages of great gratitude and praise over thousands of
light years. All's well that ends well.
This last minute demonstration of duplicitous Jovians isn't necessary for
her to justify her actions within her True Knowledge. (Indeed, the fact
that the Jovians are themselves not all of one voice is known already.)
The last minute fixed struck me as a "happy ending" of sorts. Maybe this
is the author's intention, to display it happy but have the reader
wonder about things. It just doesn't read that way to me. The omniscient
author should, if trying to present an objective world in which his
characters are good, bad and mostly in-between should show a variety of
consequences. More details on the wiped out civilizations who were her
victims. Much more ambiguous messages from Earth about how some protest
the horror of what was done, others like it and others are ambiguous.
This is a true morally ambiguous ending. Not, "We wiped them out in
a sneak attack, and now we're all happy and can get on with our lives
again."
But this isn't the same Hitler, or we're given no reason to think so.
They incorporate, in a group memory, the memories of the earliest of
their kind.
But in fact this is like a general in the Israeli army, sent into the
future, and finding the great-great-great^20th-grandchildren of Hitler,
noting that some of them are bad, and they do still have copies of
Mein Kampf in their libraries, and wiping them all out with a
pre-emptive strike, and everybody being happy about it.
A vast amount of the evil in the world is done because "your grandfather
hurt my grandfather and I must avenge it." Kosovo is the latest example,
and a highly extreme one.
The Serbs however, are *not* all thanking Milosovic for fighting the good
fight for them. And their horror will grow as they learn more of the
truth (presuming we're learning the truth from our own reporters.)
Ellen has the classic mindset of the genocider (a new word!) for most of
the book. The enemy are less than human, even if they're smarter, so it's
OK to wipe them out.
Frankly I haven't figured out what to think of the scene where she realizes
that she has been spending some time running in a nanocomputer. It seems
almost dropped in. It appears for no reason, explains nothing, and alters
no behaviour. I may have to go back and look at it in more detail, but I
welcome other views on the meaning of this scene.
>There's no "at the last minute" about it. She explicitly points out,
>halfway through the book, that the problem is that the fast-thinkers are
>fast; one year to us, a thousand years to them. If they're allowed to
>get away with repudiating the immoral actions of their ancestors, what's
>to stop their descendants repudiating their peace overtures? Given the
>time difference, this could happen painfully soon by human standards --
>and leave the human species at the mercy of a very strong post-human
>civilization.
Nothing stops them from such repudiation. As nothing stopped Ellen from
repudiating the promise of the Cassini Division that they had diverted
their comets and wouldn't attack.
>
>This is not the morality of a comic-book heroine, this is the logic of
>a general on the central committee staring military defeat -- and
>probable species extinction -- in the face. Elen is right, by the
>standards of her experience, her civilization, her situation, and all
>the precedents. This doesn't mean what she does is good; just that it
>is necessary for the survival of the human species.
No, she is wrong, from her own viewpoint. She comes from a society -- helped
build the society -- based on cooperation and charity. Maybe you can
argue she really doesn't believe in those things.
Under the True Knowledge, you do use your own survival as the ultimate
good, and do what you can get away with (Sounds a lot like Rand, actually),
but the socialist viewpoint is that you will better serve this by sharing
and convincing others to cooperate with you to do more. If you wipe
out a threat, it's a very extereme measure.
The Union delegate has it right. The fast folk are too strong to trick or
defeat in battle. You can't wipe out the very *idea* of the outwarder.
Especially when computers make their comeback. She acts against her own
interests, and those of her people, by leaving the legacy that machine
intelligence is best advised to just wipe out pesky bio-humans because their
own philosophy makes them capable of happy genocide.
snip
>
>But in fact this is like a general in the Israeli army, sent into the
>future, and finding the great-great-great^20th-grandchildren of Hitler,
>noting that some of them are bad, and they do still have copies of
>Mein Kampf in their libraries, and wiping them all out with a
>pre-emptive strike, and everybody being happy about it.
>
In the _Stone Canal_, same author, Israel nukes Berlin
in the opening moves of WWIII.
James Nicoll
--
"I don't laugh at you when you're hurt. I laugh at
you when you've been -maimed-."
>>If a novel deals intelligently with what *really* happens when you get
>>power and have to deal with it, then it isn't APF. Otherwise _How Like
>>A God_ would be APF, wouldn't it?
>
>No. _How Like a God_ is an exploration of what might happen when you
>get fantasy powers. It doesn't have much to do with the sort of power
>people sometimes get in the real world.
Perhaps I should have cited L M Bujold's _Memory_ instead.
Hmm. Maybe it's a boy thing, but my fantasies did involve fantasy
powers. If I had X-ray vision I could see in the Girls' changing room;
if I had telepathy, I could just read the teacher's mind for the answer;
if I had super-strength, I'd shove the bully's head *so* far down the
toilet he'd be drawing his next breath at the sewage outfall... :-)
So what do adolescent girl power fantasies look like?
ObSF: I thought C J Cherryh's _Rider At The Gate_ did a pretty good job
of skewering the fantasy of "If I had a big telepathic horse".
Spoilers for The Stone Canal:
> In the _Stone Canal_, same author, Israel nukes Berlin
>in the opening moves of WWIII.
Wasn't it Kiev?
That might depend on how much people think they know about ecosystems.
If there were more places for coral reefs, could that be a good thing?
>viewscreen windows that make it seem like you're at a beach house too,
>but people will still want, and pay for, a real one.
>
>The problem with the argument of plenty is that people *want* to be
>different from, and better than, their neighbours. If there is
>something they can get that others can't afford, they will want it,
>either for real or imagined reasons.
>
>That said, while staying on the earth, there certainly isn't going
>to be enough beautful coastal land to give everybody a real waterfront
>home.
Build archipelagos? The hard part would be building archipelagos that
aren't extremely vulnerable to storms. On the other hand, if you have
magic nanotech, it'll be a lot easier to rebuild.
>
>We won't eliminate scarcity of everything, and if we could, we would
>invent other things to be scare, and create artificial scarcities to
>compete over. It seems to be our nature and already happens today.
>Why do people pay _extra_ to get DKNY stamped on their clothes?
>
--
> 1. Future shock. We live in interesting times, indeed. SF was traditionally
>
> 2. Brain candy. Too much saccharine power-tripping rots the mind, especially
>
> 3. Imperial #1-ism. I'm probably treading on dangerous ground here, but I've
>
> 4. Constitutional myopia. "We hold these rights to be self-evident" may make
>
> 5 Cross-infection from fantasy. Most heroic fantasy seems to rely on a
Moral ambiguity almost precludes a well resolved or happy ending. If
both the protagonist and their antagonist are morally ambiguous either
their conflict may not be resolved or someone who is at least partly
sympathetic loses out in a big way. This can make for an emotionally
unsatisfying, albeit intellectually attractive, reading experience.
Frossie
--
Joint Astronomy Centre, Hawaii http://www.jach.hawaii.edu/~frossie/
Language is the soul's ozone layer and we thin it at our peril --Sven Birkerts
> 4. Constitutional myopia. "We hold these rights to be self-evident"
> may make for a grand-sounding constitution, and it is indeed the
> bedrock of US law
Technical pedantry:
(1) The line is:
We hold these truths to be self-evident, that all men are
created equal, that they are endowed by their Creator with
certain unalienable Rights, that among these are Life,
Liberty, and the pursuit of Happiness.
Note "truths," not "rights." (2) It might very well make for a
grand-sounding constitution, but not the U.S. one. The line's from
the U.S. Declaration of Independence, a truly loverly political
manifesto (with something about secession tacked on) that isn't part
of the Constitution at all. I _think_ it's quoted in full at the
beginning of the United States Code, the federal lawbook, but as far
as I know it has no legal authority of its own.
-- William December Starr <wds...@crl.com>
I used to spend entire class periods in high school using my telekinetic
powers to explode the hundred-some windows of the school buildings I could
see out of the window. One by one, in great detail and slow motion.
> ObSF: I thought C J Cherryh's _Rider At The Gate_ did a pretty good job
> of skewering the fantasy of "If I had a big telepathic horse".
Now if only someone would skewer the fantasy of "If I had a darling fuzzy
telepathic cat." The closest I can think of, though the cats in question
aren't telepathic, is Neil Gaiman's "A Dream of a Thousand Cats" in the
_Dream Country_ collection, in which we see what would happen if cats ruled
the world. The few panels are priceless.
Rachel
>Now if only someone would skewer the fantasy of "If I had a darling fuzzy
>telepathic cat."
I wouldn't mind having a darling fuzzy telepathic cat, so long as
it wasn't any bigger than the cats I have now. My youngest
cat, Sebastian, went on a rampage this morning, attacking the
other cats, ricocheting off the walls, clawing and biting, and
when I got him out of the kitchen garbage and took the chicken
bones away from him he decided my big toe was edible. At this
point I gave him a raw egg, and then another one, after which he
calmed down and deigned to sample some cat food. Maybe he was
just really, really hungry (he'd had three seizures in the last
two days)? If he could have told me so, things would've been
simpler.
But if he'd been any bigger than he is, I'd be missing body parts
now.
Dorothy J. Heydt
Albany, California
djh...@kithrup.com
http://www.kithrup.com/~djheydt
> SF was traditionally
> read by people who were neophiliac. However, the implications of
> rapid and continual change are now unavoidable, and we are seeing an
> audience who don't read SF because of the traditional appetite for
> new ideas; they read it because they want the same reassurance that
> some fantasy readers seem to want -- namely that their core values
> will always remain the same, however strange the world may become.
John Barnes has argued with some energy that this characterises _most_ SF
since the invention of genre SF; that SF, for most readers, is about the
containment of wonder.
I'm not sure I agree; I'm not even sure he agrees. But he's got hold of
something real there.
Hmmm. How about "The Lion Game", a Telzey Amberdon story by Schmitz.
The heroine's "darling fuzzy telepathic cat" turns out to be a junior
member of a species of rather bloody predators who think human
sports hunters are good sport. Maybe it isn't a proper skewer, though,
because it all comes out right in the end.
Ethan A Merritt
mer...@u.washington.edu
>So what do adolescent girl power fantasies look like?
The two I remember most vividly were:
(a) hatching out a gold dragon in the gymnasium at my high school,
(b) being the leader of a rebellion against sinister mind-control
powers to which I was personally immune.
>ObSF: I thought C J Cherryh's _Rider At The Gate_ did a pretty good job
>of skewering the fantasy of "If I had a big telepathic horse".
It does a number on _Dragonflight_ too. If that was one of your
adolescent fantasies, the scenes involving Brianna really hit
a nerve--at least they did for me.
Mary Kuhner mkku...@eskimo.com
>When they pop up and resume contact and say "hi guys, sorry about what
>happened 200 years ago, we won't do it again" her response to them is
>pretty much what you'd expect of a general in the Israeli army responding
>to Adolf Hitler popping out of hiding in Brazil and apologizing while
>explaining that he's taking over South America for purely peaceful
>reasons.
Um, the situation as described sounds more like a general in the Israeli
army nuking Germany in 2145 because the Chancellor said "sorry about the
Holocaust guys, we don't do it again".
>Stoned koala bears drooled eucalyptus spittle in awe
>as <aber...@princeton.edu> declared:
>
>>>One complaint about Ken's work that seems to come up a bit is that it is
>>>morally ambiguous. Moral ambiguity is a feature of the real world, but
>>>some people seem to dislike it intensely in their diet of fiction.
>>
>>I'm curious where you're getting this. Was this from previous
>>discussions, but I don't see it from this thread.
>
>That's where this thread seems to have started; a review of The Cassini
>Division that concluded that its central protagonist was quite simply
>evil. Evil is not an appropriate description of her;
Of course it is. Remember, peoples' ideas of what exactly "evil" is vary a
lot.
>her actions are
>entirely consistent with one viewpoint, and while somewhat horrific,
>make sense and can be seen as pure self-defense.
Even if true, that does not preclude them from being evil. While I haven't
read the book in question, the actions in question seem to make about as
much sense as the US nuking Russia into a patch of glowing sand prior to
MAD. If you define her actions as non-evil, seems like you'd have to
define that sort of thing as non-evil as well.
>The thread of moral
>ambiguity runs through all Ken's work; there's nobody who you can point
>to and say, "this person is obviously the bad guy".
This depends greatly on what your criterion for being a "bad guy" are.
>Hmm. Actually what I didn't like was the seeming removal of moral
>ambiguity at the end. The Jovians are shown to be bad guys. A thankful
>Earth beams messages of great gratitude and praise over thousands of
>light years. All's well that ends well.
Um, that's not the book I read. Remember the point at which Ellen May
realises that _she_ is an upload? Puts a whole different spin on things,
that does. What's really coming out of it is that it's very dangerous
indeed to try and negotiate with a culture as if it's a monolithic
entity rather than a bunch of individuals -- and it's also dangerous
to assume that all members of a heterogeneous culture will behave
altruistically. The whole book is riddled with examples of non-co's,
and indeed that seems to be the whole point.
>The last minute fixed struck me as a "happy ending" of sorts. Maybe this
>is the author's intention, to display it happy but have the reader
>wonder about things. It just doesn't read that way to me.
That's a subjective reading: I read it the other way from you.
-- Charlie
>Even if true, that does not preclude them from being evil. While I haven't
>read the book in question, the actions in question seem to make about as
>much sense as the US nuking Russia into a patch of glowing sand prior to
>MAD. If you define her actions as non-evil, seems like you'd have to
>define that sort of thing as non-evil as well.
Then you probably want to read the book (in the context of it being the
third of a loosely-coupled series). Clue: the Jovians (who she nukes in
the end) are not exactly guilt-free. (And, parenthetically speaking, we're
not in Jordan Bassior's universe here.)
-- Charlie
>Um, the situation as described sounds more like a general in the Israeli
>army nuking Germany in 2145 because the Chancellor said "sorry about the
>Holocaust guys, we don't do it again".
Naah. More like, the Chancellor said "sorry about the Holocaust guys,
oh and by the way, we can't do anything about those brownshirts who
keep kicking in your doors but don't worry, we're sure they don't mean
anything. And, uh, we have no territorial demands this week."
-- Charlie
>Moral ambiguity almost precludes a well resolved or happy ending.
Only if you (a) require a happy ending, or (b) require a moral
resolution. (Real life ain't like that, but fiction is stylised, so ...)
>If
>both the protagonist and their antagonist are morally ambiguous either
>their conflict may not be resolved or someone who is at least partly
>sympathetic loses out in a big way. This can make for an emotionally
>unsatisfying, albeit intellectually attractive, reading experience.
But if you contemplate most genre SF, the heroes/protagonists get up
to singularly nasty activities -- things that in the real world would
win 'em a long stretch in prison, except that they're the Good Guys(TM),
so that doesn't happen. (Or they're in the army, so it's allowed.) What
this amounts to is sweeping the moral issues under the rug, rather than
resolving them positively.
And I'm intrigued by the way you contrast the emotional and intellectual
satisfactions to be gained from a good novel, as if the two aspects are
opposed, rather than lying in orthogonal directions.
-- Charlie
>Now if only someone would skewer the fantasy of "If I had a darling fuzzy
>telepathic cat."
Give 'em opposable thumbs and the power of speech and that's it for
homo sap., except for those of us chained to the conveyor belts in the
tuna canneries.
Plus, can you imagine what the conversations would be like?
"Hi there! I'm a cat, me. You got any tuna? I'm a cat, pay attention
to me, I'm important! Where are you going? Oh, you're going in there.
I'm going in there too, what's in here? Oh, I'm in here. Me, I'm a
cat, me! Pay attention, now. What am I doing here? There's no fish
here. I'm going there. I'm a cat ..."
-- Charlie
>Plus, can you imagine what the conversations would be like?
>
>"Hi there! I'm a cat, me. You got any tuna? I'm a cat, pay attention
>to me, I'm important! Where are you going? Oh, you're going in there.
>I'm going in there too, what's in here? Oh, I'm in here. Me, I'm a
>cat, me! Pay attention, now. What am I doing here? There's no fish
>here. I'm going there. I'm a cat ..."
You stole this gag from Linda Krakwecke (nee Pickersgill) and I claim my
five pounds.
>You stole this gag from Linda Krakwecke (nee Pickersgill) and I claim my
>five pounds.
I stole it okay, but not from Linda. Guess I'd better take this joke back
to the shop and ask for my money back ...
-- Charlie
: The two I remember most vividly were:
: (a) hatching out a gold dragon in the gymnasium at my high school,
: (b) being the leader of a rebellion against sinister mind-control
: powers to which I was personally immune.
I was your second-in-command during that rebellion (my power fantasies
were only moderately megalomaniacal), figuring out the best way to use the
school's ductwork to get around without being seen. I was also frequently
summoned into alternate worlds to fulfill prophecies and save doomed
kingdoms by figuring out some obscure puzzle, or simply by being Not From
Around Here. I still have some of the maps and imaginary language
dictionaries and bits of dialogue from the ones I tried to turn into
stories tucked away in a box to amuse my executors.
I also engaged in some cross-gender fantasy play, plotting out adventure
stories in my head in which the protagonist was male. They tended to
follow the same pattern as above, though.
Peace,
Liz
--
Elizabeth Broadwell | "[P]ointing a finger at me she said,
(ebro...@dept.english.upenn.edu) | 'Thou still unravish'd bride of quiet-
Department of English | ness, has thou read _The Cenci_ yet?'
at the University of Pennsylvania | I hadn't. And I still haven't."
-- A. Guiness, _Blessings in Disguise_
> Hmm. Maybe it's a boy thing, but my fantasies did involve fantasy
> powers. If I had X-ray vision I could see in the Girls' changing room;
> if I had telepathy, I could just read the teacher's mind for the answer;
> if I had super-strength, I'd shove the bully's head *so* far down the
> toilet he'd be drawing his next breath at the sewage outfall... :-)
>
> So what do adolescent girl power fantasies look like?
I had a lot based on :The Lathe of Heaven:, thus proving that _any_
book will do for adolescent power fantasies if you have a sufficiently
odd adolescent.
I don't think I had any that were comic book powers at all. The
sort of telepathy I wanted came from :The Chrysalids:/:Rebirth:.
And that was much more If I had telepathy I'd be secret and special
and I'd have people who understood me. Others of that period were:
if I had a machine gun I could mow down the whole school while queued
up for lunch; if I had a dragon I could fly _away_... then there was
the end of :The Silver Chair:. And "If I had a big sword and armour I
could dress up as a boy and find someone just slightly better at
everything than me who would like me". Have I ever mentioned how
much I like being grown up?
> ObSF: I thought C J Cherryh's _Rider At The Gate_ did a pretty good job
> of skewering the fantasy of "If I had a big telepathic horse".
Not to mention :Hawkmistress:.
--
Jo - - I kissed a kif at Kefk - - J...@bluejo.demon.co.uk
http://www.bluejo.demon.co.uk - Interstichia; Poetry; RASFW FAQ; etc.
Watch _Red Dwarf_. The Cat, who is a creature evolved from the ship's cat,
is grooving down the corridor, spraying things with an aerosol of scent:
"This is mine; that's mine, that's mine, and that's mine; I'm claiming all
this as mine ... except that bit. I don't want that bit. But all the
rest of this is mine! Hey, this has been a good day! I've eaten five
times, I've slept six times, and I'd made a lot of things mine! Tomorrow
I'm going to see if I can't have sex with something!"
--
+- David Given ---------------McQ-+ "Those who do not understand Unix are
| Work: d...@tao-group.com | forced to reinvent it, poorly." --- Henry
| Play: dgi...@iname.com | Spencer
+- http://wired.st-and.ac.uk/~dg -+
It might also be worth analysing what constitutes a happy ending in
most sf.
: >Now if only someone would skewer the fantasy of "If I had a darling fuzzy
: >telepathic cat."
: Give 'em opposable thumbs
A guy I used to work with claimed his parents cat had an opposable
thumb. I'm not sure if he didn't understand what that really meant
or if he really had a weird cat.
: and the power of speech and that's it for
: homo sap., except for those of us chained to the conveyor belts in the
: tuna canneries.
Oh no, that's a guerilla rebellion I'd pay money to join.
"Waitaminnit. You mean the rest of humanity now wants to shoot cats too?!"
Pete
>Charlie Stross (cha...@antipope.org) wrote:
>: Stoned koala bears drooled eucalyptus spittle in awe
>: as <r.ph...@worldnet.att.net> declared:
>
>: >Now if only someone would skewer the fantasy of "If I had a darling fuzzy
>: >telepathic cat."
>
>: Give 'em opposable thumbs
>
>A guy I used to work with claimed his parents cat had an opposable
>thumb. I'm not sure if he didn't understand what that really meant
>or if he really had a weird cat.
When I was a kid we had a polydactyl cat with opposable thumbs. He
could pick up a pencil with one paw. It was seriously scary; we got
him neutered, of course. Letting him breed would've been dangerous.
--
The Misenchanted Page: http://www.sff.net/people/LWE/ Last update 4/24/99