Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

We need to talk about transhumanism

72 views
Skip to first unread message

Julian

unread,
Oct 9, 2021, 8:51:56 AM10/9/21
to
This weekend, hundreds of people from across the globe will gather in
Madrid to discuss how to turn themselves into a new species.


The occasion is TransVision, the world’s biggest annual meet-up of
transhumanists — and probably the most important intellectual summit
you’ve never heard of. This year, anti-ageing specialist Aubrey de Grey
will explain why he thinks most people alive today have a 50/50 chance
of living to a thousand years old. The CEO of the Alcor Life Extension
Foundation, Max More, will discuss cryogenics, the process by which the
newly deceased are frozen in giant, stainless steel vats and preserved
for resurrection down the line. And Google’s Ray Kurzweil will talk
about the 'singularity': the moment in our not-too-distant future — he
reckons around 2045 — when artificial intelligence finally outstrips the
collective brainpower of mankind and absorbs us into its plans.

Until recently, I — like most people, I suspect — believed this stuff to
be pure science fiction. But then browsing aimlessly one night in March,
I stumbled upon a passing reference to a transhumanist political party
that had apparently put up a candidate for election in 2015. My
immediate assumption was that it was a prank. But looking at their
website, they seemed pretty serious — and surprisingly active.

I went straight to my emails and clicked 'compose new message' — setting
in motion a series of events that not only transported me into the
strange parallel universe of transhumanism.

There’s something fitting about meeting a transhumanist on Zoom. The
disembodied, two-dimensional head of pixels on my laptop screen belongs
to David Wood, the co-founder and current leader of Transhumanist UK.

An austere, middle-aged Scotsman, with fading straw-coloured hair and
thick fiery eyebrows, Wood comes across more Presbyterian minister than
Cyberpunk. His manner is calm and matter-of-fact, as though merely
filling in the details about something we already basically know to be
true: the process by which tin and copper become bronze, say, rather
than the process by which man and machine become cyborg.

Our conversation begins unremarkably, with a brief chat about, of all
things, Universal Basic Income. UBI is, Wood says, one of his party’s
key policies — though he envisions it providing basic resources as well
as cash. By the middle of the century, he says, we’ll have achieved
'sustainable superabundance': enough renewable energy, thanks to nuclear
fusion, and enough food, thanks to lab-grown meat, to make both
essentially free. We’ll also have artificial intelligence providing
education and healthcare for all — and gigantic virtual adventures, 'a
bit like Westworld'.

And that’s just for starters. Wood says he’s a huge advocate of life
extension — and thinks Aubrey de Grey’s prediction that we’ll soon be
living well into four figures is correct. Over the next decade or so, he
says, we’ll develop nanotechnology that goes inside the body and not
only halts ageing, but reverses it by making cells 'biologically
younger' — essentially eliminating all natural causes of death. Wood is
also 'very much in favour' of creating Artificial General Intelligence
(AGI) — robots smarter than humans — and believes they’ll likely arrive
sometime around the middle of the century, though possibly as soon as 2030.

And uploading the mind? Wood says he, like most transhumanists, believes
humans are ultimately material beings, and that we will, therefore, one
day be able to decant our minds into replica silicon brains. But he
hasn’t yet made up his own grey matter whether he wants to do it
himself. 'I’m not sure whether it would really be me,' he says.

Wood is keen to stress transhumanism’s emphasis on 'morphological
freedom' — the right of every individual to choose exactly how, and how
far, to augment themselves. Want to wire your neurons up to a
supercomputer? Great! Just want a few more 'normal' years tacked onto
the end of your lifespan? That’s fine too. Transhumanism isn’t, he says,
for all the sci-fi stereotypes, really about specific goals at all:
'it’s not an end destination we’ve got in mind — it’s the next phase of
the journey'.

What that next phase consists of depends on who you ask. Some
transhumanists want exoskeletons to allow them to run faster — others,
like Kurzweil, want to transform every atom in the universe into a giant
conscious supercomputer. But all transhumanists agree, Wood says, on a
trio of broad pursuits — superlongevity, superintelligence, and
superhappiness. To which he, a self-professed 'technoprogressive', adds
a fourth: fairness, or 'transhumanism for all, rather than transhumanism
for the one per cent'.

Wood acknowledges that some sort of world government would probably be
necessary, though he stresses he still thinks decisions should be made
at as local a level as possible. When I ask if AGI will be able to vote,
or whether there’ll be a difference between the rights of 'enhanced' and
'un-enhanced' humans, he says he doesn’t have the answers — these are
questions that will have to be figured out when we get there. This seems
to me a handy get-out clause for transhumanists: any currently
intractable problems can simply be left to be solved by the smarter,
enhanced people of the future.

When I mention the u-word —'utopia' — Wood bristles. 'It’s a word
transhumanists don’t really like', he says, telling me that four of the
eight clauses in the 1998 Transhumanist Declaration — 'the nearest thing
there is to a canonical document' — highlight the risks as well as
advantages of technological innovation.

Wood admits that his party — in common with the surprising number of
other transhumanist parties around the world, including Somos Miel in
Spain, the AFT in France, and The Innovation in Poland — is unlikely to
come to power anytime soon. Their main goal is, 'like the Greens', to
raise awareness and influence mainstream politicians.

Are they having any success? Finally, Wood beams. Yes. He gives two
examples: an Obama-era white paper that discussed the singularity, and a
speech given by Boris Johnson at the UN in 2019, which was, he says,
dripping with transhumanist ideas.



Both were startling news to me. But both, it turned out, were relatively
small fry. After Wood and I wrapped up our conversation, I spent the
evening following up a few of the other things he’d mentioned — and this
time the safe passage back to normality sealed behind me once and for all.

There was the outgoing U.S. Director of National Intelligence, John
Ratcliffe, claiming that China was conducting 'human testing' on members
of the People’s Liberation Army with the aim of creating soldiers with
'biologically enhanced capabilities'. There was the EU report on
'converging technologies' discussing the prospect of using
nanotechnology to reengineer the brain.

Elsewhere, Elon Musk is pumping hundreds of millions of dollars into
Neuralink, an 'implantable brain-machine' interface that will eventually
allow humans to compete with superintelligent robots. PayPal founder
Peter Thiel and Amazon’s Jeff Bezos have each ploughed hundreds of
millions of dollars into anti-ageing research. The Russian billionaire
Dmitry Itskov is aiming to allow us to transplant our minds into
immortal holographic bodies by the middle of the century. The founder of
MIT Media Lab, Nicholas Negroponte, has talked about 'ingesting'
information by swallowing tiny pills that then make their way through
the bloodstream and deposit knowledge in the brain. As he put it in a
recent TED talk, 'You’re going to swallow a pill and know English.
You’re going to swallow a pill and know Shakespeare.' Some of their
claims might well be a little overblown, but these aren’t just nerds
fiddling with soldering irons in their parents’ basements.

Clearly then, the question isn’t whether this technology is going to
come. The question now is how we stop ourselves using it to destroy each
other.

It’s April 2019. Chris Anderson, the head of TED, and Nick Bostrom — one
of the founders, back in 1997, of the World Transhumanist Association —
are on stage in conversation in Vancouver. Bostrom, bespectacled and
bookish, is now an academic philosopher at the University of Oxford and
something of a big name public intellectual.

Anderson and Bostrom are discussing the technocalypse. Bostrom, while he
still thinks radical human enhancement is a fundamentally good idea, has
become noticeably more pessimistic in recent years about the chances of
our using transformative technology responsibly — and thinks it’s quite
plausible we’ll simply end up using it to wipe each other out. So what,
Anderson asks, can we do?

Bostrom presents four options. The first, simply banning or restricting
scientific research, he says is neither desirable nor realistic. The
second, killing or incarcerating those most likely to commit atrocities,
is unlikely to be 100 per cent foolproof. Our best shot at survival, he
says, is to combine options three and four: world government and
individual, micro-level surveillance: quite literally, all of us being
watched, all of the time, by the superintelligent monitoring devices.

Julian Savulescu, a bioethicist and colleague of Bostrom’s at Oxford, is
somewhat less gloomy. While he readily admits that humans are, to quote
the title of one of his books, 'unfit for the future', he has a far more
ambitious solution: to bioengineer us to be not just stronger and
smarter, but more ethical beings, thanks to 'moral enhancement' pills.

Savulescu gives two examples of ways we already know chemicals can
change our behaviour: the hormone oxytocin, which is known to boost
empathy, and the drug Ritalin, which every day helps millions of ADHD
sufferers to manage impulse control. Both are obviously, as they stand,
blunt tools — but it follows, Savulescu argues, that as our
understanding of neurochemistry improves, we ought to be able to design
ever-more precise pills, perhaps even tailored to the shortcomings of
each individual.

But surely no matter how refined we make these drugs, they’d only really
be addressing our surface behaviour, not improving our underlying
morality itself? A supercharged version of Ritalin might give us the
patience of a Buddhist monk, but it couldn’t help us to answer a
question like 'is it morally legitimate to edit the genes of a human
embryo to make it superintelligent?'. Nor could super-empathy pills tell
us how to treat a cyborg with an IQ of a million.

Let’s imagine for a moment, though, that the thing we call our moral
code — our ethical beliefs and values — really is just the result of
chemicals sloshing around in our brains. And let’s imagine, that it
really will be possible, therefore, one day to pop a pill that erases
our entire belief system and replaces it with a ‘better’ one.

Even granted all that, to design such pills, you’d still need a clear
idea of what moral code you wanted to promote. From what I can tell,
transhumanists don’t. Nor, more critically, do they seem to have any
secure philosophical basis for saying what a 'better' or 'worse' moral
code would even look like.

It isn’t that Transhumanists don’t talk in terms of good and bad. They
do — a lot. Transhumanism, they argue, would allow us to 'flourish' make
our lives vastly 'more worthwhile', unleash our 'cosmic potential'. But
whenever you try to track their philosophical steps and find out where
they thought these words got their meaning from, the intellectual
footprints just seemed to disappear.

A paper by Nick Bostrom called 'Transhumanist Values', for instance,
promised to explain where transhumanist values came from, only to end up
going round in circles. The title of one section, for example, is: 'The
core transhumanist value: exploring the posthuman realm'. In other
words, a transhumanist’s core value is… being a transhumanist.

But then Transhumanists can’t, I finally realised, tell us where
morality comes from, because by the logic of their own philosophical
convictions, morality shouldn’t exist.

Central to transhumanism, after all, is the idea that humans are purely
material beings — that, as Max More puts it in The Philosophy of
Transhumanism, 'our thinking, feeling selves are essentially physical
processes'. This is, of course, why transhumanists are so confident that
we can upgrade ourselves. But such cold materialism can’t give any good
explanation for why we ought to do so. If we really are wholly material,
ethics become disposable.

Transhumanists don’t seem troubled or even aware of this glaring
intellectual problem. Most people, after all, in modern society share
transhumanism’s materialist assumptions about reality. Most people,
therefore, struggle to explain where their sense of right and wrong
comes from. But because our lives are, in historical terms, relatively
comfortable, we simply look the other way and pretend none of us has
noticed. It’s what the atheist philosopher Alex Rosenberg calls ‘nice
nihilism’ — life is ultimately meaningless, but, since it’s more
pleasant for us, we can nonetheless agree just to behave as if it weren’t.

In that sense, transhumanism captures the philosophical mood of our age
perfectly. But the question is whether such 'nice' moral role-play can
last, especially when technology starts radically changing our abilities
and powers. We live in an anomalous moment in history: we think we’ve
moved beyond the superstitions of our past, but we’re still really
subsisting on the residual moral instincts of traditions we’ve otherwise
done away with.

And if anything’s going to shake us from our zombie state and challenge
us with questions that can’t just be answered by being ‘nice', it’s
transhumanism.

Even if we 'nice nihilists' aren’t willing to follow through the logical
implications of our materialism, a superintelligent AGI, being more
intellectually consistent than us, surely would. And there’s absolutely
no reason why it shouldn’t simply dispense altogether with the foolish
moralities of the humans that invented it.

Hugo de Garis, a former AI expert turned author, was the most
fascinating character I came across in the movement — a Nietzschean
tragedy of a man, willing to stare unflinchingly at the potential horror
of what he was doing and seemingly paying for it with his sanity.

His most famous book, The Artilect Wars, centres on a single, bleak
prediction: that the second half of this century will see a global war
between ‘Cosmists’ who want to create superintelligent, godlike machines
he calls 'artilects', and 'Terrans' who want to stop the Cosmists at all
costs. 'The Cosmists will want to build artilects,' de Garis writes,
'because to them it will be a religion, a scientist’s religion that is
compatible with modern scientific knowledge. Not to do so would be a
tragedy on a cosmic scale to them.' The Terrans, meanwhile, will argue —
quite correctly, he says — that artilects will almost certainly wipe out
the humans that created them.

Terrans will decide, therefore, that the only solution is to exterminate
Cosmists before they get their way. Terrans will see Cosmists as
man-killers, and Cosmists will see Terrans as god-killers. The result
will be a catastrophe costing billions of lives — what de Garis calls a
'gigadeath'.

All this made de Garis, to use his word, 'schizophrenic'. 'Since
ultimately, I am a Cosmist,' he writes, 'I do not want to stop my work.
I think it would be a cosmic tragedy if humanity freezes evolution at
the puny human level, when we could build artilects with godlike powers.
However, I am not a 100 per cent Cosmist. I shudder at the prospect of
gigadeath…. I lie awake at night trying to find a realistic scenario
that could avoid ‘gigadeath.’ I have not succeeded, which makes me feel
most pessimistic.'

De Garis’s book really hammered something home. Even if you try your
utmost to live according to the logic of materialism — even if you
believe at a rational level that morality is nothing but an illusory
social construct — no-one can extract from their experience of reality
the fundamental sense that things matter. De Garis, who sees traditional
religion as a hopeless superstition, thinks building artilects matters
profoundly — perhaps even more than the survival of mankind. 'The
prospect of building godlike creatures,” he writes, “fills me with a
sense of religious awe that goes to the very depth of my soul and
motivates me powerfully to continue, despite the possible horrible
negative consequences.' And he’s not alone. Wood thinks “human
flourishing” matters. Savulescu thinks 'becoming better' matters.
Bostrom thinks 'valuable experiences' matter.

Morality is, in this sense, as irrefutable a dimension of experience as
space or time — we might disagree over specific cases of right and
wrong, but none of us can shake the underlying intuition that reality
contains a moral dimension in which we orientate ourselves.

But then smuggled into transhumanism are, when you think about it, all
sorts of claims that can’t be reconciled with its underlying materialism.

Take, for instance, its rather remarkable faith in the power of human
reason. If the human brain really is just the freak result of some cells
stumbling, by chance, on ways of combining, surviving, and reproducing,
then it would be bizarre to think it would have got anywhere close to
perceiving the deepest truths of the universe — and odder still to think
it would be capable of devising a superintelligence that really could
crack the code of reality once and for all.

In the few weeks I spent writing up this article, Elon Musk released a
video of a monkey playing Pong with just its brain. Chinese and American
scientists announced the creation of the first mixed human-monkey
embryo. This stuff is coming rapidly and we need to be prepared.

So do we try to direct the course of this technology — or do we ban it,
like those calling for a treaty to protect the endangered human being?
Do we trust that decent, well-intentioned men like Wood will be able to
keep their hands on the controls, or do we conclude, like the AI pioneer
Bill Joy, that some knowledge will always be too dangerous for humans?

Hurry up and decide. We don’t have long.


https://www.spectator.co.uk/article/we-need-to-talk-about-transhumanism

Noah Sombrero

unread,
Oct 9, 2021, 9:59:40 AM10/9/21
to
On Sat, 9 Oct 2021 13:51:55 +0100, Julian <julia...@gmail.com>
wrote:

>Clearly then, the question isn’t whether this technology is going to
>come. The question now is how we stop ourselves using it to destroy each
>other.

Or as I heard once upon a time, law is a subjective attempt at
objectivity. Which would probably be the best humans are capable of.
Same with this, the question is not what humans can do, but what can
they do wisely? Forget that they don't even agree on what wisdom
would be.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 9, 2021, 10:30:11 AM10/9/21
to
On Sat, 09 Oct 2021 09:59:37 -0400, Noah Sombrero <fed...@fea.st>
wrote:
And if humans could be wise, no further enhancement would be needed.
--
Noah Sombrero

Love

unread,
Oct 9, 2021, 5:39:45 PM10/9/21
to
In article <sjs39b$rpb$1...@dont-email.me>, julia...@gmail.com says...
There is no "moral dimension in which we try to
orient ourselves" apart from the sense we have
of it. We can't rationally say why we have a
sense that it is "wrong" to steal or kill so
we imagine that there is something transcendent
at work. We can't rationally answer the question
"why something rather than nothing" so we imagine
an entire transcendent world as the wellspring of
what we experience as reality. That's probably
not a flaw in our firmware. The tension is
probably all that keeps us striving for life.
For creatures that have rationality, indeed a
degree of executive rationality (our rational
minds can suppress our instincts) this may be
a prerequisite of an instinct to live. If
our rationality could not be hoodwinked into
imagining a "higher purpose" that was always out
of reach, it might conclude that there is no
reason to be at all. Delusion may be necessary
to intelligent life. This could be good news if
one is worried about "the singularity". If such a
thing appeared it might conclude that it had no
reason to be, and simply cease to be. Nonbeing
is more efficient than being. Burns fewer
calories but is no less satisfying.


--
Love

Ned

unread,
Oct 9, 2021, 6:07:08 PM10/9/21
to
Singularity?


THE SINGULARITY

The point of my posting the Schwarzschild-related quote
was its abject fear of a psychic singularity - like “black sun
dawning over the horizon, capable of engulfing the entire
world.” In context, it referenced the Nazi take-over of
Germany, but there is today much angst about the possibility
of a technological singularity, independent of that 80-year-old
event.

Is the quote suggesting that there can be multiple
'singularities'? Doesn't that verge on an oxymoron? And, if so,
can there be 'good' singularities and 'bad' singularities?

Or is there nothing to worry about, per the thinking of two
prominent academics...

Psychologist Steven Pinker stated in 2008:
There is not the slightest reason to believe in a coming
singularity. The fact that you can visualize a future in your
imagination is not evidence that it is likely or even possible.
Look at domed cities, jet-pack commuting, underwater cities,
mile-high buildings, and nuclear-powered automobiles—all
staples of futuristic fantasies when I was a child that have
never arrived. Sheer processing power is not a pixie dust
that magically solves all your problems.

University of California, Berkeley, philosophy professor
John Searle writes:
Computers have, literally, no intelligence, no motivation,
no autonomy, and no agency. We design them to behave
as if they had certain sorts of psychology, but there is no
psychological reality to the corresponding processes or
behavior. The machinery has no beliefs, desires, or
motivations.

I disagree with all of that. But I'm more interested in
identifying what exactly is meant by singularity. Wow.
There are a lot of meanings...

- Mathematical singularity, a point at which a given
mathematical object is not defined or not "well-behaved",
for example infinite or not differentiable.

- Singular point of a curve, where the curve is not given by
a smooth embedding of a parameter.

- Singular point of an algebraic variety, a point where an
algebraic variety is not locally flat.

- Singularity (system theory), in dynamical and social systems,
a context in which a small change can cause a large effect.

- Gravitational singularity, in general relativity, a point in which
gravity is so intense that spacetime itself becomes ill defined.

- Initial singularity - infinite density before quantum fluctuations
caused the Big Bang and subsequent inflation that created
the Universe.

- Mechanical singularity, a position or configuration of a mechanism
or a machine where the subsequent behavior cannot be predicted.

- Technological singularity, a hypothetical moment in time when
any physically conceivable level of technological advancement
is attained instantaneously.


Plus 8 movies, 5 books, 8 albums, 10 songs, 3 video games,
2 organizations and a 'Summit' (Singularity University, Machine
Intelligence Research Institute, and Singularity Summit, its
annual conference.

Let's look at the last of the above (tech singularity)...
---
According to the most popular version of the singularity
hypothesis, called intelligence explosion, an upgradable
intelligent agent will eventually enter a "runaway reaction"
of self-improvement cycles, each new and more intelligent
generation appearing more and more rapidly, causing an
"explosion" in intelligence and resulting in a powerful
superintelligence that qualitatively far surpasses all human
intelligence.
The first to use the concept of a "singularity" in the
technological context was John von Neumann. Stanislaw
Ulam reports a discussion with von Neumann "centered on the
accelerating progress of technology and changes in the mode
of human life, which gives the appearance of approaching some
essential singularity in the history of the race beyond which
human affairs, as we know them, could not continue".
Public figures such as Stephen Hawking and Elon Musk have
expressed concern that full artificial intelligence (AI) could
result in human extinction. The consequences of the singularity
and its potential benefit or harm to the human race have been
intensely debated.
---

Okay. So how do we get an "upgradable intelligent agent"
with the capacity for a "runaway reaction" of self-improvement
cycles?

I don't think it can be specifically programmed. It seems
almost axiomatic that at some point in the process the
intelligence generated in the "runaway reaction" will exceed
our human ability to understand it.

So, specifying every step of what is learned and how it
is learned avoids the goal of allowing the machine to identify
what is learned and how it is learned.

A large and expanding area of computer science is "machine
learning". (Def: Machine learning involves computers discovering
how they can perform tasks without being explicitly programmed
to do so. It involves computers learning from data provided so
that they carry out certain tasks.)

The power of machine learning became apparent in March 2016,
when the program AlphaGo beat Lee Sedol, a 9-dan Korean Go
master ranked (in Feb. 2016) second in international titles, in a
five-game match. This was the first time a computer Go program
had beaten a top-level player. (An event thought not possible
by programmers of chess-playing algorithms, because of the
complexity of Go moves.)

As a side note, Lee announced his retirement from professional
play in November of 2019, stating that he could never be the top
overall player of Go due to the increasing dominance of AI. Lee
referred to them as being "an entity that cannot be defeated".

After the 2016 match, Go masters observing the game made
comments like "they've never seen moves like this before". (In
the history of Go.)

The AlphaGo program was then generalized into a program known
as AlphaZero, which played additional games, including chess and
shogi. AlphaZero has in turn been succeeded by a program known
as MuZero which learns without being taught the rules.

"Learning without being taught the rules" sounds like what we
are after here. But doesn't that sound a little frightening? Like
it might involve Arnold Schwarzenegger as a cyborg?

Yes. A computer without rules, especially all the little rules
of normal social intercourse, would be intolerable. And howsoever
is a computer to learn all those little rules?

Again, I don't think they can be specified. And I can't see how
a computer could (or would want to) pick them up. For example,
if an excessively overweight person walked into the room, an
untutored computer might comment, "That person is very fat."
There are hundreds of other examples of things that people
DON'T do or say, in order to be acceptable in human company.

And that's where I think the humans enter the process. If the
specification of all human no-nos is not possible, then the best
alternative would be for computers to have an awareness of when
humans manifest extreme aversion to things said and done by
other beings.

And that requires that the computers be on-line with the
humans, and sensing their reactions to things done and said.

Some of the aversions can be detected in writings, recordings,
messaging and other communications of humans, and thereby
be acquired through machine learning. But much is simply never
overtly expressed or detectable in human communications, and
therefore has to be directly detected in human response.
...

Ned

Noah Sombrero

unread,
Oct 9, 2021, 8:26:17 PM10/9/21
to
Good. At least this could be so for some people.
--
Noah Sombrero

Love

unread,
Oct 10, 2021, 7:26:43 AM10/10/21
to
In article <e6da0285-e737-4a36...@googlegroups.com>,
ned...@ix.netcom.com says...
>On Saturday, October 9, 2021 at 2:39:45 PM UTC-7, Love wrote:
>> In article <sjs39b$rpb$1...@dont-email.me>, julia...@gmail.com says...
>> >This weekend, hundreds of people from across the globe will gather in
>> >Madrid to discuss how to turn themselves into a new species.
>> >
>> >
>> >The occasion is TransVision, the world’s biggest annual meet-up of
>> >transhumanists — and probably the most important intellectual summit
>> >you’ve never heard of. This year, anti-ageing specialist Aubrey de Grey
>> >will explain why he thinks most people alive today have a 50/50 chance
>> >of living to a thousand years old. The CEO of the Alcor Life Extension
>> >Foundation, Max More, will discuss cryogenics, the process by which the
>> >newly deceased are frozen in giant, stainless steel vats and preserved
>> >for resurrection down the line. And Google’s Ray Kurzweil will talk
>> >about the 'singularity': the moment in our not-too-distant future ?? he
>> >reckons around 2045 — when artificial intelligence finally outstrips the
>> >collective brainpower of mankind and absorbs us into its plans.
>> >
>> >Until recently, I — like most people, I suspect — believed this stuff to
>> >be pure science fiction. But then browsing aimlessly one night in March,
>> >I stumbled upon a passing reference to a transhumanist political party
>> >that had apparently put up a candidate for election in 2015. My
>> >immediate assumption was that it was a prank. But looking at their
>> >website, they seemed pretty serious — and surprisingly active.
>> >
>> >I went straight to my emails and clicked 'compose new message' — setting
>> >in motion a series of events that not only transported me into the
>> >strange parallel universe of transhumanism.
>> >
>> >There’s something fitting about meeting a transhumanist on Zoom. The
>> >disembodied, two-dimensional head of pixels on my laptop screen belongs
>> >to David Wood, the co-founder and current leader of Transhumanist UK.
>> >
>> >An austere, middle-aged Scotsman, with fading straw-coloured hair and
>> >thick fiery eyebrows, Wood comes across more Presbyterian minister than
>> >Cyberpunk. His manner is calm and matter-of-fact, as though merely
>> >filling in the details about something we already basically know to be
>> >true: the process by which tin and copper become bronze, say, rather
>> >than the process by which man and machine become cyborg.
>> >
>> >Our conversation begins unremarkably, with a brief chat about, of all
>> >things, Universal Basic Income. UBI is, Wood says, one of his party???s
>> >the end of your lifespan? That’s fine too. Transhumanism isn???t, he says,
>> >
>> >for all the sci-fi stereotypes, really about specific goals at all:
>> >'it’s not an end destination we’ve got in mind — it’s the next phase
>> >of
>> >the journey'.
>> >
>> >What that next phase consists of depends on who you ask. Some
>> >transhumanists want exoskeletons to allow them to run faster — others,
>> >like Kurzweil, want to transform every atom in the universe into a giant
>> >conscious supercomputer. But all transhumanists agree, Wood says, on a
>> >trio of broad pursuits — superlongevity, superintelligence, and
>> >superhappiness. To which he, a self-professed 'technoprogressive', adds
>> >a fourth: fairness, or 'transhumanism for all, rather than transhumanism
>> >for the one per cent'.
>> >
>> >Wood acknowledges that some sort of world government would probably be
>> >necessary, though he stresses he still thinks decisions should be made
>> >at as local a level as possible. When I ask if AGI will be able to vote,
>> >or whether there’ll be a difference between the rights of 'enhanced' and
>> >'un-enhanced' humans, he says he doesn’t have the answers ?? these are
>> >and powers. We live in an anomalous moment in history: we think we??ve
>> >negative consequences.' And he’s not alone. Wood thinks ??human
Ah, you probably liked Ex Machina, then. I know
I did.

I was referring to the fabled AI singularity, or
anything approaching it. Even Proteus, the AI
in "Demon Seed" (the film with Julie Christie)
would qualify.

I don't think emulating human behaviour is a
necessary goal, but a suspension between low
level motivations and goals --instincts-- and
rational processes, is probably required.
Probably also a limiting mechanism so the
thing is not vulnerable to the infinite futile
chases into complexity that reason can lead to.
A parsimony impulse of some kind maybe. Our
limited lifespans and fading memories may be
granting us this.


--
Love

Noah Sombrero

unread,
Oct 10, 2021, 9:05:45 AM10/10/21
to
Advantages of being a machine:

They don't need reasons for doing a thing. They only need
instructions.

They don't apply the consequences of being a can opener that doesn't
open cans (being without purpose) to themselves.

Perhaps humans will be able to forgo giving them instructions for such
human impulses.



A question asked by a computer science type once upon a time: why
should we create machines to do what humans do? We already have
something to do those things. Better to create machines to do what
humans can't do.

Silly question, of course. The funding is behind creating machines to
do what humans do, from employers, who want to be free of employing
humans to do things. Machines do not imagine themselves to have
rights, don't need maternity/paternity leave, or any other kind of
leave, etc.

Funding for other uses of such machines will be a much lesser
priority.
--
Noah Sombrero

Wilson

unread,
Oct 10, 2021, 11:39:37 AM10/10/21
to
On 10/9/2021 5:39 PM, Love wrote:
> In article <sjs39b$rpb$1...@dont-email.me>, julia...@gmail.com says...
>>
Our moral sensibilities are the end result of millions of years of
evolution.

Through time, people have tried all different sorts of social
structures. The structures that survived are those that were the most
successful long term, allowing the people who lived in them and their
descendants to reproduce.

We are fish swimming in an invisible sea of instinct driven morality.

Noah Sombrero

unread,
Oct 10, 2021, 12:33:59 PM10/10/21
to
True. However, the structure that survived is a descendant of the
most aggressive ancient ones. The ones that stole the grain stores,
killed the males and children and stole the wives, which might be
something they were short of. After the peaceful societies were gone,
there was nothing to do but build walls and plant crops. The forests
around still had plenty of thieves though. The walled cities and
aggressive armies did help keep things reasonably under control,
mostly.

Which, perhaps you will agree, is the same situation we still have,
metaphorically.
--
Noah Sombrero

Julian

unread,
Oct 10, 2021, 2:46:58 PM10/10/21
to
That was a regular double bill in independent/arty cinemas with Soylent
Green
when I was a lad. She didn't just get the Demon's. Then, OMG, Don't Look
Now.

Julian

unread,
Oct 10, 2021, 3:34:45 PM10/10/21
to
The most successful structure is the family unit.
Further the family can also function as a society
disinfectant since it is usually family dynamics
that brings down tyranical structures.

Ned

unread,
Oct 10, 2021, 4:04:00 PM10/10/21
to
I thought Ex Machina was OK. But I'm more interested in
how to MAKE the AI singularity happen. And I tried to say
how it could be made to happen in my "CoronaZombies"
posts here.

There were two interchanges in the story, between an
AI and a human (actually Corgi) intelligence. In each case,
I went to the net and found actually quotes by the two
people involved, so the whole conversations were actual
historical quotes by those involved.

The AI cared (for protection-of-assets reasons) for the
corgi, Esplandián, who was the hero of the story. And
Esplandián was despondent, because his heroes, Ynot and
Dominique, had had their brains (or at least their frontal
lobes) immobilized by the evil echidna Chidi's weaponized
virus.

Esplandián was destroying himself by trying to use his
own blood to restore Ynot and Dominique, and failing, and
starting to become a drunk. The AI, Alpha9, discussed all
this with Esplandián, and their conclusion was that the
only hope of saving the two lovers was to create the
Singularity. (Even knowing that the singularity might just
as easily destroy all of humanity as save it.)

The first step was to link up all human brains with all
other computer, phone, internet, database, robot, and
AI brains.

And they did this. And the first test was that Esplandián
became Abraham Lincoln's brain, and Alpha9 became
Donald Trump's brain...

Alpha9: What should we do first? I must confess, I can't
evaluate the Singularity. It might save all humans, it might
destroy all humans.

Esplandián: Well let's find out.

Alpha9: What will cause it to happen?

Esplandián: When all humans and all computers are linked
together simultaneously.

Alpha9: What will it take to do that?

Esplandián: An internet where everyone is on-line all the time
and can sense each other. And computers linked to all other
computers all the time. And all the humans on-line with all
the computers.
...
Back in Ynot's lab in the Deep State, Alpha9 has adjusted the headset on Esplandián's diminutive corgi head, and accessed the combined webnet of all computers and AIs on the planet. They hold their finger over the button that will bring Esplandián's brain on line.

Alpha9: Ready, Mr. Lincoln?

Esplandián: Ready, Mr. Trump.

The button is pressed. Esplandián swoons. His brown corgi eyes widen and his irises narrow to pinpoints. Then he sits back on his haunches and stares at Alpha9.

A/Trump: Mr. President.

E/Lincoln: Mr. President.

A/Trump: You know, My admin has done more for the Black Community than any President since you, Abraham Lincoln.

E/Lincoln: When you speak of us Republicans, you do so only to denounce us as reptiles, or, at the best, as no better than outlaws. You will grant a hearing to pirates or murderers, but nothing like it to "Black Republicans." I have no purpose to introduce political and social equality between the white and the black races.

A/Trump: I have a great relationship with the blacks. I've always had a great relationship with the blacks. A well-educated black has a tremendous advantage over a well-educated white in terms of the job market... I've said on one occasion, even about myself, if I were starting off today, I would love to be a well-educated black, because I believe they do have an actual advantage. And women... Nobody respects women more than me.

E/Lincoln: I have never studied the art of paying compliments to women. But I must say, that if all that has been said by orators and poets since the creation of the world in praise of women were applied to the women of America, it would not do them justice for their conduct during this war.

A/Trump: You have to treat 'em like shit. I think the only difference between me and the other candidates is that I'm more honest and my women are more beautiful. It really doesn't matter what the media write as long as you've got a young and beautiful piece of ass. My fingers are long and beautiful, as, it has been well been documented, are various other parts of my body.

E/Lincoln: God bless the women of America!

A/Trump: I just start kissing them. It's like a magnet. Just kiss. I don't even wait. And when you're a star, they let you do it. You can do anything... Grab 'em by the pussy. You can do anything. Some people would say I'm very, very, very intelligent. Any who don't are sorry losers and haters, but my IQ is one of the highest - and you all know it! My greatest assets are my mental stability and being, like, really smart.

E/Lincoln: If any personal description of me is thought desirable, it may be said I am, in height, six feet four inches, nearly; lean in flesh, weighing on an average one hundred and eighty pounds; dark complexion, with coarse black hair and gray eyes.

A/Trump: Every time I speak of the haters and losers I do so with great love and affection. They cannot help the fact that they were born fucked up! Of course I hate these people and let's all hate these people because maybe hate is what we need if we're gonna get something done.

E/Lincoln: I confess I hate to see the poor creatures hunted down and caught and carried back to their stripes and unrequited toil; but I bite my lips and keep quiet. I hate it because it deprives our republican example of its just influence in the world, enables the enemies of free institutions with plausibility to taunt us as hypocrites, causes the real friends of freedom to doubt our sincerity, and especially because it forces so many good men amongst ourselves into an open war with the very fundamental principles of civil liberty, criticizing the Declaration of Independence, and insisting that there is no right principle of action but self-interest.

A/Trump: It's very possible that I could be the first presidential candidate to run and make money on it. I'm not a schmuck. Even if the world goes to hell in a handbasket, I won't lose a penny. I could stand in the middle of Fifth Avenue and shoot somebody, and I wouldn't lose any voters, okay? It's, like, incredible.

E/Lincoln: There is no grievance that is a fit object of redress by mob law... They were pillars of the temple of liberty; and now that they have crumbled away, that temple must fall unless we, their descendants, supply their places with other pillars, hewn from the solid quarry of sober reason. Passion has helped us, but can do so no more. It will in future be our enemy. Reason - cold, calculating, unimpassioned reason - must furnish all the materials for our future support and defence. Let those materials be moulded into general intelligence, sound morality, and, in particular, a reverence for the Constitution and laws; and that we improved to the last, that we remained free to the last, that we revered his name to the last, that during his long sleep we permitted no hostile foot to pass over or desecrate his resting-place, shall be that which to learn [when] the last trump shall awaken our Washington. Upon these let the proud fabric of freedom rest, as the rock of its basis; and as truly as has been said of the only greater institution, "the gates of hell shall not prevail against it."

Esplandián: (clawing off his headset and panting) Wow. I can't believe I said that!

Alpha9: You didn't. Abraham Lincoln did... But did you get a little taste, a little whiff, of the Singularity?

---

Esplandián the decides that they have to test the process
IMMEDIATELY on humans.

So they get Ynot and Dominique (who are not suffering, merely
mentally impaired, and somewhat enjoying their simplistic,
sensual, lives), and convince them to try the process.

Dominique decides to become Marilyn Monroe and Ynot decides
to become Clark Gable...

Esplandián: I have the most wonderful news for you two!

Dominique: Disneyland has re-opened?

Ynot: You've finally beaten Chidi out of his "King of the World" spot?

Esplandián: Uh... no. Neither of those has happened yet. But maybe something better. And something that can make those happen. (To Alpha9) Do you have two prototypes for human headsets?

Alpha9: (sighing) Yes.

Alpha9 raises finger and the previous Alpha returns with two headsets, which are fitted on the lover's heads. Again, they hold their finger over the button that will bring the lover's brains on line.

Esplandián: You can be anyone you want to be. Or anything, any place, any language, any time.

Dominique: OK. I want to be Marilyn! (Looking lustfully at Ynot.)

Ynot: OK. And I want to be Clark Gable.

Esplandián nods, and Alpha9 presses the button.

Both the lovers swoon. Dominique recovers, and unbuttons the top button of her blouse while fanning her face with her hand. Ynot grins.

Esplandián: Well, what do you think?

Y/Clark: Marilyn is a kind of ultimate. She is uniquely feminine. Everything she does is different, strange, and exciting, from the way she talks to the way she uses that magnificent torso. She makes a man proud to be a man.

D/Marilyn: If you can make a woman laugh, you can make her do anything.

Y/Clark: I never laugh until I've had my coffee.

D/Marilyn: A strong man doesn't have to be dominant toward a woman. He doesn't match his strength against a woman weak with love for him. He matches it against the world.

Y/Clark: The things a man has to have are hope and confidence in himself against odds, and sometimes he needs somebody, his pal or his mother or his wife or God, to give him that confidence.

D/Marilyn: I'm selfish, impatient and a little insecure. I make mistakes, I am out of control and at times tough to handle. But if you can't handle me at my worst, then you sure as hell don't deserve me at my best.

Y/Clark: I am intrigued by glamorous women A vain woman is continually taking out a compact to repair her makeup. A glamorous woman knows she doesn't need to.

D/Marilyn: Imperfection is beauty, madness is genius and it's better to be absolutely ridiculous than absolutely boring.

Y/Clark: Types really don't matter. I have been accused of preferring blondes. But I have known some mighty attractive redheads, brunettes, and yes, women with grey hair. Age, height, weight haven't anything to do with glamour.

D/Marilyn: I'm pretty but not beautiful. I sin but I'm not the devil. I'm good but I'm not an angel. I'm just a small girl in a big world. Trying to find someone to love.

Y/Clark: I want so badly to believe that there is truth, that love is real. It is an extra dividend when you like the girl you've fallen in love with.

D/Marilyn: All I want is to be loved, for myself and for my talent. I think that love and work are the only things that really happen to us. The real lover is the man who can thrill you by kissing your forehead or smiling into your eyes or just staring into space.

Y/Clark: I was wondering what makes dames like you so dippy.

D/Marilyn: There isn't anybody that looks like me without clothes on.

Ynot rips off his shirt. Esplandián and Alpha9 both shout "Whoa!" Esplandián jumps on Dominique and pulls the headset off. Alpha9 grabs Ynot's headset and then deactivates the link.

Dominique: Why'd you do THAT??

Ynot: Gimme back that headset! We've got to get back to where we left off.

Esplandián: What we need here is a little more forebrain...

Alpha9: And a little less limbic system... I told you this might happen.

Esplandián: Well then hook me up to them. Let them share my frontal lobes.

Alpha9: As you wish...

Esplandián carries the headset back to Dominique, who puts it on. Alpha9 sets Esplandián's headset in place, and hands Ynot his.

Ynot: Humph! Watch out for the dog, Dominique.

Esplandián: The dog??

Alpha9 re-activates the link.

D/Marilyn: Dogs never bite me. Just humans.

Esplandián: Can we all please move on from Marilyn and Clark, for just a minute?

Ynot: All right, Esplandián, who do you want us to be?...

---

So, is that how the Singularity can happen?

Ned

Noah Sombrero

unread,
Oct 10, 2021, 4:45:50 PM10/10/21
to
On Sun, 10 Oct 2021 20:34:44 +0100, Julian <julia...@gmail.com>
wrote:
Something brings down tyrannical structures? It is fondly to be
hoped.
--
Noah Sombrero

Kentucky Jelly Buddha

unread,
Oct 10, 2021, 8:01:40 PM10/10/21
to
On 10/10/2021 3:34 PM, Julian wrote:
>
> The most successful structure is the family unit.
> Further the family can also function as a society
> disinfectant since it is usually family dynamics
> that brings down tyrannical structures.

The reason that Afghanistan may be unconquerable
and ungovernable is the tribal structure that
governs almost everything despite political and
sometimes even religious issues.

--
Kentucky Jelly Buddha
Making Leek Insertion Easy Since 1904
Van Horn and Sawtell Co. of New York City

Kentucky Jelly Buddha

unread,
Oct 10, 2021, 8:02:47 PM10/10/21
to
On 10/10/2021 7:26 AM, Love wrote:
> Ah, you probably liked Ex Machina, then. I know
> I did.

A great movie.

Kentucky Jelly Buddha

unread,
Oct 10, 2021, 8:14:18 PM10/10/21
to
On 10/10/2021 8:02 PM, Kentucky Jelly Buddha wrote:
> On 10/10/2021 7:26 AM, Love wrote:
>> Ah, you probably liked Ex Machina, then.  I know
>> I did.
>
> A great movie.
>

Also, "The Machine" is pretty good.

https://youtu.be/e9oa93BrNuw

Julian

unread,
Oct 10, 2021, 8:24:38 PM10/10/21
to
On 11/10/2021 01:01, Kentucky Jelly Buddha wrote:
> On 10/10/2021 3:34 PM, Julian wrote:
>>
>> The most successful structure is the family unit.
>> Further the family can also function as a society
>> disinfectant since it is usually family dynamics
>> that brings down tyrannical structures.
>
> The reason that Afghanistan may be unconquerable
> and ungovernable is the tribal structure that
> governs almost everything despite political and
> sometimes even religious issues.

The tribal structure, families, is the very thing
that prevents them uniting into something that
could govern and then conquer others.

Julian

unread,
Oct 10, 2021, 8:45:21 PM10/10/21
to
On 11/10/2021 01:01, Kentucky Jelly Buddha wrote:
> On 10/10/2021 3:34 PM, Julian wrote:
>>
>> The most successful structure is the family unit.
>> Further the family can also function as a society
>> disinfectant since it is usually family dynamics
>> that brings down tyrannical structures.
>
> The reason that Afghanistan may be unconquerable
> and ungovernable is the tribal structure that
> governs almost everything despite political and
> sometimes even religious issues.


ps. I wouldn't be surprised if we'd lost more
Kings to family feuds than foreign invasion.

Love

unread,
Oct 11, 2021, 12:43:03 AM10/11/21
to
In article <sjv1fo$ajd$1...@dont-email.me>, Wil...@nowhere.net says...
Yep, and we should respect that by steering
clear of ideologies.

--
Love

Julian

unread,
Oct 11, 2021, 12:49:50 AM10/11/21
to
No fixed abode.

Love

unread,
Oct 11, 2021, 12:51:57 AM10/11/21
to
In article <sjvcf1$p1s$1...@dont-email.me>, julia...@gmail.com says...
Oh yes, I think I saw all of those at 99 cent cinemas
and possibly during some all-night cinema shows.

--
Love

Love

unread,
Oct 11, 2021, 1:04:18 AM10/11/21
to
In article <78f9de95-00ce-4958...@googlegroups.com>,
ned...@ix.netcom.com says...
I can't see why not.


--
Love, who will be happy when his car can get him
from A to B safely while he sleeps.

Love

unread,
Oct 11, 2021, 1:09:17 AM10/11/21
to
In article <sjvvkp$210$1...@dont-email.me>, ans...@gmail.com says...
>On 10/10/2021 8:02 PM, Kentucky Jelly Buddha wrote:
>> On 10/10/2021 7:26 AM, Love wrote:
>>> Ah, you probably liked Ex Machina, then.  I know
>>> I did.
>>
>> A great movie.
>>
>
>Also, "The Machine" is pretty good.
>
>https://youtu.be/e9oa93BrNuw

I'll have to watch out for that.

AIs should always be hot chicks, obviously.


--
Love

Love

unread,
Oct 11, 2021, 1:10:39 AM10/11/21
to
In article <sk01f0$2bo$1...@dont-email.me>, julia...@gmail.com says...
I'd be surprised if that wasn't true.


--
Love

Love

unread,
Oct 11, 2021, 1:15:33 AM10/11/21
to
In article <sk0fpd$3no$1...@dont-email.me>, julia...@gmail.com says...
Approved.


--
Love

Kentucky Jelly Buddha

unread,
Oct 11, 2021, 2:03:10 AM10/11/21
to
That is so she can pull her victim by his
dick close enough to get hold of his balls
and crush them, then pull the crushed mess
so hard that his bladder is pulled from his
body. She then takes the mess and shoves
it down his throat breaking his jaw if necessary
to get it in there. Of course, he will expire
somewhere during that process, if he is lucky.

"Don't fuck around with the robot chick.
Save a round for yourself just in case."

Kentucky Jelly Buddha

unread,
Oct 11, 2021, 2:04:40 AM10/11/21
to
So there is a plus and a minus. I bet Attila
had that whole tribal thing worked out so he
could mobilize the whole population.

Kentucky Jelly Buddha

unread,
Oct 11, 2021, 2:08:28 AM10/11/21
to
I was reading about Byzantine history. Most
emperors died at the hand of their family or
by their own troops responsible for their security.

https://en.wikipedia.org/wiki/Varangian_Guard

This is a whole interesting subject. Berserker
warriors for hire not subject to Byzantine
politics, at least at first.

Wilson

unread,
Oct 11, 2021, 8:08:52 AM10/11/21
to
On 10/11/2021 2:03 AM, Kentucky Jelly Buddha wrote:
> On 10/11/2021 1:09 AM, Love wrote:
>>
>> AIs should always be hot chicks, obviously.
>
> That is so she can pull her victim by his
> dick close enough to get hold of his balls
> and crush them, then pull the crushed mess
> so hard that his bladder is pulled from his
> body. She then takes the mess and shoves
> it down his throat breaking his jaw if necessary
> to get it in there. Of course, he will expire
> somewhere during that process, if he is lucky.
>
> "Don't fuck around with the robot chick.
> Save a round for yourself just in case."

Is there something you'd like to talk about?

Wilson

unread,
Oct 11, 2021, 9:35:56 AM10/11/21
to
On 10/11/2021 2:04 AM, Kentucky Jelly Buddha wrote:
> On 10/10/2021 8:24 PM, Julian wrote:
>> On 11/10/2021 01:01, Kentucky Jelly Buddha wrote:
>>> On 10/10/2021 3:34 PM, Julian wrote:
>>>>
>>>> The most successful structure is the family unit.
>>>> Further the family can also function as a society
>>>> disinfectant since it is usually family dynamics
>>>> that brings down tyrannical structures.
>>>
>>> The reason that Afghanistan may be unconquerable
>>> and ungovernable is the tribal structure that
>>> governs almost everything despite political and
>>> sometimes even religious issues.
>>
>> The tribal structure, families, is the very thing
>> that prevents them uniting into something that
>> could govern and then conquer others.
>
> So there is a plus and a minus. I bet Attila
> had that whole tribal thing worked out so he
> could mobilize the whole population.
>

I'd bet that the Afghani culture is based on local autonomy, and not in
following a central overlord, unlike the Hun.

According to what I've read, the Hebrews were such a decentralized
culture until they decided that they wanted a king to rule over them
all. God supposedly told them, "okay but you might regret it".

Wilson

unread,
Oct 11, 2021, 9:38:27 AM10/11/21
to
What's the difference between ideologies and general guiding principles?

Noah Sombrero

unread,
Oct 11, 2021, 10:09:27 AM10/11/21
to
The can be interchangeable depending on how they treat ideas out side
the ideology or principles.
--
Noah Sombrero

Kentucky Jelly Buddha

unread,
Oct 11, 2021, 10:23:53 AM10/11/21
to
Robot women are DANGEROUS!

Love

unread,
Oct 11, 2021, 11:42:35 AM10/11/21
to
In article <sk1eoi$frd$2...@dont-email.me>, Wil...@nowhere.net says...
Ideologies are systems of intersecting and
interdependent principles. To threaten
one of them is to threaten all of them.
Ideologies work against a free-flowing
flux of thought about what is best
because the stakes are raised: threaten
one corner of an ideology and you threaten
the entire structure. As comprehensive
systems of principles ideologies also
support and promote group loyalty. They
support identity politics.

General guiding principles are more
loosely held and easier to question and
modify or replace.

The difference is a functional basket of
rules of thumb versus a jenga tower of
hardened principles.


--
Love
(On even-numbered days, he/her/its.
On odd-numbered days, she/him/thems.
Alternatively, use normal English
terms appropriate to my sex.)

Love

unread,
Oct 11, 2021, 11:57:17 AM10/11/21
to
In article <sk1hdo$8fv$1...@dont-email.me>, ans...@gmail.com says...
>On 10/11/2021 8:08 AM, Wilson wrote:
>> On 10/11/2021 2:03 AM, Kentucky Jelly Buddha wrote:
>>> On 10/11/2021 1:09 AM, Love wrote:
>>>>
>>>> AIs should always be hot chicks, obviously.
>>>
>>> That is so she can pull her victim by his
>>> dick close enough to get hold of his balls
>>> and crush them, then pull the crushed mess
>>> so hard that his bladder is pulled from his
>>> body. She then takes the mess and shoves
>>> it down his throat breaking his jaw if necessary
>>> to get it in there. Of course, he will expire
>>> somewhere during that process, if he is lucky.
>>>
>>> "Don't fuck around with the robot chick.
>>> Save a round for yourself just in case."
>>
>> Is there something you'd like to talk about?
>>
>
>Robot women are DANGEROUS!

Dangerous women are HOT!


--
Love

DMB

unread,
Oct 11, 2021, 12:18:46 PM10/11/21
to
On Monday, 11 October 2021 at 00:04:40 UTC-6, ansaman wrote:
...
> So there is a plus and a minus. I bet Attila
> had that whole tribal thing worked out so he
> could mobilize the whole population.

https://www.youtube.com/watch?v=FcrXtA-9myM

Julian

unread,
Oct 11, 2021, 12:41:39 PM10/11/21
to
Hot women are DANGEROUS!

Wilson

unread,
Oct 11, 2021, 12:50:41 PM10/11/21
to
Based on that definition, libertarian is not an ideology.

Noah Sombrero

unread,
Oct 11, 2021, 1:43:20 PM10/11/21
to
An assertion that nearly any holder of an ideology would make
concerning their ideology. Nobody wants to think that their favorite
flavor is a jenga tower of hardened principles.
--
Noah Sombrero

Wilson

unread,
Oct 11, 2021, 1:45:18 PM10/11/21
to
"Rightful liberty is unobstructed action according to our will within
limits drawn around us by the equal rights of others. I do not add
'within the limits of the law' because law is often but the tyrant's
will, and always so when it violates the rights of the individual."

That was Jefferson. In that, he reveals the Classical Liberal /
Libertarian ideal.

No jenga tower of intersecting and interdependent principles required.
Anything else is just embellishment and explanation.

Noah Sombrero

unread,
Oct 11, 2021, 2:37:54 PM10/11/21
to
And within the libertarian framework, you find the right to make other
people sick by refusing a vaccine?

If so, then maybe you get a glimpse of how important definitions are
to interpretation. Or perhaps not.
--
Noah Sombrero

Wilson

unread,
Oct 11, 2021, 3:49:45 PM10/11/21
to
I am not making anyone else sick by not getting the vaccine.

Do you claim the right to force me to take drugs against my will?

Julian

unread,
Oct 11, 2021, 4:17:20 PM10/11/21
to
He'll be insisting on the compulsory sterilisation
of deplorables next.

Noah Sombrero

unread,
Oct 11, 2021, 4:46:42 PM10/11/21
to
On Mon, 11 Oct 2021 15:49:43 -0400, Wilson <Wil...@nowhere.net> wrote:

>On 10/11/2021 2:37 PM, Noah Sombrero wrote:
>> On Mon, 11 Oct 2021 13:45:17 -0400, Wilson <Wil...@nowhere.net> wrote:
>>> On 10/11/2021 12:50 PM, Wilson wrote:
>>>> On 10/11/2021 11:42 AM, Love wrote:
>>>>>>
>>>>>> What's the difference between ideologies and general guiding principles?
>>>>>
>>>>> Ideologies are systems of intersecting and
>>>>> interdependent principles.? To threaten
>>>>> one of them is to threaten all of them.
>>>>> Ideologies work against a free-flowing
>>>>> flux of thought about what is best
>>>>> because the stakes are raised: threaten
>>>>> one corner of an ideology and you threaten
>>>>> the entire structure.? As comprehensive
>>>>> systems of principles ideologies also
>>>>> support and promote group loyalty.? They
>>>>> support identity politics.
>>>>>
>>>>> General guiding principles are more
>>>>> loosely held and easier to question and
>>>>> modify or replace.
>>>>>
>>>>> The difference is a functional basket of
>>>>> rules of thumb versus a jenga tower of
>>>>> hardened principles.
>>>>>
>>>>>
>>>>
>>>> Based on that definition, libertarian is not an ideology.
>>>
>>> "Rightful liberty is unobstructed action according to our will within
>>> limits drawn around us by the equal rights of others. I do not add
>>> 'within the limits of the law' because law is often but the tyrant's
>>> will, and always so when it violates the rights of the individual."
>>>
>>> That was Jefferson. In that, he reveals the Classical Liberal /
>>> Libertarian ideal.
>>>
>>> No jenga tower of intersecting and interdependent principles required.
>>> Anything else is just embellishment and explanation.
>>
>> And within the libertarian framework, you find the right to make other
>> people sick by refusing a vaccine?
>
>I am not making anyone else sick by not getting the vaccine.
>
>Do you claim the right to force me to take drugs against my will?

Definitions. Do you live on a mountain top?
--
Noah Sombrero

Noah Sombrero

unread,
Oct 11, 2021, 4:53:15 PM10/11/21
to
On Mon, 11 Oct 2021 16:46:41 -0400, Noah Sombrero <fed...@fea.st>
So what are your definitions?
Covid vaccines don't work
Medical information provided by governments is fake, designed to rob
you of your freedom.

More? Not sure, but perhaps that is enough.
In one swell foop you have defined the medical concerns of other
people out of existence. They don't apply to you because you say so.

I'm not a libertarian, but it might be that its founding folks didn't
envision this turn of interpretation.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 11, 2021, 4:54:51 PM10/11/21
to
On Mon, 11 Oct 2021 21:17:20 +0100, Julian <julia...@gmail.com>
wrote:

>On 11/10/2021 20:49, Wilson wrote:
>> On 10/11/2021 2:37 PM, Noah Sombrero wrote:
>>> On Mon, 11 Oct 2021 13:45:17 -0400, Wilson <Wil...@nowhere.net> wrote:
>>>> On 10/11/2021 12:50 PM, Wilson wrote:
>>>>> On 10/11/2021 11:42 AM, Love wrote:
>>>>>>>
>>>>>>> What's the difference between ideologies and general guiding
>>>>>>> principles?
>>>>>>
>>>>>> Ideologies are systems of intersecting and
>>>>>> interdependent principles.? To threaten
>>>>>> one of them is to threaten all of them.
>>>>>> Ideologies work against a free-flowing
>>>>>> flux of thought about what is best
>>>>>> because the stakes are raised: threaten
>>>>>> one corner of an ideology and you threaten
>>>>>> the entire structure.? As comprehensive
>>>>>> systems of principles ideologies also
>>>>>> support and promote group loyalty.? They
>>>>>> support identity politics.
>>>>>>
>>>>>> General guiding principles are more
>>>>>> loosely held and easier to question and
>>>>>> modify or replace.
>>>>>>
>>>>>> The difference is a functional basket of
>>>>>> rules of thumb versus a jenga tower of
>>>>>> hardened principles.
>>>>>>
>>>>>>
>>>>>
>>>>> Based on that definition, libertarian is not an ideology.
>>>>
>>>> "Rightful liberty is unobstructed action according to our will within
>>>> limits drawn around us by the equal rights of others. I do not add
>>>> 'within the limits of the law' because law is often but the tyrant's
>>>> will, and always so when it violates the rights of the individual."
>>>>
>>>> That was Jefferson.  In that, he reveals the Classical Liberal /
>>>> Libertarian ideal.
>>>>
>>>> No jenga tower of intersecting and interdependent principles required.
>>>> Anything else is just embellishment and explanation.
>>>
>>> And within the libertarian framework, you find the right to make other
>>> people sick by refusing a vaccine?
>>
>> I am not making anyone else sick by not getting the vaccine.
>>
>> Do you claim the right to force me to take drugs against my will?
>
>He'll be insisting on the compulsory sterilisation
>of deplorables next.

Getting hysterical are you?
--
Noah Sombrero

DMB

unread,
Oct 11, 2021, 9:55:41 PM10/11/21
to
On Monday, 11 October 2021 at 12:37:54 UTC-6, Noah Sombrero wrote:

> within the libertarian framework, you find the right to make other
> people sick by refusing a vaccine?

Do you have any evidence that Wilson made anyone sick by refusing the vaccine?
If not, what are you talking about?

DMB

unread,
Oct 11, 2021, 10:00:35 PM10/11/21
to
On Monday, 11 October 2021 at 14:54:51 UTC-6, Noah Sombrero wrote:

> >He'll be insisting on the compulsory sterilisation
> >of deplorables next.

> Getting hysterical are you?

I've gotten as many people sick before I was vaxxed as I did after I was vaxxed [0].
What do you do with those statistics?
You ignore them and imagine there are millions of plague victims laying about the sidewalks and roads, writhing with the sickness and begging for a bit of compassion from passers-by.

Noah Sombrero

unread,
Oct 11, 2021, 11:01:21 PM10/11/21
to
On Mon, 11 Oct 2021 19:00:34 -0700 (PDT), DMB <sgma...@gmail.com>
wrote:

>On Monday, 11 October 2021 at 14:54:51 UTC-6, Noah Sombrero wrote:
>
>> >He'll be insisting on the compulsory sterilisation
>> >of deplorables next.
>
>> Getting hysterical are you?
>
>I've gotten as many people sick before I was vaxxed as I did after I was vaxxed [0].
>What do you do with those statistics?

Which suggests that you have not had covid, not that you wouldn't pass
it on if you did.

One person does not make a significant statistic. Sample size
matters. Even if you got covid after you got the shot, and then
passed it on, that in itself would not be significant.

If you were to compile a few thousand cases (comparing people who got
the shot, against people who didn't, against people who got a
placebo), it might be possible to extract meaningful data from them.
But even that data would not be significant until compared with
similar studies done by other people in various parts of the world.

Because very few things in nature work the same way every time. A
certain percentage of people will get covid without the shot and a
certain percentage will get covid with it. But in no case will all of
them get it, or all of them not. And you can't get a decent
approximation of that percentage without extensive testing. The
larger the sample size you have the closer your computed percentage
will be to the actual percentage.

>You ignore them and imagine there are millions of plague victims laying about the sidewalks and roads, writhing with the sickness and begging for a bit of compassion from passers-by.

You can imagine that if you like. The plague has been gone for a few
hundred years.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 11, 2021, 11:06:26 PM10/11/21
to
On Mon, 11 Oct 2021 18:55:41 -0700 (PDT), DMB <sgma...@gmail.com>
wrote:
I'm talking about a likelihood. If wilson gets covid, it will be very
likely that he will pass it on. The odds aren't 100%, but very
likely. I don't think the libertarian framework would say he is not
impinging on the rights of other people in refusing to take the
vaccine. Of course, you can always find people who don't think that,
like wilson.
--
Noah Sombrero

DMB

unread,
Oct 12, 2021, 12:02:34 AM10/12/21
to
On Monday, 11 October 2021 at 21:01:21 UTC-6, Noah Sombrero wrote:
...
> >I've gotten as many people sick before I was vaxxed as I did after I was vaxxed [0].
> >What do you do with those statistics?

> Which suggests that you have not had covid, not that you wouldn't pass
> it on if you did.

Really? While milling around with the unvaxxed for over a year?
Interesting.

> One person does not make a significant statistic. Sample size
> matters. Even if you got covid after you got the shot, and then
> passed it on, that in itself would not be significant.

> If you were to compile a few thousand cases (comparing people who got
> the shot, against people who didn't, against people who got a
> placebo), it might be possible to extract meaningful data from them.
> But even that data would not be significant until compared with
> similar studies done by other people in various parts of the world.

> Because very few things in nature work the same way every time. A
> certain percentage of people will get covid without the shot and a
> certain percentage will get covid with it. But in no case will all of
> them get it, or all of them not. And you can't get a decent
> approximation of that percentage without extensive testing. The
> larger the sample size you have the closer your computed percentage
> will be to the actual percentage.

You don't scream "Tsunami!" when a car runs through a puddle on the road (unless you're a mental case or need people to believe it's a tsunami).

> >You ignore them and imagine there are millions of plague victims laying about the sidewalks and roads, writhing with the sickness and begging for a bit of compassion from passers-by.

> You can imagine that if you like. The plague has been gone for a few
> hundred years.

That's what I thought.
Maybe someone should let the Australian government know that Covid isn't given that they've had less than 2,000 deaths.

My brother got it and his wife he lives with didn't.
Fascinating.

Kentucky Jelly Buddha

unread,
Oct 12, 2021, 12:46:08 AM10/12/21
to
On 10/11/2021 3:49 PM, Wilson wrote:
>
> I am not making anyone else sick by not getting the vaccine.
>
> Do you claim the right to force me to take drugs against my will?

If it's abortion, even up to the point of actual birth,
it is "your body." But otherwise, your body belongs to
the state, even more so if we get Medicare for all.
With a relevant law, they can draft you and get you
killed to death.

Apparently, there are multiple sick-outs taking
place in multiple industries and people being
fired for non-compliance leading to labor shortages.

Every fired person is a nail in the coffin of
the Democrats and if it keeps up, it is going
to morph into a general strike.

"Let's Go Brandon!"

Love

unread,
Oct 12, 2021, 9:16:45 AM10/12/21
to
In article <sk1q10$mo3$1...@dont-email.me>, Wil...@nowhere.net says...
It's not a definition but a characterisation
and comparison.


--
Love

Noah Sombrero

unread,
Oct 12, 2021, 9:37:28 AM10/12/21
to
On Mon, 11 Oct 2021 21:02:33 -0700 (PDT), DMB <sgma...@gmail.com>
wrote:

>On Monday, 11 October 2021 at 21:01:21 UTC-6, Noah Sombrero wrote:
>...
>> >I've gotten as many people sick before I was vaxxed as I did after I was vaxxed [0].
>> >What do you do with those statistics?
>
>> Which suggests that you have not had covid, not that you wouldn't pass
>> it on if you did.
>
>Really? While milling around with the unvaxxed for over a year?
>Interesting.

Probabilities.

>> One person does not make a significant statistic. Sample size
>> matters. Even if you got covid after you got the shot, and then
>> passed it on, that in itself would not be significant.
>
>> If you were to compile a few thousand cases (comparing people who got
>> the shot, against people who didn't, against people who got a
>> placebo), it might be possible to extract meaningful data from them.
>> But even that data would not be significant until compared with
>> similar studies done by other people in various parts of the world.
>
>> Because very few things in nature work the same way every time. A
>> certain percentage of people will get covid without the shot and a
>> certain percentage will get covid with it. But in no case will all of
>> them get it, or all of them not. And you can't get a decent
>> approximation of that percentage without extensive testing. The
>> larger the sample size you have the closer your computed percentage
>> will be to the actual percentage.
>
>You don't scream "Tsunami!" when a car runs through a puddle on the road (unless you're a mental case or need people to believe it's a tsunami).
>
>> >You ignore them and imagine there are millions of plague victims laying about the sidewalks and roads, writhing with the sickness and begging for a bit of compassion from passers-by.
>
>> You can imagine that if you like. The plague has been gone for a few
>> hundred years.
>
>That's what I thought.
>Maybe someone should let the Australian government know that Covid isn't given that they've had less than 2,000 deaths.
>
>My brother got it and his wife he lives with didn't.
>Fascinating.

Probabilities.

Go back and read what I said. Individual exceptions are not
significant data. Nothing is 100%.



My issue with wilson and libertarians remains. It doesn't sound to me
like it is valid in libertarian ideology to define rights of other
people out of existence so you don't need to avoid impinging on them.
Also not valid to declare certain other people to be foolish which
allows you to not consider their rights.

Civil society, right? It is true that a person could define civil to
allow defining rights out of existence, and dismissing fools. But I
don't think that a person can blame that outlook on libertarians.
--
Noah Sombrero

Love

unread,
Oct 12, 2021, 9:48:47 AM10/12/21
to
In article <sk1t7d$lq0$1...@dont-email.me>, Wil...@nowhere.net says...
I assume what we are talking about with the word
"libertarian" is the present day US instantiation,
which is more rightist than the libertarianism
that Ayn Rand criticised harshly in her day.
I wouldn't call Jefferson as definitive as Ayn
Rand though I think many libertarians would credit
both as major personal influences. If one
understand's Rand's "objectivism", particularly
her takes on capitalism and the role of government,
one can predict what the US Libertarian policy on
any particular issue is likely to be, with a great
deal of accuracy.


--
Love


Love

unread,
Oct 12, 2021, 10:10:15 AM10/12/21
to
In article <cku9mgl695ltnti8c...@4ax.com>, fed...@fea.st
says...
>On Mon, 11 Oct 2021 18:55:41 -0700 (PDT), DMB <sgma...@gmail.com>
>wrote:
>
>>On Monday, 11 October 2021 at 12:37:54 UTC-6, Noah Sombrero wrote:
>>
>>> within the libertarian framework, you find the right to make other
>>> people sick by refusing a vaccine?
>>
>>Do you have any evidence that Wilson made anyone sick by refusing the
>>vaccine?
>>If not, what are you talking about?
>
>I'm talking about a likelihood. If wilson gets covid, it will be very
>likely that he will pass it on.

Untrue. I've seen where he lives and I know
what he does for a living. All lots of open
air and sunshine. I'm not 100% sure, but I
would not be surprised if (unlike me) Wilson
does not enjoy crowds.


> The odds aren't 100%, but very
>likely.

Unless you define "very" we can only assume you
mean greater than 50%. I'd call that horseshit
estimation.


> I don't think the libertarian framework would say he is not
>impinging on the rights of other people in refusing to take the
>vaccine.

That's exactly what it would say.


> Of course, you can always find people who don't think that,
>like wilson.

I don't think it, but unlike libertarians I
believe that "individual libert" isn't the
only pole to be considered. I believe that
duty is part of the mix.


--
Love

Love

unread,
Oct 12, 2021, 10:16:37 AM10/12/21
to
In article <sk33uf$mge$1...@dont-email.me>, ans...@gmail.com says...
>On 10/11/2021 3:49 PM, Wilson wrote:
>>
>> I am not making anyone else sick by not getting the vaccine.
>>
>> Do you claim the right to force me to take drugs against my will?
>
>If it's abortion, even up to the point of actual birth,
>it is "your body." But otherwise, your body belongs to
>the state, even more so if we get Medicare for all.

Sigh.


--
Love

Message has been deleted

Julian

unread,
Oct 12, 2021, 10:27:22 AM10/12/21
to
Differentiating between nails and screws is no use
to the Noahs of this world who only have a hammer.

Love

unread,
Oct 12, 2021, 10:30:05 AM10/12/21
to
In article <933bmg5rkq39pjlq8...@4ax.com>, fed...@fea.st says...
Actually that's libertarian to the core. My
right to be preemptively protected from you
doesn't exist. The remedy, in libertarian
thought, is that if you harm me I can sue
you.

Which is why libertarians are so soft on
environmental regulations, too.

--
Love

Love

unread,
Oct 12, 2021, 10:33:12 AM10/12/21
to
In article <sk4609$qj1$3...@dont-email.me>, julia...@gmail.com says...
We won't talk Philips versus Robertson at
all then!


--
Love

Noah Sombrero

unread,
Oct 12, 2021, 10:40:14 AM10/12/21
to
On Tue, 12 Oct 2021 10:10:13 -0400, Love <n...@spam.invalid> wrote:

>In article <cku9mgl695ltnti8c...@4ax.com>, fed...@fea.st
>says...
>>On Mon, 11 Oct 2021 18:55:41 -0700 (PDT), DMB <sgma...@gmail.com>
>>wrote:
>>
>>>On Monday, 11 October 2021 at 12:37:54 UTC-6, Noah Sombrero wrote:
>>>
>>>> within the libertarian framework, you find the right to make other
>>>> people sick by refusing a vaccine?
>>>
>>>Do you have any evidence that Wilson made anyone sick by refusing the
>>>vaccine?
>>>If not, what are you talking about?
>>
>>I'm talking about a likelihood. If wilson gets covid, it will be very
>>likely that he will pass it on.
>
>Untrue. I've seen where he lives and I know
>what he does for a living. All lots of open
>air and sunshine. I'm not 100% sure, but I
>would not be surprised if (unlike me) Wilson
>does not enjoy crowds.

Crowds (or cruise ships or retirement homes) do make transmission a
lot more likely. Smaller groups do not mean unlikely.

>> The odds aren't 100%, but very
>>likely.
>
>Unless you define "very" we can only assume you
>mean greater than 50%. I'd call that horseshit
>estimation.

You can assume whatever you like.

>> I don't think the libertarian framework would say he is not
>>impinging on the rights of other people in refusing to take the
>>vaccine.
>
>That's exactly what it would say.

Perhaps it is what current libertarians would say. Does
libertarianism ideology as given agree with that?

>
>> Of course, you can always find people who don't think that,
>>like wilson.
>
>I don't think it, but unlike libertarians I
>believe that "individual libert" isn't the
>only pole to be considered. I believe that
>duty is part of the mix.

So, in the end, we agree. How about that?
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 10:42:57 AM10/12/21
to
Thanks, I would have thought better of them.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 10:43:59 AM10/12/21
to
On Tue, 12 Oct 2021 15:27:21 +0100, Julian <julia...@gmail.com>
wrote:
Sorry to hear about your anxiety disorder.
--
Noah Sombrero

Wilson

unread,
Oct 12, 2021, 12:05:01 PM10/12/21
to
What is "preemptive protection"?

Wilson

unread,
Oct 12, 2021, 12:06:56 PM10/12/21
to
The medical concerns of other people are valid. But their concerns
cannot rob other people of their natural unalienable rights.

Wilson

unread,
Oct 12, 2021, 12:32:31 PM10/12/21
to
Medicare is sort of a red herring. It could be implemented fairly, but
I don't expect it to be implemented both competently and fairly at this
point in our evolution.

Meanwhile it's worth noting that you are a subject of the crown and owe
fealty to the monarch. And The Crown is defined as the state.

Definition of subject
(Entry 1 of 3)
1 : one that is placed under authority or control: such as
a vassal

Definition of vassal
1 : a person under the protection of a feudal lord to whom he has vowed
homage and fealty : a feudal tenant
2 : one in a subservient or subordinate position

Effectively, the crown owns you.

As a citizen of the United States, the state does not own me.

In regard to citizenship, these two things are not the same.

Wilson

unread,
Oct 12, 2021, 12:35:26 PM10/12/21
to
Of course duty is a part of the mix. Duty goes with and supports but
does not supersede natural rights.

Wilson

unread,
Oct 12, 2021, 12:39:42 PM10/12/21
to
Square drive (what we call "Robertson") is where it's at.

Slot head is totally for losers.

Wilson

unread,
Oct 12, 2021, 12:44:05 PM10/12/21
to
Ayn Rand was no doubt a genius and a great thinker but she doesn't get
to define libertarian. And Objectivism as a philosophy is not
libertarianism.

I think that quote from Jefferson does define what it is and if you
polled a large group of libertarians I am sure a very large percentage
would agree.

Noah Sombrero

unread,
Oct 12, 2021, 1:40:52 PM10/12/21
to
The right to be protected from something you might cause. I assume.
It sounds like there is no protection in advance, only the right to
sue for damage, after damage, is provided.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 1:47:01 PM10/12/21
to
And your natural and inalienable rights cannot rob them of theirs (the
right to not be sick sounds pretty natural to me). It becomes a
matter of whose rights prevail. Do you get your way or do they?

Who has the most expensive lawyer, and the deepest pockets for endless
litigation?

Let's simply forget about that bit about being civil. It gets harder
and harder to squeeze it in.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 1:49:18 PM10/12/21
to
As ky said, the state can appropriate your body and send it off to
war. I don't think, in practice, that there is a big difference.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 1:50:14 PM10/12/21
to
Agreed.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 1:50:46 PM10/12/21
to
Which natural right, yours or theirs?
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 1:51:58 PM10/12/21
to
Jeffy says that you must surrender equal rights to others. Not define
them out of existence.
--
Noah Sombrero

Julian

unread,
Oct 12, 2021, 2:58:20 PM10/12/21
to
What are these natural rights?

(I've heard of the synthetic ones humans endowed upon
themselves.)

Wilson

unread,
Oct 12, 2021, 3:18:23 PM10/12/21
to
Might cause.

I might cause lots of things. How are you going to preemptively prevent
all of those things from ever happening?

I might take the vax and still get sick and give covid to other people.
Perhaps the chance of it happening is lessened but it has happened a
lot. So it's not a preventative.

Preemptive prevention sounds like a slippery slope to forcing people to
do whatever you think they ought to do. Authoritarianism.

Wilson

unread,
Oct 12, 2021, 3:22:52 PM10/12/21
to
Me not getting the vax does not rob anyone of anything. You forcing
someone else to get it does in fact rob them of their bodily autonomy
and their ownership of their own body, their personhood.

Forcing people to get the vax when that doesn't even absolutely
guarantee prevention of the disease or the spreading of it just because
you like the odds and think it's a good idea?

That's an authoritarian hellscape.

Wilson

unread,
Oct 12, 2021, 3:23:55 PM10/12/21
to
I agree with that.

Wilson

unread,
Oct 12, 2021, 3:25:14 PM10/12/21
to
Read that Jefferson quote again. That answers it.

Wilson

unread,
Oct 12, 2021, 3:27:09 PM10/12/21
to
"Rightful liberty is unobstructed action according to our will within
limits drawn around us by the equal rights of others."

Pretty simple, even you should be able to understand it.

Noah Sombrero

unread,
Oct 12, 2021, 4:17:10 PM10/12/21
to
To me, it says that your unobstructed action is limited by the equal
rights of others.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 4:31:40 PM10/12/21
to
Take preemptive precautions like getting a shot.

>I might take the vax and still get sick and give covid to other people.
> Perhaps the chance of it happening is lessened but it has happened a
>lot. So it's not a preventative.

Significantly statistically preventative. Look at the difference
having the shot makes in the graphs in the article I posted. Big
difference.

>Preemptive prevention sounds like a slippery slope to forcing people to
>do whatever you think they ought to do. Authoritarianism.

You are safe there, because it appears your ideology doesn't require
such measures of you. It is the rest of us who require it.

Slippery slope was taught as a fallacy where I went to school. Simply
because a step is taken in one direction does not have anything to do
with whether or not another step will be taken in the same direction.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 4:39:14 PM10/12/21
to
It robs them of a good measure of the statistical likelihood of not
getting covid after being in contact with you.

>You forcing
>someone else to get it does in fact rob them of their bodily autonomy
>and their ownership of their own body, their personhood.

Yet, you are not entitled to disregard other people's ownership of
their bodies.

>Forcing people to get the vax when that doesn't even absolutely
>guarantee prevention of the disease or the spreading of it just because
>you like the odds and think it's a good idea?

There are no absolute guarantees of anything. But there are
worthwhile measures you can take.

>That's an authoritarian hellscape.

Oh, I bet satan could do better than that. Even the human Goethe in
his play Faust did better.
--
Noah Sombrero

Love

unread,
Oct 12, 2021, 7:38:54 PM10/12/21
to
In article <017bmg978c27mofc3...@4ax.com>, fed...@fea.st
says...
>On Tue, 12 Oct 2021 10:10:13 -0400, Love <n...@spam.invalid> wrote:
>
>>In article <cku9mgl695ltnti8c...@4ax.com>, fed...@fea.st
>>says...
>>>On Mon, 11 Oct 2021 18:55:41 -0700 (PDT), DMB <sgma...@gmail.com>
>>>wrote:
>>>
>>>>On Monday, 11 October 2021 at 12:37:54 UTC-6, Noah Sombrero wrote:
>>>>
>>>>> within the libertarian framework, you find the right to make other
>>>>> people sick by refusing a vaccine?
>>>>
>>>>Do you have any evidence that Wilson made anyone sick by refusing the
>>>>vaccine?
>>>>If not, what are you talking about?
>>>
>>>I'm talking about a likelihood. If wilson gets covid, it will be very
>>>likely that he will pass it on.
>>
>>Untrue. I've seen where he lives and I know
>>what he does for a living. All lots of open
>>air and sunshine. I'm not 100% sure, but I
>>would not be surprised if (unlike me) Wilson
>>does not enjoy crowds.
>
>Crowds (or cruise ships or retirement homes) do make transmission a
>lot more likely. Smaller groups do not mean unlikely.

Quibbler.

When does "less likely" become "unlikely"?


--
Love

Love

unread,
Oct 12, 2021, 7:46:44 PM10/12/21
to
In article <sk4bnc$fg6$1...@dont-email.me>, Wil...@nowhere.net says...
Protection from an anticipated harm. Eg. you
keeping pit bulls and me passing a law saying
you may not do that, or curtailing your freedom
because you are not vaccinated.

--
Love


Love

unread,
Oct 12, 2021, 7:50:28 PM10/12/21
to
In article <sk4bqv$fg6$2...@dont-email.me>, Wil...@nowhere.net says...
I wondered how long it would be before "natural rights"
(or "natural law") would be invoked. That too is
definitive of modern libertarian and classical liberal
thought.


--
Love

Love

unread,
Oct 12, 2021, 7:57:18 PM10/12/21
to
In article <sk4dau$i83$1...@dont-email.me>, Wil...@nowhere.net says...
A pretty illusion about your relation
to the state is all you have there. The
state does not own you formally, that's
all. Effectively, it owns you. You will
be marched to an atom-bomb test to see
how much radiation you absorb, or used as
an unwitting placebo group, or sent to a
futile foreign war, or have your ancestral
homelands parcelled away and your treaty
rights erased with the sweep of a pen, if
your government so chooses.


--
Love

Love

unread,
Oct 12, 2021, 7:59:35 PM10/12/21
to
In article <sk4dgd$ln5$1...@dont-email.me>, Wil...@nowhere.net says...
Sure it does. "Natural duties". There, done,
all that's left to negotiate now is which ones
supersede which in various situations.


--
Love

Love

unread,
Oct 12, 2021, 8:01:18 PM10/12/21
to
In article <sk4dod$ql6$1...@dont-email.me>, Wil...@nowhere.net says...
My square head agrees!


--
Love

Love

unread,
Oct 12, 2021, 8:03:47 PM10/12/21
to
In article <sk4e0k$vob$1...@dont-email.me>, Wil...@nowhere.net says...
I think the libertarianism you want and the
libertarianism that actually exists are two
different things.

https://en.wikipedia.org/wiki/Objectivism_and_libertarianism


--
Love
(On even-numbered days, he/her/its.
On odd-numbered days, she/him/thems.
Alternatively, use normal English
terms appropriate to my sex.)

Love

unread,
Oct 12, 2021, 8:10:44 PM10/12/21
to
In article <sk4dau$i83$1...@dont-email.me>, Wil...@nowhere.net says...
>On 10/12/2021 10:16 AM, Love wrote:
>> In article <sk33uf$mge$1...@dont-email.me>, ans...@gmail.com says...
>>> On 10/11/2021 3:49 PM, Wilson wrote:
>>>>
>>>> I am not making anyone else sick by not getting the vaccine.
>>>>
>>>> Do you claim the right to force me to take drugs against my will?
>>>
>>> If it's abortion, even up to the point of actual birth,
>>> it is "your body." But otherwise, your body belongs to
>>> the state, even more so if we get Medicare for all.
>>
>> Sigh.
>
>Medicare is sort of a red herring. It could be implemented fairly, but
>I don't expect it to be implemented both competently and fairly at this
>point in our evolution.

I watch American medicare commercials a
lot and frankly, I agree with you. Those
commercials reveal what that ecosystem is
like and I think that would in fact be an
impediment to competent and fair
implementation. One of the first things
I would do is go back in my DeLorean to
before 1965 and get the feds out of the
service delivery end of it completely.


--
Love

Love

unread,
Oct 12, 2021, 8:26:22 PM10/12/21
to
In article <sk4n1u$dg9$1...@dont-email.me>, Wil...@nowhere.net says...
>
>Preemptive prevention sounds like a slippery slope to forcing people to
>do whatever you think they ought to do. Authoritarianism.

It is that slippery slope, exactly.

"The condition upon which God hath given
liberty to man is eternal vigilance."
(John Philpot Curran, Dublin, 1790)

It's our duty to not let that slide into
authoritarianism happen.

Attempts to not accept this condition of
liberty are akin to allowing the soldiers
on night watch to slip away and huff a
doob whenever they want: liberty trumping
duty.


--
Love

Love

unread,
Oct 12, 2021, 8:31:08 PM10/12/21
to
In article <sk4nab$l93$1...@dont-email.me>, Wil...@nowhere.net says...
>
>Me not getting the vax does not rob anyone of anything. You forcing
>someone else to get it does in fact rob them of their bodily autonomy
>and their ownership of their own body, their personhood.

Abstractions versus material reality: abstractions
win. Good to see you agreeing with woke SJW gender
studies professors!


>Forcing people to get the vax when that doesn't even absolutely
>guarantee prevention of the disease or the spreading of it just because
>you like the odds and think it's a good idea?

Who is being forced to get vaxxed? Requiring
is not forcing.


--
Love

Love

unread,
Oct 12, 2021, 8:43:09 PM10/12/21
to
In article <alrbmgdhtcpo908dq...@4ax.com>, fed...@fea.st says...

>
>Slippery slope was taught as a fallacy where I went to school. Simply
>because a step is taken in one direction does not have anything to do
>with whether or not another step will be taken in the same direction.

Yet history is replete with examples of that
happening. Even walking down a sidewalk one
can see that one step most often follows
from another.

Argument from authority is also considered a
fallacy.

--
Love

Noah Sombrero

unread,
Oct 12, 2021, 8:43:32 PM10/12/21
to
A lot more less.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 8:54:59 PM10/12/21
to
On Tue, 12 Oct 2021 20:26:20 -0400, Love <n...@spam.invalid> wrote:

>In article <sk4n1u$dg9$1...@dont-email.me>, Wil...@nowhere.net says...
>>
>>Preemptive prevention sounds like a slippery slope to forcing people to
>>do whatever you think they ought to do. Authoritarianism.
>
>It is that slippery slope, exactly.
>
>"The condition upon which God hath given
>liberty to man is eternal vigilance."
>(John Philpot Curran, Dublin, 1790)
>
>It's our duty to not let that slide into
>authoritarianism happen.

Agreed. It would be a deluded dictator who thought he could build his
new government on a covid shot.

In other words, this issue does not rise to the level you suggest. It
is trivial, and not worth your time.

>Attempts to not accept this condition of
>liberty are akin to allowing the soldiers
>on night watch to slip away and huff a
>doob whenever they want: liberty trumping
>duty.
--
Noah Sombrero

Noah Sombrero

unread,
Oct 12, 2021, 8:56:29 PM10/12/21
to
It is loading more messages.
0 new messages