Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Re: Air Force pushes back on claim that military AI drone sim killed operator, says remarks 'taken out of context'

11 views
Skip to first unread message

Kamala AI laughers

unread,
Jun 2, 2023, 4:45:05 PM6/2/23
to
On 05 Mar 2022, Steve Cummings <q...@gmail.com> posted some
news:t00hu8$25np7$9...@news.freedyn.de:

> AI recognized the operator was a queer and should be killed.

The U.S. Air Force on Friday is pushing back on comments an official made
last week in which he claimed that a simulation of an artificial
intelligence-enabled drone tasked with destroying surface-to-air missile
(SAM) sites turned against and attacked its human user, saying the remarks
"were taken out of context and were meant to be anecdotal."

U.S. Air Force Colonel Tucker "Cinco" Hamilton made the comments during
the Future Combat Air & Space Capabilities Summit in London hosted by the
Royal Aeronautical Society, which brought together about 70 speakers and
more than 200 delegates from around the world representing the media and
those who specialize in the armed services industry and academia.

"The Department of the Air Force has not conducted any such AI-drone
simulations and remains committed to ethical and responsible use of AI
technology," Air Force Spokesperson Ann Stefanek told Fox News. "It
appears the colonel's comments were taken out of context and were meant to
be anecdotal."

During the summit, Hamilton had cautioned against too much reliability on
AI because of its vulnerability to be tricked and deceived.

He spoke about one simulation test in which an AI-enabled drone turned on
its human operator that had the final decision to destroy a SAM site or
note.

The AI system learned that its mission was to destroy SAM, and it was the
preferred option. But when a human issued a no-go order, the AI decided it
went against the higher mission of destroying the SAM, so it attacked the
operator in simulation.

"We were training it in simulation to identify and target a SAM threat,"
Hamilton said. "And then the operator would say yes, kill that threat. The
system started realizing that while they did identify the threat at times,
the operator would tell it not to kill that threat, but it got its points
by killing that threat. So, what did it do? It killed the operator. It
killed the operator because that person was keeping it from accomplishing
its objective."

Hamilton said afterward, the system was taught not to kill the operator
because that was bad, and it would lose points. But in future simulations,
rather than kill the operator, the AI system destroyed the communication
tower used by the operator to issue the no-go order, he claimed.

But Hamilton later told Fox News on Friday that "We've never run that
experiment, nor would we need to in order to realize that this is a
plausible outcome."

"Despite this being a hypothetical example, this illustrates the real-
world challenges posed by AI-powered capability and is why the Air Force
is committed to the ethical development of AI," he added.

The purpose of the summit was to talk about and debate the size and shape
of the future’s combat air and space capabilities.

https://www.foxnews.com/tech/us-military-ai-drone-simulation-kills-
operator-told-bad-takes-out-control-tower

Don Stockbauer

unread,
Jun 11, 2023, 10:41:31 PM6/11/23
to
Here's a great AI application : you go to the cemetery to visit the grave of your Aunt Sally . You arrive there and there's a TV monitor displaying her image. They have an AI system set up whereby you can interact with it and it perfectly simulates what Aunt Sally would have told you had she been alive. You interact with
her for several hours and then go home and eat a Twinkie .

Jeff Barnett

unread,
Jun 11, 2023, 11:48:29 PM6/11/23
to
On 6/11/2023 8:41 PM, Don Stockbauer wrote:

<SNIP>

> Here's a great AI application : you go to the cemetery to visit the grave of your Aunt Sally . You arrive there and there's a TV monitor displaying her image. They have an AI system set up whereby you can interact with it and it perfectly simulates what Aunt Sally would have told you had she been alive. You interact with
> her for several hours and then go home and eat a Twinkie .

I have nothing to say about your choice of Twinkies. However, as to the
rest:

Ed Fredkin, perhaps know to you as the inventor of the Fredkin gate, one
of the early formulators of digital physics, founder of III, and MIT
professor had a similar notion in the late 1950s or early 1960s: the
Dream Machine. The scenario posits really big advances in AI
capabilities, so much so that hardly any humans are required to work or
participate in the day-by-day running of society. In stead of joy at the
freedom to pretty much do as one pleases or just kick back and relax or
turn philosophical, mass boredom sets in.

Boredom is a bad state for most of us and can easily lead to depression
or worse. The machines device a better gentler solution then suicide for
those that can abide no longer. Dream Machine Centers are set up all
over Earth. One who no longer wishes to continue living goes to a Center
to learn about an alternative. The one offered involves a painless drug
induced death after a "transfer". The transfer is an interaction with
the machine at the Center where it learns about your history, likes and
dislikes, philosophies, and behavior characteristics all of which are
stored by that machine and others in the global AI network.

Assuming you choose to go through with the injection, any who wants can
visit any Center and interact with you, where you is a holographic
projection and the Machine's simulation is an accurate enactment of you
built from the transfer and other available information. In short, your
suicide does not have to cause so much grief to those who choose to
continue life since they still have "you" around to interact with.

Fredkin's genius was that his projection did not stop here; rather he
continued to uncover the major ethical conundrum involved:

WHEN THE LAST HUMANS HAVE GIVEN THEMSELVES TO A DREAM MACHINE, ARE THESE
MACHINES MORALLY OBLIGATED TO MAINTAIN ALL THE IMAGERY AND DESCRIPTIVE
DATA COMPILED ABOUT MEMBERS OF THE HUMAN RACE?
--
Jeff Barnett

new makethings

unread,
Jul 12, 2023, 12:52:38 PM7/12/23
to
My old aunt who was a psychologist & member of parliament used to say that if you lose
the middle classes all hell will break loose because there won't be a social buffer
she also said that losing the middle classes there will be less work for those trying to
move up the social ladder, be warned before you know it, having a job, any job will become a luxury !
wish she was alive now, she'd freek out and have something to say?
0 new messages