Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Chile plans to regulate all neurotech and ban the sale of brain data

0 views
Skip to first unread message

FBInCIAnNSATerroristSlayer

unread,
Dec 23, 2021, 5:48:32 PM12/23/21
to

Dumb fucking LOW IQ clown John Hall and his coterie of low iq rsc fans
will say Chile is schizophrenic paranoid BECAUSE some stranger Dhruv aka
Bobby Smith aka Bill Pollock aka Muriel Mckay aka Loughall Tomartyr aka
whatever shitty name it calls itself with the same email id
tsp...@gmail.com SAID SO.

John Hall didn't even think WHY these STALKERS use different names with
the same email id, RESPOND ONLY to my posts but NEVER discussed cricket.

These EVIL Intelligence Agencies CIA NSA MI6 MI5 ASIS ASIO Psychopaths
"themselves BRAGGED" MANY TIMES that HALF the POSTS on rsc are from
their OPERATIVES ACROSS the GLOBE, and YET Genius John Hall RESPONDS to
the same Psychoapaths WHO SECRETLY CHIPPED HIM more than 10 years ago
and RECORDING all his private thoughts, emotions and memories in GHCQ
and NSA Quantum Computer AI HIVE Global Information Grid.




More than half the posts on rsc are from our operatives across the globe
https://imgur.com/4B1fzDg


Death Threats to me from CIA NSA Psychopaths
https://imgur.com/gallery/3uM2AxP





John Hall is really a GENIUS and ALL rsc'ers should WORSHIP and LISTEN
to his WISDOM.




=======================================================================

https://spectrum.ieee.org/neurotech-neurorights

Worldwide Campaign for Neurorights Notches Its First Win

Chile plans to regulate all neurotech and ban the sale of brain data

The government of Chile is taking a stand: Its citizens must be
protected from technologies that are capable of mind control, mind
reading, or any other nefarious interference with their brains. While
such concerns used to be relegated to conspiracy-theory chat rooms and
science fiction, now they’re subject to debate by senators. Thanks to a
constitutional amendment that was passed by the National Congress of
Chile and signed by the president, the people of Chile are the first in
the world to be granted a new kind of human rights—“neurorights”—which
advocates say are made necessary by rapid advances in neurotechnology.

Neurotech includes brain implants that can read information from the
brain, translating its electrical signals into, for example, movement
commands for a prosthetic arm. Other implants change the brain by
stimulating specific regions with electrical pulses. Such implanted
stimulators are currently approved for only a few medical conditions,
but Elon Musk has claimed that his neurotech company, NeuralinkCorp., is
developing implants that may one day be used by everyday people to
enhance their cognitive abilities.

There are also a host of noninvasive technologies that can record from
or stimulate the brain, some of which are approved for medical use.
Other companies sell noninvasive neurotech directly to consumers for
applications such as meditation, focus, and sleep; these devices need
only meet the safety standards that govern consumer gadgets, not the far
stricter regulations for medical devices regarding both safety and proof
of clinical benefit.

Chile’s congress is currently considering a bill that goes beyond the
constitutional amendment’s broad declaration of principles. The
“neuro-protection” bill mandates that all neurotech devices be subjected
to the same regulations as medical devices, even if they’re intended for
consumer wellness or entertainment. It also states that neural data will
be considered equivalent to a human organ—which would prohibit the
buying or selling of such data.

“Neuroscience is not just another field of knowledge,” Senator Guido
Girardi, the lead sponsor of the bill, tells IEEE Spectrum in an email.
“It’s similar to what atomic energy was in the 1950s. It may be used to
develop a better society, but also to create weapons against humanity.”
Girardi says he hopes that Chile will be an example for the world and
that other nations and international agencies will adopt comparable
regulation.

Indeed, 2022 may be the year that neurorights becomes a hot topic,
bringing the young neurotech industry and the human rights community
into uncomfortable conversations. Spain’s new Digital Rights Charter
includes a section on neurorights, and while it’s a nonbinding
framework, it may inspire new legislation. The United Nations’
Secretary-General is also interested; his ambitious agenda, published
last September, stated that it’s time to “update our thinking on human
rights,” and included neurotechnology in a list of “frontier issues” to
be considered. The debate is even coming to the big screen: Werner
Herzog, the German film director, is expected to premiere a film about
neurorights, Theater of Thought, sometime in 2022.

A man is wearing a silver device that looks like a helmet on his head.
The front of the device is marked with a K.\u00a0 The Flow headset from
Kernel uses near-infrared light to measure blood flow in the brain.
Kernel is currently selling the device to researchers, but the company
is also developing a consumer model.Kernel

While some neuroscientists and bioethicists support the global campaign,
others say Chile is setting a problematic example for the world, and
that its rushed regulations haven’t been properly thought through.
Concepts such as “brain data” need to be clarified, critics say, because
a broad definition could include behavioral data that reflects what’s
going on in a person’s mind, which many companies already collect.

The debate can quickly get philosophical: Do people have fixed mental
identities throughout their lives? Does anyone have free will? And what
do the squiggly patterns of electrical activity that can be recorded
from a person’s brain reveal about them? Rafael Yuste, cofounder of the
NeuroRights Foundation, in New York City, believes that the technology
is forcing such questions upon us. “This is something that affects the
essence of what it means to be human,” he says.

The NeuroRights Foundation can claim much of the credit for the
developments in Chile, Spain, and the U.N. Yuste, a professor of biology
at Columbia University who studies neural circuitry, has been promoting
the idea of neurorights for nearly a decade now.

He first raised the issue through his involvement with the U.S. Brain
Research through Advancing Innovative Neurotechnologies (BRAIN)
Initiative, a US $100 million research effort announced by President
Barack Obama in 2013. Yuste next convened a group of neuroscientists,
clinicians, ethicists, and engineers to come up with ethical priorities
for neurotechnology, which they published in a 2017 Nature paper. With
his colleagues at the foundation, he has worked closely with the
policymakers who have made the first moves on neurorights. Yuste says
he’s been driven by the implications of his own scientific research:
“We’re decoding perceptions and memory in mice,” he says, “so it’s just
a matter of time until this happens in humans.”
The 5 Neurorights

1. The right to mental privacy

2. The right to personal identity

3. The right to free will

4. The right to equal access to mental augmentation

5. The right to protection from bias

The foundation has delineated five basic neurorights, starting with the
right to mental privacy. Medical and consumer neurotech devices collect
the most intimate kind of data about us, Yuste says; even if current
technologies can decode only a small fraction of it, the data may become
increasingly revealing as the technology improves. The next two rights
protect against the misuse of neurotech that stimulates the brain and
alters its activity: People should have the right to maintain their
personal identity and to exercise free will. The final two rights are
broader guidelines for society: People should have equal access to
mental-augmentation technologies, and the technology should be free from
algorithmic bias that makes the technology work better for certain groups.

Legal scholars working with the NeuroRights Foundation say the right to
mental privacy is under the most imminent threat. Staff attorney
Stephanie Herrmann of Perseus Strategies, a law firm specializing in
international human rights, points to a few articles in recent years
that have raised alarms about new kinds of neuro-surveillance. One
report from the South China Morning Post highlighted a manufacturing
company that was supposedly using brain-scanning headsets to monitor its
workers’ emotional and cognitive states, while another article from that
publication showed schoolchildren wearing headbands that indicated
whether they were paying attention to their lesson.

“All of these technologies are so far ahead of where we are in our
thinking about them,” Herrmann tells IEEE Spectrum. In an article
published in the journal Horizons, Herrmann, Yuste, and Perseus
Strategies director Jared Genser argue that the U.N. should set global
standards for neurorights, paving the way for nations to pass their own
laws. “Regulations are very much part of the future,” Herrmann says,
“but establishing an international framework for thinking about how to
regulate is a good start.”

Herrmann also notes that human-rights laws often protect individuals
against harmful actions by the state, and says that it’s easy to
envision misuse of neurotech by governments. Beyond the potential for
surveillance, she notes that a 2020 U.N. report on psychological torture
contained a discussion of emerging technologies that could be used to
inflict new kinds of pain and suffering, naming neurotechnology as one
to watch. Torturers could alter a victim’s subjective experience of
pain, Herrmann suggests, or interfere with their sense of autonomy.

Yuste worries more about the private companies that are now pouring
money into neurotech R&D, particularly those that sell directly to
customers and are regulated only as consumer electronics. He notes that
many neurotech companies own the data that they extract from users’
brains. “The company is free to decode the data, to sell it, to do
whatever they want with it,” he says. Do you feel uncomfortable when you
consider how much Facebook knows about you based on your online
activity? Now imagine if the company had your brain data as well.

Are you uncomfortable with how much Facebook knows about you? Now
imagine if the company had your brain data as well.

Now let’s talk about hype. Critics say that news reports like those in
the South China Morning Post vastly overstate the current technology’s
capabilities, potentially causing hysteria. “People are being swept up
in the hype around how scary these things are,” says Karen Rommelfanger,
founder of Emory University’s neuroethics program and the new nonprofit
Institute of Neuroethics.

External headsets, like those supposedly worn by workers and students in
China, provide fairly crude types of data decoding or stimulation. The
most powerful and high-fidelity neurotech devices are those implanted in
the brain, but even implants are far from being able to read someone’s
thoughts or force them to act against their will. For example,
researchers at the University of California, San Francisco, have done
pioneering work with implants that can decode words from the brains of
stroke patients who have lost the ability to speak, but their latest
study used a vocabulary set of only 50 words. Facebook had helped fund
that research as part of its effort to build a brain-computer interface
for consumers that would translate “intended speech” into text, but in
July the company announced that it was abandoning that effort.

Rommelfanger is strongly in favor of national and international
discussions of neuroethics, but she says the Chilean efforts on
neurorights were rushed and didn’t incorporate enough local input. “If
you dig into the local literature, you’ll see that philosophers,
clinicians, lawyers, and even digital-rights groups have all offered
critiques of the laws.” She says that some Chilean legal and medical
experts have raised concerns about turning broad principles into clear
rules. For example, she asks, “What does it mean to have psychic
continuity?” Some could argue that giving a depressed person
antidepressant medication changes who they are—hence, she says, the
concerns from medical groups that the neuro-protection law could hamper
their ability to treat patients.

Rommelfanger thinks that the approach taken by the Chilean bill for
neuro-protection is too heavy-handed; by regulating all neurotech as
medical devices, she worries that the country will stifle innovation and
prevent startups from bringing forth new devices that will help people.
And Chile’s actions are getting international attention: “I’m afraid
that other governments are going to move too fast, like what Chile has
done, which will foreclose their opportunity to develop neurotech,” she
says. It might be wiser, she says, to start with a review of existing
human rights and biometric privacy laws around the world and to consider
whether those rules apply to the novel technology.

The entrepreneur Bryan Johnson, who founded the Los Angeles-based
neurotech company Kernel in 2016, agrees that overzealous regulation is
a threat to the young industry. Rafael Yuste “has said that he wants all
brain devices to be considered medical devices,” Johnson tells IEEE
Spectrum. “I think that would be a crushing blow to the industry.”
Johnson says it’s already quite hard and expensive to start a brain-tech
company that builds devices for consumers or scientists. “I funded this
company with $50 million of my own money,” he says. If every neurotech
device had to clear the regulatory hurdles required of medical devices,
such as proving efficacy in large-scale clinical trials, he believes the
expense would be crippling.

Kernel is currently selling its first noninvasive brain scanner to
neuroscientists, but Johnson says the company will have a consumer
product ready in 2024. The company has given a great deal of thought to
its privacy policy, Johnson says, which is centered around two
principles: Individuals should always provide full consent for how their
neural data will be used, and they should always have control of their
data. “We all have a shared interest in being good actors here,” Johnson
says. “If we don’t, they’re going to come in and regulate us.”

Facebook

unread,
Dec 23, 2021, 9:05:08 PM12/23/21
to
In article <017xJ.163855$3q9.1...@fx47.iad>
FBInCIAnNSATerroristSlayer <FBInCIAnNSATe...@yahoo.com> wrote:

Flush.

FBInCIAnNSATerroristSlayer

unread,
Dec 24, 2021, 12:05:10 AM12/24/21
to

BeamMeUpScotty

unread,
Dec 24, 2021, 2:12:29 AM12/24/21
to
On 12/24/21 12:05 AM, FBInCIAnNSATerroristSlayer wrote:
> On 12/23/2021 5:56 PM, Facebook wrote:
>> In article <017xJ.163855$3q9.1...@fx47.iad>
>> FBInCIAnNSATerroristSlayer <FBInCIAnNSATe...@yahoo.com> wrote:
>>
>> Flush.
>


I think, therefore I am.

or

I am what I am, uk uk uk uk uk...
--
That's karma,

"We hold these truths to be self-evident, that all men are created
equal, that they are endowed by their Creator with certain unalienable
Rights, that among these are Life, Liberty and the pursuit of Happiness.
— That to secure these rights, Governments are instituted among Men,
*deriving their just* *powers from the consent* of the governed, — That
whenever any Form of Government becomes destructive of these ends, it is
the Right of the People to alter or to abolish it, and to institute new
Government,"

It would seem that *MANDATES* are NOT derived from the consent of the
governed. The Constitution doesn't delegate unlimited power to mandate
the governed be part of a medical experiment.
0 new messages