Hi Avanti, and Carole,
I am uncertain how this note from Avanti comes by way of GCPwork,
but I will respond inline. Sorry for being so slow getting back to
you Avanti.
Roger
On 4/7/2024 5:12 PM, Carole Aubin
wrote:
Collective Consiousness . I see this when there are Big News
events , I see this in the Lottery. A random number generator.
I
was thinking about the methodology of the GCP and in
particular the XOR mask that is applied to the raw bitstream
generated by the random number generator in order to
guarantee that the expected value of the resulting bitstream
is exactly 0.5. The Global Consciousness Project proved the
existence of correlations between the final RNG outputs, not with
the raw RNG outputs, which means that the quantum tunneling
processes that provided the raw bitstream were conditioned
on the XOR mask that would be applied to it. In other
words, a constraint was applied on the physical processes
behind the quantum tunneling such that RNGs at different
locations would produce similar output only after XORing
with a bitmask, and if you replace "output" with
"measurement", you get the same
It may be a language issue, but it is not correct to say the RNGs
produce similar output only after XORing. I would say that the RNGs
produce correlated output after XORing (or even after
XORing). In other words, the data from separated RNGs become
correlated even with the constraints of the XOR. We have no interest
in the final bitstream being correlated with the raw noise. Rather
the XORing is designed to remove 1st order bias of the mean of the
data. The actual experimental protocol is aimed at changing the
final, post-XOR data sequences in a similar way across devices such
that correlations between devices might be observed. And this is
exactly what we observe.
type
of constraint that applies in quantum entanglement: a
coupling between measurements. This in turn suggests that
the "hidden variables" of quantum mechanics don't specify
individual measurements; they specify constraints on the
outcomes, consistent with the retrocausal
interpretation of quantum entanglement.
In
other words, "fate" exists and it is authored by our free
will/consciousness, which retro-causally influences events
so that particular coincidences occur. By extension, could
we argue that even our
laws of physics are a retrocausal
explanation for the universe designed by the "free will"
of the collective consciousness? I wouldn't be
surprised if that is the reason for the recently discovered
antipodal duality (
https://www.wired.com/story/particle-physicists-puzzle-over-a-new-duality/): the
laws of physics must be set up to allow coincidences to
occur, and this manifests as a duality between seemingly
unrelated aspects of physics.
It's a pretty good story, but there is an underlying assumption that
there must be a change in the electron flow or the tunneling
parameters, or something in the diodes (or FETs). I think the
anomalous effects may need to be recognized as occurring at a higher
or more comprehensive level. The candidate that comes most easily to
mind is the statistical ensemble level. I know most people say, but,
but, there has to be a physical process, a change in the electron
count, etc. But I think that is ultimately misleading -- much to
strong a constraint imposed by our experiential learning and our
physics conditioning.
What
do you think? As an aside, I
prepared
a writeup on the stats used in the GCP that I think is
clearer than the writeup on the website - in particular it
clarifies the two very different contexts in which the
"Stouffer's Z" is used. I would be happy to help update the
writeup on the website if desired.