Myth of quantum computing

0 views
Skip to first unread message

extra...@comcast.net

unread,
Sep 17, 2005, 7:55:43 PM9/17/05
to freeviews.science
Here are simple computations about limits of the quantum computer
speed:

Here is the principle of uncertainty:

dE<eV> x dT<sec> = h <eV*sec>


dT= 1 year =365*24*60*60 <sec> =~ 3e7 <sec>

h =~ 4e-15 <eV*sec>

then:

dE =~ 1.3e-22 <eV>



How many levels is needed, to represent all possible simple divizors of
N=2**1000?

L number = 2**500 =~ 1e3**(500/10) = 1e150

It will consume energy band = dE*L =~ 1.3e128 <eV>


Let say we use electric potential 1e6 <V>

How many electrons we need, to have such energy band available?

N =~ 1.3e122 <electrons>

Each electron has one proton to get matter neutral

N =~ 1.3e122 <protons>

What would be the mass M of those protons?

M =~ 1.3e122 * 1.7e-27<kg> =~ 2.2e95 <kg>


Now, consider please, that the whole Earth mass is ~ 6e24<kg>


So, we have our answer:

Mass of a quantum computer, which would be able to factor 1000 bit
number in a year, must be at least 3.7e70 times the mass of Earth.

The whole known Universe mass is 3e52 <kg>

Mass of a quantum computer, which would be able to factor 1000 bit
number in a year,must be at least 7e42 times the mass of whole
Universe.


Hello! Anyone listening?

ES

extra...@comcast.net

unread,
Sep 26, 2005, 3:38:18 PM9/26/05
to freeviews.science
Let us put it this way:

A quantum computer can factor at most 640 bit number.

It would have to use all the Earth matter, and would have to work since
the Big Bang.

Here is the proof.

dE * dT = h

E= dE * sqrt(N) = h/dT*sqrt(N)

E= m*c2

m*c2 = h/dT*sqrt(N)

dT*m*c2/h =sqrt(N)

Using dt=10**10 years, 1year=3*10**7sec, m=6*10**24kg /for the whole
Earth/, c=3*10**10 m/sec , h= 6.6*10-34 J*sec:

10**96=sqrt(N)

2**320=2**(lg(N)/2)

lg(N)=640


This is the limit for any analog factoring device.

ES

Reply all
Reply to author
Forward
0 new messages