A quantum computer can factor at most 640 bit number.
It would have to use all the Earth matter, and would have to work since
the Big Bang.
Here is the proof.
dE * dT = h
E= dE * sqrt(N) = h/dT*sqrt(N)
E= m*c2
m*c2 = h/dT*sqrt(N)
dT*m*c2/h =sqrt(N)
Using dt=10**10 years, 1year=3*10**7sec, m=6*10**24kg /for the whole
Earth/, c=3*10**10 m/sec , h= 6.6*10-34 J*sec:
10**96=sqrt(N)
2**320=2**(lg(N)/2)
lg(N)=640
This is the limit for any analog factoring device.
ES