Quantum Computer Factoring

20 views
Skip to first unread message

John Clark

unread,
Dec 14, 2019, 7:13:48 AM12/14/19
to everyth...@googlegroups.com
Just a few years ago the best a Quantum Computer could do is figure out that the factors of 15 were 3 and 5 but things have improved. Very recently a Quantum Computer figured out that the factors of 1,099,551,473,989  are 1,048,589 and 1,048,601.


John K Clark 

Philip Thrift

unread,
Dec 14, 2019, 7:52:17 AM12/14/19
to Everything List
There are some engineers who think that although quantum processes do occur in the quantum computers of the various types being built, none will ever turn out to be useful (in doing really big tasks - whether factoring huge integers or finding minimum-distance traveling-salesman paths). On big problems they will just return random answers that are only "close" at best.


@philipthrift

John Clark

unread,
Dec 14, 2019, 11:56:05 AM12/14/19
to everyth...@googlegroups.com
On Sat, Dec 14, 2019 at 7:52 AM Philip Thrift <cloud...@gmail.com> wrote:

> There are some engineers who think that although quantum processes do occur in the quantum computers of the various types being built, none will ever turn out to be useful (in doing really big tasks - whether factoring huge integers or finding minimum-distance traveling-salesman paths). On big problems they will just return random answers that are only "close" at best.

That's true for the sort of machines that D-Wave makes that use Quantum Annealing, that's good for some specialized problems but Quantum Annealing is not Turing Complete and D-Wave never claimed it was; devices like that might not be able to find the perfect solution to the traveling salesman problem but they could find a very good one and that is nothing to sneeze at. However to run Peter Shor's factoring algorithm you need a Quantum Computer that is Turing Complete, and those are the types of machines that IBM, Google, and others make. As far as I know D-Wave is the only company that makes Quantum Annealers.

John K Clark

smitra

unread,
Dec 14, 2019, 12:28:49 PM12/14/19
to everyth...@googlegroups.com
> Quantum computer sets new record for finding prime number factors [1]
>
> John K Clark

Interesting, just a pity that they happened to choose a number this
large with two prime factors that are so close to each other. There
exists a very simple factoring method that works well for such numbers,
based on the series expansion if the square root. If a number n has a
factorization n = p q with p and q prime numbers close to each other,
then with
a = (p+q)/2, and u = (p-q)/2, we can write

n = (a+u)(a-u) = a^2 - u^2

Taking the square root and expanding in powers of u gives:

sqrt(n) = a [1 - 1/2 u^2/a^2 - 1/8 u^4/a^4 + ...] = a - 1/2 u^2/a - ...

If u^2/(2 a) < 1, then you can find the value of a just by computing the
integer part of the square root and adding 1, the factorization of n
then follows immediately. The condition u^2/(2 a) < 1 means that this
method will work when the difference between the primes is of the order
of n^(1/4)., so for numbers of the order of 10^12, the method will start
to fail when the difference between the prime numbers is of the order of
1000.

In this case, the integer part of the sqrt(n) is 1048594, this means
(assuming that the method will work) that

u = sqrt(1048595^2 - n) = 6

and from this we find p = 1048595 + 6 = 1048601 and q = 1048589

Saibal

Philip Thrift

unread,
Dec 14, 2019, 3:26:37 PM12/14/19
to Everything List
It is doubtful (to some)  ANY of them, (IBM, Google, ...) will work for factoring big numbers. (You might have to run them returning false results after false results for a long time until you get a true answer - because of the noise.) 

See my next post.

@philipthrift

John Clark

unread,
Dec 14, 2019, 4:40:21 PM12/14/19
to everyth...@googlegroups.com
On Sat, Dec 14, 2019 at 3:26 PM Philip Thrift <cloud...@gmail.com> wrote:

> It is doubtful (to some)  ANY of them, (IBM, Google, ...) will work for factoring big numbers. (You might have to run them returning false results after false results for a long time until you get a true answer - because of the noise.) 

Quantum Computer expert Scott Aaronson says in his book "Quantum Computing since Democritus" it would be more interesting if that turned out to be true because if large scale Quantum Computers can not be built for a fundamental reason rather than just because of engineering or monetary difficulties that would mean a new law of physics has been discovered. Nothing we know now precludes their existence, but there is a lot we don't know.

 John K Clark

Brent Meeker

unread,
Dec 14, 2019, 6:30:35 PM12/14/19
to everyth...@googlegroups.com
Most quantum algorithms return a random value which only near the actual answer in some sense.  So as I understand it the algorithm is repeated to build up a statistical distribution of answers and then you have decide, depending on your problem, what uncertainty you can tolerate.  In a lot of problems "close" is good enough.  In some cases, like factoring big numbers, it's easy to check the answer classically.

Brent

John Clark

unread,
Dec 15, 2019, 5:11:54 AM12/15/19
to everyth...@googlegroups.com
On Sat, Dec 14, 2019 at 6:30 PM 'Brent Meeker' <everyth...@googlegroups.com> wrote:
 
> In some cases, like factoring big numbers, it's easy to check the answer classically.

There are some problems that are hard to do but easy to check, for example if I claim to have a better solution to the Traveling Salesman Problem than you have it's easy to check if my claim is correct, just add up the distance between cities and see if the sum is less than your sum; but if I claim to have found the very best solution it would be very difficult for you to check if my claim is true. Most believe it is fundamentally more difficult to find the best solution than to find a better solution but this has never been proven and is probably the greatest open problem in computer science.  

John K Clark 

Philip Thrift

unread,
Dec 15, 2019, 6:47:41 AM12/15/19
to Everything List
The checking part is not the issue. The issue is how much time a QC will need to run (the noise wiping out the parallelism) to find a correct factorization vs. a standard computer (even with quantum chip QRNGs for generating random numbers for a probabilistic factorization program, like Dixon's factorization). 

@philipthrift


John Clark

unread,
Dec 15, 2019, 8:58:59 AM12/15/19
to everyth...@googlegroups.com
On Sun, Dec 15, 2019 at 6:47 AM Philip Thrift <cloud...@gmail.com> wrote:

> The checking part is not the issue. The issue is how much time a QC will need to run (the noise wiping out the parallelism) to find a correct factorization vs. a standard computer (even with quantum chip QRNGs for generating random numbers for a probabilistic factorization program, like Dixon's factorization). 

Shor's Quantum Factoring Algorithm does not need a QRNG or any other sort of random number generator. And Shor does not make use of probabilities although he does use something more general; a probability is always a positive Real Number but Shor uses something that can be negative or even a Complex Number for its internal calculations although it's output is always a integer.

Here is a explanation of how Shor's Algorithm works:


John K Clark

Philip Thrift

unread,
Dec 15, 2019, 10:49:41 AM12/15/19
to Everything List
I was referring to a probabilistic integer factoring algorithm like Dixon's factoring method, not Shor's.


The problem with Shor's method is that it depends on quantum parallelism.

"Shor's algorithm utilizes quantum parallelism to perform the exponential number of operations in one step."

But this theoretical quantum parallelism speedup will be wiped out (according to critics) in an actual quantum computer due to "environmental" noise when the number of qubits is large enough to do something useful.

So Shor's algorithm may work in quantum theory, but will fail in practice with more than a few qubits.



Like all quantum computer algorithms, Shor's algorithm is probabilistic: it gives the correct answer with high probability, and the probability of failure can be decreased by repeating the algorithm. 


@philipthrift

John Clark

unread,
Dec 15, 2019, 11:18:10 AM12/15/19
to everyth...@googlegroups.com
On Sun, Dec 15, 2019 at 10:49 AM Philip Thrift <cloud...@gmail.com> wrote:

> But this theoretical quantum parallelism speedup will be wiped out (according to critics) in an actual quantum computer due to "environmental" noise when the number of qubits is large enough to do something useful.

Yes that's what critics say, but quantum error correcting algorithms have already been developed and non-abelian anyons would produce far fewer errors to start with than anything used today; critics can not say why none of these things will work they just say something unspecified will turn up to make them unworkable. Well maybe so, but if it turns out to be true that would mean new physics has been discovered. Judging from the rate of investment in Quantum Computers in recent years I conclude that the financial community is betting that new physics will not be discovered.

John K Clark  

Lawrence Crowell

unread,
Dec 15, 2019, 7:00:44 PM12/15/19
to Everything List
The difficulty is quantum error correction coding (QECC). One needs to have a system for computing the Hamming distance between states and their updates. Any error then can be computed with a type of quantum error correction, such as hexicode or E8. Now to make things a bit loopy, with quantum computing you have quantum processors running the QECC. So things run into some funny issues. In general, this could be done "well enough," so if the quantum computing can occur in a short enough time then errors can be managed. 

LC


 

Philip Thrift

unread,
Dec 16, 2019, 5:49:40 AM12/16/19
to Everything List
If the the "superpositions" (needed to grow exponentially to get quantum speedup via parallelism) are physically wiped out (in a real QC with more than a few qubits), I don't see how any error correction can help.

@philipthrift
 

John Clark

unread,
Dec 16, 2019, 9:31:53 AM12/16/19
to everyth...@googlegroups.com
On Mon, Dec 16, 2019 at 5:49 AM Philip Thrift <cloud...@gmail.com> wrote:

> If the the "superpositions" (needed to grow exponentially to get quantum speedup via parallelism) are physically wiped out (in a real QC with more than a few qubits), I don't see how any error correction can help.

Conventional error correcting works mainly through clever use of redundancy, that works for bits but due to the No Cloning Theorem it won't work for Qubits. But Peter Shore found that you can split up the information in a Qubit and send the parts to 9 other Qubits, this number has been later reduced to 5.  By analyzing these 5 Qubits it is possible to figure out if the original Qubit had been corrupted and if so how; and you can do all this without obtaining any information at all about the actual value of the Qubit which is very important as that would destroy the entanglement. You've obtained information about the error but not about the original Qubit.

In 1996  Michael Ben-Or and Dorit Aharonov discovered the Quantum Fault-Tolerance Theorem, it says that if you can reduce the error rate of a physical Qubit below a certain threshold value you can reduce the error rate of the entire computer to a arbitrarily low value, in other words you'd be correcting errors faster than you'd be creating them. Unfortunately it takes a lot of overhead to do this, if you had a physical error rate of 1%, which seems achievable, you might need 10,000 error correcting Qubits for every Qubit that was actually working on the problem you want to solve. That's why if you could compute with 100 perfect Qubits you could probably rule the world, but a real Quantum Computer that you could actually build might need a million Qubits to do the same thing.  

However if you could reduce that initial physical error rate you'd need a lot less overhead for error correction. A type of quasiparticle called a Majorana fermion is thought to be non-abelian, if so then the particle exchange operation between two quasiparticles would be non-commutative, the particles would in a sense remember where they have been and you could tie their world lines into a knot. This could have big implications for a new type of quantum computer called a "Topological Quantum Computer" because they would be MUCH less sensitive to quantum decoherence, so MUCH less error correction would be needed. Think of it this way, It's easy to jolt a string into a new position but it's far more unlikely that you could jolt a string in just the right way to untie a knot in it, and a Topological Quantum Computer uses the knots in the world line of those Majorana quasiparticles to do its quantum computing. The leading company in Topological Quantum Computing is Microsoft.

John K Clark

Bruno Marchal

unread,
Dec 16, 2019, 9:53:35 AM12/16/19
to everyth...@googlegroups.com
On 14 Dec 2019, at 17:55, John Clark <johnk...@gmail.com> wrote:

On Sat, Dec 14, 2019 at 7:52 AM Philip Thrift <cloud...@gmail.com> wrote:

> There are some engineers who think that although quantum processes do occur in the quantum computers of the various types being built, none will ever turn out to be useful (in doing really big tasks - whether factoring huge integers or finding minimum-distance traveling-salesman paths). On big problems they will just return random answers that are only "close" at best.

That's true for the sort of machines that D-Wave makes that use Quantum Annealing, that's good for some specialized problems but Quantum Annealing is not Turing Complete and D-Wave never claimed it was; devices like that might not be able to find the perfect solution to the traveling salesman problem but they could find a very good one and that is nothing to sneeze at. However to run Peter Shor's factoring algorithm you need a Quantum Computer that is Turing Complete,

Strictly speaking this is not correct. Just factoring numbers is not enough to emulate a universal machine.

The difficulties are just in isolating large number of qubits. This is solved in principle by topological quantum computation, but there too, the technical difficulties are very great. I have few doubt that quantum computer will appear though.



and those are the types of machines that IBM, Google, and others make. As far as I know D-Wave is the only company that makes Quantum Annealers.

Which, like the RMN quantum computation, is a sort of fake quantum computation. It is some step in the good direction, but other technics are needed to get genuine quantum computation (like indeed the factorisation of 15, and now of a very bigger number).

I have seen that China invests a lot in quantum cryptography and computing. They believe in some sort of quantum supremacy indeed, and their goal does not seem rosy.

Bruno





John K Clark

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv2JzXO3dnqaoB_KZ5d1wKk8oGaKDwn%3DE1joih__J9rTrg%40mail.gmail.com.

Lawrence Crowell

unread,
Dec 16, 2019, 5:57:38 PM12/16/19
to Everything List


On Monday, December 16, 2019 at 8:31:53 AM UTC-6, John Clark wrote:
On Mon, Dec 16, 2019 at 5:49 AM Philip Thrift <cloud...@gmail.com> wrote:

> If the the "superpositions" (needed to grow exponentially to get quantum speedup via parallelism) are physically wiped out (in a real QC with more than a few qubits), I don't see how any error correction can help.

Conventional error correcting works mainly through clever use of redundancy, 

Hamming distance is used so this is done with changes in bits and in the case of quantum computing qubits. This is more efficient and for classical error correction it is almost perfect. For quantum information it is limited by things such as the Holevo theorem.

LC
Reply all
Reply to author
Forward
0 new messages