{{46411, 1}, {232051, 1}, {417691, 1}}
Seems strgnge that it would give back a wrong answer in only 4
seconds?????
but remember how complex is for a calculator to find prime factors,
maybe this is a little exception of the probabilistic prime checker
algorithm in our hp.
I next years maybe we will be using a quantum cpu in our loved hp and 4
seconds will be the time to get the real prime factor for that number
;-)
or someone using HPGCC makes an exthaustive search program
and uses 120MHz or even 203MHz during calculation
> or someone using HPGCC makes an exthaustive search program
> and uses 120MHz or even 203MHz during calculation
Exhaustive search is futile, and I wouldn't go beyond 75 MHz anyway
with HPGCC - it's not worth it.
Regards
Steen
It's always worth it!
I overclock my primary PC - why not calc? :-)
You are naturally entitles to your own opinion...
BUT
it's probably better to use a different base for that probability test
e.g. what ever factors it uses => prepare for lsarger detected PRIMEs
Others may have found good numbers for the algorithm for faster CPU's
and now it's time to update ISPRIME
> It's always worth it!
> I overclock my primary PC - why not calc? :-)
> You are naturally entitles to your own opinion...
> BUT
> it's probably better to use a different base for that probability test
> e.g. what ever factors it uses => prepare for lsarger detected PRIMEs
> Others may have found good numbers for the algorithm for faster CPU's
> and now it's time to update ISPRIME
With actual computers is near impossible find a efficient prime
finder/prime factoring algorithm... it worth to small numbers but not
for bigger ones (I say "bigger") :-D otherwise our actual web security
(based on the inefficient way than actual computers can apply a prime
factorization) will be useless.
Maybe ISPRIME function need to be updated with more exceptions
ISPRIME is not a prime test, it's a pseudo-prime test.
FACTOR does not factor integers which are pseudo-primes.
The Voyage 200 returns the correct factors in about 4 seconds.
> Takes about twice that on my TI-92+. That suggests that factor() is
> using brute force, at least for part of the search. As others have
> noted, that won't scale. 8)
Try factorizing 500!
my hp-50g takes exactly 41.3 seconds and provides 95 factors
> Takes about twice that on my TI-92+. That suggests that factor() is
> using brute force, at least for part of the search. As others have
> noted, that won't scale. 8)
On the TI trial division is performed up to 1021. On the HP it's up to
1999.
Regards
Steen
> Try factorizing 500!
>
> my hp-50g takes exactly 41.3 seconds and provides 95 factors
Expect that performance to improve with about a factor 100 in the next
couple of months.
Regards
Steen
>
> Expect that performance to improve with about a factor 100 in the next
> couple of months.
>
> Regards
> Steen
Cool, let me, us, know what program or libray it is and where to get it
etc :)
Roy
> Try factorizing 500!
>
> my hp-50g takes exactly 41.3 seconds and provides 95 factors
For wild trip down memory lane..
A long time ago in a galaxy far away, i used a 512K Macintosh from
1986, used one of the early math programs called "Powermath" and it
took 7 days for it to exacly compute and show 500!
How far we have come when the HP 50G with Steen Scmidt's fine program
does 500! almost instantly, and Mathemtaica 4 or 5 does it in a blink
of an eye.
Ahaa!!!
How far will your pure ARM code algorithm go?
99991 perhaps?
Probably around 9973, but I can't say for sure until I get the
algorithm sorted out (initial trial division is close to the last thing
to implement in this case, as performance scaling really can't be
judged until the algorithm is very close to finished).
Cheers,
Steen
> A long time ago in a galaxy far away, i used a 512K Macintosh from
> 1986, used one of the early math programs called "Powermath" and it
> took 7 days for it to exacly compute and show 500!
Don't blame the hardware when the cause is poor programming. One
could compute 500! by hand in less time than that.
My Apple II (48K, 1 MHz 8-bit CPU, probably less computing power
than your average dishwasher) can compute 500! in a couple
minutes with muMATH, a 1980-vintage CAS that begat today's Derive
(see http://www.derive-europe.com/theroots.asp?history ).
This can be turned into a few seconds with a special-purpose
factorial calculating program, such as the one I wrote in December
1985 to compute 1985! and print out the digits in the shape of
a Christmas tree.
Paul Guertin
p...@sff.net
> by the way... whats happened with this bug? anyone has fixed or
> reported it?
As Bernard Parisse pointed out in this thread, it's not a bug. The
stated integer is a pseudo-prime which FACTOR doesn't factor.
When dealing with primality tests it's important to understand that
these are probabilistic in nature.
Regards
Steen
Well, simply returning after 4 seconds, *as if* it had actually
completing the factoring, is not so good!
No error messages, eg "num too big, aborted", err code so
that that if-error exception-handling thing would notice it?
DOCUMENTED that it'll do this on a "too-big" number?
David
David Combs wrote:
> No error messages, eg "num too big, aborted", err code so
> that that if-error exception-handling thing would notice it?
That depends on whether the input is a fast-aborter or a time-outer, as
explained in the next two paragraphs respectively.
ISPRIME? can't give a special error code for fast-aborting inputs,
since ISPRIME? can't tell any difference between "pseudo-primes" and
real primes. ISPRIME? checks not for actual primality, but for
sufficiently strong pseudo-primality. If it passes *that* test, a
result of 1 is returned, whether the input was actually prime or was
merely a pseudo-prime. The code cannot alert the user about the latter
case, since it can't tell the difference between primes and
pseudo-primes. It just says to itself, "Hey, this number passes all
the little pseudo-prime tests! Even some of the not-so-little ones!
The probability of it being a prime is high enough For All Practical
Purposes!" And it returns a 1.
I *totally* agree with you about non-pseudo-primes, however. When a
non-pseudo-prime is being factored, and any partial factor takes longer
than 60 seconds to crack, the entire process is aborted, returning the
original input. This is unfortunate, since (a) that could be many
minutes after finding many partial factors, (b) it could at least
return the partial factor(s) found during that process, and (c) it
should at least (as you say) alert the user that the input is composite
but too big to factor. IMHO, the timeout should be user-settable, with
0 being interpreted as "never time out". Next model, maybe. ;-)
> DOCUMENTED that it'll do this on a "too-big" number?
It IS documented, as being a "probabilistic test". The problem is not
that the number is "too big" but rather that it is too strongly
pseudo-prime.
BOTTOM LINE: When ISPRIME? returns 0, the input was *definitely*
composite, but when it returns 1, the input was either prime or
strongly pseudo-prime.
[Rodger et al: Am I using "strong pseudo-prime" correctly here? It
sounds funny. -jkh-]
-Joe-
> ISPRIME? can't give a special error code for fast-aborting inputs,
> since ISPRIME? can't tell any difference between "pseudo-primes" and
> real primes. ISPRIME? checks not for actual primality, but for
> sufficiently strong pseudo-primality. If it passes *that* test, a
> result of 1 is returned, whether the input was actually prime or was
> merely a pseudo-prime. The code cannot alert the user about the latter
> case, since it can't tell the difference between primes and
> pseudo-primes. It just says to itself, "Hey, this number passes all
> the little pseudo-prime tests! Even some of the not-so-little ones!
> The probability of it being a prime is high enough For All Practical
> Purposes!" And it returns a 1.
What's exactly the algorithm used to test pseudo-primality? I'm not
even close being a number theorist, but a quick search on Wiki revealed
that Probabilistic Miller-Rabin test is one of the most reliable, easy
to program algorithms currently available. See:
http://en.wikipedia.org/wiki/Miller-Rabin_primality_test
for details and code. It has been proved that the test is deterministic
for n < 341,550,071,728,321
Given availability of a fast FFT routine (and I suppose the HP49/50 has
it) the computation time is O((log(n)^2)).
> What's exactly the algorithm used to test pseudo-primality?
Miller-Rabin with bases 2, 3, 5, 7 & 11 for long zints (longer than 64
bits).
> http://en.wikipedia.org/wiki/Miller-Rabin_primality_test
>
> for details and code. It has been proved that the test is
> deterministic for n < 341,550,071,728,321
Iff bases 2, 3, 5, 7, 11, 13 & 17 are tested, which is not the case on
the HP. The more bases you test, the slower the algorithm. The more
bases you test, the higher n the combined test is deterministic for. As
always, a tradeoff between performance and stability.
The number in question in this thread (4,498,414,682,539,051) is still
larger than 341,550,071,728,321 though, so testing bases 13 & 17
wouldn't have made the test deterministic for that input.
Regards
Steen
Yes, it should at least return the partial factors founf so far!!
> process, and (c) it should at least (as you say) alert the user that
> the input is composite but too big to factor. IMHO, the timeout
> should be user-settable, with 0 being interpreted as "never time
> out". Next model, maybe. ;-)
Next new ROM maybe?!
The original test are for the HP 49G which is slower than the 49g+/50G
I think it's time to slightly change the algorithms
> The number in question in this thread (4,498,414,682,539,051) is still
> larger than 341,550,071,728,321 though, so testing bases 13 & 17
> wouldn't have made the test deterministic for that input.
pity...
I think that computer simulation oif the limits is in order
eg. document the 1st prime it will not find
or better yet...
put in ROM some of the primes it will not find in order to rase the limit
would this be feasible at all?
> The original test are for the HP 49G which is slower than the 49g+/50G
> I think it's time to slightly change the algorithms
And then the ROM won't be usable on the HP49G anymore. But it's all
moot, since no enhancements is being made on the CAS as per HP decision.
> I think that computer simulation oif the limits is in order
> eg. document the 1st prime it will not find
Computer simulation would take ages, and who should do it? Are you
volunteering? :-)
> or better yet...
> put in ROM some of the primes it will not find in order to rase the
> limit
That would mean iterating on the above, and it would mean changing the
CAS code (no one will do that), and it'd be a complete waste of time;
Most of the numbers that people fret about not being factored here are
much much bigger than the puny limits we're talking about here. Even if
you included the first ten thousand primes the built-in algorithm won't
determine deterministically, it wouldn't matter at all when you try to
factor a 30 digit number. It would still be as probabilistic as ever.
I think the only feasible thing to do is to remove the time limit, but
that won't change anything about pseudo-primes. Or one could write a
better factoring algorithm in C.
Cheers,
Steen
> Yes, it should at least return the partial factors founf so far!!
It does so already.
> > process, and (c) it should at least (as you say) alert the user that
> > the input is composite but too big to factor. IMHO, the timeout
> > should be user-settable, with 0 being interpreted as "never time
> > out". Next model, maybe. ;-)
>
> Next new ROM maybe?!
Sure.
Regards
Steen
pity
>> I think that computer simulation oif the limits is in order
>> eg. document the 1st prime it will not find
>
> Computer simulation would take ages, and who should do it? Are you
> volunteering? :-)
it would not take ages - any PC is much much faster than a calc
> I think the only feasible thing to do is to remove the time limit, but
I agree
The time limit sould be removed
> that won't change anything about pseudo-primes. Or one could write a
> better factoring algorithm in C.
Oh...better algorithm in C....hmmm....
> > Computer simulation would take ages, and who should do it? Are you
> > volunteering? :-)
>
> it would not take ages - any PC is much much faster than a calc
Sure, but not at running calc code. We need to determine the first
composite number that fails to be deemed so by the HP. This at least
entails implementing the exact same algorithm on a PC, if not running
the show on an emulator. The latter will take a very long time. While
the former is much faster, we're still talking about someting like
testing 1E15 numbers on the algorithm. If we on average can test one
number per ms on a PC, it'll still take more than 30000 years to try
them all.
It'd be faster to assume the code is a full implementation of
Miller-Rabin with bases 2, 3, 5, 7 & 11, and then make a proof of when
the algorithm pivots from deterministic to probabilistic. It's no easy
task I assure you, and for the uninitiated it might even take 30000
years as well ;-)
Regards
Steen
> I think the only feasible thing to do is to remove the time limit
That might be very welcomed, provided that the CANCEL key
is permitted to interrupt, so that the user can decide for himself
when to quit. A user-settable test sequence limit [not a time limit]
would also be handy (or perhaps another command in which that limit
is an additional argument).
The command is inherently mis-named, however;
evidently should have been called ISPROBPRIME?
which would have eliminated these misunderstandings
(it's never too late to give a command an *additional*
name, however, much the way "LIMIT" turns into "lim"
after being entered).
[r->] [OFF]
> That might be very welcomed, provided that the CANCEL key
> is permitted to interrupt, so that the user can decide for himself
> when to quit. A user-settable test sequence limit [not a time limit]
> would also be handy (or perhaps another command in which that limit
> is an additional argument).
Why a user-settable limit at all? Such a limit doesn't exist on the
SOLVE or ADD function either. As long as CANCEL works, the user can
always abort - until then the calc shouldn't do anything else but
concentrate on what it's being asked to do. Some operations take a long
time.
> The command is inherently mis-named, however;
> evidently should have been called ISPROBPRIME?
> which would have eliminated these misunderstandings
Of course, but again, should the SOLVE function be renamed to
SOLVEIFALGORITHMALLOWS ? ;-)
All it takes is the smallest understanding of prime numbers as well as
reading the manual. But this is a problem with almost everything today
- users take for granted that they understand how stuff works by birth,
hence they needn't read manuals. Many also expect appliances to read
their mind (the black box user), or even guess the intention not yet
grasped by the user (an analogy would be for a user to shout at a
coffee maker; MAKE COFFEE! And then the coffee maker would start out by
leaping down to the grocery store to buy coffee beans, running back
attaching itself to the water and power lines as well as serve the
coffee whereever the user might be at the time the coffee is finished).
I think it's good if a device demand some knowledge of what you ask of
it, before anything reasonable comes out. Or else the human mind will
degenerate much faster than necessary. All it takes is a little
interest in the world around us - a little curiosity - and without
that, what is there to life?
Regards
Steen
Hitting calcel with no time limit seems the best solution
Maybe if VERBOSE is set
one could seem something on the Header...
I meant a variable limit on how many different primes
to use in the basic test algorithm
(it was earlier pointed out that another implementation
tested with more primes, to increase the probability
of correctness of a "probable prime," and to raise
the value of the smallest composite that can
incorrectly slip through the test).
When you run the calc on different hardware
(not only 49G Saturn vs. ARM calcs, but also
including computer emulators like Emu48),
you can certainly afford doing more tests
before running to the end of the list of tests,
which terminates the command anyway, even if you have
all the time in the world to wait :)
If the "inner" function which does one single test
is accessible enough (or if anyone would care
to supply a completely independent program),
then it might be possible to offer an external program
to remove all current limitations (of course it's
more likely and practical to do it under HPGCC instead,
given how much greater computer power that offers,
except for the lack of equal power under current emulators
and for the 49G)
[r->] [OFF]
There is another way. By increasing the
number of bases, you are decreasing the
number of cases where a composite number
will pass the Rabin-Miller algorithm ( which
is a souped-up Fermat Test ). No matter how
many bases you use, there will still be a
few composites that pass. The law of
decreasing returns on investment applies
here! :-)
The way I like to handle it is to use two
different probablistic tests and, in the case
of the Rabin-Miller test, consider trying the
simple Lucas Test which says that the z'th
term of the Lucas Sequence is congruent to
one mod z, if z is prime. This test is also a
probablistic test and therefore does have a
number of pseudo-primes.
The key is that the Lucas and Rabin-Miller
will have a different set of pseudo-primes and
so, a composite number that passes the
Rabin-Miller test is very unlikely to pass the
Lucas Test and vice-versa.
I'd propose to do perform the Rabin-Miller
test to a very moderate number of bases and
then do a simple check with the Lucas Test
given above ( can be done very quickly mod
z using Q matrix, etc. )
My guess would be a pretty good bang for
the buck and your pseudoprimes for this
combined test would be extremely rare. I
do not know the likelihood of a composite
number passing both the Lucas & Rabin-
Miller combined test and I'm not so sure there
is a known example.
JT
> I'd propose to do perform the Rabin-Miller
> test to a very moderate number of bases and
> then do a simple check with the Lucas Test
> given above ( can be done very quickly mod
> z using Q matrix, etc. )
That's a good suggestion, and one I'll probably attempt to implement.
Thanks.
Cheers,
Steen
http://www.ticalc.org/archives/files/fileinfo/376/37669.html
There is also the Perrin test in the same
zip file. The Perrin test uses another more
involved modular recurrent sequence as a
probablistic prime test though the Perrin
test is much more powerful than the Lucas
test - just takes roughly twice as long to
calculate.
As an example, an integer was posted to
this thread: 4,498,414,682,539,051
The V-200 determined in less than a sec
that the number is NOT prime, probably by
trial division which it does very quickly. The
factors were returned in just under 10sec.
46411 * 232051 * 417691
This integer passes the Lucas test and took
the V-200 about 4.5sec to determine that
the integer is probably prime.
The same integer fails the Perrin test and
it took the V-200 about 8.4 sec to determine
that the integer is definitely NOT prime.
These times are for TI-Basic versions. If
you write versions in HPGCC I'm sure the
execution times will be just a fraction of those.
The same integer fails the Rabin-Miller test
but not until the base prime of 19 which was
the eighth prime base tested.
JT
> If you like, I have the algorithm posted
> for TI-Basic at this link:
>
> http://www.ticalc.org/archives/files/fileinfo/376/37669.html
Thanks.
> As an example, an integer was posted to
> this thread: 4,498,414,682,539,051
> The V-200 determined in less than a sec
> that the number is NOT prime, probably by
> trial division which it does very quickly.
The V-200 does trial division only up to 1021.
> This integer passes the Lucas test and took
> the V-200 about 4.5sec to determine that
> the integer is probably prime.
> The same integer fails the Perrin test and
> it took the V-200 about 8.4 sec to determine
> that the integer is definitely NOT prime.
> These times are for TI-Basic versions. If
> you write versions in HPGCC I'm sure the
> execution times will be just a fraction of those.
> The same integer fails the Rabin-Miller test
> but not until the base prime of 19 which was
> the eighth prime base tested.
That's odd, since PrimeQ in Mathematica uses Rabin-Miller with bases 2
& 3 combined with the Lucas pseudoprime test - Wolfram states that this
test catches all composites below 10^16 with no known counterexamples.
Are you sure there isn't an error somewhere? If there isn't I would
tell Wolfram about 4,498,414,682,539,051.
Cheers,
Steen
There are several different versions of these
'Lucas' probablistic tests floating around. I
believe the one referenced would be what's
called the Lucas-Lehmer test. I'd have to
check but I think that's the same as the
strong Lucas test. A certain combination
of Rabin-Miller and Lucas-Lehmer has no
known counterexamples at all yet. Best
Rabin-Miller bases I've heard of is the
{ 2, 3, 7, 61, 24251 } prime bases given by
Greathouse that takes you up to 10^16. I
just found a reference to that:
http://primes.utm.edu/prove/prove2_3.html
Look also for Paul Pollack's 'Number
Theory Programs for the TI-92' I do recall
his algorithms were pretty good.
About TI only trial dividing up to 1021...
My mistake - I've always had it in my head
and I don't know where I read it - about the
TI isprime doing trial division up to 1E6 or
something like that - does sound ridiculous
now that I think about it! :-) I don't think
TI's are quite *that* fast!
Good luck with your project by the way,
I'll bet it's tough to try and implement some
of these tests in low-level code. I'll be very
interested in seeing your results! I haven't
tackled HPGCC myself yet - just downloaded
and did some reading but no coding yet....
so hurry up, I need example code to learn
from! ;-)
JT