Bruce,
Whereas Python has virtually infinite accuracy for integer operations such as multiplication and powers, Glowscript is giving me only six digits of accuracy, going quickly into floating point numbers for largish (actually not even very large) numbers.
Having only six digits of accuracy is a real problem for number theory work (for example, an implemention of the Diffie-Hellman public key cipher).
There may be no full solution, but if Python at least used
double precision
values by default, it would improve things.
Thanks for considering,
Harlan