Accelerating Determinant Computations for Large Symbolic Matrices with Multiple Variables

Skip to first unread message


Aug 20, 2023, 6:54:00 AM8/20/23
to sage-devel
Dear all ,
I'm seeking advice on how to speed up the computation of the determinant for large symbolic matrices. Are there alternative approaches, optimizations, or libraries that could help me achieve faster results? I'm particularly interested in any insights that might help me handle a 20x20 matrix with 5 variables efficiently.
Additionally, I suspect that memory allocation might be contributing to the issue. Could anyone provide guidance on how much memory I should allocate for these computations? This information could be crucial in tackling the memory-related challenges I'm facing, as it was one of the issues i ran into while trying to run  the code on super computer.
I greatly appreciate your help!

Dima Pasechnik

Aug 20, 2023, 8:16:45 AM8/20/23
to sage-devel
a standard way to compute the determinant in such cases is interpolation at random points.

you won't need much more memory than the space for the coefficients of the corresponding polynomial.

You received this message because you are subscribed to the Google Groups "sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
To view this discussion on the web visit


Aug 20, 2023, 3:05:06 PM8/20/23
to sage-devel
There is no universal answer, it depends on the matrix. For some, Gauss-Bareiss will perform well, for some others Lagrange interpolation will, you can guess that with total degree and partial degree bounds. For some matrices (sparse ones) minor expansion will perform better (compute first all minors with columns 0 and 1, then all minors with columns 0 to 2 using previously computed minors and so on)..
Reply all
Reply to author
0 new messages