Dear SLATE developers,
We are at the 2021 SDSC GPU Hackathon. We are trying to link SLATE for distributed linear algebra operations with our FHI-aims code to perform GW calculation on GPU. We successfully linked SLATE but it gave us this error at the run time.
aims.210427.scalapack.mpi.x: scalapack_api/scalapack_slate.hh:64: slate::Matrix<scalar_t> slate::scalapack_api::slate_scalapack_submatrix(int, int, slate::Matrix<scalar_t>&, int, int, int*) [with scalar_t = std::complex<double>]: Assertion `An % desc_NB(desca) == 0' failed.
We noted two notes in your README_scalapack_api.txt file.
NOTE: A submatrix must start and end on a tile boundary.
Taking a submatrix of global matrix A of size Am,An starting at ia,ja.
and
NOTE: The ScaLAPACK BLACS grid needs to be Column-major.
CALL BLACS_GRIDINIT( ICTXT, 'Col-major', NPROW, NPCOL )
Our usage do conflict with these two restrictions. We are using the row-major gridinit and the submatrix might not be dividable by the block size.
When should we expect these two restrictions be eliminated? Or should we work on to reconstruct our code?
Best wishes,
Yi