Python3: NumPy computation with GPU using Numba

49 views
Skip to first unread message

Glacies Cerebrum

unread,
Sep 4, 2017, 9:14:37 PM9/4/17
to Python GCU Forum
Hi
Anyone can solve this? 

I'm new with Python in computer science (and I never used C or C++ before; My background is Pascal and Java). I got problem with using GPU on python, especially for compute a large covariance matrix to get eigen's value.
As far I did, I've tried with this formula:

Covariance Matrix = A^T * A 

  Which A has a dimension N x M
  N > M

And the covariance matrix should get M x M dimension
but the most frustating is the M's value is even bigger for my CPU.

Just an example I did, I had covariance matrix with 10000 x 10000 dimension and it tooks 30 minutes with CPU using Numba JIT


Any suggestion would be appreciated. Thank you

Reply all
Reply to author
Forward
0 new messages