Using GPU in custom python layer

104 views
Skip to first unread message

Sumit Maharjan

unread,
Feb 3, 2017, 8:58:46 AM2/3/17
to Caffe Users
I have been trying to implement some custom loss functions in caffe by defining them in python as implemented in these examples.
https://github.com/BVLC/caffe/blob/master/python/caffe/test/test_python_layer.py

How do you make caffe perform calculation in GPU in these type of loss layer?
Do I need to implement calculations using other libraries cupy, numba etc.
While there is explanation on implementing forward_gpu and backward_gpu for the c++ implementation, I couldn't find any other information regarding the python implemented.

Any solution or suggestion is much appreciated.

OCR

unread,
May 2, 2017, 1:19:29 AM5/2/17
to Caffe Users
Same question here. Does the python layer automatically use GPU or something else should be done?

OCR

unread,
May 2, 2017, 1:19:39 AM5/2/17
to Caffe Users
Reply all
Reply to author
Forward
0 new messages