PReLU is slow

379 views
Skip to first unread message

Mausoom Sarkar

unread,
May 1, 2015, 2:29:58 PM5/1/15
to caffe...@googlegroups.com
Hi,

I tried training alexnet with PReLUs instead of ReLUs and the speed almost halfed. Even setting the "channel_shared: true" didn't improve performance.
Below is the layer that is repeated throughout the network
layer {
  name: "prelu1"
  type: "PReLU"
  bottom: "conv1"
  top: "conv1"
  param {
    decay_mult: 0   
  }
  prelu_param {
    channel_shared: true
  }
}

I am using TitanX on Ubuntu 14.04.2 with driver version 346.47
The time with RELU is 10sec for 20 iterations
with PReLU it is 17sec for 20 iterations

Is this slowdown expected?
Message has been deleted

Mausoom Sarkar

unread,
May 2, 2015, 8:32:28 AM5/2/15
to caffe...@googlegroups.com
I just did some benchmarking and found that Prelu is slower than conv layer


I0502 15:58:03.235862 19023 caffe.cpp:276]      conv1    forward: 23.0175 ms.
I0502 15:58:03.235872 19023 caffe.cpp:279]      conv1    backward: 26.5506 ms.
I0502 15:58:03.235882 19023 caffe.cpp:276]     prelu1    forward: 5.75406 ms.
I0502 15:58:03.235893 19023 caffe.cpp:279]     prelu1    backward: 111.537 ms.

CaffeStudent

unread,
May 6, 2015, 4:53:35 PM5/6/15
to caffe...@googlegroups.com
Did this recent pull request fix the issue you are experiencing?

Mausoom Sarkar

unread,
May 7, 2015, 5:12:37 AM5/7/15
to caffe...@googlegroups.com
Yes it does show speedup
Reply all
Reply to author
Forward
0 new messages