FLOPS:
- 725,066,088 for all conv + fc w/ biases
- 000,659,272 for ReLU
- 000,027,000 for pooling
- 000,020,000 for LRN
layer, weight ops, bias ops
conv1 105415200 290400
conv2 223948800 186624
conv3 149520384 64896
conv4 112140288 64896
conv5 74760192 43264
fc6 37748736 4096
fc7 16777216 4096
fc8 4096000 1000
For example, conv2 has 256 * (96 / 2) * 5^2 = 307,200 params
Schematic of params and forward FLOPs (on one 3x227x227 image):
Hope that helps,
Hi,I want to calculate the number of flops for a single iteration of alexnet. Can anyone tell me how can I get it?
--
You received this message because you are subscribed to the Google Groups "Caffe Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to caffe-users...@googlegroups.com.
To post to this group, send email to caffe...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/caffe-users/565761b2-2cdf-4b55-8dcb-d7598eec11b3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.