Nudity detection

466 views
Skip to first unread message

Aleš Žurek

unread,
Nov 5, 2014, 5:54:03 PM11/5/14
to caffe...@googlegroups.com
Dear Caffe Users,

I am trying to learn neural network to detect nudity on pictures.
I have two categories "nudity" and "ok".
I am using bvlc_reference_caffenet model, but with smaller input images.
I am loading source images with resolution 128x128 and cropping sub images to 96x96px.

I divided source images into this categories:
  • learn
    • nudity: 71767 images
    • ok: 52600 images
  • validate
    • nudity: 20579 images
    • ok: 13808 images
  • test
    • nudity: 2946 images
    • ok: 2799 images

I have problem with learning. After 20 000 iterations the loss function starts to raise.

I tried to shuffle the images, decrease the learning rate to 0.002 and 0.001. I tried to set the batch processing size to 512 and 256 and now I am trying the bvlc_alexnet model instaed of bvlc_reference_caffenet model, but so far it looks that loss function begin to raise again.

Any idea how I can solve this issue? Is there problem with size of source images? Or the bvlc_reference_caffenet model is not good for this type of classification?



I am using images from test group to test the neural network classification and I am comparing results for each snapshot.

For example I get results like this:

64000 Iterations, lr_rate = 0.001, shuffled images, bvlc_reference_caffenet model
Wrong
Wrong False:    146  // false negative
Wrong True:     573  // false positive
Wrong < 0.6:    125  // accuracy 0.5 - 0.6
Wrong < 0.7:    88   // accuracy 0.6 - 0.7
Wrong < 0.8:    85
Wrong < 0.9:    102
Wrong < 1.0:    319  // accuracy 0.9 - 1.0
----------------------------------------
Average values
Correct:        0.958699976074
Wrong:          0.816941141419
Wrong True:     0.827005588571
Wrong False:    0.777441633074
----------------------------------------
Final results
Failed:         0
Correct:        5026  // classified correctly
Wrong:          719   // wrong classification
Total:          5745

This means that 146 nudity pictures are classified as "ok" and 573 "ok" images are classified as nudity images. More than 12% of images are classified wrong. I would like to get error less than 5% or at least less than 10%. Is it possible?

I would appreciate any information or recommendation about this problem.

Michael Holm

unread,
Sep 22, 2015, 5:24:48 PM9/22/15
to Caffe Users
Hello Ales,

Did you find a solution to your problem?  I would be interested to hear how you made progress, if you did.

Thanks,

Michael

Tarik Arici

unread,
Sep 25, 2015, 1:25:56 PM9/25/15
to Caffe Users
That means it is starting to overfit. But first make sure: training error always goes down but the testing error on the validation set goes up. If that is the case then you are are overfitting.

There are many things to do the first thing is to increase your dataset. You can get more images or transform your current ones by translating or rotating. Second is do not update all layers of your pretrained model. Just the final layers maybe.

Alex Orloff

unread,
Sep 3, 2016, 6:16:32 AM9/3/16
to Caffe Users
Hi All,

Does anyone has a solution for this task?
I think detect nudity is quite common problem, even google has special ability called "safe search"

Thanks

Сергей Алямкин

unread,
Sep 3, 2016, 7:09:18 AM9/3/16
to Caffe Users
I solved this task with very high accuracy

суббота, 3 сентября 2016 г., 16:16:32 UTC+6 пользователь Alex Orloff написал:

Alex Orloff

unread,
Sep 3, 2016, 7:18:41 AM9/3/16
to Caffe Users
Hi Sergey!
Thank you for your reply
So, can you please share your solution?
BTW does your model make difference between girls in bikini and nude?

Thanks
Reply all
Reply to author
Forward
0 new messages