False positive determination in Stimfit paper

20 views
Skip to first unread message

Anand Kulkarni

unread,
Oct 5, 2018, 1:28:50 AM10/5/18
to stimfit
Hi Guys,
              I could not find how false positive percentage was determined in the Stimfit Frontiers paper? What was the denominator? 

Best,
Anand

Christoph Schmidt-Hieber

unread,
Oct 5, 2018, 4:09:37 AM10/5/18
to stimfit
I’ve just added the code that we used for the manuscript to our GitHub repository.

If you scan for the string “FP” you should be able to reproduce our false positive computation:

https://github.com/neurodroid/stimfit/blob/master/manuscript/events.py
> --
>
> ---
> You received this message because you are subscribed to the Google Groups "stimfit" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to stimfit+u...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Anand Kulkarni

unread,
Oct 5, 2018, 5:20:06 AM10/5/18
to stimfit
Thanks Christoph!

Here is what I gather. Can you confirm it?

Definitions:

positives events =  events identified by the algorithm

true events =  events identified by an expert labeler 

In your code here is how the fractions are calculated:

% TP = # of TP / # of positive events

% FP = # of FP / # of positive events

% FN = # of FN / # of true events

Is that accurate?

Christoph Schmidt-Hieber

unread,
Oct 5, 2018, 5:22:32 AM10/5/18
to stimfit
iirc we ran simulations to obtain ground truth data, i.e. the events are determined by the simulation and there is no expert labeller.

Anand Kulkarni

unread,
Oct 5, 2018, 5:52:22 AM10/5/18
to stimfit
I see. In which case consider true events as those generated in the simulation. 

Therefore, is my description of the calculations accurate? 

Christoph Schmidt-Hieber

unread,
Oct 5, 2018, 5:56:06 AM10/5/18
to stimfit
Seems reasonable, though I’d have to re-run the code to be sure.
Reply all
Reply to author
Forward
0 new messages