Fitting a model to FR1 Learning Curves

58 views
Skip to first unread message

Courtney Klappenbach

unread,
Apr 8, 2022, 7:32:16 PM4/8/22
to FEDforum
Hello all,

We are interested in how our different mouse groups learn the FR1 task from the FEDs, and I am curious if anyone else has looked at this before or has any ideas on how to do it.

I am looking at learning as cumulative accuracy (correct pokes / all pokes) over time. I’d like to fit a model to this data so that we can compare model parameters across groups.

So far, I've tried 4 different models. I've attached some example plots with the 4 models on data from different mice. Generally, a sigmoid function fits our data well, except for when mice start by poking to the correct side and make few pokes (so cumulative accuracy starts out high). In addition, there is a lot of variation initially (when there are very few pokes) and the model fits these poorly. I’d appreciate ideas about how to deal with these cases.

Below are the details on the 4 models. All of this is done in R. For all plots, x = hours since the start of FR1 testing and y = correct pokes / total pokes. Sigma is a measure of fit, the residual standard deviation: the https://stat.ethz.ch/R-manual/R-devel/library/stats/html/sigma.html.

Model #1: Sigmoid model.
Fit with: nls(y~b/(1 + exp(-(xc)/d)), start=list(b=.9,c=3,d=6)).
b = asymptote
c = inflection point
d= a scale parameter ?
Adapted from:
https://kyrcha.info/2012/07/08/tutorials-fitting-a-sigmoid-function-in-r
https://stackoverflow.com/questions/33033176/using-r-to-fit-a-sigmoidal-curve

Model #2: Sigmoid model on data where points with y = 1 have been removed and the model must go through the origin. I did this by adding a point at (0,0) and the using the 'weights' input in the nls function to set the weight of all points to 1 and the (0,0) point to 1000.


Model #3: Same sigmoid model as above but points before the 'steep incline' have been removed. By eye, I determined where the 'steep incline' was on the data set and then removed all the points before this. Ideally, I would find a way to calculate this 'steep incline' point rather than doing it by eye but I'm not sure how to do this. These points that were removed are in blue on the graph.


Model #4: 5 parameter logistic regression.
Fit with: nlsLM(y ~ A + (D-A)/(1 + exp(log(2^(1/S)-1) + B*(xmid-x)))^S)
start = c(A = 0, B = 1, xmid =cut, D = 1, S = 1),
lower = c(-Inf, 0, -Inf, -Inf, 0),
control = nls.lm.control(maxiter = 1024, maxfev=10000))

A = lower asymptote
D = upper asymptote
B = 'how rapidly the curve makes its transition between the two asymptotes'
C = location parameter ?
Xmid = inflection point
cut = the same value used to determine the 'steep incline' from the graph. Setting this as the starting value for xmid leads to less errors in the nls function.
Adapted from:
https://www.graphpad.com/guides/prism/latest/curve-fitting/reg_asymmetric_dose_response_ec.htm
https://www.r-bloggers.com/2019/11/five-parameters-logistic-regression/#:~:text=The%20five%2Dparameters%20logistic%20curve,%E2%88%92x)))S.

Looking at the graphs, model 4 seems to fit the data best, but often results in negative inflection points (xmid) which wouldn't help us to determine when the mice reach their fastest rate of learning.

I would appreciate any input or advice anyone has on this. Thank you!!

Best,
Courtney Klappenbach
Delevich Lab
Washington State University







example2.png
example4.png
example3.png
example1.png

Lex Kravitz

unread,
Apr 9, 2022, 11:38:30 AM4/9/22
to FEDforum
Hi Courtney!  I don't have much experience with this type of modeling but I love that you're doing this!  To get around the issue of high variability at the start throwing off your curves I might suggest trying to bin the data either by time (ie: correct %age oer hour), or by trials (ie: correct %age in blocks of N trials) to see if that improves your ability to analyze how the accuracy changes over time. Please write back if you find a good way to do this!
Reply all
Reply to author
Forward
0 new messages