Hello all,
We are interested in how our different mouse groups learn the FR1 task from the FEDs, and I am curious if anyone else has looked at this before or has any ideas on how to do it.
I am looking at learning as cumulative accuracy (correct pokes / all pokes) over time. I’d like to fit a model to this data so that we can compare model parameters across groups.
So far, I've tried 4 different models. I've attached some example plots with the 4 models on data from different mice. Generally, a sigmoid function fits our data well, except for when mice start by poking to the correct side and make few pokes (so cumulative accuracy starts out high). In addition, there is a lot of variation initially (when there are very few pokes) and the model fits these poorly. I’d appreciate ideas about how to deal with these cases.
Below are the details on the 4 models. All of this is done in R. For all plots, x = hours since the start of FR1 testing and y = correct pokes / total pokes. Sigma is a measure of fit, the residual standard deviation: the
https://stat.ethz.ch/R-manual/R-devel/library/stats/html/sigma.html.
Model #1: Sigmoid model.
Fit with: nls(y~b/(1 + exp(-(xc)/d)), start=list(b=.9,c=3,d=6)).
b = asymptote
c = inflection point
d= a scale parameter ?
Adapted from:
https://kyrcha.info/2012/07/08/tutorials-fitting-a-sigmoid-function-in-r https://stackoverflow.com/questions/33033176/using-r-to-fit-a-sigmoidal-curveModel #2: Sigmoid model on data where points with y = 1 have been removed and the model must go through the origin. I did this by adding a point at (0,0) and the using the 'weights' input in the nls function to set the weight of all points to 1 and the (0,0) point to 1000.
Model #3: Same sigmoid model as above but points before the 'steep incline' have been removed. By eye, I determined where the 'steep incline' was on the data set and then removed all the points before this. Ideally, I would find a way to calculate this 'steep incline' point rather than doing it by eye but I'm not sure how to do this. These points that were removed are in blue on the graph.
Model #4: 5 parameter logistic regression.
Fit with: nlsLM(y ~ A + (D-A)/(1 + exp(log(2^(1/S)-1) + B*(xmid-x)))^S)
start = c(A = 0, B = 1, xmid =cut, D = 1, S = 1),
lower = c(-Inf, 0, -Inf, -Inf, 0),
control = nls.lm.control(maxiter = 1024, maxfev=10000))
I would appreciate any input or advice anyone has on this. Thank you!!
Best,
Courtney Klappenbach
Delevich Lab
Washington State University