OSCR with external state space and covariates

77 views
Skip to first unread message

Prashant Mahajan

unread,
May 15, 2025, 4:40:46 PM5/15/25
to oSCR

Hello all,

I have recently started using the oSCR package to fit spatial capture-recapture (SCR) models. I'm working with genetic data collected across three different years, each containing two seasons. For now, I am focusing on a single season across three years and treating each year as a temporal session.

I created the state space externally, including covariates, with a structure like this:

     X                      Y                   HFI     FC
707314.6 5313439 -0.4344612 -0.07216065
708311.4 5313516 3.1997226 -0.86034913
709308.2 5313594 -0.3704614 -1.3235114
710305         5313672 1.1569519 -1.3235114
711301.9 5313749 -0.1441296 -1.29100879
712298.7 5313827 2.6610416 -1.20568942

I then used the following code to build the state space object:

cov <- read.csv("mask.csv")
ss <- list(cov, cov, cov)
class(ss) <- "ssDF"

So currently, I am trying to run three sessions data for a single season (across three years). I am using the temporal sessions with the same state space for all three sessions. I have following queries:

1. Inclusion of Covariates in the State Space and Trap File
  • Will covariates such as HFI and FC be automatically included in the model as part of the state space when using the code above? I can see them included in the ss object, but I’m unsure if they are utilized correctly in the model fitting.

  • I also have an Effort covariate in the trap file (tdf). If I want to model the effect of HFI on density (D) and Effort on detection probability (p0), should my model look like this?

m1 <- oSCR.fit(list(D~HFI, p0~Effort, sig~1), sf, ss)

or like this:

m1 <- oSCR.fit(list(D~HFI, p0~1, sig~1), sf, ss)

assuming the Effort has automatically being taken as in case of SECR package in the form of usage.

2. My state space is bigger in size with around 8500 pixels/coordinates and when I am trying to run the give model such that my all three parameters vary with session:

m1 <- oSCR.fit(list(D~session, p0~ session  , sig~session), sf, ss)

The model is taking an exceptionally long time to run (over 14 hours and still going). I understand reducing the number of pixels in the state space might help, but are there any other ways to reduce run time without compromising much on resolution?

Also, if I want to model the effect of a covariate on density across sessions, would this be the appropriate formulation?  

m1 <- oSCR.fit(list(D~session+HFI, p0~session+Effort, sig~session), sf, ss)

Any guidance or suggestions would be greatly appreciated.  

Thank you 
Prashant

Prashant Mahajan

unread,
May 16, 2025, 4:40:37 PM5/16/25
to oSCR
Also, I am currently working on integrating telemetry data with genetic capture data. The encounter data file (edf) is structured as follows:  
Session ID Ocassion Detector
1 1 MVIND58 1 3606 2 1 MVIND7 1 3605 3 1 MVIND18 1 3605

  The telemetry fixes (tele) are formatted like this:  

ID X Y 1 MVIND1 726954.9 5365853 2 MVIND1 727053.6 5365103 3 MVIND1 727872.5 5365670   I am using a state-space dataframe (ssDF) that looks like:  

X Y HFI FC 1 707533.9 5323484 0.5927083 2 708530.7 5323562 -0.5150675 3 709527.5 5323640 -0.5240237
Again, I am using the same ssDF and want to know if "HFI" and "FC" above can be considered as the covariates. T
Here is the code I am currently using:

edf = read.csv("18S_capth.csv")

tdf1 = read.csv("18_S_trap.csv")

fixes <- tele[,c("ID","X","Y")]
colnames(fixes)<- c("ID","X","Y")

# create the state space and RSF surfaces with covariates
ssDF <- rsfDF <- data.frame(cov)
head(ssDF)

kp<- seq(1,nrow(fixes),length=0.10*nrow(fixes)) 
fixes.thin <- fixes[kp,]

nfix <- telemetry.processor(list(rsfDF),list(fixes.thin))$nfreq

telemetry <- list(fixfreq=nfix,HFI=HFI)

data <- data2oscr(edf = edf, #the EDF
                  sess.col = 1, #session column
                  id.col = 2, #individual column
                  occ.col = 3, #occasion column
                  trap.col = 4, #trap column
                  tdf = list(tdf1), #list of TDFs,
                  K = c(1),
                  ntraps = c(331), #no. traps vector
                  rsfDF = list(rsfDF),
                  telemetry = telemetry)

sf <- data$scrFrame

# fit the model "SCR+RSF" from Royle et al. 2013
fit <- oSCR.fit(scrFrame=sf,ssDF=list(ssDF),DorN="D",encmod="CLOG",
                rsfDF=list(rsfDF),RSF=TRUE,telemetry="dep",
                trimS=sf$mdm,
                model=list(D~HFI,p0~1,sigma~1,path~1))

However, the model is not fitting correctly and is returning NA values. Is there anything that I am doing wrong here? I am using capture data for one session but have pooled the telemetry for three sessions (three summer seasons). Additionally, not all collared individuals were captured, and some captured individuals were not collared.

Any guidance or suggestions would be greatly appreciated.  

Thank you 
Prashant

Daniel Linden

unread,
May 19, 2025, 3:08:11 PM5/19/25
to oscr_p...@googlegroups.com
Hi Prashant, one place that is good to explore is here: Spatial Capture-Recapture - 2. Getting started with oSCR

That site contains some lectures that describe the basics of data prep and modeling fitting using oSCR.  Most of what you've described sounds like you are on the right track.

Regarding your questions, the TDF needs to be formatted correctly to get covariates incorporated.  Assuming you have a "sep" column with "/" entries on each row, then any columns listed after that can be chosen as covariates in the p0 model, including Effort.  There is no automatic incorporation of effort unless you have 0/1 entries with a column for each occasion in the TDF (before the "sep" column).  If you've summed effort as a single value, you need to include in the formula as you illustrated.

We do not have good ways to speed up the model fitting that we can personally vouch for, aside from coarsening the state space resolution.  There has been discussion about modifying the oSCR R code to accommodate parallel processing during the likelihood optimization.

As for the telemetry integration, if you are following the instructions in our vignette (Vignette 1 Integrated RSF-SCR models in oSCR | oSCR vignettes) and still having trouble, unfortunately it could be a starting value problem that can be tricky to diagnose.  Often, there will be telemetry patterns that do not conform well to the model assumptions (e.g., individuals making unique movements).  More times than not, when someone has difficulty fitting the telemetry integration it is because of such errant locations.

--
You received this message because you are subscribed to the Google Groups "oSCR" group.
To unsubscribe from this group and stop receiving emails from it, send an email to oscr_package...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/oscr_package/9af7bb2a-cc74-4c64-86aa-080032aaa14fn%40googlegroups.com.

Prashant Mahajan

unread,
May 20, 2025, 4:16:20 PM5/20/25
to oSCR

Hi Dan,

Thank you for your response and for clarifying the TDF covariates. Regarding the external state space: do I need to explicitly supply each state space covariate as shown in Vignette 1 (e.g., ssDF <- rsfDF <- data.frame(ss, z = nybears$elevation)), or can I simply use an external data frame with the covariates, which will be incorporated automatically when I convert the data frame into an ssDF class?

Also, regarding the telemetry data, could you please provide a bit more guidance on how to specify the starting values? I would like to test the model using the null model's starting values, initially without including the telemetry data.


Best Regards,

Prashant

Daniel Linden

unread,
May 22, 2025, 10:12:48 AM5/22/25
to oSCR
The state space is simply a dataframe with coordinates and optionally covariate values.  So yes, you can create that data frame however you want, so long as the 1st and 2nd columns are X and Y, respectively.

As for starting values, unfortunately it just takes some troubleshooting.  We have found that smaller values can sometimes be better (i.e., do not suggest large d0 values).  But for telemetry integration, it is critical that you examine the point patterns of your telemetered animals to make sure there are no egregious outliers.  If your attempts do not result in any solutions, send me your data and I can help.
Reply all
Reply to author
Forward
0 new messages