Hi Andrew et al,
I think I see what you're asking about, and sorry to jump in late here. I've been traveling etc.
Since an SMC algorithm doesn't have monitors per se in the same way an MCMC does, the derived quantities features are not an immediate solution (unless you are using the particle MCMC approach).
One approach that I suspect you have already considered would be to do what you want by post-processing. That is, you could write a nimbleFunction -- or simply stay within R -- to iterate over samples returned by the SMC, plug them into the model, and calculate the functionals of interest. However, this would require that you extract all the necessary states, which can be a pain or prohibitive, and hence your question about doing it during the SMC run.
This is a good idea for extending nimbleSMC (which gets used less and receives less development attention than MCMC, by a long shot). You could try modifying the nimbleSMC source code, and really that strikes me as perhaps the most flexible approach, although with some learning curve. For example, if you are using the auxiliary filter, see AuxiliaryFilter.R. In buildAuxiliaryFilter, the run function iterates through the time steps, and each time step is done by an auxFstep object. Between iterations you could obtain and store the functionals of interest. If you need to do them for each particle considered at each time step, you could do that in the run function of auxFstep.
If digging into the nimbleSMC source code is not how you want to spend your time, let me suggest some tricks or workarounds that you could try right away. You can put in some "fake" data for each time step that make only a constant contribution to the likelihood but force some calculations. For example, say x[t] is a latent state in the state-space model. You could make a deterministic node that calculates what you are interested in from x[t]. Then you could make a data node that depends on that deterministic node in a constant manner. The setup code of any of the filters would decide that the data likelihood contributions needs to be calculated, and that the deterministic node needs to be calculated first (for each particle). That would give you a way to calculate some kinds of functionals (but not for example an easy way to get the deviance), and you would still need to save the values in some way, so it is not a complete solution.
Another trick is to make a nimbleRcall that calls an R function that uses the compiled model from R to get whatever you want. The nimbleRcall could be called from the deterministic node in the previous trick, or you could insert calls to it in the source code, either in the buildAuxiliaryFilter run function or the auxFstep run function (or similar for the other filters). For example, you can have an R function that uses compiled_model$getLogProb(nodes) to sum the current log probabilities of some nodes in the model and save that to a global R object. You can call that R function via a nimbleRcall from the compiled_model itself or from an algorithm using the compiled_model.
Let me know any further questions if either of these tricks seem useful and are not clear. As Daniel said, there should be a way to do it based on nimble's programmability, but it's also pretty clear it should be easier than it is.
HTH
Perry