Hey Stan Team and discussion participians,
one of my toy models for inferencing Markov Models of Ion channels Looks like this. (See below)
It works fine. (As long as I exclude probabilties = 0 or 1 for the binominal Distribution. The derivative of the log probabilty seems to be infinit there.?)
But in the real data I not going to see y_t but some y_gaus ~ normal(y_t, sigma). Are finitie mixtures here the right way to go? Are there better ways?
My data will be something like
p(t) = 1- exp(-theta*t)+....
n_open ~ Binomial(N_channel, p(t))
I ~ normal(Const * n_open, sigma)
I would like to infer p(t) via Theta (or more complex Sums of exponentials) and n_open and N_channel even though they are discrete. The const will probably come from different Data.
Thanks alot for any hints.
Jan Münch
data {
int<lower=1> N_data; // number of data points
int<lower=0, upper = 1000> y_t[N_data]; // array observations ever element bigger als 0 to 20 binomainal draw(20, theta)
real<lower=0, upper = 10> time[N_data];
int N_channel;
}
transformed data{ // ... declarations ... statements ...
}
parameters { // The parameters we want to inference by via Stan
simplex[2] mu;
real<lower = 0, upper = 5> theta[2];
}
transformed parameters { // ... declarations ... statements ...
real<lower = 0, upper = 1> probabilty[N]; // fitted values
for(i in 1:N)
probabilty[i] = 1 - mu[1] * exp(-theta[1] * time[i])- mu[2]* exp(-theta[2]*time[i]);
}
model {
theta[1] ~ uniform(0.0,3);
theta[2] ~ uniform(2.0,5);
y_t ~ binomial(N_channel, probabilty);
//y_gaus ~ normal(y_t,sigma);
}
generated quantities { // ... declarations ... statements ...
}
"""