On Thu, Oct 23, 2014 at 2:27 PM, Max Song <
maxso...@gmail.com> wrote:
> I want to use the statsmodels.regression.linear_model.OLS
> package to do a prediction, but with a specified constant.
>
> Currently, I can specify the presence of a constant with an argument:
>
> (from docs:
>
http://statsmodels.sourceforge.net/devel/generated/statsmodels.regression.linear_model.OLS.html)
>
> class statsmodels.regression.linear_model.OLS(endog, exog=None,
> missing='none', hasconst=None), where **hasconst** is a boolean.
>
> What I want to do is specify explicitly a constant C, and then fit a linear
> regression model around it. From using that OLS, I want to generate a
> <RegressionResults class instance> and then access all the attributes like
> resid, etc.
>
> A current suboptimal work around would be to specify the OLS without a
> constant, subtract the constant from the Y-values, and create a custom
> object that wraps both the specified constant and OLS w/o constant, every
> time I want to do predict or fit, to first subtract the constant from the Y
> variables, and then use the prediction.
That's essentially the only or best way to do it.
A small advantage, this way you don't need to add the constant to the
X in prediction.
It's not currently directly supported.
It is available with `fit_constrained` for models that currently
define offset. However, in that case we handle the general
non-homogeneous restriction `R params = q`.
We don't have offset defined for all models, and I thought it wouldn't
be necessary for the linear models.
I don't know if it will be better to support offset in linear models
(several code changes for little benefit) or as part of the general
restricted linear model (more computational work because we might not
take advantage of the special case that only the constant is fixed).
Josef