is it possible to manipulate the Std.Err. without manipulating the Estimate?

19 views
Skip to first unread message

gaia...@usal.es

unread,
Apr 9, 2019, 4:51:33 AM4/9/19
to lavaan
Good morning,
I tried to find and answer to this question in the forum and find no reply. In any case, I apologize if this has been answered before.
can I manipulate the Std.error of an estimate without altering the estimate value? 
I tried to do it by increasing the Std. Error term in rnorm(N,mean,Std. Error)However, by comparing two models one with Std Error and and with it modified (see below), I realized that 


N <- 10000

x <- rnorm(N)

y1 <- 0.55*x + rnorm(N)

y2 <- 0.24*x + 0.47*y1 #+ rnorm(N) OR rnorm(N,,5) <- models differ here, whetherto include one option or the other


1) the estimate is less accurate when modifying the Std Error (0.242 vs 0.197). Why? it is possible to modify the variance of a regression without modifying the estimate of the slopes, right?
2) While the variance of the variable increase according to the set value, the Std Error of the estimate (what I want to manipulate) do not do it
3) Is it possible to manipulate the Std error of the estimate of without doing for y1? is is possible to do like that?
y2 <- 0.24*x + rnorm(N) 
y2 <- 0.47*y1 +rnorm(N,,5)

Thanks in advance
I attached to models below marking in bold the changed parameters/values

Model with no defined Std. Err.

rm(list=ls(all=TRUE))

N <- 10000

x <- rnorm(N)

y1 <- 0.55*x + rnorm(N)

y2 <- 0.24*x + 0.47*y1 + rnorm(N)

LinearMedData <- data.frame(x, y1,y2)

 

#Data analysis

LinearMedMod <- 'y1~a*x

y2~b*x+ c*y1

'

fitLinearMedMod <- sem(LinearMedMod, data = LinearMedData)

summary(fitLinearMedMod)

Regressions:

                   Estimate  Std.Err  z-value  P(>|z|)

  y1 ~                                               

    x          (a)    0.537    0.010   54.050    0.000

  y2 ~                                               

    x          (b)    0.242    0.011   21.165    0.000

    y1         (c)    0.463    0.010   45.746    0.000

 

Variances:

                   Estimate  Std.Err  z-value  P(>|z|)

   .y1                0.985    0.014   70.711    0.000

   .y2                1.011    0.014   70.711    0.000

 

Model with defined Std. Err.

rm(list=ls(all=TRUE))

N <- 10000

x <- rnorm(N)

y1 <- 0.55*x + rnorm(N)

y2 <- 0.24*x + 0.47*y1 + rnorm(N,,5)

LinearMedData <- data.frame(x, y1,y2)

 

#Data analysis

LinearMedMod <- 'y1~a*x

y2~b*x+ c*y1

'

fitLinearMedMod <- sem(LinearMedMod, data = LinearMedData)

summary(fitLinearMedMod)

Regressions:

                   Estimate  Std.Err  z-value  P(>|z|)

  y1 ~                                               

    x          (a)    0.557    0.010   55.853    0.000

  y2 ~                                               

    x          (b)    0.197    0.057    3.475    0.001

    y1         (c)    0.458    0.050    9.206    0.000

 

Variances:

                   Estimate  Std.Err  z-value  P(>|z|)

   .y1                1.005    0.014   70.711    0.000

   .y2               24.894    0.352   70.711    0.000

Rönkkö, Mikko

unread,
Apr 9, 2019, 5:10:21 AM4/9/19
to lav...@googlegroups.com
Hi,

I am answering below.

On 9 Apr 2019, at 10.51, gaia...@usal.es wrote:

Good morning,
I tried to find and answer to this question in the forum and find no reply. In any case, I apologize if this has been answered before.
can I manipulate the Std.error of an estimate without altering the estimate value? 

Yes. In regression, the standard error depends on three quantities: sample size, correlation between the predictors, and the estimated error variance.

I tried to do it by increasing the Std. Error term in rnorm(N,mean,Std. Error)However, by comparing two models one with Std Error and and with it modified (see below), I realized that 


N <- 10000

x <- rnorm(N)

y1 <- 0.55*x + rnorm(N)

y2 <- 0.24*x + 0.47*y1 #+ rnorm(N) OR rnorm(N,,5) <- models differ here, whetherto include one option or the other


1) the estimate is less accurate when modifying the Std Error (0.242 vs 0.197). Why? it is possible to modify the variance of a regression without modifying the estimate of the slopes, right?

Because changing the variance of the error term changes the covariance between the explanatory variables and the dependent variable. This is easier to see in a smaller sample

library(lavaan)
set.seed(1)
N <- 100
x <- rnorm(N)
y1 <- 0.55*x + rnorm(N)
e <- rnorm(N)
y2 <- 0.24*x + 0.47*y1 + e
y3 <- 0.24*x + 0.47*y1 + e*5

data <- cbind(x,y1,y2,y3)
cov(data)

coef(sem("y2~ x + y1",data))
coef(sem("y3~ x + y1",data))

2) While the variance of the variable increase according to the set value, the Std Error of the estimate (what I want to manipulate) do not do it

The easiest way to manipulate SEs is to estimate the model using covariance matrix. First generate a dataset, then calculate the covariance matrix and estimate using that. Then reestimate the model using the same covariance matrix but with smaller sample size or manipulate the covariance matrix to increase y2 variance without changing the covariances. 

Another approach is to add orthogonalized noise to the y2 variable:

library(lavaan)
set.seed(1)
N <- 100
x <- rnorm(N)
y1 <- 0.55*x + rnorm(N)
e <- rnorm(N)
y2 <- 0.24*x + 0.47*y1 + e

data <- cbind(x,y1,y2)

coef(sem("y2~ x + y1",data))

data[,"y2"] <- data[,"y2"] + residuals(lm(rnorm(N)*5 ~ x + y1))
coef(sem("y2~ x + y1",data))


3) Is it possible to manipulate the Std error of the estimate of without doing for y1? is is possible to do like that?
y2 <- 0.24*x + rnorm(N) 
y2 <- 0.47*y1 +rnorm(N,,5)

In this case, no. If you have more than two predictors, then it is possible by manipulating the correlations between the predictors to increase collinearity of some variables while keeping it constant for others. 

Why do you want to manipulate the SEs?

Mikko

--
You received this message because you are subscribed to the Google Groups "lavaan" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lavaan+un...@googlegroups.com.
To post to this group, send email to lav...@googlegroups.com.
Visit this group at https://groups.google.com/group/lavaan.
For more options, visit https://groups.google.com/d/optout.

MARIO GARRIDO ESCUDERO

unread,
Apr 10, 2019, 12:31:16 PM4/10/19
to lav...@googlegroups.com
Hi Mikko,
first of all, thaks a lot for the detaled response. Even I dont understand properly I think. I need to study it in detail
My intention is to create paths with different estimates and SE´s to check their robustness under different analyses. For example, I want to generate a dataset of 10000 and run a model with whole dataset and smallest subsamples to check consistency of P-values and other statistics.
That´s why I need to create a dataset from a 'True' model where  can know in advance what shoyuld be the Estimate mean and Std. Error. To check after whether I obtained those values when running models under different conditions

MARIO GARRIDO ESCUDERO

unread,
Apr 11, 2019, 3:18:27 AM4/11/19
to lav...@googlegroups.com
Hi again,
after readin a while I dont know if I am totally confused.
What I want is to fix the estimate while manipulate his Std Err. (both on bold) so I can increase the uncertanity and reflect it in the P-value (now all P-values of the set model are zero, I want them not to be zero but higher values). I do not know if I explained well and your asnwers are in these direction.
Thanks a lot 

Regressions:

                   Estimate  Std.Err  z-value  P(>|z|)

  y1 ~                                               

    x          (a)    0.537    0.010   54.050    0.000

  y2 ~                                               

    x          (b)    0.242    0.011   21.165    0.000

Reply all
Reply to author
Forward
0 new messages