RAM memory leaking?

40 views
Skip to first unread message

Kamil Krawczyk

unread,
May 17, 2019, 6:31:59 AM5/17/19
to lavaan
Hi,

In my previous post (https://groups.google.com/forum/#!topic/lavaan/z_-YnFdhq2k) I was talking about making a time series model in lavaan.

I was working on this model today and something weird happen... 



RStudio after the fitting told me that my fitted model have only 200 MB weight, but I checked the RAM storage before and after modeling. 
The rsession.exe have increased it's weight from like 333 MB to 9.3 GB and I don't know why... (photos attached)
start.PNG end.PNG statistics.PNG


Data set which I'm using is only 294 x 98 which I think is small.

Best regards,
Kamil

Yves Rosseel

unread,
May 17, 2019, 9:40:22 AM5/17/19
to lav...@googlegroups.com
I don't think there is any leaking.

You have 384 free parameters and 95 constraints. Therefore, to compute
standard errors, we need to compute an augmented information matrix,
which will contain 479*479 = 229441 elements. This matrix needs to be
inverted, introducing at least one additional copy of this matrix.
Assuming 32 bits are needed per element, each matrix is about 7342112
bytes...

If you don't need standard errors, you can use se = "none"

Yves.

On 5/17/19 12:31 PM, Kamil Krawczyk wrote:
> Hi,
>
> In my previous post
> (https://groups.google.com/forum/#!topic/lavaan/z_-YnFdhq2k) I was
> talking about making a time series model in lavaan.
>
> I was working on this model today and something weird happen...
>
>
>
> RStudio after the fitting told me that my fitted model have only 200
> MB weight, but I checked the RAM storage before and after modeling.
> The rsession.exe have increased it's weight from like 333 MB to 9.3
> GB and I don't know why... (photos attached) start.PNG end.PNG
> statistics.PNG
>
>
> Data set which I'm using is only 294 x 98 which I think is small.
>
> Best regards, Kamil
>
> -- You received this message because you are subscribed to the Google
> Groups "lavaan" group. To unsubscribe from this group and stop
> receiving emails from it, send an email to
> lavaan+un...@googlegroups.com
> <mailto:lavaan+un...@googlegroups.com>. To post to this group,
> send email to lav...@googlegroups.com
> <mailto:lav...@googlegroups.com>. Visit this group at
> https://groups.google.com/group/lavaan. To view this discussion on
> the web visit
> https://groups.google.com/d/msgid/lavaan/511cf9f0-942e-443c-8586-4278e0ef06b1%40googlegroups.com
>
> <https://groups.google.com/d/msgid/lavaan/511cf9f0-942e-443c-8586-4278e0ef06b1%40googlegroups.com?utm_medium=email&utm_source=footer>.
>
>
For more options, visit https://groups.google.com/d/optout.

Kamil Krawczyk

unread,
May 17, 2019, 9:56:30 AM5/17/19
to lavaan
Thank you Yves.

I need standard errors to see p-values of seted parameters, but I always can do some acf/pacf plot before fitting the model. It also speed up the fitting in obvious way

Kamil 
> lav...@googlegroups.com
> <mailto:lav...@googlegroups.com>. To post to this group,

Kamil Krawczyk

unread,
May 24, 2019, 3:03:41 AM5/24/19
to lavaan
Yves could you tell me one more thing? I have observed that lavaan object also need big ammount of RAM.. I want to know if it's possible to reduce it in some way.

Yves Rosseel

unread,
Jun 8, 2019, 3:28:02 PM6/8/19
to lav...@googlegroups.com
On 5/24/19 9:03 AM, Kamil Krawczyk wrote:
> Yves could you tell me one more thing? I have observed that lavaan
> object also need big ammount of RAM.. I want to know if it's possible to
> reduce it in some way.

The only thing you can do for the moment is adding the NACOV = FALSE
argument. I agree that this is something that should be handled better
(reducing memory requirement for large-scale models).

Please open up an issue about this on the lavaan github site, and if
possible add a reproducible script.

Yves.

Reply all
Reply to author
Forward
0 new messages