UncLib Matlab Memory Performance

56 views
Skip to first unread message

Eric Smith

unread,
Aug 19, 2014, 4:11:19 AM8/19/14
to unc...@googlegroups.com
Hi Michael,

I am trying to perform large uncertainty calculations in Matlab using LinProp (i.e. complex arrays over 5000 long). I am running into serious limitations on the ability of matlab to perform these calculations. Matlab takes up all of my RAM trying to perform these calculations, often causing my entire computer to completely freeze up.

Do you have any advice on how to better perform these large computations? Can LinProp -Matlab handle sparse covariance matrices? I have experimented with them, but it doesn't seem to work consistently. Once I perform a very large LinProp calculation, Matlab is very unresponsive until the program is completley rebooted, regardless of if I clear the variables from the workspace.

Thanks,
Eric


Eric Smith

unread,
Aug 19, 2014, 4:23:07 AM8/19/14
to unc...@googlegroups.com
Also, is there a way to tell LinProp to ignore correlation between the different data points?

For example, if I have a fictional array of 2,000 scalar independent measurement points. To input this into Matlab would look something like this : 

variable = LinProp( measured_values(1:2000) , diag(meas_unc(1:2000)).^2 )

This creates the entire 2000 x 2000 covariance matrix, but I don't care about the covariance (or am assuming independence).

Technically, I could create a for-loop to cycle through the each measurement point + measurement uncertainty combination, but this is extremely slow.

Is there a way to tell LinProp to ignore the correlation between all of the different points? It may not be a best practice to ignore the correlation, but if I just want to perform quick calculations it would be easier (and I'm assuming faster computation time).

Thanks,
Eric

Michael Wollensack METAS

unread,
Aug 19, 2014, 5:09:30 AM8/19/14
to unc...@googlegroups.com
Hi Eric,

one LinProp object needs 8 Bytes for the value and 12 Bytes for each dependency.

m = a*(8 + b*12)

where
  • m is the used memory in Bytes
  • a is the number of objects
  • b is the number of dependencies
Example given: 10000 objects and 10000 dependencies --> 1.2 GB

I really recommend to use a 64 bit operation system and have at least 16 GB of RAM installed (I've 32 GB).

Sparse covariance matrices are not supported at the moment.

Regards
Michael

Michael Wollensack METAS

unread,
Aug 19, 2014, 7:27:51 AM8/19/14
to unc...@googlegroups.com
Hi Eric,

I did some tests:

>> v = rand(10000,1);
>> j = randn(10000,10000);
>> cv = j*j';
>> tic; o = LinProp(v, cv); toc

This takes a very long time. The problem is that I use in Metas.UncLib V1.4.4 alglib for the eigenvalue decomposition of the covariance matrix. I think in the next version I will switch to Intel MKL eigenvalue decomposition and it will be much faster.

If you have no correlation between elements (all zeros), it runs much faster.

>> tic; o = LinProp(v, diag(diag(cv))); toc
Elapsed time is 4.238901 seconds.

Regards
Michael

Eric Smith

unread,
Aug 22, 2014, 2:07:18 PM8/22/14
to unc...@googlegroups.com
Michael,

Thanks for the quick reply. I am using 64 bit OS with 8 GB of RAM. I guess I just need to get some more RAM.

UncLib is a very powerful tool, and I use it regularly. I appreciate the strong support you give to both UncLib and VNA Tools.

Thanks,
Eric
Reply all
Reply to author
Forward
0 new messages