dataset query error in fastLmPure(lmvars$X, y, method = 2) : NROW(y) == nrow(X) is not TRUE only for subset of data

18 views
Skip to first unread message

Miriam Vignando

unread,
Aug 17, 2020, 5:07:16 AM8/17/20
to brainGr...@googlegroups.com, Vignando, Miriam
Hello Chris, 

thank you for developing brainGraph, I found it quite easy to use and the user's guide was super helpful and very detailed, I am actually looking forward to using it also for other datasets! 

I managed to do everything I wanted on my structural data (T1-weighed only) from almost 500 patients. The dataset was created with a collaboration with other research groups, so it underwent a data harmonisation process before I could use brainGraph or do any kind of analysis on those data. (I used the Desikan atlas and did analyses both on thickness and surface area). 

Overall everything worked really well. However, it took me a while to make it work as with the whole dataset I wouldn't manage to get past one of the very first steps, which is the 

all.dat.resids <- get.resid(lhrh, covars=covars, exclude.cov='Group')

which returned a error in fastLmPure(lmvars$X, y, method = 2) :  NROW(y) == nrow(X) is not TRUE.

I checked everything i could check and eventually decided to re-run the code for each of the datasets part of my dataset. 
Eventually I noticed that some subjects from 2 datasets were causing this - i don't know yet how, but after removing those subjects everything else worked smoothly. 

I have manually inspected the data but there is nothing wrong with them, so I wanted to know what might have driven this in your opinion, or whether this has happened to someone else before. 


Many thanks 
Miriam 

Chris Watson

unread,
Aug 19, 2020, 8:44:49 AM8/19/20
to brainGr...@googlegroups.com, Vignando, Miriam
Hi Miriam, thank you for your kind words.

I am not sure what the exact cause of the error is. It is possible some of the subjects have incomplete data (i.e., "NA" values). What did you see in the subjects you ended up removing?
If you want to wait for the release of v3.0, this issue might be fixed as I no longer rely on "fastLmPure" (from the RcppEigen package).

Chris

--
You received this message because you are subscribed to the Google Groups "brainGraph-help" group.
To unsubscribe from this group and stop receiving emails from it, send an email to brainGraph-he...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/brainGraph-help/CADtbyHdBLsvcvNQMFMZazBteqzC1FWkvtQGe6y36o%3Dp1svdAyw%40mail.gmail.com.

Miriam Vignando

unread,
Aug 19, 2020, 8:53:51 AM8/19/20
to brainGr...@googlegroups.com, Vignando, Miriam
Hi Chris, 

thank you for your email! There were no NA values, and i also re-checked the segmented images with freeview just to make sure i didn't miss anything weird, which was not the case. 
When i was examining the dataset, i stared by removing the two smallest datasets contributing to my full dataset; after removing the first few subjects the error changed to Error in solve.default(crossprod(X)) : system is computationally singular: reciprocal condition number = 0, that made me decide to explore pairwise covariance and I thought i should remove subjects that showed the highest correlations as i know that too much linearly dependent columns may cause this, making the matrix non invertible. Does this make sense in your opinion? (I figured that the problem was always the same, but removing some cases gave me a bit more insight into it, but I ight as well be wrong about this)

When is v3.0 being release? This analysis was the last for the study i have written up. I am still refining everything so depending upon the release date i might try the new version, if you think that it might address the problem. 

Thank you!
Miriam 

Chris Watson

unread,
Aug 20, 2020, 1:43:11 AM8/20/20
to brainGr...@googlegroups.com, Vignando, Miriam
It's possible that is the cause of the error.
I hope to have the next version done in another week, or maybe 2.

Miriam Vignando

unread,
Aug 20, 2020, 4:34:14 AM8/20/20
to brainGr...@googlegroups.com
Thank you so much for all the information! 
Kindest regards

Miriam 

Reply all
Reply to author
Forward
0 new messages