Downward bias in covariance matrix using genomic controlled summary statistics

70 views
Skip to first unread message

Austin Park

unread,
Sep 2, 2022, 4:02:42 AM9/2/22
to Genomic SEM Users
Hello all,

I recently came across that the heritability and covariance estimated using LDSC with genomic controlled summary statistics generate downward biased results.
Could this affect factor model loadings and fit obtained using Genomic SEM?
Is it recommended to use summary statistics that have not been GC?

Best,

Austin

Elliot Tucker-Drob

unread,
Sep 2, 2022, 2:29:44 PM9/2/22
to Austin Park, Genomic SEM Users
For estimating S and V with ldsc() I would recommend following best practices for standard ldsc. In the case of genomic control, I agree that it would be preferable to use summary statistics that have not corrected for lambdaGc. 
Note that the sumstats function has an option to correct SEs of SNP effects using the ldsc intercept, which indexes inflation due to confounding. See step 4.8 here: https://github.com/GenomicSEM/GenomicSEM/wiki/4.-Common-Factor-GWAS

--
You received this message because you are subscribed to the Google Groups "Genomic SEM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to genomic-sem-us...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/genomic-sem-users/00035095-2a76-45f7-a2c4-6916a9ae6eddn%40googlegroups.com.

Austin Park

unread,
Sep 13, 2022, 3:50:33 AM9/13/22
to Genomic SEM Users
Hi,

Thanks for the response!

I have the following question regarding SE correction using ldsc intercept.

It states:
GC: Level of Genomic Control (GC) you want the function to use. The default is 'standard' which adjusts the univariate GWAS standard errors by multiplying them by the square root of the univariate LDSC intercept. Additional options include 'conserv', which adjusts standard errors by multiplying with univariate LDSC intercept, and 'none' which does not correct the standard errors.

But is it correct that SE is adjusted by multiplying with the square root of the univariate LDSC intercept?
Or should it be SE divided by the square root of the univariate LDSC intercept?

I was confused since as far as I know, genomic control correction on SE using lambda GC is done by dividing SE with the square root of lambda GC.

Best,

Austin

Elliot Tucker-Drob

unread,
Sep 13, 2022, 8:02:03 AM9/13/22
to Austin Park, Genomic SEM Users
The intercept indexes inflation of the chisq(1) statistic beyond its expected null of 1.0. Values above 1.0 represent inflation, i.e. more significant effects than expected. By multiplying the SE by the sqrt of the inflated intercept, the SE becomes larger, thereby making effects a bit less significant to counteract the inflation. The correction for the SE is sqrt of the intercept because the intercept indexes inflation of chisq, the sqrt of which is beta/SE (i.e. Z).

Reply all
Reply to author
Forward
0 new messages