Using normalizeUsingRPKM, should I set a scaleFactor parameter?

172 views
Skip to first unread message

Yung-Chih Lai

unread,
May 5, 2016, 1:22:53 PM5/5/16
to deepTools

Hi,

 

I downloaded several histone modification ChIP-Seq data without Input. They have various sequencing depth (from 17 million reads to 242 million reads). I would like to use bamCoverage to normalize them (e.g. a command line below). My question is that when I use RPKM to normalize them, should I set --scaleFactor parameter, e.g. 17 or 242 for total number of million reads? Or just set --scaleFactor=1 is better, because --normalizeUsingRPKM already normalize the total number of mapped reads for each sample? Many thanks.

 

bamCoverage --bam HMLE_Parental_H3K4me3.bam --outFileName HMLE_Parental_H3K4me3.bw --outFileFormat=bigwig --scaleFactor=17 --binSize=50 --extendReads=200 --numberOfProcessors=6 --normalizeUsingRPKM –verbose

 

Best,

 

Gary

Fidel Ramirez

unread,
May 6, 2016, 3:23:51 AM5/6/16
to Yung-Chih Lai, deepTools
Hi,

Don't use the --scaleFactor just setting the --normalizeUsingRPKM is enough. This will take care of normalizing based on the total number of sequenced reads. 

Best,

fidel

--
You received this message because you are subscribed to the Google Groups "deepTools" group.
To unsubscribe from this group and stop receiving emails from it, send an email to deeptools+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--

Fidel Ramirez

Yung-Chih Lai

unread,
May 6, 2016, 3:53:19 AM5/6/16
to Fidel Ramirez, deepTools
Hi Fidel,

I see. Thank you so much.

Best,

Gary
Reply all
Reply to author
Forward
0 new messages