Stepwise rQTL-unknown warning messages and odd QTL curve

105 views
Skip to first unread message

Claire O'Quin

unread,
May 23, 2015, 8:58:55 PM5/23/15
to rqtl...@googlegroups.com
Hi There,

I am running a stepwise QTL for a backcross and got the following warning message at the end of my stepwise run:

Warning message:
In lastout[[i]] - (max(lastout[[i]]) - dropresult[rn == qn[i], 3]) :
  longer object length is not a multiple of shorter object length

Unfortunately, I have no clue what this means and how to make adjustments to correct the issue. When I created my plot, the QTL curve on chromosome 3 is very odd (tried attaching it), so I suspect that the warning is connected to that odd curve plot.

Additionally, I tried running the fitqtl just to see what would happen and got an error (Error in solve.default(t(Z) %*% Z, t(Z) %*% X) : system is computationally singular: reciprocal condition number = 1.49755e-24). 

Any thoughts about what is going on?

I will note that I am trying to run a covariate with my analysis. I created: head.covar <- pull.pheno(sawfly.cross, pheno.col=19) and used addcovar=head.covar when running my permutations. However, when I tried to use that same code in my stepwise qtl, I had to adjust it to covar=sawfly.cross$pheno$Head.Area because I got an error saying it had an incorrect number of dimensions. I don't know if this is related to my problem.

Thank you,
Claire
Phenotype2_Oddplot.pdf

Karl Broman

unread,
May 23, 2015, 9:03:33 PM5/23/15
to rqtl...@googlegroups.com
Yes it seems that something has gone horribly wrong. I've not seen this before, and can't tell what the problem is without looking at your data and code. Could you send it to me privately?

karl
> --
> You received this message because you are subscribed to the Google Groups "R/qtl discussion" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to rqtl-disc+...@googlegroups.com.
> To post to this group, send email to rqtl...@googlegroups.com.
> Visit this group at http://groups.google.com/group/rqtl-disc.
> For more options, visit https://groups.google.com/d/optout.
> <Phenotype2_Oddplot.pdf>

Claire O'Quin

unread,
May 27, 2015, 1:15:27 PM5/27/15
to rqtl...@googlegroups.com
Hi Karl,

Sorry for the delayed reply. Yes, I will try to get that sent to you by the end of the day, thank you. I've been playing with my data some more and I seem to get this warning message when it maps two QTL pretty much one on top of the other (at markers right next to each other).

Claire

Karl Broman

unread,
May 27, 2015, 3:33:00 PM5/27/15
to rqtl...@googlegroups.com, cto...@gmail.com
The warning message:

Warning message:
In lastout[[i]] - (max(lastout[[i]]) - dropresult[rn == qn[i], 3]) :
  longer
object length is not a multiple of shorter object length

is a bug that is revealed by the problem of having multiple QTL at very close to the same position, and this seems to arise because you have very strong QTL effects, and Haley-Knott regression isn't working well in this case.

I'd recommend switching to the multiple imputation method (method="imp"): use sim.geno in place of calc.genoprob, and method="imp" in place of method="hk".

karl

Claire O'Quin

unread,
Jun 1, 2015, 3:20:43 PM6/1/15
to rqtl...@googlegroups.com, cto...@gmail.com
I'm in the process of trying to run the multiple imputation method for this data set. With 1000 permutations to run, it is obviously a very slow process. Can I run the permutations in sets, and then use cbind() to combine the results together? If so, what do you suggest using as an appropriate permutation number and batch size? Using my old methods, for 1000 permutations, I was using a batch size of 100. Thank you.

Karl Broman

unread,
Jun 3, 2015, 6:42:48 AM6/3/15
to rqtl...@googlegroups.com
Yes, I would run things in sets and combine them together afterwards.

The appropriate batch size depends on how many computers you have and how much memory they have available.

I would use all of the computers I have available, would pay attention to how much RAM is on the machine, and would only set as many simultaneous jobs running on a given computer so that they all fit into the available RAM.

I also generally aim for jobs that are ~4 hrs long, and then I'll set new jobs running when the old ones have completed. This way if something goes wrong, I'll know sooner rather than later.

karl
Reply all
Reply to author
Forward
0 new messages