The main question is capture in the title: Does a peptide's intensity affect elution time?
The details (in case they are of interest) are as follows: Using only very intense peptides (~500 peptides spread across a ~2hr gradient on Thermo Fusion), there appears to be a clear alignment pattern that can be easily represented by a smoothing cubic spline, and almost all (eg 99.99%) of the peptides fall within a very narrow range (eg 20 seconds) of this smoothing cubic spline. HOWEVER, if lesser intense peptides are also used for this modeling exercise, then about 1% of the peptides are NOT anywhere close to within this 20 seconds error bar -- they can be as much as 1 to 3 minutes away. So I see two possibilities: one (the likely one) is that those peptides are simply outliers in my modelling algorithms. I consider this intuitively likely, since I can think of a number of reasons why my algorithm would incorrectly claim that these lower intense peptides are off by 1 to 3 minutes; BUT, the other possibility is that intensity affects elution (for perhaps 1% of the case) and that those peptides *truly* are 1 to 2 minutes misaligned (because of their intensity).
So, is there are theoretical (or observational) reason to suggest that intensity may affect elution?
I tried to come at the answer sideways by googling/skimming through papers -- mainly ones that discussed *prediction* of LC elution time -- and none suggested that intensity matter, though that is not a conclusive argument. (It could be that since no one knows the intensity of an analyte ahead of time, there's little point in attempting to consider it to predict elution LC time.)