Hello Kai,
No worries, that's what the forum is for.
I imagine you did not specify the Hessian of the cost, and I suppose you judge that one is faster than the other based on time, and not based on iterations count? (Iterations in trust regions are typically much more expensive than for CG).
Your observation is familiar: I also tend to find trust regions (RTR) are faster than non-linear conjugate gradients (CG) (at least, as they are implemented in Manopt, with default parameters, which certainly are not adequate in all situations.)
I do not have a satisfying answer for why that is. Of course, we have convergence results for RTR which state explicitly that we can expect superlinear convergence with it (under some assumptions, and if the Hessian approximation is good enough), whereas superlinear convergence results for CG on manifolds are hard to come by (although there has been recent work on Riemannian CG and I am not up to date.) So, theoretically, I suppose we could expect this outcome; but I don't think that explains it. CG appears to converge superlinearly in many situations also. Why RTR routinely beats CG is a mystery to me.
Sorry that I don't have an answer.
Best,
Nicolas