... + pertsf[LI[1]] - Scalar[pertsf[LI[1]]] + ...
Why is this happening? Sure, I can make an additional rule, ensuring pertsf[LI[1]]=Scalar[pertsf[LI[1]]], which works fine.
pert[LI[1], \[Alpha], -\[Alpha]] - Scalar[metric[\[Alpha], \[Beta]]*pert[LI[1], -\[Alpha], -\[Beta]]]
are not cancelled.
What I do is the following: I have an expression L, which depends on a metric and a scalar field. I want the perturbation:
varL2 = ToCanonical[ContractMetric[ExpandPerturbation[Perturbation[L, 2]]]]
vartf2 = 2 (-VarD[pert[LI[1], \[Alpha], \[Beta]], cd][varL2] / Sqrt[-Detmetric[]]
/. {delta[-LI[1], LI[1]] -> 1,
delta[-LI[2], LI[1]] -> 0,
delta[-LI[1], LI[2]] -> 0}
//SeparateMetric[metric] //RicciToEinstein) //Expand //ContractMetric //ToCanonical
eterm2 = IndexCollect[cd[\[Alpha]]@vartf2 //ToCanonical, CD[-\[Beta]][sf[]]] //Simplification
result = eterm2 /. myRule /. myRule2 /. myRule3 // Simplification
If you spot any error in this approach, please let me know! Thank you very much in advance! :)