Hello again,
I cannot think of a clean way to force K both to be a projection matrix and to have a prescribed diagonal exactly.
My suggestion then would be to go as follows: every projection K can be written as K = VV', where V is an orthonormal matrix, that is, V'V = Identity --- V' is the conjugate-transpose of V, and V has size n x k with k < n corresponding to the rank of the projector.
We want to minimize
f(V) = -trace(X' ( (VV') .* conj(VV') ) X)
with V on grassmanncomplexfactory(n, k).
(Matlab 2021b or more recent with the Deep Learning toolbox can figure out the gradient for you automatically with problem = manoptAD(problem); and otherwise you can also figure out the gradient with some calculus on paper; see also Sec. 4.7 in my book on my webpage if need be).
But we would also like to force the norms of the rows of V to be equal to certain given values (as they correspond to the diagonal entries of VV', which is our projector).
Something to try here is to add a penalty term in the cost function to favor the correct diagonal entries. One possibility is to use a type of augmented Lagrangian approach on manifolds, as described here for example:
A simpler approach would be to add the following to the cost function: sum_i (V(i, :)'*V(i, :) - u(i))^2 where u(i) is the desired value for the diagonal entry K(i, i). That penalty (the whole sum) should be multiplied with some penalty weight that needs to be tuned.
I hope this helps.
Best,
Nicolas