Hello Ryan,
As an aside: strict inequality constraint can be tricky, for the following reason. Imagine minimizing f(x) for x in the open interval (0, 1). Then, either the minimizer of f indeed lies strictly inside (0, 1) -- in which case the constraint is inactive, and so it does not play much of a role -- or f does not attain a minimizer in (0, 1), because its infimum on that interval corresponds to the value f(0) or f(1). In the latter scenario, any optimization algorithm should try to get to x = 0 or x = 1, but it wouldn't be allowed to.
If you are happy to try things out with non-strict inequalities, then here would be my first suggestion: let's work with a product manifold where d is your unit-norm vector in R^3 (that's just a sphere), and q in R^2 has two components, which we use as follows:
r = q(1)^2;
v = (vmin+vmax)/2 + (vmax - vmin)/2 * sin(q(2));
In Manopt, you can create the manifold as follows:
elements.d = spherefactory(3);
elements.q = euclideanfactory(2);
manifold = productmanifold(elements);
problem.M = manifold;
To make the code more transparent, I recommend introducing a change of variable function as follows:
function xx = change_of_variable(x)
xx.d = x.d;
q = x.q;
xx.r = q(1)^2;
xx.v = (vmin+vmax)/2 + (vmax - vmin)/2 * sin(q(2));
end
problem.cost = @mycost;
function f = mycost(x)
xx = change_of_variable(x);
% ... compute f as a function of xx, which is a structure that contains d, r and v as you normally would
end
From here, you could try to call automatic differentation with:
problem = manoptAD(problem);
This may fail (you need the Deep Learning toolbox, and rather recent version of Matlab, and it can be a bit finnicky).
If so, then you would ideally implement the gradient of your cost function. This must be expressed with respect to d and q (because that is the manifold), so you would compute the gradient of the composition of f with the change of variable (chain rule).
I hope this helps -- don't hesitate to follow up here.
Best,
Nicolas