For a 15 variable constrained NLS fitting problem, especially one calling statistical functions, 1s sounds pretty fast to me. However if you want to add the gradient to OPTI (rather than finite difference), it might help. Below is an example of how to do this using the symbolic toolbox:
% Core of lognpdf.m:
lognpdf = @(x,mu,sigma) exp(-0.5 * ((log(x) - mu)./sigma).^2) ./ (x .* sqrt(2*pi) .* sigma);
% Your objective
fun = @(x,t)((((x(1)*lognpdf(t,log(x(2)),x(3))+x(4)*lognpdf(t,log(x(5)),x(6))+x(7)*lognpdf(t,log(x(8)),x(9))+x(10)*lognpdf(t,log(x(11)),x(12))+x(13)*lognpdf(t,log(x(14)),x(15))))));
% Create symbolic version
x = sym('x',[15,1]);
t = sym('t');
fSym = fun(x,t)
% Calculate analytic gradient
gradSym = jacobian(fSym,x)
% And convert to anonymous function
gradVec = matlabFunction(gradSym, 'vars', {x,t})
% Reshape to matrix
grad = @(x,xdata) reshape(gradVec(x,xdata), length(xdata), length(x0));
% Solve it
opts = optiset('display','iter','solver','levmar');
Opt = opti('fun', fun, 'grad', grad, 'data', xdata, ydata, 'ineq', A, B, 'bounds', lb, ub, 'x0', x0, 'opts', opts)
[x,f,e,i] = solve(Opt)
% And plot solution
plot(Opt)