If you want to do least-squares, the code is
x := mat64.NewDense(len(xdata), 1, xdata)
y := mat64.NewVector(len(ydata), 1, ydata)
var theta mat64.Vector
err := theta.SolveVec(x, y)
// handle err
We also have gradient-descent in optimize if you're going to do batch-learning (as in your code)
https://godoc.org/github.com/gonum/optimize#GradientDescent.
If you are doing batch learning, gradient descent is one of the least efficient ways to solve least-squares. We do need to provide a way to do stochastic gradient descent for fitting algorithms. It doesn't belong in the base optimize, but it may in a subpackage, or just in the fitting repositiory.