I'm getting puzzling results in the implementation of the SVD. The projections are differing vs running the same dataset through a numpy implementation. I think there's an issue in the numeric.js SVD implementation. I'm noticing that some of the columns of the U matrix have differing signs vs the Python implementation. It's not a case of just flipping the signs of the values, since some row differ. It's resulting in differing projections into 2D space. More specifically, when multiplied by its transpose, U should be returning the identify matrix. Here's a case were that doesn't happen. Any ideas?
//tested with numeric.js version 1.2.6
var countsMatrix = [[0,0,1,1,0,0,0,0,0],[0,0,0,0,0,1,0,0,1],[0,1,0,0,0,0,0,1,0],[0,0,0,0,0,0,1,0,1],[1,0,0,0,0,1,0,0,0],[1,1,1,1,1,1,1,1,1],[1,0,1,0,0,0,0,0,0],[0,0,0,0,0,0,1,0,1],[0,0,0,0,0,2,0,0,1],[1,0,1,0,0,0,0,1,0],[0,0,0,1,1,0,0,0,0]];
var svd = numeric.svd(countsMatrix);
var U = svd.U;
var UTranspose = numeric.transpose(U);
//U should be a unitary matrix, so transpose(U) * U should be the identity matrix:
//http://web.mit.edu/be.400/www/SVD/Singular_Value_Decomposition.htm
var shouldBeIdentity = numeric['*'](UTranspose, U);
//However, it isn't the identity matrix:
console.log(JSON.stringify(shouldBeIdentity));