Looks like a good list, I'm sure I'd get some value from some of these as
well. I suspect there's little reason not to have most or all of them, except
for the minor point that someone actually has to write the code!
Best,
--Tim
On Wednesday, May 29, 2013 03:10:36 PM Laurent S wrote:
> Hi all,
>
> I think Julia has great potential, and can, unlike Matlab, rid itself of
> its tunnel vision on matrices and embrace tensors from the ground up.
> Tensors, and tensor decompositions (such as those which generalize the
> matrix SVD/EVD/...), have been gaining traction in recent years, and
> supporting them properly would certainly increase Julia's adoption rate
> looking forward. Below is a list of some of my tensor-related
> suggestions/comments:
>
> - repmat currently allows replication along the first two dimensions,
> the second argument could instead define by which multiplier to replicate
> along a vector of modes.
> Possibly even consider changing the name to "rep" or "reparr", since the
> method also applies to vectors and tensors, not just matrices. On a
> related note, I'm very glad Julia has broadcasting operators, this makes my
> life so much easier.
> - Although I can't find a reference in the documentation, it seems Julia
> uses the work "rank" internally to describe the number of dimensions an
> array (or tensor) has. This could be confusing, rank already has a
> different definition for matrices, and disjoins into two different
> concepts for tensors (i.e., rank and multilinear rank). For tensors, the
> number of dimensions is referred to as the tensor's order in the
> literature<
http://www.sandia.gov/~tgkolda/pubs/bibtgkfiles/TensorReview.pdf
> > .
> - The implementation of rank should allow the user to set a
> tolerance, preferably a relative tolerance (unlike Matlab's absolute
> tolerance).
> - For tensors, rank could return the tensor's multilinear rank, which is
> a vector of ranks (one rank for each mode). Or, it could be a separate
> function (e.g., mlrank). The matrix case is just a special case of mlrank
> where the column rank equals the row rank.
> - If A and B are matrices of the proper dimensions, A*B and B*A are
> well-defined. Now let B be a tensor (say, three-dimensional), then A*B
> and B*A are still defined in the multilinear algebra sense, but not yet in
> Julia. For the former, the effect is that each column vector B[:,j,k] of B
> is mapped to C[:,j,k] = A*B[:,j,k]. For the latter, each row vector
> B[i,:,k] of B is mapped to C[i,:,k] = B[i,:,k]*A. It is really a
> straightforward generalization of the matrix-matrix multiplication to
> matrix-tensor multiplication.
> - Related to the previous point, the * operator would only allow
> multiplying a tensor from the left and from the right. In general, B
> could also be multiplied by a matrix A along its third (or nth) dimension.
> I'm not sure what the best way to include such functionality is. A
> tmmul(Tensor,Matrix,dim) function perhaps? Or a new operator?
> - Julia already has the Kronecker product. There is a related product
> called the Khatri--Rao product, which would also be nice to have since it
> is an essential operation in one of the most prominent tensor
> decompositions.
> - normfro(T) and norm(T,"normfro") where T is a third- or higher-order
> tensor should return norm(T[:]) (currently returns an error).
> - Thank you for the many nice improvements to basic functions like sum