```julia
julia> x[2, :, :] = randn(3, 4)
ERROR: argument dimensions must match
in setindex! at array.jl:595
```
For some reason, this does work, however
```julia
julia> x[2, :] = rand(Int64, 3, 4)
3x4 Array{Int64,2}:
9210414016196894986 5590911648019920056 -2549620222663833335 8213417336355676902
-6399277314024803139 224843245225049497 5480293538228905038 -327610178966698264
-4819572141754207167 1796228437444309558 -8885727529087829364 4250741295695756214
```
There is more to be said here, but I think this should illustrate the concern.
As for what x[2,:] means, I am not quite sure. I know that for some reason it takes the slice x[2,:,:], then flattens it, so it becomes a 1d array, which somehow can be filled by a 2d array. Kinda weird.
If there is likely to be a significant amount of Matlab code orphaned by this decision maybe you should raise it as an issue. If its not changed then you are, at least, likely to get an understanding of why it shouldn't be changed.
https://github.com/JuliaLang/julia
>> a = randn(2,3,4);>> size(a(2,:,:))
ans =1 3 4>> size(a(:,2,:))ans =2 1 4>> size(a(:,:,2))ans =2 3
Maybe John Lynch misread my comment. I have a lot of old python (numpy) code that doesn’t translate directly.
My concern with not dropping singleton dimension stems from the fact that this is exactly the behavior of numpy:
In [1]: import numpy as np
In [2]: x = np.reshape(range(24), (2, 3, 4))
In [3]: x[1, :, :].shape
Out[3]: (3, 4)
In [4]: x[:, 1, :].shape
Out[4]: (2, 4)
In [5]: x[:, :, 1].shape
Out[5]: (2, 3)
The fact that taking a horizontal and vertical slice of a matrix gives you the same shape of object strikes me as terribly inconvenient for linear algebra work. Even more inconvenient is that taking the transpose of such a vector gives you the same vector back.If we did drop all singleton dimensions, then our slicing behavior would be incompatible with both Python and Matlab – that's the worst of both worlds. There is unfortunately no way to make everyone happy.
I personally think the idea of silently dropping singleton dimensions is hot death. It seems to make sense, until you think about the case where your indexes are generated by code with conditionals, and you didn't plan ahead for the case where your code generated a singleton rather than a range---suddenly, all your array dimensions mean something different than you intended. The only way it might be tolerable (maybe) is if we don't drop dimensions in cases of indexing with something like 3:3, which might plausibly be produced by code that generally expects to return a range (and therefore preserves that dimension). Bottom line is that the array dimensionality, like the rest of well-designed Julia code, at a minimum needs to be predictable from the types of inputs.
I also agree with this. I think it makes sense for x[:, 1, :] to be 2-dimensional, but x[:, 1:1, :] should preserve all 3 dimensions.
FWIW, this is the behavior of numpy:
In [1]: import numpy as np; x = np.random.randn(2, 3, 4)
In [2]: x[0:1, :, :].shape
Out[2]: (1, 3, 4)
Since I've been coding in Matlab and Python, lately in Julia and Python, I'd like to add some comments here:
1/ To me, one of the strength of Numpy's implementation of arrays is the fact that it treats all dimensions symmetrically and that there is no special treatment for two-dimensional objects. Actually there has been some natural selection here, because it is possible to use a matrix object, but the documentation strongly discourages its use and favours the ndarrray object instead (cf http://wiki.scipy.org/NumPy_for_Matlab_Users). Its main advantage is the fact that it overrides the default elementwise multiplication to replace it by matrix multiplication.
This approach is associated with the implementation of multiplication as tensor reduction operations. For instance, the following code is valid in numpy:
```
M = zeros( (4,4) )
X = zeros(4) # a 1d array
dot(M, X) # a 1d array equal to M.X
dot(X,M) # another 1d array
X.T # the transpose of a 1d array is itself
```
This is true, because `dot`, i.e. the matrix multiplication, reduces the last dimension of its first argument with the first dimension of its second argument. This multiplication is very generic and works for any combination of dimensions while keeping the output dimension predictible.
In Julia or Matlab, the 1d array behaves as a hidden column 2d matrix:
``
X.T
``
>>> x = np.random.randn(3,3)>>> x[2,:].shape(3,)>>> x[:,2].shape(3,)
>>> x = np.random.randn(3,4)>>> x[:,2].shape(3,)>>> x[:,[2]].shape(3, 1)>>> x[:,2:3].shape(3, 1)>>> x[:,2:2].shape(3, 0)
That is exactly what Python does:>>> x = np.random.randn(3,3)>>> x[2,:].shape(3,)>>> x[:,2].shape(3,)A row slice is indistinguishable from a column slice.