Proposal to add tables (two-dimensional slices) to go

2,079 views
Skip to first unread message

Brendan Tracey

unread,
Mar 26, 2014, 1:21:55 PM3/26/14
to golan...@googlegroups.com
I have written up a proposal [1] to extend Go and add a definition of two-dimensional slices. Two dimensional tables are important for numerical computation (such as in scientific computing), and their adoption will expand the list of use-cases for which Go is an excellent tool. The proposal contains a detailed description of the problems tables solve, and contains the spec and implementation changes needed for their inclusion. This proposal has been vetted on the gonum-dev mailing list (discussion here) and was generally met with acceptance. The only point of major contention is the decision to limit the proposal to two-dimensional slices (leaving higher-dimensional slices to a future proposal) instead of proposing full N-dimensional tables.

Please take a look and give it your consideration. This would be a big improvement for a significant subset of programmers 

[1] https://docs.google.com/document/d/1eHm7KqfKP9_s4vR1zToxq-FBazdUQ9ZYi-YhcEtdfR0/edit

Raul Mera

unread,
Mar 26, 2014, 7:07:33 PM3/26/14
to golan...@googlegroups.com
I would suggest that we limit the comments in the text itself to formal problems (including missing benchmarks, etc), and we discuss the proposal itself here.

Mandolyte

unread,
Mar 26, 2014, 8:41:48 PM3/26/14
to golan...@googlegroups.com
I took a quick glance at an earlier proposal and it hasn't been updated since October. Is the major difference the downscoping to 2-d (at least for now). Or are there other major differences? 

There are a lot Fortran coders left who'd use another language if it could perform. Hoping Go gets there!

Dan Kortschak

unread,
Mar 26, 2014, 8:47:21 PM3/26/14
to Mandolyte, golan...@googlegroups.com
On Wed, 2014-03-26 at 17:41 -0700, Mandolyte wrote:
> Is the major difference the downscoping to 2-d (at least for
> now). Or are there other major differences?

There are other differences, mainly described in the section on
non-goals.

Brendan Tracey

unread,
Mar 26, 2014, 8:52:52 PM3/26/14
to Dan Kortschak, Mandolyte, golan...@googlegroups.com
Specifically, downslicing (going from an N-dimensional slice to an N-1 dimensional slice) is not allowed, and there is no definition of Range. There are also minor differences; the len definition is different, no discussion of language built-ins, etc.
> --
> You received this message because you are subscribed to a topic in the Google Groups "golang-nuts" group.
> To unsubscribe from this topic, visit https://groups.google.com/d/topic/golang-nuts/osTLUEmB5Gk/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to golang-nuts...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Kyle Lemons

unread,
Mar 26, 2014, 9:04:08 PM3/26/14
to Brendan Tracey, Dan Kortschak, Mandolyte, golang-nuts
To mention my main complaint with your proposal here in the open:

I think any proposal needs an answer to bounds check elision, as that's where the biggest performance penalties lie in the matrix stuff I've done, and I think a lot of others as well.  It doesn't have to be range, but that seems like the most appropriate place for it.


You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts...@googlegroups.com.

Dan Kortschak

unread,
Mar 26, 2014, 9:24:36 PM3/26/14
to Kyle Lemons, Brendan Tracey, Mandolyte, golang-nuts
On Wed, 2014-03-26 at 18:04 -0700, Kyle Lemons wrote:
> I think any proposal needs an answer to bounds check elision, as
> that's where the biggest performance penalties lie in the matrix stuff
> I've done, and I think a lot of others as well. It doesn't have to be
> range, but that seems like the most appropriate place for it.

There is a reasonably natural way for ranging to be expressed that
follows from the syntax (and underlying implementation) of how the
special case of copy between slice and table types:

e.g. copy([]T, [n,:]T)

So you could conceivably say:

t := make([,]T, n, m)

for i, v := range t[r,:] {
// stuff
}

or

for j, v := range t[:,c] {
// stuff
}


However, Nico Riesco has filed an issue on the relative importance of
bounds checking in [1] that you may like to look at.

Dan

[1]http://code.google.com/p/go/issues/detail?id=7545


Brendan Tracey

unread,
Mar 26, 2014, 9:44:06 PM3/26/14
to Dan Kortschak, Kyle Lemons, Mandolyte, golang-nuts

On Mar 26, 2014, at 6:24 PM, Dan Kortschak <dan.ko...@adelaide.edu.au> wrote:

> On Wed, 2014-03-26 at 18:04 -0700, Kyle Lemons wrote:
>> I think any proposal needs an answer to bounds check elision, as
>> that's where the biggest performance penalties lie in the matrix stuff
>> I've done, and I think a lot of others as well. It doesn't have to be
>> range, but that seems like the most appropriate place for it.
>
> There is a reasonably natural way for ranging to be expressed that
> follows from the syntax (and underlying implementation) of how the
> special case of copy between slice and table types:
>
> e.g. copy([]T, [n,:]T)
>
> So you could conceivably say:
>
> t := make([,]T, n, m)
>
> for i, v := range t[r,:] {
> // stuff
> }
>
> or
>
> for j, v := range t[:,c] {
> // stuff
> }

That’s quite nice. Something to keep in mind for the future.

I think the other answer to your concern Kyle is that tables make it easier to prove bounds checking is redundant. For example, in

var sum float64
for i := 0; i < len(t, 0); i++{
for j := 0; j < len(t,1); j++{
sum += t[i,j]
}
}

It’s easy to see that i and j don’t change and have a maximum value, and the compiler controls the implementation and bounds checking of t. Thus, it’s easy for the compiler to tell that only one bounds check needs to happen. Contrast this with the case of the Struct representation, where the compiler first needs to look within two data sets, then it needs to notice that if statements must evaluate to true and remove them, and then it needs to prove that i*stride + j has a maximum value, that that maximum value doesn’t overflow, that that it only has to be checked once. Tables make it much easier for the compiler to do code optimization.

Dan Kortschak

unread,
Mar 26, 2014, 9:51:32 PM3/26/14
to Brendan Tracey, Kyle Lemons, Mandolyte, golang-nuts
On Wed, 2014-03-26 at 18:44 -0700, Brendan Tracey wrote:
> Something to keep in mind for the future.

I sent before I fully fleshed it out, but to answer the concerns in the
proposal about ranging, ranging would only be allowable on a single
dimension (ranging in general in Go is linear along some, possibly
discontiguous, dimension), so were the proposal to be accepted and then
extended to dimensionality greater than 2, it would only be legal to
allow a single ':' in a table range expression, and the index part of
the range would be an int that is within the bounds of that dimension in
the table.


Brendan Tracey

unread,
Mar 26, 2014, 10:21:41 PM3/26/14
to Dan Kortschak, Kyle Lemons, Mandolyte, golang-nuts
This also makes the “value” part of range useful while not allocating lots of temporary memory (like some previous range proposals)

yy

unread,
Mar 27, 2014, 11:39:16 AM3/27/14
to Brendan Tracey, golang-nuts
On 26 March 2014 18:21, Brendan Tracey <tracey....@gmail.com> wrote:
> Please take a look and give it your consideration. This would be a big
> improvement for a significant subset of programmers

Could you elaborate on this? I'm not sure anymore which is the problem
to solve. Could you give some examples of code that will be
significantly improved with this feature?

It seems like you are trying to make the change as small as possible.
This is good, it can be an advantage to get the feature added to the
language (even although it is, still, a significant change), but only
if you are not limiting too much its usefulness. Personally, I would
not start using Go as my main scientific language just because of the
introduction of tables as defined in the proposal. But, since
scientific computing is such a broad field and my usage of it quite
specific, I am interested in other opinions.

I agree something along these lines may be a first step, but only if
this is part of a more ambitious plan. It is not clear to me what this
plan is (if it exists at all), or if there is any interest in it from
the community, in general, and the main authors, in particular.


--
- yiyus || JGL .

Robert Johnstone

unread,
Mar 27, 2014, 12:19:41 PM3/27/14
to golan...@googlegroups.com, Brendan Tracey
I agree that getting the main authors interested (or at least accepting of future patches) is critical.  Other languages in this field had scientific computing as a goal from the beginning (Matlab, R, Julia) or providing sufficient levels of abstraction (operating overloading, for one) that the tools could be provided by libraries (C++, Python).  Unless there is a commitment from the main authors, I don't see Go becoming a language for scientific computing.

Brendan Tracey

unread,
Mar 27, 2014, 12:20:30 PM3/27/14
to yy, golang-nuts

On Mar 27, 2014, at 8:39 AM, yy <yiyu...@gmail.com> wrote:

> On 26 March 2014 18:21, Brendan Tracey <tracey....@gmail.com> wrote:
>> Please take a look and give it your consideration. This would be a big
>> improvement for a significant subset of programmers
>
> Could you elaborate on this? I'm not sure anymore which is the problem
> to solve.

Is there more you would like to see on top of the Rationale section?

> Could you give some examples of code that will be
> significantly improved with this feature?

A start would be most of the code for the Dense type of github.com/gonum/matrix/mat64/, and it would give us good reason to improve github.com/gonum/blas/goblas . These codes and all those that build on it would gain from the convenience and speed improvements.

> It seems like you are trying to make the change as small as possible.
> This is good, it can be an advantage to get the feature added to the
> language (even although it is, still, a significant change), but only
> if you are not limiting too much its usefulness. Personally, I would
> not start using Go as my main scientific language just because of the
> introduction of tables as defined in the proposal. But, since
> scientific computing is such a broad field and my usage of it quite
> specific, I am interested in other opinions.

Are you a scientific programmer? Just curious based on your statement of “would not start …”. I think it’s pretty hard to beat the Matlabs, pythons, and Rs of the world when it comes to writing short throwaway scripts, but when it comes to writing performant software, go has many advantages.

>
> I agree something along these lines may be a first step, but only if
> this is part of a more ambitious plan. It is not clear to me what this
> plan is (if it exists at all), or if there is any interest in it from
> the community, in general, and the main authors, in particular.

What do you mean “more ambitious plan”? I would guess range will be added at some point, and we have developed a good notation for this in the comments of this thread.

Things like operator overloading will likely never be added to the language, so you’ll never be able to write

// a, b, c, d, and e are all *mat64.Dense
e = a*b + c\d

Shiny syntax is not the main draw for Go, and these operators are not necessary for making Go attractive for scientific computing in my opinion.

Raul Mera

unread,
Mar 27, 2014, 3:47:43 PM3/27/14
to golan...@googlegroups.com, Brendan Tracey
I think accepting this proposal is all the commitment to science-specific things we need from the main authors, as it is the main (only?) addition to the language itself we need. Having 2D (or ND if the proposal was expanded) slices would be enough for the gonum community to take over and provide the tools needed for scientific computing (they are actually already providing such tools, but with the problems detailed in the proposal's "Rationale" section).

Getting a place in science is of course difficult, and not something you can do in a short amount of time, but it is possible.

Currently, at least in my area of work, you have these specific languages which have shortcomings as soon as you leave the strictly numerical work (matlab, R) Julia, which is also specific, and seems to be more apt for small projects (although I admit I am not familiar with it at all). Then Python which is great, but problematic/hard to read for large projects, and you sooner or later need to write functions in C or even (heavens forbid it) C++. And of course, Fortran. Wouldn't it be nice to have a language which is general, readable, easy to debug, good from small to large projects, and performant? And, we add easy concurrency on top of that?. And don't even get me started on the compilation/distribution. Right now we seem to be lacking in the performance part (I mean, for numerical work). Something like the old netchan would be also great, and the devs said that they were considering a way of doing it. 
I think if we could implement a goblas that is competitive in speed with the current c/fortran ones, we would become the best, or at least one of the best, language for the job. This is what the current proposal is for. Of course, one thing is being the best, and another is actually succeeding, but the latter should follow the former given some time if the community is active, which it is, and if Go also succeeds in other areas, which appears to be happening.

yiyus writes about the need for an ambitious plan. We have gonum. On top of that, we have at least a few libraries for science that already work  (I think, for instance, in biogo, Sonia Key's work, Plotinum, and my own goChem, but there are more). I don't think we are doing bad in that department.

As a final note, I do not thing operator overloading is needed. For instance, I think the gonum/matrix API looks good without it.

Raul

Jason E. Aten

unread,
Mar 27, 2014, 3:49:07 PM3/27/14
to golan...@googlegroups.com

For best numeric performance, you really want highly tuned Fortran code underneath you matrix manipulations; e.g. the GotoBLAS, the Intel math libs, ATLAS, etc that are already multicore, already elide bounds checks, have already had hundreds of person-years of tuning put into them.

So I would suggest just writing a general purpose fgo (like cgo but for Fortan90/77/whatever) interface mechanism, and reap the rewards of re-use.

- Jason

Dan Kortschak

unread,
Mar 27, 2014, 3:57:16 PM3/27/14
to Jason E. Aten, golan...@googlegroups.com
On Thu, 2014-03-27 at 12:49 -0700, Jason E. Aten wrote:
> For best numeric performance, you really want highly tuned Fortran
> code underneath you matrix manipulations; e.g. the GotoBLAS, the Intel
> math libs, ATLAS, etc that are already multicore, already elide bounds
> checks, have already had hundreds of person-years of tuning put into
> them.

This proposal does not preclude that.

Kevin Malachowski

unread,
Mar 27, 2014, 8:31:43 PM3/27/14
to golan...@googlegroups.com
I don't understand why you think that "c.Set(i,j, c.At(i,j) + a.At(i,j) * b.At(i,j))" looks much worse than "c[i,j] += a[i,j] * b[i,j]" when comparing a struct representation to your 2-d tables, but you consider "c.Mul(a,b)" compared to "c := a*b" not that bad when discussing the non-goal of supporting mathematical operators. Is it just that multiplying individual elements of a table is a much more common action than multiplying or doing other math on tables? (For the record, I agree that the operators shouldn't be overloaded if this proposal were supported for the reason you gave.)

To me, the limited nature of only supporting 2-d tables makes it seem that an external package optimized for this kind of math is more appropriate than a full-blown language feature. It could be written in C or assembly rather than Go if the Go equivalent is too slow, just like numpy is definitely not written in Python. I personally think that rather than making a language change it would be better for the language overall to investigate why the struct representation is much slower. If some fancy optimizations could be done in gc to, for example, have better analysis to remove bounds checking from more locations it could benefit everyone who uses slices in their every day programming and just make Go that much faster as a whole.

I like Go because it's so simple and so composable that *most of the time* you can solve your problems by just using the tools the language already offers. Adding more tools may help a specific group of people, but I think it's worth investigating to make sure the problem can't be solved with the existing tools and without delving into unsafe. Other than the slight syntax help it doesn't seem to me that this proposal would allow you to do that you can't already do now with an external and specialized library.

For example, I'm sure when the proposal for changing the cap of a slice was written some people thought to themselves "I wonder how often I'll actually use that." But in my opinion it was a good addition because there was no other way to do that without going into unsafe (or possibly reflect, I don't know if it was possible that way).

You already mentioned that there would have to be a library to allow for the table arithmetic operations (since built-in types do not currently have methods attached to them and I don't think they ever will), so why not just include the type for the matrix in that library and optimize by hand where needed?

Dan Kortschak

unread,
Mar 27, 2014, 8:51:24 PM3/27/14
to Kevin Malachowski, golan...@googlegroups.com
On Thu, 2014-03-27 at 17:31 -0700, Kevin Malachowski wrote:
> I don't understand why you think that "c.Set(i,j, c.At(i,j) + a.At(i,j) *
> b.At(i,j))" looks much worse than "c[i,j] += a[i,j] * b[i,j]" when
> comparing a struct representation to your 2-d tables, but you consider "
> c.Mul(a,b)" compared to "c := a*b" not that bad when discussing the
> non-goal of supporting mathematical operators. Is it just that multiplying
> individual elements of a table is a much more common action than
> multiplying or doing other math on tables? (For the record, I agree that
> the operators shouldn't be overloaded if this proposal were supported for
> the reason you gave.)

Part of the issue here is that c[i,j] += a[i,j] * b[i,j] requires only a
single lookup into c while the method approach requires two - both with
library-provided bounds checks. That optimisation cannot easily be done
at the library level.

> To me, the limited nature of only supporting 2-d tables makes it seem that
> an external package optimized for this kind of math is more appropriate
> than a full-blown language feature. It could be written in C or assembly
> rather than Go if the Go equivalent is too slow, just like numpy is
> definitely not written in Python. I personally think that rather than
> making a language change it would be better for the language overall to
> investigate why the struct representation is much slower. If some fancy
> optimizations could be done in gc to, for example, have better analysis to
> remove bounds checking from more locations it could benefit everyone who
> uses slices in their every day programming and just make Go that much
> faster as a whole.

That has been done, for me at least, part of the justification of the
proposal is to improve readability and maintainability of the matrix
libraries that we have. Also note that tables are not limited to
numerical values and I think I could see a number of uses for
non-numerical value tables in my code (that slice of slices only poorly
fit today).

> I like Go because it's so simple and so composable that *most of the time*
> you can solve your problems by just using the tools the language already
> offers. Adding more tools may help a specific group of people, but I think
> it's worth investigating to make sure the problem can't be solved with the
> existing tools and without delving into unsafe. Other than the slight
> syntax help it doesn't seem to me that this proposal would allow you to do
> that you can't already do now with an external and specialized library.

Use of non-Go libraries requires either unsafe or plan9 C wrappers and
Go libraries are (for example in case of matrix arithmetic where we have
a number of C and Go backed libraries) very much slower. Resorting to
asm seems like a wrong path for code maintainability.

> For example, I'm sure when the proposal for changing the cap of a slice was
> written some people thought to themselves "I wonder how often I'll actually
> use that." But in my opinion it was a good addition because there was no
> other way to do that without going into unsafe (or possibly reflect, I
> don't know if it was possible that way).
>
> You already mentioned that there would have to be a library to allow for
> the table arithmetic operations (since built-in types do not currently have
> methods attached to them and I don't think they ever will), so why not just
> include the type for the matrix in that library and optimize by hand where
> needed?

This is what we do. It works but is not ideal. I am really looking
forward to being able to rewrite significant portions of that to make
use of this kind of a data structure - the library will shrink in size
(generally line length) and I am confident the
readability/maintainability will improve. I'm also confident that
performance of the pure-Go implementation will improve (particularly if
later additions of things like 'range' are added).


Daniel Mal

unread,
Mar 27, 2014, 8:57:00 PM3/27/14
to golan...@googlegroups.com
i think adding 2d slice is ridculous, and also introduce complexity to language feature

在 2014年3月27日星期四UTC+8上午1时21分55秒,Brendan Tracey写道:

Raul Mera

unread,
Mar 27, 2014, 9:07:40 PM3/27/14
to golan...@googlegroups.com
I would expect some kind of reasoning to follow such a statement.

Brendan Tracey

unread,
Mar 27, 2014, 10:33:04 PM3/27/14
to golan...@googlegroups.com
On Thursday, March 27, 2014 5:31:43 PM UTC-7, Kevin Malachowski wrote:
I don't understand why you think that "c.Set(i,j, c.At(i,j) + a.At(i,j) * b.At(i,j))" looks much worse than "c[i,j] += a[i,j] * b[i,j]" when comparing a struct representation to your 2-d tables, but you consider "c.Mul(a,b)" compared to "c := a*b" not that bad when discussing the non-goal of supporting mathematical operators. Is it just that multiplying individual elements of a table is a much more common action than multiplying or doing other math on tables? (For the record, I agree that the operators shouldn't be overloaded if this proposal were supported for the reason you gave.)

It's definitely a matter of degrees, but in my mind  it's hard to see what is being set where in the first example, while the second set is clear. I would probably write it
v := c.At(i,j) + a.At(i,j) * b.At(i,j)
c.Set(i,j,v)
 
This isn't that big of a deal if this is the only operation, but it gets tedious when this has to be done every matrix operation. Secondly, it's also a matter of impact on the language. c := a*b does, in fact, look better than c.Mul(a,b). However, operator overloading for the specific case of [,]float64 and [,]cmplx128 is compounding the list of special cases in the language significantly, not to mention the fact that the overloaded notation has problems with definition (do bounds need to match? etc.). The point is that the table does create syntax improvements (which is only a piece of the proposal), and that overloaded arithmetic is not necessary to make tables useful. Operator overloading is purely be a syntax thing (with a not-clear definition), and does not hold its weight.


To me, the limited nature of only supporting 2-d tables makes it seem that an external package optimized for this kind of math is more appropriate than a full-blown language feature. It could be written in C or assembly rather than Go if the Go equivalent is too slow, just like numpy is definitely not written in Python. I personally think that rather than making a language change it would be better for the language overall to investigate why the struct representation is much slower. If some fancy optimizations could be done in gc to, for example, have better analysis to remove bounds checking from more locations it could benefit everyone who uses slices in their every day programming and just make Go that much faster as a whole.

In addition to Dan's comments (about maintainability and use of tables beyond numerics), the problem with calling an external library is that many matrix data manipulations aren't one of those that you can just call an external library, so that speed penalty will come in many cases. In addition, one of the major drawing points for me is that I don't have to leave the Go ecosystem (and as compiles get better and better, there will be less and less reason to leave). I have been a part of projects that have to be re-written because they get slowed down by the Matlab or Python environments. Go provides a great platform for writing software that grows in complexity, and providing performant tables means that I can continue to build off of Go tools. Lastly, while you're right that "some fancy optimizations could be done", they are tough with the gonum/matrix representation (the only one that actually has the behavior of a table). It's not just a matter of "removing some bounds checks" but (I believe, I am not a compiler expert):
1) Inlining a function that contains unexported data.
2) Taking that inlined function and proving that it contains if-statements whose condition can be proved, and then removing those if-statements
3) Once the if-statements are removed, the compiler can then note that the integer multiply-add is the same as an increment (within the context of the program) and so the access can be optimized
4) Then, note that these accesses are of provable size, and so only the largest one needs bounds checking.
5) Once that has been done, it can then note that since there are accesses of sequential data, the operations can be vectorized.

 It is possible to for a compiler to optimize the struct access, but it is no small feat, and seems to be unlikely to happen anytime soon. With a table, you can jump straight to step 4. The optimizations of step 4 and 5 will help a lot of code. Tables also come with better maintainability, legibility, and simplicity.


I like Go because it's so simple and so composable that *most of the time* you can solve your problems by just using the tools the language already offers. Adding more tools may help a specific group of people, but I think it's worth investigating to make sure the problem can't be solved with the existing tools and without delving into unsafe. Other than the slight syntax help it doesn't seem to me that this proposal would allow you to do that you can't already do now with an external and specialized library.

(See Dan's comment about unsafe and libraries, and my comments above about specialized libraries often not being sufficient).

The "Rationale" section is an investigation of the available options with existing tools. In gonum/matrix, we have decided to provide the option of calling a c-based blas package (which must import unsafe), and we allow the user to get access to matrix internals (which as far as abstraction is concerned is also unsafe)


> You already mentioned that there would have to be a library to allow for
> the table arithmetic operations (since built-in types do not currently have
> methods attached to them and I don't think they ever will), so why not just
> include the type for the matrix in that library and optimize by hand where
> needed?

I echo Dan's comments about maintainability and performance. I spent a while porting some of the C tools to Go (avoiding external dependencies is great where possible, and while it's not as fast as the C implementations, it's not as big a performance hit as it would be in python). Even despite that effort, I'm excited about the ability to rewrite that code to make it simpler, more legible, and more performant.

Raul Mera

unread,
Mar 27, 2014, 10:47:11 PM3/27/14
to golan...@googlegroups.com
There are several advantages of Go over C, which have been already stated here and elsewhere. I will only mention the lack of runtime dependencies and the ease of installation/distribution.

We want to take all those advantages to numerical computing; "Just do it in C" does not accomplish that. Rather, it takes the problems of C to Go programs that happen to do some numerical things. The fact that we could have applications which use numerical processing coded fully in Go is one of the advantages that Go has over Python/Numpy/SciPy

It is true that there will always (or at least for a long time) be cases where one will resort to C/Fortran backed solutions, as when you really want every bit of performance you can get. Still, with this addition, in many cases you would not need to because the performance would be good enough.

yy

unread,
Mar 28, 2014, 7:22:14 AM3/28/14
to golang-nuts
(Replying to several messages)

The Rationale section first talks about the matrix support in other
languages. Then, it presents a treatment of them much more limited
than in these other languages, and it is implicitly assumed that this
is enough to make Go as convenient and fast as them. I think that
needs more proof.

There have been a couple of comments implying that the main goal is to
use Go for complex projects where performance is important. I'm
currently working on some projects like this, and the reasons I do not
use Go do not change with the introduction of tables. For some
context: I need to do lots of tensor algebra. Since the dimensions of
these tensors are usually fixed, I can just use arrays. These programs
are doing calculations for days in many processors, so every bit of
performance is important here. Currently, having to write loops for
matrix operations makes complex algorithms really unreadable (I've
fixed quite a few bugs just translating FORTRAN 77 code with loops to
array operations in Fortran 90). Using functions and methods is also
an option, but tables do not allow me to define a type for a column or
a row of a matrix (the options are to use a slice and make a copy or
use a 2d table, which are far from optimal). Also, although I admit
I've not run benchmarks, I doubt Go can achieve the performance of
Fortran compilers (I think it is naive to be optimist here). Of
course, I would like to have concurrency, the Go standard library, and
the confy development environment that Go provides in these projects,
but only if it doesn't imply a significant loss of readability and
performance.

All this said, scientific computing is a very general term, but the
particular problems each of us try to solve with it are usually very
specific. The fact that tables do not help with my problems does not
mean they cannot help with many others. Obviously, a matrix type will
help to write matrix algebra packages, but it doesn't look to me like
it is going to suppose a big improvement for the users of those
packages. Its usage in image processing is a valid and strong point,
and I'm sure there will be other interesting ones. But I don't think
this change is going to suppose a significant boost in the adoption of
Go by the scientific community (though I'd like to be wrong), and the
cost is not low.

Brendan Tracey

unread,
Mar 28, 2014, 2:37:45 PM3/28/14
to golan...@googlegroups.com

First of all, thanks for taking the time to expand your comment.


On Friday, March 28, 2014 4:22:14 AM UTC-7, yiyus wrote:
(Replying to several messages)

The Rationale section first talks about the matrix support in other
languages. Then, it presents a treatment of them much more limited
than in these other languages, and it is implicitly assumed that this
is enough to make Go as convenient and fast as them. I think that  
needs more proof.

Here's some more example code. I'm sorry I don't have a full implementation of the algorithm. I started a Go implementation in the past, but my needs changed and I no longer needed it.

http://play.golang.org/p/S00rP4Kdoq

The code shows one method of a Gaussian Process implementation (a type of regression algorithm). The method adds a new data point to the training data, and updates the kernel matrix. The kernel matrix is an NxN matrix, where the i, j th element contains  kernelFunc(input_i, input_j). The actual matrix math (and what makes it a gaussian process) happens in different methods.

The two codes are roughly the same number of lines, but I would argue that the table version is much better. Aside from the type definition, the table version only relies on language features and built-ins -- concepts that will be familiar to any reader of go. It stands reasonably well on its own with minor comments, and it is easy to implement with only Go knowledge. The implementation with the current version, is much trickier. Its use of the matrix package is non-standard, and thus it must rely on subtleties of the matrix package. The reader must have knowledge of the inner workings of the matrix package in order to understand the code (possibly even with good comments), making it harder to read and even harder to write. In addition, the code author is incentivized to break the Matrix abstraction of the type and instead work with the underlying data,  not only because of speed, but also because some of the operations are difficult to fit within the workings of the current package. 
 
There have been a couple of comments implying that the main goal is to
use Go for complex projects where performance is important. I'm
currently working on some projects like this, and the reasons I do not
use Go do not change with the introduction of tables. For some
context: I need to do lots of tensor algebra. Since the dimensions of
these tensors are usually fixed, I can just use arrays.

I also do a fair amount of matrix algebra. My sizes are almost never fixed. Both cases are common, and depend on the domain.
 
These programs
are doing calculations for days in many processors, so every bit of
performance is important here. Currently, having to write loops for
matrix operations makes complex algorithms really unreadable (I've
fixed quite a few bugs just translating FORTRAN 77 code with loops to
array operations in Fortran 90). Using functions and methods is also
an option, but tables do not allow me to define a type for a column or
a row of a matrix (the options are to use a slice and make a copy or
use a 2d table, which are far from optimal). Also, although I admit
I've not run benchmarks, I doubt Go can achieve the performance of
Fortran compilers (I think it is naive to be optimist here).

Any reason in favor of skepticism? Go does not yet achieve the performance of Fortran, but that's not the same as "can't" or "won't". I don't know enough about compiler's or Fortran to have an opinion, but I've heard it expressed on this list that certain properties of Go mean that Fortran-style optimizations are possible (ones which are not possible in C).
 
Of course, I would like to have concurrency, the Go standard library, and
the confy development environment that Go provides in these projects,
but only if it doesn't imply a significant loss of readability and
performance.

Tables help improve readability and performance

All this said, scientific computing is a very general term, but the
particular problems each of us try to solve with it are usually very
specific. The fact that tables do not help with my problems does not
mean they cannot help with many others. Obviously, a matrix type will
help to write matrix algebra packages, but it doesn't look to me like
it is going to suppose a big improvement for the users of those
packages.

If all you want to do is multiply and do a linear solve, that's probably true (assuming the chosen linear solve implementation is okay for your problem; there are lots of ways to do it with different properties). If your needs can't be expressed by common functions (or the implementations don't suit your needs), then that's not true. Tables help speed, legibility, and simplicity. Additionally, tables will probably find use for much more than matrix math; a table represents any set of "rectangular" data
 
Its usage in image processing is a valid and strong point,
and I'm sure there will be other interesting ones. But I don't think
this change is going to suppose a significant boost in the adoption of
Go by the scientific community (though I'd like to be wrong), and the
cost is not low.

I disagree that it won't be a significant boost (as is probably clear). 

One of the major selling points about Go is the standard library. "It's got everything you need, look how easy it is to write an http server! Not only that, but the standard library is very well written, well documented, and easy to follow". The problem is that for numeric computing, there are only the bare building blocks (math and math/rand mostly). It will be much easier to sell Go once there is a stable, functional, and fast numpy equivalent. Those of us who see the upsides to Go as a great language for software have been working on building such a library. The lack of a table type has made the code trickier to write and more error-prone, which slows down the (volunteer-time) development. Additionally, it is harder to read the internals of the matrix package (which is frequently useful as a user), and it is often necessary to break the abstraction to get performance. Tables improve legibility, usability, speed, and the ease of writing correct code. It's hard for me to see how that isn't a boost, and again, tables will likely find uses beyond [,]float64 and [,]cmplx128.

Brendan Tracey

unread,
Mar 28, 2014, 2:42:26 PM3/28/14
to golan...@googlegroups.com
One quick comment: the code that I wrote isn't tested, and so there may be errors.

yy

unread,
Mar 28, 2014, 3:58:41 PM3/28/14
to Brendan Tracey, golang-nuts
On 28 March 2014 19:37, Brendan Tracey <tracey....@gmail.com> wrote:
> On Friday, March 28, 2014 4:22:14 AM UTC-7, yiyus wrote:
>> The Rationale section first talks about the matrix support in other
>> languages. Then, it presents a treatment of them much more limited
>> than in these other languages, and it is implicitly assumed that this
>> is enough to make Go as convenient and fast as them. I think that
>> needs more proof.
>
> Here's some more example code.

Well, of course Go code which works with matrices looks better with
tables. What I'm arguing is that this single feature makes the support
of matrices in Go comparable to the one in Matlab, Python/Numpy or
Fortran.

>> I doubt Go can achieve the performance of
>> Fortran compilers (I think it is naive to be optimist here).
>
> Any reason in favor of skepticism?

I really respect the authors of the Go compilers (it is really amazing
how fast they are improving), but the fact is they are not even
trying. Compilation speed, simplicity and portability are more
important goals for the gc compiler than performance. On the other
hand, Fortran compilers have been developed (also by very competent
people) for decades, trying to make use of every possible trick (no
matter how ugly) just to save a few instructions.

I think Go can be very competitive in scientific computing, but I
don't think performance is going to be its main selling point with the
actual competition.

>> Of course, I would like to have concurrency, the Go standard library, and
>> the confy development environment that Go provides in these projects,
>> but only if it doesn't imply a significant loss of readability and
>> performance.
>
> Tables help improve readability and performance

Again, I'm not comparing Go+tables with Go. I'm just saying that it
will be less readable and performant than in other languages currently
used for scientific programming, which are highly optimised and have
been specifically designed to solve this kind of problems.

>> Its usage in image processing is a valid and strong point,
>> and I'm sure there will be other interesting ones. But I don't think
>> this change is going to suppose a significant boost in the adoption of
>> Go by the scientific community (though I'd like to be wrong), and the
>> cost is not low.
>
> I disagree that it won't be a significant boost (as is probably clear).

I hope you are right, but take into account that we are talking about
a community of which a significant part is still using FORTRAN 77 as
main programming language.

Raul Mera

unread,
Mar 28, 2014, 6:34:31 PM3/28/14
to golan...@googlegroups.com, Brendan Tracey


>> I doubt Go can achieve the performance of
>> Fortran compilers (I think it is naive to be optimist here).
>
> Any reason in favor of skepticism?

I really respect the authors of the Go compilers (it is really amazing
how fast they are improving), but the fact is they are not even
trying. Compilation speed, simplicity and portability are more
important goals for the gc compiler than performance. On the other
hand, Fortran compilers have been developed (also by very competent
people) for decades, trying to make use of every possible trick (no
matter how ugly) just to save a few instructions.

Exactly, then I do not understand how you can claim that Fortran is more readable than Go.
Maybe for the matrix operations, which will be all in one separate library anyway (see below).

Still, I am with Brendan on the performance. We are of course not going to match fortran the week after this proposal is accepted,
but the compiler in general is getting better by the day. In the long term, I do expect that we match C performance (which seems enough for many
high-performance programs, see  NAMD and ORCA, programs for computational chemistry). Even Fortran performance is not impossible.

Something else, I still dont quite get what do you mean when you say that the current proposal is "much more limited" than what you have
in other languages. Is it the 2D limit? Because that is up to discussion. Is it the lack of things like operator overload? Because that I don't quite see as
a limitation as much as a way to keep sanity.

 
I think Go can be very competitive in scientific computing, but I
don't think performance is going to be its main selling point with the
actual competition.

I agree, and this is why we do not need to be as performant as Fortran  (right now).
We do need to perfom better than now so Go is used for its multiple advantages when
you need decent performance but are willing to sacrifice some for a sane language,
When your program is large, you may be willing to do this tradeoff to be able to finish it
faster and with less bugs.
 

>> Of course, I would like to have concurrency, the Go standard library, and
>> the confy development environment that Go provides in these projects,
>> but only if it doesn't imply a significant loss of readability and
>> performance.
>
> Tables help improve readability and performance

Again, I'm not comparing Go+tables with Go. I'm just saying that it
will be less readable and performant than in other languages currently
used for scientific programming, which are highly optimised and have
been specifically designed to solve this kind of problems.

 Despite the need for loops, people do use C and C++ (Eigen would be an example,  you have also the programs I mention above) for number crunching, and at least in my field, the tendency seems to be to migrate for Fortran to C++. I am more of a user of numerical things than an expert, so the gonum guys may correct me here, but as I see it most matrix operations are encapsulated in a library, so you will have the loops isolated in few places which will be very much tested.
Go is already way more readable than Fortran and C++.  Neither of the latter was planned with readability or manteinability in mind (I hope!).

 
>> Its usage in image processing is a valid and strong point,
>> and I'm sure there will be other interesting ones. But I don't think
>> this change is going to suppose a significant boost in the adoption of
>> Go by the scientific community (though I'd like to be wrong), and the
>> cost is not low.
>
> I disagree that it won't be a significant boost (as is probably clear).

I hope you are right, but take into account that we are talking about
a community of which a significant part is still using FORTRAN 77 as
main programming language.


I dont think either that there will be a boost in adoption, but I am willing to bet on a slow, sustained growth,
coupled to the general improvement of the compiler, and the success of Go in other areas. There are of course
other things that need to be done to make this happen, but those we can address as a community, without the need for language changes.
The idea was to request the minimum needed to enable to community to properly cover the numerical needs.

Finally, In the previous post you state that will "help to write matrix algebra packages, but it doesn't look to me like

it is going to suppose a big improvement for the users of those packages".

I am a user of these packages and I strongly disagree.
As a gonum/matrix user, the tables get you extra performance if you use fortran-backed routines,
and all the benefits of a pure Go library, if you can give up a bit of performance (well the latter would happen
only if/when a goblas engine is written, but that doesn't seem something unrealistic to expect).
 
 
-Raul






 

Andy Balholm

unread,
Mar 28, 2014, 10:15:52 PM3/28/14
to golan...@googlegroups.com, Brendan Tracey
On Friday, March 28, 2014 3:34:31 PM UTC-7, Raul Mera wrote:
Go is already way more readable than Fortran and C++.  Neither of the latter was planned with readability or manteinability in mind (I hope!).

Weren't readability and maintainability Fortran's main objectives? (Compared to writing everything in octal machine code, or maybe assembler if you were really up and coming)

I think Fortran achieved those goals too, in comparison with the competition at the time. But there has been some progress in language design since. :-)

Raul Mera

unread,
Mar 28, 2014, 11:14:18 PM3/28/14
to golan...@googlegroups.com, Brendan Tracey
Context is everything, I guess :-)

Jsor

unread,
Apr 7, 2014, 9:33:07 AM4/7/14
to golan...@googlegroups.com, Brendan Tracey

Still, I am with Brendan on the performance. We are of course not going to match fortran the week after this proposal is accepted,
but the compiler in general is getting better by the day. In the long term, I do expect that we match C performance (which seems enough for many
high-performance programs, see  NAMD and ORCA, programs for computational chemistry). Even Fortran performance is not impossible.


Eh... I'm not an expert on compilers or language theory or anything, but I think getting Go to Fortran levels is edging on impossible. I understand that a lot of Fortran's speed comes from the fact that pointer aliasing isn't allowed. I suppose if you had a bit of a cooperative endeavor between the optimizer and the programmer we could get Fortran-level performance in very narrow cases (that is -- if the programmer writes code that the compiler can prove doesn't alias anything the optimizations can be done); but I'm... er... dubious that that sort of analysis is even feasible. Especially given that Go is a language meant to have fast compilation times and has a mild allergy to compiler flags.

Kevin Gillette

unread,
Apr 7, 2014, 12:47:34 PM4/7/14
to golan...@googlegroups.com, Brendan Tracey
On Thursday, March 27, 2014 10:19:41 AM UTC-6, Robert Johnstone wrote:
Other languages in this field had scientific computing as a goal from the beginning (Matlab, R, Julia) or providing sufficient levels of abstraction (operating overloading, for one) that the tools could be provided by libraries (C++, Python).

I think it's quite telling that languages not at all designed for scientific computing are being used for scientific computing. I also suspect that many scientists, operator overloading is not a major factor, even indirectly, in the use of Python for scientific computing. From what I've seen and heard, Python is often favored, in spite of its non-computational focus, for its generality and low total-time-to-completion (developing for 1 week and computing for 2 hours is better than developing for a 2 weeks and computing for 2 minutes).

I'm hardly surprised that the Go is being considered for scientific computing in spite of the fact that scientific computing wasn't even on the horizon of language design motivations or goals. This isn't a remark on the incidental virtues of Go, but rather on the fickleness of incidental programmers. Python and Go are appropriate for a much wider range of applications than typical computing languages, and both also are likely to have had comparatively more attention paid to language design fundamentals; meticulously designed languages should and do have a great deal of internal consistency, and thus have a shallower learning curve. We don't need to do anything to make Go interesting for scientific computing; the fact that we're discussing it in that role at all proves interest.

Robert Johnstone

unread,
Apr 7, 2014, 2:55:03 PM4/7/14
to golan...@googlegroups.com, Brendan Tracey
The opinion that scientists don't care about overloading does not match my experience.  Frankly, I only hear from people who don't do any scientific computing.  Unless you have some other language to add to the list in my previous post, most of the current languages used for scientific computing support it, which is quite telling.  For people with a strong mathematical background, operator overloading improves readability and is a productivity boost.  Like most programmers, scientists want one language/environment to handle all of their programming needs, and so they want a language capable of general purpose tasks, but that is only one part of the requirements they are looking for.

Finally, your conclusion is very misleading.  You say that Go is already interesting for scientific computing because we are discussing it, but it is being discussed because members of that community are requesting languages changes they feel is necessary for Go to be successful.

-- The above statements should not be taken as an argument for Go to support operator overloading.  I'm not interested in that debate.  My previous post was just a suggestion that it would be hard to find support for language changes to support scientific computing.  Something that Kevin's message emphasizes.

Jsor

unread,
Apr 7, 2014, 4:02:49 PM4/7/14
to golan...@googlegroups.com, Brendan Tracey
That said, even when using Numpy operator overloading is pretty underused. Functions dominate 90% of your code. Most matrix libraries I've seen use the * symbol for elementwise multiplication which has uses, but is much less common that what most people think of when they think "matrix multiplication". In Numpy that's still usually "dot", so even with operator overloading your code, at best, looks like x.T.dot(A.T + w.T).dot(x.inverse).

Operator overloading is useful and certainly CAN improve readability, but the way it's used is pretty thin. What arguably could be far more useful is operator DEFINITION. Most languages don't support you doing a<=>b for a bi-implication, and I could see it being useful to define operators like that. The language would end up with a really schizophrenic codebase, though.

Raul Mera

unread,
Apr 7, 2014, 4:10:58 PM4/7/14
to golan...@googlegroups.com, Brendan Tracey


On Monday, April 7, 2014 8:55:03 PM UTC+2, Robert Johnstone wrote:
The opinion that scientists don't care about overloading does not match my experience.  Frankly, I only hear from people who don't do any scientific computing.

I personally dont care for or even like operator overloading (I am a biochemist), and other scientists in the Go community have said similar things.  I do agree that with scientists who do a little coding in a "casual" way  the lack of operator overloading will likely be an issue. In my view displacing Python is not so likely anyway: Python is really ideal for short scripts and (with numpy/scipy/matplotlib) to replace matlab/gnuplot. I think that , even though we do have some overlap with Python, we are aiming more for the C/C++ and even Fortran market: People who develop somewhat larger applications or libraries. Actually, something else I personally think we need to succeed is a good/easy integration with Python. Just communication through pipes in some json-like format is probably enough for most purposes,  as the data Python/Go transfer is unlikely to be the bottleneck (I implemented something like that to allow my library to interact with a popular Python program, I recently read of more standard way to do it).

At this point there are at the very least 4 languages widely used in Chemistry (Fortran, C, C++, Python, in no particular order). People who code somewhat-larger things probably don't mind using two languages (I don't), and for the others, the Python/Go communication thing can become useful. Of course one would like to use one language, but when you use Python for larger things you end up coding in C anyway.


Finally, your conclusion is very misleading.  You say that Go is already interesting for scientific computing because we are discussing it, but it is being discussed because members of that community are requesting languages changes they feel is necessary for Go to be successful.

Well, the members requesting a language change (singular :-)  ) do it because we want to be able to do more scientific programming in the language we are already using for that. I think those of us who are not part of gonum, are developing or have developed some other science library/program. Of course, there are few of us, but Go is very young and the set of numerical libraries (gonum) is still being developed. I see no reason not to be optimistic about the future.
 

Kevin: The thing is, there are scientists already using Go,  but this change would remove many scenarios where it is necessary to resort to C or Fortran. That would allow doing larger parts and sometimes the whole of our projects in Go, which I (and I think the scientists in this community in general) want for the reasons mentioned during this thread.
 

Volker Dobler

unread,
Apr 7, 2014, 5:15:12 PM4/7/14
to golan...@googlegroups.com

Am Montag, 7. April 2014 20:55:03 UTC+2 schrieb Robert Johnstone:
The opinion that scientists don't care about overloading does not match my experience.  Frankly, I only hear from people who don't do any scientific computing.  Unless you have some other language to add to the list in my previous post, most of the current languages used for scientific computing support it, which is quite telling.  For people with a strong mathematical background, operator overloading improves readability and is a productivity boost.
You are wrong here: One might believe at first sight that operator overloading
helps readability. It doesn't do it at a significant level. I speak from experience.
With some operator overloading you'll get a+b instead of MatrixAdd(a,b) which
seems much more readable. Unfortunately operator overloading basically
stops at this level. You may overload +, -, *, / maybe % and maybe & and some
other. These functions will look simple in code with operator overloading but
your math often contains much more stuff and you are running out of overloadable
operators and now you'll mix overloaded operators and plain function calls.
Result: Your code does not look like the stuff you publish. Nothing really gained.

 
Like most programmers, scientists want one language/environment to handle all of their programming needs, and so they want a language capable of general purpose tasks, but that is only one part of the requirements they are looking for.
Again some anecdotic evidence that this statement might be true but
misses the real (as opposed to imagined) needs in scientific computing:
I inherited some numerical code in C++. It was done in C++ because of
"speed". It was a mess. Just some basic cleanup and some precomputing
resulted an improvement of a factor 100. Maybe scientist claim that they
need raw power but most would benefit from a clear and nice language
much more than from some 10% speed in their numerical kernels. (Note:
not every scientist does raw data screening at Cern.)
I kept C++, used Blitz++ for the matrix stuff. Some code looked slightly
more like the math on the paper but honestly: It was not worth it. The
computational part was pretty small in the end with lots of other stuff,
parameterisations and graphics attached and these other parts where
the parts I spent most time in.

V.

Sebastien Binet

unread,
Apr 7, 2014, 5:25:28 PM4/7/14
to Volker Dobler, golang-nuts
sadly, c++ code at CERN is really far from harnessing the full power
of our cores.
and it is a pain to maintain.
and a nightmare to develop, compile, install and run.

give me Go any day. (preferably yesterday!)

-s

Dan Kortschak

unread,
Apr 7, 2014, 6:05:01 PM4/7/14
to Volker Dobler, golan...@googlegroups.com
Readability outweighs performance by a fairly reasonable margin in my view. Correctness is important, and in science peer acceptance of correctness is part of that; we can get incorrect answers in O(1) with small constants, so it's worth showing that the answer is likely to be correct to interested reviewers by making the code more readable.

Brendan Tracey

unread,
Apr 7, 2014, 6:14:21 PM4/7/14
to Dan Kortschak, Volker Dobler, golan...@googlegroups.com
I agree, and that’s one of the reasons Go is attractive now even as it stands (among others). The real attraction of go is that this simplicity/legibility/concurrency comes without a speed penalty. From the FAQ:  "One of Go's design goals is to approach the performance of C for comparable programs”. Go is not there yet for floating point computation tasks, but the compilers will improve and Go will become competitive with C. Tables will allow the harnessing of these speed improvements without sacrificing legibility or simplicity for numeric codes.

--
You received this message because you are subscribed to a topic in the Google Groups "golang-nuts" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/golang-nuts/osTLUEmB5Gk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to golang-nuts...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Kamil Kisiel

unread,
Apr 8, 2014, 6:45:37 AM4/8/14
to golan...@googlegroups.com, Brendan Tracey
For what it's worth, Python 3.5 is gaining an @ infix operator for matrix multiplication:

simon place

unread,
Jun 13, 2014, 12:51:07 AM6/13/14
to golan...@googlegroups.com
i don't see any point in 2d slices being in Go, slices are the base of things because they map to hardware directly, (memory is 1d) all 2d constructs have to map to 1d somewhere, in a compiled language there is not really much performance to be lost doing it yourself, and you benefit from flexibility, and potentially a choice of library implementations, like using memory efficient sparse matrices or something invented for a particular usage.

having 2d slices in the language would achieve standardisation, so that a lot of highly optimised libraries would all use the same constructs and be able to pass them around, but 2d 'tables' aren't complex enough structurally for ad-hoc converters or access routines not to be used, and they can be elided away by the compiler.

Dan Kortschak

unread,
Jun 13, 2014, 1:05:30 AM6/13/14
to simon place, golan...@googlegroups.com
The proposal addresses these issues, but to reiterate: the proposal's
aims are to gain performance without reduction in legibility, and to
expose greater options for optimisation that libraries obscure; and
there is no reduction in flexibility since all the type does is allow
indexing in greater than one dimension.

simon place

unread,
Jun 14, 2014, 11:14:26 PM6/14/14
to golan...@googlegroups.com
have read the proposal, and saw i says the struct solution is much slower than the single array solution, making it unusable. this didn't make sense to me, since you could just 'unpack' the struct and effectively run the single array solution code.

so i checked the code linked and the struct solution there is doing an extra 6 x O(size³) , range checks. (for a multiply)

so for the benchmarked example (200x300).(300x400) thats over 10million.

basically its unnecessarily using 'safe', range checked, exposed At() and Set(), rather than internal unchecked versions from inside the package where the coordinates have already been tested, duplication that the other solutions aren't performing. (an unchecked at() is present but unused!!!)

for me this change results in a large improvement;

Struct

2.30 -> 1.1

9.12 -> 1.8



add this;
func (m *Dense) set(r, c int, v float64) {
m.mat.Data[r*m.mat.Stride+c] = v
}

and update these;

for l := 0; l < k; l++ {
tmp := a.at(i, l)
for j := 0; j < n; j++ {
c.set(i, j, c.at(i, j)+tmp*b.at(l, j))
}

and
for j := 0; j < k; j++ {
if a.at(i, j) > 0.4 {
sum += a.at(i, j)
}

Michael Jones

unread,
Jun 14, 2014, 11:29:24 PM6/14/14
to simon place, golang-nuts
Excellent!


--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--
Michael T. Jones | Chief Technology Advocate  | m...@google.com |  +1 650-335-5765

Dan Kortschak

unread,
Jun 14, 2014, 11:56:54 PM6/14/14
to simon place, golan...@googlegroups.com
That is fine for internal use, but does not provide adequate safety for an API. It is possible without bounds checks to misaddress positions within the data slice.

So yes, if we can assure that indexing is correct (i.e. there are appropriate tests of some kind or another) then we can elide the checks. This is not the case for client code; we do use the at() method internally.

Brendan Tracey

unread,
Jun 27, 2014, 5:05:26 AM6/27/14
to golan...@googlegroups.com, tracey....@gmail.com, dan.ko...@adelaide.edu.au, ceci...@gmail.com
On Wednesday, March 26, 2014 6:04:08 PM UTC-7, Kyle Lemons wrote:
To mention my main complaint with your proposal here in the open:

I think any proposal needs an answer to bounds check elision, as that's where the biggest performance penalties lie in the matrix stuff I've done, and I think a lot of others as well.  It doesn't have to be range, but that seems like the most appropriate place for it.

I have just finished adding a definition for the behavior of range on tables to the proposals. This will allow for bounds check elision and efficient indexing.
Reply all
Reply to author
Forward
0 new messages