Go and multi core architecture

592 views
Skip to first unread message

Lars Pensjö

unread,
Oct 3, 2011, 6:52:28 AM10/3/11
to golan...@googlegroups.com
There is a trend in hardware to go to more and more cores. It would seem that the programming technology will go through a paradigm shift quite soon. To be able to keep abreast with the development (and the "new" direction of Moore's law), the computer languages may have to be adapted. For example, the report The GPU Computing Revolution predicts that the number of cores will grow by an order of magnitude every 5 years.

To some extent, Go supports this trend, e.g. using goroutines and channels. But how good will Go be? Will you eventually be required to use specialized API:s, like Cuda or OpenCL?

Or is there a risk (chance?) that Go will be replaced by something better adapted to massive parallel computing?

One of the goals of Go is to use it for server applications, but servers is one of the targets where multi core technology probably will dominate really soon (and to some extent already has been established as such).

The report referenced above mentions about the GPU architecture, which is currently not general purpose. Even if it would have 1000 cores, you can't use it to run 1000 web servers. However, the usual CPUs is also going through the change to more cores, rather than higher frequencies.

Paulo Pinto

unread,
Oct 3, 2011, 7:05:58 AM10/3/11
to golang-nuts
I am a strong advocate of funcional programming and I think that
multicore programming is
exactly one of the sweet spots for this programming paradigm.

Go however already provides some nice facilities to explore
multiprogramming at the system
level. If that would be enough time will tell.

Regarding GPGPU, I think Go might have a hard time there. Currrently
the only languages
that target GPGPU lack GC support.While the GC ones require the use of
specific APIs or
very constraint language use.

To properly use Go in a GPGPU scenario, without having to write OpenCL/
Cuda/DirectCompute
shaders, would certanly require language extensions.


--
Paulo


On 3 Okt., 12:52, Lars Pensjö <lars.pen...@gmail.com> wrote:
> There is a trend in hardware to go to more and more cores. It would seem
> that the programming technology will go through a paradigm shift quite soon.
> To be able to keep abreast with the development (and the "new" direction of
> Moore's law), the computer languages may have to be adapted. For example,
> the report The GPU Computing Revolution<https://ktn.innovateuk.org/web/mathsktn/articles/-/blogs/the-gpu-comp...>predicts that the number of cores will grow by an order of magnitude every 5

André Moraes

unread,
Oct 3, 2011, 8:15:29 AM10/3/11
to golang-nuts
> To properly use Go in a GPGPU scenario, without having to write OpenCL/
> Cuda/DirectCompute
> shaders, would certanly require language extensions.

Maybe translating Go code to C shader code. That will not be the best solution,
but could be practical for some parts of the code that really benefit
from running on GPU.

And even on that case, sending data to and forth could require some
hard programming.

--
André Moraes
http://andredevchannel.blogspot.com/

ron minnich

unread,
Oct 3, 2011, 11:36:26 AM10/3/11
to André Moraes, golang-nuts
2011/10/3 André Moraes <and...@gmail.com>:

>> To properly use Go in a GPGPU scenario, without having to write OpenCL/
>> Cuda/DirectCompute
>> shaders, would certanly require language extensions.
>
> Maybe translating Go code to C shader code. That will not be the best solution,
> but could be practical for some parts of the code that really benefit
> from running on GPU.
>
> And even on that case, sending data to and forth could require some
> hard programming.

For a pretty elegant idea that might apply to both Go and newer GPUs,
see CellFS:

http://dl.acm.org/citation.cfm?id=1587392

which I note has been ported to the 386!

ron

Ian Lance Taylor

unread,
Oct 3, 2011, 12:43:02 PM10/3/11
to golan...@googlegroups.com
Lars Pensjö <lars....@gmail.com> writes:

> There is a trend in hardware to go to more and more cores. It would seem
> that the programming technology will go through a paradigm shift quite soon.
> To be able to keep abreast with the development (and the "new" direction of
> Moore's law), the computer languages may have to be adapted. For example,

> the report The GPU Computing Revolution<https://ktn.innovateuk.org/web/mathsktn/articles/-/blogs/the-gpu-computing-revolution>predicts that the number of cores will grow by an order of magnitude every 5

> years.
>
> To some extent, Go supports this trend, e.g. using goroutines and channels.
> But how good will Go be? Will you eventually be required to use specialized
> API:s, like Cuda or OpenCL?

No. Perhaps somebody will write some sort of GPU library, but I can't
imagine that such a library would ever be required to write Go programs
which can run on multiple cores.

> Or is there a risk (chance?) that Go will be replaced by something better
> adapted to massive parallel computing?

Well, yes, of course there is a chance that that could happen.

Ian

Paul Lalonde

unread,
Oct 3, 2011, 4:52:59 PM10/3/11
to golang-nuts
I'm going to plug a non-Go tech that could be a source of
inspiration: ISPC - http://ispc.github.com/.
ISPC integrates a CUDA/GPU-like language directly with your C/C++ code
and executes it using your on-core SIMD units (SSE, AVX, etc). The
GPU-like restrictions of the programming model allow very efficient
mapping onto your SIMD units while avoiding the overhead of
transferring data to the GPU and synchronizing with it. ISPC can also
launch these tasks across multiple cores giving quite impressive speed-
ups.

I haven't done the analysis to understand what restrictions would have
to be placed on Go code to make SIMD-parallelizable code, but I expect
it to be similar: disallow aliasing (easy without pointer arithmetic,
though I expect a bit of slice grief), remove communication
(channels), and restrict the "go" keyword. But this would be an
interesting project for taking advantage of a compute resource that's
taking up a lot of die area even when you are not using it.

Paul

On Oct 3, 9:43 am, Ian Lance Taylor <i...@google.com> wrote:
> Lars Pensjö <lars.pen...@gmail.com> writes:
> > There is a trend in hardware to go to more and more cores. It would seem
> > that the programming technology will go through a paradigm shift quite soon.
> > To be able to keep abreast with the development (and the "new" direction of
> > Moore's law), the computer languages may have to be adapted. For example,
> > the report The GPU Computing Revolution<https://ktn.innovateuk.org/web/mathsktn/articles/-/blogs/the-gpu-comp...>predicts that the number of cores will grow by an order of magnitude every 5
Reply all
Reply to author
Forward
0 new messages