HPC image processing in go?

96 views
Skip to first unread message

quin...@gmail.com

unread,
Nov 4, 2021, 5:43:11 AM11/4/21
to golang-nuts
Hi,

Has anyone got any experience of high performance image processing in go?

By this I mean doing complex image processing in real time at 4K resolution on commodity hardware. This is really pushing it using carefully written C++ but when we tried writing similar code using go slices we go a significant slowdown (x4 over gcc).

We experimented using unsafe pointers thinking it is go's slice range checking that cause the problems, but surprisingly saw no improvement.

Has anyone had success saturating the memory bandwidth using go?  Is my result a surprise to people? Is it just that gcc's code generator is very mature and Go's is less so, or should I keep looking for dropoffs in my code?

I haven't looked at the generated assembly yet, but that is my next step.

Any opinions?

-Steve

David Finkel

unread,
Nov 4, 2021, 10:28:15 AM11/4/21
to quin...@gmail.com, golang-nuts
On Thu, Nov 4, 2021 at 5:43 AM quin...@gmail.com <quin...@gmail.com> wrote:
Hi,

Has anyone got any experience of high performance image processing in go?

By this I mean doing complex image processing in real time at 4K resolution on commodity hardware. This is really pushing it using carefully written C++ but when we tried writing similar code using go slices we go a significant slowdown (x4 over gcc).

We experimented using unsafe pointers thinking it is go's slice range checking that cause the problems, but surprisingly saw no improvement.

Has anyone had success saturating the memory bandwidth using go?  Is my result a surprise to people? Is it just that gcc's code generator is very mature and Go's is less so, or should I keep looking for dropoffs in my code?
This doesn't particularly surprise me. The Go GC compiler is optimised for fast compile-times and does very limited vectorization.

I think the general advice in cases where one needs better optimization paths has been to use gccgo or gollvm (if possible).
 

I haven't looked at the generated assembly yet, but that is my next step.

Any opinions?

-Steve

--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/golang-nuts/8419a816-8058-48c1-874b-09a34be0f3fcn%40googlegroups.com.

jlfo...@berkeley.edu

unread,
Nov 4, 2021, 2:36:43 PM11/4/21
to golang-nuts
I'm wondering if it would be worth the effort to improve Go's vectorization optimizations as opposed to creating/improving
its bindings to the various GPUs.

Jon
Reply all
Reply to author
Forward
0 new messages