garbage collection

728 views
Skip to first unread message

mikets...@gmail.com

unread,
May 23, 2014, 7:59:45 PM5/23/14
to golan...@googlegroups.com
I was just wandering, Go been the best language for almost everything except Real time applications,
how hard would have been to add a switch turning garbage collection on or off.
That would had elliminated the need for using C/C++, and would have made the language
more popular. Just a thought.

Mike.

Jesse McNelis

unread,
May 23, 2014, 9:17:32 PM5/23/14
to mikets...@gmail.com, golang-nuts
You can turn garbage collection off. But then you have no way of
reclaiming unused memory and your process will grow unbounded.

You can't reasonably do manual memory management in Go because too
many allocations are made by the runtime that you can't track the
lifetime off.

Dmitri Shuralyov

unread,
May 24, 2014, 4:48:46 PM5/24/14
to golan...@googlegroups.com, mikets...@gmail.com, jes...@jessta.id.au
Jonathan Blow, the creator of Braid and The Witness, was streaming some games on twitch last night and someone asked him "What do you think of Go (golang)?".

I'll have to paraphrase his answer, but basically he said "I haven't really tried it. It looks interesting, but it is has garbage collection. When you're in game development, you spend many years building up tools and investing into a code base to achieve your goals, and I thinks it would be very unwise to do that for a language with GC."

Personally, I've come from a background of working on small indie games that I never got enough time to that far, so they're unfinished and small and nowhere near AAA or even indie AAA. Until 2 years ago, my language of choice was exclusively C++ for pretty much the same reasons as everyone else. However, I really grew sick of the kind of inefficiencies you have to put up when writing it and being unable to refactor it with automated tools (unlike C#/Java, etc.).

When I tried Go and saw how much you could achieve with it despite its simplicity, I wanted to explore its potential and try to push it to its limits.

Currently, I'm working on GUI tools and a (very small) game port in Go using OpenGL and I have no problems hitting stable 60 fps or higher despite GC being there. So it's not yet a problem, at least not at the level I'm at.

My hope is that the GC can be improved and there can be some solution found once/if the problem arises (but like I said, so far it hasn't been an issue for my use cases). I feel that if there's any chance Go is viable for this kind of development, I will be thrilled to be able to keep using it and enjoy all of its perks. Honestly, it would be really hard to go back to something else that doesn't have things like gofmt, godoc.org and a simple spec that fits on 1 page.

DV

unread,
May 24, 2014, 5:23:57 PM5/24/14
to golan...@googlegroups.com, mikets...@gmail.com
I can "reasonably" (for various definitions of reasonably) manage memory manually in C/C++, but only when dealing with single-threaded code. From my experience, that's also the majority of C/C++ programmers who claim that they have no issues with manual memory management and GC just "gets in the way" of "max performance". 

I don't even want to imagine the horrors Go programmers would have to endure with hundreds of goroutines trying to figure out when memory can be freed. No thanks.

C/C++ are still 10 years behind "the curve", IMO, when it comes to implementing concurrency and I believe that manual memory management is one of the main reasons for that. 

alexru...@gmail.com

unread,
May 24, 2014, 8:02:21 PM5/24/14
to golan...@googlegroups.com, mikets...@gmail.com, jes...@jessta.id.au
Jonathan Blow is a "latency-sensitive" guy. Read comments here:
http://braid-game.com/news/2008/08/misc-linux-questions/

суббота, 24 мая 2014 г., 23:48:46 UTC+3 пользователь Dmitri Shuralyov написал:

Devon H. O'Dell

unread,
May 24, 2014, 8:49:35 PM5/24/14
to DV, golang-nuts, mikets...@gmail.com
2014-05-24 17:23 GMT-04:00 DV <dimiter....@gmail.com>:
> C/C++ are still 10 years behind "the curve", IMO, when it comes to
> implementing concurrency and I believe that manual memory management is one
> of the main reasons for that.

It is very much possible to design proper concurrent systems in C, a
la concurrencykit.org. I think the main reason is lack of
understanding of (wait|lock)-free data structures and algorithms,
modern processor implementation, cache coherency protocols, and safe
memory reclamation algorithms. Reference counting, proxy collectors,
quiescent state-based reclamation (including epoch reclamation),
hazard pointers, and RCU are all successful ways that C programmers
deal with memory reclamation in systems with hundreds or thousands of
concurrent processes. Any "behind the curve" status really lies with
the implementor.

Certainly various languages attempt to make this easier, but you still
have to prove the correctness of the implementation regardless of the
language. And when dealing with concurrency, the correctness of your
implementation can be very dependent on the underlying hardware
(memory ordering semantics, cache coherency protocols, out-of-order
execution, atomic instruction implementation, etc). Right now, Go
abstracts this away from you -- but a significant chunk of the
concurrency primitives in Go are implemented in C and / or assembler.
So in some respect, one could say that Go is "behind the curve" for
implementing concurrency in the sense that it provides neither
lock-free nor wait-free guarantees on its concurrent data structures.
When one is concerned about latency, this is problematic -- even
without considering a garbage collector. From this perspective, you
absolutely cannot implement a latency-sensitive system in Go. That's
fine, I don't think that's a current goal for the language.

I'm very much a fan of Go (although I haven't contributed to it
materially in some time), so I hope people don't take this as a
criticism of the language or the implementation. I guess this is just
a tangential reply because this statement oversimplifies memory
management in concurrent systems. Someone has to keep track of the
memory, it just depends on whether that's the programmer or the
runtime. If it's the runtime, it's good to have some guarantees about
correctness. It's not fair to state the language used to implement the
concurrency primitives provided by the touted language is "behind the
times".

--dho

>> I was just wandering, Go been the best language for almost everything
>> except Real time applications,
>> how hard would have been to add a switch turning garbage collection on or
>> off.
>> That would had elliminated the need for using C/C++, and would have made
>> the language
>> more popular. Just a thought.
>>
>> Mike.
>
> --
> You received this message because you are subscribed to the Google Groups
> "golang-nuts" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to golang-nuts...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Sugu Sougoumarane

unread,
May 24, 2014, 9:10:00 PM5/24/14
to golan...@googlegroups.com, mikets...@gmail.com
Real time system is too ambiguous a term. I have myself fallen into this trap of blindly repeating that C/C++ is the only option for real-time systems.
CPUs have become very fast over the years. Also, applications have become very complex these days. Not all operations take the same time.

I'd try to come up with more concrete requirements like average and 99 percentile latency.

And then, it will become easy for you to know if Go, or any other language can deliver.

Bryan Mills

unread,
May 25, 2014, 10:10:33 PM5/25/14
to golan...@googlegroups.com, mikets...@gmail.com
On Saturday, May 24, 2014 5:23:57 PM UTC-4, DV wrote:
I can "reasonably" (for various definitions of reasonably) manage memory manually in C/C++, but only when dealing with single-threaded code. From my experience, that's also the majority of C/C++ programmers who claim that they have no issues with manual memory management and GC just "gets in the way" of "max performance". 

I don't even want to imagine the horrors Go programmers would have to endure with hundreds of goroutines trying to figure out when memory can be freed. No thanks.

If you have hundreds of goroutines and don't know when they exit, you're already in for a world of hurt.  Garbage collection will protect you against use-after-free bugs, but not deadlocks, data races, or even goroutine leaks - it's not a panacea.

The way to avoid those issues turns out to be the same way you avoid memory errors in non-GC languages: write synchronous code with clear, block-scoped memory lifetimes, and set clear invariants on when operations must be finished.  "Share memory by communicating": transferring ownership of a memory object should be explicit, whether by channel operations, return values, or WaitGroups, not implicit in pointers accessed by man

What garbage collection primarily helps with is simplifying the language itself - the type system in particular.  It removes the need for complicated (and difficult) compile-time lifetime analysis or verbose, hard-to-read type annotations.  With GC, you don't have to annotate which operations are "borrowing" references and which are transfers of ownership.  You don't have to annotate when you're returning an internal pointer, or when you're keeping a reference to a cache entry that should expire at the end of a function call, or when you're transferring ownership to or from a global datastructure.  In function closures, you don't have to annotate whether to capture variables "by reference" or "by copy".

So there are plenty of reasons why garbage collection is useful, especially in a language like Go that strives for clarity and comprehensibility.  Unfortunately, it doesn't absolve you of the need to think about memory management yourself.

Jsor

unread,
May 26, 2014, 5:37:56 PM5/26/14
to golan...@googlegroups.com, mikets...@gmail.com, jes...@jessta.id.au
I don't think Go is a valid choice if you're writing the new iteration of the Crytek engine, but there is an absolute world of game programming between Pong and Crysis.

I'm dubious that Go would really be that bad for a 2D platformer, or even a "reasonably good looking" (though probably sub-AAA) 3D game. But let's face it, if you're seriously confronted with the question of "what language should I be using?" you're probably not about to make a AAA game. You, of course, have to be careful not to generate ridiculous amounts of garbage, and such, but throwing out Go for all real-time applications is kind of overdoing it. Perfectly serviceable video games have been written in Javascript (+HTML 5) fer chrissake!

If you're working with consoles things may be different, but I'd like to point out that Naughty Dog used a custom variety of LISP to make games as good looking and AAA-ish as Uncharted and The Last of Us, and that variety was garbage collected. Certainly they had to expend a lot of effort to do, and they used naked assembly in addition to LISP in many places, but I think the "C++ or bust" crowd are overstating it a bit.

Besides, call me cynical, but I'm... heavily skeptical that most indie game programmers are truly good enough to meaningfully exploit the differences between C++ and Go with regards to memory management. I think, after a few rounds of profiling and optimization, most programmers will end up with code at around the same level of performance. The GC pauses may be a bit more unpredictable than manual memory stuff, but as long as you use similar tricks in Go as you use to reduce the number of allocations in C++ I'm skeptical that it's heavily noticeable.

Stefano Casillo

unread,
May 26, 2014, 8:51:32 PM5/26/14
to golan...@googlegroups.com, mikets...@gmail.com, jes...@jessta.id.au


On Tuesday, May 27, 2014 5:37:56 AM UTC+8, Jsor wrote:
but I think the "C++ or bust" crowd are overstating it a bit.

I totally agree. The Blow example is very fitting, you read what he writes and you expect to google for his games and find some sort of Crysis reloaded with millions of polygons at screen.. but you end up with a Mario very refined and smart clone.. but a Mario thing nevertheless. So he is clearly taking himself too seriously there. All the control and stuff he talks about is all about fulfilling his own Ego as programmer, because he is a programmer and he enjoys being a programmer. But nothing he has done until now is undo-able in a GC language. 

The problem I see with alternative languages adoption in games is that games are long term project, generally speaking >=1-2 years. At the beginning of a project you are faced with choices and going for uncharted territories also means expose yourself to a huge number of unpredictable problems.
I remember spending nights trying to choose between C# and C++ for my current game in 2010, and you literally end up having nightmares of your game stuttering after 3 years of work, because of the GC. I ended up with C++ cause that was the choice that allowed me sleep :D
C++ is somehow the "easy default".. everybody is doing it, if my game isn't performing I know it's my fault and not the language fault.. I guess that's the main reason why these projects end up in C++.
After all, gamedev are a difficult bunch.. while the world was writing in C they were still using ASM, when the world moved to C++ they were using C and ASM, while the world moved completely away from C/C++ they eventually started to use C++.. sometimes decisions are forced by hardware (compilers and so on) but there is a huge amount of NIH syndrome.. often the standard library is not used and some inhouse thing is used instead, pretty much always allocators are rewritten from scratch.
It is very unlikely to get them interested in something like Go.. we'll have to target the younger generation, I'd love to see something like XNA made in Go. XNA generated a lot of C# game developers, a robust game library in Go would have the same effect.
I wish the Go team would go for that, I'd be willing to get hired :P



 

Kevin Gillette

unread,
May 26, 2014, 11:10:37 PM5/26/14
to golan...@googlegroups.com, mikets...@gmail.com, jes...@jessta.id.au
I don't think this criticism entirely applies. If you were writing a triple-a game in anything, presumably you'd be okay with modifying a runtime system to suit your needs, such as tweaking the garbage collector to disable periodic collection, and instead do collection when specifically requested, or only when allocation thresholds have been surpassed. Well written engines simply must have stable memory usage -- that's not at all hard to achieve in Go; along with the aforementioned GC tweaks, there wouldn't be any new garbage to collect (since you're keeping usage stable, and reusing existing allocations), and there would be no potential for unwanted pauses during gameplay.

With that out of the way, Go comes reasonably close to C in terms of speed, and you might be writing some of your core algorithms in C or assembly anyway -- in which case Go makes for an excellent, safe, sane control language -- your engine would almost certainly have fewer bugs than an equivalent written entirely in C or C++.

Andrew Gerrand

unread,
May 27, 2014, 1:04:57 AM5/27/14
to Jsor, golang-nuts, mikets...@gmail.com, Jessta

On 27 May 2014 07:37, Jsor <jrago...@gmail.com> wrote:
I'm dubious that Go would really be that bad for a 2D platformer, or even a "reasonably good looking" (though probably sub-AAA) 3D game.

When this topic comes up I feel compelled to remind everyone that Minecraft, one of the most wildly successful games of the last few years, is written in Java. I think it would be possible to write a Go clone of Minecraft that would perform much better than the Java version, because Go gives you much better control over memory allocations. 

IMO games seem one are where GC is not a significant disadvantage, because you have very predictable memory use patterns. Recycling existing allocations seems a lot easier in that context than in other environments, such as servers, where load (and thus total allocations) can fluctuates dramatically over time.

Andrew

Robert Tweed

unread,
May 27, 2014, 4:03:33 AM5/27/14
to golan...@googlegroups.com
One of the things games tend to do is preallocate all the objects that will be needed for a particular period of time (typically per level) so that during that time the GC will (should) do nothing anyway - the only problem being if it decides to unpredictably stall to clean up something from much earlier. Even if you are creating and destroying objects of a similar type, it's pretty common to keep them in a buffer and recycle them, rather than continually allocate and deallocate RAM. You will only really run into "GC" issues if you are calling a lot of functions and transparently creating and destroying temporary objects on the stack, which is a potential pitfall of pretty much any language if you don't understand the argument-passing semantics.

In fact, the option to temporarily disable GC at runtime would be awesome for games, as that would eliminate the potential for stalling at times when you know it isn't needed. From looking at the docs, it seems you can do this by setting GOGC=Off, then periodically calling GC() whenever a stall is acceptable. You would do that, e.g., between levels and it would maybe stall for a short period. I don't know if anyone around here has played Skyrim lately, but I think adding half a second to the load screen for GC wouldn't be noticed by anyone.

You don't even need to look as far as Java to see games written successfully in pretty high level languages. I mean it's perfectly possible to write games in Flash or HTML5. I am an occasional game developer, and in the past have developed several Shockwave games. GC was the last of my performance concerns! True, it's not AAA, but the point is that you can now use some pretty high level tools to develop games without caring about the kind of low-level stuff that was essential 20 years ago, yet in many cases the standard won't be far off what was cutting edge 5-10 years ago.

For various reasons there's a ton of stuff you just don't need to care about anymore. Unless you are doing something stupendously complex, memory management is probably one of those things. In fact, even if the GC can introduce some latency, if that latency is small or infrequent enough, nobody will notice anyway! If you are doing something like Minecraft, you probably care more about how to efficiently render a very large number of cubes than whatever the GC might be doing. Different sorts of games will all have different optimisation problems, much like any other software.

The reason I wouldn't write a game in Go today isn't because it's not possible, but because there are much better options, like UDK or Unity, for indie development. If you are doing AAA it's going to be in C++ anyway so there's no point thinking about it. If you are writing something so basic that the lack of libraries in Go isn't a problem, then stop-the-world GC isn't going to be a problem for that project either.

- Robert

Jesper Louis Andersen

unread,
May 27, 2014, 4:56:04 AM5/27/14
to Robert Tweed, golang-nuts

On Tue, May 27, 2014 at 10:03 AM, Robert Tweed <fistful.o...@gmail.com> wrote:
One of the things games tend to do is preallocate all the objects that will be needed for a particular period of time (typically per level) so that during that time the GC will (should) do nothing anyway - the only problem being if it decides to unpredictably stall to clean up something from much earlier. Even if you are creating and destroying objects of a similar type, it's pretty common to keep them in a buffer and recycle them, rather than continually allocate and deallocate RAM. You will only really run into "GC" issues if you are calling a lot of functions and transparently creating and destroying temporary objects on the stack, which is a potential pitfall of pretty much any language if you don't understand the argument-passing semantics.

I know of at least one folk-lore trick where you utilize a manual variant of region inference. You pre-allocate a large slab of memory in which you allocate objects with a known limited lifetime. Say for instance that you allocate there for a single rendering frame, or for a single computation of a flight schedule between two destinations. Once that piece of work is done, you reset your slab.

Of course you need to know exactly what pointers are now invalid, or you will need to have a way to determine if a pointer is one of those belonging to an old generation. This is what region inference makes automatic.

GC then isn't an issue in the course of action, as long as you keep yourself within the budget of the slab. And if you exceed the budget, you can just halt the system with an error. Historically, PHP used the same method for its heap when serving a request through Apache.

To pull this off, you need to have enough flexibility in your system to handle moving to a different allocator, however.


--
J.
Reply all
Reply to author
Forward
0 new messages