On Thu, Sep 4, 2014 at 3:25 PM, Bodo Kaiser <
i...@bodokaiser.io> wrote:
> On Thu, Sep 4, 2014 at 1:57 PM, <
i...@bodokaiser.io> wrote:
>
> Hello,
>
> I am wondering what the use cases of a sync.Pool could be in comparison to a
> buffered channel?
>
> The docs say that sync.Pool is used to "cache" output to stdio. I guess that
> a sync.Pool may benefit here as it does not have a limit in size.
>
> I still see two downsides on using sync.Pool over buffered channel:
> 1. Limiting the size via buffered channels may prevent memory problems
>
>
> Pool can have arbitrary large size but at the same time do not cause
> any memory problems. It is integrated with garbage collector.
>
> 2. Using select or for over a channel uses less resources then always
> checking pool.Get() for non-nil values
>
>
> I do not understand this.
>
>
> Compare:
>
> for value := range channel {
> // process value
> }
>
> With:
>
> for value := pool.Get(); value != nil; value = pool.Get() {
> // Do something
> }
>
> This first will block until something is sent the latter always rechecks if
> a resource is available.
>
>
> Don't do this.
>
> Do:
>
> p := &sync.Pool{ New: func() *Foo { return new(Foo) } }
>
> return p.Get().(*Foo)
>
>
> But what is if New() requires some heavy computation which needs to be
> executed in a go routine?
>
> My current draft is:
https://gist.github.com/bodokaiser/106a2a3019a78056b892
> not sure if it works already but you get an idea.
>
> I now don’t see any way how I could replace the in and out channels with
> pools without doing:
+golang-nuts again
then don't use sync.Pool, use channels
sync.Pool is mostly for caching memory, e.g. when you need a 4K
scratch buffer for every operation
> for value := pool.Get(); value != nil; value = pool.Get() {
>
> }
>
> To check for new input.
>
>
>
>
> On the other side I am sure the Go devs would not have added sync.Pool if
> there would be no reason. So what am I missing?
>
>
>
> Pool is faster and more scalable.
> Pool do no retain memory unnecessarily, it's flushed during garbage
> collection.
> Pool will autotune to required size. Consider that you have a
> chan-based pool with capacity 10; if you have 20 goroutines; resources
> will be constantly dropped and recreated. It won't happen with Pool.
>
> The downside of Pool is that it can cache up to GOMAXPROCS more
> objects than strictly necessary. If the objects are huge, it can be a
> problem.
>
>