The CPU vs GPU debate

812 views
Skip to first unread message

Envengerx

unread,
Jul 11, 2018, 11:36:43 AM7/11/18
to LCZero
I am not able to understand the GPU vs CPU debate that has been going around here for some time now.

1. Power consumption and the cost of the items can be easily calculated and compared. Its not like a 1080 ti is 2x as expensive as a 16 core processor or something.

2. If you see a graph comparing GPU performance vs CPU performance over the last few years, CPU performance growth has been much slower compared to GPU growth. And it would get much faster with the next few generations.

3. When the first multi-core engine and multi-cores processors came around, did you guys complain about all chess engines should run on single core? Multi-core processors = not fair, and all engines should be compared only by their single core performance?

4. Theoretically if CPU running chess engines cannot keep up with GPU running engines with the hardware of the same cost and consuming nearly the same amount of power, don't you think the processor running engines would go obsolete soon? So everyone one who is complaining about the engines cannot be compared are theoretically fighting for a lost cause?

Thomas Kaas

unread,
Jul 11, 2018, 12:23:41 PM7/11/18
to LCZero
I have played many games in Arena, between LC0 and LCzero CPU. The GPU has alot higher winrate.

Salah Abbas

unread,
Jul 13, 2018, 10:34:36 PM7/13/18
to LCZero
To understand really why a GPU is better for Leela is that fact that it is an NN. Normal 'brute-force' chess engines have a series of algorithms that can be computed fast and quickly on a CPU, while Leela is much different. GPUs can do intensive operations at a higher rate than CPUs. CPUs are not designed to handle much.

Jesse Jordache

unread,
Jul 14, 2018, 9:39:49 AM7/14/18
to LCZero
Not to object nitpick over a vague description, but it's the other way around.  CPUs can do intensive operations that GPUs aren't so good at: if you see an incredibly complicated algorithm, think, say, a computer engine where a position is broken down into hundreds of parts.  Each of those parts is quantified based on the values of other parts, and then these are all fed into a conditional formula, yielding an evaluation. Now do this a hundred-thousand times a second.  You want a CPU for that.

GPUs on the other hand, are good at massively parallel, but relatively simple operations.  If you think of 3d graphics, it's simple to rotate an object in 3 dimensional space as represented by coordinates. Well, relatively simple - mathematically smart people can do it in their head.  Now, add in about a hundred different objects in the scene, light refraction - again, not difficult mathematically, but if you're simulating a fully lit scene, you're talking about a LOT of light rays - light absorption, shadows, etc, and now a GPU shows its worth: lots and lots of simple, independent operations, preferably fast enough so that the scene can run in real time.  A CPU, which can do things far more complicated if you're talking about a single algorithm, would melt.

NNs are based around linear algebra, which is the math behind a lot of the stuff that happens in 3d graphics.  I just wrote a perhaps not completely apt, but long explanation of why neural nets do operations that are similar to computer graphics, and my computer ate it.  But for a partial explanation, this is okay I think.

Envengerx

unread,
Jul 14, 2018, 10:16:37 AM7/14/18
to LCZero
Yes, they are good in doing say 32bit integer operations, 16bit integer operations etc, but they cant work pass data as fast as a CPU or different algorithms.
Its like you give a you a truckload full of calculations to do and a bit later they get you back a truck load full open results.

A CPU on the other hand gives you the results much quicker and you can ask it to do something in multiple steps easily.

There is a good video on it
https://youtu.be/x-N6pjBbyY0

Envengerx

unread,
Jul 14, 2018, 10:18:37 AM7/14/18
to LCZero
My OP was for members on the group complaining that Leela SHOULD run on the CPU to be comparable, else it cannot be compared etc.
Message has been deleted

Kostas Oreopoulos

unread,
Jul 14, 2018, 11:44:48 AM7/14/18
to LCZero
They complain because they do not understand

Do they complain for FPS games requiring a GPU to play on? No. Why?

GPU is great on linear algebra problems and geometry and nowhere else. Its a different type of problem and a GPU is great at that.

A GPU is awful at complicated algorithms. A "normal" chess algorithm will run extremely bad in CPU. 

Its apples and oranges.

The algorithm indicates the hardware. 

I understand all that comes from " I do not want to buy an expensive GPU" but NN work their.

You cannot compare different things. Only their effective results.

You cannot compare a train to a car to a plane to a ship. They all help transportation. They do it in different ways. You pick which is best for you. You do not demand a car to fly (yet) or a plane to use the roads.

Jesse Jordache

unread,
Jul 14, 2018, 2:47:24 PM7/14/18
to LCZero
Exactly.  You just have to do your best to explain why they're on different hardware, and cave canem.

Envengerx

unread,
Jul 14, 2018, 3:23:22 PM7/14/18
to LCZero
I would like to add when Leela gets stronger, it should be easy to get similar results with equal priced hardware and power consumption.

And a very huge advantage is upgrading for Leela is easier then upgrading the CPU.
If you are using an old i3 and want stronger hardware, you might have to change your mother board to get a decent processor. While the graphics card can be just added to what every you are using.

David Grosvenor

unread,
Jul 14, 2018, 3:34:32 PM7/14/18
to LCZero
… a descent GPU most probably won´t fit on that board as well. Aside from an appropriate power-supply-unit. 
Reply all
Reply to author
Forward
0 new messages