Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Use motherboard video-out or GPU’s

163 views
Skip to first unread message

Pankaj Jangid

unread,
Mar 9, 2021, 3:00:05 AM3/9/21
to
I have a confusion. Actually there are two. Please help me understand
this.

Suppose I have a good motherboard with two GPUs installed in the PCIe
slots. Now, should I connect the monitor to the motherboard video-out or
should I use GPU’s output? Suppose the OS (Debian GNU/Linux in this
case) is fully configured to utilize the GPUs i.e. drivers etc. are set
up. Will the graphic system be able to utilize the GPUs irrespective of
where I have connected the monitor?

--
Regards,
Pankaj Jangid

deloptes

unread,
Mar 9, 2021, 3:50:04 AM3/9/21
to
Pankaj Jangid wrote:

> Suppose I have a good motherboard with two GPUs installed in the PCIe
> slots. Now, should I connect the monitor to the motherboard video-out or
> should I use GPU’s output? Suppose the OS (Debian GNU/Linux in this
> case) is fully configured to utilize the GPUs i.e. drivers etc. are set
> up. Will the graphic system be able to utilize the GPUs irrespective of
> where I have connected the monitor?

I am not an expert, but first of all what server are you using X or Wayland.
I am not an expert in Wayland, but X is able to utilize any GPU in any
number.
If it does not this automatically you have to manually configure the server.
There are numerous guides on the internet.
It also depends how your MoBo is designed to work. Some do not allow using
both internal/external GPU.

Last but not least - why many GPUs but one monitor? It makes no sense. It
should use the one where the monitor is attached, but usually monitors have
more inputs, which you can utilize.

IL Ka

unread,
Mar 9, 2021, 6:10:05 AM3/9/21
to
Last but not least - why many GPUs but one monitor? It makes no sense.

IL Ka

unread,
Mar 9, 2021, 6:20:04 AM3/9/21
to

Suppose I have a good motherboard with two GPUs installed in the PCIe
slots.

Now, should I connect the monitor to the motherboard video-out or
should I use GPU’s output?


 
Suppose the OS (Debian GNU/Linux in this
case) is fully configured to utilize the GPUs i.e. drivers etc. are set
up. Will the graphic system be able to utilize the GPUs irrespective of
where I have connected the monitor?

Yes. Videocard should detect the monitor and report its presence to the video system (X11).
In some cases you would need to configure it manually.

In modern X11 you use your desktop environment UI to configure monitors or you can use the ``xrandr`` tool directly.





Linux-Fan

unread,
Mar 9, 2021, 7:50:04 AM3/9/21
to
Connect the monitors to the GPU/cards you want to use. Otherwise, it is most
likely that graphics output will be computed by a processor-integrated
graphics unit (if present, otherwise black screen).

Here are the two special cases that immediately come to mind:

* It is possible to utilize GPUs that are not connected to screens for
computation purposes (e.g. OpenCL/NVidia CUDA). Unless you are doing this
all the time there is no reason against using them for video output, too :)

* In case of mobile devices there are some which can render on a different GPU
than the monitor is connected to but for usual desktops this is not the case.

HTH
Linux-Fan

öö

[...]

Pankaj Jangid

unread,
Mar 9, 2021, 7:50:04 AM3/9/21
to
deloptes <delo...@gmail.com> writes:

>> Now, should I connect the monitor to the motherboard video-out or
>> should I use GPU’s output? Suppose the OS (Debian GNU/Linux in this
>> case) is fully configured to utilize the GPUs i.e. drivers etc. are set
>> up. Will the graphic system be able to utilize the GPUs irrespective of
>> where I have connected the monitor?
>
> I am not an expert in Wayland, but X is able to utilize any GPU in any
> number. If it does not this automatically you have to manually
> configure the server. There are numerous guides on the internet.

Okay. So, this is possible. But my doubt remains. See below.

> Last but not least - why many GPUs but one monitor? It makes no sense. It
> should use the one where the monitor is attached, but usually monitors have
> more inputs, which you can utilize.

I’ll be using multiple monitors but my original doubt was this. If I am
not connecting a monitor to GPU output, will I not be utilizing its
power for (this) display? Assuming, I have configured X to utilize all
the GPUs for number crunching. Should it make any difference where I am
connect the monitor? What will be the highest available resolution? Will
it depend on where I have connected?

--
Regards,
Pankaj Jangid

Dan Ritter

unread,
Mar 9, 2021, 9:50:04 AM3/9/21
to
Pankaj Jangid wrote:
> I???ll be using multiple monitors but my original doubt was this. If I am
> not connecting a monitor to GPU output, will I not be utilizing its
> power for (this) display? Assuming, I have configured X to utilize all
> the GPUs for number crunching. Should it make any difference where I am
> connect the monitor? What will be the highest available resolution? Will
> it depend on where I have connected?

Each monitor will only use the GPU that it is connected to.

If you are using a GPU-as-coprocessor system like CUDA or
mlOPEN, that GPU is generally not available to be used by a
monitor at that time. It may be used before or after that.

If your motherboard has monitor outputs, it may be controlled
by:

- a motherboard-integrated GPU (not a card)
- a motherboard-integrated video controller (also not a card)
- a CPU-integrated GPU (Intel desktop CPUs, AMD APUs)

It will not be controlled by a plug-in GPU.

Finally, there are game APIs which use multiple GPUs to do the
rendering of screen images and pipe it through one final GPU
output; that is called AMD Crossfire and NVidia SLI or NVLink --
none of these are compatible with each other, and they are all
likely to be de-supported in the near future.

Does that help explain things?

-dsr-

Stefan Monnier

unread,
Mar 9, 2021, 11:00:04 AM3/9/21
to
> Each monitor will only use the GPU that it is connected to.

FWIW, I find the terminology used in the graphics card PC industry very
confusing. In my view, there are 4 different kinds of components to
a graphic system:

- Memory: this can be dedicated "video RAM" or just a chunk of your
normal RAM, shared with the rest of your system.
- Display engine (DE): This is the thing that reads a frame buffer
from some memory and sends the corresponding data out to your
monitor(s) via the output connector(s).
- Video processing unit (VPU): This is a specialized element with
support for decoding/encoding some specific video and audio formats.
It takes its input from some memory and sends its output to some other
part of the memory. E.g. when playing a video, it typically reads the
video from the system RAM and sends the decoded output to some (part
of a) frame buffer.
- GPU: This is a processor dedicated to doing the kind of number
crunching used for 3D rendering. It takes its inputs (e.g. textures
and scene descriptions) from some memory and outputs the rendered
scene to some (part of a) frame buffer. Nowadays these processors
have grown sufficient functionality that they can be used for other
kinds of number crunching.

Most "integrated graphics" are composed of the last 3 components above
and use the system's normal DRAM for their memory needs.

Most PC's "discrete" graphics cards include all 4 components and they
typically can't use the system's normal DRAM in the same way they use
their own video RAM. This means that you often can't use your discrete
GPU to render a 3D scene into the frame buffer used by the DE of your
integrated graphics card.


Stefan

Dan Ritter

unread,
Mar 9, 2021, 12:30:05 PM3/9/21
to
All correct, except --
if you have one of the specific combinations of AMD APU, AMD
graphics card, and AMD-chipset motherboards that can do "Hybrid
Crossfire" in which, surprise, the GPU integrated with the CPU
can contribute to rendering each other's frame buffers. (I think
it's always one specific direction, but am not sure.)

-dsr-

deloptes

unread,
Mar 9, 2021, 1:50:05 PM3/9/21
to
Pankaj Jangid wrote:

> Okay. So, this is possible. But my doubt remains. See below.
>
>> Last but not least - why many GPUs but one monitor? It makes no sense. It
>> should use the one where the monitor is attached, but usually monitors
>> have more inputs, which you can utilize.
>
> I’ll be using multiple monitors but my original doubt was this. If I am
> not connecting a monitor to GPU output, will I not be utilizing its
> power for (this) display?  Assuming, I have configured X to utilize all
> the GPUs for number crunching. Should it make any difference where I am
> connect the monitor? What will be the highest available resolution? Will
> it depend on where I have connected?

AFAIR A screen will be assigned to the controller/GPU. then to each output a
Monitor and Display will be assigned.

As you know there are few options how it can be handled (Extend/Clone).
The resolution depends on the quality of the cards.

I have only one integrated intel with DVI and HDMI. So I have one screen and
one Monitor, but both DVI and HDMI are connected.
Because I have configured Extend per default, I see the login screen only on
the DVI display and the extention on the other.

I guess you must try out and perhaps manually fine tune this or that.

May be also copy/paste here the make/model of the hardware, so that someone
can give you more detailed information.

Felix Miata

unread,
Mar 9, 2021, 4:20:05 PM3/9/21
to
deloptes composed on 2021-03-09 19:43 (UTC+0100):

> AFAIR A screen will be assigned to the controller/GPU. then to each output a
> Monitor and Display will be assigned.

> As you know there are few options how it can be handled (Extend/Clone).
> The resolution depends on the quality of the cards.

> I have only one integrated intel with DVI and HDMI. So I have one screen and
> one Monitor, but both DVI and HDMI are connected.
> Because I have configured Extend per default, I see the login screen only on
> the DVI display and the extention on the other.

Clear as mud.

Monitor = display. This is physical.

Screen for X purposes is comprised of from 1 to N displays aka 1 to N monitors,
and most often is. It's a logical construct in which displayed output can be
either mirrored (cloned) or discrete (unique).

IME, two cables simultaneously connected from one GPU to one display is a formula
for failure, unless the display specifically supports multiple simultaneous
discrete inputs, such as PBP, POP or PIP.
--
Evolution as taught in public schools, like religion,
is based on faith, not on science.

Team OS/2 ** Reg. Linux User #211409 ** a11y rocks!

Felix Miata *** http://fm.no-ip.com/

deloptes

unread,
Mar 9, 2021, 7:00:04 PM3/9/21
to
Felix Miata wrote:

> Clear as mud.
>
> Monitor = display. This is physical.
>
> Screen for X purposes is comprised of from 1 to N displays aka 1 to N
> monitors, and most often is. It's a logical construct in which displayed
> output can be either mirrored (cloned) or discrete (unique).
>

Not exactly - it utilizes the device and AFAIR if you have multihead GPU,
you could have more than one screens with more than one monitors and define
there whatever acceptable displays you can define.

> IME, two cables simultaneously connected from one GPU to one display is a
> formula for failure, unless the display specifically supports multiple
> simultaneous discrete inputs, such as PBP, POP or PIP.

Why not? I can do HDMI or DVI as I like. The DVI delivers somehow better
quality - might be because the HDMI has a switch box in between - I suspect
in general being produced in China and it can not deliver full
1920x1080@75, or may be some other reason.
Anyway I kept the DVI and it works with both plugged in just fine - may be
over two years already.

IL Ka

unread,
Mar 9, 2021, 8:00:04 PM3/9/21
to
Felix Miata wrote:

> Clear as mud.
>
> Monitor = display. This is physical.
>
> Screen for X purposes is comprised of from 1 to N displays aka 1 to N
> monitors, and most often is. It's a logical construct in which displayed
> output can be either mirrored (cloned) or discrete (unique).
>

Not exactly - it utilizes the device and AFAIR if you have multihead GPU,
you could have more than one screens with more than one monitors and define
there whatever acceptable displays you can define.


X11 terminology is complex and bloated with 30 years old poorly named abstractions.

Especially funny is that term "display" has different meaning in xorg.conf (screen's width and depth) and X11 protocol (collection of screens: DISPLAY=display.screen)

Nowadays you should have:
* One screen and one display (DISPLAY=:0.0) unless you do something rare like Zaphod mode.
* Use randr (either directly or with tools provided by your DE) to configure each monitor as different output

RandR solves most issues. Other approaches are outdated: Xinerama requires manual configuration and doesn't support different resolutions without of "dead zones"
"Several screens" (original X solution) does not allow you to move windows between them (except several carefully written apps)


Pankaj Jangid

unread,
Mar 9, 2021, 11:40:04 PM3/9/21
to
Stefan Monnier <mon...@iro.umontreal.ca> writes:

>> Each monitor will only use the GPU that it is connected to.
>
> FWIW, I find the terminology used in the graphics card PC industry very
> confusing. In my view, there are 4 different kinds of components to
> a graphic system:
>
> ...
>
> Most PC's "discrete" graphics cards include all 4 components and they
> typically can't use the system's normal DRAM in the same way they use
> their own video RAM. This means that you often can't use your discrete
> GPU to render a 3D scene into the frame buffer used by the DE of your
> integrated graphics card.
>

Thanks for detailed explanation, Stefan. Your advice will definitely
save my money. As I have not yet purchased a motherboard, but I have a
couple of GPUs lying around unused. I am building a new workstation,
taking clues from this list.

--
Regards,
Pankaj Jangid

Pankaj Jangid

unread,
Mar 9, 2021, 11:40:04 PM3/9/21
to
Dan Ritter <d...@randomstring.org> writes:

> All correct, except --
> if you have one of the specific combinations of AMD APU, AMD
> graphics card, and AMD-chipset motherboards that can do "Hybrid
> Crossfire" in which, surprise, the GPU integrated with the CPU
> can contribute to rendering each other's frame buffers. (I think
> it's always one specific direction, but am not sure.)

I guess it pays them (in money terms) to not come up with a standard
solutions.

But anyway, I have some spare GPUs from AMD (Radeon RX580) and am
planning go via Ryzen way. So I guess, I will be able to utilize the
things nicely now. Thanks.

--
Regards,
Pankaj Jangid

Pankaj Jangid

unread,
Mar 9, 2021, 11:50:03 PM3/9/21
to
IL Ka <kazakev...@gmail.com> writes:

> Nowadays you should have:
> * One screen and one display (DISPLAY=:0.0) unless you do something rare
> like Zaphod mode.
> * Use randr (either directly or with tools provided by your DE) to
> configure each monitor as different output

Clear. Thanks a lot.

> "Several screens" (original X solution) does not allow you to move windows
> between them (except several carefully written apps)

I remember experimenting with $DISPLAY, in my early days. It satisfies
the inner programmer but yes, the applications must be carefully written
for :0.0, :0.1, :1.0, :1.1, ...

--
Regards,
Pankaj Jangid

deloptes

unread,
Mar 10, 2021, 2:10:04 AM3/10/21
to
Pankaj Jangid wrote:

> But anyway, I have some spare GPUs from AMD (Radeon RX580) and am
> planning go via Ryzen way. So I guess, I will be able to utilize the
> things nicely now. Thanks.

Keep in mind that AMD dropped support for their older card models.

Pankaj Jangid

unread,
Mar 10, 2021, 2:50:04 AM3/10/21
to
Hmm... that’s another research work for me. Thanks for the advice.

--
Regards,
Pankaj Jangid
0 new messages