Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

PC gamers: turn "Max pre-rendered frames" to ZERO

868 views
Skip to first unread message

Zank Frappa

unread,
Apr 16, 2010, 8:42:39 AM4/16/10
to
I got this from another forum and it has really helped with JC2.

Max pre-rendered frames is a setting that a lot of games are using these
days, the purpose of which is to smooth out the fps. It does this by
using your extra GPU power to render additional frames. Unfortunately
this has a tendency to introduce input lag.

If you are getting a good framerate but your mouselook feels floaty and
unresponsive this is likely the reason. Bad Company 2 and Fallout 3 are
examples of games that suffer dramatically from this setting. It is also
known as RenderAheadLimit in the Bad Company 2 "settings.ini" file. You
can change it there too if you like or you can set it to 0 in your
Nvidia control panel.

For those of us using ATI cards, download the latest version of ATI Tray
Tools (a third party driver tweaking suite – there's no equivalent
setting in the Catalyst drivers) and adjust the 'Flip Queue'.

Changing this setting to 0 wont have any effect whatsoever on games that
don't use it. I don't have a list of games that do. Personally I have it
forced to 0 in my global settings and have never had any issues.

Juarez

unread,
Apr 17, 2010, 11:25:56 AM4/17/10
to

I had to change that setting in Ghost Recon AW because when using night
vision the render ahead 3 frames setting would cause the game to crash
on the 7900GT but 1 didn't.Apart from that I have never adjusted it.
I'll check it out though and see if you are correct but I have ATI and
so have to get ATI tray tools first. I hate it when games feel laggy
even when the frame rate is good.

yaugin

unread,
Apr 18, 2010, 11:56:54 AM4/18/10
to
On Apr 16, 5:42 am, Zank Frappa <em...@mail.com> wrote:
> Max pre-rendered frames is a setting that a lot of games are using these
> days, the purpose of which is to smooth out the fps. It does this by
> using your extra GPU power to render additional frames. Unfortunately
> this has a tendency to introduce input lag.

More accurately it is output lag. You are seeing and reacting to
things that happened, accurately, 3 frames ago. When you get the lag
effect from this, it means the CPU is overpowered for the game and is
consistently delivering frames to the GPU ahead of time, so basically
by the time the GPU shows the frames on screen, it's always 3 frames
behind the CPU. This is supposed to be offset by an equal mix of slow
frames for an ideal net of 0, working under the assumption that the
worst case is negative (choking). However, artificial performance
optimizations can be worse than having a few hiccups.

This "feature" is useful for benchmarking a GPU and not much else --
after all most benchmarks don't use live gameplay so who would notice
the lag? Reminds me of vsync and how drivers usually default it to off
these days. These graphics card makers will do anything to post a
higher number than the other guy.

Message has been deleted

Justisaur

unread,
Apr 19, 2010, 11:14:26 AM4/19/10
to

To be fair there's no reason to vsync on an LCD which most people are
using today.

- Justisaur

Ayatollah of rock 'n' roller

unread,
Apr 19, 2010, 12:17:03 PM4/19/10
to

"Justisaur" <just...@gmail.com> wrote in message
news:17a24dde-9a76-457f...@j37g2000yqe.googlegroups.com...


Why not? I have an LCD and still get image-tear if I don't use vsync.


Borg

unread,
Apr 20, 2010, 5:13:51 PM4/20/10
to
On Mon, 19 Apr 2010 08:14:26 -0700 (PDT), Justisaur
<just...@gmail.com> wrote:


>To be fair there's no reason to vsync on an LCD which most people are
>using today.
>
>- Justisaur

Why do you say that? The cockpit frames in flight sims tear really
badly and is very noticable on an LCD compared to a CRT without Vsync
on. If the frame rate is good one should always use Vsync for best
image quality. In some games it is not very noticable but in others it
can look horrid without it. If I had a 120hz LCD I think I would
always have vsync on unless it was causing lag, as it can in some
games.

Borg

unread,
Apr 20, 2010, 5:15:59 PM4/20/10
to
On Mon, 19 Apr 2010 17:17:03 +0100, "Ayatollah of rock 'n' roller"
<thi...@lse.co.ck> wrote:


>Why not? I have an LCD and still get image-tear if I don't use vsync.
>

Yep, and image tearig looks a lot worse on an LCD than a CRT for some
reason. Uisng TrackIR in a flight sim without Vsync enabled makes the
cockpit frame tear very badly on an LCD.

JLC

unread,
Apr 21, 2010, 11:10:57 PM4/21/10
to

Battlefield 2 has a bug which causes Vsync to be turn off even if forced
using the ATI console. I think it's the same for Nvidia cards.
I have a 1GB 4870 running a 22" LCD at 1680x1050 and Win 7 64bit and the
game looks horrible. The tearing is very noticeable, I can't believe
anyone could play it and not notice right away. I'm using D3DOverrider
which does fix the problem. It also enables triple buffing which helps
with performance.
JLC

Justisaur

unread,
Apr 24, 2010, 12:28:34 AM4/24/10
to
On Apr 20, 2:15 pm, Borg <m...@somewwhere.invalid> wrote:
> On Mon, 19 Apr 2010 17:17:03 +0100, "Ayatollah of rock 'n' roller"
>
> <this...@lse.co.ck> wrote:
> >Why not? I have an LCD and still get image-tear if I don't use vsync.
>
> Yep, and image tearig looks a lot worse on an LCD than a CRT for some
> reason. Uisng TrackIR in a flight sim without Vsync enabled makes the
> cockpit frame tear very badly on an LCD.

Never noticed it. But then I don't play flight sims. Do play some
space sims and haven't noticed it there, but then most of those are
older and I doubt have that new default.

- Justisaur

0 new messages