Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

XXXFPS & the Human Eye

0 views
Skip to first unread message

Rudy Pollorena Jr.

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to Kenny L.

That is very interesting information. I'm glad I read this because it
was enlightening.

Kenny L.

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

Hello All,

After discussions with my Doctor and Optomoligist, and consulting many books
on the subject, I have found what I believe is the threshold for observable
fps.

Under normal lighting,( bedroom light or outside midday, for examples of min
and max. norms) the image that one sees is projected onto the back of the
eyeball. That image will stay on the back of the eyeball for up to about 10
seconds, however, the image decays rapidly, so about .2 seconds after the
image hits the back of your eyeball it has decayed to about half its
original resolution.
Example: When in a dark room, with your light off, stare at the light,
flick the light on and off quickly, and you will see the light flash on and
off, ( of course ). After the light switch is turned off, notice that you
still have this white spot in your field of vision, even with your eyes
closed, but it doesn't really look like a light bulb. The longer you leave
the light off, the longer you can still see the spot.

The images on the back of your eye are constantly being replaced in real
time, and the "current image" is always more intense than that of the one
stored on the eye.

The optic nerve sends the image data to the back of the brain, which
processes the data and applies a definition to what your seeing. This is
the bottleneck. In Normal Humans, the time it takes to get the image from
the back of the eyeball to the brain and process it is aprox 0.0125
seconds.

If images are presented faster than this to the eye, then they are sent to
the brain anyway, but the brain tends to mix them up and the observer MAY
get confused, experience disorientation, nausea, epileptic fits. ( the
extreme effects are experienced by less than 1% of observers in reported
cases )

Anyway, do the math, and you will find that 1.0 / 0.0125 = 80

Based on the normal healthy human optic system, 80 fps is the most that your
brain can process. the brain can accept faster data rates, but cannot
process them in time, so 80 fps is the most efficient figure.
Example, if you send 30 images to the brain each second, the brain can
process those 30 fps no problem.
if you send 80 images to the brain each second, the brain
can process those 80 fps still with no problem.
if you send 85 images to the brain each second, the brain
can process 80 fps, and mix the other 5 fps in with those 80.

These are the scientific base lines, in respect to the max. fps that the
observer can "see". Your mileage may vary.

When you go to the movies, the film is presented at 30 fps, which is just
above the threshold of fluidity. The movie appears to be in real time. If
the film was presented faster than that it would look better, and appear
more fluid, but 30fps is acceptable.

Is more better? Yes.
At 30fps, your game may appear fluid. Up that to 45fps...can you see the
difference? OF COURSE YOU CAN.
Up that again to 60fps, again you can visually see that the game looks more
fluid. Up that yet again to 75fps and most will see the increase. Up the
count to 80fps, and only a few will notice the difference between 75 and
80fps.
Above 80fps, only the chosen few, may notice the difference.

This is based on human physiology and can be authenticated by any college
level med book.

So why do I need 200fps?

Anyone who has played Quake/Quake2 (or similar games) will tell you that
more fps can make all the difference between life and death, not to mention
just plain having fun.

Real World...

When I play Quake2, single player, my time demo scores are about 50fps.
Looks good, very playable. If I get into a small space and run timerefresh,
I can get to about 75fps! Looks good, but I don't spend all my time in
those little crawl spaces.
When I run the massive1.dm2 timedemo, I get about 25fps. Barely playable.
Once, I went online, had a decent ping, and got into a deathmatch. My fps
were so low, that I saw somebody in one frame, and I was dead in the next
frame. My hardware is not up to the task for online deathmatch, no way.

This is why we need as much fps as we can get, because when things get real
busy on screen, the hardware must slow down to process it. We should start
with the fastest hardware so when we get into a real fragfest, and our
hardware "slows down" to 75fps, we won't be caught in a choppy mess and get
blasted by some mug who walks up to our face and we can't see him because he
hasn't appeared on our monitor yet!


Can't wait for my Voodoo 2's to come.

Kenny L.

John Fournier

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

Well, I hope this finally settles it for all of you cynics out there.

-John

[remove .nospam to reply]
----------------------


Troy

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to


Kenny L. <tyna...@inreach.com> wrote in article
<6l9ggr$qmo$1...@news.3dfx.com>...


> Hello All,
>
> After discussions with my Doctor and Optomoligist, and consulting many
books
> on the subject, I have found what I believe is the threshold for
observable
> fps.
>

>.....and yada yada yada.....
ok if you all want to be techies well lets rock!!! first all this 30 fps
stuff...well??? a CRT based on the NTSC(north american) standard displays
actually 60 fps-NOT 30 FPS as all these want abe's keep saying... PLEASE
get this stuff right...i work as a techie for the sony corp. on
video....the lines are scanned the length of the display every 1/60 th of a
second..as the first FRAME is displayed the second one is then INTERLACED
between the first as the old frame fads causing what you would call
flicker(interlaced)....this process happens every 1/60 th of a second or 60
times a second which would be in computer techie quakie terms 60 fps...BUT
WAIT!!! its not that simple...since the first frame is only every even line
and the second every odd... they are interlaced to be a complete frame and
since as are pal said...the eye "holds" a image for a split sec then we see
it as a full frame at which time it would be 30 fps because each field or
"picture" scanned inbetween each other at 1/60th a sec. (divide by 2) to
make 1/30th a sec. but they are different fields(hold different picture
detail...including updated color and motion info) which means that a smooth
TV PICTURE really gets updating every 60 times a sec.(unless you live in
europe at which case its every 50 times a sec.-PAL system blaaa!!!) the new
standard by sony for HDTV will be updating even faster...believe me you
WILL SEE A DIFFERENCE...

.......so the moral of the storey is you need 60 fps to get smooth motion of
a computer screen since there is no interlacing on on televisions...and a
increase will make a even smoother motion up to a limit...at which time it
will only make you sick

p.s. as for seeing higher then 80 fps...read all material...its dependent
on your field of view > then 30% or < then 30%...less being you guested
it...seeing a difference for sure....i.e. not sitting with you nose in the
screen of your monitor...i hope i confused you all...but then again you
deserve it...i am tired of reading mis information...any questions?????


elw00d

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

It sounds like very soon, we will have to start upgrading our brains to keep
up with the graphics cards. ;O)

--
ICQ 1082633
elw...@home.msen.com
http://home.msen.com/~elw00d

Zinger

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

>the lines are scanned the length of the display every 1/60 th of a
>second..as the first FRAME is displayed the second one is then INTERLACED


>...any questions?????


Since most current monitors are billed as displaying "non-interlaced"
pictures, and have selectable refresh rates of 60, 75, 85Hz and above, how
do these statistics and terms fit with what you just said about NTSC and
interlacing?

Thanks.

Kenny L.

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

Troy wrote in message <01bd90ca$01c93900$4086b3cf@troys>...


>
>
>Kenny L. <tyna...@inreach.com> wrote in article
><6l9ggr$qmo$1...@news.3dfx.com>...
>> Hello All,
>>
>> After discussions with my Doctor and Optomoligist, and consulting many
>books
>> on the subject, I have found what I believe is the threshold for
>observable
>> fps.
>>
>>.....and yada yada yada.....

>.i am tired of reading mis information...any questions?????
>

Troy,

Please re-read my post. I don't believe I was talking about computer
hardware specs. I was defining what the limits of human hardware is.
Thanks for your info on monitors, though.

Joshua Krane

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

Interesting...but a little off.

The eye is the flaw, not the brain. Images are perceived when a photon
knocks an 11-cis-retinol molecule out of "place" in the cones and rods that
make up the "receptor" portion of the eye (retina). What happens is that in
order to readapt to darkness, Vitamin A and retinol must combine to reform
11-cis retinol. this takes time.

This is the bottle neck, so to speak.

There is evidence that the brain can accurately process multiple synaptic
inputs with firing rates faster than 120Khz (thats 120,000 times per
second).

Ask your optomologist, it is the eye at fault not the brain.

Joshua Krane, PhD Candidate
Dept. of Neuroscience
Florida State University


F

unread,
Jun 5, 1998, 3:00:00 AM6/5/98
to

Kenny L. wrote in message <6l9ggr$qmo$1...@news.3dfx.com>...


>Hello All,
>
>After discussions with my Doctor and Optomoligist, and consulting many
books
>on the subject, I have found what I believe is the threshold for observable
>fps.
>

SNIP


>
>The optic nerve sends the image data to the back of the brain, which
>processes the data and applies a definition to what your seeing. This is
>the bottleneck. In Normal Humans, the time it takes to get the image from
>the back of the eyeball to the brain and process it is aprox 0.0125
>seconds.
>

SNIP


>Anyway, do the math, and you will find that 1.0 / 0.0125 = 80
>

What was the book where you found this info? I looked in all my core med
school books and couldn't find anything on the human threshold of fps. From
intuitive thinking, I'd tend to believe Joshua Crane's statement in another
post that it's the re-formation of the 11-cis retinol/opsin complex that
would be the bottleneck. Not to mention, he's a PhD candidate in
Neuroscience :-)

>This is based on human physiology and can be authenticated by any college
>level med book.

um, not quite :-). This topic has virtually no relevance to human
diseases, so at most it's mentioned in passing (and quickly forgotten by med
students). "If it's not on the boards, forget it!"
The only time I heard about this topic in med school was in a radiology
elective when they were discussing interlacing. The prof just quickly
mentioned in passing that at 30fps you could notice the different frames of
an image. The reason television looks smooth is because of interlacing,
which gives you 60 frames per second (one frame shown on odd lines, a second
frame interlaced in the even lines - two "images" shown every 1/30 of a
sec). Whether or not the human eye can distinguish the difference between
60 fps and higher wasn't mentioned - just that fluid motion is perceived at
about 60fps.

Someone had a link to a program which showed a moving square at 10,30,40,
and 60 fps, and you could definitely see the difference between 60fps and
the others.


>
>So why do I need 200fps?
>

SNIP


>This is why we need as much fps as we can get, because when things get real
>busy on screen, the hardware must slow down to process it. We should
start
>with the fastest hardware so when we get into a real fragfest, and our
>hardware "slows down" to 75fps, we won't be caught in a choppy mess and get

SNIP

Yup, this is the real need for higher fps - It gets pretty ugly when my fps
drops below 20 in rocket battles. But do you really need to have your fps
up to the human threshold? I'd think that if it stays above 40fps (which is
what I get right now in my timedemos), I'd play just fine - it's when it
gets really low that matters. That's why I think timedemos with demo1 or
demo2 really don't mean as much as Brett's massive demo. The Massive demo
gives you a better gauge of what your fps will be like in big battles when i
counts most.

Anyway, don't bother your family doc about this - probably 99 out of 100
won't know the answer - "if it ain't on the boards then forget it"

idealego

unread,
Jun 6, 1998, 3:00:00 AM6/6/98
to

I couldn't agree more with most of what Kenny L. said and that the
brain can take in up to about 80 fps. I used to run Quake 1 in
software with the monitor running at 180 Hz (achievable with Scitech
Display Doctor and a $2000 monitor) and the framerate in my PPro 233
at 320x200 was about 100 fps and never dropped under 50 in the biggest
rocket fights which made all the difference in the world. I usually
claim using my own observations that the brain can take in about 75
fps which is pretty close to what Kenny L. is claiming although
through more substantial research then myself I must add.

Anyone who claims they can't notice the difference is just jealous
because their system won't run that fast.

Jason Keddie
idea...@hotmail.com

Todd

unread,
Jun 6, 1998, 3:00:00 AM6/6/98
to

Joshua Krane wrote in message <6l9rtl$1cf$1...@news.3dfx.com>...

Wow, I must say I have never read so much interesting and technical stuff
about a stinkin gaming card, in my whole life! Now maybe I can convince my
girlfriend that while i'm obsessing over my clv2, i'm actually learning
important things about retinol, vit a, the brain and my eyes. ha :)

Todd, foofoo candidate
Dept. of foo
foofoo University


Jon

unread,
Jun 6, 1998, 3:00:00 AM6/6/98
to

On Fri, 5 Jun 1998 12:22:04 -0700, "Kenny L." <tyna...@inreach.com>
wrote:

<snip explanation>

>Is more better? Yes.
>At 30fps, your game may appear fluid. Up that to 45fps...can you see the
>difference? OF COURSE YOU CAN.
>Up that again to 60fps, again you can visually see that the game looks more
>fluid. Up that yet again to 75fps and most will see the increase. Up the
>count to 80fps, and only a few will notice the difference between 75 and
>80fps.
>Above 80fps, only the chosen few, may notice the difference.
>

>This is based on human physiology and can be authenticated by any college
>level med book.

This is all basically fine. There is more than just the human
eye/brain to deal with. Since no one has mentioned them all, I'll do
it! You have:

1) rendering speed to framebuffer
2) DAC -> monitor refresh read of framebuffer or digital send to
digital screens.
3) monitor phosphor time decay/cathode voltage and current
4) cone/rod reception of light
5) optic nerve->brain reception
6) image complexity/detail in brain encoding of image.

#1 is 3Dfx's and application developers problem. They should provide
us with a board and software which is:

A) demonstrates consistent rendering performance per level of quality
WHY? : Because consistent performance at 45fps is better than
200fps->30fps.
HOW?: Make software/hardware trade-offs between quality of scene
to speed when necessary. Example of this being done are: Messiah,
with level of detail control of model tessellation. Talisman is an
example of attempting this, but didn't make it into market place
totally. This will be key for software developers/hardware
developers.
B) demonstrates rendering speed to framebuffer at consistently above
refresh rate with vertical synchronization of rendering to refresh
enabled with multiple display buffers.
WHY? : When you render at speeds slower than the monitor is
refreshing, you get frames sent to the monitor which are
duplicates...The more you are below the refresh of the monitor the
more you get. At 30fps with 60Hz refresh your retina is exposed to 2
images for every rendered scene as you track the screen's action.
This is horribly annoying and why computer graphics NEEDS to go to
60fps to appear as smooth as N64 in the first place. no VSYNCH is
unacceptable...Creates tearing in image when off synch.
HOW? : faster boards/better control of level of quality.

#2 is 3Dfx's and industry standards and the monitor's problem. They
should provide us with:

A) The DAC needs to be of high quality, or use digital. Monitor needs
to support at least NI 60Hz-85Hz at resolutions to what you want(upto
1600x1200 today for high-enders).
WHY?: Poor DACs have poor separation of colors. Digital is best.

#3 is the monitor maker's problem. Best would be brain interface, but
heck! :) They should provide us with:

A) phosphor or whatever material should decay to about 1/exp(1)
intensity per unit refresh time. This is standard for high-end
monitors. Bigger is better->More immersion. head-gear is definitely
in future, shooting lasers into your eyes. Then later brain
interfaces at different levels of connection.

#4 is our problem. :) Brain interface could bypass this, but that's
for our grandchildren to experience. Without physical/genetic
enhancements we are on average limited by:

A) rod's reception of intensity is key. Without the prevailing techno
babble, each rod refresh's in about 1/90Hz. Depending on many factors
this is the extreme our eyes will let us detect, and that's off-axis
to center viewing. 85Hz should be fine for any application.

#5,#6 is again our problem. Optic nerve is pretty efficient. Brain
itself is the key to much misunderstanding about FPS:

A) Brain encodes optical nerve's signal into many components,
represented by shapes(lines and such), spatial and temporal
frequencies and their shifts, colors can do odd things, etc.

B) The key idea with computer games is complexity. The more complex a
scene is, the more fluid it *can* appear. This is because you don't
give the brain simple information to isolate detailed phenomena.
Compare a white ball with black background moving at 60fps on your
screen. Move it fast enough and you can TELL it's skipping. If the
scene is complex enough you can't tell as readily unless the scene has
simple components in it.

So that's a wrap. It shouldn't be too technical, and so it won't
provide you with *evidence*--go read a book, but simply is an overview
of the factors that come into play.

It all boils down to: Right now we should desire 60Hz, 60fps LOCKED.
Which means you follow what I said above. Better is better, sure.
Until about 90Hz which is silly afterwards unless you are doing
something funny to that signal like separating it into stereo signals.

Derek

unread,
Jun 6, 1998, 3:00:00 AM6/6/98
to

Kenny L. wrote:

> The optic nerve sends the image data to the back of the brain, which
> processes the data and applies a definition to what your seeing. This is
> the bottleneck. In Normal Humans, the time it takes to get the image from
> the back of the eyeball to the brain and process it is aprox 0.0125
> seconds.

> Anyway, do the math, and you will find that 1.0 / 0.0125 = 80

Assuming you are right... So what if it takes 0.0125 seconds to reach
the brain. That means everything is just that much lagged. You could
pack in the info to the brain 2ns apart making the brain very up to
date.

--
http://www.wcvt.com/~restey/

Peter O'Boyle

unread,
Jun 6, 1998, 3:00:00 AM6/6/98
to

On 5 Jun 1998 21:37:43 GMT, "Troy" <tr...@nbnet.nb.ca> wrote:


>ok if you all want to be techies well lets rock!!! first all this 30 fps
>stuff...well??? a CRT based on the NTSC(north american) standard displays
>actually 60 fps-NOT 30 FPS as all these want abe's keep saying... PLEASE
>get this stuff right...i work as a techie for the sony corp. on

60 fields and 30 fps... and medium persistence phosphors
doing the smoothing... :-)

Peter


Cem Aygün

unread,
Jun 7, 1998, 3:00:00 AM6/7/98
to

The point IS that mixing you mention to occur above 80 fps...In real
analog life, a lot of things are faster than human perception, creating
a dillusion, a new level of mis-perception...Like a stationary car wheel
and a turning whell, or the wings of a fly...

In digitally simulated world, these mis-perceptions are substitued by
look-alikes. (like changing the whell texture when it starts to turn)
But above 80 fps, they would completely be regenerated. So you would be
able to see what you can see in real world the way you see it...

Of course we first need some real-world oriented games, like Spec. Ops.
so that we can judge how far a machine/software can get. Not completeley
fictional junk like Quake 2 (If you make the rules, you can always cheat
to make your work seem perfect. Trick is simple: anything you can not
do, you say doesn't exist in your world according to the game concept)

Cem Aygun


Derek

unread,
Jun 7, 1998, 3:00:00 AM6/7/98
to

Cem Aygün wrote:
>
> The point IS that mixing you mention to occur above 80 fps...In real
> analog life, a lot of things are faster than human perception, creating
> a dillusion, a new level of mis-perception...Like a stationary car wheel
> and a turning whell, or the wings of a fly...
>
> In digitally simulated world, these mis-perceptions are substitued by
> look-alikes. (like changing the whell texture when it starts to turn)
> But above 80 fps, they would completely be regenerated. So you would be
> able to see what you can see in real world the way you see it...

take 60 seconds multiplied by 80 = 4800. Now from that math we couldn't
tell the difference between 4800 rpms and 20,000 rpms (assuming a
revolution is ~ frame). Yet wagons wheels spinning at much lower rpms
(wild guess say 100), make that effect.

Of course, this is just my wild reasoning and I don't really know what
im talking about ;)
--
Please remove the NOSPAM in my address before trying to email me
Thanks - Derek

http://www.wcvt.com/~restey/

Jon

unread,
Jun 9, 1998, 3:00:00 AM6/9/98
to

On Sun, 07 Jun 1998 19:12:25 -0400, Derek <roo...@yahoo.com> wrote:

>take 60 seconds multiplied by 80 = 4800. Now from that math we couldn't
>tell the difference between 4800 rpms and 20,000 rpms (assuming a
>revolution is ~ frame). Yet wagons wheels spinning at much lower rpms
>(wild guess say 100), make that effect.
>
>Of course, this is just my wild reasoning and I don't really know what
>im talking about ;)

There's a difference between flicker detection and motion-blurring
detection.

Most CG have 0 motion-blurring or unrealistic motion-blurring, so
looking at the flicker rate is what's important for fluid frame
interpolation, but that doesn't mean the image will look correct for
quickly rotating objects since they are heavily motion blur dependent.

All other types of motion should be ok.

0 new messages