Printer Friendly

Retinal Jiggles: Why Your Eyes, Brain Strongly Prefer Games at 60 Fps.

TEHRAN (FNA)- One of the issues that comes up frequently in technical discussions of both gaming and video is what frame rate should be considered "good enough."

On the one hand, you have people who believe that 24 fps for film and 30 fps for gaming represents a magic figure that we don't benefit from exceeding. There's actually a great deal of objective evidence suggesting this isn't true, and a new blog post by Simon Cooke, from Microsoft's Xbox Advanced Technology Group, delves into why humans tend to prefer higher frame rates, extremetech reported.

Part of the problem with trying to discuss this topic is that the human eye is a fantastically complicated piece of equipment that performs its own image processing before the signal is ever relayed to the brain. We tend to think of what we see as a cohesive whole because our entire visual system has evolved to allow us to do so. In many ways, however, this is an illusion. The eye's sensitivity to color, motion, light, and acceleration/deceleration are all different. The situation is further muddied by the fact that, while we often think of the eye as a camera and discuss vision with the same terms we use to talk about computer-generated graphics, neither of these analogies actually capture how the eye receives or processes information. This video , shows the difference between 60 fps and 30 fps at three different movement rates.

All of this said, people do tend to prefer higher frame rates for gaming when given the opportunity to try them. This preference holds up even above 60 fps (60Hz), for a number of potential reasons depending on the nature of the game, its graphics, and how fast-paced the action is.

Simon Cooke's theory is that this preference has to do with one of the interesting mechanical aspects of human vision - even when you fix your eyes on a single fixed point, the retina is never actually still. The wobble - more properly known as ocular microtremor - occurs at an average frequency of 83.68Hz, with a jiggle range of around 150-250nm, or about the width of 1-3 photoreceptors in the retina.

So what's the point of this wiggling back and forth? Cooke thinks he knows. By wiggling the retina back and forth, you sample the same scene from two very slightly different points. Meanwhile, inside the eye, you've got two different types of retinal ganglion cells - on-center cells that respond when the center of its receptive field is exposed to light, and off-center cells that respond when the center of the field is not exposed to light.

When the retina wiggles back and forth, incoming light strikes both on-center and off-center cells, stimulating both. Cooke thinks this may boost our ability to detect the edges of objects. He also argues that all of this ties back to the Uncanny Valley.

Tying back to video games

If this theory is accurate, it means that the human retina effectively upsamples the world around us, just as video cards and game consoles can render internally at much higher resolutions before outputting an image to the display. The image below shows an example of how taking multiple samples of an image with exactly the same equipment can lead to better results.

Super-resolution close-up.

But this ability to extract additional information from what we see depends on feeding information to the system at a certain rate. If the sample rate (30Hz, 30 fps) is lower than half the retina's microtremor rate, then the image field isn't changing quickly enough for the eye to extract additional information.

If you've followed the discussions of microstutter and frame latency in gaming, you're already aware that one reason microstutter is less intuitive than frame rates as an objective performance measure is because the benefit of lower frame times drops off as the constant frame rate approaches 60 fps. Cutting frame latency from 33.3ms (30 fps) to 25ms (40 fps) feels larger than moving from 40 to 60 fps - despite the fact that the second shift is both relatively and absolutely larger.

If Cooke is correct, the explanation for this phenomena is that the eye's own super-resolution mode can effectively kick in around the 43 fps mark. Interestingly, he suggests that a higher frame rate at a lower resolution can yield better results than a 1080p @ 30 fps target. Whether many developers are willing to take that shot or not remains an open question; most Xbox titles have failed to hit the 1080p @ 30 fps target, but most have aimed for 900p rather than dropping all the way back to last-gen's 720p. There's an expectation of resolution improvement that may work against developer attempts to optimize other aspects of the experience.

If you're curious to see how 60 fps and 30 fps gaming line up side-by-side, hit up the 30 fps vs. 60 fps website. These are uploaded MP4s, not YouTube videos, and we've confirmed that the clips shown are encoded in 30 and 60 fps respectively.

Unfortunately, there's no sign that the game industry will adopt the changes that would take meaningful advantage of Cooke's research, even if the claims hold up under scrutiny. The game industry is fixated on resolution rather than frame rates, and if 720p @ 60 fps isn't politically possible today, there's little hope that 1080p @ 60 fps (as opposed to 4K @ 30 fps) will be any more tenable for future product cycles. PC gamers, of course, have access to these advantages already, though they can still require powerful video cards or reduced quality settings to unlock. PC monitors with v-sync enabled will only maintain a 60Hz refresh rate if they can hold that speed constantly - if the game's frame rate drops below that point, the monitor will automatically fall back to 30Hz or 20Hz. That means we need to see 120Hz panels become much more common before the 1/2 or 1/3 fallback point will correspond to the minimum refresh rate necessary to take advantage of retinal upsampling.

This kind of work and understanding will be vital to the various attempts to augment human vision. From next-generation smart contact lenses to night vision add-ons, to peripherals like the Oculus Rift, a number of high-profile research projects are all focused on integrating technology with human vision in unprecedented ways. The projects that survive, I suspect, will be those that adapt strategies that dovetail most closely to what the human eye already does on its own and can mimic its functions in ways that don't disrupt normal day-to-day actions.

2014 Fars News Agency. All rights reserved Provided by SyndiGate Media Inc. ( ).
COPYRIGHT 2014 SyndiGate Media Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Publication:FARS News Agency
Date:Dec 27, 2014
Previous Article:Iran to Render Infertility Treatment Services in Azerbaijan.
Next Article:NVIDIA Shield Receives Android Lollipop Update to Great Success.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters