Why The 1080p/60fps Dream May Never Be Realized


Youtube Video:
http://www.youtube.com/watch?v=drIsVogR5uo


If you scour the gaming media, magazines, Internet forums and blogs from the past year or so, you will see countless discussions about whether or not the next generation of gaming delivered by the PS4 and Xbox One will reach the 1080p/60fps "holy grail" for delivering what many feel is the optimal quality experience available today. It's a discussion that has really been going on since the advent of the current generation of consoles: the PS3 and Xbox360 which both promised to deliver "full HD" (i.e 1080p) graphics. Unfortunately, only a relative handful of titles were ever released in the 8 years that delivered on that promise and that trend may continue well into the next generation of consoles. Let me explain why.

Understanding 1080p

1080p is an HD resolution standard meaning 1920 x 1080 (2 million) pixels on the screen. In 2005 when the Xbox360 was released, 1080p was seen as a luxury that not many people could take advantage of. The penetration of HDTVs that delivered full 1080p resolution was relatively small and still growing. In addition, most gamers already saw the jump from standard definition (480p) to 720 HD resolution that the Xbox360 and PS3 delivered as being a big enough improvement in its own right. However, today 1080p is the standard across both PC monitors and HDTVs and they are plentiful and cheap. The expectation is that with nearly 8 years between console generations and the increase in technology, these new modern systems in the PS4 and Xbox One should be able to at least deliver what was promised last generation. After all, gamers have been enjoying 1080p on the PC for several years now on technology that is inferior and outdated compared to the PS4 and Xbox One. Thus, we should finally be able to play games that can fully take advantage of that maximum resolution.

Understanding 60fps

Fps stands for "frames per second" and describes the "frame rate" of the video. The short explanation is that the higher the frame rate, the smoother the video looks. As a point of reference, film (i.e what you watch when you go to the movie theater is 24fps, video (TV broadcast) is 30fps, and the best games typically max out at 60fps. In games, frame rate simply means "How fast can the hardware process the scene." The more complex the scene (i.e the better the graphics), the longer it will take to process and render the frame. For example, for a simple scene with just one person in an empty room, the GPU may be able to render 60 frames per sec (60fps). But if the scene now had an extra person in it, the extra time it would take the GPU to render the 2nd person may mean that it can only render 30 frames in one sec. This chart summarizes what I'm saying here: the higher the complexity, the lower the frame rate.




Games are particularly sensitive to frame rates because how fast the frames can be displayed on the screen directly affects the interactive experience of playing a game. For example, not only can a low frame rate cause the game to stutter and lag on screen but it will affect the lag from you say pressing the jump button on your controller to how long it takes for you to see the character jump on the screen. It is important to note that the 60fps limit is due to most displays being 60hz, meaning that they display a maximum of 60 frames in one second. So even if the graphics processor (GPU) could generate 120 game frames in one second, most monitors and TVs will only be able to display 60 of them anyway. In addition, studies have shown that most human beings cannot detect additional motion beyond 60fps while a minimum of 12fps is required for our brain to tell the difference between individual stills versus "motion". But trust me, you do not want a game or video running at 12fps. In general, 30fps is the minimum target for a game to have a pleasant experience (most console games run between 20-30fps).

Now with that little primer out of the way, we can understand what this whole 1080p/60 dream is about. Basically, 1080p (resolution) and 60fps is considered the optimal gaming experience available with today's technology. Again HDTVs generally max out at 1080p so that is the highest resolution you can get into your living room today (4K is just rolling out). Meanwhile, 60fps is the highest frame rate that most TVs can display and that most people can detect to gain the extra enjoyment. The benefits of the 1080p/60 experience has already been proven on the PC platform where the highest end graphics cards available today can deliver 1080p (and even higher resolutions) at 60fps. The experience is already much better than the current generation consoles (PS3 and Xbox360) so that idea is that the next generation consoles should be able to deliver that high end PC experience to the living room. That's a fair goal, but unfortunately it may never be met.

Why Not 1080p/60fps

Let me start with a simple answer: 1080p/60fps is possible, but not without sacrifices to the quality of the game. In my experience, most people demanding 1080p/60 from PS4, Xbox One, or PC do not fully understand the nature of the problem. The first thing to understand is that frame rate is NOT a fixed metric but one that varies as the game is running depending on how complex the game scene is. A simple way to see this on a PC game is to run a frame rate counter such as FRAPS and play a game with an outdoor setting. If you focus the camera on a scene where there may be homes, a forest, or a bunch a characters on the screen, the frame rate will be much lower than if you look straight up at the sky. The reason is because there is less work for the hardware to render when looking at the sky than with all the geometry closer to the ground. The next thing to understand is that the resolution has a direct affect on the frame rate. The higher the resolution, the lower the frame rate will be for the SAME scene. The reason for this is that the higher frame rate means more pixels which translates to more work for the GPU. The more work, the longer it will take. Again, pictures can help illustrate this point as well.



So if you follow me so far, think about this: the Xbox360 and PS3 can already do 1080p/60fps. They have the hardware and I/O capabilities to do it. The problem is that the level of graphics that most of their games generate are simply too much for the hardware to handle at that speed with such a high resolution. If all the games on the PS3 had the same level of graphics as the PS1 or PS2, then the majority of games on PS3 would likely be 1080p/60fps. The additional horsepower of the PS3 applied to the relative low complexity games of the PS1/PS2 would mean that the system can draw the same scene in much greater details (SD -> FullHD) and do it much faster (60fps). However, that is not how games have ever evolved. People demand bigger and more realistic worlds with each new generation. Thus you get games like Uncharted and Gears of War which clearly look way better than an Xbox or PS2 from 2003. But in order to achieve that quality, you have to make sacrifices since every hardware has limits.

We'll Never Be Able To Have Our Cake And Eat It Too 

Again, if the PS4 was used to simply run Infamous, Killzone 2, or Uncharted with the same level of graphics as PS3 games, then I'm sure it can do so at 1080p/60. But that would be boring and stagnating to the growth of the industry. What we want is brand new experiences that push the graphics and technology of the new systems to deliver something that looks several times better than Uncharted on PS3. To do do, the game developer has to make a touch choice: do I put in more complexity to create more realistic worlds or do I simplify the world in order to allow the hardware to render it faster and in more detail? Unfortunately, there is no right answer to that question. Some people prefer higher resolution, some prefer higher frame rates, and some prefer better graphics. The point is you can't have your cake and eat it too. Expect to see some games run at 720p next gen. Expect to see some many games run at 30fps. This dilemma will never end as long as computer graphics continue to advance.  In the PS5 generation, the hardware may be 100x more powerful than the PS3. But that doesn't mean much since it's not asked to do the same work as the PS3. It will not be able to deliver games that look 100x better than PS3 games and run at 8K (8x 1080p) resolution and run at 60fps. Developers will ALWAYS have to make that tough decision. They will continue to choose different priorities and thus while we will definitely see some 1080p/60 games next gen we will never reach our golden standard across all games.

Some people get the technical reasoning I've already explained but simply feel that the number 1 priority for developers should be 1080p/60. To that, I say that while resolution will continue to evolve and 1080p may become standard at some point next gen, the 60fps baseline is not necessary for the majority of games. Since going to 60fps involves an inherit sacrifice to literally reduce the complexity of the scene by 1/2, it is not a sacrifice that every developer will be able to make for every game. For slow moving genres like RTS and 3rd person adventures, 30fps is more than sufficient. That additional rendering time could be used to render much more advanced effects while still maintaining a smooth playable experience. Uncharted 2/3, Gears of War series, Killzone 2/3, or GOW III/Ascension would not have been as impressive if they ran at 60fps. So the idea that all games need to run at 60fps is just nonsense and a bit elitist. I completely support developers pushing more games to 60fps next gen but I also understand the benefits to 30fps and hope to continue to see games at both frame rates in the future.

Comments

Popular posts from this blog

Xbox Scorpio vs PS4 Pro: The Truth About What To Expect

SHOCKER: Paramount to drop Blu-Ray Support!!!