Top 5 Reasons Next Gen Gaming is Taking So Long
Now that Nintendo has jumped the gun and announced their next generation game console called the Wii U, there is a ton of speculation as to when Sony and Microsoft will follow with their next gen systems. In past console generations, the average time of a console cycle was about 4-5 years. This does not mean that in 4-5 years a system would stop being manufactured and disappear. Rather this means that there was typically a gap of 4-5 years between new consoles from the same company even if they would sell concurrently for some time (i.e PS1 -> PS2, SNES -> N64). This fall, the Xbox 360 and PS3 will be 6 and 5 years old respectively which is well past that established life cycle. Yet, both the PS3 and Xbox360 are still enjoying healthy worldwide sales, have plenty of games coming in the next 12-18 months, and still have not reached the magical price point of $199 (max price for all Skus) which is traditionally when most systems see their largest sales. How is it that these systems are able to be relevant for so much longer than their predecessors?
Here are my top 5 reasons the next generation of game consoles is taking so long:
5. TECHNOLOGY IS CHANGING TOO FAST
Tablets, 4G, wireless display, motion controls, 3D, networking, digital downloads, APUs (GPU integrated with CPU), and cloud computing. These are just some of the significant technologies that have become common in the 5-6 years since the Xbox360 and PS3 were released. All of these technologies can have profound impacts on video gaming. The hardware has to be made in such a way to enable some or all of these technologies in the next generation. The difficult question for Microsoft and Sony is which ones are the ones to bet the future on? Should the next Xbox use physical media or be all digital downloads for game distribution? Should an updated Kinect be the primary user interface to the system saying goodbye to the physical controller for good? Should Microsoft or Sony follow Nintendo's innovation and leverage a tablet-like controller interface to expand the ways users can interact with the console? Should the PS4 use Blu-ray again? Will 1080p full 3D graphics be the target for next gen? How can the next gen consoles expand on social networking and media sharing? Is OnLive really the future of video games? These are all very difficult questions that both Sony and Microsoft must answer before committing another 8-10 years and hundreds of millions of dollars in a new system. If they bet wrong, there will be a steep price to pay.
In previous generations, updating a system to the next generation was pretty straightforward. Simply update the processing power of both the CPU and GPU to enable new uses of technology and you were pretty much done. However, today the expectations for a game console has grown immensely and simply increasing the processing power is not enough. With so many variables, what-ifs, and volatile technologies in the mix, it's reasonable that Sony and Microsoft need more time weigh them all, observe their impact on the audience, and work to fine tune some of the less polished ones.
4. HARDWARE ISN'T AS IMPORTANT AS IT USED TO BE
How many readers are old enough to remember the old "Genesis Does" or "U R NOT E" gaming commercials from the 90s? If you can think back to those days, all you heard or saw in ads was how many "bits" the hardware had, how many polygons it can push, or how advanced the system is (remember those "It's Thinking" ads for Dreamcast). Kids argued in school how much better the Genesis was with its 16-bits compared to the NES with only 8-bits. Or how awesome the PlayStation was because it was 32-bit and used shiny new CD-ROMs for games. For gamers then, the most exciting thing was seeing the leaps in technology with each new system adding things like full-motion video, voice, and 3D graphics. Oh the good ole days.
In 2011, consumers are much less interested in the nuts and bolts of a system and more concerned with what kind of experience it gives them. This is true not only in the video game space, but across the board in consumer electronics and technology. If nothing else, the Wii proved that people are less interested in technology and graphics than they are fun and accessible experiences. Xbox Live is a phenomenal success and is the primary reason why Xbox360 has such a large sales lead over the PS3 in the US. Xbox Live offers an "experience" that just cannot be found on the other consoles and many gamers are still buying 360s to be a part of that community. The rise of the MP3 and digital downloads again proves that consumers care more about accessibility and convenience than quality alone. No doubt Cd's, SACDs, DVD-Audio, and Blu-Ray offer superior experiences than anything you can download today, but consumers are much more interested in having the content come to them and travel with them wherever they go. The IPhone and IPAD set the world on fire not because it offered a cutting edge CPU or GPU under the hood, but because it provided a innovative, easy to use, and unique user interface that appealed to a large audience.
The changing times would have certainly made Sony and Microsoft at least hesitate before approaching the next generation. Now they have to invest a lot more resources that used to be used to increase processing power to delivering a powerful and innovative user experience. They must invest not so much in hardware but in software in order to be successful next gen. This is a fundamentally different paradigm for the hardware manufacturers to work in and there is likely a time overhead with figuring out the new approach.
3. GRAPHICS APPROACHING DIMINISHING RETURNS
I know that of all of my reasons on this list, this is likely to be most controversial and flame worthy. However, I think it is still a valid point if you can hear me out. There are two main ideas embedded in this statement:
A) That the graphics of current generation hardware is good enough to be relevant longer
B) That the difference in graphical fidelity in going to next generation is not going to be as large as before.
With the arrival of the Xbox360 and thus the advent of the HD era, the jump in visual fidelity from the previous generation was striking. Not only did geometry complexity and texture quality increase dramatically, but the change finally addressed one of the largest gaps between PC gaming and console gaming: resolution. The first time seeing Gears Of War, Project Gotham Racing, Edler Scrolls Oblivion, or Uncharted on a large HDTV in full 720p was a defining moment for most gamers. Almost 6 years later, many gamers are still more than satisfied with their HD graphics on current HD consoles. Games like Killzone 3, Uncharted 3, Forza 4, and Gears of War 3 (all 2011 titles) prove that the current HD consoles can still deliver amazing graphics on par with anything out there. When you combine this argument with the de-emphasis on graphics and technology in general (as mentioned in item #4 above), it makes sense that the current generation quality would be sufficient for most gamers for a longer period than in previous generations.
The other part of this argument is looking to the future, how much further can we go. Even though Moore's Law is still mostly in affect and raw processing power has increased dramatically since the Xbox360 and PS3 came out, the delta that is seen from the increase is much lower than in the past. We've already heard many Sony executives make statements in recent months essentially saying that the next generation is not coming anytime soon because there is not a large enough jump in quality that can be obtained with the latest and greatest technology (Jack Trenton says PS4 not coming for some time). While on paper it would seem like the latest graphics tech like an Nvidia GTX 500 or AMD 6900 series GPU would offer a massive leap over current generation systems, the practical use of all that power in the latest most advanced titles does NOT provide quite as large of a leap. I work with the latest and greatest PC tech everyday for a living and have seen it all: DX11 Ultra Crysis 2 upgrade, Metro 2033 DX11 maxed, Witcher 2 maxed, Unigine, 3DMark 11, Stone Giant, Dirt2/3, Hawx 2, Starcraft II and much more. To my eyes, the best overall visual experience I have had with any title is still God of War III on the PS3. Even smaller titles like Super Stardust HD and Limbo impress me just as much as those DX11 titles. Now, I'm not suggesting that GOW III or Uncharted 2 are better on a technical level than Crysis 2 or Witcher 2 on PC. That would be silly. But my argument is that the consumer does not care about the technical details under the hood in terms of how many polygons a game is pushing, whether it is using tessellation, what shadow algorithm is being used, what type of AA is in use, or what is the resolution of the textures on screen. They just care about what the end result looks like on the screen. I personally have yet to be amazed by anything I've seen using DX11 and think that tessellation in its current state offers a minor visual improvement at best for a significant hit on a performance that does not justify its use. So would I be willing to pay for a new system to match the current state of the art in PC games? Absolutely not. Here is another way of thinking about it: is the leap from Killzone 2/3 PS3 to Crysis 2 PC as fundamentally large as the leap of Killzone PS2 to Killzone 2/3 PS3? Or Metal Gear Solid PS to Metal Gear Solid 2 PS2?
|Crysis 2 DX9|
|Crysis 2 DX11 - Is this a generational leap?|
2. CURRENT SYSTEMS EXPAND/GROW VIA SOFTWARE
This console generation is the first where we have network connections as a standard. This single feature brought about a ton of advancements to gaming: Xbox live and PSN, PlayStation home, patching of games, digital downloads, and media sharing just to name a few. However, perhaps the most significant advancement was firmware updates. This allowed Sony and Microsoft to update the firmware on the device via the Internet thus changing and expanding the feature sets for their respective systems. The result: neither the Xbox 360 nor the PS3 is anywhere close to being the same system they were at launch. Through firmware updates the Xbox has seen Dashboard updates, the advent of avatars, Kinect integration, movie and music download services, a digital marketplace, and a host of apps such as Netflix, Facebook, Twitter, Last FM, and ESPN. Similarly, the PS3 has seen dramatic changes since its launch four and a half years ago. Features such as PlayStation home, revamped PlayStation store, apps such as Netflix, Vudu, and Folding @Home, expanded Blu-ray playback features (i.e profile 2.0 support), support for 3D movies and games, remote play with PSP and media sharing are all features that did not exist at launch.
|Xbox360 Dashboard in 2005|
|Xbox360 Dashboard in 2011|
When you really look at the list of improvements, it is not a stretch to say that both the Xbox360 and PS3 have almost transformed into new systems over the course of there respective life cycles. In fact, one can argue that the 2011 Xbox360 and PS3 systems are "next generation" systems compared to their 2006 counterparts much in the same way that Wii is the "next generation" system to the GameCube (Wii uses essentially the same CPU and GPU as GameCube). These new features keep the system feeling fresh and new and thus makes the desire for an upgrade to a new system significantly less.
1. COST OF MAKING GAMES STILL TOO HIGH
Despite the previous 4 items on the list, there is not doubt that Microsoft or Sony could still move forward with a new system and gaming enthusiasts (including myself) would be salivating at the idea of a new cutting edge game machine. However, I suggest that the number one thing holding back the next generation is simply one word: COST. This item is not addressing consumers, but rather the developers and the hardware manufacturers themselves. Even though the Xbox360 and PS3 technology has aged, the cost to makes games for them has NOT gone done significantly. In fact, over the past 10 years, the average cost of making a game went from around $2 million in 2000 to over $20 million in 2010 including marketing and licensing (see Video Game Costs). This is a serious problem and is causing a divide in games where we see AAA blockbusters from large studios on one side and small indie games on PSN, Xbox Live marketplace, Internet, and app stores on the other. Moreover, the vast majority of studios cannot afford to invest $20 million to make a game and this generation has seen more studios shut their doors for good than ever before. Also, with the high cost comes high expectations for sales. In today's industry if you don't sell millions and at least break even, you may not be survive.