Thisbook is not about the future of video games. It is not an attempt to predict the moods of the market, the changing profile of gamers, the benevolence or malevolence of the medium. This book is about those predictions. It is about the ways in which the past, present, and future notions of games are narrated and negotiated by a small group of producers, journalists, and gamers, and about how invested these narrators are in telling the story of tomorrow.
This new title from Goldsmiths Press by Paolo Ruffino suggests the story could be told another way. Considering game culture, from the gamification of self-improvement to GamerGate's sexism and violence, Ruffino lays out an alternative, creative mode of thinking about the medium: a sophisticated critical take that blurs the distinctions among studying, playing, making, and living with video games. Offering a series of stories that provide alternative narratives of digital gaming, Ruffino aims to encourage all of us who study and play (with) games to raise ethical questions, both about our own role in shaping the objects of research, and about our involvement in the discourses we produce as gamers and scholars. For researchers and students seeking a fresh approach to game studies, and for anyone with an interest in breaking open the current locked-box discourse, Future Gaming offers a radical lens with which to view the future.
Video games have steadily risen in popularity for years. And with the social benefits of video games becoming more apparent, the trend has only accelerated. Gaming is now a bigger industry than movies and sports combined.
Although VR has hit a few bumps along the way, tech and gaming companies are busy trying to advance the industry, investing considerable resources to develop VR hardware and games. Companies like Meta, Valve, PlayStation and Samsung have all ventured into the VR industry over the last several years. Apple is even jumping in on the action with the release of its Vision Pro headset. This trend of investment is likely to continue with the VR game industry projected to grow at 30.5 percent by 2028.
While VR headsets have developed a reputation for being pricy, bulky and uncomfortable for gaming, companies have been busy making VR more appealing to a wider audience, and hardware prices are dropping. But even when those hurdles are cleared, the fact that the typical VR experience is so socially isolating might limit its upside.
For several years now, designers have been using AI to help them generate game assets, which frees them up from painstakingly drawing each individual tree in a forest or rock formation in a canyon. Instead, designers can offload that work to computers by using a technique called procedural content generation, which has become standard practice in the industry.
While AI may not create entire games yet, AI-generated art may change the graphics industry in the future. One designer even used AI art to create a horizontal-scrolling shooter game in just three days.
Playing with AI art might be fun for creators, but academics and game designers alike are still trying to implement AI systems that will control the game in a way that is engaging for the player. Cardona-Rivera envisions a future in which AI acts as a game master that calls the shots for a human player.
Cloud gaming, sometimes called game streaming, is a kind of online gaming that allows players the ability to stream games directly on their device by accessing video games from faraway servers, in the same way they stream Netflix movies on their smart TVs without needing to pop in a DVD first.
PC gaming companies like Nvidia and AMD have made great strides in creating graphics cards that allow for high-fidelity images in games and techniques like ray tracing. High-fidelity graphics are when a game has 3D imagery with a multitude of complex vertices, the points in space where line segments of a shape meet. High-fidelity games usually have ray tracing technology too.
In the past, things like shadows and reflections and lens flares were essentially painted onto objects within the game. This gave the illusion that light was coming from the sun or moon and reacting as it would when it hit a surface. With ray tracing, an algorithm actually simulates the behavior of light on objects within a game.
Not all games of the future will be designed for such realistic graphics. Especially not indie games. The way Mack sees it, there are two distinct routes game developers can take when it comes to graphics.
One approach is to hire tons of visual artists and technicians to supply vast amounts of art for high-fidelity graphics. That means big budgets, big teams and increasingly realistic graphics, down to every last speck of dirt. This approach is more often used in triple-A games (high-budget games made by big game publishers).
Several gaming companies are seeing the benefits of offering free-to-play games with in-game purchases. Activision Blizzard, the company behind Overwatch, World of Warcraft and Call of Duty, reported that it made $2.46 billion from in-game purchases within a single quarter in 2023.
A concept popularized by author Neal Stephenson in his 1992 science-fiction book Snow Crash, the metaverse is best understood as an online cyberspace, a parallel virtual realm where everyone can log in and live out their (second) lives. Ideally, the metaverse will combine both virtual and augmented reality, have its own functioning economy and allow complete interoperability.
While we may be a long way off from that, hints of the metaverse are increasingly evident. You see it in gaming platforms like Roblox, where luxury fashion brands like Gucci host events, and in games like Fortnite, where users can dress up as their favorite Star Wars or Marvel characters and watch virtual music concerts.
Gaming could become more enhanced, thanks to AI improving non-player characters, AR and VR technologies offering more engaging experiences and cloud streaming making it possible to play games on multiple platforms. In addition, gaming could shift to more mobile formats as free-to-play games on mobile phones grow in popularity.
As we continue to see advancements in gaming technology, it is natural to question what future gaming requirements might look like. One topic that comes up frequently is the necessity of having 20GB VRAM or more for gaming purposes. While some high-end graphics cards currently offer this amount of VRAM, is it really necessary for future gaming?
I am curious to hear your thoughts on this matter. Do you think that 20GB VRAM will become a necessary requirement for gaming in the future, or do you believe that 20/24GB VRAM is overkill? Are there any particular factors or developments in the gaming industry that might make 20GB VRAM more necessary in the future?
I've been playing Dead Space Remake at max setting with RTAO at 3840x1080, and according to AMD's own Adrenalin monitoring software, I'm using about 10GB already. So, 8GB VRAM is fast becoming the bare necessity. As for The Returnal and Forspoken, I ain't too concern as my rigs have 32GB of system RAM, while my cards are 24GB and 16GB. But those games don't interest me, but yes, I can see a time when 12GB becomes the bare minimum, even for 1080P gaming. With 16GB of VRAM and higher, one cannot be blamed for feeling a certain amount of the 'longevity' of their card vis-a-vis 8GB and 10GB cards.
Maybe 20GB is overkill at the moment but in 2 years time it will be the sweet spot, my advice is if you are buying a graphics card today is min. 8GB for 1080P, 12GB for 1440P and 16Gb for 4K, ideally go for the 12GB, 16GB cards
VRAM's purpose is to ensure the even and smooth execution of graphics display. It is most important in applications that display complex image textures or render polygon-based three-dimensional (3D) structures. People commonly use VRAM for applications such as video games or 3D graphic design programs.
Overall, I believe the "minimums" will continue to grow as games become more and more demanding (and immersive). Some will say that it's the software that drives the hardware's growth. If you believe it, as I do, then tomorrow's games (developers) are going to require more VRAM naturally.
As I gamer, I expect tomorrow's games to be visually stunning and smooth (no lag, no tearing, high FPS, high refresh rates, photorealistic, etc.). The GPU, and its VRAM, will have to continue to "grow" to deliver these features.
Nah mate, not only you. To be honest I'm dreaming of "socketed GPUs" where you just get a GPU-mobo with a set amount and type of memory (e.g. 40GB of HBM2) but can upgrade chip-wise later down the line for example by pulling the card out, taking of the cooler and plonking in a new chip into a LGA socket or something, repasting and stuff being a given.
I mean we had naked CPU dies on sockets for a long time and everyone who ever repasted a GPU was dealing with naked dies aswell - so "damaging the chip" is not even an argument against it, maybe latency is tho
When looking at recent titles like Company of Heroes 3 hogging up to 11GB of VRAM on medium texture details and high settings leading to filled-VRAM-based crashes/framedrops on my 6700XT while "ultra" even only being available when having more than 16GB VRAM:
However, with games like Hogwarts, TLOU P1 and many more in the next years I consider 8 at the minimum, 16 in the midrange and 20 for the top dogs as needed depending on quality settings and resolution.
HardwareUnboxed did warn its viewers that while the RTX 3070/3070 Ti were excellent cards, due to their 8GB of VRAM, their usefulness for future gaming at high res + max setting + RT would be kneecapped by insufficient VRAM.
Actually I feel for the owners of these cards, even the RTX 3080 10GB, who might have gotten their cards back when they were priced at ridiculous level. Recent games like Hogwarts Legacy, RE4R, Forspoken, and TLOU Pt 1 are foreshadowing what future games may require, 12GB would become the new 8GB standard for mid range cards.
3a8082e126