20gb Hdd

0 views
Skip to first unread message

Gaby Zenz

unread,
Aug 3, 2024, 6:12:15 PM8/3/24
to milmadedop

As we continue to see advancements in gaming technology, it is natural to question what future gaming requirements might look like. One topic that comes up frequently is the necessity of having 20GB VRAM or more for gaming purposes. While some high-end graphics cards currently offer this amount of VRAM, is it really necessary for future gaming?

I am curious to hear your thoughts on this matter. Do you think that 20GB VRAM will become a necessary requirement for gaming in the future, or do you believe that 20/24GB VRAM is overkill? Are there any particular factors or developments in the gaming industry that might make 20GB VRAM more necessary in the future?

I've been playing Dead Space Remake at max setting with RTAO at 3840x1080, and according to AMD's own Adrenalin monitoring software, I'm using about 10GB already. So, 8GB VRAM is fast becoming the bare necessity. As for The Returnal and Forspoken, I ain't too concern as my rigs have 32GB of system RAM, while my cards are 24GB and 16GB. But those games don't interest me, but yes, I can see a time when 12GB becomes the bare minimum, even for 1080P gaming. With 16GB of VRAM and higher, one cannot be blamed for feeling a certain amount of the 'longevity' of their card vis-a-vis 8GB and 10GB cards.

Maybe 20GB is overkill at the moment but in 2 years time it will be the sweet spot, my advice is if you are buying a graphics card today is min. 8GB for 1080P, 12GB for 1440P and 16Gb for 4K, ideally go for the 12GB, 16GB cards

VRAM's purpose is to ensure the even and smooth execution of graphics display. It is most important in applications that display complex image textures or render polygon-based three-dimensional (3D) structures. People commonly use VRAM for applications such as video games or 3D graphic design programs.

Overall, I believe the "minimums" will continue to grow as games become more and more demanding (and immersive). Some will say that it's the software that drives the hardware's growth. If you believe it, as I do, then tomorrow's games (developers) are going to require more VRAM naturally.

As I gamer, I expect tomorrow's games to be visually stunning and smooth (no lag, no tearing, high FPS, high refresh rates, photorealistic, etc.). The GPU, and its VRAM, will have to continue to "grow" to deliver these features.

Nah mate, not only you. To be honest I'm dreaming of "socketed GPUs" where you just get a GPU-mobo with a set amount and type of memory (e.g. 40GB of HBM2) but can upgrade chip-wise later down the line for example by pulling the card out, taking of the cooler and plonking in a new chip into a LGA socket or something, repasting and stuff being a given.

I mean we had naked CPU dies on sockets for a long time and everyone who ever repasted a GPU was dealing with naked dies aswell - so "damaging the chip" is not even an argument against it, maybe latency is tho

When looking at recent titles like Company of Heroes 3 hogging up to 11GB of VRAM on medium texture details and high settings leading to filled-VRAM-based crashes/framedrops on my 6700XT while "ultra" even only being available when having more than 16GB VRAM:

However, with games like Hogwarts, TLOU P1 and many more in the next years I consider 8 at the minimum, 16 in the midrange and 20 for the top dogs as needed depending on quality settings and resolution.

HardwareUnboxed did warn its viewers that while the RTX 3070/3070 Ti were excellent cards, due to their 8GB of VRAM, their usefulness for future gaming at high res + max setting + RT would be kneecapped by insufficient VRAM.

Actually I feel for the owners of these cards, even the RTX 3080 10GB, who might have gotten their cards back when they were priced at ridiculous level. Recent games like Hogwarts Legacy, RE4R, Forspoken, and TLOU Pt 1 are foreshadowing what future games may require, 12GB would become the new 8GB standard for mid range cards.

Mid range cards like the RX 7800 XT should be armed with 16GB VRAM, I honestly think nVidia did another crap move by arming the RTX 4070 Ti with just 12GB of VRAM, and it looks like the RTX 4070 would have the same. Yes, 50% more VRAM than their predecessor and like them, are very capable cards that can do with more VRAM. It looks like the same nVidia early obsolescence masterplan for these cards, much like the RTX 3070/3070 Ti.

Games keep getting more demanding for resources. There are games that gladly use several cores at the same time, if it's available, and don't really work well with just 16 GB main RAM -- even if you don't play it in 4K resolution. Star Citizen (Alpha) is such a game and more are coming. So I believe we have arrived in that future already. The next graphic tech invention, after RT, will maybe push us over 20 GB Video RAM usage in 4K gaming.

20gb is still more than you'll need 99% of the time when gaming. However if you're planning on playing brand new AAA titles at 4k you're going to start seeing more of a need for it. However if you're just planning on doing 1080 12gb should get you buy for a couple more years but 16gb will ensure that your card won't be obsolete by the time you're tempted to buy a new one.

I was commenting about the VRAM buffer issue in other forums, I'm taken aback that nVidia fanboys who are owners the RTX 3060 Ti/3070/3070 Ti aren't furious about their purchases, and especially at nVidia. nVidia knew that 8GB VRAM isn't quite enough for newer games going forward, yet chose to put 8GB on these relatively powerful cards that are capable of more. They defend their purchases by saying it's okay to reduce texture, image quality and even RT to run within the 8GB VRAM buffer.....what about nVidia much vaunted PQ and RT performance? Suddenly it's okay to reduce PQ and do without RT just to NOT saturate the 8GB VRAM buffer?

Basically, planned early obsolescence as no matter how fast these cards are, once the VRAM buffer is saturated, performance will tank as exemplified by some recent AAA games. Heck, had I'd gotten one of these cards, I'd be pretty upset....but then, I'd be even more upset had I'd gotten an RTX 3080 10GB, as perhaps games are beginning to exceed that buffer size too.

Below is a screenshot of Dead Space Remake, there are traversal hitching and very mild stuttering here and there, but as a whole, the game runs pretty well. I have Freesync set at 120Hz, and the game is pretty darn smooth 99% of the time. I'd snipped the image/screenshot by removing part of the right side of the screen. Note the 'Mem'. 11.2GB is allocated, 10.05GB is utilized...the game was running originally at 3840x1080 natively, max graphics setting + RTAO

Well I do have RTX 3070, I never play new release games, for example I just got started on GTA V. I wait years on getting games and longer on newer gpus. I Play GTA V pretty much all maxed out at 2560x1440 at 165hz and GTA V is the most visually graphic game I have played recently and 8gb vram is more than enough, IMO the game has totally awesome visuals. No Stutter or lag no issues whatsoever. By the time I decide to buy this current generation of games and upgrade my gpu there will be even newer cards with more vram and better specs than what is the latest and greatest now. I am a casual gamer, it takes me time to work through games. By the time I get to the games almost all the kinks are worked out of said game by the time I get to it, then when I have a new gpu that is more than capable to handle the games fully maxed out and most likely no issues. All this just has worked out for me for years, and IMO for me being a casual gamer the best way to go for me.

Over the years I've gone through a number of GPUs. From the earliest cards like the 3dfx Voodoo, or the old ATI Rage/Xpert cards even up to the HD4350 I had my first foray into OC with... Then my R7 250 with a whole GIGABYTE of GDDR5 (back when most cards were still using GDDR3) and on to my RX570 with 4GB of GDDR5. That was considered mid-range a few years ago, and you could still get Nvidia GPUs with 2 or 3 GB of VRAM. The performance floor has been raised for years, and will continue to do so. Unlike consoles which have a new iteration every 5 to 7 years, PC gaming has been a never ending arms race since the beginning. What is bleeding edge now, will be mid range in 2 years, outdated in 3.5 and obsolete in 5.

My recently installed RX6800XT has been mind-boggling for visuals since I'm still locked to 72fps at 1080p due to not having a monitor upgrade. The card barely spools up. Pushing high framerates with the visuals at or near max at 1440p will probably not make this thing work hard. I simply don't have the budget for 4K, unless someone surprises me with a 55" TV to hang on the wall. Then I can see myself playing a few games from the La-Z-Boy. But will it cut the mustard in a couple of years? Probably not. But if I don't have an upgrade path set by then, I'll do what I've always done in the past: Lower the settings until it's playable. If some of my games stop supporting my hardware, then I'll upgrade.

What I would love to see is a GPU "kit" that you can upgrade, much like what we do with the rest of our rigs. Sell me a PCB with the VRMs, I/O and power connector in place. Have it kitted out with a socket for the GPU, and slots for VRAM. Let me figure out my own cooling solution. Have those PCB designs supported for 5-6 years, and then you can bombard us with new modules every year or two. If you cheaped out on your PCB, you have a smaller window of options, just like you do with a PC MoBo.


What I would love to see is a GPU "kit" that you can upgrade, much like what we do with the rest of our rigs. Sell me a PCB with the VRMs, I/O and power connector in place. Have it kitted out with a socket for the GPU, and slots for VRAM. Let me figure out my own cooling solution. Have those PCB designs supported for 5-6 years, and then you can bombard us with new modules every year or two. If you cheaped out on your PCB, you have a smaller window of options, just like you do with a PC MoBo.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages