
If you're worried about 10GB of memory not being enough, my advice is to just stop. If a game really exceeds the GPU's internal VRAM capacity, performance will tank hard. That's less than 5% of the RTX 3080's 760 GBps of bandwidth. Even PCIe Gen4 x16 only has 31.5 GBps of bandwidth available. You'll notice when data actually starts getting swapped out to system memory because it causes a substantial drop in performance. So, even if you look at a utility that shows a game using all of your VRAM, that doesn't mean you're actually swapping data to system RAM and killing performance. If the memory is just sitting around doing nothing, why not put it to potential use? Data can sit in memory until either it is needed or the memory is needed for something else, but it's not really going to hurt anything. Call of Duty Modern Warfare does this, for example, and Windows does this with system RAM to a certain extent. Using all of your GPU's VRAM to basically cache textures and data that might be needed isn't a bad idea.

Even some games (Resident Evil 3 remake) do this, informing gamers that it 'needs' 12GB or more to properly run the ultra settings properly. The problem is that a lot of gamers use utilities that measure allocated memory rather than actively used memory (e.g., MSI Afterburner), and they see all of their VRAM being sucked up and think they need more memory. There are ways to exceed using 10GB of VRAM, but it's mostly via mods and questionably coded games - or running a 5K or 8K display.

Take a look at the previous generation with 11GB, or the RTX 3090 with 24GB, and 10GB seems like it's maybe too little. In the past few weeks since the RTX 30-series announcement, there have been quite a few discussions about whether the 3080 has enough memory.
