More aptly, if the fridge is VRAM, then amount is the cubic feet storage of the fridge, but the bandwidth is the size of the doors–how much you can take in and out at once.
Once it’s in VRAM, the ops are purely within the card, and while VRAM is always faster than an SSD or HDD, the question is how fast it is compared to the GPU.
Continuing the analogy, the GPU is the chef. If the chef is cooking so fast that they’re waiting for the food to be taken out of the fridge regularly, you could use bigger doors–more bandwidth. If instead the chef is up to their ears in work and only occasionally taking the food out, larger doors won’t make the chef cook faster.
The chef’s speed is partly dictated by pure clock speed and partly by coding efficiency, so one way to go faster is by presenting the chef the ingredients they need as they need them, not before (wasting counter space) and not later (making them wait for it). Then there is the matter of making sure the recipes are as efficient as they could be so they’re not finishing the side dishes before starting on the main course, for example, but working on everything in sequence so all is finished at once (when presenting the image to the screen). If there’s a mismatch, you can get stuttering.
I’m hungry.
So here is a question:
When we look at say the Windows task manager for system RAM we can see the amount in use (active), and the amount cached. Which in the case of my 32GB system, is all but 1MB (that actually shows as free in Resource Monitor) accounted for. Of course, the cached memory can be replaced for more active data if needed.
However, for our GPU’s VRAM (dedicated video memory) we don’t get this granularity of detail (or is there perhaps a program that can?) so I wonder in the case of our GPUs if the amount in use displays as active + cache, or only quantifies the active portion?
That could make a difference if it isn’t “smart” enough to split the data and totals the two values because in that case we are often not using as much video memory as we think. However, I think it’s the latter case and all active thus requiring more if possible for us from this generation than we got on previous generations.
Either way, I am still aiming for more than the 8GB I have on my 1070ti for it’s replacement.
AMD Radeon software has an overlay that shows it, otherwise you can use MSI’s Afterburner software to check any stat to your hearts desire!
Video memory is becoming more and more important due to the large size of data and asset streaming that is popular. Back in Ye Old Days all the assets were loaded before starting a game. These days a lot of content is streamed on the fly to the video memory, especially with textures that can be massive in size!
I guess I should have posted this here … new rumor in the mill …
The 3090 could boast a performance improvement of 50% over the 2080 Ti.
Two more weeks! I checked my local computer store and they don’t even stock 2080 Ti’s any more.
That’s only $1850 CAD, which is what the upper 1080ti’s went for and the 2080ti’s get up to now.
Hmmmmmmmmmmmmmmmmmmmmmm…
Yup, if all this pans out even close to the rumors, I’m whipping my wallet out LOL!
$1400 for a video card??
I am due a tax refund in almost the exact right amount…
LOL! Yup, 4K and VR gaming has a pretty steep price tag. I paid around $1700 Canadian for my 1080 Ti. I still love it though and whoever I sell it to will love it too.
Good video cards are like a commodity. I’ve never had a problem selling them for a good price.
My first build cost less than this video card. It’s absurd.
That’s how us elitists roll.
FYI
HP Reverb G2 Performance Comparison (multiple existing headsets)
Nice to see my O+ holding it’s own. Fortunately the ‘field mods’ to make it wearable are cheap
That 1/2 resolution mode is interesting too.
DERAIL ATTEMPT DETECTED … DERAIL ATTEMPT DETECTED.
$1400 is about where I expected it to fall. Considering its literally twice the VRAM of a 2080ti, its not that unreasonable (given the audience). Personally I’ve never been able to justify a video card that expensive. I picked up the 2080 super on “sale” for 700.
I think I’ll wait for the inevitable mid-life upgrade in the next 12-18 months and scoop up the mid tier 3000 series. Hopefully it has 12 gb of VRAM.
and by justify, I definitely mean afford.
We’ll see what happens. That card is going to last a longer than its mid-tier partner of it has 20gb of vram.
Think I’ll sit with my 2080ti for a good while.
For those of you with 2080(to) for sure. I have a measly 1080