Anyone have thoughts on the announced Nvidia RTX 20XX line

Meh, i’m sitting this out. My 970 is starting to ache in DCS but it works for now. Maybe in one or two years when the availability of GDDR6 is better and prices drop down (and RT is actually supported by anything).
I usually go for cards with a TDP <= 150W. Don’t know if a 2050 will be so much faster than my aging 970 to justify the expense.

Maybe I’ll even wait to see what Intel has in store, although it is doubtful that they will compete with nVidia flagships with their first generation of discrete GPUs.

I have a 1070, which means the 1070Ti, 1080, 1080Ti, and Titans are all faster than this.

I may get a 2070 early next year assuming its performance is a good step up and it’s not more expensive than a 1080Ti still unsold, but not for raytracing. With every new major innovation in graphics cards there has been one truism–the first generation are never powerful enough to play the games at a meaningful speed.

I sort of remember that when I got my first DX8 and DX9 cards, I definitely remember the first DX10 cards couldn’t handle it, the first DX12 cards couldn’t achieve that… They “enable” you to use those APIs, but the performance is always so bad people turn it on, say “oooh pretty” then turn it back off to get good performance while the game looks like all the rest again.

So from what I’ve heard only the 2080Ti is even worth attempting to use RT in games, and that at 1080p single-screen. Bought a 2080? Forget it. There will be driver optimizations and game patches to improve performance, but a 33% boost only sounds impressive until you realize it means going from 30 to 40fps.

Meanwhile, turn off RT and get a still great (by current standards) looking game that runs 3x faster at high res or in VR.

1 Like

If that is the case with my favourite flightsims, I will be getting a new vid card. :slight_smile:

1 Like

I was referring to the fact that in the same game, such as Tomb Raider or BF5, having RT off will likely give fps well over 100 while with it on you’ll be in the 40s if lucky.

How well the 20xx series will do compared to 10xx is unknown right now, although there are unsubstantiated “2x faster” boasts. Without knowing under what circumstances it could well be much less than that.

1 Like

Are the radar calculations that taxing though? The mirrors absolutely are a huge workload though.

In general, a bit disappointed that all the focus is on a new rendering technique that will take years to propagate at the software level, when VR has really highlighted the need for simply more ‘old fashioned horsepower’ in graphics cards. In the year 2018, we don’t need smarter software, we need smarter hardware.

They can be, depending on what the radar is doing. Would normally be a CPU, not a GPU function though.

1 Like

That’s really strongly dependent on the nature of the simulation process.

2 Likes

If doing really advanced radar imaging a la super bug stuff, yeah I’d imagine so. Kinda overkill though when one could get away with a simpler, less taxing solution on both the GPU and CPU.

I’m not talking about the processing that the radar has to do, i’m talking about the simulation of the radar wave propagation.

Agreed. I imagine a lot of special coding would be needed, and even then only a few people would actually get anything from it.

It makes way more sense to use this special drivers/hardware on the RTX cards for visuals, because nicer visual features are nice to have, not need to have.

No, it’s definitely the future but as others have commented, it usually takes more than just one hardware generation to establish such a shift in technology.

PhysX “failed” on a different level in that developers never really dared to code features essential to gameplay for it because that would have locked a large quantity of consumers out (PhysX is exclusive to cards that support CUDA). So it was confined to rendering superficial effects like more realistic hair or a few flags in the wind (Mirror’s Edge e.g.) that had only a small visual impact.

Ray Tracing is such a fundamental step forward for rendering accurate shadow and reflections that there is no way that it’ll not be mainstream at some day, it’s just a matter of having the processing power available.

2 Likes

RT real time application in games is still in its infancy, caveat emptor.

In 6 months, we’ll know much more than we do now.

1 Like

PhysX also was nonessential. There were other physics middleware vendors whose programs ran on CPUs. As they have become more powerful, the need to offload what became less stressful onto the video cards disappeared.

I can’t see RT taking that route. It was always considered the “best” way to render a scene because it’s the way it happens in reality, but it needed powerful dedicated hardware to do it. If nvidia indeed spent 10 years developing this, I will admit they have the right to charge a lot for this. I just don’t know if it’s worth it right now.

If AMD picks it up (and since MS added it to DX I’m guessing they will), then devs will use it. It will also probably take years for them to put it in widely and without a massive performance hit. I can see it being mainstream in, say, 2025 though.

1 Like

Hardware chase. No thanks.

In a free market economy, anyone has the right to charge whatever they want for anything. Now, whether the market will bare that price or not, is a different story…

2 Likes

Seeing some benchmarks for the 2080 and 2080 Ti coming out. Looks to me like the 2080 (non Ti) is about the same speed as the 1080 Ti … unless I am missing something.

Looking at flight sims as my primary hardware driver, with the 1080 Ti and 2080 being roughly equal then it comes down to price. Looking on newegg.ca, 1080 Ti’s are running about the same cost as a 2080 with a slight premium on the 2080 side.

So, it seems like I am going to keep my 2080 pre-order. Thoughts?

If price and current performance are similar, the newer card will probably be better in a few years, when the drivers are better optimized for it. Even if you do not use it in new or updated games which will use its potential even better.

1 Like

Everything being equal, the newer card will get longer driver support (you’ll probably not have it that long though).

1 Like

Now that the reviews by people who didn’t make them under NDA are trickling in, I am a bit disappointed.

Of course I know that once raytracing really becomes a thing in games the performance will be a bit better, the new stuff will longer receive driver support, and once some optimization is done they will probably be a bit faster, and the prices might also drop at some point.

Still… In many “real world” gaming cases the 1080ti is just as fast as the 2080, for less money.

Can’t wait for the 2070, I have to decide whether to get a 1080 (pretty much the minimum card in case I decide to do VR at some point) or a 2070.

2 Likes