Anyone have thoughts on the announced Nvidia RTX 20XX line


Meh, i’m sitting this out. My 970 is starting to ache in DCS but it works for now. Maybe in one or two years when the availability of GDDR6 is better and prices drop down (and RT is actually supported by anything).
I usually go for cards with a TDP <= 150W. Don’t know if a 2050 will be so much faster than my aging 970 to justify the expense.

Maybe I’ll even wait to see what Intel has in store, although it is doubtful that they will compete with nVidia flagships with their first generation of discrete GPUs.


I have a 1070, which means the 1070Ti, 1080, 1080Ti, and Titans are all faster than this.

I may get a 2070 early next year assuming its performance is a good step up and it’s not more expensive than a 1080Ti still unsold, but not for raytracing. With every new major innovation in graphics cards there has been one truism–the first generation are never powerful enough to play the games at a meaningful speed.

I sort of remember that when I got my first DX8 and DX9 cards, I definitely remember the first DX10 cards couldn’t handle it, the first DX12 cards couldn’t achieve that… They “enable” you to use those APIs, but the performance is always so bad people turn it on, say “oooh pretty” then turn it back off to get good performance while the game looks like all the rest again.

So from what I’ve heard only the 2080Ti is even worth attempting to use RT in games, and that at 1080p single-screen. Bought a 2080? Forget it. There will be driver optimizations and game patches to improve performance, but a 33% boost only sounds impressive until you realize it means going from 30 to 40fps.

Meanwhile, turn off RT and get a still great (by current standards) looking game that runs 3x faster at high res or in VR.


If that is the case with my favourite flightsims, I will be getting a new vid card. :slight_smile:


I was referring to the fact that in the same game, such as Tomb Raider or BF5, having RT off will likely give fps well over 100 while with it on you’ll be in the 40s if lucky.

How well the 20xx series will do compared to 10xx is unknown right now, although there are unsubstantiated “2x faster” boasts. Without knowing under what circumstances it could well be much less than that.


Are the radar calculations that taxing though? The mirrors absolutely are a huge workload though.

In general, a bit disappointed that all the focus is on a new rendering technique that will take years to propagate at the software level, when VR has really highlighted the need for simply more ‘old fashioned horsepower’ in graphics cards. In the year 2018, we don’t need smarter software, we need smarter hardware.


They can be, depending on what the radar is doing. Would normally be a CPU, not a GPU function though.


That’s really strongly dependent on the nature of the simulation process.


If doing really advanced radar imaging a la super bug stuff, yeah I’d imagine so. Kinda overkill though when one could get away with a simpler, less taxing solution on both the GPU and CPU.


I’m not talking about the processing that the radar has to do, i’m talking about the simulation of the radar wave propagation.


Agreed. I imagine a lot of special coding would be needed, and even then only a few people would actually get anything from it.

It makes way more sense to use this special drivers/hardware on the RTX cards for visuals, because nicer visual features are nice to have, not need to have.


I think the price is crazy!

I paid £1100 for a Titan X Pascal though, so I guess I’m crazy too :slight_smile:

Seriously I think Nvidia have a backlog of 1080/60/70 series cards to shift before these new cards prices can become lower, perhaps because of that mining fad they had some over production issues of the 1080 series and need to shift them?

Ray tracing huh, you know what, I dont think it looks that great, gimmicky even, bit like 3D, where in the demo’s of ray tracing games I have seen so far its all a bit in your face with the flame reflections and nothing really subtle, also I don’t see any games I play that support it or any flight sim in the near future supporting it either?

Its a bit worrying too, no benchies have been seen of how a 2080 Ti compares against say a 1080 Ti, have seen many rumours there is not much performance gain at all in traditional games or sims? its all about new ray tech with these cards it seems … I love new hardware but honestly do not see me buying one of these, perhaps sit on the fence until next years gen?

PhysX never really took off and we had dedicated PhysX cards too, maybe ray tracing will be a flash in the pan as well?


No, it’s definitely the future but as others have commented, it usually takes more than just one hardware generation to establish such a shift in technology.

PhysX “failed” on a different level in that developers never really dared to code features essential to gameplay for it because that would have locked a large quantity of consumers out (PhysX is exclusive to cards that support CUDA). So it was confined to rendering superficial effects like more realistic hair or a few flags in the wind (Mirror’s Edge e.g.) that had only a small visual impact.

Ray Tracing is such a fundamental step forward for rendering accurate shadow and reflections that there is no way that it’ll not be mainstream at some day, it’s just a matter of having the processing power available.


Thats good to know, are there any flight simulations that will have ray tracing tech? new flight sims are few and far between these days it seems.

Is it possible to introduce ray tracing into older sims like P3D. DCS, IL2 ETC, these are still ongoing developing projects, if not then I would worry for RT use in newer flight sims.

Am still concerned about the overall RT performance impact too and if this new gen of cards will perform much better in traditional non RT titles?

My other last thought is on VR, what impact and performance result will RT have on VR and existing headsets like the Rift that I use?


RT real time application in games is still in its infancy, caveat emptor.

In 6 months, we’ll know much more than we do now.


PhysX also was nonessential. There were other physics middleware vendors whose programs ran on CPUs. As they have become more powerful, the need to offload what became less stressful onto the video cards disappeared.

I can’t see RT taking that route. It was always considered the “best” way to render a scene because it’s the way it happens in reality, but it needed powerful dedicated hardware to do it. If nvidia indeed spent 10 years developing this, I will admit they have the right to charge a lot for this. I just don’t know if it’s worth it right now.

If AMD picks it up (and since MS added it to DX I’m guessing they will), then devs will use it. It will also probably take years for them to put it in widely and without a massive performance hit. I can see it being mainstream in, say, 2025 though.


Thanks for the comments and information Guys, but all the more reason to sit it out and wait, wait until there at least some titles sim wise that use RT that I might like and how the new current generation of Nvidia 2080 ultra expensive cards deal with the early RT games and demos.

I have a GTX Titan X Pascal just now, the most expensive card I’ve ever bought, but some of these new Nvidia 2080 Ti cards are going for £1500 in UK with 4 GB less VRAM and no benchies on how they compare to my card in sims and games I actually use and will continue to use :slight_smile:

Its back on the fence for me until next gen at end of next year for Nvidia for me and some decent RT titles (Simulation) at least, chicken and egg situation I know, but I aint gonna be the guinea pig any more, done my share of that :grimacing:


Hardware chase. No thanks.


In a free market economy, anyone has the right to charge whatever they want for anything. Now, whether the market will bare that price or not, is a different story…


Seeing some benchmarks for the 2080 and 2080 Ti coming out. Looks to me like the 2080 (non Ti) is about the same speed as the 1080 Ti … unless I am missing something.

Looking at flight sims as my primary hardware driver, with the 1080 Ti and 2080 being roughly equal then it comes down to price. Looking on, 1080 Ti’s are running about the same cost as a 2080 with a slight premium on the 2080 side.

So, it seems like I am going to keep my 2080 pre-order. Thoughts?


If price and current performance are similar, the newer card will probably be better in a few years, when the drivers are better optimized for it. Even if you do not use it in new or updated games which will use its potential even better.