AI could be bad for GPU availability

XT or XTX or both?

1 Like

@SkateZilla The XTX is still way too expensive for what it offers.
But the RX 7900 XT seems to offer around 25-30% better performance at almost the same power usage as the RTX 3080. That could boost me from ± 70 fps to ± 90 fps in VR. Looking at what TUF RTX 3080s go for on the used market, and assuming I would buy Starfield anyways, the net cost could be as low as €250

And as I said, the driver issues seem fixed.

I really wonder what you think of price developments of GPU power. Cheaper because chiplets or more expensive because AI?

1 Like

They are both using the same design, one just has a GCD/MCD Disabled,

The Drivers are hit and miss every month, but the problem is the design was rushed, the infinity fabric is a bottleneck between each MCD, which hinders GCD performance.

the 7900XT Price has gone down to the point were it competes, the XTX not so much.

The design should have been held, there was plenty of 6900/6800s to support a 6 month delay and fix the problems.

Then again, it’s premature release saved nVidia fans from that 4070ti rebadged as a 4080 12GB, so you’re welcome.

Also, the 7900s have VR Stuttering issues.

Side Note:
AMD 6700+ and AMD 7xxxx GPUS Come w/ Starfield or Starfield Premium right now.

1 Like

I don’t think I’d be too bothered if my 4080 has to last a few years - that would actually be ideal, considering what I paid for it. It certainly does the business.

PS _ I actually got some free premium game code with it, but can’t remember what it was - have to have another look. I didn’t use it, so might be up for grabs.

1 Like

I understand your technical perspective, and am of course hoping for a much improved but reasonably priced RX 8800XT. But that depends on market circumstances and the economic reality at least as much as on the technical progress. And as a consumer, the 7900XT right now with the current drivers seems to offer pretty good value.

I’ve been watching that yes. The July driver update fixed it:

AMD response:

Reviewer test:

All that is left is a bug on Oculus software for Oculus/Pico headsets but I use WMR and would only replace it with a Varjo or Valve headset. So now that AMD has fixed their side, these cards are A-OK for me.

1 Like

There wont be a 8800/8900 GPU From the news I’m seeing, ( baring a miracle breakthrough in fixing the IF/IC Bottlenecking the larger GCDs problem.)
rDNA 5 is the Roadmap to Multiple Smaller GCD Chip-lets instead of one large GCD,
rDNA 4 was rDNA 3.5 (GCD, MCDs, and PCIe Chip-lets) w/ and added AI Accelerator Chip-let,
rDNA 4 has the same issues as rDNA3, Infinity Fabric bottlenecks on larger GCD’s, infinity cache over-runs, both of which bottleneck the GCDs and AI Accelerator, not to mention performance per watt decrease due to added AI Chip-let. these problems wont be evident on smaller GCD’s (8600 etc).
AMD’s GPU Division kina had a Bulldozer Moment, so excited with introducing IF and GCD/MCD Chip-lets, they didn’t realize the IF and IC Bottlenecks hindered the raw performance of the GCD.

Had the GCDs not been bottlenecked, you’d see 4080/4090 Performance from the XTX Easily without having to double the TDP to 600w.

rDNA5 is still early, but hasn’t ran into those problems, (as they are using new IF and IC Configurations) w/ Smaller separated GCD Chip-lets and preproduction tests show “exciting” performance increases (from an AMD Tech).

So they will likely not push out a new top tier this time around,
Continue selling off 7800/7900 GPUs, while they develop rDNA 5 in the background and relaunch rDNA5 with Top Tier cards using Full On GCD Chip-lets.

Seeing as 3nm has a 400mm² Reticle Limit w/ likely a 20mm² Edge buffer zone, (NAVI31 GCD = 304.35 mm² for reference), It makes sense to go full on chip-let like I said in my wall of text post above, chip-lets are the future.
You cannot continue to die-shrink and move to sub 5nm with these 400-600mm² GPUs, the voltages required would simply melt the ICs,
4 Smaller GCD Chip-lets w/ 4 separate and lower voltage input lanes would be more stable, plus power tuning would allow the GPU to turn off entire chip-lets completely in desktop or lower GPU utilization games, so instead of running all 4 chip-lets and all the MCDs and Ram Modules, user could set a profile to only enable 2 for specific games, and 2 MCDs and the Ram Chips those MCDs control.

As for Giving nVidia full control of the Top Tier, I doubt you see a RTX5090 on the 3NM Node, no way they can fit that GPU Tapped on a 400mm² reticle limit w/ 20mm² edge buffer zone, that’s a 35-40% reduction in mm² or ICs, it’s not happening.
Even if you take AD102 at 5NM is 609mm², and die-shrink it to a pre-matured 3nm tap-out w/ a 400mm² reticle limit and 20mm² buffer zone, you’re looking at 370mm² for an Ax102 Chip just by die shrinking, which would put it in the buffer zone for the reticle.

nVidia is also HEAVILY invested in the shift to AI Chip Production, they own the top 50 cards on Steam’s survey, with the 3090 and 4090 barely cracking the top 50 (#35 and #39),

3 Likes

Lets not also forget,
rDNA 4 is also supposed to double the Compute Density per GCD… to double TFLOPs.
Seeing as rDNA3 has Large GCD Bottlenecks now, doubling the Compute core density … umm.,

2 Likes

“Like fine wine”, eh :slight_smile:

Navi 41/42/4C Design Explanation:

1 Like

Hey guys! I think I have just mastered the youtuber clickbait face! How’s this?

10 Likes

I keep on clicking, but nothing happens…!
I wanna know WHAT HAPPENS NEXT!!! I bet I will never be able to believe it. :astonished:

7 Likes

Flight sims are dead!

:slight_smile:

2 Likes

So dead! Again! :wink:

1 Like

I F-ing did it again.

I got an offer of €520 for my RTX 3080 and ordered the RX 7900 XT Taichi.
From the tests I’ve seen on the XTX, the Taichi should be one of the better coolers in terms of noise vs temperature.

Net cost of the upgrade is not astronomical at €340, including Starfield Ultimate Edition.
Expect some slightly sloppy DCS VR and 4K tests (I’m not going to run DDU more than the three times required to 1) test the AMD card, 2) put the Nvidia back in to demo for the buyer, 3) put the AMD back in.)

If the results are mediocre, I can still send the 7900 XT back and cancel the sale.

2 Likes

I just did a huge rant on VR performance complaints in EDs discord last night, lol.

a 7900XTX shoudn’t have any issues provided the settings are set correctly.

The expectation of “i have a $900 GPU I should be able to max everything and get 90 FPS” is misplaced.

1 Like

I know, I’m just hoping to get close to 30% performance increase from what I have now, at least in a GPU-limited scenario

Seems AI Farms found a new trick,

Using Zen2 & 3 APUs, paired with 16GB of System Ram. they are able to match most mid tier GPUs in performance.

So now AMD APU proces will go up lmao.

2 Likes

Shall I finally replace my 4-year-old AMD RX 5700 XT for Nvidia RTX 4070 with 8 PIN? Is that a good upgrade? Or shall I wait for the AMD RX 7700/7800 release?
Rumors say that Nvidia could improve access to memory from 128-bit (in the case of RTX 4060 Ti) and 192-bit (in the case of RTX 4070/Ti) + maybe a model with 16GB VRAM as a reaction to new AMD mainstream models.
But everybody knows that Nvidia is greedy. Do you think that this might be true like in the case of RTX 2060/2070 and their Super version as a reaction to technically better and faster RX 5700/5700XT?
If I bought RTX 4070 and Nvidia released it after that Super version I would be very angry.

7800 XT will likely compete w/ 4070 and 4060ti’s for $299

Frankly, I have had an AMD GPU for 4 years and my personal experiences with AMD GPU drivers are mixed (really long story).
Although I’m not a fan of Nvidia prices and DLSS or RT (except Cyberpunk 2077/The Witcher 3 next-gen update and some old games like Quake 2) as everybody around me, I would say that during my period with GTX 960 2GB and GTX 1060 6GB, I didn’t remember that I faced any issue with Nvidia GPU driver in general. Despite I’m in general satisfied with RX 5700 XT, especially performance in new games like RTX 2070 Super/GTX 1080 Ti I had some issues with AMD GPU drivers in the past and not all were easy to figure out.
I love AMD technology but this is curious, they don’t have a problem with GPU drivers in-game (usually except for one period with RX 5700 XT when was new) they still have some problems with Windows OS 10/11 unfortunately. Therefore I want to buy Nvidia, although I’m not a fan of only 12GB VRAM and 192 bit memory access on their RTX 4070 GPU, not to mention still high price.