I have been an Intel man my entire life, with Nvidia covering the graphics. The only catch to that was my first desktop tower, which after being relegated secondary to a Toshiba Qosmio X300 had to be brought back under “emergency measures” when the Qosmio’s Nvidia GPU called quits (replacements were extortionate and I could only find the Euro model, which wasn’t the same).
This was back in the days of AGP slots (which were deprecated and no new cards made for them). The only card I could get that would let me play FSX was a Radeon, as it was dual height with a fan and had 1GB of VRAM. It was a royal PITA, as I researched the chips had a fault that caused them to bugger up 2D renderings - the 2D panels in FSX would all have a diagonal cut in them with the two halves offset along the slice by a half dozen pixels.
I can stomach a chip reaching EOL and failing, but a design fault like that - no. To date I have had 3 Nvidias go out, laptop included and one other “premature” but that was a secondary card I played with SLI with. It’s warranty replacement is alive and well (and so is the original 560ti from my last build, in my dad’s PC). The third was a 660ti replaced last summer, having earned its retirement.
No Intel CPU has ever let me, or my family down (for those outdated CPUs that get handed down).
I would be interested to see if anyone has a similar story from the “other side”. My systems are long-term builds, each one goes for 5+ years before a full replacement, so Intel’s reliability is what keeps them getting my dollars.