Why does your RTX 2080ti come back as 146.7% but mine came back as 174.5%? Does the mobo and memory have some effect on the results of the GPU test as well?
They may have different core clocks out of the box. An Evga may well have higher boost clocks than a Fe
Also with the newer gpus the cooler the gpu the higher it will boost even without a manual overclock
Im running mine stock… My memory says I need to run XMP. I am playing around in the BIOS…
I cant even reach BIOS for some strange reason my TV gets signal from GPU only at Win login page
Anyway, as I was playing around with things I updated GPU drivers. I always run on some legacy GPU driver ( because its stable and I have ok performance with it ) and I update only from time to time. When there is something I want to solve/update on my machine ( disclaimer : GPU drivers install never solved anything in my case )
At this point I went from 452.06 to 457.09. What was interesting was that XP just crashed because of Vulkan and didnt even want to switch to Vulkan saying that my drivers are older than 440.
So I installed some previous driver 456.55 and then XP was happy with Vulkan again and I saw increase in FPS. Need to test further but there was definitely increase.
Wondering if tinkering with some older drivers can have some benefit also for you guys with RTX 30 family cards playing old sims like XP and DCS.
I have EXACTLY the same issue.
Resolution not supported
I installed cod and left it for my son to play. I am too old for that stuff. I am quite enjoying the brief time I had on cyber punk but mainly its flying the hercules in dcs with fluid graphics in VR. Its great
Just because you can use a TV instead of a monitor doesn’t mean you should.
There’s a reason a TV is a lot cheaper than a monitor.
BIOS don’t support 4K res, so if your TV doesn’t support up-res from much a lower non-TV res coming out of the BIOS, you’ll see nothing. You’ll need a proper monitor to hook up to see the BIOS.
Try switching off Secure Boot in the UEFI. (It’s not called BIOS anymore - oh sorry I am being german again)
UserBenchmarks: Game 113%, Desk 104%, Work 104%
CPU: Intel Core i7-8700K - 102.4%
GPU: Nvidia RTX 2080-Ti - 107.6%
SSD: Samsung 960 Evo NVMe PCIe M.2 250GB - 211.7%
SSD: Samsung 850 Pro 256GB - 113.8%
SSD: Samsung 960 Evo NVMe PCIe M.2 500GB - 202%
HDD: WD Black 500GB (2012) - 96%
RAM: G.SKILL F4 DDR4 3000 C16 4x8GB - 93.7%
MBD: Asus ROG MAXIMUS X HERO (WI-FI AC)
Oh Look Mommy, Im going backwards… WTH!
Increasing your values are not.
IIRC, userbenchmark percentages are based on some statistic of all (recent?) benchmarks done (average or median or something like that).
So if many people with new RTX3080, RTX3090, RX6800, RX6900 cards are running benchmarks now, a fixed performance may decline in %-score even more rapidly than normally.
Having said that, this is a very fast decline.
Ok, getting a good OC on the card now. Thanks @BeachAV8R for getting me overclocking again.
UserBenchmarks: Game 167%, Desk 105%, Work 153%
CPU: Intel Core i7-8700K - 104.5%
GPU: Nvidia RTX 2080-Ti - 160.9%
SSD: Samsung 960 Evo NVMe PCIe M.2 250GB - 216.3%
SSD: Samsung 850 Pro 256GB - 123%
SSD: Samsung 960 Evo NVMe PCIe M.2 500GB - 213.5%
HDD: WD Black 500GB (2012) - 102.3%
RAM: G.SKILL F4 DDR4 3000 C16 4x8GB - 94.3%
MBD: Asus ROG MAXIMUS X HERO (WI-FI AC)
Mine might have come from MSI overclocked to start with…as I haven’t fiddled with mine.
While my CPU is overclocked, I’ve never done it with my Nvidia 2080ti. I was worried about heat. What’s the best way to do it with a decent margin of safety? TIA
I don’t know what any of this stuff does…MSI has a Dragon Center, but I’ve never actually used it…the old “if it ain’t broke” principle…
Wow, I so highly disagree with your opinions! You should absolutely use a TV instead of a monitor! Check out my 55”. My crap pictures don’t do it justice … crystal clear graphics and text …
What is the reason? I seriously don’t know. I feel so fortunate to be able to go out and buy at 55” 4K monitor/TV for $400. They don’t even make computer monitors larger than 50” and the larger monitors they do make come in that gimmicky “ultrawide” format. I like my vertical FOV too so “ultrawide” & “ultratall” for me thank you! If they did make monitors over 50” you’d probably easily have to spend $1500+ for one.
I don’t know what TV’s you guys are using because I’ve been able to see my BIOS settings ever since I bought my first 4K TV a few years ago …
I’m as much devoted to my large computer monitors as the VR enthusiasts are devoted to their headsets! (I am also a VR enthusiast)
I got to overlook it last night. Did it right for the first time in my life. It takes patience. Do a little, test. Rinse and repeat till you get artifacts or a crash then go back to the last test. I used Presidio X1 to OC and Superposition as the benchmark. The benchmark shows live temp reading. I hit 62 as a high. So nothing to fear.
I make changes, go do laundry check, move sliders test. Ect ect. Writ down my results and settings too.
FWIW, I’ve been playing with a little O/C’g on my 1080. Has gone as @Maico mentions. On my VCard I can tell I’m reaching the limit when, in DCS, I start to get tiny little ‘sparkles’ on the screen. Red ones it seems. Really small, like single pixel-sized. Note the temps stay fine and well within limits. If I go even a little over this the sparkly-things get distracting.
An interesting bit: when the scene is heavy (low down, lotsa trees, etc) I don’t see the artifacts much, but when the scene is barren, and the FPS is highest (at high alt with not much around me) is when they appear more frequently.
Writing things down is helping me a bunch. I’m getting close.
@BeachAV8R I say your stuff has probably been very much optimized from the factory. I would not touch a thing!!!
I built mine so I got to learn how to do this. I got to the edge and crash. Now finding the Sweet spot.
Im now in a place where going up w the clock lowers the score. Voltage for me is +75. More is worse, less is worse. My core will be between +115 and 120.