What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

PSA to all RTX 3080/90 owners

dainluke

King Eggbert III
VIP Supporter
Rating - 100%
344   0   1
Joined
Dec 30, 2016
Messages
4,317
Reaction score
2,779
Points
9,335
Location
Stellenbosch
Preface:
I recently made a rookie mistake that I sort-of had already been thinking about before I received my card. With regards to core clocks by this point I'm certain it's just the luck of the draw as I have seen a huge range in stable frequencies. This is besides the point in generally up to you and how much power you want to shove through your GPU.

Actually important:
I had been running my GPU at +100MHz on the core and +1000MHz on the G6X with no issues whatsoever. I completed my suite of benchmarks and when I had somewhat respectable numbers I called it a day and began putting this R18500 to work as I haven't been gaming that much in the last few months. For the first time I had memory artefacting and a GPU scheduler BSOD. I learnt that the G6X is actually error correcting, and unstable memory OCs may not appear quickly at all if you have fairly good modules. I watched a video of Jay OCing his FE 3080 in which he actually mentioned this topic and demonstrated the FPS loss in Unigine Heaven. Lowering the OC by 500MHz for an effective 20Gbps rendered a negligible drop in Port Royal even though my GPU was holding 100MHz lower of an average boost clock (I wasn't benchmarking with my giant fan blowing at it). I'm not totally sure but I remember hearing of Micron modules exhibiting failures with 20-series cards and if you have a GPU currently holding anything above 700MHz on the G6X, I seriously recommend revisiting your OC.
 
This is one of the things I tried when I overclocked. In Port Royal, my score still scaled positively with more memory speed.

With 100% fan speed to keep it under 60c to keep consistent core clocks between runs, I tested and found +750 scored higher than +500, and +1000 was higher than +750 but not by a huge amount.
Will try it with Heaven and makes sure it also scales.

In games I am currently 100% stable at +75MHz core (+105 was stable for benchmarks but failed after gaming for a while) and +1000Mhz mem after testing in many scenarios - high power usage scenarios of 350w with 1900-1940MHz clockspeeds, low power usage games of 275-300w causing 2000-2040MHz clockspeeds, games with RTX on and games with RTX and DLSS on.
 
This is one of the things I tried when I overclocked. In Port Royal, my score still scaled positively with more memory speed.

With 100% fan speed to keep it under 60c to keep consistent core clocks between runs, I tested and found +750 scored higher than +500, and +1000 was higher than +750 but not by a huge amount.
Will try it with Heaven and makes sure it also scales.

In games I am currently 100% stable at +75MHz core (+105 was stable for benchmarks but failed after gaming for a while) and +1000Mhz mem after testing in many scenarios - high power usage scenarios of 350w with 1900-1940MHz clockspeeds, low power usage games of 275-300w causing 2000-2040MHz clockspeeds, games with RTX on and games with RTX and DLSS on.
I've had very similar results on my end. I just think that perhaps it's worth taking a look at because neither Jay nor GN avoided going near 1000. It's been completely stable for me and I've basically been gaming 8 hours a day since I received the card. It took a very very long time before RDR2 just randomly crashed. I think my 1000MHz OC was barely stable under certain workloads, but I really think it may be safer to back off on the offset for safety sake.
 
It's a 3080/3090. Why overclock at all? :D
Just slap it in and enjoy it for what it is. An out the box gaming beast.

Or....

Put it back in its box, return it, get your hard earned bucks back and get a better GPU from AMD when they go on sale :)
 
I've had very similar results on my end. I just think that perhaps it's worth taking a look at because neither Jay nor GN avoided going near 1000. It's been completely stable for me and I've basically been gaming 8 hours a day since I received the card. It took a very very long time before RDR2 just randomly crashed. I think my 1000MHz OC was barely stable under certain workloads, but I really think it may be safer to back off on the offset for safety sake.

It could be a few things though, not necessarily the memory clock.
If it took a very very long time before it crashed, have you tested the new memory overclock for the same amount of time ?

Did you not use Display Driver Uninstaller before you installed the new drivers ?
Did you enable Hardware Accelerated GPU Scheduling in Windows ?
 
It's a 3080/3090. Why overclock at all? :D
Just slap it in and enjoy it for what it is. An out the box gaming beast.

Because free FPS

Some interesting final figures before I move on to actually playing. WD Legion at 1440p, Ultra settings, Ultra RT and DLSS Quality.

Undervolted 3080 (1821Mhz @ 0.825mV, +1000Mhz mem, max PL) :
73FPS avg, 40FPS min, 114FPS max, +-235w

Stock :
76FPS avg, 44FPS min, 116FPS max, +-330-340w

Overclocked (+75Mhz core, +1000Mhz mem, max PL) :
81FPS avg, 50FPS min, 124FPS max, +-340-350w
 
Because free FPS
No such thing as a free lunch :D

Risk vs reward on a very expensive, beautifully engineered product like a 3080/3090 is a gamble in my humble opinion.
Overclocking comes at a hidden cost in my opinion. And it's a humble one at that :)
Risking a few FPS for a more permanent artifacting 'feature' is a way too scary prospect for me personally on hardware that costs more than the price of delivering a new life in to this world :D

Just my thoughts
 
No such thing as a free lunch :D

Risk vs reward on a very expensive, beautifully engineered product like a 3080/3090 is a gamble in my humble opinion.
Overclocking comes at a hidden cost in my opinion. And it's a humble one at that :)
Risking a few FPS for a more permanent artifacting 'feature' is a way too scary prospect for me personally on hardware that costs more than the price of delivering a new life in to this world :D

Just my thoughts

So all those factory overclocked cards have problems because theres no such thing as a free lunch? Nonsense. You don't get a permanent artifacting feature from overclocking either.

Once you go too high, you get errors. Dial back to more stable speeds (yet still higher than stock) and you're good.

Worst case you didn't win the silicon lottery and have to stay at stock. But you dont get permanent damage from overclocking, thats just an old myth with no proof.
 
So all those factory overclocked cards have problems because theres no such thing as a free lunch? Nonsense. You don't get a permanent artifacting feature from overclocking either.

Once you go too high, you get errors. Dial back to more stable speeds (yet still higher than stock) and you're good.

Worst case you didn't win the silicon lottery and have to stay at stock. But you dont get permanent damage from overclocking, thats just an old myth with no proof.
Mkay.

I just find the whole exercise completely pointless and bordering on the silly (for daily gaming use). More especially on these new gen GPU's.
All in the name of a few FPS more and some arbitrary benchmark scores.
There definitely is no such thing as a free lunch. Tradeoffs and compensations have to be made regardless of your beliefs. It's certainly not risk free.

Each to their own though - if 7 FPS more is worth contending with the fallout from the possibility, however marginally remote, of long term or even immediate damage, on a 40k GPU - horror stories can be funny too - and we all look forward to some laughs :D
 
I must add :

In all likelihood you are more right about it all than I am - I'm just expressing my opinions on the topic from how my logic path dictates those opinions. I guess aging rapidly is making me more risk averse.

:)
 
Did a lot of testing in Unigine Heaven (1440p, max settings) to find when memory regression hits and what it looks like.

To rule out any GPU boost/power limit affecting scores, I undervolted, set the fan speed to 100% and let the GPU get up to temperature (44-45c) and then ran the benchmark which had a constant 1680MHz core clock so all the difference was down to the mem alone.

These were my findings :
+0 : 127.3 avg, 3206 score
+250 : 127.7 avg, 3216 score
+400 : 128.0 avg, 3225 score
+500 : 128.3 avg, 3231 score
+600 : 128.4 avg, 3236 score
+700 : 128.6 avg, 3240 score
+800 : 128.8 avg, 3244 score
+900 : 128.9 avg, 3248 score
+1000 : 129.1 avg, 3253 score
+1250 : 129.6 avg, 3264 score
+1499 : 116.7 avg, 2940 score
+1500 : 117.3 avg, 2955 score


Pretty clear where the regression occurs due to memory error correction on GDDR6X. No artifacts at all though, just the performance drop off.
 
It could be a few things though, not necessarily the memory clock.
If it took a very very long time before it crashed, have you tested the new memory overclock for the same amount of time ?

Did you not use Display Driver Uninstaller before you installed the new drivers ?
Did you enable Hardware Accelerated GPU Scheduling in Windows ?
I did use DDU. I am currently using HWS. Thing is I wouldn't have made this thread had other YouTubers eluded to a hard limit at around 500-700. It's possible that your card is better equipped for memory overclocking though. That's a really good run of testing you did and I see what you mean.
 
No such thing as a free lunch :D

Risk vs reward on a very expensive, beautifully engineered product like a 3080/3090 is a gamble in my humble opinion.
Overclocking comes at a hidden cost in my opinion. And it's a humble one at that :)
Risking a few FPS for a more permanent artifacting 'feature' is a way too scary prospect for me personally on hardware that costs more than the price of delivering a new life in to this world :D

Just my thoughts
I'm not sure if I made it seem that way. The artefacting is definitely not permanent. I merely went to MSI AB and dialed back the memory OC. I picked 20Gbps because OCD. I didn't have any crashes in the subsequent hours thereafter (I only saw one crash at 1000MHz regardless, but I assume my card simply isn't capable of that sort of an OC).
 
Did some more testing, this time in Shadow of the Tomb Raider (everything maxed, RT Ultra, 4x SMAA)

Avg FPS was 63 for every run, only difference was in frames rendered :
+0 : 9891 frames rendered
+500 : 9954 frames rendered
+750 : 10001 frames rendered
+1000 : 10007 frames rendered
+1250 : 10034 frames rendered
+1500 : 10001 frames rendered

So again, we see performance regression at the same memory speed (+1500MHz) as before but the performance doesn't drop nearly as much as in Heaven.

Seems like Unigine Heaven is the best for picking up memory issues which is surprising as it only uses +-2-3Gb VRAM while Shadow of the Tomb Raider uses 8-9Gb.
 
I've been doing 20gbps for my 24/7 gaming mode from the start because I read these cards don't scale much beyond 20500mhz. I always like to be safe within a good margin.

My clocks are +130mhz on core and +500mhz.

Overlcocking for benching settings are +200mhz and 800mhz on the memory but I'm not going to do it again since I'm bored with the benching. Just playing lekker games now!
 
Isn't GDDR6X rated between 19Gb/s and 21Gb/s? doesn't that mean you could get anything ranging from the lowest to the highest margin?

I'm getting performance drop off well past +1250 mark, however gains from +750 to +1000 is minimal, I keep it at +750, looks like the sweet spot. I have never experienced any form of artifact or crash because of memory tuning, I have however had lots of crashes with too aggressive core clocks, especially in gaming.
 
Last edited:
Isn't GDDR6X rated between 19Gb/s and 21Gb/s? doesn't that mean you could get anything ranging from the lowest to the highest margin?

I'm getting performance drop off well past +1250 mark, however gains from +750 to +1000 is minimal, I keep it at +750, looks like the sweet spot. I have never experienced any form of artifact or crash because of memory tuning, I have however had lots of crashes with too aggressive core clocks, especially in gaming.
For me 1000 was fine for days until I had a crash. Probably gonna do 500 while gaming to keep it at 20gbps. More than likely 750-900 for benchmarking.
 

Users who are viewing this thread

Latest posts

Back
Top Bottom