What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

NVIDIA RTX 40 series thread

I bought my previous XFX 6800XT in Nov 20 for R17,7k, so the 7900XTX/XT prices do seem more "grounded". It does appear that local retailers took advantage of the 4000 series release, as the price differences between the 7900XTX and 4000 series can't just be explained by the USD200 difference (wrt to 4080). Either way, all facts are now on the table and you can pick the card you want. Whatever you get, enjoy it!

@Fluke Keen to hear what you settle on.
 
@Fluke Keen to hear what you settle on.
Probably going to be waiting for the 7000 and 4000 series to fully be fleshed out before deciding. You have to keep in mind I stuck on a 2500k for nearly 8-9 years and my 1080 Ti I've had since 2017 too, so I tend to be overly picky with hardware lol
 
Probably going to be waiting for the 7000 and 4000 series to fully be fleshed out before deciding. You have to keep in mind I stuck on a 2500k for nearly 8-9 years and my 1080 Ti I've had since 2017 too, so I tend to be overly picky with hardware lol
As long as you're not still on the 2500K
 
They saw what people were willing to pay during the covid/mining pandemic, so now they've gone over the fucking rails.

4090 is the only card really making sense. But even then the actual pricing should've been like this :

4090 : $1199
4080 : $799
4070 Ti : $599
 
Last edited:
On Wootware now...for slightly better than 3090Ti performance at 250 (?) watt doesn't seem bad.

 
GPU prices in this country make 0 sense. AMD cards at same price as 4080 and 4070ti more then 4080s. Who buys these things at these prices. I get buying a 4090 (frankly its cheaper then any previous top card has been for us, so it must even look like a steal to some) if you want the best and have the money sure why not, but the everything below is a joke.
 
GPU prices in this country make 0 sense. AMD cards at same price as 4080 and 4070ti more then 4080s. Who buys these things at these prices. I get buying a 4090 (frankly its cheaper then any previous top card has been for us, so it must even look like a steal to some) if you want the best and have the money sure why not, but the everything below is a joke.

Want to see another joke regarding local pricing.
All MBA(Made By AMD) reference design cards are all from the same production. they are from AMD, the AIB just puts their name on it and sells it, yet look at the price difference between them.
So these are 2 cards below are from the same factory, designed by the same people, made using the exact same components but somehow having a Gigabyte box on the outside is deserving of an extra R8K.

Local GPU pricing really is wacky sometimes.



Source : Statement from Power Color regarding ref cards
 
Last edited:
Want to see another joke regarding local pricing.
All MBA(Made By AMD) reference design cards are all from the same production. they are from AMD, the AIB just puts their name on it and sells it, yet look at the price difference between them.
So these are 2 cards below are from the same factory, designed by the same people, made using the exact same components but somehow having a Gigabyte box on the outside is deserving of an extra R8K.

Local GPU pricing really is wacky sometimes.



Source : Statement from Power Color regarding ref cards
You usually get an extra year of warranty with Gigabyte vs XFX, but I agree, still not worth the 8K brand name tax.
 
Hulle is rerig befok:

Just go buy a fucken 4080, like why would one ever do this purely for the brand aesthetic? xD
Imagine the chaos if it was still called a 4080 12GB :eek:
 
Something I haven’t seen a lot on was multi monitor support. The 30 series was up to 8K at 60FPS but it didn’t handle if you had multiple monitors at higher refresh rates at lower resolutions very well (or there was some specific math I didn’t quite understand). Has that gotten any better with this generation or will we need to wait on that?
So I see people say this, but honestly I haven't had any multi-monitor issues since my 8600 GT. I've run some weird monitor combos too, my 1080 Ti has no issues with any combination of the following:

1x 1440p165hz g-sync monitor via DP
1x 4k60 monitor via HDMI
1x Index (2880p144hz)via DP

1x 4k60 TV via HDMI
1x 1280x1024 Apple monitor via DVI from like 2006
1x Rift CV1 (1600p90hz) via HDMI
1x 1080p60 crappy lenovo monitor

Currently only using the top three, bolded, displays with the TV thrown in sometimes via a long ass HDMI cable. The only "issue" multi-monitor will have on (any) GPU setup is that the 2D clocks will be higher and the memory idle will be higher too. Thus will the power draw and temps. It's not an issue, it's doing that because there's more stuff to render/draw and it needs to raise the clocks/power draw to do so...
 
Been the same for me between W10 and 11, honestly 11 is basically 10 but with a GUI update and nice-to-have features that people have been requesting for ages.
 
New dlss 3 patch out for cyberpunk … 152 fps avg on the benchmark for me … feels just wrong to be seeing such high fps in cyberpunk with everything maxed out
 
New dlss 3 patch out for cyberpunk … 152 fps avg on the benchmark for me … feels just wrong to be seeing such high fps in cyberpunk with everything maxed out

Don't worry, the Overdrive Raytracing patch will come and bring it back down again lol.


My 9900K is bottlenecking me like crazy at 1440p. Very fast but in that marketplace in Cyberpunk it can have drops to 80fps with CPU usage in the 90s. Forspoken was the worst, like dips into the mid 60s in combat with like 60% GPU usage.

COD Warzone also regularly hovers around 110-130fps with 50-70% usage in the cities but no problem when the buildings are scarce.

This poor 9900k is trying it's best but I think I need to make the move to a new i7 and DDR5 soon (trying to hold out for the Intel Arrow Lake stuff late next year that's supposed to be 25-40% faster with even more cores).
 
Don't worry, the Overdrive Raytracing patch will come and bring it back down again lol.


My 9900K is bottlenecking me like crazy at 1440p. Very fast but in that marketplace in Cyberpunk it can have drops to 80fps with CPU usage in the 90s. Forspoken was the worst, like dips into the mid 60s in combat with like 60% GPU usage.

COD Warzone also regularly hovers around 110-130fps with 50-70% usage in the cities but no problem when the buildings are scarce.

This poor 9900k is trying it's best but I think I need to make the move to a new i7 and DDR5 soon (trying to hold out for the Intel Arrow Lake stuff late next year that's supposed to be 25-40% faster with even more cores).
Amd x3d … that’s out soon soon … that’s what I’m upgrading to … but it’s basically an entire new pc I’ll have and I thought why not change everything … case included 🙈 (so it feels new)
 
Managed to get my hands on a Palit Gamerock 4090, The power consumption never reaches 400watt but its extremely fast
 
My 9900K is bottlenecking me like crazy at 1440p. Forspoken was the worst, like dips into the mid 60s in combat with like 60% GPU usage.

So, got that 13700k + DDR5 6400MHz to pair with the RTX4090....

Forspoken combat went from mid 60s to over 105 FPS.
Dead Space went from 80fps to 140FPS.
Warzone runs comfortably at over 140fps.
Cyberpunk maxed out is finally smooth without Frame Generation enabled.

Massive upgrade, much higher GPU usage across the board. Only downside is the new parts from Intel/AMD run much hotter than I'm used to.
 
the 5800x3d can still keep me company for a while. I’m definitely in for a 14th gen intel, and then a bump to 15th later. New mobos from 14 so 15 should fit just fine
 
the 5800x3d can still keep me company for a while. I’m definitely in for a 14th gen intel, and then a bump to 15th later. New mobos from 14 so 15 should fit just fine

14th gen sounds like a bust. Many leakers/insiders claiming the top desktop Meteor Lake parts have been dropped in favour of laptop versions. Raptor Lake refresh for Desktop instead.

Arrow Lake is when it gets good, I couldn't wait till the end of next year for it though.
 
14th gen sounds like a bust. Many leakers/insiders claiming the top desktop Meteor Lake parts have been dropped in favour of laptop versions. Raptor Lake refresh for Desktop instead.

Arrow Lake is when it gets good, I couldn't wait till the end of next year for it though.
this 5800x3d doesn’t hold the 4090 back at all at 4k so then it just means I would have to chill for a while. That 9900k to 13700k is a lekker upgrade especially the ddr5 version
 
this 5800x3d doesn’t hold the 4090 back at all at 4k so then it just means I would have to chill for a while. That 9900k to 13700k is a lekker upgrade especially the ddr5 version

Yep. 9900K +1440p is very different to 5800X3D +4K.

In Warzone with the render res set to 4K I was getting the same FPS as when it was set to 1440p, then I knew the 9900K was bottlenecking badly :LOL:
 
With RTX VSR, GeForce RTX 40 and 30 Series GPU users can tap AI to upscale lower-resolution content up to 4K, matching their display resolution. The AI removes blocky compression artifacts and improves the video’s sharpness and clarity.

Just like putting on a pair of prescription glasses can instantly snap the world into focus, RTX Video Super Resolution gives viewers on GeForce RTX 40 and 30 Series PCs a clear picture into the world of streaming video.

RTX VSR requires a GeForce RTX 40 or 30 Series GPU and works with nearly all content streamed in Google Chrome and Microsoft Edge.

The feature also requires updating to the latest GeForce Game Ready Driver, available today, or the next NVIDIA Studio Driver releasing in March. Both Chrome (version 110.0.5481.105 or higher) and Edge (version 110.0.1587.56) have updated recently to support RTX VSR.

 

Ok 4090 finally going to be challenged 😀

Just came out an hour ago, gave it a try all maxed out on pathtracing. Looks amazing.

1440p + DLSS Quality + Frame Generation = 105-115fps avg on my 4090.
1440p + DLSS Quality = 60-65fps avg.
1440p Native + Frame Generation = 55-65fps avg.
1440 Native = 30-35fps avg.
But could probably free up 5-10fps if I dialed back SSR from Psycho to Ultra, and maybe clouds and volumetrics down one notch.

Eina. But to be expected, menu says it requires a minimum of a 4070Ti or 3090 and I can see why.
 

Users who are viewing this thread

Similar threads

  • Location:
    1. Kempton Park
Replies
5
Views
817
Back
Top Bottom