What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

Nvidia 4080 Super vs AMD 7900 XTX

fxizel

New Member
Rating - 100%
1   0   0
Joined
Feb 5, 2024
Messages
20
Reaction score
0
Points
65
Age
23
Location
Grahamstown
Good day,

I am going to be purchasing a gaming PC soon and need some advice, I am stuck between the 4080 Super and the 7900 XTX, i've heard a lot of good things about both but I can't seem to decide which one. I mainly play Call Of Duty but I'd also like this to be a little bit of a work machine, I work in tech as a software engineer (I know, ironic that I can't choose).

Any tips would be great!
 
Good day,

I am going to be purchasing a gaming PC soon and need some advice, I am stuck between the 4080 Super and the 7900 XTX, i've heard a lot of good things about both but I can't seem to decide which one. I mainly play Call Of Duty but I'd also like this to be a little bit of a work machine, I work in tech as a software engineer (I know, ironic that I can't choose).

Any tips would be great!
I am pairing this with an i9-14900KF
 
At the same price, I'd go for the 4080-S. You'll get way better RT if interested in experiencing with that. Also seems a bit faster.

My 5700XT is starting to make me itch. :/
 
If money is no object, 4080 Super. If money is an object, 7900XTX.
 
I'm an AMD fan boy through and through but for the superior ray tracing and more importantly efficiency I'd go 4080 super.

I'm not comfortable with a gpu pulling close to 400 watts.
 
I'm not comfortable with a gpu pulling close to 400 watts.
Uh, why not?

Most midrange and above GPUs pull in and around 400 watts with an overclocked 4090 pulling way way more than that.

I'm not comfortable with a CPU pulling that kind of wattage with a tiny surface area for cooling but hey, here we are with Intel 14th Gen.

Edit: Also, Ray-Tracing is a gimmick at the moment and unless you have to have it, not worth the performance loss.

I would get Nvidia if I was streaming though as there are nice to have features with the latest gen Nvidia cards but if all I was doing was gaming, I'd stick with AMD

performance-matchup-vs-7900-xtx-4k.png
 
Last edited:
Uh, why not?

Most midrange and above GPUs pull in and around 400 watts with an overclocked 4090 pulling way way more than that.

I'm not comfortable with a CPU pulling that kind of wattage with a tiny surface area for cooling but hey, here we are with Intel 14th Gen.
I'm not sure what you mean?
Mid range cards all pulling between 200 and 300 watts? (4060,4070,4070ti,7700xt,7800xt)

A few high end cards are just over the 300 mark. 7900xt is around 320w & the 4080 is 310 roughly?

I had the xfx 7900xtx and it pulled 400.
Saw reviews on the sapphire nitro and it pulled the same (even over 400).

I'm also using an AMD cpu so the power draw is a lot less than Intel.
 
The 7900XTX will give you the most FPS for call of duty but there are a lot of nice features with the NVidia cards, personally I would go 4080S If I had the choice and especially when people use word "work" and GPU together . Mind us asking what type of work you will do with it?
 
I'm not sure what you mean?
Mid range cards all pulling between 200 and 300 watts? (4060,4070,4070ti,7700xt,7800xt)

A few high end cards are just over the 300 mark. 7900xt is around 320w & the 4080 is 310 roughly?

I had the xfx 7900xtx and it pulled 400.
Saw reviews on the sapphire nitro and it pulled the same (even over 400).

I'm also using an AMD cpu so the power draw is a lot less than Intel.
Yes, I know. I'm perplexed as to why you're not comfortable with a GPU pulling close or above 400 watts?

Is it because it'll max out your PSU?

I see the wattage draw during gaming is less than say running furmark so most cards won't be power hogs for normal use but I'm confused as to why drawing 400 watts or more is an issue.

I know that the midrange Nvidia cards are more power efficient, I just wonder why that's such a major consideration.
 
I'm confused as to why drawing 400 watts or more is an issue.
This is why we are back on stage 4 loadshedding :ROFLMAO:

Aside from consumption, that heat also needs to go somewhere. My 6900 easily chows ~300w depending on title and settings, and my exhausted air is toasty. So his concern could be cost of running, inverter load, case heat management?

Another plus for the 4080 is if you ever want to start playing around with any machine learning things, I believe the support for it is much better on Nvidia than on AMD.
 
Maybe because electricity is not free unless you live in Soweto...
That’s fair.

Nvidia does hold that advantage over AMD.

I’d do a slight under volt as well to improve efficiency.

Hell, with Ryzen 5000, it can improve performance as well with a slight under volt.
 
Struggled with a similar choice earlier. Opted for nvidia due to an awesome sale, otherwise would have probably still been on the fence

Short side perspective:
4080: great thermals, quiet, looks sick at 3 slots, for some reason nvidia works better for freesync than 6900xt (benq mobiuz)
AMD GPU: cost/performance is great, AMD software is kilometers ahead of nvidia (nvidia looks stuck on windows XP)

I guess you will be happy with either, review the specific models you are considering in-and-out and hopefully along the way you will get convinced which one to buy. :)
 
Yes, I know. I'm perplexed as to why you're not comfortable with a GPU pulling close or above 400 watts?

Is it because it'll max out your PSU?

I see the wattage draw during gaming is less than say running furmark so most cards won't be power hogs for normal use but I'm confused as to why drawing 400 watts or more is an issue.

I know that the midrange Nvidia cards are more power efficient, I just wonder why that's such a major consideration.
So look back in 2016 or 2017 (can't remember exactly) when the 1080ti came out.
That card was a monster and still fairs well today and drew only 250w.
It's like the 4090 of today.
400w is too much electricity wise.
Also when I was playing a normal AAA title the xtx drew 400..


Like @CarboVan mentioned, heat is also an issue and makes gaming in summer not so enjoyable in the furnace room.
 
That card was a monster and still fairs well today and drew only 250w
It's important to remember boost algorithms and power targets have changed a lot.

Ever since 30 series manufacturers have been going full send on power targets to try and edge each other out because of how tight Nvidia is with controlling board design and price etc. it's why EVGA decided to exit the GPU market even though it was by far their biggest revenue generator. This means most gpu's are running way beyond the sweet spot on the voltage/frequency curve.

That being said doesn't mean if your card has 400W power limit that you need to run it at that, I have found that I can run GPU undervolted which drops power usage from 400W to 250-270W and about 8-10c cooler while still keeping the stock boost clock.
 
It's important to remember boost algorithms and power targets have changed a lot.

Ever since 30 series manufacturers have been going full send on power targets to try and edge each other out because of how tight Nvidia is with controlling board design and price etc. it's why EVGA decided to exit the GPU market even though it was by far their biggest revenue generator. This means most gpu's are running way beyond the sweet spot on the voltage/frequency curve.

That being said doesn't mean if your card has 400W power limit that you need to run it at that, I have found that I can run GPU undervolted which drops power usage from 400W to 250-270W and about 8-10c cooler while still keeping the stock boost clock.
What card are you doing this with?
An xtx?
 
More curious than anything, is it the value prospect thats bad?
Power draw is nonsensical, the clock modulation is nonsensical, the amount of cards with strange issues is nonsensical, the drivers have practically regressed since 6000-series, and you lose out on an objectively superior architecture in Lovelace. Furthermore, you lose NVIDIA’s entire suite of apps and features. I don’t know what the value proposition would need to be for me to justify picking a 7900 XTX over a 4080 Super. If it’s only 20% cheaper, I’d still recommend the 4080 Super.
 
Power draw is nonsensical, the clock modulation is nonsensical, the amount of cards with strange issues is nonsensical, the drivers have practically regressed since 6000-series, and you lose out on an objectively superior architecture in Lovelace. Furthermore, you lose NVIDIA’s entire suite of apps and features. I don’t know what the value proposition would need to be for me to justify picking a 7900 XTX over a 4080 Super. If it’s only 20% cheaper, I’d still recommend the 4080 Super.
I recall you actually had the 7900XTX at one point
 

Users who are viewing this thread

Latest posts

Back
Top Bottom