What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

NVIDIA Shares Slip on RTX Family Sales

Toxxyc

The PUNisher
Staff member
Administrator
Rating - 100%
160   0   0
Joined
Oct 1, 2014
Messages
14,838
Reaction score
10,665
Points
23,955
Age
34
Location
Modimolle
Nvidia’s new RTX series made news all over and raised expectations for the brand long before it was released. Gamers were excited and miners were hopeful. Everyone had an opinion on launch dates, apparent benchmarks were “leaked” and people were just about to throw punches about misinformation and opinions being spread as fact. After launch though, it seems like Nvidia might have made a bit of a mistake.

The cards certainly didn’t live up to the hype. We were all excited to see the actual performance of the cards that finally broke the GTX-naming cycle and from the internet we pulled information, how wrong it might even have been, that showed a “50% increase in performance”. Gamers were expecting almost unrealistic increases in framerates which would have been extremely welcoming, but Nvidia failed to deliver.

The launch card, the RTX 2080, launched with a massive price tag, specially in South Africa and when it was revealed that performance is only marginally higher than the last flagship (GTX 1080Ti), hearts sank. Yes, it’s more powerful, but at $500 more than the 1080Ti, is it worth it? People have been asking this question since launch, and it’s starting to show. Nvidia’s share prices have been slipping lately, and while it’s already slightly recovering after the bigger dip, the latest price still is 1.52% in the red.

We’re hoping the later drivers and updates will fix this, or maybe the prices of these cards will come down, or maybe the later RTX cards will turn it around for this new “family” of cards, but for now I’m not too excited. At the current exchange rate it looks like the RTX series of cards are a pipe dream for the budget gamer, and will remain as such for a while still.

nvidia-rtx-share-price-drop-01.jpg
 
Last edited:
There are plenty of market dynamics that determine where a GPU ends up and if it's a success or not.
The stock market is not where you want to be looking. From the stock price movement I'd not judge the validity of a product or component.

On August 1st stock price close was $246 per share and on the last day of that month it was $280.
Nothing happened in August of significance that's related to any product they have (Nvidia looks at itself as a product company which I believe is a mistake which may cost them in future).

Three things are happening right now in the market and with nvidia.
1. NVIDIA has been asking for more money for its GPU+Memory from it's 'partners' (More like underlings as there's zero negotiating with NVIDIA :( ) and in turn they ask for more from us the end users.

2. RTX cost a fortune to develop and that cost is obviously going to be paid by recipients of the GPU technology, which is you and even the enterprise/HPC space clients. The difference is these markets can almost always justify the cost in what they gain in compute capability, but you and I and other joe soap's use GPU power and technology in a rather limited scope. That is to say, what we get for RTX vs the cost is not balanced at all versus what their other clients get from RTX technology.

3. GDDR6 has a single vendor (micron)right now and only a single client for high speed GDDR6 as well (NVIDIA). That has a cost implication, be it small or big, this situation will increase prices, lowering sales volumes etc. (Until this isn't an issue at all)

4. There's absolutely zero relevant competition for NVIDIA and that literally allows them to dictate the pricing for X amount of GPU computing power. (Per mm2 and per watt their processors are by far the most powerful available anywhere today)

5. The product (another reason why perhaps being a product company isn't conducive for success for the GeForce cards) can and should be contextualized and not looked at a single all purpose and all person's go to solution. Which is to say we must separate the technology from the business and commercial aspect of it.

The long and the short is that yes we are getting screwed with prices and that sucks However whose fault it is entirely and why is a lot more detailed. Regardless of that answer, it won't make out GPUs cheaper. The most direct way to may them cheaper is for NVIDIA to have better competition.
Be it one is an AMD GPU user or NVIDIA GPU user, having both of them at each other's heels helps with pricing (for example GDDR6 pricing will decrease as a result, which makes the GPUs cheaper). Hopefully Q1 2019 sees some price adjustments for NVIDIA's GPUs courtesy of AMD's updated or new GPU lineup. Situation above and more may not have changed but at least we will have an option which has to make things better for end users.
 
I agree, yes. Specifically on number 4 - I think it might have hurt them in this regard. The RTX series was released not because they had to, but because they wanted to show the world "because we can". I just think that the general perception was going to be "better", and it bit them in the ass a little bit. Business and product disconnectivity aside, they definitely have an impact on each other.
 
And the great people still buy Nvidia shares why because it is a BLUE CHIP CO forbes 500 market index ...no other competitor can touch this brand in a hurry
 
The rtx card is disappointing ... Firstly the performance over previous gen is marginal and nothing to brag about ... Especially when we look at the performance leaps between generations from previous releases of cards.

Secondly ray tracing and dlss is not anywhere to be seen. For industry applications, yes rtx would be welcomed, anything new that cuts down the time to deliver "stuff" to their clients equals money sooner.

The problem lies with (like was said) the Joe soap gamers ... Like most on this forum

There simply is no valid reason right now for anyone to sell their 1080s or 1080tis for a 20 series card. For the most part there will be very little to no perceivable difference in your gaming experience ... The only shallow consolation is knowing you'd have the highest 3d mark scores. We buy GPUs to game ultimately

Regarding ray tracing

The reason it's not available is because the performance hit is massive ... It works we've seen it in a real game and the only plausible reason it ain't patched in is ... performance. There is no other reason. In light of no competition Nvidia can get away doing this ... Have an entire keynote focussing mainly on some new tech and sell cards based on this new tech ... But the cards can't do any of it. Apple did this when they launched the iphone X and showcased amazing portrait modes where the entire background was blacked out making for amazing linkedin photos [emoji4] ... Till today that feature is pretty crap and doesn't work that well. I would not be surprised if tomb Raider gets patched next year for ray tracing ... And a month later they announce 2090 cards showcasing it's impressive ray tracing performance over the 2080 cards ... Introduce the new cards at the same price points the 2080 cards were at and drop those cards to almost half their value ... It's Nvidia and right now they can do what ever the hell they like

What the gaming industry sorely needs is competition in the GPU market. To the extent that companies should form JVs (joint ventures) to be that competition to Nvidias monopoly. AMD is not going to do this on their own ... But maybe a JV between AMD and Tesla can ( a silly example )...

Even if that new company or JV steals 20% / 30% market share that is a hella lot.

When the 1080ti launched at 699 the only reason it did was because Nvidia needed to ensure their dominance of their brand despite the eminent threat of those Vega cards from AMD ... Which wasn't a threat. But this is what competition creates ... Better prices better products for us ...

If no company or companies challenge Nvidia .... It will continue to do what it wants ...

For me I'll wait on the sides and see how this plays out ...
 
Last edited:
A note on ray tracing - it'll probably be left out of the 2060 cards coming up and left exclusively for the 2070 and higher models.
 
The rtx card is disappointing ... Firstly the performance over previous gen is marginal and nothing to brag about ... Especially when we look at the performance leaps between generations from previous releases of cards.

Secondly ray tracing and dlss is not anywhere to be seen. For industry applications, yes rtx would be welcomed, anything new that cuts down the time to deliver "stuff" to their clients equals money sooner.

The problem lies with (like was said) the Joe soap gamers ... Like most on this forum

There simply is no valid reason right now for anyone to sell their 1080s or 1080tis for a 20 series card. For the most part there will be very little to no perceivable difference in your gaming experience ... The only shallow consolation is knowing you'd have the highest 3d mark scores. We buy GPUs to game ultimately

Regarding ray tracing

The reason it's not available is because the performance hit is massive ... It works we've seen it in a real game and the only plausible reason it ain't patched in is ... performance. There is no other reason. In light of no competition Nvidia can get away doing this ... Have an entire keynote focussing mainly on some new tech and sell cards based on this new tech ... But the cards can't do any of it. Apple did this when they launched the iphone X and showcased amazing portrait modes where the entire background was blacked out making for amazing linkedin photos [emoji4] ... Till today that feature is pretty crap and doesn't work that well. I would not be surprised if tomb Raider gets patched next year for ray tracing ... And a month later they announce 2090 cards showcasing it's impressive ray tracing performance over the 2080 cards ... Introduce the new cards at the same price points the 2080 cards were at and drop those cards to almost half their value ... It's Nvidia and right now they can do what ever the hell they like

What the gaming industry sorely needs is competition in the GPU market. To the extent that companies should form JVs (joint ventures) to be that competition to Nvidias monopoly. AMD is not going to do this on their own ... But maybe a JV between AMD and Tesla can ( a silly example )... R&D is expensive ... Even if that new company steals 20% / 30% market share that is a hella lot.

When the 1080ti launched at 699 the only reason it did was because Nvidia needed to ensure their dominance of their brand despite the eminent threat of those Vega cards from AMD ... Which wasn't a threat. But this is what competition creates ... Better prices better products for us ...

If no company or companies challenge Nvidia .... It will continue to do what it wants ...

For me I'll wait on the sides and see how this plays out ...

Reading while snacking on LINDT
 
You WILL buy all our old Pascal stock before we drop RTX prices:sneaky:
 
Like a wise man once said , there really is nothing like a bad graphics card , only a bad price.

Played on the RTX 2080 at Rage and my gosh it is incredible , even without the whole Ray tracing , and DLSS it is a serious power house .
 
What the gaming industry sorely needs is competition in the GPU market. To the extent that companies should form JVs (joint ventures) to be that competition to Nvidias monopoly. AMD is not going to do this on their own ... But maybe a JV between AMD and Tesla can ( a silly example )...

Are we forgetting that AMD is already a merged company? They swallowed another 3-letter company starting in A to become what they are today.
And it helped not a bit at all.

The only way for nVidia to fall is to accidentally be left behind on tech, which would required AMD to actually innovate on GPUs again for once. Like back in the DX10.1 days when nVidia ended up on an outdated DX shader and Red Team owners were smiling all the way.
 
I’m undecided about the 2080Ti (which is uncanny for me). My CUD wants the 2080Ti but I’m struggling to justify the cost to upgrade from my 1080Ti. I know what I wanna do, but I still don’t know what I’m gonna do...
 
I’m undecided about the 2080Ti (which is uncanny for me). My CUD wants the 2080Ti but I’m struggling to justify the cost to upgrade from my 1080Ti. I know what I wanna do, but I still don’t know what I’m gonna do...

I can help ease that tension buy the fucking card and throw that 1080ti to the sharks of this OCEAN
 
AMD for now cater for the masses. They are for the guy that can't pick up the bill for a top class Nvidia card but still want to game at good settings, or like me, Just don't like Nvidia.

AMD makes good sales because of this and will continue to do so even if Nvidia build something better after RTX or whatever they plan for the future.

Even as an AMD fan, I don't think they can bring anything to the table that can compare with 1080ti/RTX. They need to take a good hard look at how they do things if they want to get to that level soon.
 
Are we forgetting that AMD is already a merged company? They swallowed another 3-letter company starting in A to become what they are today.
And it helped not a bit at all.

The only way for nVidia to fall is to accidentally be left behind on tech, which would required AMD to actually innovate on GPUs again for once. Like back in the DX10.1 days when nVidia ended up on an outdated DX shader and Red Team owners were smiling all the way.

They bought a company ... Or gobbled one up rather ... Different to a JV. Truthfully it's a great business opportunity for a company or consortium with deep pockets to do so
 
I’m undecided about the 2080Ti (which is uncanny for me). My CUD wants the 2080Ti but I’m struggling to justify the cost to upgrade from my 1080Ti. I know what I wanna do, but I still don’t know what I’m gonna do...

You will cave ... You will buy it. The moment you fire up a game you will be disappointed ... Been there done that
 
They bought a company ... Or gobbled one up rather ... Different to a JV. Truthfully it's a great business opportunity for a company or consortium with deep pockets to do so

It was still the coming together of two powerhouses of technology...which somehow left them both weaker than when they started.

I mean yes, in a business sense it was something different, but for what it represented to the industry it was every bit the joining of two ventures. It certainly wasn't your usual EA or Actiblizzion studio seizure which is more an example of one company gobbling up another.
 
Nvidia’s new RTX series made news all over and raised expectations for the brand long before it was released. Gamers were excited and miners were hopeful. Everyone had an opinion on launch dates, apparent benchmarks were “leaked” and people were just about to throw punches about misinformation and opinions being spread as fact. After launch though, it seems like Nvidia might have made a bit of a mistake.

The cards certainly didn’t live up to the hype. We were all excited to see the actual performance of the cards that finally broke the GTX-naming cycle and from the internet we pulled information, how wrong it might even have been, that showed a “50% increase in performance”. Gamers were expecting almost unrealistic increases in framerates which would have been extremely welcoming, but Nvidia failed to deliver.

The launch card, the RTX 2080, launched with a massive price tag, specially in South Africa and when it was revealed that performance is only marginally higher than the last flagship (GTX 1080Ti), hearts sank. Yes, it’s more powerful, but at $500 more than the 1080Ti, is it worth it? People have been asking this question since launch, and it’s starting to show. Nvidia’s share prices have been slipping lately, and while it’s already slightly recovering after the bigger dip, the latest price still is 1.52% in the red.

We’re hoping the later drivers and updates will fix this, or maybe the prices of these cards will come down, or maybe the later RTX cards will turn it around for this new “family” of cards, but for now I’m not too excited. At the current exchange rate it looks like the RTX series of cards are a pipe dream for the budget gamer, and will remain as such for a while still.

View attachment 27491
 
It was still the coming together of two powerhouses of technology...which somehow left them both weaker than when they started.

I mean yes, in a business sense it was something different, but for what it represented to the industry it was every bit the joining of two ventures. It certainly wasn't your usual EA or Actiblizzion studio seizure which is more an example of one company gobbling up another.

I've personally been involved in mergers and acquisitions and trust me one company buying another is not always to "combine forces" ... Business is a cruel thing many times
 
It's really hard to decide especially when something is new better Come with high price, to my opinion I will relax before purchasing it until the price become fair with performance
 
I've personally been involved in mergers and acquisitions and trust me one company buying another is not always to "combine forces" ... Business is a cruel thing many times

I know that. But this was or the one or the other division would have stopped, not limped on to hopefully receive new life again down the line.
 
I know that. But this was or the one or the other division would have stopped, not limped on to hopefully receive new life again down the line.

Yes i reckon in that instance it was more of a "I would not mind that IP in ATI ... It can help our chip business " ... Approached ATI and go "You in crap and I'll make you an offer as it's the perfect time". Such a corporate action will result in the ATI management being gotten rid of "eventually" and like someone above alluded the business model for both CPU and GPU is not to compete head on with the likes of Nvidia and Intel (which might have been obscured with ryzen as it was a direct jab at Intel's best)... Trust me they only cared for the IP. ATI had the aggressive business model ... But you and your investors need deep pockets when you playing in that bleeding edge of tech

So fast forward to today and AMD's mainstream "cheaper" options are the order of the day. Over the past year this has been good for their share price and this model is working for them so far.

Actually maybe it will be a totally new player (like Huawei in the cellphone market) to take Nvidia on in that bleeding edge tier.

The strategies many businesses use are genius and down right "skelm" at times ... Like a company creating their flagship model to only sell a few to make their brand be perceived as the market leaders which intrinsically will increase the sales of every other "lesser" product they sell. It's strategies like this businesses use. And their lesser models in instances is inferior to competitors or ... Exactly the same

So my point is ... Businesses generally make decisions for reasons we sometimes think they are and in fact they not. It is though in every instance a case of "what's in it for me "

Sorry just feel like this has become an offtrack philosophical discussion on business [emoji4]
 
Last edited:
Yes i reckon in that instance it was more of a "I would not mind that IP in ATI ... It can help our chip business " ... Approached ATI and go "You in crap and I'll make you an offer as it's the perfect time". Such a corporate action will result in the ATI management being gotten rid of "eventually" and like someone above alluded the business model for both CPU and GPU is not to compete head on with the likes of Nvidia and Intel (which might have been obscured with ryzen as it was a direct jab at Intel's best)... Trust me they only cared for the IP. ATI had the aggressive business model ... But you and your investors need deep pockets when you playing in that bleeding edge of tech

So fast forward to today and AMD's mainstream "cheaper" options are the order of the day. Over the past year this has been good for their share price and this model is working for them so far.

Actually maybe it will be a totally new player (like Huawei in the cellphone market) to take Nvidia on in that bleeding edge tier.

The strategies many businesses use are genius and down right "skelm" at times ... Like a company creating their flagship model to only sell a few to make their brand be perceived as the market leaders which intrinsically will increase the sales of every other "lesser" product they sell. It's strategies like this businesses use. And their lesser models in instances is inferior to competitors or ... Exactly the same

So my point is ... Businesses generally make decisions for reasons we sometimes think they are and in fact they not. It is though in every instance a case of "what's in it for me "

Sorry just feel like this has become an offtrack philosophical discussion on business [emoji4]

Yeah let's not Kotick this up too much.

Any new player has one big problem though. They don't have the machine time, they are going to need an entirely new plant because there's no space for them between the companies we already have.

Huawei is far from a new player, they for years had a closed off market of a billion people they could feed off of and dominate with no competition. Now they're simply "spreading" the wealth as it were. They've grown too big for their closed economy and realised they have the capital and means to crush the West (and it's Korean allies). Red brother is watching. (but since someone is always watching these days, who cares, their phones are in fact kickass so whatever).

And make no mistake, AMD may be content dominating the budget sectors but they are forever playing the long game at the same time. There are constantly making improvements that seem to yield no real world benefits but at some point something's gotta give. Either they'll collapse under R&D no one wants, or they'll crush the two current powerhouses. Every other day I jump between thinking it will be the one or the other, but AMD has more to show us, even if that is just a colossal collapse.
 
Ray tracing on RTX isn't slow. Can we find any other piece of silicon that is remotely close to doing any sort of real time photon mapping at the rate RTX can anywhere? No
It's important to understand the technology, the philosophy behind it and what that translates to in a product with which we may have several objections to.
Scene complexity of the ILM/UE4 based Star wars demo is hundreds of times more complex than what Tomb Raider or BF can produce. Computationally it's night and day.

These engines that power these games are not by any stretch of the imagination designed from the ground up to support real time ray tracing in the primary or core rendering passes. They are literally patching support in. There's a reason benchmarks exist, and it's because any one title can't tell you about the underlying hardware objectively. Benchmarks are far from perfect, but they are cognizant of that and therefor written accordingly so they scale in a predictable and meaningful way, unlike games whose purpose is to deliver a playable experience first and foremost.

It would be like making a judgment on an entire API (DX/Vulkan etc) based on a single benchmark or title. Just because one uses a particular iteration of an API it doesn't mean what you get from that is a reflection of the API. This applies to the RTX hardware as well. As our hardware becomes more complex, it is important we understand this nuance.

We all know and have seen that when the Xbox 360 released vs when it was near the end of it's life how much better the graphics it was putting out were. The hardware didn't change, but the ability of the developers to extract power from the hardware is what changed.In the same way Killzone 4 for the PS4 was not the definitive example of what the console could do. Look at Horizon Zero Dawn or Detroit or GOW, even Death Stranding. Those are by far more visually impressive and more complex than Killzone 4 and potentially run better as well

Also, GPU progress in power isn't linear and it's absurd to expect it to be. GF3 wasn't much faster than GF2 Ultra, but it could do things the GF2 couldn't. Mainly it could use the first shader model brought into DirectX. That inflection point is necessary and it can't be about pushing ever faster frame rates forever in a straight line.

The ability to do work, has to come before the measurement of how fast that work is being done. This is important to understand.

It's important to understand how technological progression works in the silicon and IC space and how that ties into the grander scheme. Less focus on the actors but rather technology as a separate entity from the business practices.

If we are to critique the RTX 2080 Ti for example, let's put anything else next to it and then compare it. RTX 2080 Ti is doing in real time what previously took 4 Titan-Vs.
In the Top 10 of the worlds most powerful Super Computers, 6 of them (including #1) use NVIDIA hardware that per GPU isn't computationally equivelantthe to RTX 2080Ti.
I.e We have super computing power available to us, the next move is for developers, not the hardware vendors.

If we say the 1080Ti can play any game right now at the highest detail at FHD and maybe QHD, then of what value would it be to increase the frame rates even farther? Nearly doubling the gate count via the addition of more texture samplers, render outputs, caches and cuda cores would yield phenomenal performance, but even if NVIDIA did that, this Star wars demo would still need more than 2 or even 3 cards to run at even 24fps.

We can rightfully say real time ray tracing isn't relevant to the games we play right now, but but as a technology it is of the utmost importance, literally on par with the introduction of programmable shaders if not more so.

It is the pioneering nature of certain semiconductor firms or technology firms that allow them to lead. How they lead is by forging a new path constantly, something that others in the spaces couldn't do.

Be it we are willing to pay for this forging of a new path or not is separate from the validity of walking this new path. We should not conflate or confuse the two.
This could apply and does apply to any pioneering technology. When AMD introduced x86-64 extensions, their CPUs were hideously expensive but this was a necessary first step, despite the sometimes marginal gains in performance over Athlon XP in some instances. AMD had to forge that path and drag us kicking and screaming into 64-bit computing and here we are today where virtually every title or software uses a 64-bit executable. Someone had to take that step and AMD did when INTEL were not able or willing to in a backwards compatible way.
 

Users who are viewing this thread

Latest posts

Back
Top Bottom