What's new
Carbonite

Welcome to Carbonite! Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

NVIDIA Shares Slip on RTX Family Sales

adamr

VIP
VIP Supporter
Rating - 100%
100   0   0
Joined
Nov 17, 2010
Messages
3,197
Reaction score
623
Points
2,835
Location
Fourways, Johannesburg
Are we forgetting that AMD is already a merged company? They swallowed another 3-letter company starting in A to become what they are today.
And it helped not a bit at all.

The only way for nVidia to fall is to accidentally be left behind on tech, which would required AMD to actually innovate on GPUs again for once. Like back in the DX10.1 days when nVidia ended up on an outdated DX shader and Red Team owners were smiling all the way.
They bought a company ... Or gobbled one up rather ... Different to a JV. Truthfully it's a great business opportunity for a company or consortium with deep pockets to do so
 

adamr

VIP
VIP Supporter
Rating - 100%
100   0   0
Joined
Nov 17, 2010
Messages
3,197
Reaction score
623
Points
2,835
Location
Fourways, Johannesburg
I’m undecided about the 2080Ti (which is uncanny for me). My CUD wants the 2080Ti but I’m struggling to justify the cost to upgrade from my 1080Ti. I know what I wanna do, but I still don’t know what I’m gonna do...
You will cave ... You will buy it. The moment you fire up a game you will be disappointed ... Been there done that
 

SCHUMI4EVER

Peripheral Phanatic
VIP Supporter
Rating - 100%
18   0   0
Joined
Oct 27, 2016
Messages
6,390
Reaction score
1,860
Points
2,665
Location
Durban, KZN
They bought a company ... Or gobbled one up rather ... Different to a JV. Truthfully it's a great business opportunity for a company or consortium with deep pockets to do so
It was still the coming together of two powerhouses of technology...which somehow left them both weaker than when they started.

I mean yes, in a business sense it was something different, but for what it represented to the industry it was every bit the joining of two ventures. It certainly wasn't your usual EA or Actiblizzion studio seizure which is more an example of one company gobbling up another.
 

Nathans drake

Junior Member
Rating - 0%
0   0   0
Joined
Apr 28, 2018
Messages
46
Reaction score
1
Points
235
Age
32
Nvidia’s new RTX series made news all over and raised expectations for the brand long before it was released. Gamers were excited and miners were hopeful. Everyone had an opinion on launch dates, apparent benchmarks were “leaked” and people were just about to throw punches about misinformation and opinions being spread as fact. After launch though, it seems like Nvidia might have made a bit of a mistake.

The cards certainly didn’t live up to the hype. We were all excited to see the actual performance of the cards that finally broke the GTX-naming cycle and from the internet we pulled information, how wrong it might even have been, that showed a “50% increase in performance”. Gamers were expecting almost unrealistic increases in framerates which would have been extremely welcoming, but Nvidia failed to deliver.

The launch card, the RTX 2080, launched with a massive price tag, specially in South Africa and when it was revealed that performance is only marginally higher than the last flagship (GTX 1080Ti), hearts sank. Yes, it’s more powerful, but at $500 more than the 1080Ti, is it worth it? People have been asking this question since launch, and it’s starting to show. Nvidia’s share prices have been slipping lately, and while it’s already slightly recovering after the bigger dip, the latest price still is 1.52% in the red.

We’re hoping the later drivers and updates will fix this, or maybe the prices of these cards will come down, or maybe the later RTX cards will turn it around for this new “family” of cards, but for now I’m not too excited. At the current exchange rate it looks like the RTX series of cards are a pipe dream for the budget gamer, and will remain as such for a while still.

View attachment 27491
 

adamr

VIP
VIP Supporter
Rating - 100%
100   0   0
Joined
Nov 17, 2010
Messages
3,197
Reaction score
623
Points
2,835
Location
Fourways, Johannesburg
It was still the coming together of two powerhouses of technology...which somehow left them both weaker than when they started.

I mean yes, in a business sense it was something different, but for what it represented to the industry it was every bit the joining of two ventures. It certainly wasn't your usual EA or Actiblizzion studio seizure which is more an example of one company gobbling up another.
I've personally been involved in mergers and acquisitions and trust me one company buying another is not always to "combine forces" ... Business is a cruel thing many times
 

Nathans drake

Junior Member
Rating - 0%
0   0   0
Joined
Apr 28, 2018
Messages
46
Reaction score
1
Points
235
Age
32
It's really hard to decide especially when something is new better Come with high price, to my opinion I will relax before purchasing it until the price become fair with performance
 

SCHUMI4EVER

Peripheral Phanatic
VIP Supporter
Rating - 100%
18   0   0
Joined
Oct 27, 2016
Messages
6,390
Reaction score
1,860
Points
2,665
Location
Durban, KZN
I've personally been involved in mergers and acquisitions and trust me one company buying another is not always to "combine forces" ... Business is a cruel thing many times
I know that. But this was or the one or the other division would have stopped, not limped on to hopefully receive new life again down the line.
 

adamr

VIP
VIP Supporter
Rating - 100%
100   0   0
Joined
Nov 17, 2010
Messages
3,197
Reaction score
623
Points
2,835
Location
Fourways, Johannesburg
I know that. But this was or the one or the other division would have stopped, not limped on to hopefully receive new life again down the line.
Yes i reckon in that instance it was more of a "I would not mind that IP in ATI ... It can help our chip business " ... Approached ATI and go "You in crap and I'll make you an offer as it's the perfect time". Such a corporate action will result in the ATI management being gotten rid of "eventually" and like someone above alluded the business model for both CPU and GPU is not to compete head on with the likes of Nvidia and Intel (which might have been obscured with ryzen as it was a direct jab at Intel's best)... Trust me they only cared for the IP. ATI had the aggressive business model ... But you and your investors need deep pockets when you playing in that bleeding edge of tech

So fast forward to today and AMD's mainstream "cheaper" options are the order of the day. Over the past year this has been good for their share price and this model is working for them so far.

Actually maybe it will be a totally new player (like Huawei in the cellphone market) to take Nvidia on in that bleeding edge tier.

The strategies many businesses use are genius and down right "skelm" at times ... Like a company creating their flagship model to only sell a few to make their brand be perceived as the market leaders which intrinsically will increase the sales of every other "lesser" product they sell. It's strategies like this businesses use. And their lesser models in instances is inferior to competitors or ... Exactly the same

So my point is ... Businesses generally make decisions for reasons we sometimes think they are and in fact they not. It is though in every instance a case of "what's in it for me "

Sorry just feel like this has become an offtrack philosophical discussion on business
 
Last edited:

SCHUMI4EVER

Peripheral Phanatic
VIP Supporter
Rating - 100%
18   0   0
Joined
Oct 27, 2016
Messages
6,390
Reaction score
1,860
Points
2,665
Location
Durban, KZN
Yes i reckon in that instance it was more of a "I would not mind that IP in ATI ... It can help our chip business " ... Approached ATI and go "You in crap and I'll make you an offer as it's the perfect time". Such a corporate action will result in the ATI management being gotten rid of "eventually" and like someone above alluded the business model for both CPU and GPU is not to compete head on with the likes of Nvidia and Intel (which might have been obscured with ryzen as it was a direct jab at Intel's best)... Trust me they only cared for the IP. ATI had the aggressive business model ... But you and your investors need deep pockets when you playing in that bleeding edge of tech

So fast forward to today and AMD's mainstream "cheaper" options are the order of the day. Over the past year this has been good for their share price and this model is working for them so far.

Actually maybe it will be a totally new player (like Huawei in the cellphone market) to take Nvidia on in that bleeding edge tier.

The strategies many businesses use are genius and down right "skelm" at times ... Like a company creating their flagship model to only sell a few to make their brand be perceived as the market leaders which intrinsically will increase the sales of every other "lesser" product they sell. It's strategies like this businesses use. And their lesser models in instances is inferior to competitors or ... Exactly the same

So my point is ... Businesses generally make decisions for reasons we sometimes think they are and in fact they not. It is though in every instance a case of "what's in it for me "

Sorry just feel like this has become an offtrack philosophical discussion on business
Yeah let's not Kotick this up too much.

Any new player has one big problem though. They don't have the machine time, they are going to need an entirely new plant because there's no space for them between the companies we already have.

Huawei is far from a new player, they for years had a closed off market of a billion people they could feed off of and dominate with no competition. Now they're simply "spreading" the wealth as it were. They've grown too big for their closed economy and realised they have the capital and means to crush the West (and it's Korean allies). Red brother is watching. (but since someone is always watching these days, who cares, their phones are in fact kickass so whatever).

And make no mistake, AMD may be content dominating the budget sectors but they are forever playing the long game at the same time. There are constantly making improvements that seem to yield no real world benefits but at some point something's gotta give. Either they'll collapse under R&D no one wants, or they'll crush the two current powerhouses. Every other day I jump between thinking it will be the one or the other, but AMD has more to show us, even if that is just a colossal collapse.
 

Gouhan

Forum Addict
VIP Supporter
Rating - 100%
87   0   0
Joined
Apr 22, 2010
Messages
2,608
Reaction score
296
Points
2,435
Ray tracing on RTX isn't slow. Can we find any other piece of silicon that is remotely close to doing any sort of real time photon mapping at the rate RTX can anywhere? No
It's important to understand the technology, the philosophy behind it and what that translates to in a product with which we may have several objections to.
Scene complexity of the ILM/UE4 based Star wars demo is hundreds of times more complex than what Tomb Raider or BF can produce. Computationally it's night and day.

These engines that power these games are not by any stretch of the imagination designed from the ground up to support real time ray tracing in the primary or core rendering passes. They are literally patching support in. There's a reason benchmarks exist, and it's because any one title can't tell you about the underlying hardware objectively. Benchmarks are far from perfect, but they are cognizant of that and therefor written accordingly so they scale in a predictable and meaningful way, unlike games whose purpose is to deliver a playable experience first and foremost.

It would be like making a judgment on an entire API (DX/Vulkan etc) based on a single benchmark or title. Just because one uses a particular iteration of an API it doesn't mean what you get from that is a reflection of the API. This applies to the RTX hardware as well. As our hardware becomes more complex, it is important we understand this nuance.

We all know and have seen that when the Xbox 360 released vs when it was near the end of it's life how much better the graphics it was putting out were. The hardware didn't change, but the ability of the developers to extract power from the hardware is what changed.In the same way Killzone 4 for the PS4 was not the definitive example of what the console could do. Look at Horizon Zero Dawn or Detroit or GOW, even Death Stranding. Those are by far more visually impressive and more complex than Killzone 4 and potentially run better as well

Also, GPU progress in power isn't linear and it's absurd to expect it to be. GF3 wasn't much faster than GF2 Ultra, but it could do things the GF2 couldn't. Mainly it could use the first shader model brought into DirectX. That inflection point is necessary and it can't be about pushing ever faster frame rates forever in a straight line.

The ability to do work, has to come before the measurement of how fast that work is being done. This is important to understand.

It's important to understand how technological progression works in the silicon and IC space and how that ties into the grander scheme. Less focus on the actors but rather technology as a separate entity from the business practices.

If we are to critique the RTX 2080 Ti for example, let's put anything else next to it and then compare it. RTX 2080 Ti is doing in real time what previously took 4 Titan-Vs.
In the Top 10 of the worlds most powerful Super Computers, 6 of them (including #1) use NVIDIA hardware that per GPU isn't computationally equivelantthe to RTX 2080Ti.
I.e We have super computing power available to us, the next move is for developers, not the hardware vendors.

If we say the 1080Ti can play any game right now at the highest detail at FHD and maybe QHD, then of what value would it be to increase the frame rates even farther? Nearly doubling the gate count via the addition of more texture samplers, render outputs, caches and cuda cores would yield phenomenal performance, but even if NVIDIA did that, this Star wars demo would still need more than 2 or even 3 cards to run at even 24fps.

We can rightfully say real time ray tracing isn't relevant to the games we play right now, but but as a technology it is of the utmost importance, literally on par with the introduction of programmable shaders if not more so.

It is the pioneering nature of certain semiconductor firms or technology firms that allow them to lead. How they lead is by forging a new path constantly, something that others in the spaces couldn't do.

Be it we are willing to pay for this forging of a new path or not is separate from the validity of walking this new path. We should not conflate or confuse the two.
This could apply and does apply to any pioneering technology. When AMD introduced x86-64 extensions, their CPUs were hideously expensive but this was a necessary first step, despite the sometimes marginal gains in performance over Athlon XP in some instances. AMD had to forge that path and drag us kicking and screaming into 64-bit computing and here we are today where virtually every title or software uses a 64-bit executable. Someone had to take that step and AMD did when INTEL were not able or willing to in a backwards compatible way.
 

R3aL1Ty

Well-Known member
Rating - 100%
23   0   0
Joined
Aug 18, 2010
Messages
924
Reaction score
99
Points
1,465
Age
25
Location
Brakpan
Ray tracing on RTX isn't slow. Can we find any other piece of silicon that is remotely close to doing any sort of real time photon mapping at the rate RTX can anywhere? No
It's important to understand the technology, the philosophy behind it and what that translates to in a product with which we may have several objections to.
Scene complexity of the ILM/UE4 based Star wars demo is hundreds of times more complex than what Tomb Raider or BF can produce. Computationally it's night and day.

These engines that power these games are not by any stretch of the imagination designed from the ground up to support real time ray tracing in the primary or core rendering passes. They are literally patching support in. There's a reason benchmarks exist, and it's because any one title can't tell you about the underlying hardware objectively. Benchmarks are far from perfect, but they are cognizant of that and therefor written accordingly so they scale in a predictable and meaningful way, unlike games whose purpose is to deliver a playable experience first and foremost.

It would be like making a judgment on an entire API (DX/Vulkan etc) based on a single benchmark or title. Just because one uses a particular iteration of an API it doesn't mean what you get from that is a reflection of the API. This applies to the RTX hardware as well. As our hardware becomes more complex, it is important we understand this nuance.

We all know and have seen that when the Xbox 360 released vs when it was near the end of it's life how much better the graphics it was putting out were. The hardware didn't change, but the ability of the developers to extract power from the hardware is what changed.In the same way Killzone 4 for the PS4 was not the definitive example of what the console could do. Look at Horizon Zero Dawn or Detroit or GOW, even Death Stranding. Those are by far more visually impressive and more complex than Killzone 4 and potentially run better as well

Also, GPU progress in power isn't linear and it's absurd to expect it to be. GF3 wasn't much faster than GF2 Ultra, but it could do things the GF2 couldn't. Mainly it could use the first shader model brought into DirectX. That inflection point is necessary and it can't be about pushing ever faster frame rates forever in a straight line.

The ability to do work, has to come before the measurement of how fast that work is being done. This is important to understand.

It's important to understand how technological progression works in the silicon and IC space and how that ties into the grander scheme. Less focus on the actors but rather technology as a separate entity from the business practices.

If we are to critique the RTX 2080 Ti for example, let's put anything else next to it and then compare it. RTX 2080 Ti is doing in real time what previously took 4 Titan-Vs.
In the Top 10 of the worlds most powerful Super Computers, 6 of them (including #1) use NVIDIA hardware that per GPU isn't computationally equivelantthe to RTX 2080Ti.
I.e We have super computing power available to us, the next move is for developers, not the hardware vendors.

If we say the 1080Ti can play any game right now at the highest detail at FHD and maybe QHD, then of what value would it be to increase the frame rates even farther? Nearly doubling the gate count via the addition of more texture samplers, render outputs, caches and cuda cores would yield phenomenal performance, but even if NVIDIA did that, this Star wars demo would still need more than 2 or even 3 cards to run at even 24fps.

We can rightfully say real time ray tracing isn't relevant to the games we play right now, but but as a technology it is of the utmost importance, literally on par with the introduction of programmable shaders if not more so.

It is the pioneering nature of certain semiconductor firms or technology firms that allow them to lead. How they lead is by forging a new path constantly, something that others in the spaces couldn't do.

Be it we are willing to pay for this forging of a new path or not is separate from the validity of walking this new path. We should not conflate or confuse the two.
This could apply and does apply to any pioneering technology. When AMD introduced x86-64 extensions, their CPUs were hideously expensive but this was a necessary first step, despite the sometimes marginal gains in performance over Athlon XP in some instances. AMD had to forge that path and drag us kicking and screaming into 64-bit computing and here we are today where virtually every title or software uses a 64-bit executable. Someone had to take that step and AMD did when INTEL were not able or willing to in a backwards compatible way.
I love reading when you respond to threads like this!

your writing, the flow the works its magical thanks :)

im done :)
 

JollyJamma

Official Forum Dunce
VIP Supporter
Rating - 100%
40   0   0
Joined
Aug 22, 2011
Messages
5,020
Reaction score
1,357
Points
3,255
Location
Johannesburg, Parkhurst
Their shares will rebound when RTX becomes mainstay enough.

Drivers are key. Nvidia once stated that they are a software company because of how much effort goes into drivers and software in general.

RTX gen 2 will be much better bang for buck too.


Sent from my iPhone using Tapatalk
 

Prosp3ctus

Junior Member
Rating - 100%
22   0   0
Joined
Dec 9, 2015
Messages
342
Reaction score
10
Points
685
Lol, I am a shareholder and the dip in share price has 0 to do with the new cards and 100% to do with the US Treasury announcing higher yields from Bonds. Basically people are removing money from the stock market and placing them in bonds because they are super safe "guaranteed money".
 

Top Donors

$200.00
$145.00
$82.00
Top