What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

NVIDIA Shares Slip on RTX Family Sales

Ray tracing on RTX isn't slow. Can we find any other piece of silicon that is remotely close to doing any sort of real time photon mapping at the rate RTX can anywhere? No
It's important to understand the technology, the philosophy behind it and what that translates to in a product with which we may have several objections to.
Scene complexity of the ILM/UE4 based Star wars demo is hundreds of times more complex than what Tomb Raider or BF can produce. Computationally it's night and day.

These engines that power these games are not by any stretch of the imagination designed from the ground up to support real time ray tracing in the primary or core rendering passes. They are literally patching support in. There's a reason benchmarks exist, and it's because any one title can't tell you about the underlying hardware objectively. Benchmarks are far from perfect, but they are cognizant of that and therefor written accordingly so they scale in a predictable and meaningful way, unlike games whose purpose is to deliver a playable experience first and foremost.

It would be like making a judgment on an entire API (DX/Vulkan etc) based on a single benchmark or title. Just because one uses a particular iteration of an API it doesn't mean what you get from that is a reflection of the API. This applies to the RTX hardware as well. As our hardware becomes more complex, it is important we understand this nuance.

We all know and have seen that when the Xbox 360 released vs when it was near the end of it's life how much better the graphics it was putting out were. The hardware didn't change, but the ability of the developers to extract power from the hardware is what changed.In the same way Killzone 4 for the PS4 was not the definitive example of what the console could do. Look at Horizon Zero Dawn or Detroit or GOW, even Death Stranding. Those are by far more visually impressive and more complex than Killzone 4 and potentially run better as well

Also, GPU progress in power isn't linear and it's absurd to expect it to be. GF3 wasn't much faster than GF2 Ultra, but it could do things the GF2 couldn't. Mainly it could use the first shader model brought into DirectX. That inflection point is necessary and it can't be about pushing ever faster frame rates forever in a straight line.

The ability to do work, has to come before the measurement of how fast that work is being done. This is important to understand.

It's important to understand how technological progression works in the silicon and IC space and how that ties into the grander scheme. Less focus on the actors but rather technology as a separate entity from the business practices.

If we are to critique the RTX 2080 Ti for example, let's put anything else next to it and then compare it. RTX 2080 Ti is doing in real time what previously took 4 Titan-Vs.
In the Top 10 of the worlds most powerful Super Computers, 6 of them (including #1) use NVIDIA hardware that per GPU isn't computationally equivelantthe to RTX 2080Ti.
I.e We have super computing power available to us, the next move is for developers, not the hardware vendors.

If we say the 1080Ti can play any game right now at the highest detail at FHD and maybe QHD, then of what value would it be to increase the frame rates even farther? Nearly doubling the gate count via the addition of more texture samplers, render outputs, caches and cuda cores would yield phenomenal performance, but even if NVIDIA did that, this Star wars demo would still need more than 2 or even 3 cards to run at even 24fps.

We can rightfully say real time ray tracing isn't relevant to the games we play right now, but but as a technology it is of the utmost importance, literally on par with the introduction of programmable shaders if not more so.

It is the pioneering nature of certain semiconductor firms or technology firms that allow them to lead. How they lead is by forging a new path constantly, something that others in the spaces couldn't do.

Be it we are willing to pay for this forging of a new path or not is separate from the validity of walking this new path. We should not conflate or confuse the two.
This could apply and does apply to any pioneering technology. When AMD introduced x86-64 extensions, their CPUs were hideously expensive but this was a necessary first step, despite the sometimes marginal gains in performance over Athlon XP in some instances. AMD had to forge that path and drag us kicking and screaming into 64-bit computing and here we are today where virtually every title or software uses a 64-bit executable. Someone had to take that step and AMD did when INTEL were not able or willing to in a backwards compatible way.

I love reading when you respond to threads like this!

your writing, the flow the works its magical thanks :)

im done :)
 
Sorry, I had to share. :ROFLMAO:

1539342054437.png
 
Their shares will rebound when RTX becomes mainstay enough.

Drivers are key. Nvidia once stated that they are a software company because of how much effort goes into drivers and software in general.

RTX gen 2 will be much better bang for buck too.


Sent from my iPhone using Tapatalk
 
Lol, I am a shareholder and the dip in share price has 0 to do with the new cards and 100% to do with the US Treasury announcing higher yields from Bonds. Basically people are removing money from the stock market and placing them in bonds because they are super safe "guaranteed money".
 

Users who are viewing this thread

Similar threads

  • Locked
  • Location:
    1. Johannesburg
Replies
2
Views
172
  • Locked
  • Location:
    1. Johannesburg
Replies
3
Views
198
  • Location:
    1. Centurion
Replies
5
Views
575
  • Locked
  • Location:
    1. Cape Town
Replies
5
Views
236
Back
Top Bottom