Omega_Red_ZA
Epic Member
- Joined
- Mar 19, 2018
- Messages
- 487
- Reaction score
- 180
- Points
- 3,185
- Age
- 43
- Location
- Stutterheim, Eastern Cape
So it begins lol
R7.8 trillion. That's about $2.50 right?So it begins lol
They should have called them LLM's instead of AI but I guess the hype filled a lot of pockets.
This sums it up nicely for me:
Check this out...I still think "analogue" microchips as an additional or supplementary processor would be a good thing for optimization and efficiency, but this is still a while away... And to really make use of this chip, it would have to be adaptive/modular like an FPGA. - The aim is multifunction, and having a single chip do only languages seems like a good start, but it'll have to do more than that to become a universal standard chip, like a soundcard used to be on older motherboards.
If a language model can be created and hard-coded into a microchip, for, I don't know, the 20 most popular languages in the world? And this chip becomes standard on all motherboards, it can provide seamless translations of anything language-based, like emails, webpages, heck, even phone calls, this could eliminate a lot of additional language prerequisites required for an operating system to work.
The processing power (silicon- and electricity-wise) to develop these models are insane, but once the structure is laid out, and hard coded, it could be worth something as a "pre-computed library" of sorts. Being more robust and less susceptible to error. I think this is what nVidia is providing, a modular and expandable platform in which these libraries can be put together, and allow us to peek at all of these structures.
There would no longer be a requirement for websites to be written in 4-5 different languages, you load one webpage, and the chip does the translation on-the-fly. I know there are already tools out there that do this for existing websites, but I believe a chip like this could replace the need for any languages to be part of a software/OS package, for audio, written text and more.
Similarly, audio processing and audio dubbing of movies and television programs or even educational videos can be done by by AI without having the Bruce Lee bad lip sync effect to be a thing. If the model knows how a human's lips move to utter a word in a/any specific language, it can be substituted without dubbing. Even games could (or are) benefitting from this.
I digress somewhat into the future capability, but none of this is a reality with 100% accuracy, unfortunately humans themselves make mistakes, and until humans can replicate 100% accuracy in software, the hardware will have to wait.
I know chip manufacturers like Asahi Kasei will be spending big money into the development of microchips similar to these.
AlphaFold has already further developed protein structure prediction, this has been made possible by AI. Protein folding is the first step toward understanding how to develop bio-computers or at least integration and communication with them on a "neural" or electrical impulse level.
Humans have this odd talent to make the impossible possible, and I don't think that could ever be replaced. We are definitely not on the verge of cramming humanity into a synthetic brain, but every step toward this, even if it reaches an asymptote, ultimately, making our lives easier and more consistent. I don't think humans as a species will continue living in a biological form. Rather something hybrid.
A single event upset event caused by radiation is currently mitigated by running three computers in tandem, running the same program, and returning the result, and comparing it to the other computers' result. Similarly, peer reviews and scientific studies into the unknown are studied independently, and results are then compared. This is stepping into quantum computing in its essence, but cracking that code seems like an asymptote.
But that's a conspiracy for a different smack talk thread.
I suppose this would work, but you'd have to have specialised lasers and oscilloscopes to actually interpret/read the data accurately.Check this out...
1st Light powered AI Chip
Yo from the perspective of these companies making these AI chatbots and all the other AI related bullshit that's recently popped up is a result of being a big tech magnate in the industry.Anybody else share the same sediment that Nvidia and Intel's whole monopoly over AI is actually just another way to control and document our lives.
Isn't it just insane that an entire industry has switched from providing the most efficient and powerful CPUs just to shift the entire focus to damn accommodating AI intergration in the next major update od windows and all other operating systems including IOS.
We saw EA lose 90% of it's consumer base not listening to what their users actually wants and throwing out an incomplete Battlefield.
Yeah I use to be a farboy.
So where exactly are we heading. So just when you thought maybe AMD won't get brainwashed by Nvidia's stockmarket success....oh shit here they go too...clearly seen by the upcoming release of the new 9000 series cpus.
Shame. Well thank God I chose to spend R75000 on a 14th gen system because it seems to be the last Gen of freedom
Rant finished...
Well not exactly but for now st least