What's new
Carbonite

South Africa's Top Online Tech Classifieds!
Register a free account today to become a member! (No Under 18's)
Home of C.U.D.

AI my ass

UwN9ebO.jpeg

So it begins lol
 
I still think "analogue" microchips as an additional or supplementary processor would be a good thing for optimization and efficiency, but this is still a while away... And to really make use of this chip, it would have to be adaptive/modular like an FPGA. - The aim is multifunction, and having a single chip do only languages seems like a good start, but it'll have to do more than that to become a universal standard chip, like a soundcard used to be on older motherboards.

If a language model can be created and hard-coded into a microchip, for, I don't know, the 20 most popular languages in the world? And this chip becomes standard on all motherboards, it can provide seamless translations of anything language-based, like emails, webpages, heck, even phone calls, this could eliminate a lot of additional language prerequisites required for an operating system to work.

The processing power (silicon- and electricity-wise) to develop these models are insane, but once the structure is laid out, and hard coded, it could be worth something as a "pre-computed library" of sorts. Being more robust and less susceptible to error. I think this is what nVidia is providing, a modular and expandable platform in which these libraries can be put together, and allow us to peek at all of these structures.

There would no longer be a requirement for websites to be written in 4-5 different languages, you load one webpage, and the chip does the translation on-the-fly. I know there are already tools out there that do this for existing websites, but I believe a chip like this could replace the need for any languages to be part of a software/OS package, for audio, written text and more.

Similarly, audio processing and audio dubbing of movies and television programs or even educational videos can be done by by AI without having the Bruce Lee bad lip sync effect to be a thing. If the model knows how a human's lips move to utter a word in a/any specific language, it can be substituted without dubbing. Even games could (or are) benefitting from this.

I digress somewhat into the future capability, but none of this is a reality with 100% accuracy, unfortunately humans themselves make mistakes, and until humans can replicate 100% accuracy in software, the hardware will have to wait.

I know chip manufacturers like Asahi Kasei will be spending big money into the development of microchips similar to these.

AlphaFold has already further developed protein structure prediction, this has been made possible by AI. Protein folding is the first step toward understanding how to develop bio-computers or at least integration and communication with them on a "neural" or electrical impulse level.

Humans have this odd talent to make the impossible possible, and I don't think that could ever be replaced. We are definitely not on the verge of cramming humanity into a synthetic brain, but every step toward this, even if it reaches an asymptote, ultimately, making our lives easier and more consistent. I don't think humans as a species will continue living in a biological form. Rather something hybrid.

A single event upset event caused by radiation is currently mitigated by running three computers in tandem, running the same program, and returning the result, and comparing it to the other computers' result. Similarly, peer reviews and scientific studies into the unknown are studied independently, and results are then compared. This is stepping into quantum computing in its essence, but cracking that code seems like an asymptote.

But that's a conspiracy for a different smack talk thread. :LOL:
 
I still think "analogue" microchips as an additional or supplementary processor would be a good thing for optimization and efficiency, but this is still a while away... And to really make use of this chip, it would have to be adaptive/modular like an FPGA. - The aim is multifunction, and having a single chip do only languages seems like a good start, but it'll have to do more than that to become a universal standard chip, like a soundcard used to be on older motherboards.

If a language model can be created and hard-coded into a microchip, for, I don't know, the 20 most popular languages in the world? And this chip becomes standard on all motherboards, it can provide seamless translations of anything language-based, like emails, webpages, heck, even phone calls, this could eliminate a lot of additional language prerequisites required for an operating system to work.

The processing power (silicon- and electricity-wise) to develop these models are insane, but once the structure is laid out, and hard coded, it could be worth something as a "pre-computed library" of sorts. Being more robust and less susceptible to error. I think this is what nVidia is providing, a modular and expandable platform in which these libraries can be put together, and allow us to peek at all of these structures.

There would no longer be a requirement for websites to be written in 4-5 different languages, you load one webpage, and the chip does the translation on-the-fly. I know there are already tools out there that do this for existing websites, but I believe a chip like this could replace the need for any languages to be part of a software/OS package, for audio, written text and more.

Similarly, audio processing and audio dubbing of movies and television programs or even educational videos can be done by by AI without having the Bruce Lee bad lip sync effect to be a thing. If the model knows how a human's lips move to utter a word in a/any specific language, it can be substituted without dubbing. Even games could (or are) benefitting from this.

I digress somewhat into the future capability, but none of this is a reality with 100% accuracy, unfortunately humans themselves make mistakes, and until humans can replicate 100% accuracy in software, the hardware will have to wait.

I know chip manufacturers like Asahi Kasei will be spending big money into the development of microchips similar to these.

AlphaFold has already further developed protein structure prediction, this has been made possible by AI. Protein folding is the first step toward understanding how to develop bio-computers or at least integration and communication with them on a "neural" or electrical impulse level.

Humans have this odd talent to make the impossible possible, and I don't think that could ever be replaced. We are definitely not on the verge of cramming humanity into a synthetic brain, but every step toward this, even if it reaches an asymptote, ultimately, making our lives easier and more consistent. I don't think humans as a species will continue living in a biological form. Rather something hybrid.

A single event upset event caused by radiation is currently mitigated by running three computers in tandem, running the same program, and returning the result, and comparing it to the other computers' result. Similarly, peer reviews and scientific studies into the unknown are studied independently, and results are then compared. This is stepping into quantum computing in its essence, but cracking that code seems like an asymptote.

But that's a conspiracy for a different smack talk thread. :LOL:
Check this out...
1st Light powered AI Chip
 
I suppose this would work, but you'd have to have specialised lasers and oscilloscopes to actually interpret/read the data accurately.

It's a good start, however, analogue is supposed to draw less than or equal amount of power compared to digital if it were to compete in any way. That's where the hardware improvements need to come along.
 
Some mothers have children :ROFLMAO:

In all seriousness there's still a lot of stuff that needs to happen before "AI" gets out of hand and if OP is worried about being tracked by "AI" wait till he finds out what Google and Microsoft are up to in terms of engagement tracking and general statistic logging.
 
Anybody else share the same sediment that Nvidia and Intel's whole monopoly over AI is actually just another way to control and document our lives.

Isn't it just insane that an entire industry has switched from providing the most efficient and powerful CPUs just to shift the entire focus to damn accommodating AI intergration in the next major update od windows and all other operating systems including IOS.

We saw EA lose 90% of it's consumer base not listening to what their users actually wants and throwing out an incomplete Battlefield.
Yeah I use to be a farboy.

So where exactly are we heading. So just when you thought maybe AMD won't get brainwashed by Nvidia's stockmarket success....oh shit here they go too...clearly seen by the upcoming release of the new 9000 series cpus.

Shame. Well thank God I chose to spend R75000 on a 14th gen system because it seems to be the last Gen of freedom

Rant finished...
Well not exactly but for now st least
Yo from the perspective of these companies making these AI chatbots and all the other AI related bullshit that's recently popped up is a result of being a big tech magnate in the industry.

The goal of every business in this in this field is to maximize their profits by any means. Google, Apple, Nvidia and basically every big tech company have billions lying around in stocks and company assets that's just sitting there and not bringing in profit, and that's a bad thing for a tech giant, the money always needs to keep moving in and out the company, that's why you see everyone throwing literally half of their companies assets toward these AI products to 100x on their investments, and from the view point of the higher ups of these companies, its either A : We don't invest in AI at all and watch our competition jump ahead literal years of us utilizing AI commodities to boost their stocks or B : We just throw bands upon bands at AI products till we see major returns and even if we don't see big returns, our competition would have likely spent just as much as we did in AI so we aren't necessarily lagging behind them.

So these companies didn't have a choice, it was either jump on the AI hype train or drown in the red. This sht was inevitable ever since AI was introduced, late-stage capitalism got us inna chokehold we aint ever gunno escape, beyond cooked atp
Nooo
.
 

Users who are viewing this thread

Latest posts

Back
Top Bottom