I've largely dismissed the strain of AI alarmism based on the notion the a computer will be so smart that the danger it poses to humanity is outsmarting us.

There are real dangers in AI, most of them relate to people using these technologies in improper ways due to having a poor understanding of what they really are... and most important the exploitation & degradation of the human body of knowledge creativity represented by publicly available digital information. 1/

But some people still talk about the risk of AI being too smart, tricking us-- or simply not delivering what we expect, or using any power it is given in ways that result in manipulation more sophisticated than what we could ever anticipate.

Do all of the people who have these fears buy-in to that silly bootstrapping theory of AI advancement? (This is the idea that once an GI AI becomes "smart" enough to redesign itself, it will spiral off to become... well a god basically.)

2/

This idea is an excellent and very fun sci-fi plot. (And sci-fi should never be ignored)

But is there any evidence that such a thing could even happen? I suppose the "evolved" algorithms used to program some motors to help robots walk show that there is some possibly. But those algorithm generators were not as simple as they were described in the press. There was a lot more scaffolding.

It wasn't just "we wrote a program to try random motor movement and then it evolved a way to move" 3/

A self modifying intelligence would need metrics to tell if it was improving or not.

This would require huge sets of data...and a way to compare them. We can do that, but it's not efficient at all.

In the 80s I thought a computer doing face recognition was impossible: a photo is just too much data to process.

We've solved this problem in the least exciting way. A kind of brute force. Throw more servers at it. Yes some of the algorithms are nice... but it's not exactly magic. 4/

The human brain has a stupendous number of cells. It's the most energy intensive organ in your body using 20% of the calories you consume despite being a much smaller portion of your body mass. This is why so few organisms have complex brains. Intelligence is an excellent strategy but it is also expensive. Even in nature.

LLMS are less efficient than your brain.

Thinking, organizing information is real work that requires energy.

(we really are getting to basics here) 5/

@futurebird Some companies are already working on harnessing human brain organoids for compute. So there's defs folks trying to understand how to harness brain structure. medicalxpress.com/news/2024-03

@mauve @futurebird that article linked says nothing about this? It talks about organoids of various tissues, and how they found out how to basically create something like a vascular system that works well for them. It talks about how this can be used for medical research, for developing new drugs and stuff. Nothing about this weird dystopian brain organoids stuff you are talking about

Follow

@enby_of_the_apocalypse @futurebird Sorry bad article it was the first result I found that was vaguely related. I'll see about getting a better link on Monday after I'm back at my pooter.

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.