Whats missing in modern AI?
Moderator: Alyrium Denryle
Whats missing in modern AI?
We have AI programs that can learn new words, recognize objects, find patterns, almost emote, and yet none of them seem to be anywhere near "conscious". Something seems lacking, something human. Kismet, an emoting object rec AI gets angry when you ignore him, bored, all sorts of things, and yet he doesnt feel quite human, he just does what hes told to do, very robot-like. I suppose thats appropriate, since he is afterall a robot, but something still feels missing, something is still wrong, because hes very lifelike but doesnt quite convince you.
So whats missing? Well, I think its the lack of situation awareness. I don't think Kismet has the ability to act according to whats going on. Sure, he watches you move and names plush horsies and and balls, and even mimics you sometimes, but he doesnt say "whats that? i have seen that before?" or anything related to unfamiliarity, such as curiosity or fear. I also dont think Kismet has any ability to find new things to learn about. While this is mostly an issue of immobility, Kismet doesnt say "Teach me about something new" or anything like that.
Theres probably also an issue of response to stimuli. Even if Kismets happiness is digital, he doesnt act on it, or seek out things he enjoys, or shuns things that anger him. He also doesnt have the ability to imagine things, he cant think "well, this ball is rolling like this, so it will end up over there" or "if i say something to Rodney that he doesnt like talking about, he wont answer" or anything remotely like that.
I doubt it would be easy to add such features, but I honestly think these are some fundementally human things that need to be thought about when making an AI.
So whats missing? Well, I think its the lack of situation awareness. I don't think Kismet has the ability to act according to whats going on. Sure, he watches you move and names plush horsies and and balls, and even mimics you sometimes, but he doesnt say "whats that? i have seen that before?" or anything related to unfamiliarity, such as curiosity or fear. I also dont think Kismet has any ability to find new things to learn about. While this is mostly an issue of immobility, Kismet doesnt say "Teach me about something new" or anything like that.
Theres probably also an issue of response to stimuli. Even if Kismets happiness is digital, he doesnt act on it, or seek out things he enjoys, or shuns things that anger him. He also doesnt have the ability to imagine things, he cant think "well, this ball is rolling like this, so it will end up over there" or "if i say something to Rodney that he doesnt like talking about, he wont answer" or anything remotely like that.
I doubt it would be easy to add such features, but I honestly think these are some fundementally human things that need to be thought about when making an AI.
Sì! Abbiamo un' anima! Ma è fatta di tanti piccoli robot.
Memory.
The human brain can hold a shitload of information, far more than you can put in yer basic hard drive. Because of this, you and I are capable of establishing CONTEXT to what's around us. "What is this? It's not an apple, an orange, a banana, or a watermelon... etc. etc." You can't just put a dictionary or an Encyclopedia into something's memory banks. It has to create its own context and opinion, experience, definition, etc. of an object, situation, occurence, mood, taste, what-have-you.
The human brain can hold a shitload of information, far more than you can put in yer basic hard drive. Because of this, you and I are capable of establishing CONTEXT to what's around us. "What is this? It's not an apple, an orange, a banana, or a watermelon... etc. etc." You can't just put a dictionary or an Encyclopedia into something's memory banks. It has to create its own context and opinion, experience, definition, etc. of an object, situation, occurence, mood, taste, what-have-you.
The Great and Malignant
- TheDarkOne
- Youngling
- Posts: 135
- Joined: 2002-07-08 07:43pm
- Location: UBC
Shear scale.
They just arent complex/big enough yet. By at least a dozen or so orders of magnitude.
They just arent complex/big enough yet. By at least a dozen or so orders of magnitude.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
- Admiral Valdemar
- Outside Context Problem
- Posts: 31572
- Joined: 2002-07-04 07:17pm
- Location: UK
Time is also a factor, if you have a machine with the basica neural net of a human brain and equivalent memory capacity (a SHITLOAD of HDDs) then give it an emergent behaviour program that learns via heuristics etc. then you may, given enough time and experience on behalf of the system, get somewhere quickly.
its simply the level of sophistication. We have AI that can react to be angry if ignored but humans are so much more complex than this, and so the AI needs to be developed in a more complex manner. eventually we'll get there, its just a matter of time.
AI needs time to learn and grow on its own, just as a human does.
When a child is born, its brain is programmed with a very limited set of parameters--its gets upset when not fed, not changed and when it is in discomfort, etc. As a child grows, his brain learns on its own and its personality developes in its own unique way (as defined by its genetic coding and its unique experiences in life).
An AI needs to develop in a similar way, so that it gains experience and developes on its own into a unique consciousness, the same as a human would.
AI needs time to learn and grow on its own, just as a human does.
When a child is born, its brain is programmed with a very limited set of parameters--its gets upset when not fed, not changed and when it is in discomfort, etc. As a child grows, his brain learns on its own and its personality developes in its own unique way (as defined by its genetic coding and its unique experiences in life).
An AI needs to develop in a similar way, so that it gains experience and developes on its own into a unique consciousness, the same as a human would.
I'll swallow your soul!
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Functional units and sophistication. The typical insect can outperform the most sophisticated AIs any day of the week. The brains of insects make modern AIs look like hydras or jellyfish in comparision. What you need are AIs with a lot more functional units with a lot better interconnectivity. This would allow for the creation of even more sophisticated and complex neural networks, which would permit for much more sophisticated behaviours.
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
- Admiral Valdemar
- Outside Context Problem
- Posts: 31572
- Joined: 2002-07-04 07:17pm
- Location: UK