AFKBRBChocolate,

The system must be self aware That’s a long, long way off. What you’re describing is what is often called “Artificial General Intelligence (AGI),” or sometimes “Strong AI.” That’s not how most people currently define AI.

Right now (to the best of my knowledge) there are no actual AI systems. And that’s part of the reason most people in the field distinguish AGI from AI.

Everything that is being called AI is just complex programs that are following EXACTLY what the programming has set in stone with no deviations.

Hmmm, that’s problematic. All AI are programs, and all programs execute as they were written to do. For there to be flexibility, it has to be written into the code. You could argue than LLMs have a lot of flexibility written into them. If I ask one to tell me a story about an adolescent bee who is embarrassed because she has no stinger, it’s going to do that. It’s not like the LLM authors wrote a bee story into the code in case someone asked for one. But that also doesn’t make them self aware. You might also ask yourself how we’ll know if one is self aware.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines