AFKBRBChocolate,

It doesn’t actually KNOW what the prompt is and it doesn’t reason the answer.

Right, no AI “knows” anything. To know someone implies understanding and, like we said earlier, AIs are just computer programs; they don’t know anything, they just provide an output based on an input.

Two areas that are fun to play with LLMs in are math and cooking. Math has crisp rules and answers can be shown to be right or wrong. LLMs get a lot of complex math problems wrong. They will give you something that looks right, because their model includes what an answer should look like, but all they’re doing is giving an answer that its model shows looks like satisfies the prompt.

Cooking is similar. There’s no crisp right or wrong, butThe same process is at play. If you ask it for a recipe for something uncommon, it’s going to give you one, and it will likely have the right kind of ingredients, but if you make it there’s a decent chance it will taste terrible.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines