treefrog,

There’s a difference. When you program a machine it follows rigid logic. It’s predictable.

When you train a machine it does not. It can make its own inferences and operate outside of strict parameters. It can also make bad inferences, what we call AI hallucinations.

I don’t know that what you’re saying is wrong about avoiding responsibility, but programmed is not the right word for what’s basically a genie in a bottle. And we still hold accountable the member of the tribe that lets the genie out, or should anyway.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines