nandeEbisu,

AI is a super broad topic, I’ve heard people refer to Principal Component Analysis, which is from the 1930s, as “Machine Learning” or “AI”. In reality its just that we have infrastructure and data at scale to start applying techniques in larger contexts.

I know pharmaceutical companies have been using AI in drug discovery for probably a decade now, but those models are very different from what a large language model looks like, and you still have a human sifting through the results and performing validation on a physical system to make sure the compounds do what is predicted safely, most of which do not.

When you ask something like ChatGPT a question like that, its doing something akin to looking up the most recent papers on the subject that it was trained on, and outputting something that looks like a chemical compound that the paper would be generated. It doesn’t have an understanding of what that formula means, only that when you arrange letters in that way, it looks superficially similar to what would have been in the paper. Its like in movies, when they need to express someone doing math, they just fill a chalk board with random equations that look like advanced math at first glance, but might be introductory level material, or even just gibberish.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines