andruid,

Training tends to be more compute intensive while inference is more likely to be able to be ran on a smaller hardware foot print.

The neater idea would be a standard model or set of models, so that a 30G program can be used on ~80% of target case, games and video seem good canidates for this.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • oklahoma
  • feritale
  • KamenRider
  • Ask_kbincafe
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • SuperSentai
  • All magazines