Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • ours@lemmy.film
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Having “AI functionality” doesn’t mean they can just get rid of their big/expensive models they use now.

    If they are anything like Open AI’s LLM, it requires very beefy machines with a ton of expensive RAM.

    • Hazdaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Well that’s exactly what I was thinking when these companies were making these claims… like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn’t make sense.

      EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

      • ours@lemmy.film
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        “AI” doesn’t use databases per se, they are trained models built from large amounts of training data.

        Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI’s LLM.