Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.

  • OhmsLawn@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    5
    ·
    11 months ago

    How’s that supposed to work?

    I’m picturing a backpack full of batteries and graphics cards. Maybe they’re talking about a more limited model?

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      3
      ·
      edit-2
      11 months ago

      This is a Financial Times article, regurgitated by Ars Technica. The article isn’t by a tech journalist, it’s by a business journalist, and their definition of “AI” is a lot looser than what you’re thinking of.

      I’m pretty sure they’re talking about things that Apple is already doing not just on current hardware but even on hardware from a few years ago. For example the keyboard on iOS now uses pretty much the same technology as ChatGPT but scaled way way down to the point where “Tiny Language Model” would probably be more accurate. I wouldn’t be surprised if the training data is as small as ten megabytes, compared to half a terabyte for ChatGPT.

      The model will learn that you say “Fuck Yeah!” to one person and “That is interesting, thanks for sharing it with me.” to someone else. Very cool technology - but it’s not AI. The keyboard really will suggest swear words now by the way - if you’ve used them previously in a similar context to the current one. The old algorithmic keyboard had hardcoded “do not swear, ever” logic.

    • exu@feditown.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 months ago

      I’ve been playing with llama.cpp a bit for the last week and it’s surprisingly workable on a recent laptop just using the CPU. It’s not really hard to imagine Apple and others adding (more) AI accelerators on mobile.

      • Apollo2323@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 months ago

        Oh yes and the CPUs on phones have being getting more powerful every year and there was nothing that could take advantage of their full potential now with a local AI will be great for privacy and response.

    • MiltownClowns@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      11 months ago

      They’re making their own silicone now. You can achieve a lot more efficiency when you’re streamlined the whole way through.

      • hips_and_nips@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        silicone

        It’s silicon. Silicon is a naturally occurring chemical element, whereas silicone is a synthetic substance.

        Silicon is for computer chips, silicone is for boobies.

      • ImFresh3x@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        By making their own, you mean telling Taiwan Semiconductor Manufacturing Company “hey we are going to buy enough of these units that you have to give us the specs we chose at a better price than the competitors, and since we chose the specs off your manufacturing capacity sheets we will say “engineered in Cupertino TM” “

        Btw I’m not shitting on Apple here. I love my m2 processor.