• Quetzlcoatl@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    1
    ·
    edit-2
    10 months ago

    The AI has investigated the officer and found no wrongdoing. The AI has lost the requested bodycam footage. The AI has denied your cancer treatment claim. Scary world we’re headed into now that theyve invented the perfect scapegoat.

    • Mamertine@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      10 months ago

      Is it really any different than when the formula did it?

      When Apple launched their credit card it was super biased against applicants with female names. They blamed it on the formula.

      Ai is just the newest fancy word for programming.

          • gramathy@lemmy.ml
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            10 months ago

            Some algorithms might be able to be written as formulas but generally no. An algorithm is a repetition of steps to achieve a desired result and does not have a fixed way of representing itself because it could make different decisions along the way in different situations.

            A sorting algorithm is not a formula, for example. Formulas are mathematical or logical expressions that can be evaluated.

    • Ech@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      The AI has denied your cancer treatment claim. you insurance based on your genetic history for cancer, thanks to buying your genome from [insert company that bought out 23andMe/Ancestry.com]

      ftfy

  • Evkob@lemmy.ca
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    10 months ago

    Body camera video equivalent of 25 million copies of “Barbie”

    Is this a typical unit of measurement in journalism? Like what even is this? Crappy in-article advertising? Some weird SEO shit? An odd attempt to be cool and hip?

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    edit-2
    10 months ago

    Body camera video equivalent to 25 million copies of “Barbie”

    Literally anything but the metric system

  • OmnislashIsACloudApp@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    10 months ago

    oh great I’m sure the training for this will not result in a bunch of things getting “reviewed” and no one being responsible for mistakes at all…

  • terminhell@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    10 months ago

    What if all the cam footage was just uploaded to something like YouTube. Publicly visible by ya know, the very citizens that pay for it and work for…

    • Dizzy Devil Ducky@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      10 months ago

      That feels like it would be a logistics and a just in general nightmare. Does every single individual have an account where they’re forced to stream their footage? If not and it’s all being uploaded to a single channel for a department, who’s in charge of the task of uploading the footage? Who’d even be willing to spend their days doing nothing but uploading footage when your departments internal internet connection comes to a crawl speed because of the person(s) who has/have to upload the footage (because you just know they certainly ain’t paying for them to have their own private network for this in most areas)?

      In theory it sounds great but in practice it just sounds like a nightmare. Not defending the police but it just doesn’t seem like a task they’d be willing to take up because of all the work they’d have to put in to make sure it works.

      That, and the money they spend doing something like this could obviously be used on something more pressing, like shooting a black man because he didn’t get down on the ground and worship the boots of the officer that killed him after being pulled over on suspicion of absolutely nothing (/s on this part)

  • DontMakeMoreBabies@kbin.social
    link
    fedilink
    arrow-up
    18
    arrow-down
    3
    ·
    10 months ago

    Would you rather these things never be reviewed? Isn’t something better than nothing?

    You’ll literally never be able to afford (or hire) enough people to review the data they are taking in…

    I mean unless we start killing billionaires and taking their shit.

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      Make it publicly accessible. It’ll most certainly get watched and problems will be reported to be investigated further.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      10 months ago

      Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step

      What we should look into for this program is

      • how the flags are being set, and what kind of interaction will warrant a flag
      • what changes are made to training as a result of this data
      • how the privacy is being handled, and where the data is going (ex. Don’t use this footage to train some model, especially because not every interaction is out in the public)
    • Rivalarrival@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      10 months ago

      File a complaint, and you get to view the video. If nobody files a complaint, there is no need to view the video.

      Indeed, nobody should be looking at the video unless a complaint is filed.

    • MaxPow3r11@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      WE should be able to review it/see it ALL.

      We pay these fucks to torture and kill with our tax $.

      They should have nothing to hide from us.

  • Ech@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    3
    ·
    10 months ago

    Ah, good. I had “racist profiling AILLM” on my 2024 bingo card

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    10 months ago

    Yes, because AI has a firm grasp on nuanced topics like law enforcement and civilian/human rights…

    You may as well play the video to an empty room.

  • TheMurphy@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    4
    ·
    edit-2
    10 months ago

    ITT: People who are scared of things they don’t understand, which in this case is AI.

    In this case, the “AI” program is nothing more than pattern recognition software setting a timestamp where it believes there’s something to be looked at. Then an officer can take a look.

    It saves so much time, and it filters out anything irrelevant. But be careful because it’s labelled “AI”. Scarry.

    EDIT: Comments to this comment confirms that you don’t understand AI, because if you did, you’d know that this system who scans video is not a LLM (large language model). It’s not even the same system in its core.

    • Killing_Spark@feddit.de
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      It’s also potentially skipping some of the parts that should be looked at. It depends on the training set.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      edit-2
      10 months ago

      This is an astonishingly bad take.

      Almost every AI system is a black box. Even if you open source the code and the training data, it’s almost impossible to know anything about the current state of a machine learning model.

      So the entire premise here is that a completely unaccountable system - whose decisions are basically impossible to understand or scrutinize - gets to decide what data is or isn’t relevant.

      When an AI says “No crime spotted here”, who gets to even know that it did that? If a human is reviewing all of the footage, then why have the AI? You’re doing the same amount of human work anyway. So as soon as you introduce this system, you remove a huge amount of human oversight, and replace it with decisions that dramatically affect human lives - that could potentially be life or death if it’s the difference between a bad cop being taken off the street or not - being made by a completely unaccountable system.

      Whose to say if the training data fed into this system results in it, say, becoming effectively blind to police violence against black people?

      And if that doesn’t scare you, it absolutely should.

      • Misconduct@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        10 months ago

        It’s not impossible to understand or scrutinize. They give it specific things to look for. It does what it’s told. You can make the argument that ANY tool used by the police will be misused in their favor. AI isn’t special for that by any means. It’s not like we bother to hold anyone accountable for anything else now anyway. Maybe the AI will be less biased

        It’s definitely not doing the same work as a human if humans are spared sifting through hours upon hours of less useful footage. I’m sure they’re testing it etc. Nobody goes all in on this stuff. Really, you guys can be so very dramatic lol

  • MaxPow3r11@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    edit-2
    10 months ago

    “our Pig AI System searched all of the videos. No cop did anything wrong. Ever. The End” ~cop fucks

    (Fuck this shit. As usual. Abolish police)