Call Of Duty using AI to listen out for hate speech during online matches::The tool, which will monitor voice chat for any bullying and harassment, will be part of Modern Warfare III - the next game in the series - when it launches in November.

    • ramjambamalam@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      4
      ·
      1 year ago

      It’s not exclusive. A twelve year old yelling slurs into their microphone is easily detectable using modern technology. Why not?

    • mrpants@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      1 year ago

      What a stupid take, these are completely different and valid problems with entirely separate solutions. One of which the gaming industry has spent decades fighting and the other they literally just got tools for.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    This is the best summary I could come up with:


    Publisher Activision said the moderation tool, which uses machine learning technology, would be able to identify discriminatory language and harassment in real time.

    Activision’s chief technology officer Michael Vance said it would help make the game “a fun, fair and welcoming experience for all players”.

    The issue is exacerbated in popular multiplayer games due to the sheer number of players, with around 90 million people playing Call Of Duty each month.

    Activision said its existing tools, including the ability for gamers to report others and the automatic monitoring of text chat and offensive usernames, had already seen one million accounts given communications restrictions.

    Call Of Duty’s code of conduct bans bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.

    Mr Vance said ToxMod allows the company’s moderation efforts to be scaled up significantly by categorising toxic behaviour based on its severity, before a human decides whether action should be taken.


    The original article contains 357 words, the summary contains 160 words. Saved 55%. I’m a bot and I’m open source!

  • Gerula@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    1 year ago

    This is dumb. Even without AI almost each generation has it’s own slang and slurs …

  • giantofthenorth@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    1 year ago

    Glad I haven’t played a cod game in years.

    There’s already a report button if someone has an issue with it let them report it, this is just going to lead to a ton of false positives or be completely useless.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’d argue the report button has the same flaws. Woop someone badly in a match? Reported. Report someone making bigoted remarks? You’ll still see them in a lobby weeks later.