College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • TimewornTraveler@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    14
    ·
    1 year ago

    Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?

    • WackyTabbacy42069@reddthat.com
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      12
      ·
      1 year ago

      It actually is artificial intelligence. What are you even arguing against man?

      Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn’t AI because you don’t like it is like saying rock and roll isn’t music

      • TimewornTraveler@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        16
        ·
        edit-2
        1 year ago

        I am arguing against this marketing campaign, that’s what. Who decides what “AI” is and how did we come to decide what fits that title? The concept of AI has been around a long time, like since the Greeks, and it had always been the concept of a made-made man. In modern times, it’s been represented as a sci-fi fantasy of sentient androids. “AI” is a term with heavy association already cooked into it. That’s why calling it “AI” is just a way to make it sound high tech futuristic dreams-come-true. But a predictive text algorithm is hardly “intelligence”. It’s only being called that to make it sound profitable. Let’s stop calling it “AI” and start calling out their bullshit. This is just another crypto currency scam. It’s a concept that could theoretically work and be useful to society, but it is not being implemented in such a way that lives up to its name.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          1 year ago

          Who decides what “AI” is and how did we come to decide what fits that title?

          Language is ever-evolving, but a good starting point would be McCarthy et al., who wrote a proposal back in the 50s. See http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html

          Techniques have come into and gone out of fashion, and obviously technology has improved, but the principles have not fundamentally changed.

        • BigNote@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          The field of computer science decided what AI is. It has a very specific set of meanings and some rando on the Internet isn’t going to upend decades of usage just because it doesn’t fit their idea of what constitutes AI or because they think it’s a marketing gimmick.

          It’s not. It’s a very specific field in computer science that’s been worked on since the 1950s at least.

          • Strykker@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            The issue is to laypeople the term AI presents the idea of actual intelligence at a human level. Which computer science doesn’t require for something to qualify as AI

            Leads to lay people attributing more ability to the llm than they actually posses.

            • BigNote@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Agreed. That said, I am uncomfortable with the idea that policing language is the correct or only solution to the problem.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Maybe machine learning models technically fit the definition of “algorithm” but it suits them very poorly. An algorithm is traditionally a set of instructions written by someone, with connotations of being high level, fully understood conceptually, akin to a mathematical formula.

      A machine learning model is a soup of numbers that maybe does something approximately like what the people training it wanted it to do, using arbitrary logic nobody can expect to follow. “Algorithm” is not a great word to describe that.

    • Venia Silente@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      1 year ago

      Please let’s not defame Djikstra and other Algorithms like this. Just call them “corporate crap”, like what they are.