College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    Is AI going to go away?

    In the real world, will those students be working from a textbook, or from a browser with some form of AI accessible in a few years?

    What exactly is being measured and evaluated? Or has the world changed, and existing infrastructure is struggling to cling to the status quo?

    Were those years of students being forced to learn cursive in the age of the computer a useful application of their time? Or math classes where a calculator wasn’t allowed?

    I can hardly think just how useful a programming class where you need to write it on a blank page of paper with a pen and no linters might be, then.

    Maybe the focus on where and how knowledge is applied needs to be revisited in light of a changing landscape.

    For example, how much more practically useful might test questions be that provide a hallucinated wrong answer from ChatGPT and then task the students to identify what was wrong? Or provide them a cross discipline question that expects ChatGPT usage yet would remain challenging because of the scope or nuance?

    I get that it’s difficult to adjust to something that’s changed everything in the field within months.

    But it’s quite likely a fair bit of how education has been done for the past 20 years in the digital age (itself a gradual transition to the Internet existing) needs major reworking to adapt to changes rather than simply oppose them, putting academia in a bubble further and further detached from real world feasibility.

    • SkiDude@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      If you’re going to take a class to learn how to do X, but never actually learn how to do X because you’re letting a machine do all the work, why even take the class?

      In the real world, even if you’re using all the newest, cutting edge stuff, you still need to understand the concepts behind what you’re doing. You still have to know what to put into the tool and that what you get out is something that works.

      If the tool, AI, whatever, is smart enough to accomplish the task without you actually knowing anything, what the hell are you useful for?