News for North Texas
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Sorry, Denton ISD students: New district rules say ChatGPT can't write that paper for you

FILE - The ChatGPT app is displayed on an iPhone in New York, May 18, 2023.
Richard Drew
/
AP
FILE - The ChatGPT app is displayed on an iPhone in New York, May 18, 2023.

If you’re gray at the temples, you know the joke about the dog eating an unfortunate middle schooler’s homework.

If you’re part of Gen Z, you don’t need to blame a real or invented dog because there are more ways to get some of that homework filled out in minutes, and without cracking open a book or a laptop. All you need is a keyboard and a ChatGPt account.

Denton ISD took a small but meaningful step to curb any inclination for students to let the internet do their problem solving — or paper writing — for them. The district added “generative artificial intelligence” to the Student Code of Conduct this month.

Going into the 2023-24 school year, students risk violating the district’s anti-plagiarism policy if they ask artificial intelligence to fill in the gaps on assignments, or if they try to turn in an assignment entirely done by artificial intelligence.

The added rule is on page 55 of the code, under the academic integrity portion of the code.

Here’s what the district included in its longstanding policy forbiding plagiarism:

“(T)he use of generative artificial intelligence for the purpose of plagiarism is strictly prohibited as a violation of the district’s acceptable use policy,” the added passage says. “Generative AI refers to the use of computer algorithms to generate original content that mimics human writing styles. While this technology can be useful for various academic and creative purposes, it is important to note that using generative AI to plagiarize someone else’s work, including that of a computer program is unethical and can result in academic consequences, consistent with Denton ISD’s academic integrity policy.”

To break it down: If a teacher guides students to use generative AI as a part of a lesson, students are allowed to partake. They can’t turn to AI to create essays or complete classroom projects. Take ChatGPT, for instance: A student could ask the algorithm to write a two-page paper on the themes of sin and guilt in the novel The Scarlet Letter and have that homework assignment completed in less than one minute. If they didn’t read the book? No problem. The algorithm has you covered, at least as far as initial appearances.

Superintendent Jamie Wilson told the school board this month that the addition to the code isn’t a top-down solution.

“You ask other districts what they’re doing about this issue and they’re like (shrugs),” Wilson said. “This came from the teachers.”

As access to generative AI widens, the market for AI detection technology is already trying to match it. OpenAI, the developer of ChaGPT, debuted a tool in February to help teachers detect AI-generated writing. But user beware: OpenAI warned that the tool isn’t reliable in all cases.

Educational technology developer Turnitin is developing a tool to detect what it calls “AI-assisted writing.”

“So ChatGPT and its cousins are here. It’s going to be transformative. Perhaps it will be the graphing calculator of the language arts,” said David Adamson, an AI scientist at Turnitin in a video about their detection tool.

The tech company’s tool is being developed to detect writing entirely generated by AI, and to find portions of writing generated by AI.

Meanwhile, universities and school districts across the country are debating the best approach to generative AI. The Univesity of North Texas Center for Learning Experimentation, Application and Research posted about ChatGPT specifically in the center’s teacher resources for theory and practice in March. The post enumerated positive uses of the tech in the higher education classroom.

One of its uses? To broach the topics of ethics and plagiarism for college students.

“Instructors can use it to help students develop digital literacy and critical thinking skills,” the post said. “For example, instructors can design writing assignments in which students actively analyze the tool’s strengths and limitations. By exploring the tool together, instructors can introduce students to the importance of ethics and the dangers of plagiarism in the writing process.”

While consumers have been interacting with chatbots for a few years now — especially when using the chat feature on retailer websites — the issue is thornier for educators. It’s one thing to connect shoppers to customer service using an algorithm.

It’s another to embrace a technology that can appeal to students itching for a shortcut. ChatGPT can’t think critically, and it can’t solve complex problems creatively: That’s still the provenance of people. As for schools, educators have to walk the line between teaching students how to recognize and use this kind of AI, and when using it breaks the rules.