As AI tools reshape education, schools struggle with how to draw the line on cheating

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

The report of the book now belongs to the past. Tests and tests to take away become obsolete.

Secondary and colleges educators across the country say that the use of artificial intelligence students has become so widespread that to attribute writing outside the class, it is like asking students to cheat.

“The cheating is outside the charts. It is the worst that I have seen in my entire career, ”explains Casey Cuny, who has been teaching English for 23 years. Educators no longer wonder if students will outsource school work to AI chatbots. “Everything you send to your home, you must suppose is being winged.”

The question is now how schools can adapt, because many teaching and evaluation tools that have been used for generations are no longer effective. While IA technology improves quickly and becomes more embarked on daily life, it transforms the way students learn and study, how teachers teach, and this creates a new confusion on what constitutes academic dishonesty.

“We have to ask ourselves, what is cheating?” said Cuny, recipient in 2024 of the California Teacher of the Year Prize. “Because I think the lines are becoming blurred.”

Cuny students in Valence high school in southern California are now making most of the class writings. He monitors laptop screens for students from his desktop, using software that allows him to “lock” their screens or block access to certain sites. He also incorporates AI into his lessons and teaches students how to use AI as a help in study “so that children learn with AI instead of cheating with AI”.

In rural regions of Oregon, secondary professor Kelly Gibson made a change similar to writing in class. It also incorporates more verbal assessments so that students discuss their understanding of assigned reading.

“I used to give a writing prompt and say:” In two weeks, I want an essay of five paragraphs, “explains Gibson. “These days, I can’t do that. It almost begs teenagers to cheat.”

Take, for example, an affectation of English in the past typical secondary: write an essay that explains the relevance of the social class in “The Great Gatsby”. Many students say that their first instinct is now to ask Chatgpt for “brainstorming” help. In a few seconds, Chatgpt gives a list of test ideas, as well as examples and quotes to save them. The chatbot ends by asking if it can do more: “Would you like to help write part of the test? I can help you write an introduction or describe a paragraph!”

Students say that they often turn to AI with good intentions for things like research, publishing or helping to read difficult texts. But AI offers an unprecedented temptation and it is sometimes difficult to know where to trace the line.

The second year student Lily Brown, a major in psychology in a school of liberal arts on the east coast, relies on Chatgpt to help describe the tests because she has trouble gathering the pieces herself. Chatgpt also helped her to cross a first -year philosophy course, where reading the reading “felt like a different language” until she read the summaries of the texts.

“Sometimes I feel bad to use Chatgpt to summarize reading, because I wonder it’s this cheating?” Do I help myself to make cheating contours? If I write a test in my own words and ask me how to improve it, or when it starts to change my test, is it cheating? ”

Her class programs say things like: “Do not use AI to write tests and train thoughts,” she says, but that leaves a lot of gray zone. Students say that they often hesitate to ask the clarity of teachers because admission to any use of AI could point them out as a cheater.

Schools tend to leave AI policies to teachers, which often means that the rules vary considerably within the same school. Some educators, for example, are delighted with the use of Grammarly.com, a writing assistant powered by AI, to check grammar. Others prohibit it, noting that the tool also proposes to rewrite sentences.

“Whether you can use AI or not, it depends on each classroom. This can be confusing, “explains Jolie Lahey, a student of 11th pupil of Valence, who attributes to Cuny to have taught her second -year English class a variety of skills in AI as how to download study guides to chat and have the chatbot which questions them and then explain the problems they have had trouble.

But this year, his teachers have strict policies “without AI”. “It is such a useful tool. And if we are not allowed to use it, it simply does not make sense, ”explains Lahey. “It’s obsolete.”

Many schools initially prohibited the use of AI after the launch of Chatgpt at the end of 2022. But opinions on the role of artificial intelligence in education have changed considerably. The term “AI literacy” has become a fashionable word for the back -to -school season, emphasizing how to balance AI forces with its risks and challenges.

During the summer, several colleges and universities summoned their AI working groups to write more detailed guidelines or provide teachers with new instructions.

The University of California, Berkeley, sent an email to all the new AI advice which invites them to “include a clear declaration on their program on the expectations of the course” around the use of AI. The guidelines offered a language for three program declarations samples – for courses that require an AI, prohibit AI in and out of class, or allow the use of AI.

“In the absence of such a declaration, students may be more likely to use these technologies inappropriately,” said e-mail, stressing that AI “creates a new confusion on what could constitute legitimate methods to complete the work of students”.

At Carnegie Mellon University, there was a huge increase in violations of academic responsibility due to AI, but often students do not know that they have done anything wrong, explains Rebekah Fitzsimmons, president of the Council Committee of IA teachers at Heinz College of Information Systems and Public of the University.

For example, a learner of the English language wrote a mission in his mother tongue and used Deepl, a translation tool propelled by AI, to translate his work into English, but did not realize that the platform also modified his language, which was reported by an AI detector.

The application of academic integrity policies has been complicated by AI, which is difficult to detect and even more difficult to prove, said Fitzsimons. Professors have flexibility when they believe that a student involuntarily crossed a line, but he is now more hesitant to emphasize violations because they do not want to accuse students unfairly, and students fear that if they are falsely accused, there is no way to prove their innocence.

During the summer, Fitzsimons helped write new detailed guidelines for students and teachers who strive to create more clarity. The teachers have been informed that a general prohibition of AI “is not a viable policy” unless the instructors make changes to the way they teach and assess students. Many teachers delete take -out exams. Some returned to pen and paper tests in class, she said, and others have moved to “inverted classrooms”, where homework is in class.

Emily Dejeu, who teaches communication courses at the Carnegie Mellon business school, has eliminated writing homework as homework and replaced them with class quizs made on laptops in “a locking browser” which prevents students from leaving the quiz screen.

“Expect that an 18-year-old child exercises a great discipline is unreasonable, which is why it is up to instructors to set up railings.”

___

The educational coverage of the Associated Press receives financial support from multiple private foundations. AP is solely responsible for all content. Find the AP standards to work with philanthropies, a list of supporters and coverage areas financed at AP.ORG.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button