One Tech Tip: Do’s and don’ts of using AI to help with schoolwork

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

The rapid rise of ChatGPT and other generative AI systems has disrupted education, transforming the way students learn and study.

Students everywhere have turned to chatbots to help them with their homework, but the capabilities of artificial intelligence have blurred the boundaries for what it should – and should not – be used for.

The widespread adoption of this technology in many other areas of life also adds to the confusion about what constitutes academic dishonesty.

Here are some do’s and don’ts of using AI for schoolwork:

Chatbots are so good at answering questions with detailed written responses that it’s tempting to just take their work and pass it off as yours.

But in case it’s not already obvious, AI should not be used as a substitute for work. And it cannot replace our ability to think critically.

You would not copy and paste information from someone else’s textbook or essay and pass it off as your own. The same principle applies to chatbot responses.

“AI can help you understand concepts or generate ideas, but it should never replace your own thinking and effort,” the University of Chicago says in its guidance on using generative AI. “Always produce original work and use AI tools to guide and clarify you, not to do the work for you.”

So don’t hesitate to put pen to paper – or your fingers to keyboard – to write yourself.

“If you use an AI chatbot to write for you – whether explanations, summaries, topic ideas, or even initial outlines – you will learn less and perform worse on subsequent exams and attempts to use that knowledge,” says the Poorvu Center for Teaching and Learning at Yale University.

Experts say AI shines when used as a tutor or study companion. So try using a chatbot to explain difficult concepts or brainstorm ideas, such as essay topics.

California high school English teacher Casey Cuny advises his students to use ChatGPT to ask questions before exams.

It asks them to upload lecture notes, study guides, and any other materials used in class, such as slideshows, to the chatbot, then tell it which textbook and chapter the test will focus on.

Next, students should ask the chatbot to: “Ask me one question at a time based on all the material cited, then create a teaching plan for anything I misunderstood.” »

Cuny posts AI tips in the form of a traffic light on a classroom screen. Permitted uses include brainstorming, asking for feedback on a presentation, or conducting research. Red-light or Prohibited AI Use: Ask an AI tool to write a thesis statement, draft, or revise an essay. A yellow light appears when a student is unsure whether AI use is allowed, in which case it tells the student to come ask.

Or try using ChatGPT’s voice dictation feature, said Sohan Choudhury, CEO of Flint, an AI-powered education platform.

“I’ll just empty my brain of what I understand and what I don’t understand” about a topic, he said. “I can ramble on for five minutes about exactly what I do and don’t understand about a topic. I can throw random analogies at it, and I know it’s going to be able to tailor something to me based on that.”

As AI has disrupted academia, educators have been forced to define their technology policies.

In the United States, about two dozen states have AI guidelines for schools, but they are unevenly enforced.

It’s worth checking what your school, college or university says about AI. Some might have a general institution-wide policy.

The University of Toronto’s position is that “students are not allowed to use generative AI in a course unless the instructor explicitly allows it” and students should check course descriptions to know what to do or not to do.

Many others don’t have a general rule.

The State University of New York at Buffalo “does not have a one-size-fits-all policy,” according to its online guidance for instructors. “Instructors have academic freedom to determine what tools students can and cannot use to achieve the learning objectives of the course. This includes artificial intelligence tools such as ChatGPT.”

AI is no longer the educational bogeyman it once was.

There is growing recognition that AI is here to stay and that the next generation of workers will need to learn how to use this technology, which has the potential to disrupt many industries and professions.

Students should therefore not hesitate to discuss its use with teachers, as transparency avoids misunderstandings, Choudhury said.

“Two years ago, a lot of teachers were just against this. Like, don’t bring up AI in this class at all, period, period,” he said. But three years after ChatGPT debuted, “many teachers understand that kids are using it, so they’re much more open to having a conversation rather than setting a blanket policy.”

Teachers say they are aware that students are reluctant to ask whether the use of AI is allowed, for fear of being reported as cheating. But clarity is key because it’s so easy to cross a line without knowing it, says Rebekah Fitzsimmons, chair of the AI ​​faculty advisory committee at Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.

“Students often don’t realize when they cross a line between a tool that helps them correct content they’ve created and when it generates content for them,” says Fitzsimmons, who helped write detailed new guidelines for students and faculty working to create clarity.

The University of Chicago says students should cite AI if it was used to come up with ideas, summarize texts, or help write a paper.

“Acknowledge this in your work where appropriate,” the university says. “Just like you would cite a book or website, giving credit to AI where appropriate helps maintain transparency.”

Teachers want students to use AI in a way that is consistent with their school’s values ​​and principles.

The University of Florida says students should familiarize themselves with the school’s honor code and academic integrity policies “to ensure that your use of AI complies with ethical standards.”

Oxford University says AI tools should be used “responsibly and ethically” and in accordance with its academic standards.

“You should always use AI tools with integrity, honesty and transparency, and maintain a critical approach in the use of any results generated by these tools,” it says.

____

Is there a technical topic that you think is worth explaining? Email us at onetechtip@ap.org with your suggestions for future editions of One Tech Tip.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button