Artificial intelligence tools can produce an essay on the migratory patterns of waterfowl or President Barack Obama鈥檚 K-12 education agenda in seconds鈥but the work might be riddled with factual errors.
Those inaccuracies are commonly known as 鈥渉allucinations鈥 in computer science speak鈥攂ut education technology experts are trying to steer away from that term.
鈥淲e know that industry tends to use the term 鈥榟allucinations鈥 to allude to errors that are made by [AI] systems and tools,鈥 said Pati Ruiz, a senior director of ed tech and emerging technologies for Digital Promise, a nonprofit that works on technology and equity issues, during an 91制片厂视频 Week webinar earlier this month.
But researchers who think about how to talk about AI recommend using another name for those errors鈥攕uch as 鈥渕istake,鈥 Ruiz said.
First off, the word 鈥渉allucinations,鈥 Ruiz said, 鈥渕ake[s] light of mental health issues.鈥
And she added that using that word for AI鈥檚 errors 鈥渕ight give students a false sense of this tool having humanlike qualities. And that鈥檚 something that we advocate against, right? We advocate for folks to understand these tools as just that, tools that will support us as humans.鈥
鈥楢I systems and tools make lots of mistakes鈥
Ruiz noted that she and another expert who spoke during the webinar, Kip Glazer, the principal of Mountain View High School in California,
What鈥檚 more, students need to understand that they shouldn鈥檛 take any information that they get from ChatGPT and similar tools at face value, Ruiz said.
鈥淕enerative AI systems and tools make lots of mistakes,鈥 she said. 鈥淲e need to have expertise across content areas so that we can review the outputs of generative AI. And we recommend always questioning the outputs of generative AI systems and tools.鈥
Schools and districts need to make that need for scrutiny clear to teachers and students. 鈥淕uidance is really important so that we can all use [AI] effectively and appropriately and in a way that doesn鈥檛 perpetuate the biases that already exist in these systems,鈥 Ruiz added.
To view the webinar in its entirety, register here.