Schools issue permission slips to get parent approval for students to take field trips, learn about sexual health, or play sports.
But some experts say school leaders should consider adding a technology-driven concern to that list: Using ChatGPT and similar tools powered by artificial intelligence.
School districts that had previously banned ChatGPT鈥攊ncluding New York City, the nation鈥檚 largest鈥攁re now puzzling through how to use the tool to help students better understand the benefits and limitations of AI.
But, when every question that a ChatGPT user asks is incorporated into the software program鈥檚 AI training model, privacy concerns come into play, experts said. And that goes for other generative AI products available to students.
Allowing ChatGPT to collect information from students that is then used to develop the tool itself would appear to run up against the Family 91制片厂视频al Rights and Privacy Act (better known as FERPA), which prohibits the collection or analysis of identifiable student data for purposes other than education, said David Sallay, director of youth and educational privacy for the Future of Privacy Forum, a nonprofit organization.
And states that the tool isn鈥檛 intended to be used by anyone under the age of 13 and that those between the ages of 13 and 18 should get permission from a parent.
Still, he expects many districts haven鈥檛 taken the step of getting formal permission from parents. 鈥淚 think a lot [of schools] are just using it and not telling anyone,鈥 Sallay said. 鈥淭hat鈥檚 what happens with a lot of ed tech.鈥
Last school year, the Peninsula School district near Seattle collected permission slips to allow students to use AI tools like ChatGPT in the classroom, Kris Hagel, the district鈥檚 executive director of digital learning, said during a Nov. 1 91制片厂视频 Week webinar on AI.
But this school year, 鈥渨e鈥檝e kind of been a little bit more loose,鈥 Hagel said.
Instead of requiring permission slips for each student, 鈥渨e let parents know at the beginning of the year that our 8th grade and above students would most likely be using AI,鈥 he said. 鈥淚 think it鈥檚 a good idea to just let parents know what鈥檚 going on in the classroom, what tools you鈥檙e using.鈥
Getting parental approval for students to use AI tools is a smart move, said Tammi Sisk, an educational technology specialist for the Fairfax County Public Schools in Virginia, who also served as a panelist for the 91制片厂视频 Week webinar. Her school district is still developing its AI policy.
鈥淚 don鈥檛 see how we get around parent permission, especially if it鈥檚 a consumer product, like ChatGPT,鈥 Sisk said. The tool is 鈥渁lso not super transparent as to what [it鈥檚] ingesting.鈥
Students using an AI tool specifically designed for education鈥攖hink Khan Academy鈥檚 Khanmigo chatbot, for instance鈥攎ight experience more of a protected environment, but teachers and school leaders should check each tool鈥檚 privacy guidelines before deciding what to do, experts said.
Permission slips provide another benefit for schools: Helping parents better understand how AI is being used in the classroom, said Stacey Hawthorne, the chief academic officer for Learn21, a nonprofit organization that works with schools on their use of education technology.
鈥淭his is a really, really good opportunity to have conversations with parents about AI,鈥 Hawthorne said during the 91制片厂视频 Week webinar.
Potential data privacy problems still exist with permission slips
But schools shouldn鈥檛 just get the permission slip and call it a day, said Amelia Vance, the president of the Public Interest Privacy Center, a nonprofit that works on child and student data privacy issues.
No matter students鈥 age, the best thing for educators to do 鈥渇rom an actual safety perspective and well-being perspective is to also teach kids how to limit or minimize the amount of personal information that they鈥檙e putting into the service,鈥 Vance said.
Vance recommends that schools advise students to 鈥渢urn off their history,鈥 a that allows users鈥 to ask questions without the conversation being later used as training data for the tool.
Students should also be cautioned not to input essays about personal trauma, or even information as simple as the name of their school, their age, where they live, or their birthdate, Vance added.
She likened that type of advice to the warnings many adults鈥攚ho are now in their 20s and 30s鈥攈eard back in middle and high school about not providing too many specifics to strangers they spoke to in chatrooms.
鈥淚t鈥檚 going to be important to make sure kids know what could be personally identifiable and what they probably shouldn鈥檛 put in even when [ChatGPT] says they鈥檙e not going to keep the information,鈥 Vance said.