Schools and universities are adopting AI tools at extraordinary speed. AI tutoring systems, automated grading, plagiarism detection, learning management platforms, and chatbots are already in classrooms. The question isn't whether AI will be in education. It's whether it'll be governed.
Key Issues
In the U.S., the Family Educational Rights and Privacy Act (FERPA) protects student education records. The Children's Online Privacy Protection Act (COPPA) restricts data collection from children under 13. When a school adopts an AI tool that processes student data, it must ensure compliance with both.
The problem: many AI tools are adopted by individual teachers without institutional review. A teacher who pastes student names and grades into ChatGPT to generate report card comments has just shared FERPA-protected data with OpenAI. This isn't hypothetical. It's happening daily.
Establish a clear, written AI use policy. Define which tools are approved. Define what student data can never be entered into AI systems (names, grades, behavioral records, health information, IEP data). Require institutional review before any new AI tool is adopted. Train teachers on these policies, because prohibition without education leads to shadow AI use.
AI-generated writing is now indistinguishable from student writing in most cases. AI detection tools are unreliable and have been shown to disproportionately flag writing by non-native English speakers. The answer isn't better detection. It's better pedagogy. Redesign assessments to emphasize process (drafts, revisions, in-class work) over product. Teach students to use AI as a tool, with transparency and attribution, rather than pretending it doesn't exist.
Students need to understand how AI works, what it can and can't do, where its biases come from, and how their data is used. This isn't a computer science elective. It's a civic literacy requirement. Just as we teach media literacy and financial literacy, AI literacy should be part of every curriculum. Students who understand AI are less likely to be exploited by it.
Ask your child's school: What AI tools are being used in the classroom? Has the school reviewed those tools for FERPA and COPPA compliance? Is there a written AI use policy? What data is being collected about my child, and by whom? You've the right to ask these questions and the right to receive answers.
Because children were recognized as vulnerable before the internet existed. FERPA was enacted in 1974. COPPA was enacted in 1998. These laws were written to protect people who couldn't protect themselves.
Adults were assumed to be capable of reading terms of service, understanding data policies, and making informed choices. That assumption was always questionable and is now laughable. The average person encounters thousands of data collection events per day through apps, websites, smart devices, and AI tools. No human being can meaningfully consent to all of it.
The honest answer: adults aren't protected because the political will to protect them hasn't yet overcome the lobbying power of the industries that profit from their data. This is fixable. It requires legislation. See the Government Action page.
Action Checklists
- Establish a written AI use policy before the next academic year. High
- Audit every AI tool currently in use for FERPA and COPPA compliance. High
- Ban entry of student names, grades, behavioral records, IEP data, and health information into any unapproved AI tool. High
- Train all staff on approved tools and prohibited data sharing. Include substitutes and aides. High
- Require institutional review before any new AI tool is adopted at the school or district level. Medium
- Communicate AI policies to parents proactively. Don't wait for them to ask. Medium
- Incorporate age-appropriate AI literacy into curriculum across all grade levels. Medium
- Review vendor contracts for data retention, data sharing, and training use clauses. Recommended
- Establish institution-wide AI use and academic integrity policy with input from faculty, students, and legal. High
- Audit all AI tools used in admissions, advising, financial aid, and student services for bias and compliance. High
- Redesign assessments to emphasize process (drafts, revisions, in-class work) over product. High
- Stop relying on AI detection tools. They're unreliable and disproportionately flag non-native English speakers. High
- Require transparency from any vendor using student data to train AI models. Medium
- Create AI literacy requirements that go beyond computer science departments. Medium
- Review research data governance policies to address AI-assisted research and AI-generated content in publications. Medium
- Offer faculty development on AI-informed pedagogy. Not just "how to catch cheaters." Recommended
- Ask your child's school: What AI tools are being used in the classroom? High
- Ask: Has the school reviewed those tools for FERPA and COPPA compliance? High
- Ask: Is there a written AI use policy? Request a copy. High
- Ask: What data is being collected about my child, by whom, and where is it stored? Medium
- Review privacy settings on any school-issued devices or accounts your child uses. Medium
- Talk to your kids about AI. They're using it whether you know about it or not. Make it a conversation, not a lecture. Medium
- If your child has an IEP or 504 plan, verify that no AI tool has access to that data. High
- Attend school board meetings where technology adoption is discussed. Your voice matters in that room. Recommended
- Know your school's AI policy. If they don't have one, that tells you something. High
- Never paste personal information, other people's names, or private conversations into any AI tool. High
- If you use AI to help with schoolwork, be transparent about it. Attribution builds credibility. Hiding it destroys it. High
- Understand that AI-generated content can be wrong, biased, or fabricated. Verify everything. Medium
- Read the terms of service on the AI tools you use. Yes, actually read them. Your data is the price of admission. Medium
- Learn how AI works at a conceptual level. You don't need to code. You need to understand what these systems can and can't do. Medium
- If an AI tool asks for your location, camera, microphone, or contacts, ask yourself why it needs them. Recommended
- Talk to your parents and teachers about AI. You probably know more than they do. Use that to start a real conversation. Recommended
Next Steps
Students deserve protection and preparation. So do you.
