AI Guidance for Staff
AI Guidance for Staff
Norton Public Schools · Office of Instructional Technology
AI and The Lancer Way
Norton's approach to AI is grounded in our Portrait of a Learner. AI should strengthen these habits — not replace them.
Our Shared Vision
Norton is committed to a human-centered approach to AI that supports growth and learning, preserves authentic thinking, and protects the privacy, security, and trust of students, staff, and families. We believe technology should serve people — not replace them.
Human-Centered
AI supports educators and students. It does not replace relationships, professional judgment, or human expertise at the core of teaching and learning.
Growth & Learning
AI should expand what is possible — not shortcut the cognitive work that makes learning meaningful.
Authentic Thinking
Every student deserves to develop their own voice, reasoning, and skills. AI supports that development — it does not substitute for it.
Privacy & Trust
Protecting personal information of students, families, and staff is a non-negotiable condition for any AI use in Norton.
Why this vision matters to Norton. When our AI Task Force and community stakeholders began this work, two things became clear quickly. First, we recognized that the value we most want AI to protect is growth — for students and for staff. Second, we recognized that the value we most worry AI could unintentionally erode is authenticity: the genuine development of each student's own thinking, voice, and skills. This tension sits at the heart of every decision in these guidelines. We are not anti-AI. We are pro-learning. Every boundary and expectation in this document exists to keep those two commitments in balance.
Guiding Principles
Drawn from DESE AI Guidance for K–12 Education (August 2025). Return to these when a situation is not explicitly addressed.
Data Privacy & Security
AI use must comply with FERPA, COPPA, 603 CMR 23.00, and Norton's Acceptable Use Policy. Only tools vetted through Norton's formal DPA process may be used with student or staff PII. This is our most immediate and non-negotiable obligation.
Transparency & Accountability
Educators, students, and families deserve to know when AI plays a role in learning, communications, or decisions. Educators are accountable for every output they use, regardless of whether AI helped generate it.
Bias Awareness & Mitigation
AI tools reflect biases in their training data. Staff must critically examine AI-generated content for bias, stereotyping, or inaccuracy before use — especially content used with students. Human review is not optional.
Human Oversight & Educator Judgment
AI supports educators; it does not replace them. No AI tool may serve as the sole basis for any high-stakes decision about a student — including grades, discipline, special education eligibility, or referral. Educators are always in the loop.
Academic Integrity
Students need to develop their own thinking, reasoning, and skills. Unless a teacher explicitly authorizes AI use for a specific purpose, students should not use AI to generate or substantially alter their academic work. Communicate AI expectations clearly on every assignment.
The Core Privacy Rule
Non-Negotiable Legal Requirement
Do not enter student or staff personally identifiable information (PII) into any AI tool that does not have a signed Data Privacy Agreement on file with Norton Public Schools. This is a legal obligation under FERPA, COPPA, and 603 CMR 23.00 — not a preference.
Tool Approval Status
Check the Norton SDPC Online Resources Database before using any AI tool with student or staff data.
AI Assessment Scale
Use this scale on every assignment to communicate AI expectations clearly — reducing ambiguity, anxiety, and academic dishonesty. Adapted from NHS PEAK team members; consistent with DESE AI guidance.
- Traditional quizzes & tests
- In-class monitored essays
- GoGuardian-monitored work
- Outlines & organizers
- Brainstorming
- Generating keyword lists
- Breaking down assignments
- Cite AI use
- Editing & refinement
- Feedback loops
- Graphics & layouts
- Optional reflection log
- Cite AI use
- Grading AI-generated essays
- Creative AI exploration
- Ungraded practice
- Cite AI use
Guidance on Student Use of AI
Norton's Phased Approach to Student AI Use
Norton is building toward student-led AI use deliberately and in sequence. The grade-level framework below reflects where each band currently sits in this progression — not a permanent ceiling, but a purposeful starting point.
Grade-Level Framework
| Grade Band | Guidance for Educators |
|---|---|
| PreK–Grade 2 | AI chatbot and generative tools are not appropriate for direct student use. AI use at this level is for staff only: lesson planning, materials preparation, and curriculum design. Students learn about AI through teacher-led demonstrations. |
| Grades 3–5 | AI tools are not recommended for independent student use. Where AI is embedded in approved curriculum platforms, its role should be incidental and under teacher supervision. Curriculum-embedded AI literacy is appropriate and encouraged. Students begin to Navigate by identifying when a tool is "helpful" vs. "doing the work for them." |
| Grades 6–8 | Structured, educator-directed use of approved AI tools may be introduced. AI literacy instruction must precede any active student use. Personal accounts on non-approved tools are not permitted. Students Communicate by clearly citing where AI assisted their final product. |
| Grades 9–12 | Educators may permit AI use for specific, defined tasks with clear expectations about attribution and academic integrity. District-approved tools only. Students must Adapt (use AI feedback to improve drafts) and Communicate (clearly cite AI assistance). |
Before Students Use Any AI Tool: Educator Checklist
- Verify the tool is listed as Approved in the Norton SDPC database
- Provide age-appropriate AI literacy instruction before students actively use the tool
- Communicate explicit AI expectations on each assignment — students should never have to guess
- Review all AI-generated outputs before using or distributing them — you are professionally responsible for everything that reaches students or families under your name
Academic Integrity in the Age of AI
No Detection Tools as Sole Basis
AI detection tools may only be used as a preliminary data point to initiate a conversation — never as the sole basis for a disciplinary finding. High false positive rates disproportionately affect ELL students, creating significant equity concerns.
Make Thinking Visible
Design assessments so the process of reasoning is evident. When thinking is visible, the role of AI in any final product becomes less determinative. Version history and oral defense are effective verification tools.
Safe Disclosure & Citation
Teach students to acknowledge AI use openly. Any use of Generative AI must be cited using current MLA/APA guidelines, including the prompt used and date of access.
Lancer AI Literacy: The Global Impact
Algorithmic Bias
AI models reflect the biases in their training data. Lancers must critically examine all AI-generated content for stereotypes, cultural insensitivities, or inaccuracies. Human review is never optional.
Environmental Cost
Generating a single AI response requires significantly more electricity and water than a standard search query. Use these tools mindfully — only when they add genuine value to learning.
Media Integrity
As AI-generated media becomes more prevalent, develop skills to verify the authenticity of what you see and hear. Seek primary sources to confirm information.
Role-Specific Responsibilities
All Staff
- Check the SDPC database before using any AI tool with PII
- Never enter PII into Not Approved or Declined tools
- Critically review all AI-generated content before use or distribution
- Maintain professional accountability for all outputs
- Contact Instructional Technology with questions
Classroom Teachers & Specialists
- Communicate explicit AI expectations on each assignment using the NPS AI Assessment Scale
- Provide AI literacy instruction before students use any approved AI tool
- Design assessments that make student thinking visible
- Do not use AI to independently grade student work or make high-stakes decisions
- Do not enter student names into AI tools to generate feedback without a signed DPA
Building Administrators
- Model appropriate AI use and reinforce guidelines with building staff
- Ensure no AI tool is deployed school-wide without prior review by Instructional Technology
- Do not use AI to make or substantially inform personnel decisions or evaluations
- Surface staff questions and concerns to Instructional Technology
Paraprofessionals & Support Staff
- Apply the same data privacy standards to AI tools as to any other digital tool or student record
- Do not enter student info into AI tools unless Approved and supervisor has confirmed the specific use
- If uncertain, ask before proceeding — contact your administrator or Instructional Technology
Counselors & Student Support Staff
- Do not enter mental health info, counseling notes, or sensitive records into any AI tool without a signed DPA and explicit confirmation from the Director of Instructional Technology
- Do not use AI to make or substantially inform referrals, eligibility determinations, or student recommendations
Central Office & Administrative Staff
- Do not enter confidential district data into AI tools without a signed DPA
- Apply district procurement principles when evaluating new tools: privacy compliance, accessibility, and bias mitigation must be evaluated before adoption
- Ensure AI-assisted community communications are reviewed for accuracy, bias, and appropriateness
Transparency with Families
Public Tool Database
Norton's Online Resources Database lists all approved tools, DPA status, and restrictions. Families may access this at any time at the Norton SDPC site.
NPS Data Privacy Initiative
Plain-language information about how Norton protects student data is available at the NPS Data Privacy Initiative website.
Educator Disclosure
When educators use AI in ways that directly affect student learning or assessment, they should communicate that to students and, where appropriate, to families.
Frequently Asked Questions
Resources & Review
Google Resources
Training & Professional Learning
NPS Data Privacy
Review Cycle
- Annual review at the start of each school year
- Mid-year updates communicated to all staff as needed
- Staff feedback always welcome — contact Instructional Technology