Skip to main content
District

AI Guidance for Staff

The Lancer Way

AI and The Lancer Way

Norton's approach to AI is grounded in our Portrait of a Learner. AI should strengthen these habits — not replace them.

LLeadOwn your learning. Don't let AI do the work for you.
AAdaptUse AI for feedback, revision, and improvement.
NNavigateCritically evaluate AI-generated content.
CCommunicateSupport your voice — don't replace it.
EEmpowerUse AI responsibly and ethically.
RReflectConsider how AI contributed to your growth.
Section 1

Our Shared Vision

Norton is committed to a human-centered approach to AI that supports growth and learning, preserves authentic thinking, and protects the privacy, security, and trust of students, staff, and families. We believe technology should serve people — not replace them.

Human-Centered

AI supports educators and students. It does not replace relationships, professional judgment, or human expertise at the core of teaching and learning.

Growth & Learning

AI should expand what is possible — not shortcut the cognitive work that makes learning meaningful.

Authentic Thinking

Every student deserves to develop their own voice, reasoning, and skills. AI supports that development — it does not substitute for it.

Privacy & Trust

Protecting personal information of students, families, and staff is a non-negotiable condition for any AI use in Norton.

Why this vision matters to Norton. When our AI Task Force and community stakeholders began this work, two things became clear quickly. First, we recognized that the value we most want AI to protect is growth — for students and for staff. Second, we recognized that the value we most worry AI could unintentionally erode is authenticity: the genuine development of each student's own thinking, voice, and skills. This tension sits at the heart of every decision in these guidelines. We are not anti-AI. We are pro-learning. Every boundary and expectation in this document exists to keep those two commitments in balance.

Section 2

Guiding Principles

Drawn from DESE AI Guidance for K–12 Education (August 2025). Return to these when a situation is not explicitly addressed.

1

Data Privacy & Security

AI use must comply with FERPA, COPPA, 603 CMR 23.00, and Norton's Acceptable Use Policy. Only tools vetted through Norton's formal DPA process may be used with student or staff PII. This is our most immediate and non-negotiable obligation.

2

Transparency & Accountability

Educators, students, and families deserve to know when AI plays a role in learning, communications, or decisions. Educators are accountable for every output they use, regardless of whether AI helped generate it.

3

Bias Awareness & Mitigation

AI tools reflect biases in their training data. Staff must critically examine AI-generated content for bias, stereotyping, or inaccuracy before use — especially content used with students. Human review is not optional.

4

Human Oversight & Educator Judgment

AI supports educators; it does not replace them. No AI tool may serve as the sole basis for any high-stakes decision about a student — including grades, discipline, special education eligibility, or referral. Educators are always in the loop.

5

Academic Integrity

Students need to develop their own thinking, reasoning, and skills. Unless a teacher explicitly authorizes AI use for a specific purpose, students should not use AI to generate or substantially alter their academic work. Communicate AI expectations clearly on every assignment.

Section 3

The Core Privacy Rule

Non-Negotiable Legal Requirement

Do not enter student or staff personally identifiable information (PII) into any AI tool that does not have a signed Data Privacy Agreement on file with Norton Public Schools. This is a legal obligation under FERPA, COPPA, and 603 CMR 23.00 — not a preference.

PII includes: Student names, ID numbers, dates of birth, grades, assessment scores, disciplinary records, IEP/504 information, health data, attendance, behavioral data, family contact info, and any combination that could identify a specific individual. Staff evaluation data, personnel records, and salary information carry the same protection. Popularity is not a substitute for a signed DPA.

Tool Approval Status

Check the Norton SDPC Online Resources Database before using any AI tool with student or staff data.

ApprovedSigned DPA exists. You may use with student PII consistent with noted restrictions and approved educational purpose.
Approved with RestrictionsA DPA exists with specific conditions. Read those conditions carefully. Do not use outside the stated restrictions.
Renewal PendingAgreement expiring; renewal in progress. You may continue use while active. Monitor the expiration date.
Not ApprovedNo DPA exists. Do not use with student or staff PII. Submit a request to Instructional Technology if you believe this tool has genuine educational value.
DeclinedVendor refused to sign a DPA. Do not use with student or staff PII. Reason is documented in the database.
Watch for Embedded AI: AI features are increasingly built into platforms you already use — Google Workspace, Canva, PowerSchool. Apply the same data privacy standards to embedded AI features. If unsure whether a specific feature is covered by an existing DPA, contact Instructional Technology before using it with student data.
Requesting a New Tool: Do not use the tool with student data until it is approved. Submit a request to the Office of Instructional Technology with: tool name, vendor, intended educational use, grade level(s), and what student data the tool would access. This process can take weeks to months — plan ahead.
Section 4

AI Assessment Scale

Use this scale on every assignment to communicate AI expectations clearly — reducing ambiguity, anxiety, and academic dishonesty. Adapted from NHS PEAK team members; consistent with DESE AI guidance.

1
No AI
Students rely solely on their own understanding, skills, and knowledge. Usually completed in a monitored environment.
  • Traditional quizzes & tests
  • In-class monitored essays
  • GoGuardian-monitored work
2
AI Pre-Task
AI for planning, idea development, and research only. Final product shows how ideas were developed and refined.
  • Outlines & organizers
  • Brainstorming
  • Generating keyword lists
  • Breaking down assignments
  • Cite AI use
3
AI Collaboration
AI helps complete the task. Students critically evaluate, modify, and cite any AI-generated content used.
  • Editing & refinement
  • Feedback loops
  • Graphics & layouts
  • Optional reflection log
  • Cite AI use
4
Full AI
No limitations. Students direct and engage with AI to achieve goals. Students act as "the director."
  • Grading AI-generated essays
  • Creative AI exploration
  • Ungraded practice
  • Cite AI use
Section 5

Guidance on Student Use of AI

Default expectation: AI tools are not permitted for student work unless a teacher explicitly authorizes a specific use for a specific purpose. Student AI use is an educator's professional responsibility to define, structure, and supervise.

Norton's Phased Approach to Student AI Use

Norton is building toward student-led AI use deliberately and in sequence. The grade-level framework below reflects where each band currently sits in this progression — not a permanent ceiling, but a purposeful starting point.

Phase 1
Exposure
Staff build fluency and confidence first. Students learn what AI is through teacher-led demonstrations — observing, not operating.
Primarily: PreK–Grade 5
Phase 2
Teacher-Led Use
Structured, educator-directed use of approved tools is introduced. AI literacy instruction precedes active student use. The teacher remains in control of the tool and the task.
Primarily: Grades 6–8
Phase 3
Student-Led Use
Students use approved tools for specific, defined tasks with clear expectations about attribution, disclosure, and academic integrity. Educators authorize and supervise; students direct.
Primarily: Grades 9–12

Grade-Level Framework

Grade Band Guidance for Educators
PreK–Grade 2 AI chatbot and generative tools are not appropriate for direct student use. AI use at this level is for staff only: lesson planning, materials preparation, and curriculum design. Students learn about AI through teacher-led demonstrations.
Grades 3–5 AI tools are not recommended for independent student use. Where AI is embedded in approved curriculum platforms, its role should be incidental and under teacher supervision. Curriculum-embedded AI literacy is appropriate and encouraged. Students begin to Navigate by identifying when a tool is "helpful" vs. "doing the work for them."
Grades 6–8 Structured, educator-directed use of approved AI tools may be introduced. AI literacy instruction must precede any active student use. Personal accounts on non-approved tools are not permitted. Students Communicate by clearly citing where AI assisted their final product.
Grades 9–12 Educators may permit AI use for specific, defined tasks with clear expectations about attribution and academic integrity. District-approved tools only. Students must Adapt (use AI feedback to improve drafts) and Communicate (clearly cite AI assistance).

Before Students Use Any AI Tool: Educator Checklist

  • Verify the tool is listed as Approved in the Norton SDPC database
  • Provide age-appropriate AI literacy instruction before students actively use the tool
  • Communicate explicit AI expectations on each assignment — students should never have to guess
  • Review all AI-generated outputs before using or distributing them — you are professionally responsible for everything that reaches students or families under your name

Academic Integrity in the Age of AI

No Detection Tools as Sole Basis

AI detection tools may only be used as a preliminary data point to initiate a conversation — never as the sole basis for a disciplinary finding. High false positive rates disproportionately affect ELL students, creating significant equity concerns.

Make Thinking Visible

Design assessments so the process of reasoning is evident. When thinking is visible, the role of AI in any final product becomes less determinative. Version history and oral defense are effective verification tools.

Safe Disclosure & Citation

Teach students to acknowledge AI use openly. Any use of Generative AI must be cited using current MLA/APA guidelines, including the prompt used and date of access.

Lancer AI Literacy: The Global Impact

Algorithmic Bias

AI models reflect the biases in their training data. Lancers must critically examine all AI-generated content for stereotypes, cultural insensitivities, or inaccuracies. Human review is never optional.

Environmental Cost

Generating a single AI response requires significantly more electricity and water than a standard search query. Use these tools mindfully — only when they add genuine value to learning.

Media Integrity

As AI-generated media becomes more prevalent, develop skills to verify the authenticity of what you see and hear. Seek primary sources to confirm information.

Section 6

Role-Specific Responsibilities

Approved student AI tools: Google Gemini and Canva for Education (SSO-only) are the district-approved AI tools for student use. Responsible AI use is a shared responsibility across all roles.

 

All Staff
  • Check the SDPC database before using any AI tool with PII
  • Never enter PII into Not Approved or Declined tools
  • Critically review all AI-generated content before use or distribution
  • Maintain professional accountability for all outputs
  • Contact Instructional Technology with questions

 

Classroom Teachers & Specialists
  • Communicate explicit AI expectations on each assignment using the NPS AI Assessment Scale
  • Provide AI literacy instruction before students use any approved AI tool
  • Design assessments that make student thinking visible
  • Do not use AI to independently grade student work or make high-stakes decisions
  • Do not enter student names into AI tools to generate feedback without a signed DPA

 

Building Administrators
  • Model appropriate AI use and reinforce guidelines with building staff
  • Ensure no AI tool is deployed school-wide without prior review by Instructional Technology
  • Do not use AI to make or substantially inform personnel decisions or evaluations
  • Surface staff questions and concerns to Instructional Technology

 

Paraprofessionals & Support Staff
  • Apply the same data privacy standards to AI tools as to any other digital tool or student record
  • Do not enter student info into AI tools unless Approved and supervisor has confirmed the specific use
  • If uncertain, ask before proceeding — contact your administrator or Instructional Technology

 

Counselors & Student Support Staff
  • Do not enter mental health info, counseling notes, or sensitive records into any AI tool without a signed DPA and explicit confirmation from the Director of Instructional Technology
  • Do not use AI to make or substantially inform referrals, eligibility determinations, or student recommendations

 

Central Office & Administrative Staff
  • Do not enter confidential district data into AI tools without a signed DPA
  • Apply district procurement principles when evaluating new tools: privacy compliance, accessibility, and bias mitigation must be evaluated before adoption
  • Ensure AI-assisted community communications are reviewed for accuracy, bias, and appropriateness
Section 7

Transparency with Families

Public Tool Database

Norton's Online Resources Database lists all approved tools, DPA status, and restrictions. Families may access this at any time at the Norton SDPC site.

NPS Data Privacy Initiative

Plain-language information about how Norton protects student data is available at the NPS Data Privacy Initiative website.

Educator Disclosure

When educators use AI in ways that directly affect student learning or assessment, they should communicate that to students and, where appropriate, to families.

Section 8

Frequently Asked Questions

Yes, with one condition: you must not enter any student or staff PII. Drafting a generic lesson plan, brainstorming ideas, writing a rubric, or summarizing a professional article is generally fine. The moment you include a student's name, describe a specific student's situation, or reference IEP/504 information, you need a signed DPA. When in doubt, anonymize before you type. Always use school accounts with school-approved tools — not personal accounts.
No. Grade level, subject area, school building, and general instructional context are not PII. PII is information that identifies or could reasonably identify a specific individual — names, ID numbers, dates of birth, grades on specific assignments, assessment scores, disciplinary records, IEP/504 content, health information, attendance records, behavioral data, and family contact information. A useful test: if someone could read what you typed and identify a specific student, it is PII.
No. "FERPA compliant" is a claim a vendor makes about their own practices — it is not a legally binding agreement with Norton. A signed DPA with Norton specifically establishes district ownership of student data, prohibits advertising use, requires breach notification within a defined timeframe, and gives Norton audit rights. If a vendor declines to sign a DPA, that is a significant red flag. Norton has documented exactly this situation with several well-known vendors.
You may use it for your own professional tasks as long as you do not enter student or staff PII. You may not use it with students or any student data until it has been approved. Submit your request to the tech office as soon as possible — the process can take weeks to months. Planning ahead for next school year is strongly encouraged.
Stop using the tool immediately and contact the Instructional Technology Office as soon as you realize what happened. Do not wait. This is a privacy and compliance issue that needs to be assessed quickly. Depending on what was entered, there may be steps needed to request data deletion from the vendor, document the incident, or notify the appropriate people. The earlier you report it, the more options the district has.
Using AI to generate sample IEP goal language based on a disability category or grade level — without entering any specific student information — is permissible and a useful starting point. What you may not do: enter a student's name, diagnosis, assessment scores, or any identifying information into an AI tool without a signed DPA. IEP content is among the most sensitive student data that exists and is protected under FERPA and IDEA.
Only if the AI tool has a signed DPA with Norton. What many teachers do successfully: use AI to generate template language or sentence starters for common progress report situations, then personalize those templates themselves without entering student-specific data into the AI tool.
No, not with student-specific information, unless the tool has an approved DPA covering behavioral and disciplinary data. Disciplinary records are explicitly protected under FERPA. You can use AI to help think through how to describe a type of behavior in general terms, then write the student-specific documentation yourself.
Only if the tool has a signed DPA. Family communication about a specific student is student record information under FERPA. A practical workaround: draft the email yourself first, then use an approved tool (or an unapproved tool with all student-specific information removed) to refine language and tone. Then reinsert the student-specific content manually.
Yes. Google Workspace for Education is an approved platform, and Gemini features are covered under Norton's signed DPA with Google. You may use Gemini in Gmail, Docs, Classroom, and other Workspace applications with student data consistent with how you use those tools generally.
Norton has an approved DPA with Canva for Education, but with one important restriction: only SSO-enabled accounts are covered. Make sure you are logging in through Norton's Single Sign-On, not a personal Canva account. Before using a specific new AI feature in Canva with student data, check with Instructional Technology.
Start with a conversation, not an accusation. Ask the student to walk you through their thinking and process — how they approached the assignment, what decisions they made, what they would change. Genuine understanding of one's own work is hard to fake. This conversation is usually more revealing than any detection tool. Use the NPS AI Assessment Scale going forward to set clear expectations.
No. Norton's guidelines explicitly prohibit using AI detection tools as the basis for academic integrity findings. These tools have unacceptably high error rates and disproportionately flag the writing of English language learners and students who write in non-standard registers — creating significant equity concerns. A detection result is not evidence; it is a probability estimate. Rely instead on assessments that make thinking visible and honest conversations about authorship.
Quite a lot. Any task that does not involve student or staff PII is fair game using an approved tool. Examples: generating first drafts of rubrics, brainstorming differentiated activity ideas, creating reading passages at different Lexile levels, drafting parent newsletter content, summarizing a professional article, generating sample test questions aligned to a standard, explaining a concept in simpler language, and drafting meeting agendas or professional development materials. The common thread: you are using AI as a thinking partner — not as a processor of student information.
Sections 9 & 10

Resources & Review

NPS Data Privacy

Review Cycle

  • Annual review at the start of each school year
  • Mid-year updates communicated to all staff as needed
  • Staff feedback always welcome — contact Instructional Technology

AI Task Force & PEAK Members

Suhaib Abdullahi, Student
Brian Ackerman, Asst. Superintendent
Suzanne Adams, Parent
Aisha Alchaar, Parent
Lori Andrade, LGN STEAM Teacher
Jennifer Attubato, Parent
Paul Barrette, JCS Principal
Kim Birkett, NHS English Teacher
Susie Cashton, LGN & HAY STEAM Teacher
Chris Cummings, NMS Grade 6 Teacher
Julie Durmis, Elementary Library Media Specialist
Kim Dwyer, NMS Grade 8 Teacher
TJ Flanagan, NHS Principal
Patrick Garber, Parent
Skylar Garber, Student
Vincent Hayward, NMS Principal
Jess Holicker, NMS Grade 6 Teacher
Kara Husselbee, LGN Kindergarten Teacher
Stephanie Lerner, NMS Special Education Teacher
Melanie Mathews, Parent
Kerri Murphy, NHS Computer Science Teacher
Liz Norcliffe, JCS Speech & Language
Rachel Pilotte, NHS Business & Technology
Bobby Portway, NHS Business & Technology Teacher
Jennifer Skowronek, NHS Business & Technology Teacher
Joe Spremulli, NMS Grade 8 Teacher
Robert Werner, Parent
Karen Winsper, Director of Instructional Technology
Jennifer Young, NHS Library Media Specialist
Kim Zajac, NMS Speech & Language
This guidance document was developed with the assistance of Generative AI tools and reviewed and approved by Norton Public Schools staff.

Norton Public Schools  ·  AI Guidance for Staff  ·  2025–2026 School Year

Issued by the Office of Instructional Technology  ·  Questions? Contact Karen Winsper, Director of Instructional Technology

This is a living document. Reviewed annually and updated mid-year as needed.