
Sign up to save your podcasts
Or


As AI tools become more common in classrooms, questions around safety, privacy, and ethics are more important than ever. In this episode, Courtney and Matt sit down with Jason Kelsall and Patrick Coniway from District Technology Services to unpack the “S” for Safety in the district’s ETHOS framework.
Together, they explore how AI tools are vetted, how student data is protected, and what safeguards are in place to ensure responsible, transparent, and ethical AI use across the district.
🔐 How Student Data Is Protected
What a data privacy agreement is—and why it matters
Why district-approved tools offer protections that personal accounts do not
How DTS works with vendors to safeguard student and educator data
🧠 How AI Tools Are Evaluated
The role of the Codex review process
How tools are reviewed for privacy, security, instructional alignment, and compliance
Why transparency in generative AI systems is non-negotiable
🧪 “Red Teaming” AI Tools
What red teaming is and how DTS stress-tests AI tools before approval
How teams attempt to “break” tools to identify risks like misuse or prompt injection
Why hands-on testing matters just as much as vendor promises
✅ District-Approved AI Tools
Why SchoolAI is approved for K–12 student use
How Gemini is approved for staff and high school students
How tools like Canva and Adobe Firefly are monitored as AI features are added
🔄 Ongoing Review & Vendor Feedback
How DTS monitors AI features added to existing tools
Why some tools are approved, some paused, and others declined
How the district provides feedback to vendors to improve safety and transparency
Matt highlights February’s Student-Powered AI Challenge winner:
A Longs Peak Middle School teacher used Gemini as a technical partner to build interactive Desmos activities
AI reduced technical barriers, allowing the teacher to focus on pedagogy
Student engagement increased, and learning clicked—proving AI works best when it supports instruction, not distracts from it
Keynote: Catlin Tucker
Date: Saturday, February 21
Perks: Free breakfast, lunch, hot chocolate, prizes
Doors open: 8:15 a.m.
AI innovation moves fast—but safety, transparency, and intentionality must lead the way. This episode highlights the thoughtful systems, partnerships, and safeguards working behind the scenes to ensure AI serves students and educators responsibly.
February PDF
By beyondalgroAs AI tools become more common in classrooms, questions around safety, privacy, and ethics are more important than ever. In this episode, Courtney and Matt sit down with Jason Kelsall and Patrick Coniway from District Technology Services to unpack the “S” for Safety in the district’s ETHOS framework.
Together, they explore how AI tools are vetted, how student data is protected, and what safeguards are in place to ensure responsible, transparent, and ethical AI use across the district.
🔐 How Student Data Is Protected
What a data privacy agreement is—and why it matters
Why district-approved tools offer protections that personal accounts do not
How DTS works with vendors to safeguard student and educator data
🧠 How AI Tools Are Evaluated
The role of the Codex review process
How tools are reviewed for privacy, security, instructional alignment, and compliance
Why transparency in generative AI systems is non-negotiable
🧪 “Red Teaming” AI Tools
What red teaming is and how DTS stress-tests AI tools before approval
How teams attempt to “break” tools to identify risks like misuse or prompt injection
Why hands-on testing matters just as much as vendor promises
✅ District-Approved AI Tools
Why SchoolAI is approved for K–12 student use
How Gemini is approved for staff and high school students
How tools like Canva and Adobe Firefly are monitored as AI features are added
🔄 Ongoing Review & Vendor Feedback
How DTS monitors AI features added to existing tools
Why some tools are approved, some paused, and others declined
How the district provides feedback to vendors to improve safety and transparency
Matt highlights February’s Student-Powered AI Challenge winner:
A Longs Peak Middle School teacher used Gemini as a technical partner to build interactive Desmos activities
AI reduced technical barriers, allowing the teacher to focus on pedagogy
Student engagement increased, and learning clicked—proving AI works best when it supports instruction, not distracts from it
Keynote: Catlin Tucker
Date: Saturday, February 21
Perks: Free breakfast, lunch, hot chocolate, prizes
Doors open: 8:15 a.m.
AI innovation moves fast—but safety, transparency, and intentionality must lead the way. This episode highlights the thoughtful systems, partnerships, and safeguards working behind the scenes to ensure AI serves students and educators responsibly.
February PDF