Just 30 minutes outside of Cleveland is Chagrin Falls Exempted Village Schools, a small, suburban district with approximately 1,600 students across four schools. There’s no sign of the bustling city here—the eponymous waterfalls cascade through the center of Chagrin Falls’ historic downtown, with buildings dating back to the late 19th century. You probably wouldn’t expect a small district in such a postcard-perfect town to be in the vanguard of K-12 schools getting serious about AI. But they created and implemented a strong vision for AI use in classes long before most other districts across the country.
In this very issue, The AI-Driven Leader author Geoff Woods tells SchoolCEO why so many leaders have been reluctant to embrace AI: They just don’t know where to start. Here, we share how one district’s leadership and technology department collaborated to create interest in AI, train their community and craft policies around its use to successfully adopt this emerging technology at scale.
Early Adopters
About a week after ChatGPT 3.5 made a huge splash in 2022, Director of Technology and Information Systems Mike Daugherty started playing with it. He thought it was a novelty at first, but an alarmist article in The Atlantic titled “ChatGPT Will End High School English” made him rethink its possibilities—good and bad. It became clear to Daugherty that AI would transform education no matter what, so the best strategy would be to get ahead of it. “I brought it to our administrative team, and we started having conversations around it,” says Daugherty. “This was the future. We could see the writing on the wall. So we talked about how we could embrace it.”
While many teachers feared The Atlantic’s predictions might come true—that ChatGPT would replace them and that students would no longer do their own work—Daugherty had a more optimistic view. The 18-year vet of Chagrin Falls Schools had helped teachers at this district weather other big technological shifts, and they had come out the other side unscathed. “When Google first came out, teachers were like, ‘What are we going to do? Kids are just going to search for the answer,’” he says. “We taught them how to use search engines. We figured out how to change our lesson plans. We didn't pretend it didn't exist. Everybody adapted.” The same would need to be true of AI.
The Rollout
Ultimately, the decision to go full sail ahead with AI was all about students. “When something is coming down the pike, we ask how it will grow our individual students,” says Superintendent Dr. Jennifer Penczarski. “How do we embrace it, so they can be prepared for the future they're going to face?” But before the district could start using AI in classrooms, they needed to get teachers on board. They figured that if AI became a positive influence in teachers’ lives, they'd be much more open to students using it.
Next, the tech team would need to expand the circle of learning to students, then to families and eventually to the wider community. “Our strategic direction is ‘Inspiring All to Grow and Thrive.’ The ‘all’ is what matters. ‘All’ is our students, our staff, our community, our partners,” says Penczarski. “When we’re doing something, we want to make sure that we're all growing and that we all understand it.” And that commitment to involving the entire community—from training them to explaining how Chagrin Falls would use it in schools—is what made this rollout so successful.
Getting Teachers on Board
The initial phase of the process focused on teachers: showing them how AI could lighten their loads and giving them time to experiment with it. In April 2023—just a handful of months after ChatGPT 3.5 caused a sensation—Daugherty, along with Tech Integration and Instructional Coach Molly Klodor, gave simple one-hour presentations at each of the district’s school sites. “We were just showing what different tools did and how they could be used,” Daugherty explains. “Once you had some base knowledge as a teacher, you could easily see, ‘How would I want to write policy? What am I putting in a syllabus?’”
They followed that up with a full day of professional development that summer. Nearly a third of their teaching staff attended—not just to hear about AI, but to have a discussion about it. “Part of that day was sitting with teachers saying, ‘What do we need to put in our handbook? How do we address the cheating aspect?’” Daugherty says. “So we built that with them.” Initially, the district’s five syllabus options essentially ranged from “Use AI as much as you want” to “Just let me know you're using it” to “No, you can never use it in my class.” Teachers could choose the option that made most sense for them and their students. “That choice and voice were the two key factors that made us unique and able to run with this so early,” he says.
That August, when the Chagrin Falls team surveyed faculty and staff about their AI use, they found that the technology hadn’t caught on as much as they’d hoped. They had follow-up conversations with teachers to see why they weren’t using AI more. “A lot of it was, ‘I don’t know how I would use it,’ or ‘I don't have time to try it in my classroom,’” says Daugherty.
To incentivize participation, they decided to run weekly raffles for a premier parking spot at each building—a very coveted prize in Ohio’s freezing winter temps. “All you had to do was enter how you were using AI into this form,” he says. They also asked staff members to estimate how much time AI was saving them. The response grew. As more people shared their use cases, Klodor in turn shared them with teachers and staff in a weekly email.
By paying attention to which areas of the district still weren’t participating, Daugherty and Klodor could also lean in to provide more training and assistance where necessary. “We were like, ‘Huh, there's not a lot of entries from the fifth grade. Let's go figure out why,’” says Daugherty. They visited teachers in classrooms to discuss what they needed. And that piece—differentiating AI adoption and support for different grades and individual teachers—falls right in line with the district’s focus on personalizing education.
Because of Chagrin Falls’ strong focus on individual students, Penczarski says, teachers are usually familiar with their students’ voices—and know when something they turn in doesn’t sound like them. Still, the district knew they’d have to address cheating in order to get teachers fully on board. They took several steps to do so by harnessing technology to police technology. For instance, Daugherty and his team showed teachers ways AI could help them determine if a student cheated. “When someone sends you a paper and you suspect that AI wrote it, put it back through AI and say, ‘Ask five questions about this paper,’” Penczarski offers as an example. “If the student actually wrote it, they’ll be able to answer those questions really quickly.”

Teaching Students How to Use AI Wisely
As teachers grew more comfortable with AI, “we began to go into classrooms and start educating students,” says Daugherty, “to have conversations about everything from ethical use to citing sources to some of the inherent biases that are built into large language models, or LLMs.” For instance, to demonstrate those biases to a class full of students, Daugherty and his team would ask ChatGPT to generate images of “an intelligent person.” “It would generate 12 images, all older white men wearing glasses. So that was a great lesson for them to understand,” he says. By teaching them to look out for inherent biases in LLMs, they were also teaching them to question their own.
Finally, they taught students when they could and couldn’t use AI, which varied by grade and by classroom. The team stressed to students that every time they did use it, they needed to fact-check their work. These classroom visits wouldn’t be the only time students would learn when and how to use AI, either. Daugherty and Klodor asked teachers to review what constituted acceptable AI use in their classrooms every time they began work on a major project.
The Chagrin Falls team knew that in order to teach students responsible AI use, the district’s adults would need to model it. “That was something that Mike intentionally taught: If you're using AI in something, even if it's in your email responses, include a note that AI may have been used to write that response,” Penczarski says. “We're practicing what we're teaching students.”

Safety and Security Measures
The initial training sessions for teachers and students were largely based around publicly available LLMs. With both groups, Klodor and Daugherty emphasized the importance of not uploading personal data, like names and photographs. For example, when teachers became interested in using AI to help write IEPs—a hugely important but very time-consuming task—Daugherty and Klodor showed them how to do so without using personally identifying information. “One of the strategies we use is to swap the student's name with ‘Student X,’” he says.
Formalizing AI use in classrooms would mean no longer relying solely on those publicly available LLMs. Instead, Chagrin Falls would need to carefully choose and adopt tools for districtwide use. For students, they eventually decided to streamline use to one primary platform—a tool called MagicSchool. “We picked an AI tool that was built for schools and wouldn’t share our data,” says Daugherty. “It’s not that we don’t talk about other ones, but we don’t encourage use of things that aren’t in our little ecosystem.”
Using a tool built for schools with a chatbot baked in also meant they would have more oversight of students’ conversations—a crucial step toward protecting students’ well-being. After all, as young people start to spend more time with AI, they can begin to view chatbots as real humans and even as friends. It was important to the team that students not have inappropriate conversations or become over-reliant on AI for companionship. “Sometimes students can feel like they’re talking to a real person, but this is a computer,” Daugherty says. “The tool that we picked has some safeguards built in, so if a student’s conversation is going off the rails, it will let the educator know.”
Building Buy-In with Families and the Community
Of course, the administration wanted to make sure families understood the district’s decision to allow AI use in schools, too. At a parent night, the district demonstrated how the technology works and how it benefits students, from enhancing classroom lessons to preparing them for an AI-driven workforce. They also showed families ways to use the technology themselves to improve their children’s education.
“If your child's struggling with a geometry problem and you’re not sure exactly how to help, you can enter the specific question and ask AI how to support the student,” offers Penczarski. “Upload notes and say, ‘Help me generate a practice quiz,’” adds Daugherty. “If your child's assignment has a rubric, upload it into Gemini or ChatGPT. Then upload the work that your child has done and ask if there are areas that could be improved.”
Once families understood how AI works and how they might use it themselves, they broke into smaller groups to hear how teachers of different grades and subjects were using it with their students. “We’ve had sixth graders talking to ecosystems. The AI chatbot took on the role of a rainforest or a desert. The students would ask it questions, and it would respond accordingly,” Daugherty says. “Another group was trying to reinvent an outdated lesson, and those kids ended up using AI to generate images and build a podcast.” Everywhere they could—in those weekly newsletters and PD sessions and presentations to families—the tech team highlighted examples like these to show how AI was actually helping teachers better engage students.
But implementing AI at Chagrin Falls Schools went even further—showing members of the community why and how the district was embracing it. The district already hosted a council made up of senior citizens in the community “to make sure that they're up to speed with what the school's doing and the impact that actions are going to have,” says Penczarski. “We introduced AI to them because it was what kids and families were talking about.” But as with teachers, students and families, they wanted to make sure the seniors understood how to use it first.
“Just like with our teachers, we worked on prompt writing with our senior citizens—setting a persona, giving it direction and an action, and telling it how you want the results,” Daugherty says. They even showed them practical tips for improving their results so seniors wouldn’t give up on the first failed attempt: “when you get what you are looking for, asking, ‘What should I have asked you to get to this answer the first time?’” he explains. Making sure all of their stakeholders had a baseline understanding of the technology wasn’t necessary for a successful rollout, but by teaching them something new and including them in the process, the schools strengthened the seniors’ connection to the district.
Going Forward
In the future, the technology department and district administration are committed to listening to their community to make sure their AI strategy is still working. “Mike uses things like the parking space contests to get data, to help target what he's designing for the future. Those kinds of things make it unique and make it sustainable,” says Penczarski.
With that feedback, Daugherty can continue fine-tuning policy and guidelines, as well as offer better training. “The PD is tiered so you can enter at any point in your learning,” explains Penczarski. “That level of PD customization is huge for us when it comes to training and preparing our staff and even our students. When I'm ready to take that next step and learn something new, he and Molly can make it happen.”
If you’re considering how to adopt AI in your district, Penczarski says that you need to work closely with your technology department. Silos are the enemy of a successful AI rollout. If your teachers show a lot of resistance, show them how AI can benefit them first. Then work with them to craft policies and guidelines that address privacy, security and cheating concerns.
It will also be hugely beneficial to your efforts if you collect feedback as you go—both to share use cases and to identify those who are struggling with new technology. Make sure you share your new guidelines with students and demonstrate AI’s potential pitfalls. Invite parents in to see all the ways AI is actually enriching learning, not replacing it. And as with any big move, be transparent with your community about it. Above all else, make sure you provide ample training to everyone—teachers, students, staff, families and even folks who don’t have kids in your district. After all, educating people is what you do best.
Chagrin Falls Schools consistently ranks as one of the top districts on the Ohio state report card. Penczarski attributes this success to their strategic direction, “inspiring all to grow and thrive,” as well as their three organizational mainsprings: “connect, create and challenge.” The district’s strong connection to their families and communities has meant looping them into the AI adoption process and even training them on it. By adopting AI early and at scale, they are creating opportunities for students to grow academically and professionally. And they have been able to do all of this by challenging themselves not to be complacent, but to do the uncomfortable job of rethinking how they work and how they educate.
Now, AI is certainly helping humans at Chagrin Falls Schools succeed, but more importantly, humans are helping humans succeed. That is the not-so-secret ingredient to a truly effective rollout.
