Wednesday, March 18th 2026 Event

Five myths about AI and education

A series of events at the Stanford Accelerator for Learning sheds light on the path forward for AI in teaching and learning.

by Isabel Sacks

Photo: Ryan Zhang

share

When you think about AI in classrooms, what emotions come up? Skepticism? Amazement? Curiosity? Fear?

As AI increasingly infuses every aspect of our lives, educators, students, parents, policymakers, and researchers are returning to fundamental questions about learning and school. 

In the winter of 2025-2026, the Stanford Accelerator for Learning hosted five events convening leaders to discuss how AI can support learning, how to develop effective AI tools, how to protect young people from AI’s risks, and what should remain fundamentally human. The conversations challenged five common myths about AI and education. 

  1. Myth: AI will replace teachers. Reality: In an increasingly tech-centered world, human connection is indispensable.

One of the most persistent fears about AI and education is that technology will eventually replace teachers. But speakers said the opposite: in an increasingly digital world, the classroom is more vital than ever as a space for human relationships and peer communities. 

“We know that learning is fundamentally cultural and social,” said Daniela Di Giacomo, associate professor at the University of Kentucky, at the fourth annual AI+Education Summit, co-hosted with the Stanford Institute for Human-Centered Artificial Intelligence (HAI). “[When] we think about how people learn in a digital age and hyperpartisan age, that should actually amplify the need for better pedagogical instruction.” 

Teacher Mike Taubman, who serves as AI innovation lead at Uncommon Schools, agreed. “The classroom is taking on an almost sacred dimension for me now, where people are gathering together to be young and human together, and grow up together and learn to argue in a very complicated country together.”

AI+Education Summit attendees heard from educators directly on the panel "How AI is Transforming How We Teach." (Photo: Ryan Zhang)

Teachers also play a key role in students getting value from AI tools when they are used. At the Youth-Powered AI Day of Learning, a gathering of teachers and middle and high schoolers hosted by the Accelerator’s Equity in Learning initiative, Nathan Pierce, a teacher at Design Tech High School, suggested the idea that teachers may eventually play more of a coaching role. 

“Getting an adult to work with [a learner] can get them to work on that platform better…the adult needs to be there to motivate and engage them,” said Susanna Loeb, faculty director of the Accelerator’s SCALE initiative.

  1. Myth: AI makes it too easy to cheat. Reality: We need to rethink assessment.

At the Accelerator’s conference on Responsible Assessment in the AI Era, in collaboration with ETS, speakers reframed cheating concerns as an opportunity to fundamentally rethink what and how we assess. 

“We have to move beyond just thinking about a test as something we do at the end,” said Amit Sevak, CEO of ETS, one of the world's largest educational testing organizations, which designs and administers the TOEFL, Praxis, and GRE. Emerging approaches include regular, data-driven formative assessments; scenario-based design that tests adaptability; and AI-powered personalization of tests. Best practices for any assessment that involves AI include co-design with educators and continuous human oversight.

Amit Sevak, CEO of ETS, speaks at Responsible Assessment in the AI Era. (Photo: Ryan Zhang)

“I think the days of us testing on sheer rote knowledge in a homework question is probably over. But these more delicate questions about synthesis, analysis, and integrating, that you can answer in a layered, multi-stage framework – these are the things that [AI] tools are effectively useless for,” said Paul Nuyujukian, assistant professor of neuroscience at Stanford, at the Accelerator’s third annual Accelerate Edtech Impact Summit. “The beautiful thing is that that’s the intuition you actually want to impart in your classroom.”

Dan Schwartz, dean of the Stanford Graduate School of Education (GSE) and faculty director of the Accelerator, agreed. “Our knowledge bases, our tools, our circumstances, our jobs, continue to change, so we need instruction that produces adaptive learners and assessments that can tell.”

  1. Myth: AI stifles creativity. Reality: With intentional design, AI can be a tool to spark creativity, but we need to be careful about potential drawbacks.

At the AI+Education Summit, Mehran Sahami, chair of Stanford’s computer science department, challenged the assumption that AI kills creativity. "A lot of critics say AI [is] just trained on certain information and [it] can only regurgitate that information…what do we do for most of education? Exactly the same thing. And somehow we expect students to be able to generate novel results." The question, he argued, is finding ways that AI can help humans produce novel ideas, and crafting learning experiences that focus on the process of creation rather than the product.

Hari Subramonyam, faculty affiliate of the Accelerator, showed how AI can expand creative possibilities for students. "AI can provide the infrastructure to lower the floor for creation to get students started," he said, showcasing examples like animation tools that allow hands-on exploration of physics concepts. It can also “raise the ceiling” of what is technically feasible, allowing learners to focus on higher-order thinking. "Creation helps learners organize and structure knowledge in more meaningful, usable ways, and AI should support this," he said. 

However, speakers cautioned that using AI for creative tasks can backfire. Accelerator Faculty Affiliate Guilherme Lichand shared research from middle schools in Brazil showing that while AI assistance helped students on immediate creative tasks, it led to significantly worse performance on subsequent tasks when the AI was removed. “You start thinking that the AI is more creative than you,” he explained, underscoring the importance of intentional design.

  1. Myth: The more new AI tools for learning we build, the better. Reality: It’s easy to make an AI tool, but harder to make one backed by science or research.

“We’ve democratized the ability to create products,” said Accelerator Faculty Affiliate Susan Athey, a professor at Stanford Graduate School of Business and an advisor to the World Bank. “People who have ideas, even non-technical people…now can make their ideas a reality. That’s very exciting. But what that’s starting to look like on the ground, is now that we have too many pilots, and still not enough implementations that are actually effective.” 

The solution, speakers emphasized, is grounding AI development in learning science and iterative design. Loeb urged developers to start with research. “The first step is to take what we know and apply it, and don't make the obvious mistakes that we know from everything we've done in edtech,” she said. Then, programs should collect data on implementation and engagement, through randomization and experimentation. “If you're rolling out a program to 10,000 schools with 100,000 students, do it in a way where you learn something,” she said. 

James Landay, co-founder and co-director of Stanford HAI and a faculty affiliate of the Accelerator, argued, “We need to go beyond user-centered design to human-centered AI design,” bringing students, teachers, families, and learning experts into the process and considering societal-level effects, particularly when it comes to tools that scale widely.

The Accelerator’s Create+AI Challenge brought 10 cross-sector teams to campus to pitch projects that put educators and learners at the heart of AI design. The projects, which aimed either to augment teaching, augment learning, or augment career opportunities, were judged not on how impressive the technology, but rather on their basis in research and design principles, and potential impact on learning.

The winners and judges of the Create+AI Challenge on the pitch day. (Photo: Ryan Zhang)

Equally important is recognizing when an AI tool isn’t necessary at all. “If you can do it with a paper and pencil, or in-person – just give a kid a hug or have a chat – just do that! Use the technology for things that are really transformational,” said Rebecca Winthrop, senior fellow at the Brookings Institution.

  1. Myth: Banning AI from classrooms is the best way to protect students from its risks. Reality: AI is here to stay. Integrating it responsibly—with ethical guidelines and AI literacy—will prepare students to thrive.

72% of K-12 students routinely use generative AI, but only 28% can accurately describe how it works, cited Ronit Levavi Morad, senior director at Google Research, at the AI+Education Summit. How do we ensure the safety of young people with tools that they may not understand, are not necessarily designed for them, and are often used outside of school contexts? “What’s the balance between ‘protect our kids’ and ‘prepare our kids’? That is a real tension lived on a daily basis in classrooms and in homes,” said Winthrop.

Speakers across events and panels referenced lessons learned from prior technologies like social media, which schools initially ignored or banned. “This is just the most recent iteration of the thing that will be in their life, and we either teach them how to use it, or they will be used by it,” said Kirsten Baesler, U.S. assistant secretary for elementary and secondary education, at the Accelerate Edtech Impact Summit.

Attendees at the Accelerate Edtech Impact Summit were encouraged to fist-bump throughout the day. (Photo: Ryan Zhang)

The path forward requires both safety measures and AI literacy education. Erin Mote, CEO at InnovateEDU, argued that "safety is not the counterpolarity of innovation.” Rather, through a focus on safety, “we can actually unlock the types of positive use cases in AI, in schools and education, that move us beyond fear and towards knowledge, and the types of learning experiences we want for young people.”

Taubman’s "AI driver's license" program exemplifies this approach. “The idea is to map that quintessential adolescent experience of getting your drivers’ license onto this AI moment,” he said. “The whole idea, as you can imagine, is to get students into the driver's seat and not the passenger seat when it comes to AI.” The impact of the program: “They start to realize that their voices matter right now and they can start to take part in shaping this world that they’re graduating into.”

Similarly, the AI Quests game, co-designed by Stanford scholars and a team from Google Research, aims to move students from passive users of AI to critical thinkers, said Victor Lee, faculty lead for AI+Education at the Accelerator and a co-creator of the game. “We’re positioning students and educators to have a sense of agency in how we use AI, how we create AI, and how we evaluate AI.”

Learn more about the Accelerate Edtech Impact Summit, Responsible Assessment in the AI Era, AI+Education Summit, Youth-Powered AI Day of Learning, and Create+AI Challenge.