The year I co-taught world history and English language arts with two colleagues, we were tasked with telling the story of the world in 180 days to about 120 ninth graders. We invited students to consider how texts and histories speak to one another: “The Analects” as imperial governance, “Sundiata” as Mali’s political memory, “Julius Caesar” as a window into the unraveling of a republic. 

By winter, our students had given us nicknames. Some days, we were a triumvirate. Some days, we were Cerberus, the three-headed hound of Hades. It was a joke, but it held a deeper meaning. Our students were learning to make connections by weaving us into the histories they studied. They were building a worldview, and they saw themselves in it. 

Designed to foster critical thinking, this teaching was deeply human. It involved combing through texts for missing voices, adapting lessons to reflect the interests of the students in front of us and trusting that learning, like understanding, unfolds slowly. That labor can’t be optimized for efficiency. 

Yet, today, there’s a growing push to teach faster. Thousands of New York teachers are being trained to use AI tools for lesson planning, part of a $23 million initiative backed by OpenAI, Microsoft and Anthropic. The program promises to reduce teacher burnout and streamline planning. At the same time, a new private school in Manhattan is touting an AI-driven model that “speed-teaches” core subjects in just two hours of instruction each day while deliberately avoiding politically controversial issues. 

Marketed as innovation, this stripped-down vision of education treats learning as a technical output rather than as a human process in which students ask hard questions and teachers cultivate the critical thinking that fuels curiosity. A recent analysis of AI-generated civics lesson plans found that they consistently lacked multicultural content and prompts for critical thinking. These AI tools are fast, but shallow. They fail to capture the nuance, care and complexity that deep learning demands. 

Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

When I was a teacher, I often reviewed lesson plans to help colleagues refine their teaching practices. Later, as a principal in Washington, D.C., and New York City, I came to understand that lesson plans, the documents connecting curriculum and achievement, were among the few steady examples of classroom practice. Despite their importance, lesson plans were rarely evaluated for their effectiveness.  

When I wrote my dissertation, after 20 years of working in schools, lesson plan analysis was a core part of my research. Analyzing plans across multiple schools, I found that the activities and tasks included in lesson plans were reliable indicators of the depth of knowledge teachers required and, by extension, the limits of what students were asked to learn. 

Reviewing hundreds of plans made clear that most lessons rarely offered more than a single dominant voice — and thus confined both what counted as knowledge and what qualified as achievement. Shifting plans toward deeper, more inclusive student learning required deliberate effort to incorporate primary sources, weave together multiple narratives and design tasks that push students beyond mere recall. 

 I also found that creating the conditions for such learning takes time. There is no substitute for that. Where this work took hold, students were making meaning, seeing patterns, asking why and finding themselves in the story. 

That’s the transformation AI can’t deliver. When curriculum tools are trained on the same data that has long omitted perspectives, they don’t correct bias; they reproduce it. The developers of ChatGPT acknowledge that the model is “skewed toward Western views and performs best in English” and warn educators to review its content carefully for stereotypes and bias. Those same distortions appear at the systems level — a 2025 study in the World Journal of Advanced Research and Reviews found that biased educational algorithms can shape students’ educational paths and create new structural barriers. 

Ask an AI tool for a lesson on westward expansion, and you’ll get a tidy narrative about pioneers and Manifest Destiny. Request a unit on the Civil Rights Movement and you may get a few lines on Martin Luther King Jr., but hardly a word about Ella Baker, Fannie Lou Hamer or the grassroots organizers who made the movement possible. Native nations, meanwhile, are reduced to footnotes or omitted altogether. 

Curriculum redlining — the systematic exclusion or downplaying of entire histories, perspectives and communities — has already been embedded in educational materials for generations. So what happens when “efficiency” becomes the goal? Whose histories are deemed too complex, too political or too inconvenient to make the cut? 

Related: What aspects of teaching should remain human? 

None of this is theoretical. It’s already happening in classrooms across the country. Educators are under pressure to teach more with less: less time, fewer resources, narrower guardrails. AI promises relief but overlooks profound ethical questions. 

Students don’t benefit from autogenerated worksheets. They benefit from lessons that challenge them, invite them to wrestle with complexity and help them connect learning to the world around them. That requires deliberate planning and professional judgment from a human who views education as a mechanism to spark inquiry. 

Recently, I asked my students at Brandeis University to use AI to generate a list of individuals who embody concepts such as beauty, knowledge and leadership. The results, overwhelmingly white, male and Western, mirrored what is pervasive in textbooks.  

My students responded with sharp analysis. One student created color palettes to demonstrate the narrow scope of skin tones generated by AI. Another student developed a “Missing Gender” summary to highlight omissions. It was a clear reminder that students are ready to think critically but require opportunities to do so.  

AI can only do what it’s programmed to do, which means it draws from existing, stratified information and lags behind new paradigms. That makes it both backward-looking and vulnerable to reproducing bias.  

Teaching with humanity, by contrast, requires judgment, care and cultural knowledge. These are qualities no algorithm can automate. When we surrender lesson planning to AI, we don’t just lose stories; we also lose the opportunity to engage with them. We lose the critical habits of inquiry and connection that teaching is meant to foster. 

Tanishia Lavette Williams is the inaugural education stratification postdoctoral fellow at the Institute on Race, Power and Political Economy, a Kay fellow at Brandeis University and a visiting scholar at Harvard University. 

Contact the opinion editor at [email protected].  

This story about male AI and teaching was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.  

Read More

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.