The fierce battle over AI in schools

New York City, with the largest public school district in the country, was breaking ground on a novel, AI-themed high school when district leadership abruptly pulled the plug last month. They cited mounting parental concern and nationwide backlash to what has been labeled rapid, unsafe adoption of AI.
Because there has been a rapid adoption of AI among students across the country. Used properly, the tech could transform learning, many argue, and fill gaps in an overburdened education system. But others worry it'll be a generational misstep that could worsen learning development.
Mashable spoke with a dozen stakeholders — parents, child safety advocates, AI literacy experts, tech leaders, and a state representative proposing stronger EdTech regulation — to lay out what is at stake when you add AI to the equation.
AI moratoriums: Safe choice or miscalculation?
Dylan Arena, chief data science and AI officer for education solutions giant McGraw Hill, told Mashable that the history of EdTech is cyclical. First there was the introduction of the internet and computers wholesale. Then, there was the push for 1:1 devices (personal laptops, Chromebooks, tablets). Now, it's AI.
He described similar hype cycles around personalized or "adaptive" learning (you'll hear this term surrounding AI, as well). Arena sees AI adoption as less an evolution and more a "pendulum swing or a wobbly spiral." AI, for what it's worth, is much older than our current LLM obsession will lead you to believe, and it's already been in classrooms. McGraw Hill's web-based AI assessment tool, ALEKS, was designed 25 years ago.
"Early on, the conversation was about access: devices, connectivity, and digital materials. Now the conversation has to be about impact," said Melissa Loble, chief academic officer at EdTech giant Instructure. Instructure, which offers popular learning management system Canvas, announced partnerships with OpenAI and Anthropic in 2025. "The benefits are real when technology is used with a clear purpose. We are not trying to add AI simply because it is new."
AI developers and tech proponents advocate for gated, human-administered AI experiences in the classroom, as well as administrative applications for teachers and staff, that will reduce workloads, enhance learning, and ease the friction of modern classrooms. They argue that future workforces will be defined by their ability to detect and leverage AI. Whether or not a student or educator intends to use it, they should at least know how AI operates.
"On one hand, the demand for generative AI in schools has grown at an extraordinary pace. On the other, that pace has understandably raised important questions about safety and the long-term impact on learning," said Naria Santa Lucia, general manager of the Microsoft Elevate initiative. "Ideally, every school adopts AI with a clear plan that includes guidelines co-developed with educators, strong privacy protections, and dedicated time for teacher training to ensure students and teachers are best prepared for the future AI economy."
"Our priority in education is to ensure AI works to the benefit of learning and students," Leah Belsky, vice president of education at OpenAI, told Mashable. "To do so, we partner with teachers, institutions, and students to advance our tool and research outcomes. We launched ChatGPT for teachers to help teachers build deep fluency with AI so that they can play a key role in guiding students in how to use AI well."
Many agree that the tech's adoption shouldn't be rushed, and that popular generative AI tools don't yet have their place in K-12. OpenAI and Anthropic, for example, only offer their classroom products for higher education.
"Our learning tools on Chromebooks are built with educators, giving them the control to decide what’s best for their students," said Google spokesperson Maggie Shiels.
The company reiterated that Gemini for Education, NotebookLM, and other Google AI products are compliant with child privacy laws, a leading concern in the debate. Students' chats aren't used for AI training and Gemini in Workspace isn't available to students under 18.
Most EdTech leaders Mashable spoke to share concerns about an overabundance of screen time among youth. Several acknowledged a concerning lack of long-term research on AI's impact on cognition and learning outcomes.
"The answer is not hype, and it is not fear," said Loble. "It is evidence, governance, and learning."
AI is the fastest growing consumer technology. It cannot be contained.
Those tools could be a genuine solution to public education's dilemmas, proponents say. "There is a real difference between purpose-built systems, systems built for educational outcomes, and general purpose AI," Ashish Bansal, founder of AI math tutor StarSpark.AI told Mashable.
Bansal says that generative AI tools can address inequities between students with access to support at home and those without. Multimodal technologies, like live translation, can make school easier for second language learners. He argues for classrooms built on collaboration, social interaction, and group problem solving, with generative AI offering support for individual learning.
Several EdTech makers Mashable interviewed are of the camp that smaller AI solutions can address societal issues posed by Big Tech's universal models, but they require time and investment. Moratoriums or bans would render that near impossible.
AI moratoriums could also pose risks themselves, Santa Lucia and others warn.
"I understand the instinct, everyone wants to be sure we get this right, and we share that caution. But we believe the real opportunity is not to stop progress, but to shape it," she said. "The more constructive path in my view is to meet that moment with intentional design."
"In our judgement, there shouldn't be any AI-facing instruction for children in elementary schools," said Randi Weingarten, president of the American Federation of Teachers (AFT).
Vocally opposed to teacher replacement, AFT's stance is that educators should have the opportunity to learn about and deploy generative AI should they see fit, empowering them to make the choice instead of Big Tech. AFT partnered with Microsoft, OpenAI, and Anthropic last year to launch the country's first National Academy for AI Instruction serving its 1.8 million members.
"AI is probably the most pronounced industrial revolution, certainly in my lifetime, but maybe in civilization," said Weingartern. "Every societal change shows up in teachers' classrooms."
AI education is not a green light for adoption, or even advocacy, argues Amanda Bickerstaff, CEO of AI for Education, an AI literacy organization that partners with educational institutions and advises districts on ethical AI deployment.
"We are living in an inflection point. When people think about generative AI, they often think of it like an app or device that can be turned off. But generative AI is more similar to the internet and electricity in that it's the power underneath the applications," she said. "[AI] is the fastest growing consumer technology. It cannot be contained."
The case for an AI pause
On April 16, a group of 250 organizations and experts convened by child safety nonprofit Fairplay penned a letter to schools across the U.S. and Canada calling for a five-year moratorium on classroom AI. It wasn’t the first.
A few months prior, a group of concerned parents, teachers, and climate activists in New York City issued their own call for a two-year moratorium. The group was formed in the wake of an August Daily News op-ed written by NYC parent and public school teacher Liat Olenick.
“It's really insidious,” Olenick said of Big Tech's presence in schools. "Our kids are not the client, they're the product." In Olenick's experience, both parents and educators are being thrown into the world of AI with little transparency or communication from districts. In addition to fears about AI's impact on the environment, she says the deployment of AI learning chatbots like Amira and Magic School AI in NYC elementary schools tipped her to do something. Investing in the future of our children and planet, Olenick argues, does not mean investing in AI.
A moratorium, however, is a common sense option to get districts to slow down, proponents say.
Those pushing AI moratoriums argue that schools are jumping into a technology without fully knowing its ramifications. They cite the potential misuse of student data, as well as institutional security risks. Cyberattacks on K-12 schools have greatly increased in recent years, including a recent Instructure breach.
But the biggest concern of people like Olenick is the effect of AI on young learners' brains. Recent, limited scale studies on chatbots have indicated overuse leads to poorer critical thinking and other developmental effects.
Every pro-moratorium source Mashable spoke to expressed worry that more technology will worsen screen addictions, increase cognitive fatigue, and devalue the importance of human teaching and social interactions. Josh Golin, executive director of Fairplay, told Mashable that AI is supercharging existing problems across all of EdTech.
They're going after our tax money, our district money, that is extremely precious and in short supply.
Many sources called it a "Wild West" situation, and feared children were being used as guinea pigs in a nationwide AI experiment. They believe the argument that AI is ubiquitous, and that it will remain that way, is built on a faulty premise — that generative AI is good, effective, and in demand. The most concerned see a push for more AI as a thinly veiled attempt to solve understaffing with AI, not more funding.
Legislators, like Vermont House Representative Angela Arsenault, suggest pauses give time for regulation to catch up. "We fell so far behind with social media, and now we have fallen almost as far behind with EdTech in general. We are very quickly losing any opportunity we have to try to keep pace with AI." Arsenault and a growing number of bipartisan lawmakers have introduced a number of bills aimed at governing EdTech.
"It's time for everyone to pause and ask what kind of society we want to see," said Anya Meksin, Los Angeles Unified School District (LAUSD) parent and deputy director of Schools Beyond Screens, one of the signatories of Fairplay's moratorium letter and co-authors of LAUSD's screen time limits resolution. In the last year, Schools Beyond Screens has grown to 2,000 members and 100 national chapters, advocating for reduced screen time in schools and a return to pencil and paper learning.
The urgency to adopt AI is manufactured, it's opponents argue. With mounting pressure from investors, companies must present a world where tech adoption is a need, not a want, one in which their billion-dollar evaluations are justified. School districts are just falling in line, having been "wined and dined" to spend tens of billions of dollars on tech over the last 20 years, said Golin.
"They're not nonprofits," said Meksin. "These are for profit companies going after public dollars. They're going after our tax money, our district money, that is extremely precious and in short supply."
In this framing, turning to smaller EduTech companies isn't a solution, either. Many still build on top of Big Tech's core models, they note, including OpenAI's GPTs. Most still want some form of tech in classrooms.
"The notion that an AI is going to be able to differentiate instruction and personalize a lesson better than I can is Orwellian," said Joe Clement, a Virginia public school teacher and co-author of Screen Schooled, a 2017 book detailing the overuse of technology in U.S. classrooms. Clement describes an "enmeshment" of student technology and AI, making it challenging to avoid in education. He argues it's overburdening children and making it harder to build engaged, critical learners.
While some believe AI is an equity gap filler, others believe it will exacerbate existing problems rampant across under-resourced schools. Many, like Clement, pointed to well-funded private schools pivoting away from 1:1 devices and technology in favor of hands-on human tutoring, leaving AI to the underfunded.
A ship without a rudder
The lack of a unified voice, and little federal intervention, is further fragmenting the debate, sources explained. "The Federal Department of Education has really abdicated its responsibility of being a clearing house on best practices," said Weingarten. "In fact, they are doing the opposite. They're doing the bidding of Big Tech, as opposed to listening to the people."
The Department of Education issued AI guidelines in 2025, but, to Weingarten's point, have ceded AI's ethical implementation to schools themselves. AI policies across the country are still being penned or are nonexistent. Rapid initial adoption has made it even more difficult to retroactively scale it back and reset.
Confusion reigns and parents, teachers, districts, even students themselves, are trying to regain some semblance of control.
As Bickerstaff, the AI for Education CEO, puts it: "This is one of the noisiest things that's ever happened in education."