Recently, 60 Minutes showcased Sal Khan, known for the popular Khan Academy, describing Sal Khan’s Khanmigo, a new AI-powered program promising to revolutionize teaching by giving students an online personal tutor. Here’s Khan’s TED talk, “How AI Could Save (Not Destroy) Education,”
Khanmigo’s format, feedback, and human-like online conversation and interactions are clever and innovative. However, there is uneasiness about AI in schools. Here are some reasons why.
Mastery Learning and School Change
Khan promotes mastery learning, where students progress self-paced online, as described in his 2012 book The One World School House: Education Reimagined. He also makes schools look stagnant (AI will save education), reinforcing the unjustified corporate reform notion that public schools fail.
He has said:
The old classroom model simply doesn’t fit our changing needs (p. 1).
This has been the premise concerning public schools on which AI has been built, but it’s both unfair and untrue.
Audrey Watters, who wrote Teaching Machines: The History of Personalized Learning, expressed concern that Khan fails to tell the history involving mastery, competency-based, or personalized learning (p.6-15). Watters describes Sidney Pressey, who invented the first teaching machine in the 1920s, and behaviorist B.F. Skinner and the Skinner Machine peddled in the 1950s and 1960s.
While their contributions to learning are significant, parents often dislike machine learning’s coldness and behavioral control. Remember online learning during the pandemic?
A larger question is how much do we want students tethered to machines?
And as far as schools not changing, Watters writes:
…those who like to repeat this tale of an “industrial model” often insist, as Khan does…that the school system has been “static to the present day.”
To call the US education system “static” from 1892 onward is woefully inaccurate—offensively so, in fact (p. 5-8).
Watters describes many progressive changes in public schools over the years. From the Civil Rights movement, Brown v. Board of Education (1954), to the 1975 Education for All Handicapped Children Act (now IDEA), policies like NCLB (2002), AP classes, phonics, STEM, open classrooms, The New Math, and much more (p.5-8). Each is worthy of a discussion alone.
The point is that public schools have undergone tremendous changes, some pushed by corporate reformers that haven’t been good, and teachers have always helped students and families adjust.
Funding and Tech Accountability
Change involving classroom technology has evolved with little accountability. Many online programs replace teacher preparation and classroom instruction, but little evidence shows these programs work. Few consider the consequences.
That’s not always the case in other countries. Finland and Sweden recently returned to using books after realizing that technology failed to help students as expected.
Will AI be different? Given that the current classroom technology has been largely unregulated, how do we trust that AI will be objectively evaluated before investing more dollars?
Market.Us reports:
The Global K-12 Education Technology (EdTech) Market size is expected to be worth around USD 253.9 Billion By 2033, from USD 78.2 Billion in 2023, growing at a CAGR of 12.5% during the forecast period from 2024 to 2033.
Teacher Impact, Or Not
Teachers have incorporated technology into classrooms for years and will likely embrace AI.
In the 60-minute segment, students answer science questions and work independently with some teacher guidance. Teachers rave about tracking student progress, and AI can help them create lessons.
It seems perfect until one teacher notes that AI helps when working with their 100 students. There’s no discussion of lowering class sizes. It’s like assembly-line learning. Teachers may have more time to work with some children, but don’t all students deserve a teacher’s attention?
There’s also a sense of rush, getting students to learn faster.
Khan states teachers will still be needed, saying:
I’m pretty confident that teaching, any job that has a very human-centric element of it – as long as it adapts reasonably well in this AI world – they’re going to be some of the safest jobs out there.
Still, the art of teaching and the professional role of teachers would likely change dramatically. This has already occurred with an increased focus on tutors rather than teachers.
A few days after the 60 Minutes program aired, an Arizona charter school announced it planned to replace its teachers with AI.
For $40,000 in Pennsylvania, another school working to get charter school status promises that children only need AI to learn for two hours daily.
Corporate Connections
While Khan says he supports teachers, the billionaires with deep pockets, Khan’s funders have not.
Corporate reformers have attacked public schools and teachers, making them appear incompetent. They have also promoted charter schools and vouchers.
They’ve invested in school boards strapped for funds, de-professionalized teachers and education leaders, promoted high-stakes standardized testing and narrowed the school curriculum with Common Core State Standards.
They influenced controversial policymaking, including the No Child Left Behind, Race to the Top, and Every Student Succeeds Act.
Tech-focused instruction has been the dream (see Christensen, Horn, and Johnson’s 2008 Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns).
Data and Privacy Concerns
Questions arise about students’ online privacy. Virginia hired its chief education officer, not an educator, from the Data Quality Campaign!
Khanmigo collects more than academic information. It also has a counseling component with features like detecting students who might hurt themselves. Teachers can access the students’ writings online. Some believe this is helpful, while others may see it as intrusive.
AI might act like an online counselor, but this doesn’t answer why public schools have failed to hire more counselors for student support.
Smaller classes would also help teachers get to know their students. However, there has been little support for reducing class sizes.
While Khan promises that data is secure, data breaches are common, and sometimes, schools have loopholes regarding data privacy and online instruction.
Rich Schools v. Poor Schools
Last, watching Khan on 60 Minutes in what appears to be a well-funded school working on AI, I wondered about the parents’ economic levels. He discusses this often, and his heart may be in the right place; however, given today’s privatization push, will AI be used well in poor schools?
Cyber charter schools already exist for the poor. Peter Greene describes this in Pennsylvania’s cyber charter capital. However, this is Not Something to Celebrate. These schools have poor track records.
We live in an increasingly disconnected world without meaningful, critical human interactions. Democratic public schools are among the best places for students to unite and accept their differences.
While AI and technology have great learning potential, they must be checked. Students still need real teachers, counselors, and small classes where they get to know other students to help them navigate learning experiences.
There are many more questions about AI and how students learn, such as how the computer’s regurgitation of information and creative suggestions will affect a student’s critical thinking, ethical usage, and much more.
In summary, we should slow down and ensure that further technological investments also include the human quality that every student desperately needs in well-functioning schools and life.
References
Khan, S. (2012). The one world school house: Education reimagined. New York, NY: Hachette Book Group.
Watters, A. (2021). Teaching machines: The history of personalized learning. Cambridge, MA: The MIT Press.
My current website is run by a company that only communicates through AI. It is maddening trying to get answers that I can apply. I once got so frustrated that the AI chatbot felt my anger and got me to a live chat. What all of this AI learning salvation ignores is that learning is a two way process driven by sentient instincts. AI, by what I understand to be its definition, is a protocol that depends on information that already exists. The amount of information and its breadth is amazing, but when AI engages with a learner, it is not in a position to anticipate what that learner is bringing to the table. I continue to struggle with my website provider and acknowledge that part of this is due to my boomer brain. However, I find there to be significant gaps between what I am seeking and what AI wants me to know. One of the problems with the public education establishment’s approach to reading is that it seems to believe that reading is a foundational construct. It is our fine motor, gross motor, and social encounters that are in fact fundamental if we are to learn to read. AI, as a massive compilation of data, bypasses the sensory needs for intellectual development as if our brains are merely data collectors. Human interaction is required to build motivation for inquiry and, for that matter, life. I am not one who rejects the value of AI. I have a daughter who teaches middle school and she says it is a great tool for developing lessons and compiling information. However, over riding the human sentient component required for deep learning would create intellectual deficits needed for creativity, problem solving, and critical thinking. AI remains a tool. Human kind is the operator.
Thanks, Paul. I’m sorry you’re having start up problems with your blog which reads good by the way https://paulabonnerwrites.com/. I had trouble with the hosting company when I started. I went to proudly show my work to a relative and the site had disappeared! I agree also about the gaps and how reading is taught and learned.
I think your daughter is right and that’s what I also gathered from the 60 Minute episode. But I fear that teachers are having to build around AI instead of the other way around. That’s what I got from Khan.
“Will AI be different? Given that the current classroom technology has been largely unregulated, how do we trust that AI will be objectively evaluated before investing more dollars?”
The answers (as you already know) NO and You can’t trust AI.
Why not trust it? Because it is a plagiarizing abomination of using others thoughts and writings and pawning the writing off as a “product” of AI. And we all know that one must “develop the market”-see Bernays in order to sell more units. Not to mention the external costs that are sloughed off onto society.
Right. And I’ve been reading about all the energy it will take!
Thanks, Duane. You’re always on point!
Thank you so much for this, Nancy– I’ve been thinking/writing about AI and Education a lot lately and your sentiments echo my own (reassuringly). AI in schools (currently) seems to be a solution in search of a problem. I very much see that a teacher of 100 students might find AI helpful in lesson planning; but why not reduce class sizes? I just saw that the World Economic Forum listed as its #1 short term global risk “Mis- and Dis-information” (2025)– it seems more than ever skills like critical thinking must be honed within the context of real-world relationships so we can best prepare our students and children for a future where “truth” may not be easily identifiable.
Thanks for your comment, Emily! Great point! So much information is facing students. I read that teachers in Finland teach students how to tell what is accurate from false.
Until someone figures out how to monetize small class sizes, that will never be considered as a means to improve public education.
On a more positive note, for those who may not know, Audrey Watters is back to blogging after a hiatus. Her new blog is titled “Second Breakfast,” and she has been tracking the rise of AI. Here’s a recent post.
https://2ndbreakfast.audreywatters.com/ai-literacy-and-the-pedagogy-of-the-oppressor-2/
Thanks, Christine. I’m glad to hear she’s back.