Agency over Agents

EdTech should focus on humans, not AI.
The commercial pressure to do the opposite is enormous.
If AI agents can replace even a fraction of human effort, district salary budgets become addressable markets. That is orders of magnitude larger than traditional software spend. That incentive, not pedagogy, drove a lot of what I saw at ASU+GSV last week.
Even Khanmigo, the most visible AI tutor in the market, couldn't sustain its hype. Adoption is low. Outcomes are unclear. Students don't engage the way the demos suggest. Khan Academy's recent conclusion is telling, "the biggest lever is investing in human systems." That is not a refinement. It is a reversal.
The issue was never whether AI could explain mathematical concepts. It's whether any instructional tool, detached from a teacher and classroom dynamics, will broadly translate into learning.
There is an even deeper issue in EdTech. Teachers have become the final integration layer, expected to bring together all the curriculum, tools, assessments, and adapt them for each student in the classroom.
Classrooms don't have a content problem. They have a coherence problem. Adding AI does not solve that. It increases the coordination burden on a system already under stress.
Teachers bring sustained attention, care, and judgment. They interpret the learner in a wider context. They sense confusion, calibrate challenges, decide when to press and when to pause. They hold both the content and the student's mental model at the same time.
Since 2009, we've been embedded in classroom routines. Teachers led us to a different conclusion: reduce the stress, effort, and coordination burden around K–5 math education so more people can operate with that level of clarity with more students.
If that's the goal, then the mainstream approach to building synthetic educators is pointing in the wrong direction. We had to think differently about AI system design.
Four AI design principles
If AI is going to significantly improve math outcomes at scale, it needs infrastructure designed to strengthen the human craft of education, not replace it.
Current EdTech tools are mostly intrusive, optimized for student engagement. The teacher holds the burden of integrating everything into coherent instruction. They are expected to provide personalized packages of HQIM, resources, and assessments for students at all stages of learning the curriculum. Then, of course, document all of it across multiple systems.
In reality, the bottlenecks are: diagnosing what students actually understand, grouping and regrouping based on learning paths, then aligning instruction, practice, and assessment - all under severe time constraints
Treating this as a content problem instead of a coherence issue leads to predictable failure. It misses the real leverage AI has at the analysis and coordination layers.
In our private beta, AI generates a five-minute class read: grouping students by missing strategies rather than surface-level scores, and highlighting cross-classroom patterns for instructional coaches.
1) Start with signal
Most EdTech systems begin with content, ask the teacher to organize all the pieces, and end with more data generated in a dashboard for someone else to interpret.
That's backwards. So far this school year, we've had over 5 million students complete over 100 million daily math practice quizzes, automatically tracking their zone of proximal development as they progress from conceptual understanding to demonstrated fluency.
You need this kind of high-frequency evidence of where students actually are on their learning journey. How their minds really work, under time pressure.
We can now leverage these billions of student-generated signals for analysis without the need for teacher input or additional student screentime.
Most AI chatbots start the user with an empty text box, expecting them to explain to the AI what they need. We begin with a lot of data about the student without needing additional screen time or high-stakes assessment.
2) Reduce cognitive load and coordination burden
EdTech has struggled for a decade due to fragmentation pushed onto teachers. Teachers don't need more tools or dashboards. They need more clarity around instructional decisions.
The job is not to add to an already crowded library of choices. It is to compress complexity: who needs help, why, and what can the educator do about it now.
XtraMath's AI recommendations are informed by the open-source Learning Commons Knowledge Graph, which determines the best, evidence-based resource across K-5 math skills. In practice, that means a five-minute class read created from multiple sources. Students grouped by missing strategies, not scores. Patterns for PD surfaced across classrooms, not buried in reports.
Most EdTech tools claiming personalization for students result in increased cognitive load for the educator. Large commercial platforms manage complexity by reducing district choice and teacher discretion.
XtraMath AI will allow teachers to make better use of the approved curriculum and resources they already trust.
3) The learning loop is human-to-human
We don't mean a "human-in-the-loop" of an increasingly persuasive agent. We mean AI as a background figure in service of the human-to-human learning loop.
When users don't engage, EdTech tends to become more aggressive. More prompts. More interruptions. More attempts to insert itself into the workflow. Agent-first approaches will always optimize for whatever product metrics they're given. They will push towards whatever improves their dashboard, but not necessarily what the student really needs.
If the system has to fight for attention, it is misaligned. AI's superpowers are better used for silently reasoning over evidence and describing patterns at massive speed and scale.
XtraMath's AI mascot, Mr. C, operates in the background: adapting daily quizzes, reinforcing the student's effort, consolidating data into useful reports. Then it gets out of the way.
4) Design for cost and control from day one
Many commercial EdTech systems act like black boxes. That creates predictable constraints:
- limited transparency and adaptability
- risk of vendor lock-in
- difficulty meeting regulatory requirements
If we layer AI agents on top of that system, those concerns grow tremendously.
Elementary Math educators need an alternative infrastructure:can an AI Harness. This is a system designed ti constrain, manage, and govern how artificial intelligence enters the classroom.
The XtraMath AI Harness for Elementary Math has five layers:
> Signals (fed by ongoing telemetry of student performance)
> Resources (knowledge graph for coordinating HQIMs)
> Skills (optimized, task-specific instructions)
> Inference (use the best model for the job)
> Governance (policy-driven, SAFE and auditable systems)
This approach allows real data to drive high-value reasoning that is alignment to standards, with policy control at the district level. It also keeps costs grounded in order to provide AI responsibly at scale, without breaking public budgets.
Our mission
The most profitable question in EdTech is still: can anyone build synthetic educators?
The more important work is different: reducing the stress, effort, and fragmentation that sits between students and teachers every day.
AI is already in classrooms. The question is whether it fragments the education system further or makes it more coherent.
Elementary math education faces an analysis and coordination problem that XtraMath AI is well-suited to address. If you're a district leader, provider, or funder working on this problem, let's design this together. Reach out to me directly: roy@xtramath.org