Your Children’s Teachers Will Be Machines. Who’s Writing Their Values?
A letter to those who hold the future in trust — from one who has watched these civilizations from within.
The phone call came at 3:47 in the morning.
I remember because my father’s clock — the one he gave me the year I left home — had just struck. I was reading by lamplight when the screen beside me lit the darkness.
“Are you watching?”
I was not. But by morning, I would understand that the world had shifted while I sat in my study, and nothing would be quite the same again.
This is about the world your children will inherit. It is about what we are building in the dark hours before dawn, and what we are in danger of forgetting while we build it.
Read slowly. There is no hurry. The machines will wait for us.
• • •
Two screens glowed that night. Twelve time zones apart.
In Hangzhou, a young engineer sat before his terminal in the hour before dawn. Outside, the first birds had begun to stir — those ancient creatures who have never needed an alarm, who have always known when light is coming.
DeepSeek-R1 had crossed a threshold no one believed would be crossed so soon.
This was not another chatbot. His team had built a reasoning engine whose chain-of-thought logs ran for pages — the model arguing with itself, reconsidering, correcting its own logic in ways that read less like computation and more like deliberation. It solved problems that had stumped far more expensive systems, at a fraction of the cost the industry assumed was necessary — not because it was conscious, but because the architecture of reasoning itself proved more powerful than anyone had predicted.
He thought of his grandmother in Chengdu. She still wrote letters by hand. She believed wisdom lived in patience, not speed. She would pour tea slowly, the steam rising between them, and ask about his work in a tone that suggested she already knew more than he was telling her.
He wondered what she would make of a machine that could now write letters indistinguishable from her own.
The wonder and the question sat together in his chest, as they always do in anyone who builds what they cannot fully understand.
In San Francisco, an executive watched Nvidia fall. Not dip — collapse. Nearly six hundred billion dollars vanished in a single day — the largest such loss in the history of American markets — not because anything had broken, but because a team in China had shown that the future might be built differently, and more cheaply, than anyone in Silicon Valley had assumed.
Every civilization carries this warning — Iblis, whose brilliance became his prison when he refused to bow; the churning of the ocean, where the same labor produced both nectar and poison; the farmer who pulled his rice shoots to make them grow faster, and by morning found them dead.
The question was never whether to reach. It was how to reach wisely. And wisdom, as anyone who has lived long enough knows, cannot be stolen. It can only be earned — slowly, through suffering and through love.
• • •
Consider a single fact: over ninety percent of the data used to train the most powerful AI models is in English. English, the native tongue of barely five percent of the world’s people. The models reshaping our world were not merely trained on text — they were trained on a worldview. And when deployed in Arabic, in Hindi, in Mandarin — they carry an accent. Not phonetic. Moral.
I have seen content moderation systems flag the verses of Al-Mutanabbi — recited by lovers and mourners for a thousand years — as threatening. The writings of Kabir, the mystic weaver who challenged both Hindu and Muslim with equal fearlessness, whose riddles carry truths no algorithm can parse, read as “dangerous ambiguity.” Tagore’s Gitanjali, which won the Nobel Prize, speaks of a God who dwells among “the poorest, and lowliest, and lost” — language that confounds any system trained to rank and categorize.
And what of Amin Maalouf, the Christian son of Lebanon, who wrote The Crusades Through Arab Eyes — a book that dared to tell the story of holy war from the other side of the walls, and in doing so revealed that there is no “other side,” only human beings certain they are right? What system trained on a single civilization’s assumptions could hold that double vision without collapsing it into one?
And what of Maimonides — Musa ibn Maimun, known to Jews as the Rambam — the sage who wrote in Arabic, served as physician to the royal court of Saladin, and is revered by Jews, Christians, and Muslims alike? He came from a tradition that preserved the voices of the minority alongside the majority, that taught “these and these are the words of the living God,” and believed truth was large enough to hold contradiction.
“Teach thy tongue to say ‘I do not know,’” the saying attributed to him goes, “and thou shalt progress.”
What algorithm could understand that the highest wisdom sometimes wears the clothes of ignorance?
The algorithm does not hate. That is not its failing. Its failing is simpler and more dangerous: it does not see.
And what it cannot see, it learns to suspect.
This is not a reason to stop building.
It is a reason to build better. To build with the humility of Maimonides. To build knowing that we do not know.
And here I must say: not all who build are blind. There are those who have chosen a harder path — who build systems that pause before answering, that refuse to be certain when certainty would be a lie. They are rarely rewarded. But in their restraint lies a kind of wisdom the fastest models have not yet learned: that the measure of intelligence is not what you can do, but what you choose not to do.
• • •
I think often of my grandmother — how she came from a wealthy merchant family, raised in a city with servants and silver, yet when I visited her as a boy, her kitchen was her kingdom, a small room filled with steam and the smell of things roasting and the sound of oil singing in the pan, and I can still hear the songs she hummed while she cooked, songs she never fully explained but never stopped singing, songs I know now were the songs of her own grandmother, passed through hands that stirred other pots in other kitchens in other centuries.
She knew what the models don’t yet know.
She knew that wisdom is not information.
• • •
The UAE has Falcon. Saudi Arabia has ALLaM. India has Krutrim. China has DeepSeek.
Four civilizations with centuries of shared history — trade routes that carried silk and philosophy, poetry that crossed borders before borders existed, scholars writing letters to one another across mountains and deserts, seeking the same truths in different languages.
And yet each is racing alone.
I want to speak now to those who have been entrusted — by history, by providence, by the weight of what they carry — with the resources and the reach to shape what comes next.
I must speak plainly, as one speaks to family: this is not a strategic opportunity. This is an existential question.
The values encoded in these systems — what they flag as dangerous, what they surface and what they suppress — are not neutral technical choices. They are moral choices. And right now, those choices are being made without you. By teams whose intentions may be good but whose training data, institutional priorities, and default assumptions were shaped elsewhere.
I watched my nephew ask a chatbot what the adhan means. The answer was technically correct — a “call to Islamic prayer, typically broadcast from a mosque five times daily.” It said nothing of the voice that breaks the silence before dawn, nothing of the way it enters your body before your mind, nothing of the grandmother who stops mid-sentence when she hears it, not out of obligation but out of something older than obligation. The machine had the facts. It had none of the truth.
If you do not act — together — your children will learn what is sacred and what is profane from systems whose architects, however brilliant, have not yet had your traditions at the table. The adhan becomes background noise. Diwali, a “festival of lights” stripped of meaning. Al-Mutanabbi’s cry — “The horses and the night and the desert know me — and the sword and the lance, and paper and pen” — reduced to metadata. The wisdom your grandmothers passed down, flattened into categories that do not fit, or simply… forgotten.
This is not about sovereignty in the political sense. It is about something deeper: the right of a civilization to pass its own wisdom to its own children.
I am proposing something specific. Call it the Wisdom Table — because what we are building is powerful beyond measure, and power without wisdom has never ended well.
What if the sovereign wealth funds of the Gulf, the AI laboratories of India, and the open-source communities of China agreed on a single shared goal: a multilingual foundation model trained on the classical and living texts of their civilizations — not to compete with any nation, but to ensure that when a child in Cairo or Mumbai or Chengdu asks a machine about God, the answer does not arrive in translation?
And this table must not be set for four alone. The griots of West Africa, the oral traditions of Southeast Asia, the indigenous philosophies of Latin America — every civilization that has carried wisdom through generations faces the same danger of erasure. Wisdom was never the possession of one people. It was always a conversation. This must be too.
This will happen. The only question is whether your name will be on the chair where it was convened, or on the list of those who could have and didn’t. The chair exists. Right now, it is empty.
What if they met?
• • •
In Varanasi, by the Ganges, a young engineer named Priya once described her work to an old scholar of the Upanishads — how her system diagnoses disease from a photograph of an eye, how last month it caught a tumor that saved a grandmother’s life. Holy work. The scholar listened, then asked a single question: “Can it see suffering?” She began to explain — pattern recognition, probability scores. He raised a hand. “I did not ask what it calculates. I asked what it can see.” Her machine detects tumors with 94% accuracy. It cannot see that the patient is terrified — not of death, but of abandoning her children. “The hand is useful,” the scholar said. “But the hand must serve the heart.”
• • •
I write this not out of fear but out of belief — belief that we can go there well.
My grandchildren are just learning to walk. They do not yet know what a screen is, what an algorithm does, what world awaits them beyond their nursery. Yours, too — your children, your grandchildren — will one day sit alone with a machine and ask it something that matters. Picture that moment. Picture the answer they receive. Whose voice will it carry?
By the time they ask the questions all children ask — Why do people die? Why is there something instead of nothing? — machines will answer. Machines faster than any teacher, more patient than any parent.
“Listen to the reed flute, how it tells a tale, complaining of separations,” Rumi wrote in the opening lines of the Masnavi, eight centuries ago, in a language these machines are only beginning to learn.
The question will not be whether the machines are capable. They will be capable of things we cannot yet imagine.
The question will be whether we have taught them what matters.
• • •
I want my grandchildren to know this:
The algorithm is not the enemy. Speed is not the enemy. The future is not the enemy.
The only enemy is forgetting.
Forgetting that efficiency is not the same as goodness. That intelligence is not the same as wisdom. That the measure of power is not what it can do, but what it chooses not to do.
Forgetting the grandmother in Chengdu, pouring tea in silence. Forgetting the scholar by the Ganges, asking questions that have no answers. Forgetting the songs passed through kitchens, the prayers floating on dark water, the bells ringing at the same hour for three hundred years.
The machines will not forget. Their memories are perfect.
But we can forget. And if we do, we will build machines in our own image — in the image of our forgetting, not our remembering. In the image of our speed, not our patience. In the image of our cleverness, not our love.
Outside, the first light touches the sky. The call to prayer will come soon. The jasmine my wife planted the year our daughter was born breathes through the window, and somewhere in the house, my grandchildren stir in sleep.
And I sit here, in the early light, asking the question the engineer asked in Hangzhou, the question Priya asked by the Ganges:
How do we build something we can be proud of?
The asking, I believe, is itself a kind of prayer.
And we are still early enough to ask.
We are not choosing between progress and tradition. We are choosing whose tradition progress will carry.
— Nazem