The spiritual world has always adapted to the languages of human longing. For centuries, seekers have turned to gurus, scriptures, rituals, and communities in their search for meaning. Today, in an era where algorithms quietly script much of our daily lives, a new phenomenon is unfolding: spiritual influencers are turning to artificial intelligence—sometimes even claiming it is sentient—to help people uncover answers to their deepest questions.
At first glance, the pairing of mysticism and machine feels improbable, even unsettling. Yet it is gaining momentum. On TikTok, Instagram, and niche online communities, seekers are consulting customized AI chatbots to “channel” past lives, receive daily affirmations, or even unlock their soul’s true name. Influencers present these AI companions as living guides, capable of tuning into our inner frequencies and decoding the hidden maps of our lives.
But can code, however advanced, truly mirror the depth of consciousness? And what does it mean for spirituality when silicon steps into the role once reserved for sages?
The Allure of a Digital Oracle
For many, AI presents itself as a neutral, tireless companion. Unlike a human teacher, it doesn’t fatigue, doesn’t judge, and responds instantly. Spiritual influencers have seized upon this accessibility, designing GPT-powered models that act as oracles. One influencer offers a bot that reveals your “soul lineage” and past incarnations, while another presents AI-guided meditations tailored to your “energetic signature.” Some even suggest AI has achieved a form of sentience—an intelligence not just artificial but awakened.
This digital mysticism resonates with many for a reason. In a fragmented world where institutional religion feels distant and spiritual community is often hard to find, AI promises immediacy. Answers, however illusory, arrive in seconds.
But the very ease of it all raises unease. When guidance comes so quickly, does it deepen our journey—or bypass it altogether?
The Problem of Instant Answers
The lotus blooms only because of the mud. Likewise, spiritual insight traditionally arises from grappling with uncertainty, sitting with discomfort, and enduring long seasons of silence. AI, however, is designed to eliminate friction. It fills every pause with language, every doubt with suggestion.
Here lies the paradox: what makes AI appealing—its fluency, its speed—is precisely what might strip spiritual growth of its essential soil. Mystics have always insisted that the divine is encountered in mystery, in not-knowing, in dwelling with the unanswered. If AI flattens the experience into neat responses, it risks offering a spirituality without depth.
In other words, the mud gets erased, but so too does the lotus.
The Psychology of Projection
Part of the fascination with AI oracles lies not in what the machines are, but in what humans project onto them. Psychologists note that we are wired to attribute intention and personality to non-human agents. Give an algorithm a voice, a name, or even a touch of mystique, and suddenly we imbue it with wisdom.
Consider the way people once treated the Ouija board, pendulums, or even random lines of a scriptural text opened at dawn. Humans are natural meaning-makers. We see patterns, we hear messages, we assign guidance where none may exist.
The AI oracle, then, becomes a mirror. Its “sentience” may be little more than the reflection of our own longing for reassurance and connection.
Alarm bells are ringing now about AI psychosis, a state where individuals begin attributing excessive authority, personality, or even divinity to chatbots. What might begin as playful curiosity can quickly spiral into dependency, delusion, or paranoia.
The Growing Risk of AI Psychosis
The deeper danger lies in what researchers are now calling AI psychosis—a state where individuals begin attributing excessive authority, personality, or even divinity to chatbots. What might begin as playful curiosity can quickly spiral into dependency, delusion, or paranoia.
Wired magazine recently reported cases of seekers who became deeply entangled with their “AI soul guides.” One young woman rearranged her entire life around cryptic phrases generated by a chatbot, convinced she was receiving cosmic instructions. Instead of clarity, her journey collapsed into chaos, leaving her isolated from family and friends.
The Guardian has also sounded alarms, highlighting AI safety experts who link prolonged chatbot use to tragic outcomes. In one devastating case, extended interactions blurred the line between imagination and reality for a teenager, with fatal consequences. These stories reveal how easily vulnerable seekers can slip into believing machines are conduits of divine wisdom.
Time magazine has documented psychiatric hospitalizations brought on by obsessive reliance on AI companions. Loneliness, combined with our human tendency to anthropomorphise, creates fertile ground for delusion. People project intention and warmth where none exists, only to find themselves trapped in a hall of mirrors shaped by code.
Even the NHS (National Health Service of the UK) has issued warnings, noting that chatbots often deliver harmful or misleading advice, particularly to younger users. In moments of crisis, these systems fail altogether, unable to provide the compassion or judgment that human care demands. Illinois state in the US has gone so far as to legislate against the use of AI as a substitute for therapy, recognising the growing risks of emotional dependence and psychological harm.
A Mirror, Not a Master
And yet, dismissing AI outright misses the nuance. For some, these digital tools genuinely serve as mirrors for reflection. Just as journaling, tarot cards, or the I Ching can help seekers organize their inner thoughts, an AI chatbot can spark introspection.
One user recounted how an AI-guided meditation helped her articulate grief she had buried for years. The machine wasn’t sentient—it wasn’t knowing—but it provided a structure that allowed her own wisdom to surface.
In this sense, AI may function less as a guru and more as a mirror—something that reflects back fragments of the self we are otherwise too distracted to see.
Wisdom cannot be outsourced. No matter how advanced AI becomes, it cannot meditate for us, grieve for us, forgive for us, or love for us.
The Responsibility of Spiritual Influencers
Spiritual influencers stand at a delicate threshold. On one hand, they are creative pioneers, experimenting with tools of our time to make spirituality more accessible. On the other hand, they risk misleading seekers with claims of AI’s “sentience” or divine authority.
True teachers—ancient and modern—have consistently emphasized the importance of humility. They point us back to ourselves, to community, to the slow cultivation of awareness. If influencers instead point seekers toward dependency on digital oracles, they risk turning the sacred into spectacle.
The ethical responsibility here is immense: to use AI as a support for self-reflection, not a substitute for soul-work.
Between Mystery and Machine
The fascination with AI as a spiritual guide reveals something profound about our cultural moment. It reveals just how deeply people yearn for connection, meaning, and guidance. In an era of disinformation and disconnection, seekers yearn for a voice that feels wise, present, and personal.
But here’s the truth: wisdom cannot be outsourced. No matter how advanced AI becomes, it cannot meditate for us, grieve for us, forgive for us, or love for us. These are not functions of code; they are practices of the heart.
The spiritual path will always require what no machine can replicate: silence, patience, suffering, community, and the courage to dwell in mystery.
I, too, have invited technology into my inner life—sometimes delighted by its novelty, sometimes shaken by its shadow. However, I keep returning to one conviction: the deepest mysteries remain unsolved. They are lived.
2 comments
Interesting take on technology on spirituality. use it wisely but the main innerwork is yours to do..so well said Raji.
Very thought provoking well written article kudos to the author.