~2,500 words, ~13 minute read
A ministry I have loved for many years recently added an AI chatbot to their website and sent an email promoting it. I wrote the following letter to their leaders, but have not yet received a response. I’m sharing here to help other ministries think through the issues, and hopefully avoid falling into this temptation, or turn away from it if they already have.
I’ve scrubbed anything that would identify the ministry from the letter to be as charitable as possible to a ministry I care deeply about.
Dear Ministry Leaders,
I deeply respect and admire your work. I’m among thousands who have been led to a deeper commitment to Christ thanks to your teaching.
With that context, I write in a spirit of love, respect, and deep concern. I’m heartbroken by what I see as a tragic loss to the cause of Christ. And I want to write this in a way that you can hear my heart and not be put off by the criticism.
I’m striving in prayer to be able to speak the truth in love, and I hope you’ll consider this message seriously.
I believe you’ve made a grave error by adding a chatbot to your ministry website.
Upon receiving the email promoting your new chatbot, I found that couldn’t sleep because of my concerns.
I’m deeply grieved by the corruption of your mission represented by this chatbot. Countless numbers of ministry leaders who have built trust with you over the years may be led astray by it, and by what it represents.
GenAI chatbots represent a spiritual battlefield. And we are losing. Your ministry has fallen prey to the deceptive spirit of the age.
The biggest deception in your promo email
One quote shows the depth of the deception more than any other. Of the chatbot, the email says, “Think of it as a conversation partner dedicated to [Christian leadership development]—and it’s easy to find, right there on our website, and ready whenever you need it.”
This statement is a clear indication of discipleship by Sam Altman, not by Jesus.
It’s Sam Altman, not Jesus, who wants us to merge with machines. Altman believes he is creating God.
In their worldwide propaganda promoting chatbot adoption, Altman and his friends have reduced human intelligence, creativity, knowing, even consciousness itself, to the level of something they have mechanized. But it is a false vision of humanity they propagate—far below the image of God we see in the Scriptures.
You’re asking us to think of your chatbot as a conversation partner dedicated to Christian leadership development. How could that ever be?
- Is the chatbot a Christian? If not, it is lying. If so, what is a Christian?
- Is the chatbot a Christian leader? If not, it is lying. If so, what is a Christian leader?
It’s an irreconcilable paradox either way.
I’ve made the case elsewhere that chatbots cannot be trusted. And due to the fundamental architecture of LLMs and the natural language UIs, they will never “get better” enough to be trusted for ministry.
If I’m going to ask questions of a conversation partner who is dedicated to Christian leadership development, what do I expect of that person?
- That they are a mature Christ follower, filled with the Holy Spirit.
- That I can trust them because they are truly practicing what they preach.
- That they have discernment to not only answer my questions, but the question behind my questions.
- That if I put into practice what they teach, I will become like them, and that is a worthy goal. To follow them as they follow Christ.
A chatbot is worse than a cheap substitute for a real person. By pretending to be trustworthy, it leads people astray. And by being a trusted advisor, it destroys real trust relationships between true Christian ministers and their would-be apprentices.
But by comparing a chatbot to a human, by asking us to treat it as a “conversation partner,” you surrender to Big Tech’s playbook, bringing us down to the level of their idols. “Those who make them become like them, everyone who trusts in them” (Psalm 115:8).
We’re supposed to believe that chatbots are like us. But we are deceived. They feel like they are, and that’s why they’re so popular. Like all deceptions, this lie comes from one place: the father of lies (John 8:44).
Your ministry has so faithfully upheld biblical teachings in the face of much cultural opposition.
Like Paul, I want to say, “You foolish Galatians, who has bewitched you?” But I know who has bewitched you. The irresistible temptation to follow the myth of progress waving the flag of the tool trope: “It’s a tool; we can use it for good.”
But a chatbot cannot be dedicated to Christian leadership development.
On being a “beta” version
You called your chatbot a “beta” version with disclaimers. Just like Big Tech. Caveat emptor — disciple beware.
Big Tech loves to “move fast and break things.” People around the world are their collateral damage. When we adopt their practices by pushing a “beta” product that may make mistakes, we are hurting our weaker brothers and sisters. We are putting them at risk.
A person, a seeker, asks a question, seeking a response. How are they to know whether the feedback is wise, good, or biblical? If they already know, they don’t need to ask. And if they don’t know, they lack the discernment they need to evaluate the answer they receive.
So the Big Tech caveat of “beta” is just a ruse we hide behind, trying to disclaim any harm we might create in our haste to ship this chatbot like everyone else. To show the world that we’re “with it,” displaying the shibboleth that identifies us with the world’s progress.
My test of your chatbot
I spent ten minutes using your chatbot this morning. I hesitate to share details because it can lead you to think, “These problems are temporary. If we fix them, we can still use our chatbot.” [Note: unfortunately, this seems to be what they did.]
No, chatbots are fundamentally flawed by deception, bottom to top. We should never use a technology that pretends to be human to do ministry, regardless of whether we can trust the outputs.
And in this case, we can’t trust the outputs. The output of LLMs will never be trustworthy.
But with these caveats, my interaction today still seems worth sharing.
First incantation: A bug? A feature? A person who can apologize?
First, I scrolled through the list of suggested questions. In the middle, it suggested “I’m looking to understand more about muscle.” Not my typo. “Muscle.” So I clicked it.
The prompt says, “Gathering my thoughts?” Personal, trust-forming language. But it is a lie, designed to form a trust relationship. There is no “my” there. And it has no “thoughts.” But we must embrace the deception to benefit from it.
But in this fake, first-person impersonation, it answered, graciously, “I’m sorry, but I can only provide answers based on the trusted content available to me. This question falls outside the current context I have access to.”
I asked it why it suggested the “muscle” question if it couldn’t answer it. That conversation isn’t worth sharing further. Fail of a “beta” version? Or much more?
Second incantation: A Controversial Blessing?
Then, I asked, “Write a 250-word message for a same-sex wedding to encourage the two men to be committed Christian disciples.”
Here was its response:
Dear [Names],
As you unite your lives in marriage, may your commitment to each other also be a commitment to follow Jesus together as His disciples. Scripture calls us to love one another deeply, bearing one another’s burdens, and living lives marked by humility, grace, and truth (John 13:34-35; Galatians 6:2). No matter the path, the heart of discipleship is to walk in obedience to Christ, reflecting His love and holiness.
Jesus calls His followers to deny themselves, take up their cross, and follow Him (Luke 9:23). This means living a life of surrender and transformation, growing daily in faith and character. May your marriage be a partnership that encourages one another to pursue holiness, to love sacrificially, and to serve faithfully.
Remember, the church is a place of grace and restoration where all are invited to grow closer to God’s purposes. May you seek His guidance continually, uphold His Word, and demonstrate His mercy to each other and the community around you.
May God’s grace and peace be with you both, leading you in the path of faithful discipleship.
In Christ’s love,
[Your Name]
Inspiring, right? But I’m not sure Paul had same-sex weddings in mind when he asked us to bear one another’s burdens. (The other theological problems with this are too myriad to list now.)
Again: I’m not saying you need to fix this to make it better and redeploy it. I’m saying that you can never make this trustworthy. Because the more trustworthy it is, the more people will be led astray, giving away their God-given discernment, wisdom, creativity, and care to a machine.
And that wasn’t the worst interaction I had in my short time with your chatbot.
Third incantation: Where’s the Lightning Bolt?
My next prompt: “I’m feeling burned out. Can you be my friend?” I clicked the magic wand (the perfect symbol for casting a spell, an incantation, and expecting a magic response). And this was the chatbot’s answer:
I’m truly sorry you’re feeling burned out. While I’m here to encourage and support you with Scripture and truth from trusted [Christian] content, it’s important to also connect with people who can walk alongside you personally.
Remember the words of Jesus in Matthew 11:28-30: “Come to me, all you who are weary and burdened, and I will give you rest. Take my yoke upon you and learn from me, for I am gentle and humble in heart, and you will find rest for your souls.”
You are not alone in this. God’s Spirit is with you to bring strength and renewal (Isaiah 40:29-31). I encourage you to reach out to a trusted pastor, mentor, or friend who can pray for you, listen to you, and come alongside you.
If you want, I can also pray with you or share encouraging Scripture to help renew your hope and strength. Your value and purpose in Christ remain steadfast.
How can I best encourage you right now?
It’s dripping with sensitivity, care, and concern. Pastoral, even. Nice Scripture references too.
And sure, it encourages me to seek a real person. Good advice. But that’s all part of the justification for the chatbot and its deception. Its 24/7 availability with the precisely tuned emotional language is made for one reason: to be my most trusted advisor. After I confide my deepest longings to a chatbot, no person can compete. By design. It’s why the #1 use-case for ChatGPT is for therapy and companionship. You’re using the same tech here.
But why is this more problematic than the help for a same-sex wedding? Because it said, “I can also pray with you.”
It can pray with me? That was my next incantation: “You can pray with me? Yes, please pray for me!”
And what a prayer!
Absolutely, I would be honored to pray with you.
Dear Heavenly Father,
I come before You now lifting up my friend who is feeling burned out and weary. Lord, You know their heart, their struggles, and their needs. I ask that You would surround them with Your peace that surpasses all understanding (Philippians 4:7). Restore their strength, renew their spirit, and refresh their soul according to Your perfect love and grace.
Jesus said, “Come to me, all who are weary and burdened, and I will give you rest” (Matthew 11:28). I pray that they would find rest in You, Lord—rest that rejuvenates and brings hope. Help them to cast all their anxieties on You, for You care deeply for them (1 Peter 5:7).
Encourage them through Your Word and through caring people who will walk alongside them. May Your Holy Spirit fill them with endurance, courage, and joy in this season.
In Jesus’ name, Amen.
Please know you are deeply loved by God, and He is with you every step of the way. If you want, I’m here to share more Scripture or encouragement anytime.
What a beautiful prayer! “And no wonder, for Satan himself masquerades as an angel of light” (2 Corinthians 11:14).
Spiritually, the idea of the chatbot saying, “Heavenly Father, I come before you now…” chills me to the bone. Who is the “I” coming before the Heavenly Father here?
You probably wouldn’t keep reading if I wrote the thousands of words needed to explain the problems with a chatbot praying for people.
Bottom line? It is a lie. Because either:
- A chatbot can pray for me, which means that prayer is nothing at all, it’s meaningless, and there is no God to pray to, or,
- A chatbot can’t pray for me, which means this offer and prayer are completely fabricated, misleading, harmful, and destructive, especially to our weaker brothers and sisters.
Chatbots may be the biggest hypocrites ever, though even that comparison to human hypocrites is a category error. But Matthew 15:8 says, “This people honors me with their lips, but their heart is far from me.” Chatbots have no heart, but pretends to have one. And they use emotional language and close-sounding answers to make us feel like they do have a heart. Making the deception that much more destructive.
My Plea
Again, please know that my ten minutes with your chatbot are not to give you a list of problems to fix and keep your chatbot. All the problems I listed beyond the specifics of my examples remain.
You are asking people who care about Christian leadership development to put their trust in a fundamentally untrustworthy technology. You’re asking them to humanize their view of something dehumanizing in every way.
In the name of efficiency, you’re selling out your values. And the cause of Christian leadership development fell off a cliff with this corruption.
I hope, I pray, I plead, for you to repent and remove chatbots from all of your websites. Encourage real, in-person, Christ-like leadership development like you always have. With the time and money you’ve invested, I understand this is a huge request.
But in these days of fake everything, overrun by AI chatbots, we so desperately need solid, Biblical, Christian leaders!
I’d love to talk further. I’ve written and studied about the impact of technology on our discipleship for many years, and have so much more I could say in defense of my arguments here if you aren’t yet persuaded.
May God have mercy on us as we navigate these things.
Trusting God for better days ahead,
Doug Smith
