It started as a routine part of sermon preparation.
The pastor needed background on a specific Greek term in Paul’s letters. The AI response came back quickly — detailed, structured, referencing multiple scholarly perspectives. It read like a reliable commentary entry.
It was wrong in a way that mattered. Not a minor nuance or a debatable interpretation. A substantive theological error, stated with complete confidence, that shaped the direction of the sermon.
Nobody caught it. Not before Sunday. Not after.
Why This Is More Common Than Pastors Realize
Pastors repeatedly identified misinformation and doctrinal accuracy as major risks of AI use. AI-generated content, while fluent and persuasive, does not inherently distinguish between orthodox theology, contested interpretations, or outright error. Christian Post
The conversations about AI in ministry tend to focus on ethics and authenticity — is it honest to use AI, will sermons sound generic, what do congregations deserve to know. These are legitimate questions.
But they distract from a more immediate practical risk: AI sermon preparation tools produce theological errors regularly, confidently, and without any indication that an error has occurred.
Understanding Why AI Gets Theology Wrong
The mechanism behind this is called hallucination, and understanding it changes how you use these tools.
AI doesn’t retrieve stored facts. It generates text by predicting what words should follow other words, based on patterns in its training data. When it encounters a question about something outside its reliable training data — a specific Hebrew or Greek term, an obscure patristic source, a nuanced point of textual criticism — it doesn’t pause or flag uncertainty.
It generates the most plausible-sounding continuation of the text. And plausible-sounding is not the same as accurate.
The result looks identical to accurate output. Same tone, same confidence, same level of detail. A pastor without deep expertise in the relevant area has no reliable way to tell them apart.
“Buy truth, and do not sell it.” — Proverbs 23:23
Truth has always required effort to obtain and verify. AI removes that effort from the process — but it doesn’t replace what the effort was producing.
Why Biblical and Theological Content Is Especially Vulnerable
AI can hallucinate facts — always verify quotes, stats, and stories before using them. Josh Hunt
General advice — but the stakes are higher in theology than in most domains.
Biblical languages, church history, patristic sources, and textual scholarship are exactly the areas where AI hallucination is hardest to detect. The training data in these fields is specialized, ancient, and often contested. The errors AI produces in these domains sound credible to anyone who doesn’t already know them to be wrong.
The pastor who turns to AI precisely because they lack deep expertise in a given area is the most exposed. They receive confidently stated theological error. They lack the background to identify it. It reaches the congregation.
A Working Protocol for Pastors
AI tools can become a helpful assistant in sermon work — if used wisely, for tasks like research summarization, outline generation, and illustration brainstorming. ChurchTechToday
The following boundaries reduce risk significantly:
Primary sources, always. Word studies, historical claims, and citations need verification against actual lexicons, commentaries, and primary texts. AI-generated citations are frequently invented.
Skepticism proportional to specificity. The more detailed and specific an AI response, the more important it is to verify. Hallucinated content tends to be specific and confident.
AI for structure, scholars for substance. Use AI to organize and clarify. Use established scholarship for the exegetical and historical content that carries theological weight.
Protect the study. The work of sitting with the text, wrestling with the original languages, and engaging serious commentaries is not an inefficiency to be automated. It is where the preacher and the passage meet.
Reference: “The Pastor Didn’t Notice.AI Sermon Gave Hallucinated Theology.”, April 15, 2026, Crossmap
