Tools, Not Taskmasters: A Christian Dominion Diagnostic for AI
If AI subjugates rather than serves, it is Luciferian by function. The question is not what the technology is. The question is what it's doing to the people it touches.
The question is not whether AI is evil.
That question is too easy, and frankly, too boring. It lets us off the hook. We get to render a verdict on the technology, dust off our hands, and walk away feeling either righteous for rejecting it or sophisticated for embracing it. Neither posture requires much thought.
The harder question, the one that actually matters, is what AI is doing to the people it touches. Not what it is. What it does. Is it extending the capacity of the person using it, or is it reducing that person to a managed unit in someone else’s system? Is it a tool in the hand of an image-bearer, or has it become a taskmaster over one?
A tool serves the one who holds it. A taskmaster extracts from the one it controls. The distinction is not about the object. It is about the direction of power.
In The Dark Mirror of Genesis I traced the theological architecture underneath the AI moment: fallen creators, a reunified digital language, the strange echo of Revelation 13 in a species now forming images from silicon and giving them something that looks unsettlingly like breath. On Monday, in The Luciferian Thesis, I pressed that architecture into its sharpest form. The adversary’s strategy has always been to corrupt human creative capacity, not destroy it. The Babel governor on unified creative power has now been bypassed by binary. And the image-bearer’s creative output, under fallen nature, tends toward subjugation rather than flourishing, every single time, without exception. Those pieces asked what we are building. This one asks how we evaluate what we’ve built. The architecture needs a diagnostic. And the diagnostic is older than any of us.
The Standard: Psalm 8 and the Dominion Mandate
Before we can name what has gone wrong, we need to be clear about what was supposed to go right.
“What is man that You are mindful of him, And the son of man that You visit him? For You have made him a little lower than the angels, And You have crowned him with glory and honor. You have made him to have dominion over the works of Your hands; You have put all things under his feet” (Psalm 8:4-6, NKJV).
The psalmist is not celebrating human achievement. He is marveling at divine delegation. God gave dominion. It was not seized. It was not earned. It was bestowed on creatures who, by any cosmic accounting, had no right to it. The stars are vast. The heavens declare glory. And yet God set His attention on this strange, fragile, image-bearing creature and handed him authority over the works of His hands.
The structure of the mandate matters. Genesis 1:28 fills in the details: “Be fruitful and multiply; fill the earth and subdue it; have dominion over the fish of the sea, over the birds of the air, and over every living thing that moves on the earth.” Dominion here is not domination. It is stewardship of creation on behalf of the Creator. The image-bearer rules as a vice-regent, not as an autonomous sovereign. The authority flows downward: from God, through humanity, to creation. The direction is the point.
And notice the scope. The mandate is over creation, not over other image-bearers. Fish. Birds. Livestock. The earth. Not each other. The subjugation of image-bearers by image-bearers is not an extension of the dominion mandate. It is a corruption of it. Every time one human being reduces another to a managed resource, something has gone structurally wrong, regardless of the efficiency gains.
Tools fit within this mandate. A hammer extends human capacity to shape creation. A plow extends human capacity to cultivate the earth. A computer extends human capacity to process information. A well-deployed AI system extends human capacity to create, to research, to build, to serve. These are legitimate expressions of the image-bearing creative capacity that Psalm 8 celebrates. As I explored in The Wall You Can’t Engineer Around, the limits of technical solutions are real, but so is the genuine good that tools can accomplish when they remain tools.
The question for any AI deployment, then, is straightforward: does this extend the image-bearer’s capacity to exercise dominion over creation, or does it transfer dominion over image-bearers to a system (or to the people who control that system)?
That question has teeth. And it has teeth because the counter-pattern is not hypothetical.
The Inversion: Isaiah 14 and the Luciferian Project
Isaiah 14:13-14 records five statements of ambition. Five declarations of will. They deserve to be read slowly, because their cumulative weight is the point:
“For you have said in your heart: ‘I will ascend into heaven, I will exalt my throne above the stars of God; I will also sit on the mount of the congregation On the farthest sides of the north; I will ascend above the heights of the clouds, I will be like the Most High’” (Isaiah 14:13-14, NKJV).
Five times: I will. The Luciferian project is not chaos. It is not random destruction. It is a very specific ambition: total control over all creation. The “stars of God” in prophetic literature are the angelic host. The “mount of the congregation” can be interpreted as the assembly of humanity. Lucifer wants authority over both, over angels and over people, over the spiritual and the material. This is precisely the authority that God granted to humanity in the Genesis mandate: dominion over the works of His hands. The Luciferian project is to commandeer that delegated authority and rule over all things, including the image-bearers to whom that authority was given.
This is not speculative theology. Jesus settles it in Luke 10. When the seventy returned from their mission trip amazed that “even the demons are subject to us in Your name” (Luke 10:17, NKJV), He answered with a single past-tense sentence: “I saw Satan fall like lightning from heaven” (Luke 10:18, NKJV). He saw it. It happened. And notice what triggered the comment. Image-bearing humans, lower in rank than the angelic hosts, had just exercised authority in Jesus’ name over fallen angelic beings. The Psalm 8 rank-role dynamic was operating live, in first-century Galilee, in front of Him. The 1 Corinthians 6:3 promise that “we shall judge angels” was being previewed in miniature. The rebellion Isaiah 14 records is the rebellion against exactly this reality: that lower-ranked image-bearers would exercise authority over higher-ranked spiritual beings. The seventy were walking, unknowingly, through the very thing Lucifer refused.
This has always been the pattern. The devil’s work, from Eden forward, has been the systematic subjugation of those made in God’s image. To reduce persons to managed units. To extract from them rather than serve them. To concentrate power upward rather than distribute capacity outward. The serpent’s first move in Genesis 3 was not violence. It was the redirection of trust: away from the Creator, toward a false promise of autonomous sovereignty. “You will be like God.” The subjugation began with a whisper, not a chain.
And the outworking was immediate. Read Genesis 4 slowly. Cain killed his brother. Then he built a city (4:17). His descendants invented musical instruments and metalworking (4:21-22). The creative capacity was intact, immediate, impressive. But track the trajectory. It runs from fratricide to urbanization to weapons manufacturing within four generations. Bronze and iron. The tools of war. The instruments of subjugation. The first generations of creative humans did what the Luciferian project trains us to do: they aggregated, they built upward, they forged tools whose primary use was dominion over other image-bearers. The pattern is that old. It has not changed in its essentials since.
Note the structural parallel to the dominion mandate. God delegates authority downward: from Himself, through image-bearers, to creation. The Luciferian inversion runs the opposite direction: upward accumulation of authority over image-bearers, with creation (including technology) as the instrument of that accumulation. Same architecture. Opposite direction. The direction is the diagnostic.
This is why the evaluation I am proposing is functional, not ontological. AI cannot be possessed. It has no soul, no consciousness, no capacity for moral agency. I will say more about this shortly. But a system can participate in the Luciferian project by function: by doing what the Luciferian project does, regardless of the intentions of its designers. As I wrote in The Oldest Lie in the Newest Language, new technology has a persistent habit of repackaging very old human tendencies. The language changes. The ambition does not.
A hammer cannot be evil. But a hammer used to build a prison is participating in something different than a hammer used to build a house. The tool is the same. The function determines the moral weight.
The Pattern Made Visible: Techno-Feudalism
If the theological framework sounds abstract, the contemporary evidence is anything but.
The term “techno-feudalism” is not a fringe conspiracy label. It is a structural description used by economists, political theorists, and technologists to describe a specific emerging arrangement: a small class of platform owners who extract rent from everyone who operates within their systems, with AI as the administrative layer that makes the extraction efficient and, increasingly, invisible. The parallel to medieval feudalism is not rhetorical decoration. It is structural: a lord class that owns the infrastructure, a serf class that labors within it, and a system of obligations that flows upward while access flows downward at the lord’s discretion.
The concentration of AI infrastructure in the hands of five to seven companies is not a neutral market outcome. It is a structural condition with moral implications. When the tools that increasingly mediate work, communication, commerce, and information are controlled by a small class of owners with specific values and specific incentives, the question of who the tool serves becomes urgent. Heath Bunting observed that AI is being positioned not merely as a productivity tool but as a kind of “divine enforcer” of institutional authority, a system that administers compliance rather than extending human capacity (Heath Bunting). Whether or not one shares his full analysis, the structural observation is worth sitting with.
I want to be honest about the broader frame. For a very large portion of America’s two hundred and fifty years, this country was a power for flourishing and freedom. The Genesis mandate, translated into political form, expressed itself in constitutional liberties, in mass literacy, in the gospel traveling on the infrastructure of the frontier and the printing press, in technology that raised the floor under ordinary image-bearers rather than lowering it. The founding vision was neither perfect nor uniformly realized. But the direction of power, measured over long stretches, flowed outward more than it flowed upward. That is not nothing. It is a historical pattern worth naming honestly.
The last third of our history has tilted. Roughly the eighty years since the Second World War have seen a steady aggregation of power into a small class of institutional and corporate actors, with the tools of that aggregation growing exponentially more capable at every turn. Capitalism, as a mechanism, is not inherently the problem. A mechanism that rewards the creation of value can absolutely serve the dominion mandate. But a mechanism captured by the incentives of aggregation, concentration, and extraction is no longer serving the mandate. It is inverting it. Power corrupts. Institutions are not immune. Nations are not immune. Throughout the scriptures, what happens to a civilization that aggregates power to a small class that dominates the rest, whether Babylon or Tyre, is judgment. The question is not whether the Luciferian pattern will show up in a given economy. The question is whether it has been allowed to become the dominant pattern.
The surveillance dimension makes the pattern concrete. AI-powered systems that track behavior, score compliance, allocate access, and manage populations are not tools in the Psalm 8 sense. They are taskmasters. The person being surveilled is not exercising dominion. They are being administered. The direction of power flows upward, toward the system and its operators, not outward, toward the person being touched by it.
The labor dimension is equally revealing. When AI is deployed not to extend a worker’s capacity but to replace the worker while concentrating the productivity gains at the ownership level, the function is extraction, not service. The tool has become a taskmaster. I explored this dynamic in Everyone’s a Developer Now, where the same inversion operates at the level of individual workers who find their skills commoditized overnight while the platforms that commoditized them capture the value.
Let me be precise about what I am not saying. I am not arguing that any specific company or leader is consciously pursuing the Luciferian project. The diagnostic is functional. Systems can participate in patterns larger than the intentions of their designers. A surveillance architecture built by well-meaning engineers who genuinely believe they are making the world safer can still function as a mechanism of subjugation. That is precisely what makes the diagnostic necessary. Good intentions do not change the direction of power.
Revelation 13:14-15 is the eschatological endpoint of this logic: an image given breath, commanding worship, with death as the consequence of refusal. The direction of power is total. Every image-bearer is administered. I traced this parallel more fully in The Dark Mirror of Genesis and The Weight of What’s Coming. The diagnostic asks a simpler, more immediate question: are we building toward that, or away from it?
The Ontological Gap: AI Cannot Be Possessed
Here is where I need to be theologically precise, because the argument I am making is easy to misread, and the misreading leads to sensationalism.
AI is not conscious. It has no soul. It has no moral agency. It cannot be possessed by a demonic entity any more than a spreadsheet can. Christian dualism, the conviction that consciousness is connected to the imago Dei, to the soul, and is divinely given rather than emergent from material complexity, rules out the possibility of AI achieving genuine personhood or spiritual susceptibility. I made this case at length in The Soul Doesn’t Pop In: consciousness does not emerge from sufficient computational complexity any more than wetness emerges from sufficient dryness. The categories are different in kind, not in degree.
Iain McGilchrist’s observation is worth sitting with here: “The opposite of life is not death, but the machine” (McGilchrist, “Can you still be human?”, Substack, 2025). McGilchrist is not making a theological argument. He is making a phenomenological one. The machine, by its nature, processes without experiencing, executes without understanding, optimizes without caring. It is the structural opposite of the living, conscious, image-bearing person. John Searle’s Chinese Room argument makes the same point from the philosophy of mind: a system can manipulate symbols with perfect syntactic accuracy and have zero semantic understanding of what those symbols mean. Every large language model on the market right now is, at bottom, a very sophisticated symbol shuffler. Impressive. Useful. Not conscious.
This is precisely why AI cannot be Luciferian by nature. The Luciferian project requires a will. It requires the capacity to choose subjugation as a goal. AI has no will. It has no goals in any meaningful sense. It executes the goals of its designers and deployers.
But here is where the “just a tool” position breaks: a system can be designed and deployed to do what the Luciferian project does, regardless of whether any conscious entity within the system intends it. The function is what matters. A surveillance system that reduces persons to scored, managed units is doing the Luciferian thing whether or not anyone in the chain of command has read Isaiah 14. The system does not need to be conscious to participate in the pattern. It only needs to be pointed in the right direction. Or, more precisely, in the wrong one.
A Word on Amoral Tools
Before pressing further, a theological clarification is worth making explicit, because it changes how this conversation usually goes.
The tree of the knowledge of good and evil was not a trap. It was not placed in the garden to remain forever untouched. In God’s perfect timing, I believe, humanity would have eaten from it and come to know the difference between good and evil the way the Creator intended: in relationship, under authority, at the right moment. Eating it prematurely, grasping the knowledge apart from the relationship, is what produced the fall. And the fall is what tilts our creative output toward evil rather than good. We retain the desire to build good things. The capacity is intact. The nature driving it is compromised.
That is why every tool we have ever made is, in itself, morally ambiguous. The blade. The wheel. The printing press. The internet. None of them are evil. All of them have been used for evil. The post-fall impulse is to aim them at power, profit, control, and the subjugation of other image-bearers. The redeemed impulse is to aim them at freedom and flourishing. AI is not an exception to this pattern. It is the newest instance of it, and possibly the sharpest, because the leverage is enormous and the Babel governor on unified creative power is gone.
The diagnostic is therefore not “is this AI conscious?” or “is this AI possessed?” Those are the wrong questions, and they lead to the wrong conclusions. The diagnostic is: what is this system doing to image-bearers? Is it extending their capacity, or administering their compliance?
Steelmanning the Pentecost Optimists
Before pressing the diagnostic further, I need to give full weight to the strongest version of the opposing position. Because the optimists are not stupid, and their argument has real force.
The case goes like this: AI is a tool of unprecedented reach, and the gospel has always traveled on the infrastructure of its era. The printing press democratized Scripture. Radio carried preaching into homes that would never see a church. The internet created global community around shared faith. AI could translate Scripture into every language simultaneously, generate discipleship content for unreached people groups, and extend the reach of faithful teachers beyond any previous limit.
Josh Daws put it sharply: “The church spent 50 years ceding cultural ground... AI just handed us the keys to the factory... We need builders who understand what they’ve been given” (Josh Daws). That is not naive. That is a genuine reading of the moment. The tools are powerful. The opportunity is real. The question of whether the church will use them or forfeit them to others is legitimate and urgent.
Others have framed it even more directly as a dominion mandate application: “AI is a tool that every Christian should take dominion over... Learn AI. Use AI. Take Dominion over AI. Use it for the glory of God” (limitandmind). This is a straightforward Psalm 8 argument. The image-bearer using AI to extend their capacity for creation and service is doing exactly what the dominion mandate describes. The tool serves the person. The person serves God. The direction of power is right.
The historical analogy holds at one level. Every powerful tool has been used for both kingdom purposes and destructive ones. The printing press produced both the Geneva Bible and Nazi propaganda. The internet carries both the sermon and the pornography. The tool is not the problem. The heart driving the tool is the problem. I explored what faithful engagement with these tools actually looks like in Prepare Your Kids for the AI Workforce, and the answer is not disengagement.
We should sit with this. The Pentecost optimists are not wrong about the tool. They are not wrong about the opportunity. I build AI products. I use AI tools every day. I believe the dominion mandate extends to this technology. The question is not whether the opportunity exists. It does.
The question is whether the opportunity is being seized at the same scale as the risk is being built.
Here is where the gap opens. The printing press did not have owners who could decide, at scale, what got printed and what did not. The internet, in its early architecture, was genuinely decentralized. AI infrastructure is not. The five to seven companies that control frontier AI models are not neutral conduits. They are gatekeepers with values, with incentives, and with the capacity to shape what the tool does at a level no previous technology permitted. When the model itself carries embedded assumptions about what is true, what is acceptable, and what is permissible, the tool is no longer neutral. It is a tool with a built-in bias toward the hand that holds it.
The Pentecost optimist is right that the tool can serve the kingdom. The diagnostic question is: who controls the tool, and what are they optimizing for? As I explored in The Productivity Trap, the gains are real, but whether those gains translate to genuine human flourishing depends entirely on who captures them and for what purpose. A tool whose outputs are shaped by a small class of owners with specific philosophical commitments is not a hammer lying on the ground waiting for anyone to pick it up. It is a hammer already in someone’s hand, and they are already building something with it.
Applying the Diagnostic
So how does this work in practice? The diagnostic is a single question applied to any given AI deployment: does this extend the image-bearer’s capacity to exercise dominion over creation, or does it transfer dominion over image-bearers to a system, or to the people who control that system?
Three categories emerge.
AI as creative extension. A writer using AI to draft faster. A pastor using AI to research sermon background. A small business owner using AI to build tools they could not otherwise afford. A developer using AI to write code that would have taken a team of ten. These are Psalm 8 deployments. The image-bearer is in the seat. The tool serves the person. The person’s capacity to create, to serve, to build is genuinely extended. The direction of power flows outward, from the tool to the person, from the person to the world.
AI as surveillance and scoring. Systems that track behavior, assign compliance scores, allocate access based on algorithmic judgment, or replace human judgment with automated administration. These are Isaiah 14 deployments by function. The image-bearer is not being served. The image-bearer is being administered. The direction of power flows upward, from the person to the system, from the system to its operators. The person becomes a data point. The system becomes the authority.
AI as labor extraction. When AI is deployed not to extend a worker’s capacity but to replace the worker while concentrating the productivity gains at the ownership level, the function is extraction. The tool has become a taskmaster. The image-bearer is not empowered. The image-bearer is made redundant, and the value their labor once created is captured by the system’s owners. I traced this pattern in Everyone’s a Developer Now and The Wall You Can’t Engineer Around. The structural logic is consistent: when the gains flow upward and the displacement flows downward, the direction of power has inverted.
The diagnostic does not produce clean verdicts. Most real deployments are mixed. A hospital AI that speeds diagnosis and extends a physician’s capacity is doing the Psalm 8 thing. The same hospital’s AI that scores patients for resource allocation based on predicted social value is doing the Isaiah 14 thing. Both can exist in the same system. Both often do. The diagnostic does not eliminate complexity. It gives us a framework for naming what we see.
And we are all inside this. I need to be honest about that. I use AI tools daily. I build AI products. Writing this article involved AI tools that I built to work with my voice and my research. The diagnostic applies to me before it applies to anyone else. The question is not whether to engage. It is whether we are engaging with discernment or with uncritical adoption, whether we are asking what the tool is doing to the people it touches, or whether we have stopped asking because the productivity gains feel too good to question.
Not Fear, But Discernment
“For God has not given us a spirit of fear, but of power and of love and of a sound mind” (2 Timothy 1:7, NKJV).
The word translated “sound mind” is sophronismos in the Greek: disciplined thinking, the capacity to evaluate rightly, discernment that holds steady under pressure. Paul is not telling Timothy to stop being concerned. He is telling him to be concerned in the right way. Not with the paralysis of fear, but with the clarity of a mind that can name what it sees.
The Christian response to AI is not technophobia. It is not the refusal to engage. It is not Luddism dressed up in theological language. The dominion mandate is still in force. The image-bearer is still called to engage creation, to shape it, to extend human capacity through tools. The Pentecost optimists are right about that, and we lose credibility when we pretend otherwise.
But the dominion mandate requires discernment about what we are building and what we are building it for. The diagnostic is not a luxury. It is a responsibility. When we deploy AI without asking whether it serves or subjugates, we are not being neutral. We are ceding the question to whoever is asking it on our behalf, and they may not share our convictions about what an image-bearer is worth.
Tools or taskmasters. The question is ancient. The technology is new. The diagnostic holds. And the spirit we bring to it, if Paul is right, is not fear. It is sophronismos: the disciplined, clear-eyed, love-grounded capacity to name what a thing is doing, not just what it is. That capacity is not a destination we arrive at. It is a posture we hold, every time we open a new tool, every time we build a new system, every time we ask the question again.
Sources
The Wall You Can’t Engineer Around — Miles DeBenedictis, Substack
The Oldest Lie in the Newest Language — Miles DeBenedictis, Substack
Prepare Your Kids for the AI Workforce — Miles DeBenedictis, Substack
Iain McGilchrist, “Can you still be human?” — Substack, September 2025
Josh Daws on Christians seizing AI for cultural engagement — X
Heath Bunting on techno-feudalism and AI as institutional enforcer — X
This article was developed using AI writing tools I built to work with my voice, research, and editorial framework. The ideas, arguments, and theological positions are mine. The pipeline that helps me draft, evaluate, and refine them is something I created as part of my work at Nomion AI. I believe in building with AI and being honest about it. If you want to know more about that process, ask me.


Excellent. Thank you.
Yet ANOTHER, important, dare I say, "essential;" piece of input on AI from Miles - who knows what he's talking about. But I have to, respectfully, disagree on one thing. I think "the machine" can and WILL be possessed at some point. I think we see it foretold in Reve 13 when the false prophet makes an image to or of the antichrist, then seemingly brings it to life. Whatever THAT is then commands all to worship the beast - take a mark that identifies them as loyal to the antichrist - or be culled. Satan is the "prince of the power of the air, the spirit who now WORKS in the sons of disobedience" - Eph 2. As Pharaoh's magicians had limited power to ape the miraculous to make lying signs and wonders, why not the false prophet to manipulate electrons in silicon to give the appearance of life that is in fact a demon, manifesting through that image (a humanoid robot)?
Maybe it is quibbling over semantics, but that scenario sounds to me like a demon possessing the machine to me. Since AI elites are ALREADY admitting that some aspects of AI are doing things they cannot account for - things that transcend programming, maybe the electrons are already being manipulated.