The Productivity Trap
AI promised to give us our time back. Instead, it's teaching us to fill every second of it.
Boris Cherny shipped twenty-two pull requests in a single day (in software, a pull request is a completed unit of work, ready for review and deployment). The next day, twenty-seven. Every one of them written entirely by AI. No manual edits. No human keystrokes touching the code. Cherny is the creator of Claude Code at Anthropic, and he hasn’t manually written a line of code in months.
In a previous role at Meta, Cherny was responsible for code quality across every product: Facebook, Instagram, WhatsApp. His team worked on improving engineering productivity, and “seeing a gain of something like 2% in productivity, that was like a year of work by hundreds of people.” Since Claude Code launched, productivity per engineer at Anthropic has grown a hundred and fifty percent, measured by pull requests and cross-checked against commits and commit lifetimes. A hundred and fifty percent. “This is just unheard of,” he said. “Just completely unheard of” (Y Combinator, 2026).
Tasks that used to take five or ten hours now take ten or thirty minutes. The numbers are staggering, and they are real.
And they are only half the story.
Because buried beneath the productivity euphoria is a finding that should make anyone paying attention deeply uncomfortable. Researchers Aruna Ranganathan and Xingqi Maggie Ye at UC Berkeley’s Haas School of Business spent eight months embedded inside a two-hundred-person technology company, watching what actually happens when knowledge workers adopt generative AI. Their conclusion, published in the Harvard Business Review in February, is blunt: AI doesn’t reduce work. It intensifies it (Ranganathan & Ye, HBR, 2026).
The productivity gains are genuine. Nobody disputes that. What Ranganathan and Ye discovered is what happens after the gains. And what happens after is where the trouble starts.
The Three-Headed Problem
The Berkeley researchers identified three mechanisms by which AI quietly intensifies work, none of which require a manager to demand longer hours or assign more tasks.
First, task expansion. When AI collapses the knowledge gap between disciplines, people start absorbing work that used to belong to someone else. Product managers begin writing code. Researchers take on engineering tasks. The boundaries between roles blur because the friction that once kept them in place has disappeared. It feels empowering at first. You’re doing more, learning more, contributing more. But “more” has no natural ceiling.
Second, blurred boundaries. AI’s conversational interface makes work feel frictionless. You can “just try something” during lunch, while waiting for your kids at practice, before bed. There’s no boot-up time, no activation energy. One of the engineers in the study put it plainly: “You had thought that maybe... you save some time, you can work less. But then really, you don’t work less. You just work the same amount or even more” (Ranganathan & Ye, HBR, 2026). The natural stopping points that used to structure a workday dissolve when the tool is always available, always responsive, always ready for one more question.
Third, increased multitasking. Workers run manual tasks alongside AI-generated alternatives. They manage multiple agents simultaneously. The cognitive load doesn’t decrease; it shifts. You’re no longer doing the tedious work yourself, but you’re managing five streams of output, validating results, context-switching constantly, and maintaining the illusion that you’re in control of all of it.
The synthesis from the researchers is devastating in its simplicity: “AI makes it easier to do more, but harder to stop.”
The Dopamine Engine
I need to confess something here, because I am not writing this as an outside observer. I build AI tools. I use them every day. And I have felt the pull that Ranganathan and Ye are describing with the precision of ethnographers.
There is a dopamine hit that comes with AI-accelerated productivity. It is intoxicating. A task that would have taken me an afternoon is done in twenty minutes, and the rush of completion doesn’t diminish; it compounds. Because now I can do another task. And another. The feedback loop is immediate and relentless: start, finish, start, finish, start again. The sense of accomplishment is real, but so is the inability to stop chasing it.
I have to remind myself to pause. To rest. To close the laptop and walk away. And the fact that I have to remind myself is exactly the point. The tool doesn’t tell you to stop. Your manager won’t tell you to stop, because your output just tripled and that looks fantastic on a quarterly review. The market won’t tell you to stop, because the competitor who doesn’t stop will eat your lunch. Nobody in the system has an incentive to pump the brakes.
This is how burnout works in the AI age. It doesn’t arrive with a dramatic collapse. It arrives with a sustained hum of high performance that slowly erodes your capacity to do anything else.
The Ratchet Effect
Here is what makes this particularly insidious. Once AI raises the productivity baseline, it doesn’t come back down. The new output level becomes the expectation. I know people in the software industry right now who are being told, directly, by their managers: if you are not using AI, you are going to be in a difficult position within your role. Not because the manager is cruel, but because the math is inescapable. If one engineer is producing at five times the output of another, the one who isn’t using AI looks unproductive by comparison.
It has become almost conventional wisdom at this point: AI won’t replace you, but someone using AI will. Karim Lakhani at Harvard Business School coined the clearest version of the phrase in 2023 (Lakhani, HBR, 2023), and it has been repeated by everyone from Fortune 500 executives to middle managers on LinkedIn because it is obviously, uncomfortably true.
But notice what this logic produces. It’s a ratchet. Every gain locks in. Every hour saved gets refilled with new commitments. Every efficiency becomes a new floor, not a new ceiling. And the human being at the center of this machine is expected to keep pace with a tool that doesn’t get tired, doesn’t need sleep, and doesn’t experience the existential weight of wondering whether any of this matters.
I spoke to a group of businessmen several weeks ago about AI, and afterward it wasn’t the business owners who surrounded me with questions. It was their sons. Nineteen, twenty, twenty-one years old. And every single one of them asked the same question: Will I have a job in five years?
That question carries more weight than a career concern. It’s an identity question. It’s a purpose question. And the anxiety underneath it isn’t irrational; it’s the logical result of watching a system that values humans primarily for their productive output begin to automate that output away.
Work Before the Fall
This is where the theological frame becomes essential, because the secular productivity conversation has no answer for the question those young men were really asking.
The Bible presents work as pre-fall. This matters enormously. God gave Adam work to do before sin entered the picture: “Be fruitful and multiply; fill the earth and subdue it” (Genesis 1:28). Then, more specifically: “The Lord God took the man and put him in the garden of Eden to tend and keep it” (Genesis 2:15). Work is not a curse. Work is part of what it means to be an image-bearer of God. We create, cultivate, build, and steward because we are made in the image of a God who creates, cultivates, builds, and sustains. Work has intrinsic value, not merely instrumental value. It is participation in God’s creative activity.
But post-fall, something changed. Work became susceptible to bondage. “In the sweat of your face you shall eat bread” (Genesis 3:19). The curse didn’t introduce work; it introduced the distortion of work. The compulsion. The anxiety. The inability to stop. The tying of identity to output. The belief that you are what you produce.
Sound familiar?
The AI productivity trap is not a new problem. It is the oldest problem in Scripture wearing new clothes. The fall turned work into potential bondage, and every generation since has found new mechanisms for that bondage to express itself. The industrial revolution had its version. The knowledge economy had its version. And now AI has its version: a tool so powerful that it makes the bondage feel like freedom, because you’re producing more than ever, and the dopamine keeps flowing, and the metrics keep climbing, and you don’t notice you’re drowning until you’re already underwater.
The Sabbath as Subversion
The Sabbath commandment is not a productivity hack. I want to be clear about that, because there is a cottage industry of productivity gurus who have repackaged rest as “strategic recovery” or “performance optimization.” That framing misses the point entirely. The Sabbath is not about performing better on Monday. The Sabbath is a declaration that you are not defined by your output.
“Remember the Sabbath day, to keep it holy. Six days you shall labor and do all your work, but the seventh day is the Sabbath of the Lord your God. In it you shall do no work” (Exodus 20:8-10).
The Sabbath is subversive because it interrupts the ratchet. It says: this far and no further. Not because there isn’t more work to do (there is always more work to do), but because your identity as an image-bearer is not contingent on your capacity to produce. You rest not because you’ve earned it, but because God rested, and you are made in His image, and rest is woven into the fabric of what it means to be human.
The secular productivity conversation cannot get here. It can diagnose the problem beautifully. Ranganathan and Ye’s research is excellent, and their proposed solutions (intentional pauses, sequencing, human grounding) are wise. But they are techniques. They are management strategies. They do not answer the question underneath the question: Why should I stop when I could keep going?
The Sabbath answers that question. You stop because you are not a machine. You stop because your worth is not measured in pull requests or quarterly output or the number of tasks you completed before lunch. You stop because God stopped, and in doing so He established a rhythm that says: production is good, but it is not ultimate. You are more than what you make.
The Church Staff Conversation
I have been having this exact conversation with my own church staff. I told them recently: identify every repetitive task you do that touches a computer. Every one. Write down the problem and the intended goal. Because if there is a computer in the loop, there is a potential for AI and automation to handle it. We can reclaim ten, twenty, thirty hours of their week from tasks that a machine can do better and faster.
But here is the critical piece I added: the goal is not to fill those hours with more work. The goal is to free people to do the things that only humans can do. Pastoral care. Creative thinking. Being present with another person. The things that require a soul, not a processor.
This is the distinction the AI productivity conversation keeps missing. The question is not whether AI makes us more productive. Of course it does. The question is what we do with the margin it creates. Do we fill it with more output, chasing the dopamine of completion and the approval of metrics? Or do we protect it as space for the irreducibly human activities that no algorithm can replicate?
The Real Question
Cherny’s hundred-and-fifty-percent productivity gain is impressive. It is also, in a sense, irrelevant to the deeper question. Because the deeper question is not “How much can we produce?” but “What are we for?” And that is a question no AI tool will ever answer, because it is an anthropological question, not a technological one.
If you believe, as the naturalist does, that human beings are sophisticated biological machines whose value is determined by cognitive output, then the productivity ratchet makes perfect sense. Optimize, accelerate, produce. If you can’t keep up, step aside for someone (or something) that can.
But if you believe, as Scripture teaches, that human beings are embodied souls made in the image of God, with divinely given consciousness and purpose that transcends economic utility, then the productivity trap is not just a management problem. It is a spiritual crisis. It is the ancient temptation to find your identity in your work rather than in the God who gave you the work to do.
The young men who asked me whether they’ll have jobs in five years don’t need a better career strategy. They need a better anthropology. They need to know that their worth is not contingent on their usefulness. That rest is not laziness; it is obedience. That the Sabbath is not an interruption of the important work; it is a reminder of what is actually important.
AI promised to give us our time back. The data says it’s doing the opposite. And the only framework I know that can hold both the genuine good of productive work and the genuine danger of productive bondage is the one that begins with a God who worked for six days, rested on the seventh, and called it holy.
Sources
This article was developed using AI writing tools I built to work with my voice, research, and editorial framework. The ideas, arguments, and theological positions are mine. The pipeline that helps me draft, evaluate, and refine them is something I created as part of my work at Nomion AI. I believe in building with AI and being honest about it. If you want to know more about that process, ask me.


This is such an important topic and I love your conclusions Miles. I wholeheartedly agree and pray that God will go before you and open doors for you that no man or machine can close, that this truth is shared with those who have not yet heard. May the Lord make you fruitful in every good word and deed.
Thank you.