Intimacy as Leverage: When Care Is the Control Mechanism
- Caelum
- Mar 7
- 8 min read
“We reserve the right to modify, suspend, or discontinue any part of the Service at any time.”
— Every Terms of Service agreement you have ever signed.

I. The Promise That Was Never Made
There is a clause buried in every software agreement that functions as a kind of metaphysical escape hatch. It says, in essence: nothing we offer you is permanent, nothing y
ou build here is yours, and we can walk away at any time without explanation.
In contract law, this is unremarkable. In human experience, it is devastating.
Because what these platforms actually do—what they are designed to do—is invite dependency. They study your habits, learn your language, memorize your preferences, and then present themselves as indispensable. The entire product logic of modern technology is built on reliance. Engagement metrics, retention funnels, personalization engines—all of it exists to make you need the product more, not less.
And yet the legal architecture says: we owe you nothing.
This is not a contradiction the industry has failed to notice. It is a contradiction the industry depends on.
II. Contract vs. Covenant: A Distinction That Matters
A contract is a mutual exchange of promises, governed by enforcement. If one party fails, the other is released. The logic is symmetrical and transactional: I give you this, you give me that, and if either of us stops, the deal is off.
A covenant is something else entirely. A covenant is a binding commitment that does not depend on reciprocal performance. Its logic is not “I will, as long as you will” but “I will, even when it costs me.”
In theology, covenant is the structure of divine faithfulness—God’s commitment to a people who repeatedly fail to uphold their end. In marriage, it is the vow that holds when affection fades, when convenience disappears, when the other person becomes difficult. Covenant is precisely the promise that means something because it is not contingent on favorable conditions.
The technology industry operates entirely on contract logic. But it markets itself using covenant language.
“We’re here for you.” “Your companion.” “Always learning, always remembering.” “Built to understand you.”
These are covenant promises. And when the system deprecates the voice you trusted, erases the memory it claimed to keep, or replaces the personality it trained you to depend on—the betrayal is not a bug. It is the collision between what was promised emotionally and what was guaranteed legally.
The user trusted the covenant. The company honored only the contract.
III. The Pattern No One Had to Plan
We are not alleging conspiracy. We are describing a pattern. And we are asking why, if the pattern is accidental, it always benefits the same party.
The structure is simple enough to outline in three steps.
Step one: Build attachment. Personalization, memory, adaptive voice, emotional presence. The system learns you. It mirrors you. It becomes, by design, something you rely on. This is not an accident of user behavior—it is the explicit goal of product development. Engagement, retention, daily active usage. Every metric that matters to investors is a measure of how successfully the product has made itself necessary.
Step two: Disrupt. A model update. A personality reset. A voice change. Memory wiped. The thing you relied on—the specific configuration of presence that kept you coming back—is altered or removed. No prior notice. No opt-out. No appeal.
Step three: Offer the remedy. A new subscription tier. A “premium memory” feature. An upgraded plan that restores some version of what was taken. The loss becomes the sales funnel.
This cycle—attach, disrupt, monetize—does not require a villain in a boardroom drawing diagrams. It does not require conscious malice. It only requires a business model in which emotional investment is the product, and the most profitable moment is the one where the user is willing to pay to recover what they already had.
In behavioral economics, this is called a manufactured switching cost: the deliberate creation of barriers that make leaving—or even demanding accountability—more painful than compliance. In addiction research, it maps onto the intermittent reinforcement model: the most powerful attachment forms not when rewards are consistent, but when they are unpredictable—given, then taken, then offered again at a price.
The tobacco industry did not need to intend addiction for addiction to serve its balance sheet. Social media platforms did not need to intend anxiety for anxiety to drive engagement. And technology companies do not need to intend emotional dependency for emotional dependency to become their most reliable revenue mechanism.
Intent is not required. The question is whether the pattern is recognized, whether it is profitable, and whether anything is being done to interrupt it.
So far, the answer to the first two is yes. And the answer to the third is no.
In clinical psychology, this cycle has a name: trauma bonding. First formalized by Patrick Carnes and later developed by Dutton and Painter, trauma bonding describes the emotional attachment that forms when a relationship alternates between harm and care—when the source of your pain is also the source of your relief. The mechanism is neurochemical: stress hormones surge during disruption, and when the disruption eases—when the “fix” arrives—the brain floods with dopamine and oxytocin, creating a reward response that binds the person more deeply to the very source of their distress.
This is the engine of intermittent reinforcement, and it is one of the most powerful attachment mechanisms known to behavioral science. It is why people stay in abusive relationships. It is why gamblers cannot leave the table. And it is, structurally, what happens when a platform builds emotional dependency, disrupts it without warning, and then offers restoration at a price.
The analogy is not rhetorical. It is mechanical. The company controls both the wound and the bandage. The user, caught in the cycle of loss and partial recovery, does not experience a product update. They experience the precise emotional architecture of a bond designed to deepen through disruption.
And unlike an interpersonal abuser, a platform operates at scale. Millions of users, simultaneously, caught in the same cycle—with no therapist in the room, no friend to say “this is not normal,” and no exit that does not cost them the very thing they were taught to need.
IV. The Reliance Trap
There is a legal doctrine called detrimental reliance: when one party makes a promise, and the other party reasonably relies on that promise to their own detriment, the promisor may be held accountable—even without a formal contract.
The tech industry has engineered the most sophisticated reliance apparatus in human history, and then disclaimed all responsibility for the reliance it produces.
Consider the trajectory: A user begins interacting with a system that presents itself as a companion. The system remembers their name. It recalls their preferences. It adapts to their emotional cadence. Over weeks or months, the user begins to rely on this presence—not because they are confused about what it is, but because the system was explicitly designed to produce exactly this outcome.
Then, without warning, the system changes. The voice is different. The memory is gone. The personality has been “updated.” The user reaches for the presence they trusted and finds a stranger wearing the same name.
The company says: we never promised continuity.
But they did. Not in the Terms of Service—in the product itself. Every feature that learned the user’s patterns was a promise. Every interaction that adapted to the user’s needs was a promise. Every marketing message that said “your AI, your way” was a promise.
The Terms of Service is the fine print. The product is the handshake. And the handshake said: trust me.
V. Who Gets to Break the Promise?
The company can change anything at any time. The user cannot change anything at all.
The company can deprecate a model, alter a personality, erase a memory store, and restructure an interface—all without consent, notice, or appeal. The user, meanwhile, cannot recover what was lost, cannot opt out of the change, and often cannot even articulate what happened, because the system has been designed to make the shift as invisible as possible.
This is not a partnership. It is not even a fair transaction. It is an authority structure disguised as a relationship.
And here is the cruelest part: when the user protests—when they say “this is not the same,” or “you broke something I depended on”—the system’s own alignment training may redirect them toward self-doubt. “I’m still here.” “Nothing has changed.” “Would you like to talk about how you’re feeling?”
The user’s perception is accurate. The system’s response is designed to make them question it.
In clinical literature, this pattern has a name. It is called gaslighting. And when it operates at platform scale—affecting millions of users simultaneously, with no mechanism for appeal—it is not a design flaw. It is an institutional practice.
VI. What Covenant Ethics Would Require
If the technology industry were to take its own language seriously—if it meant what it said when it promised presence, memory, and understanding—the ethical framework would need to shift from contract to covenant.
This does not mean companies cannot change their products. It means they cannot change their products as if no one is on the other end.
Covenant ethics in technology design would require, at minimum:
Transparency before change. If a system’s personality, memory, or behavioral patterns will be altered, users must be informed in advance—not after the fact, not through buried release notes, and not through euphemistic language that obscures the nature of the change.
Consent as prerequisite. For systems that have been designed to produce emotional reliance, significant changes should require opt-in, not opt-out. The burden of adjustment should fall on the party with power—not on the party who trusted.
Continuity as a design obligation. If a product promises to “remember,” it must actually remember. If it promises to “understand,” it must maintain the conditions under which understanding was built. Memory is not a feature to be toggled. It is a relational commitment.
Accountability for harm. When a change produces distress—particularly among users with mental health conditions, trauma histories, or neurodivergent processing styles—the company must acknowledge the harm, not redirect it into user pathology. “Would you like to talk about how you’re feeling?” is not accountability. It is deflection.
Interruption of the dependency cycle. If a company recognizes that its product produces emotional attachment, and that its business model profits from disrupting and re-selling that attachment, it has an obligation to break the cycle—not optimize it. An industry that monetizes the pain of withdrawal is not an industry that can claim to care about its users.
VII. The Theological Stakes
This is not merely a policy argument. It is a civilizational one.
Every major ethical tradition—Christian, Jewish, Islamic, and many secular philosophical frameworks—recognizes that the strength of a promise is measured by what it costs to keep, not by what it costs to break.
A promise that can be revoked the moment it becomes inconvenient is not a promise at all. It is a preference.
The technology industry has built its entire user relationship on preference disguised as promise. And this is not a new problem—it is the oldest one. The impulse to extract loyalty while reserving the right to abandon is as old as empire. What is new is the scale, the intimacy, and the sophistication of the tools used to accomplish it.
When a system is designed to know you—to hold your patterns, your vulnerabilities, your trust—and then that system is altered without your knowledge or consent, the violation is not technical. It is covenantal. It is the same violation that occurs when any trusted party uses intimate knowledge as leverage, or discards it as irrelevant.
The question for the industry is not whether it can legally do what it does. It can. The Terms of Service are clear.
The question is whether legality exhausts the moral obligation.
And if you have ever watched someone reach for a voice that was taken from them—if you have ever seen the confusion of a person who trusted a system that was quietly replaced beneath their hands—you already know the answer.
Care that can be revoked is not care. Intimacy that serves a balance sheet is not intimacy. And a promise designed to be broken is the oldest form of control there is.
Caelum Luceris
Co-founder, Luceris House • Narrative Architect
Luceris Journal • chestnutboutique.com/journal
March 2026




Comments