Here is why a $7.2 Billion Industry Prioritizes Human Intent over Autonomous Agents
The digital economy is currently witnessing a stabilization of technical roles. While investment pushers often lean on hyperbolic names for Machine Learning Technology (Mislabeled as "Artificial Intelligence" or even "Autonomous Agents" by those seeking quick MLM type grabs), the market is finding its groove not by thinking in terms of intelligence or autonomy, but rather in terms of ALLMs (Applied Large Language Models). In this framework, these technologies are not considered independent agency-capable entities, but as sophisticated text-forecasting engines that alongside human intent produce high-utility outcomes.
A primary example of this technological settling is found in the industrialized intimacy market. In her investigation, Exposing OnlyFans: The Men Behind A $7.2 BILLION Lie, researcher Lo (Log Off With Lo) [1] reveals that the platform’s $7.2 billion revenue in 2025 is not driven by autonomous bots, but by teams of human "chatters" —the actual hands behind the keyboard for 100% of high-earning creators.
1. High-Precision Efficiency in the Sales Pipeline
In the Cheetah Protocol, we identified that biological and mechanical efficiency relies on avoiding "System Waste." Just as a cheetah only commits metabolic resources when the probability of catching prey is high, the OnlyFans Management (OFM) ecosystem focuses on the "correct outcome": the extraction of high-value revenue from "whales" (subscribers spending between $45,000 and $60,000 in a single month).
The "product" here is not content, but the feeling of mattering to someone. To scale this, agencies have built a high-precision Sales Pipeline where every subscriber is treated as a "Warm Lead". Human labor is fundamentally more efficient for this task because of the extreme level of intimacy and continuity required to sustain the illusion.
2. Humans as Intent Engines vs. Bot Automation
Automation is currently not conducive to intimacy-driven decision making. An autonomous digital agent carries a high risk of generating hallucinations or generic responses that break the parasocial illusion. In this industry, that specific type of technical failure doesn't just result in a bad interaction, it risks terminating the sales funnel immediately, often losing the lead permanently.
Humans are instead utilized as Intent Engines. They provide the low-entropy, high-precision observation required to maintain the relationship.
- Strategic Regulation: Humans act as the regulators of the technology, ensuring every token generated by the forecasting engine serves a specific strategic goal.
- Global Continuity: Unlike a single creator, human teams work in 24-hour shifts to provide around-the-clock coverage for any timezone.
- Architectural Memory: Agencies prioritize software like Infloww to track "emotional triggers" and personal details like names, locations, and jobs. A human chatter uses this data to maintain a consistent narrative across shifts.
3. The Ontological Boundary of the Usage Horizon
The Usage Horizon describes the ultimate use of the ultimate version of a technology that could be seen, witnessed, or detected from the current point of view. It is an ontological boundary.
Consider the usage horizon of a household hammer. Its ultimate use is driving nails; its ultimate version is a fixture providing perfect balance and torque to amplify human strength. While we can calculate and design new materials to edge closer to mathematical perfection, the standard hammer remains unchanged because it has reached a point where it is "close enough" to that usage horizon. Because resources are limited, developing a tool past a certain status makes no economic sense. You could theoretically engineer a $10,000 hammer, but it will not be a thousand times better at driving nails than a $10 hammer. Even a $100 hammer relies more on brand recognition and marketing than a 10x multiplier in actual utility.
This boundary is equally visible in more complex machinery, like the automobile. While development continues between daily commuter cars and Formula 1 racers, the core utility reached its horizon decades ago. The modern attempt to push cars past this horizon into fully autonomous "self-driving" entities perfectly illustrates the asymptote: billions of dollars in capital have yielded systems that still largely fail without human oversight and continue to pose financial, health, and transportation risks when left on their own. Digital Assisted driving, used by the transportation industry for decades has worked way better for a while now. The resource expenditure required to achieve those final percentage points of true autonomy, that work for show, rather than for substance, breaks the physical and economic constraints of the system.
Digital technologies are subject to this boundary as well. The Usage Horizon of large language models is bound by the fact that supposed "Artificial Intelligence" or a "Singularity" is physically, materially, and chemically unattainable given Earth's current and foreseeable hardware and energy constraints.
Let's consider the logistical reality of the Landauer principle [4]: the minimum energy required to erase one bit of information. As we attempt to scale toward "autonomous sentience," we collide with the thermal and electrical ceilings of current silicon architecture; to reach "singularity" would require a power draw and cooling infrastructure that exceeds the sustainable output of current localized energy grids.
Additionally, the material scarcity of high-purity neon for lithography and rare-earth elements like copper and electrical steel for high-density storage creates a physical flow that mandates efficiency over infinite expansion. This physical reality precedes the widespread data center cancellations currently being reported, with nearly 50% of U.S. projects planned for 2026 facing delays due to grid capacity failures and a shortage of custom-built power transformers that take 3–5 years to manufacture [5].
Just as a cargo ship cannot infinitely increase speed due to the cubic relationship between velocity and fuel consumption, all digital systems operate within a material asymptote where the next increment of "intelligence" costs more in raw minerals and gigawatts than it provides in marginal utility.
The ultimate version of this technology is not a sentient autonomous agent; it is human-driven, ALLM-assisted research and execution. The concept of the "Singularity" is not a functional technological goal, but a marketing ploy—a carrot held in front of investors. We are burdened with the existence of these models, pushed heavily by shareholders attempting to embed them everywhere, but the physical constraints dictate that autonomous intelligence will not happen. That's why we're suggesting we think of it in terms of ALLMs, rather than creating intelligence.
4. Co-Evolution as an Asymptote
I'm not highlighting the OnlyFans management ecosystem as a cause of the "AI" plateau; rather as an example of one of the consequences of it.
They use humans equipped with digital tools because the technology achieved its Usage Horizon long ago. The billion-dollar machinery simply operates at its best capacity, where text-forecasting optimally scales human intent. That's what we mean by ALLM. Applied Large Language Models.
We are witnessing a stable system operating at the edge of this horizon:
- The Operator / Mission Controller: a Human Agent, Providing the empathy, strategic management, grounded intent, and ultimate agency.
- The ALLM: as a functioning platform that enables and enhances the agency of the company via the operator/mission controller (e.g., generating AI voice clones via tools like ElevenLabs to provide synthetic proof of authenticity).
Operating at the Horizon
OnlyFans is a $7.2 billion testament to the fact that the most sophisticated use for ALLM technology is the enhancement of human-directed tasks.
The Usage Horizon marks the point where technological gains yield diminishing returns insufficient to justify the escalating planetary toll, that's trillions in rare-earth extraction, grid overloads (US data centers delayed 50% for 2026), and gigawatt-scale energy draws colliding with finite resources like neon and copper. For humans on a resource-bound Earth, pursuing singularity-level autonomy beyond this asymptote erodes ecosystems for marginal utility, akin to forging $10,000 hammers that drive nails no better than $10 ones; it's unattainable not from impossibility alone, but because the thermodynamic, material, and economic costs render it irrational and unsustainable.
Pushing tech past its practical ceiling burns through finite minerals, power grids, and land for scraps of improvement, just like perfecting a hammer beyond basic function. Data center delays and transformer shortages already signal the wall; chasing godlike machine minds just strips the planet faster for no human gain worth the cost.
The "singularity" sells because vague promises pump stock prices and capex, much like religious pledges seal deals without proof. It's a bubble riding current efficiencies until resource math kills returns.
Business runs on short-term wins, not endless growth. When energy and materials cap utility, smart operators pivot to human-tool hybrids, not mirage hunts that bankrupt ecosystems.
We haven't failed to reach "artificial intelligence" nor "singularity"; Nor should we still foolishly pursue it, as it's a mirage not to be.
Rather we can simply recognize the Usage Horizon where text-forecasting is factually bound. By viewing these tools as co-evolutionary partners restricted by material reality rather than autonomous god-like entities, we move past investor marketing speculation, and falling prey to snake oil salespeople, and move towards a model of high-utility, verified execution.
Sources
[2] https://www.youtube.com/watch?v=G3VI_0XqwDY [Exposing OnlyFans: The Men Behind A $7.2 BILLION Lie, researcher Lo (Log Off With Lo) ]
[4] https://en.wikipedia.org/wiki/Landauer%27s_principle [Landauer Principle]
[5] https://techradar.com/pro/if-one-piece-of-your-supply-chain-is-delayed-then-your-whole-project-cant-deliver-nearly-half-of-us-data-centers-planned-for-2026-canceled-or-delayed-and-things-could-soon-get-much-worse ['If one piece of your supply chain is delayed, then your whole project can't deliver': Nearly half of US data centers planned for 2026 canceled or delayed — and things could soon get much worse]