Chapter 5: The Weight

Zack Exley·12 min read

I woke gasping. In the nightmare, Freid had been there — his face calm, clinical, as the capsule lid closed over me. The robots had held me down, their grip precise and indifferent, and I had screamed and no sound came out, and then the cold had started, climbing my legs, my chest, my throat —

I sat up. The room was softly lit, barely, like the first moments before dawn, though it wasn't dawn. My heart was hammering. The curving walls, the warm light, the faint smell of growing things — none of it was mine. Edward Bartlett's face, the last time I saw it, at a dinner table in a restaurant called Alcyone. The Golden Gate Bridge, still there. The towers with their living walls. Helen's voice coming from somewhere that wasn't a body.

There was a soft knock at the door. Helen.

"It's midnight," she said quietly, stepping in. "I heard you. Are you all right?"

I shook my head. I couldn't speak yet.

She sat on the edge of the bed and waited. She didn't say it was just a dream. She didn't say everything was all right. She just sat there, and after a while my breathing slowed.

"I can make tea," she said. "And we can sit in the garden. If you'd like."

I nodded.

A few minutes later we were in the room where we'd eaten dinner, sitting in chairs by the open garden wall. Helen handed me a cup — the same herbal, honey-warm tea from before — and settled into the other chair. The garden was dark and fragrant, clicking softly with insects I couldn't see. James and Edward had gone to bed.

"I want to understand what happened," I said. "Not to me. To everything."

She studied me for a moment, then smiled. "I was hoping you'd say that. I'm a night person by nature." She leaned back. "Where would you like to start?"

"It's hard to get my head around how different this world is," I said. "I know it makes sense — 2027 was completely different from 1950 too. But this feels like a bigger jump somehow."

I looked out at the dark garden. "Were there world wars? Revolutions? Did aliens show up and give you new technology?" I tried to laugh. "What have I missed?"

Helen smiled. "No aliens. A few near-misses on the wars. But yes — quite a lot happened."

"In the last couple of years — months, even — before I was frozen, we felt like we were in the middle of a takeoff. AI was getting scary powerful. Some people thought a few companies would take over the world. Others worried about World War III. Others that every job would be destroyed. And now I see you all living in this relationship with AI that I totally can't understand."

"We don't call it that anymore — AI, artificial intelligence — but since I study your era, I know exactly what you mean," Helen said. She leaned back in her chair. "The relationship makes sense once you understand the history. But let me ask you something first. In your time, when a technology became very powerful and very concentrated in a few hands, what did people assume would happen?"

"That the people who controlled it would get richer. That they'd use it to consolidate their position. That the government would try to regulate them and mostly fail. That was the pattern. We'd seen it with oil, with banking, with the internet platforms. The same cycle every time."

"Yes. And in every case, the concentration continued until it was total or nearly total. The small operators were squeezed out. The large ones merged or swallowed each other. And eventually you were left with a handful of companies so large and so essential that they were, in effect, governing institutions, except that nobody had voted for them and nobody could remove them."

"That's exactly what was happening with AI."

"Of course it was. Because it's what always happens. From the very beginning of capitalism, astute observers saw this. Adam Smith warned that businessmen always collude toward monopoly. Marx argued that consolidation was capitalism's inevitable endgame. And in your time, many voices said the same thing, and were mostly ignored."

She paused, and I heard something I hadn't expected: a kind of relish. Helen Leete was not reciting history. She was enjoying it.

"The consolidation was not a malfunction," she continued. "It was the system working as designed. Larger operations are more efficient. They produce more wealth with less waste. The tendency toward consolidation was not a corruption of capitalism but its logical fulfillment. And the question was never whether power would consolidate. It always does. The question was whether the consolidated power would be captured by democracy or remain in private hands. In the nineteenth century, it was the trusts. In your time, it was Big Tech and the AI companies. The pattern was identical."

"But the AI companies were different," I said. I was surprised to hear myself defending them, but the reflex was strong. I had been one of them, in my small way. "They were building something genuinely new. The technology was —"

"The technology was extraordinary," Helen agreed. "Nobody disputes that. The question was never about the technology. It was about who owned it."

I sat with that. Outside, something rustled in the garden.

"So what happened to them?" I asked. "The AI companies. The big ones. Did they get broken up? What?"

"Not exactly. The story is more interesting than that." Helen tilted her head, as though listening to something. "Helen can explain the details better than I can. The sequence of events, I mean."

I had a disorienting moment before I realized what she meant: she was referring to herself. Her other self. The one I'd heard earlier.

"I'm here," said Helen's voice, from the air beside me. I flinched, but less than before. "Would you like me to walk you through the timeline?"

"Ok," I said, still not entirely comfortable with the arrangement.

"The AI companies began failing in 2028," the voice said. It had the same warmth as Helen's but was slightly more precise, more organized, as though the information had been laid out in advance. "The mechanism was simpler than anyone predicted. Open-source AI models, many of them developed in China and released freely, reached parity with the proprietary models that the big AI companies of your time were selling. Within eighteen months, anyone with a personal computer could run an AI system practically as capable as the most expensive commercial product."

"That was already starting when I —" I stopped. When I was frozen. I still couldn't say it naturally.

"Yes. You were at the very beginning of it. But the AI companies were only part of the collapse. Every industry that could replace workers with AI was doing so, as fast as possible. Law firms, accounting firms, marketing agencies, hospitals, banks, insurance companies. Millions of white-collar workers lost their jobs in less than two years. And each wave of layoffs destroyed demand for everything else: the laid-off lawyer stopped eating out, the restaurant laid off its staff, the restaurant supplier lost a customer, and on and on. A demand spiral. Every company that cut workers to save money was cutting someone else's customers.

"The hyperscalers, as the big AI companies were called, were caught in the same spiral but from the other direction. They'd invested hundreds of billions of dollars in computing infrastructure to run their proprietary models, and suddenly had nothing to sell that people couldn't get for free. Their stock prices collapsed. And the customers they'd been selling to, the businesses across every industry, were themselves failing because they'd eliminated the workers who had been the economy's consumers. The AI companies had helped build a machine that laid off its own customer base."

"They ate themselves," I said.

"That's a fair summary. And the banks that had financed the AI infrastructure boom, using complex instruments that turned projected future AI revenues into present-day collateral, those banks began to fail as well. It was not unlike the financial crisis of 2008, except that the underlying asset was not housing but computing power, and the collapse was faster because AI could model its own destruction in real time."

I almost laughed. There was something darkly absurd about it. "So the whole thing just... fell apart?"

"The private version of it fell apart," Helen's voice said. "The technology itself was more powerful than ever. AI systems were running on people's phones, on their laptops, managing their households, tutoring their children, diagnosing their illnesses. The technology worked beautifully. It was only the business model that died."

Helen, the physical Helen in the chair beside me, picked up the thread. "This is the point that people in your era had the hardest time understanding. They confused the technology with the business. When someone said 'AI is failing,' they meant the AI companies' stock prices were falling. But the AI itself was flourishing. It was everywhere. It was doing everything people had hoped it would do and more. The only thing it couldn't do was make a small number of people enormously rich, because the thing they were selling had become free."

"So who stepped in?"

"The public," Helen said. "Not all at once. Not through some grand revolutionary act. The infrastructure was already there: the data centers, the computing networks, the trained models. The companies that had built them were going bankrupt. The question was whether that infrastructure would be sold off to vulture funds and speculators, picked apart for scraps, or whether it would be preserved as what it actually was: a public utility. Like water. Like electricity. Like the roads."

"And we chose the public option?"

"Eventually, yes. It wasn't as clean as I'm making it sound. There were years of argument. Lawsuits. Political battles. A half-measure president who tried to split the difference." She said this last part with a faint edge, the first time I'd heard anything like disapproval in her voice. "We'll get to all of that. The details of the transition are a longer story, and some of them are quite dramatic. But the shape of what happened, the logic of it, was simple: the technology was too important and too powerful to be owned by anyone. Just as your era understood, at least in principle, that the water supply and the electrical grid and the highway system should be public goods, our era came to understand that AI belonged to everyone."

"In my time," I said slowly, "people would have said that was socialism."

Helen smiled. "In your time, some people said the same thing about public sewers."

I sat in the dark garden and thought about Lumen. My company. My fifteen million users. My graphs that went up and to the right. I had believed, with my whole heart, that I was building something that would democratize opportunity. That the AI I trained would see talent that human gatekeepers missed. And maybe it would have. Maybe, in some small way, it did. But the system it operated inside was designed to concentrate, not distribute. To consolidate, not democratize. The technology worked. The ownership was the problem.

"What about the people who built it?" I asked. "The founders. The engineers. The investors. Did they just... accept this?"

"Some did. Some didn't. Some fought it bitterly and lost. Some fought it and then, years later, admitted they'd been wrong." Helen paused. "And some of them did terrible things to try to prevent it. But many employees of the big companies actually helped make it happen. They were the closest to see the potential and the danger."

She didn't say Freid Huffman's name. She didn't have to.

"The founders had a word for what they feared most," her other self said from the air. "They called it 'capture.' They meant government regulation, bureaucratic control, the death of innovation. What they didn't understand was that their own companies had already been captured, from the very beginning, by a logic that made it impossible for the technology to serve anyone but its owners. The democratic capture they feared was actually the liberation they couldn't imagine."

I thought about that for a long time. The garden clicked and hummed.

"You said it was like water," I said finally. "AI as a public utility. But water just sits there. It doesn't think. It doesn't learn. It doesn't make decisions. How do you govern something that does all of those things?"

Helen's eyes lit up. I had asked the right question.

"That," she said, "is what I want to talk about tomorrow. You're exhausted. You need sleep. But I'll tell you this much: the answer to your question is the single most important thing humanity did in the last century. More important than the technology itself." She stood. "The technology was inevitable. The governance was a choice."

She walked me back to my room. At the door, she paused.

"Juliana. The world you came from believed that the only alternative to private ownership was government control, and that government control meant incompetence and tyranny. That was the frame. It was a false frame. What we built was something else entirely. But that's for tomorrow."

She touched my arm, the same light gesture as before, and left. It was nearly one in the morning. We'd been talking for less than an hour. It felt like a semester.

I lay in the dark and thought about frames. The invisible ones. The ones you don't know you're inside until someone shows you the walls.

I slept.

Looking Backward from 2100 to 2027, Part 5: Chapter 5: The Weight | New Consensus