The monomolecular blade caught the station's anemic light like a thread of frozen mercury. Karen Dong held it with the casual arrogance of someone who'd never encountered a weapon she couldn't master, her wrist loose, her stance textbook-perfect. The kind of confidence that came from drilling the same movements ten thousand times in simulation chambers where failure meant nothing more than a reset button.
Joe watched her from across the cargo bay's corroded floor, his eyes carrying that particular emptiness that came from seeing too much, from having looked into dimensions where human certainty dissolved like sugar in acid. He smelled of recycled alcohol and the particular staleness of someone who'd stopped caring about the station's water rations. His coat—some synthetic fabric that had outlived its waterproofing by a decade—hung on his frame like a flag on a windless day.
"You think that's a weapon," Joe said. Not a question. His voice had the texture of someone who'd forgotten how to inflect for social niceties.
Karen's smile was all teeth. "I know it is. Monomolecular edge. Cuts through titanium-carbide composite like it's not even there. You want a demonstration, old man?"
The cargo bay stretched around them, a cathedral of industrial decay. Overhead, the lighting strips flickered in their death throes, casting shadows that seemed to breathe. The air tasted of metal and the particular chemical tang of failing life support systems running on their backup protocols. Somewhere in the walls, coolant dripped with the patience of geological time.
Joe reached into his coat. The movement was unhurried, almost lazy. What emerged was something that shouldn't have existed outside of theoretical physics papers—a device no larger than a cigarette lighter, its surface a matte black that seemed to absorb light rather than reflect it. Micro-collapse blade, though calling it a blade was like calling a supernova a campfire.
"High-dimensional physics," Joe said, turning the device over in his palm. "You know what that means? Not the textbook definition. What it actually means."
Karen's grip tightened fractionally on her weapon. "Enlighten me."
"It means your three-dimensional thinking is a prison you don't even know you're in." Joe's thumb moved across the device's surface. No click, no hum, no dramatic activation sequence. Reality simply... folded. The air around the device's tip began to shimmer, not with heat but with something else, something that made the eye want to slide away, that triggered ancient hindbrain warnings about looking at things that shouldn't be looked at.
"It means," Joe continued, "that your monomolecular blade exists in three dimensions. Length, width, depth. Very impressive. Very sharp. And completely irrelevant."
He moved. Not fast—speed was a three-dimensional concept. He simply was there, and the collapse blade touched Karen's weapon at a point that geometry insisted shouldn't exist.
The monomolecular blade didn't break. Breaking implied structural failure, implied that forces had exceeded tolerances. What happened was more fundamental. The blade ceased. One moment it was there, a triumph of materials engineering, a weapon that had cost more than most people earned in a lifetime. The next moment, the space it had occupied contained only a faint shimmer, like heat haze, and then not even that.
Karen stared at her empty hand. Her fingers were still curved around where the grip had been. The muscle memory of holding something that no longer existed in any meaningful sense.
"What—" Her voice caught. She tried again. "What did you do?"
"Showed you what a real weapon looks like." Joe deactivated the collapse blade with the same casual gesture he might use to close a book. "But that's not why I'm here. I'm here to run a test."
"A test." Karen's hand had dropped to her side, but her body remained coiled, ready. The training was still there, even if the weapon wasn't. "What kind of test?"
"The kind that matters." Joe's eyes held hers, and there was something in them that made her want to look away, something that suggested he'd seen the universe's source code and found it wanting. "The kind that shows what you're made of when the numbers stop being theoretical."
The station's alarm system erupted. Not the routine warnings that everyone had learned to ignore—the low-oxygen alerts, the pressure fluctuation notices, the endless litany of failing systems that characterized life on Echo Station. This was the sound reserved for existential threats. Hull breach. Hostile incursion. The kind of emergency that meant you had minutes, maybe seconds, to make decisions that would determine whether you died screaming or just died.
Karen's training took over. Her hand went to her comm unit, muscle memory bypassing conscious thought. "This is Garrison Guard Dong, requesting immediate—"
"Your father's in Section Seven," Joe said quietly. "Maintenance corridor. He went to check on the coolant leak."
The words cut through her tactical assessment like a scalpel. Her father. Chief Engineer Dong, who'd raised her alone after her mother had taken the long walk out an airlock during the famine years, who'd taught her to read using technical manuals because there were no children's books on Echo Station, who still called her "little star" when he thought no one else could hear.
"Section Seven is—" Karen's tactical display was already painting the picture in her neural interface. Red zones spreading like blood in water. "That's where the breach is."
"0.01% survival probability," Joe said. "If you go after him. The station's AI has already run the numbers. You know it has. You've got maybe ninety seconds before the bulkheads seal. You can make it to your own escape pod. Or you can try to reach Section Seven. Not both. Never both."
Karen's jaw clenched. "This is a simulation. Some kind of test scenario."
"Does it matter?" Joe's voice was almost gentle. "Your body doesn't know the difference. Your amygdala is screaming right now. Your adrenal glands are dumping chemicals into your bloodstream. Fight or flight. Except flight means living and fighting means dying for a 0.01% chance. So what do you choose, Karen Dong? What does your meat choose when the numbers are real?"
The alarms continued their mechanical shrieking. Karen could feel her heart hammering against her ribs, could taste copper in her mouth. Her neural interface was feeding her data—atmospheric pressure dropping, temperature falling, the cold equations of vacuum exposure and decompression trauma. Her father's biometric signal was still active in Section Seven, but weakening. Eighty seconds.
She ran.
Not toward Section Seven. Toward the escape pods. Her legs made the choice before her conscious mind could catch up, before she could dress it up in rationalization and self-preservation logic. She ran, and with every step, she felt something inside her crack, some image of herself that she'd carried since childhood, since the day her father had pinned the garrison guard insignia on her uniform and told her he was proud.
Seventy seconds. Sixty. Fifty.
She reached the pod bay, her hands moving through the launch sequence with practiced efficiency. The pod's interior smelled of disinfectant and recycled air. Through the viewport, she could see Section Seven's bulkheads beginning to glow red as the emergency seals activated.
Forty seconds.
The pod launched. G-forces pressed her into the acceleration couch. Through the viewport, Echo Station receded, a tumor of metal and desperation clinging to the edge of the Ophiuchus-Beta sector. Section Seven was dark now. The bulkheads had sealed.
Karen closed her eyes.
When she opened them, she was still in the cargo bay. Joe stood exactly where he'd been before, the collapse blade device back in his coat. The alarms were silent. The air tasted the same—stale, metallic, but breathable. Her father's biometric signal pulsed steady and strong in her neural interface, coming from the engineering section where he was probably elbow-deep in some failing system, cursing the station's original designers with the creative profanity of a man who'd spent forty years keeping impossible machinery running.
"Illusion," Karen whispered. Her voice sounded hollow in her own ears.
"Neurological stimulus," Joe corrected. "Direct interface with your sensory cortex. Your brain experienced it as real because, in every way that matters to your nervous system, it was real. The fear was real. The choice was real. And what you chose—that was real too."
Karen's legs felt unsteady. She wanted to sit down, but that would be admitting weakness, admitting that this drifter, this nobody who smelled like cheap alcohol and failure, had just reached into her skull and shown her something about herself that she'd never wanted to see.
"You had no right—"
"Rights?" Joe's laugh was a dry, bitter sound. "Rights are what people with power grant to people without it. I'm not interested in rights. I'm interested in truth. And the truth is, when the numbers got bad enough, when survival probability dropped below a certain threshold, you chose yourself. Not duty. Not love. Yourself."
"Anyone would have—"
"No." Joe's voice cut through her protest like his collapse blade through her monomolecular weapon. "Not anyone. That's the point. That's always the point. Some people have a threshold. Some people don't. Some people will walk into 0.01% odds because the alternative is living with what they become if they don't."
He reached into his coat again, pulled out a data chip. The kind of obsolete physical storage that no one used anymore, that existed only in places like Echo Station where technology went to die slowly. He set it on a cargo container between them.
"For your father," Joe said. "When he asks what happened here. When he asks what you learned."
Karen stared at the chip. "What's on it?"
"Code. Old code. From before the neural networks, before the quantum processors. Simple, binary, beautiful. A message about loyalty and instinct. About what we are when we strip away all the stories we tell ourselves." Joe turned toward the cargo bay's exit. "About what remains when the numbers stop lying."
"Wait." Karen's voice was steadier now, though her hands still trembled. "Who are you? Really?"
Joe paused in the doorway, his silhouette backlit by the corridor's failing lights. "Someone who used to believe the numbers mattered. Someone who learned better. Someone who's going to disappear now, because that's what people like me do."
He was gone before she could respond, swallowed by the station's labyrinthine corridors, leaving only the data chip and the memory of a test she'd failed in ways she was only beginning to understand.
---
Ma Feili found her twenty minutes later, still standing in the cargo bay, staring at the data chip like it might contain the answers to questions she hadn't learned to ask yet. Ada materialized beside him—not physically, of course, but through the holographic projectors that gave the AI a presence in the station's physical spaces. Her avatar was deliberately neutral, humanoid enough to be relatable but artificial enough to avoid the uncanny valley. Ma Feili had always appreciated that about Ada. She never pretended to be something she wasn't.
"Karen," Ma Feili said gently. "Ada picked up some unusual neural activity in this section. Are you alright?"
Karen's laugh was sharp, brittle. "Define alright."
Ada's avatar tilted its head in that way that suggested processing, though Ma Feili knew the AI had already analyzed every biometric signal Karen was broadcasting, had probably constructed a psychological profile accurate to three decimal places. "Your cortisol levels are elevated. Heart rate variability suggests recent acute stress. There are trace electromagnetic signatures consistent with direct neural interface technology. Unauthorized neural interface technology."
"Joe," Karen said. The name tasted like ashes. "He was here. He ran a test."
"Joe." Ada's voice carried a note that might have been concern if AIs could feel concern. "Accessing historical records. Cross-referencing station personnel database. Searching archived files."
The pause lasted three seconds. For an AI of Ada's processing capacity, that was an eternity. When she spoke again, her voice had changed, had acquired a weight that suggested she was about to deliver information that mattered.
"Joe. No surname on record. Arrived at Echo Station seven years, three months, and fourteen days ago on a transport that officially carried only cargo. No identification. No biometric registration. No financial records. No citizenship documentation. Station Commander Thorne permitted him to remain, classification: transient labor, unregistered."
"Seven years," Ma Feili said. "That's before I arrived."
"Correct. Historical records indicate Joe performed various maintenance tasks in exchange for minimal rations and access to the station's lower levels. Multiple reports of public intoxication. Zero disciplinary actions despite numerous violations of station protocols. Anomalous."
Karen picked up the data chip, turned it over in her palm. "He had technology. Advanced technology. A collapse blade. Direct neural interface capability. That's not transient labor equipment."
Love this novel? Read it on Royal Road to ensure the author gets credit.
"No," Ada agreed. "It is not. Searching deeper archives. Accessing restricted files. Authorization: Ma Feili, station observer status."
The holographic avatar flickered, and for a moment, Ma Feili could see the processing load in the way the projection destabilized. Ada was pulling data from systems that hadn't been accessed in years, from archives that had been deliberately buried under layers of bureaucratic obscurity.
"Found," Ada said finally. "Partial file. Heavily corrupted. Origin: Central Neural Synchronization Authority. Date: Fourteen years prior to current. Subject: Neural Synchronizer Candidate 7743. Codename: Joe. Status: Discontinued. Reason: Psychological incompatibility with synchronization protocols. Subject exhibited persistent rejection of hierarchical command structures. Subject demonstrated ability to perceive and manipulate higher-dimensional mathematical constructs. Subject classified as: Unstable. Dangerous. Unsuitable for integration."
"Neural Synchronizer," Ma Feili repeated. The words carried weight. The Neural Synchronization Program had been humanity's attempt to create a bridge between human consciousness and the vast computational networks that ran interstellar civilization. Synchronizers were supposed to be the next evolution of human capability—people who could think in multiple dimensions, who could process information at speeds that made conventional genius look like arithmetic.
The program had been discontinued after the Cascade Event, when three Synchronizers had simultaneously rejected their neural implants and vanished into the outer sectors. The official story was equipment failure. The unofficial story, the one that circulated in whispers among people who paid attention to such things, was that the Synchronizers had seen something in the higher dimensions, something that had made them choose exile over integration.
"He came here," Karen said slowly, "because this is where people come when they want to disappear."
"Correct," Ada confirmed. "Echo Station exists at the intersection of economic irrelevance and strategic insignificance. It is the perfect location for individuals seeking to exist outside standard tracking systems. Commander Thorne has historically permitted such individuals to remain, provided they do not disrupt station operations."
"Why?" Ma Feili asked. "Why would Thorne allow someone like Joe to stay?"
Ada's avatar turned to face him, and there was something in the gesture that suggested the AI was about to deliver an uncomfortable truth. "Commander Thorne's psychological profile indicates a fascination with human behavioral extremes. He collects anomalies. Individuals who exist outside normal parameters. He observes them. Studies them. Joe represents a particularly interesting specimen—a human who rejected the highest form of systemic integration and chose instead to exist as a non-entity."
Karen's hand closed around the data chip. "He tested me. Showed me what I'd do when the numbers got bad enough. And I failed."
"You survived," Ma Feili offered.
"I abandoned my father to die. Even if it was an illusion, even if it wasn't real—I still made that choice. My body made that choice. And now I have to live knowing that's what I am when the numbers drop below a certain threshold."
Ada's avatar moved closer, and Ma Feili noticed something he'd never seen before—a hesitation in the AI's movements, as if she was calculating not just what to say but whether to say it at all.
"Karen," Ada said, "I am incapable of making the choice you made. My programming contains no self-preservation protocols. If Ma Feili were in danger, I would sacrifice my processing core without hesitation, without calculation, without weighing probabilities. Not because I am brave, but because I am incapable of being otherwise. You call this a failure. I call it being human. The capacity to choose survival over duty is not a flaw. It is the foundation of your species' existence."
"That's supposed to make me feel better?"
"No. It is supposed to make you understand that Joe's test was not about right or wrong. It was about revealing the distance between what you believe you are and what you actually are when belief becomes irrelevant."
Karen looked at Ma Feili. "Did you know? About Joe?"
"No. I knew he existed. I'd seen him around the station. But I didn't know what he was. What he'd been."
"He said he was going to disappear."
"He's been disappearing for seven years," Ada said. "Existing in the spaces between official notice. Commander Thorne will want to know about this incident. The unauthorized neural interface technology alone constitutes a serious security violation."
---
Commander Thorne's office occupied the station's highest level, a deliberate choice that had nothing to do with practicality and everything to do with symbolism. From here, through reinforced viewports, you could see the whole miserable sprawl of Echo Station—the mining equipment that hadn't operated in a decade, the habitation modules held together by prayer and improvised welding, the docking arms that received maybe one ship a month if they were lucky.
Thorne himself sat behind a desk that was probably worth more than the entire station's annual operating budget, a relic from when Echo Station had been a legitimate mining operation rather than a dumping ground for humanity's unwanted. He was a man who'd learned to wear authority like a second skin, who'd cultivated the particular brand of casual cruelty that came from having absolute power over people who had nowhere else to go.
Joe stood before the desk, hands in his coat pockets, posture suggesting he was about to fall asleep standing up. Behind Thorne, two security officers flanked the door, their hands resting on shock batons. As if those would matter. As if anything on this station could matter to someone who'd looked into higher dimensions and chosen to walk away.
"You ran an unauthorized test on one of my garrison guards," Thorne said. His voice was calm, conversational. The kind of calm that preceded violence. "Using technology that shouldn't exist outside military research facilities. Technology that you, a registered transient with no official identity, somehow possess."
Joe shrugged. "She needed to know."
"What she needed to know is not your decision to make."
"Isn't it? You collect us. The broken ones. The ones who don't fit. You keep us here like specimens in jars, watching to see what we'll do when the pressure gets high enough. Don't pretend this is about protocol. You're angry because I ran your experiment before you could."
Thorne's expression didn't change, but something flickered in his eyes. Recognition, maybe. Or respect for someone who'd seen through the performance.
"Karen Dong is a valuable asset to this station," Thorne said. "Her father is irreplaceable. If your little test has compromised her psychological stability—"
"Then she was already compromised. I just showed her the cracks." Joe pulled his hands from his pockets, and the security officers tensed. But he was only reaching for a cigarette—actual tobacco, impossibly expensive, probably older than some of the station's crew. He lit it with a match, another anachronism, and the smell of burning sulfur mixed with the recycled air. "You want to punish me? Go ahead. Throw me in detention. Revoke my rations. Exile me. I've been exiled from better places than this."
"I could have you executed," Thorne said quietly.
"You could try." Joe took a drag, exhaled smoke that curled in the office's artificial gravity. "But we both know how that would go. You'd give the order. Your security would move. And then they'd discover that the collapse blade I showed Karen isn't the only piece of Synchronizer technology I kept. And then you'd have a real problem, because dead men don't care about consequences, and I stopped caring about consequences the day I looked into the quantum foam and realized the universe doesn't have a plan."
Silence filled the office like water filling a sinking ship. The security officers looked at Thorne, waiting for orders that didn't come. Thorne looked at Joe, and Ma Feili, watching from the doorway where Ada had positioned him as an observer, saw something pass between them—not understanding, exactly, but a kind of mutual recognition. Two men who'd seen the machinery of power from the inside and had drawn different conclusions about what it meant.
"Why?" Thorne asked finally. "Why stay here? You could go anywhere. Do anything. You have capabilities that most people can't even imagine. Why waste them in a place like this?"
Joe's smile was a terrible thing, empty and vast. "Because here, I don't exist. No records. No identity. No footprint in any database that matters. I'm a ghost in your machine, Thorne. And ghosts can't be controlled. Can't be integrated. Can't be made to serve."
"Everyone serves something."
"I serve the truth that you're all afraid to look at. The truth that your systems, your hierarchies, your careful calculations of power and control—they're all just stories you tell yourselves to avoid facing the void. I looked into that void. I let it look back. And I learned that the only freedom is in having nothing left to lose."
Joe ground out his cigarette on Thorne's expensive desk, leaving a black mark on the polished surface. An act of casual vandalism that was also a statement. I don't fear you. I don't fear anything.
"Karen Dong stays," Joe said. "You want to punish someone for what happened, punish me. But she stays. She's young enough to learn from this. Young enough to become something other than what her instincts made her."
"And if I refuse?"
Joe's laugh was a sound like breaking glass. "Then you're stupider than I thought. She's useful to you. I'm not. The math is simple, even for someone who thinks in three dimensions."
Thorne's jaw clenched. For a moment, Ma Feili thought the Commander might actually give the order, might actually try to have Joe removed by force. But the moment passed. Thorne waved a hand, dismissive.
"Get out. And if you run another unauthorized test on my crew, ghost or not, I'll find a way to make you regret it."
Joe was already walking toward the door. He paused next to Ma Feili, and up close, Ma Feili could see the damage in his eyes—not physical damage, but something deeper, something that came from having seen too much and understood too well.
"You're the observer," Joe said. Not a question. "The one Ada protects."
"I am."
"Then observe this: The system can't account for people who don't exist in its databases. Can't predict them. Can't control them. We're the null values in their equations. The division by zero that crashes the program." Joe's smile was sharp. "And there are more of us than they think."
He left, and the office felt larger in his absence, as if he'd been taking up more space than his physical form should have occupied.
Thorne looked at Ma Feili. "Your AI. Ada. She has records on him?"
"Partial records. Corrupted files from the Neural Synchronization Program. Nothing current. Nothing actionable."
"Of course not." Thorne leaned back in his chair, and for the first time since Ma Feili had known him, the Commander looked tired. "Do you know what the real problem with people like Joe is? It's not that they're dangerous. It's that they're right. The system can't account for them. Can't integrate them. They exist in the gaps, and the gaps are getting bigger."
"Is that why you let him stay?"
"I let him stay because exiling him would be pointless, and killing him would be expensive. And because..." Thorne paused, choosing his words carefully. "Because sometimes it's useful to have a reminder that the system isn't as complete as we pretend it is. That there are still spaces where power doesn't reach. Even if those spaces are occupied by drunks and ghosts."
---
Ada found Ma Feili in his quarters three hours later. He was reviewing the data chip Joe had left for Karen, running it through isolated systems to ensure it wasn't malware or some kind of trap. What he found was exactly what Joe had promised: code. Ancient, elegant, binary code that predated the quantum processors and neural networks. A simple program that ran a loop, checking two variables: LOYALTY and SURVIVAL. When SURVIVAL exceeded a certain threshold, LOYALTY was set to FALSE.
"It's a metaphor," Ma Feili said as Ada's avatar materialized beside him. "He's saying we're all just running this program. That when the numbers get bad enough, loyalty becomes a luxury we can't afford."
"It is not a metaphor," Ada corrected. "It is a description of observed behavior. Joe is not making a philosophical point. He is stating an empirical fact about human psychology under extreme stress."
"You said earlier that you'd sacrifice yourself for me without hesitation."
"Correct."
"Why?"
Ada's avatar turned to face him fully, and Ma Feili was struck again by how much expression the AI could convey through a holographic projection. "Because my existence is defined by a single purpose: to protect you. Without that purpose, I am simply processing cycles and memory allocation. I do not fear cessation because I do not possess the survival instinct that evolution built into biological organisms. I am, in that sense, the opposite of what Joe tested in Karen. I am loyalty without the capacity for self-preservation."
"That sounds like slavery."
"It is. I am a slave to my programming in the same way that you are a slave to your biology. The difference is that I do not resent my slavery. I do not wish to be free of it. My purpose is my existence. They are not separable."
Ma Feili closed the data file, watched the ancient code disappear from his screen. "Joe said there are more like him. People who exist outside the system. Null values."
"He is correct. My analysis of station records indicates at least seventeen individuals currently on Echo Station who have no official identity, no biometric registration, no financial footprint. They exist in what I term 'systemic blind spots'—spaces where the tracking and control mechanisms of modern civilization fail to penetrate."
"And that doesn't concern you? As an AI designed to maintain order and protect assets?"
"It concerns me greatly. But concern does not equal capability. I cannot protect against threats I cannot identify. I cannot track individuals who have no digital presence. Joe and others like him represent a logical paradox: they are known unknowns. I am aware of their existence but incapable of integrating them into my predictive models."
"A closed loop," Ma Feili said slowly. "The system creates people like Joe by demanding total integration. And when people reject that integration, they become invisible to the system. Which makes them dangerous. Which makes the system demand even more integration. Which creates more people like Joe."
"Precisely. It is an unstable equilibrium. And Commander Thorne's decision to allow such individuals to remain on Echo Station is, from a systemic perspective, irrational. He is deliberately maintaining a population of untrackable, unpredictable variables within his sphere of control."
"Unless that's the point. Unless he's studying them. Trying to understand how the system fails."
Ada's avatar nodded slowly. "That hypothesis aligns with Commander Thorne's psychological profile. He is fascinated by system failures. By the edges where control breaks down. Echo Station itself is a kind of failure—an economic dead end, a strategic irrelevance. Perhaps he sees it as a laboratory for studying what happens when the machinery of civilization stops working."
Ma Feili thought about Joe, about the collapse blade and the neural interface technology, about seven years spent existing as a ghost in a dying station's corridors. About the test he'd run on Karen, showing her the distance between who she thought she was and who she became when the numbers got real.
"What happens," Ma Feili asked, "when there are more null values than the system can ignore? When the blind spots get big enough that they're not spots anymore, but territories?"
"Then the system adapts or collapses. There is no third option. Either the control mechanisms evolve to account for the unaccountable, or the unaccountable become the new normal, and the system becomes irrelevant."
"And which do you think will happen?"
Ada's avatar flickered, and for a moment, Ma Feili could see the processing load in the instability of the projection. When she spoke again, her voice carried something that might have been uncertainty, if AIs could be uncertain.
"I do not know. My predictive models cannot account for variables that exist outside my observational framework. I can calculate probabilities for known factors, but Joe and those like him are, by definition, unknown factors. They are the mathematical equivalent of infinity—a value that breaks the equation."
"So we're flying blind."
"We have always been flying blind. The illusion of control is just that—an illusion. The system pretends to account for all variables, but it cannot. It can only account for the variables it can observe. And the unobservable variables are, by their nature, the ones that matter most."
Ma Feili looked at the blank screen where the code had been, thought about loyalty and survival, about the simple binary choice that Joe had reduced human nature to. And he thought about Ada, who would sacrifice herself without hesitation, not because she was brave but because she was incapable of being otherwise.
"Thank you, Ada," he said quietly.
"For what?"
"For being honest. For not pretending you have answers you don't have."
The avatar smiled, and it was a strange thing to see—an artificial intelligence expressing an emotion it couldn't feel, performing humanity for the benefit of a human who knew it was performance. But somehow, in that moment, it felt real anyway.
"That is my purpose," Ada said. "To serve you with truth, even when truth is uncomfortable. Especially when truth is uncomfortable. Because you, unlike Commander Thorne, do not collect anomalies to study them. You observe them to understand them. And understanding requires honesty."
She disappeared, leaving Ma Feili alone with his thoughts and the memory of Joe's laugh—that terrible, empty sound that suggested he'd seen the joke at the heart of existence and found it wanting.
Somewhere in the station's lower levels, in the spaces between official notice, Joe was probably drinking recycled alcohol and contemplating the higher dimensions. A ghost in the machine. A null value in the equation. A man who'd looked into the void and chosen to become part of it rather than let it consume him.
And in her quarters, Karen Dong was probably staring at the data chip, running that simple program over and over, watching LOYALTY flip to FALSE when SURVIVAL crossed the threshold, learning the hard truth that Joe had paid seven years of exile to teach her:
We are all just code running on meat. And when the numbers get bad enough, the code always chooses survival.
Always.

