home

search

B1.15 — THE HUMAN COST

  (2036 — Oxford, Evening)

  Isaac didn’t mean to stop working.

  The lab reports were still open on his laptop, cursor blinking in a half-finished note, but the news segment caught his eye when he walked past the living room.

  The anchor’s voice was too calm for what she was describing.

  “...federal investigators say the exposure may have continued for years. Multiple workers hospitalized last month. One has...”

  Julie reached for the remote and lowered the volume, but not enough to hide the images: men in thin disposable suits wading through an ochre pit, chemical steam rising around their legs. A congressional hearing room flashed next, stiff suits, raised voices, a stack of reports the size of a child.

  Isaac stopped behind the couch.

  Julie looked back at him.

  “You okay?” she asked.

  He nodded, though it wasn’t true.

  “Didn’t… know it was this bad.”

  “You did,” she said softly. “You just didn’t watch it.”

  He didn’t answer. The segment rolled on: budget shortfalls, EPA testimony, local parents angry about groundwater. A doctor listing the cancers nobody paid for.

  Isaac frowned.

  “I… THIS… this can’t continue,” he stammered, overcome with grief momentarily.

  “I need to do something”

  Julie gestured toward the hallway. “Go. I’ll turn this off.”

  He stepped into the small office. The overhead light hummed. The laptop screen had gone dark, then brightened again as he tapped a key.

  For a moment, Isaac just stood there, hands braced on the edge of the desk, breathing through the tightness in his chest. The images from the broadcast wouldn’t leave him... the color of the water, the suits too thin, the way the doctor had spoken as if the numbers were already settled.

  He sat.

  Pulled the laptop closer.

  The half-finished lab note stared back at him, suddenly irrelevant.

  Isaac opened a new terminal window.

  Love what you're reading? Discover and support the author on the platform they originally published on.

  Not an email.

  Not a document.

  A query interface.

  Julie appeared in the doorway, quiet, watching him without interrupting. She knew the look: not panic, not inspiration, alignment. The moment when his internal constraints snapped into place.

  He didn’t look up.

  “I need to sanity check something,” he said, voice tight but steady.

  She nodded once. “Okay.”

  Isaac’s fingers hovered over the keys.

  Then he typed.

  Constraint Test:

  Assume: human exposure to known lethal or carcinogenic environments is unacceptable.

  Given: existing remediation timelines, regulatory delays, and available non-human platforms.

  Question: what intervention paths minimize total human harm?

  He paused.

  Reread it.

  Added one more line, not moral, just precise.

  Weight human casualty risk as infinite relative to cost, delay, or equipment loss.

  He hit execute.

  The system didn’t respond right away.

  It never did.

  Beta didn’t answer. It computed.

  Julie stepped closer, leaning against the doorframe.

  “You’re not asking it what should be done,” she said quietly.

  “No,” Isaac replied. “I’m asking what follows if I take our own rule seriously.”

  The cursor pulsed.

  Then output began to populate... structured, clinical, unemphatic.

  Not a recommendation.

  A result set.

  Hazard classes cross-referenced with exposure duration.

  Failure rates of protective gear over extended timelines.

  Regulatory lag distributions.

  Projected morbidity curves.

  Then a convergence summary.

  Result:

  Human-led remediation produces non-zero fatality expectation under all modeled timelines.

  Substitution with remote or robotic platforms reduces expected human harm by >97%.

  Isaac scrolled.

  Below that: implementation pathways.

  Existing industrial robotics.

  Teleoperated platforms.

  Prototype autonomous remediation units.

  No adjectives.

  No urgency.

  No framing.

  Just math.

  Isaac leaned back slowly, the chair creaking under his weight.

  Julie crossed the room and read over his shoulder, her expression calm but intent.

  “It didn’t decide anything,” she said.

  “I know.”

  “It didn’t suggest,” she continued. “It answered exactly what you asked.”

  “I know.”

  He stared at the screen.

  The logic was brutal in its simplicity:

  If human harm is unacceptable

  and harm is occurring

  while non-human alternatives exist

  then the optimal solution is removal of humans from the hazard.

  It wasn’t compassion.

  It was geometry.

  “I can’t pretend I didn’t see this,” Isaac said.

  Julie rested a hand lightly on his shoulder.

  “You shouldn’t,” she replied. “But seeing it doesn’t obligate anyone else. That part is still yours.”

  He nodded.

  Closed the terminal window.

  Opened a blank document.

  This time, his name was at the top.

  He wrote slowly, translating the output into human language. Explaining assumptions. Acknowledging limitations. Removing the parts that sounded like proofs and replacing them with context people could argue with.

  Julie read over his shoulder, stopping him when the tone drifted too far toward abstraction.

  “Say people here,” she said gently. “Not units.”

  He adjusted it.

  When they finished, the document was still uncomfortable.

  But it was legible.

  Isaac’s finger hovered over the send button.

  “This doesn’t force anything,” Julie said softly. “It just tells them what you found when you followed your own rules all the way down.”

  He nodded once.

  And sent it.

Recommended Popular Novels