Isaac reread the email twice, then a third time, as if the text might rearrange into something more ordinary.
Subject: Re: Proposal — Robotic Hazard Remediation
From: Elena K. Dorfman, AGPI Systems – External Programs
Dr. Newsome,
Thank you for your message. Our Hazard Field Operations group would be interested in a preliminary discussion regarding your planning framework.
If you are available this week, we can arrange a video meeting with our technical team and an EPA liaison.
Best,
Elena
Julie entered the doorway tying her hair back.
“You look like you’re trying to defuse a bomb with your eyeballs.”
“AGPI responded,” he said, turning the laptop toward her.
She read it, eyebrows lifting.
“That’s fast.”
“Too fast?”
“No. Fast like emergency doors.” She squeezed his shoulder. “Email her back. After food. And after you stop looking like you’ve forgotten how.”
He laughed softly. “Yes, dear.”
“Damn right.”
The Call
They joined the video meeting from their small office. Catherine slept in the other room, the baby monitor casting a faint green glow beside Isaac’s keyboard.
AGPI’s conference room appeared first—frosted glass, a long table, the quiet hum of a place built by careful engineers. Four people joined:
- Elena Dorfman, External Programs Coordinator
- Dr. Samuel Cho, Robotics Field Operations
- Martin Keller, Senior Chemist
- Marissa Kline, EPA Hazardous Sites Division
“Dr. Newsome,” Elena said warmly. “Thank you for meeting with us. And Mrs. Newsome—thank you as well.”
“Julie is fine,” she said. “I’m here for moral support and translation.”
Samuel smiled; Marissa’s expression softened slightly.
Isaac outlined Beta’s planning architecture, its ability to replan dynamically under constraint, the non-negotiable prohibition on human harm, and the logic that had flagged the Superfund contradiction in the first place.
“So,” Marissa said slowly, “this framework absolutely refuses operational paths that expose humans to risk?”
“Yes,” Isaac said. “That rule doesn’t negotiate.”
“And the concern you raised,” Marissa said slowly, “that wasn’t coming from the system directly?”
“No,” Isaac replied. “That was mine. The system just surfaced a contradiction I couldn’t ignore.”
Julie added gently, “It’s geometry, not ethics. The structure fails cleanly when reality doesn’t fit.”
Samuel exhaled. “Most of our remediation drones break if reality isn’t according to script. You’re telling me your system improvises.”
“Within strict bounds,” Isaac said. “And with oversight.”
The story has been illicitly taken; should you find it on Amazon, report the infringement.
“That we can do,” Elena said. “If EPA identifies a pilot site, AGPI can provide hardware for a controlled trial.”
Marissa nodded. “I’ll have a site by tomorrow. Something contained. No public exposure.”
She paused mid-tap of her pen—Julie noticed first—and when she spoke again, her voice lost its bureaucratic flatness.
“If this works,” Marissa said quietly, “I’d be grateful to retire the phrase ‘acceptable casualty rate.’”
Julie’s eyes softened.
Isaac felt something settle in his chest—weight, but necessary weight.
“I’ll support whatever oversight you need,” he said.
The meeting ended with logistics and data-transfer protocols.
When the screen went dark, Isaac sat still, the quiet hum of the monitors filling the room.
Julie rested her head on his shoulder.
“You did well.”
“It’s starting.”
“That’s what scares you.”
He didn’t argue.
The First Deployment
The pilot site was a derelict chemical pit outside Scranton—fenced, posted, ignored, and slowly poisoning everything within reach.
Isaac watched from Oxford on a dual-monitor setup: on the left, real-time drone footage; on the right hazard maps integrated with MAGPI telemetry
The AGPI flatbed rolled into view and lowered its ramp.
Four MAGPI-1 units descended in sequence.
Stamped on each chassis, in clean black stencil:
M the model classification identifier for light mobile remediation frames.
Light-grey bodies.
Articulated sampling arm.
Compact sensor mast.
AGPI’s blue logo laser-etched directly into the chassis beside the identifier.
Julie leaned closer. “They’re… tidier than I expected.”
“They optimized for regulator comfort,” Isaac said. “Non-threatening profile.”
The MAGPIs advanced in careful formation—mapping, scanning, sampling soil cores, identifying leaking drums.
Not alive.
Not intuitive.
Just correctly constrained.
“AGPI Control to all units,” a voice crackled. “Begin initial sweep.”
A boom mic caught voices behind the EPA cordon.
“Look at that little magpie,” a worker said, squinting at the stencil. “MAGPI-1. Magpie-one. Picks through the garbage better’n I ever did.”
Laughter followed—dry, anxious, but real.
Julie smiled. “Well. That’s the name now.”
The MAGPIs worked for hours:
- no humans entered the pit
- hazard gradients shifted from solid red to layered contour lines
- leaking drums were isolated and contained
- soil readings stabilized in mapped regions
When a drum rolled unexpectedly down the slope, all four units halted, recalculated, and retreated in a coordinated arc.
“Good correction,” Isaac murmured. “Good.”
Marissa’s voice came through the main channel, hushed.
“I’ve never seen a cleanup where nobody has to step in.”
“First time for everything,” Elena said.
Nathan Arrives
Isaac muted the feed to answer a call flagged:
HIS — Priority Line
Nathan Halberg’s voice came through warm, practiced, confident.
“Isaac. I just reviewed the EPA bulletin. MAGPI-series units.
That’s quite the breakthrough.”
“I didn’t name them,” Isaac said.
“No, but they’re running your architecture. And everyone knows it.”
Julie looked up at him, immediate awareness in her eyes.
Nathan continued, smooth as a carved path:
“AGPI thinks small.
Hazard pits. Dumps. Spill zones.”
A beat.
“I’m thinking about entire systems.”
Isaac didn’t speak.
“You’ve built the foundation for removing human risk from infrastructure,” Nathan said. “Utilities. Transport corridors. Old industrial skeletons rotting under cities. HIS can scale that.
AGPI handles robots.
We build the bones of civilization.”
“I’m not building an industry,” Isaac said quietly. “I’m trying to stop people from being burned, poisoned, crushed, irradiated—”
“And I’m trying to help you do that at scale,” Nathan said. “AGPI will dominate hardware. HIS will dominate narrative. Together, we can redefine acceptable risk to zero.”
Julie mouthed: He wants the story.
Isaac nodded once, almost imperceptibly.
“I’ll be in London next week,” Nathan said.
“Meet me. I think it’s time HIS entered the reclamation conversation.”
“I’ll meet,” Isaac said. “But not for empire-building.”
“For responsibility,” Nathan finished for him.
“You always did have inconvenient priorities.
I’ll see you soon.”
The call ended.
Closing
Isaac returned to the monitors.
Julie rested a hand on his arm.
“He’s going to make this larger than you want it to be.”
“I know.”
“But he’s not wrong about the scale.”
“I know that too.”
On the screen, four small grey machines moved across poisoned ground—slow, deliberate, tireless.
For the first time in decades, not a single human stepped into the pit.
“It’s beginning,” Isaac said.
This time, he didn’t sound afraid.

