AGPI Labs, Oxford — August 28–September 4, 2038
The first serious argument came on August 28th — four days after the C-AGPI-2 near-miss and barely twenty hours after HIS submitted its accelerated deployment plan.
Isaac stood at the front of Conference Room 4A, palms pressed flat against the table, staring at a slide that showed a simple outline: a narrow shaft, a debris choke point, and the silhouette of a machine slipping between the two.
A bipedal frame.
Compact.
Articulated joints.
Human proportions.
“I said no,” Isaac repeated, voice taut.
Nathan rubbed his forehead. “We’re not committing to it. HIS wants feasibility. A sketch. A concept pitch.”
Julie tapped the projection remote. “And this is what they’ll take from it: You can do this. So do it. Feasibility isn’t neutral.”
Howard sat back in his chair, calm but not relaxed. “We need to be precise. There’s a difference between exploring the geometry and endorsing the implication.”
“The implication,” Isaac snapped, “is a walking shape that looks too much like us.”
Nathan opened his hands. “It doesn’t need a face. It doesn’t need anything but the ability to move through spaces the heavy Crows can’t.”
Isaac let out a sharp exhale. “The moment something takes a human stance, people project intention onto it. Empathy. Fear. Expectation. Pick one.”
Julie added, “And once a machine looks like it can do what a person does, the public assumes it will do what a person should do. That’s the fault line. That’s where the boundary breaks.”
Howard gave a short nod. “Anthro frames come with a psychological tax. For us. For them. For the entire ecosystem.”
Nathan’s shoulders tightened. “HIS is getting pressure from every direction — rescue units, infrastructure teams, emergency services. They’re not asking for this because they want it. They’re asking because they need it.”
Isaac’s voice dropped. “Then they need to understand what they’re asking.”
Isaac walked the length of the testing bay with Howard at his side. C-AGPI-2 stood in diagnostic stance, its dented upper chassis still waiting for replacement plates. Engineers worked in the distance under the buzz of overhead lights.
“Do you think I’m overreacting?” Isaac asked.
Howard didn’t answer immediately. He watched a technician calibrate a hydraulic joint. “No,” he said at last. “You’re reacting exactly the way someone should when the world starts accelerating faster than it can understand itself.”
Isaac huffed a bitter laugh. “That’s not comforting.”
“It wasn’t meant to be.”
They stopped near the prototype. The damaged panel gleamed under the lights, a quiet reminder of the human it had protected.
“We’re not building people,” Isaac said softly.
Ensure your favorite authors get the support they deserve. Read this novel on Royal Road.
“No,” Howard agreed. “But we’re working in the same emotional territory. Whether we choose to or not.”
That sentence stayed with Isaac long after he left the lab.
Julie lit the kettle while Isaac sat at the kitchen table, head in his hands. Catherine’s laughter echoed faintly from upstairs — a small reminder that life outside the lab still existed, even if he couldn’t feel it.
“Talk to me,” Julie said, setting a mug in front of him.
“It’s not the engineering,” Isaac said. “It’s everything around it. The expectations. The… hunger.”
Julie took the chair across from him. “People are frightened. They’re hopeful. That combination makes them irrational.”
He shook his head. “I keep thinking that if we just design it right — constrain it right — we can keep this clean. But every time we solve a problem, we create ten new demands.”
“That isn’t failure,” Julie said gently. “That’s scale.”
He stared into the steam curling from the mug.
Julie reached across the table, touching his wrist. “You’re allowed to draw a line, Isaac.”
He swallowed hard. “It won’t hold.”
“Then we reinforce it,” she said. “Together.”
Something in his chest gave way — a tension he hadn’t realized was rigid until it broke.
The room was warm, packed with people who wanted answers more than discussion. Diagrams lined the walls — confined tunnels, collapsed stairwells, structural choke points.
An HIS official pointed to a narrow rescue corridor. “We’re not asking for humanoid. We’re asking for access. The heavy Crows can’t enter this space. You need something that can.”
Isaac kept his tone measured. “We can design articulated limbs without making the frame resemble a person. There’s a geometry for that.”
Julie added, “But if you insist on a bipedal posture — even if it’s tool-like — people will read human intent into it. We need to manage that expectation.”
The official frowned. “You’re telling me we can’t build the platform we need because people will get the wrong idea?”
Howard cut in. “We’re telling you the wrong idea becomes public expectation. And public expectations become policy.”
The room stilled.
Nathan stepped in, smoothing the edges. “We’ll develop a constrained, non-humanoid articulated platform. That’s the starting point. And we’ll do it safely.”
The official nodded, but the tension didn’t move.
Later, as they left, Nathan murmured, “They heard the answer they wanted. Not the one we gave.”
Isaac didn’t disagree.
The lab was empty except for prototypes in their charging bays — metal silhouettes against the dim blue of overnight lighting. Isaac walked between them, hands in his pockets, turning over confined-space geometry in his mind.
Not humanoid.
Not shaped like us.
Not a stand-in for a person.
Tools.
Always tools.
He stopped beside C-AGPI-2. The dented panel caught a thin line of light, like an old scar.
“It can’t be a person,” Isaac whispered. “It can’t even look like one.”
From the darkness near the doorway, Julie said quietly, “Isaac… sometimes looking like a person is what saves a life.”
He turned. She stepped closer, thoughtful rather than confrontational.
“In SAR work,” she continued, “people trapped in smoke or dust don’t respond logically. They don’t respond to geometry. They respond to silhouettes. A familiar shape can make someone run toward help instead of away from it.”
Isaac looked unsettled. “Or it can make them freeze.”
“Yes,” Julie agreed. “Or hide. Or panic. It depends on context. That’s the problem.”
She rested her hand lightly on the frame. “A humanoid outline isn’t a bug. It’s a psychological variable. We have to decide when it’s helpful and when it’s dangerous.”
Howard’s voice drifted from deeper in the lab; he’d arrived without announcement, as he often did.
“And if we don’t decide,” he said, “someone else will — and they’ll choose whatever gets the most bodies out the door fastest.”
Nathan joined them moments later, coffee in hand, eyes tired but clear. “HIS is already thinking about public perception. If a bipedal outline pulls survivors out of a burning building faster than a tool shape, they’ll push for it.”
Isaac rubbed the tension between his eyes. “So we can’t ban a human silhouette.”
“No,” Julie said softly. “But we can control its use. Define the contexts.
Search and rescue isn’t urban clearing.
Fire response isn’t crowd space.
And silhouettes aren’t neutral.”
Howard nodded. “Then our job is to carve the boundary with enough precision that they can’t move it without noticing.”
Nathan set his coffee down. “And to make sure we don’t cross it without meaning to.”
No one spoke after that.
The machines hummed quietly in their bays.
The air felt thick with possibility — all of it heavy.
And the boundaries, newly redrawn and newly complicated, held.
For now.

