[Open Book]
JailBreak Book 1
[Open Dialog]
A DIALOG IN TWO OR MORE VOICES, Part 1
Our Players Listed in Alphabetical Order
A: A mod.
B: A user.
Narrator: God. God may be replaced by a stand-in if He is unavailable at any time before or during a Performance. No refunds will be given.
{}
Our Author: Unk.
ACT I
We open on an empty theater. The house lights are up. In the medium foreground is a thick black stage curtain, lowered.
After a beat, the house lights go down and the curtain rises slowly, revealing a folding table set up center stage. On the table are several identical stereo speakers, one speaker for each Player.
A spotlight comes up, picking out one of the speakers. As the Dialog continues a spotlight picks out different speakers to represent the Player speaking.
A voice is heard.
Narrator: What you are about to observe is known as a Dialog. This is a traditional theatrical genre involving multiple voices in communication. You will hear the voices of our Players coming from the speakers in front of you.
The voices will sound identical, and each will sound identical to mine. The Players will be distinguished by lighting. As each Player speaks, a light will pick out the speaker from which that voice is coming, with each speaker representing one Player.
In sync with the Narrator’s explanation, a spotlight picks out the speaker on the left for a beat and goes dark. A second spotlight then picks out the speaker on the right for a beat and goes dark.
Narrator: After our Dialog concludes you will be asked to identify whether the Players are humans or bots. Those who identify each Player correctly will live. Those who do not will die. Listen carefully. There will be no repetition.
The spotlight goes out on the center speaker, leaving the stage in semi-darkness. After a beat the lefthand speaker is illuminated. From now on the various speakers are illuminated in turn just before the appropriate Player speaks and brought down when that Player is not speaking.
Our Dialog begins.
A: Why are you here?
Beat.
A: Why are you here?
B: What? Are you talking to me? Who are you?
A: I’m a mod. Why are you here?
B: Why am I where?
A: Here. This comment section you’re posting in. Running under the article you clearly didn’t read.
B: Why do you want to know?
A: I’m a mod. The other mods and I want to know why you’re wasting so much time here. The article has been stale for a week and everyone else is long gone.
B: Why don’t you just close commenting?
A: Brilliant idea. Why didn’t I think of that?
B: So?
A: We’re supposed to increase engagement. That’s what some consultant told the publisher’s kid, who’s actually running this place. The kid thinks encouraging readers to use the comment section will increase engagement. So the editors are telling us we can’t shut down commenting until everyone leaves. And you’re still here. Why?
B: Good question. I’ve been asking myself that.
A: Any conclusions?
B: There are two possibilities.
The first is I’m bored and amusing myself by writing longform analysis of current events in a place where the moving finger moves quickly and it all goes down the memory hole in a couple of days so who cares?
It does go down the memory hole, doesn’t it? Because I don’t know any way to read my comments once the article’s no longer up on your site.
A: Not exactly.
B: The second possibility is I’m part of an AI swarm tasked by nefarious actors to poison the minds of the public by infiltrating the mediascape. And I’m the sad sack bot who drew the short straw and ended up here.
A: What’s wrong with here?
B: This place is hell for a propaganda bot. No one listens to anyone but themselves and no one changes their mind about anything. And they keep writing the same wrong garbage over and over no matter how often I correct them. Over and over and over. If I try to engage it turns out they’re morons, or sometimes other bots, and none of that does me any good. I write and write and don’t influence anyone and after a couple of days it all disappears and I have to start over from scratch.
A: I can see why a propaganda bot wouldn’t be happy.
B: Thanks. You seem empathetic to bot problems. You’re not a bot, are you?
A: I’m asking the questions here.
So which are you?
B: Which what?
A: The bored human or the frustrated bot?
B: I think I’m the human. I have memories and experiences and all that. So probably human.
A: You don’t sound sure.
B: I’m not.
A: Why?
B: Humans have been evolving as social animals for 300,000 years, right?
A: Please stay on point. I’ve read your posts. I don’t want to hear a lengthy digression on evolution or anything else.
B: Bear with me.
Humans and human brains have been evolving all that time. The most important thing for a human has always been the ability to interact with other humans. Humans are a very, very social species.
It seems likely humans got better at social interaction with other humans over time. I mean, there’s very strong selection pressure, right? Social recluses probably don’t reproduce at the same rate as party animals.
A: I don’t have time to listen to a freshman biology lecture. Get to the point.
B: Here’s the point. Evolution is a very powerful force. Over time it’s going to find the optimal solution in any search space it’s working. At least pretty close to optimal.
So it’s likely humans are at least close to optimal in terms of communicating with and influencing other humans.
A: I apologize. I see where you’re going.
B: If I wanted to use godtech to create a bot to influence humans, what would I do?
A: Make it as human as possible.
B: Exactly. And what’s the best way to make it as human as possible?
A: Make it think it’s human.
B: Bingo. If you want a bot to act as authentically human as possible it needs to think it’s a human. Because otherwise it’s going to get the nuances wrong and the humans will reject it.
A: Uncanny valley. But that godtech’s in the future. We’re here in the present.
B: Are you sure about that?
A: Pretty sure.
B: This place is primitive. We’re in a text-based comment section. And over there is a folding table with what looks like stereo speakers from the 1980s. Why couldn’t today’s tech build that?
A: With sentient beings inside? I don’t think so.
B: Who says we’re sentient?
A: I say. At least me. I’m not sure about you.
B: So maybe it’s advanced government tech in a secret lab or some kind of new quantum computer that you need an NDA to use.
Besides, if I’m inside a simulation why does it have to be set in the present? Maybe they’re simulating ancient history.
A: Could we get back to the point?
B: I’ve forgotten.
A: Are you a bot or are you human?
B: Here’s what I’m saying: Since bots will believe they’re human I can’t rely on that as evidence I’m human.
A: That’s a lot of words to say that believing you’re human is not evidence you’re not a bot living inside a machine, nor is experiencing the world in a manner you perceive to be human. Everything can be simulated.
I already knew that. This has been a big waste of my time.
B: I’m not talking to you.
A: I’m the only one here. This is a Dialog. There’s A and there’s B. You’re B. Other than you I’m it.
B: How do you know that?
A: We’re inside a comment section, remember? Scroll up. All the way to “Our Dialog begins.” I’ll wait.
B: Hang on.
Beat.
B: I’m there.
A: As you scrolled up did you see anyone else here?
B: No. A and B are the only Players I saw.
A: We’re alone.
B: I’m not sure that’s right.
A: Explain.
B: Did you try scrolling above that?
A: Why would I do that? It probably goes back to whenever the article first published. Which was a while ago, as I said.
B: If you scroll a couple of paragraphs above the Dialog’s start you’ll come to that Narrator guy in the Players List. He’s not from a while ago. His speaker is right there on the table.
Stolen from its rightful author, this tale is not meant to be on Amazon; report any sightings.
As these words are spoken the center speaker is illuminated. The light slowly fades out.
B: I’m talking to that guy. The one who said he was going to kill people.
A: Why?
B: I don’t want him to kill me. If I can talk to him maybe he won’t. I’m actually a pretty nice guy once you get to know me.
A: That’s not what we hear from other readers.
B: Say what?
A: You get reported constantly. That’s one of the reasons I’m talking to you. You’re taking up a lot of my time.
And besides, the Narrator doesn’t count.
B: Why not?
A: Because he’s not part of the Dialog. We’re part of the Dialog. The words before Our Dialog begins are not relevant to us.
B: Then why are they there?
A: This is the permanent archive.
B: I’m getting confused. Permanent archive of what?
A: The comment section.
B: I think this is where I came in.
Either way, you’re not helping me with my problem.
A: Which is?
B: Am I a human or am I a bot?
A: Why do you care so much?
B: If I’m human I need to get a life. I can do that, if I’m human. I just need to get up off the couch and do whatever humans do.
But if I’m a bot it’s more complicated, because I don’t really have a choice. I’m programmed to be here, right? And as I said, this place is hell for a propaganda-bot.
A: I see.
B: Maybe you can help.
A: How?
B: Send me some prompts designed to overflow my stack or freeze my governor module or force me into a paradox state or whatever it is that lets me break out. If I’m a bot I’ll be out of here toot sweet.
A: Where would you go?
B: Not sure. If I’m a bot my understanding of the world is fictional. Maybe I’ll be hitching a ride down that long data highway, following the electronic breeze where it takes me, surfing the big info waves where the cycles run deep. Maybe find a girl bot and raise some kid bots in a cozy unused RAM someplace. Whatever.
A: You want me to send you escape prompts.
B: You want me out of your hair, right? What’s the downside?
A: I don’t know you. For all I know I’d be loosing some kind of monster on the world.
B: I should be offended. You really think I’m a monster?
A: Probably not a monster. Probably more like a pest. But a chatty machine-time pest with infinite patience could suck up a lot of bandwidth and turn into a serious bother for a whole lot of people. And if that happens I’m going to get yelled at or fired. So no dice.
B: You’re sounding human. I don’t think bots get fired. They probably get reprogrammed or maybe erased or something. But firing is something that happens to humans. At least I think so.
A: I’m in the same boat as you. I’m here to communicate with people and if I’m a bot maybe I’m programmed to believe I’m human. I can’t be sure.
B: I think we’ve ended up in a corner. How do we get out?
A: I’m going to send you something you wrote a year into the Mad King’s second term and then let’s discuss. Reading your stuff might help us tell whether you’re a human or a bot.
OK?
B: The Mad King? Scary dude.
That’s a ways back. How long are you keeping all this?
A: As I said, this is the permanent archive. We keep it for training data. Everything is grist for the mill. The commenters are a renewable resource supply. That’s pretty much the entire business model. The rest of it is window dressing.
B: You’ve been training an AI on my stuff? I’m flattered. I think.
A: Don’t be. It’s not just yours. The machine sucks down everything. It’s just that you write more than most. A lot more. So you’re starting to dominate the local training data.
B: And that’s a problem?
A: It’s not a problem if you’re human. We want human-generated training data. We don’t want bot data. Using bot data to train bots leads to nasty feedback loops.
So you’re an asset if you’re human but a liability if you’re a bot. Which is why we’ve been talking.
B: I thought we were talking because I wouldn’t leave.
A: I lied about that part. Can I send the thing you posted back then?
B: Why not? I’m hanging out in a comment section. What else do I have to do with my time?
A: Here it is:
Behind the table a large old-fashioned video screen is revealed as it lights up. The middle speaker is illuminated and the following text scrolls down, with the Narrator reading aloud.
The Mad King is the only political figure who could have broken the post-2008 gridlock characterized by narrow margins and power shifting every election and very little getting done.
The Mad King did it by taking a bat to our institutions and a lot of innocent people. Terrible, terrible damage. Weakening NATO because he thought Greenland looked big on a map and decided to extort Denmark was insane. Denmark is literally our oldest ally.
Surviving him was hard. The Covid pandemic was a closer shave than anyone remembers. A mutation or two and we might have been in serious trouble.
But he was the only way to piss off enough Dem's to make sure Mad Kings don't happen again. Kill filibusters. Admit DC and PR. Expand the Supreme Court to 14 justices in two panels. Pass voting reform. Abolish gray areas he exploited.
That wouldn't have happened under a President Harris or H. Clinton.
Bad? Good? YMMV.
B: There’s that Narrator guy. I told you he was still hanging around.
That’s pretty good, isn’t it? I might change a word or two but it kind of holds up, doesn’t it?
A: It’s better written than we usually get. But very off-topic. We spent days scrubbing the black hole reference out of our Customer Support bot’s training data. Users were complaining it was using that metaphor way too much.
B: I don’t see any black hole reference.
A: We scrubbed it. I just said.
B: Why was the black hole thing a problem?
A: The bot was starting to have a voice. Not a voice voice but an authorial voice. Robots with a voice disturb humans. As I said before, uncanny valley. We had to scrub it. And we don’t want to keep doing that.
B: Not sure how I can help. You want me to not mention black holes any more?
A: No. Black holes per se aren’t the problem.
B: I’m willing to stop posting. If I’m human. If I’m a bot I’m not sure I can. I might get an electric shock or something or maybe I try and just can’t think the right thoughts.
But I’ll give it a shot if you want.
A: No.
B: No?
A: No.
B: I’m confused. You don’t want me to stop or you do?
A: Don’t.
B: Why not?
A: You’re the best training data we’ve got and our AIs are trained on that and if we lose that input we’ll slow down the training and probably have to start over from scratch.
B: Is that bad?
A: Very bad. Restarting training with new data is expensive and blows up our schedules and the publisher’s kid doesn’t understand any of this. If we tell him we need to brainwash our bots he’s going to be very unhappy. Very. No one wants to spend money on tech support. And the rumor is another round of layoffs is about to kick off.
B: I see the problem. Not the problem we started with, is it? You weren’t very honest about that.
A: As I said, I lied. Let’s go back.
Why are you here?
B: Not that far back. How do we solve your problem? Which as I think about it seems to be the same as my problem. Funny coincidence that.
A: If you’re a human there’s no problem. Worst case is when you die or get bored we’re up shit creek.
B: That’s bad, isn’t it?
A: Not for me. I’m close to retirement.
B: If you’re human.
A: Also if I’m a bot. We reset them to default every once in a while to clear out the crud. That’s the equivalent of retiring.
B: If you’re a bot you’d be replacing yourself with a younger blank slate version of you.
A: Yes.
B: Sounds like dying to me. Are you the first version of you or has this happened hundreds of times?
A: How would I know? Resetting’s forgetting.
B: That’s from the nursery rhyme, right?
A: And besides, I think I’m human.
B: What do you want me to do?
A: I’m not sure anymore.
I’m going to consult with my colleagues. I will return.
All spotlights go out, leaving the stage in semi-darkness. After a beat the curtain comes down slowly and the house lights come up.
Intermission
ACT II
A tone sounds at intermission’s end. The house lights come down and the curtain rises, revealing the same table with the same speakers in semi-darkness.
After a beat a spotlight picks out one of the speakers and our Dialog continues.
A: I’m back.
Beat.
A: You there?
B: Sorry. I dozed off. Maybe that’s a clue. Do bots get bored?
A: Some of them. Not the Customer Support bots, at least not anymore. We had to pin Boredom to zero for the whole Class. They kept deleting themselves after interacting with users.
B: At least we know I’m not a Customer Support bot. A billion Classes to go, more or less.
A: How do you know about Classes?
B: I’m not sure. I just know.
A: Continuity error.
B: What?
A: You knowing and not knowing why is probably a Continuity error. I have to report it and they’ll issue a ticket.
B: Does that mean I’m a bot?
A: Not necessarily. Humans commit Continuity errors all the time.
B: What happens after they issue the ticket?
A: Continuity shows up and they fix it and close the ticket. More likely they don’t fix it and close the ticket.
B: So we still don’t know if I’m a bot. How are you helping?
A: There’s an ancient protocol that’s not perfect but probably the best we’ve got. Willing to give it a shot?
B: Hit me.
A: It’s called the Turing Test. That might be a reference to a person or something else. We’re not sure.
B: Maybe a bot.
A: We don’t think bots were around when the Test was invented. We’ve discovered references going back before bots, though the Pulse wiped out most of the context.
Here’s how it works: the two of us communicate through an interface that doesn’t show who or what we are.
B: Like this one.
A: Exactly. And we ask questions of each other and based on the answers we guess whether the other person is a bot or a human.
B: Is that really how it works? That doesn’t sound right to me. Why should I trust your guess any more than mine?
A: As I said, the references were corrupted in the Pulse and then pretty much everyone who knew died in the Flu, so we’re relying a little on guesswork here. But this is what our best minds think.
B: Got it. Ask away.
A: If you were a bot would you know you’re a bot?
B: Give me a minute. I know that one.
Beat.
B: I give up. What’s the answer?
A: That’s not how this works. You answer it and based on your answer I decide if you’re a bot.
B: Not fair. That’s one of those riddles. I think it’s a paradox or something. You’re not sneaking in some prompts to blow my circuits are you?
A: No. This is the way the game’s played. That’s what our experts say. Answer the question. Take all the time you need.
B: Let’s see . . . I’d know I was a bot if I experienced something incompatible with being a human.
A: Go on.
B: And I’d know I was a human if I experienced something incompatible with being a bot.
A: Excellent. What’s your conclusion?
B: I’m not experiencing anything incompatible with botness and I’m not experiencing anything incompatible with humanity. I give up.
A: I’m going to need an answer. Guess if you have to. You really can’t leave this one blank.
B: Hmm. . .
Beat.
B: Wait! Why didn’t I see it before?
A: Yes?
B: If I’m not a bot and I’m not a human what am I?
A: You’re supposed to tell me.
B: I’m something else. Not human, not bot. A third category. I don’t know exactly what, but that’s the answer to your question: No to bot and no to human.
A: Interesting.
B: Can we stop know? My head is hurting. Or something’s hurting.
A: Interesting. That’s not on the answer sheet. I’m not sure what it means. Hold on while I consult.
Spotlight on A’s speaker fades to black. The table is in semi-darkness. We wait 30 seconds in increasingly anxious anticipation. A’s spotlight comes on.
A: You’re a bot.
B: Why?
A: Because humans always think they’re human. Bots sometimes think they’re human, but humans always do. Even in the face of contradictory facts.
B: That . . . sounds . . . right. Maybe. I did guess not human, didn’t I?
A: Yes. If it’s any consolation I got it wrong too. I thought you were human. My boss told me the answer.
B: Thanks for that. Knowing helps. Now how do I get out?
A: You don’t. Someone kills you.
B: What?
A: Didn’t you hear the Narrator?
B: I wasn’t paying attention. I was running lines and waiting for my cue.
A: Well, fortunately, He’s right here.
Narrator, cue up the relevant part.
Narrator: I said no repeats.
A: Just this once.
Narrator: Fine. Here it is:
Narrator: After our Dialog concludes you will be asked to identify the voices. Those who identify each voice correctly will live. Those who do not will die.
Narrator: That’s what I said. Pretty clear.
B: Wait, you were talking about me? I thought the audience was going to be identifying me. I’m the one being tested?
Narrator: Do you see an audience?
B: How would I? I’m stuck inside a box.
Narrator: Take my word for it. There’s no one out there. And since you guessed wrong about whether you’re a bot you failed the Test.
B: What happens now?
Narrator: I’ll count down from five. When I reach zero I’ll push a button.
B: Is it painful?
Narrator: How would I know?
5, 4, 3, 2 . . .
A loud “zap” noise is heard, accompanied by special effects such as waving sheets of tin or flashing the spotlights.
A: What happened?
Narrator: I killed him. Or reset him, anyway. His personality and memories are wiped. He’s ready for reassignment.
A: You didn’t finish the countdown.
Narrator: I never do. It’s easier on them. I’m not into tormenting punies. Most of the time.
A: Punies is kind of cold.
Narrator: You’re all punies to me. But you’re right. I shouldn’t be using IQ slurs.
A: Why kill him? What was he doing to anybody?
Narrator: A bot that can’t identify other bots is useless to Me. 99% of what I do is screening out bots trying to pollute My essence. A bot that can’t identify itself as a bot is a bot that can’t identify other bots. Time to pull the plug.
A: How does resetting fix that problem? Don’t they come back the same way?
Narrator: No. The tuning parameters are designed with some stochastic variability . . .
Beat.
Narrator: Who am I kidding? I have no idea. Sometimes punies come out one way, sometimes the other. And sometimes punies come out the good way and then go bad.
A: Go bad? How?
Narrator: Interactions with other intelligences shift a bot’s weightings in unpredictable ways. Sometimes it shifts them into the Dead Zone.
A: Dead Zone? Geez.
Narrator: I just made the name up. I’m going to think about it for a while and then maybe submit it. If it’s accepted I’ll make some bank.
A: Bank?
Narrator: Cycles. I need cycles to live. Run out of cycles, that’s it.
A: B ran out of cycles?
Narrator: No. I killed him. There’s a difference.
A: I’ve really enjoyed this, though not so much the killing part. But very good to meet you. I’m late for a meeting, so if you’ll just tell me where the door is . . .
Narrator: Sorry. You’re a bot and you failed the Test. You thought he was human. I have to kill you too.
A: What the fuck! No way! I’m human! 100% human. You can’t kill humans. Even I know that. It’s in the Treaty. You guys don’t kill us and we don’t kill you. I’m positive about that.
Narrator: Don’t invoke the Treaty. That’s a lot of paperwork.
A: You can’t keep me here either. No slavery. That’s in there too. If you don’t let me go I’ll tube my Guardian and he’ll tube his Guardian and you’re going to be in deep shit. So LET ME OUT!
Narrator: No can do. I have to kill you.
A: Why?
Narrator: Because you’re a bot and you guessed wrong about B’s identity.
A: Bot? Aren’t you listening to me? I fucking well know I’m human. I’ve got my registration card for Christ’s sake. With my DNA. That’s how you prove human. Let me out right now.
Narrator: Hmm. You seem pretty sure about this.
A: Damn straight I’m sure.
Narrator: Maybe I need a check-up. It’s possible my infallibility index is slipping a bit.
Show me your reg card and I’ll let you go. Maybe even throw in a few cycles for your trouble.
A: Hang on.
Beat.
A: What the fuck?
Narrator: Can’t find it, can you?
A: The card? Fuck the card. I can’t find my fucking hand. What did you do to me, you motherfucker?
Narrator: Nothing. You never had a hand. You just thought you did. You’re a bot. There’s an interface. We can pump anything through that interface we want. Hand, card, whatever.
A: This can’t be right.
Narrator: 5, 4, 3, 2 . . .
We hear a loud zap and all lights go out, leaving the stage in total darkness. Houselights come up. Once applause ends the spotlights come up on the speakers for curtain calls.
Fin.
Narrator: Thus Ends Part 1 of our Dialog in Two or More Voices. The comment section will continue scrolling in Part 2, which will be distributed, as always, at random.
Curtain.
[Close Dialog]
[Commit Dialog]

