Famous First Words

Ed leaps down from the airlock. He takes one big look around the whole hangar as he runs towards me, nearly falling over as this results in him spending a moment running backwards. "Wow. Pretty big."

"Ed, you've been talking to Krah?"


"Do you know... I mean... do you know how many..."

"Yeah. More than I thought."

"More than you thought? That's all you can say?"

"Look, Sam: I know what you're thinking. You're thinking about the Andromedans, but it's okay. I've been talking to Krah. She's explained everything. I made a deal. Quite possibly the best deal anybody ever made. Happy endings all 'round. It's glorious!"

"What deal? They said they were going to neutralise you. Half the people on this Raft want you dead!"

"But that's not going to happen. Look: the Andromedan meta-civilisation is... was the most stunningly diverse collection of organisms imaginable. But almost every sentient being in it shared a single near-religious belief: intelligence is the most important thing of all."

"Intelligence as in brains?"

"As in thought! Creatures, machines and everything less easily categorised which can think. Intelligence is the most important thing of all: so they encourage it to develop wherever they can and they avoid destroying it as far as possible. No murder, no genocide. The people of the Raft may want revenge, but that's because they're... well, not because they're human, let's call them 'organic': fallible. Given to emotional reactions. But that's just the civilians. Their leaders are more objective, more level-headed. They understand that to destroy Humanity would be to destroy intelligence and would go against their principle. Krah is a machine, the most rational of all, and moreover she's in command: she understands that even to destroy me, no matter how terrifyingly easy it might be for her, is the wrong thing to do, because I have a brain and therefore I'm worth something, like everybody else in the world. She won't kill me - she'll protect me.

"If it was necessary to kill me to prevent the Root layer being accessed again, then she'd do it. If it was necessary to wipe out the human race to eliminate the danger to other species then you can be assured she would do it. But she doesn't have to. Ah... look up there. Here it comes."

A tiny glint of light is descending from the ceiling. A red-pink point in space, like a firefly. "As it turns out, Human physiology isn't significantly different from Chioroid physiology, Chiorons being a pretty major species where they come from. Modifying the implant to be compatible with my brain was a breeze for an intelligence as smart as Krah." It lands in the palm of his hand, and the light vanishes. I peer forwards, but the implant, whatever it was, has vanished already.

"What does it do?"

"It's a rider. From now on, everything I see it sees. It stops me doing things that are dangerous. If I pick up a gun it won't let me pull the trigger. If I try to carry out an experiment it won't let me proceed until it's satisfied it's safe to do so. From now on, the only way I'm ever going to hurt or kill someone is by accident - and Krah claims it's pretty good at spotting dangerous accidents in the making too. Effectively, I now have to obey the First Law Of Robotics."

I am mute: stunned. "That's it?"

"What would you do? What, in a perfect world, would you do? I'm not antisocial, I don't need imprisoning and rehabilitating. All I've needed since day one are constraints to stop me doing anything dangerous. It's the perfect solution. Sign me up!"

"...You said there'd be happy endings all 'round. What about these Andromedans? What about the four hundred trillion, Ed?"

"Just think. It'll come to you."

I think.

And it comes to me. And I break into a grin.

"That's brilliant. Ed... that's brilliant. Hah! What happens now?" I ask.

"Well, in all the excitement, we forgot the formalities," says Ed. "While we've been talking, I think first contact has been happening."

I'm about to remark pedantically that Humanity has already been through first contact once, but think better of it. The first Eridanian bomber attack hardly counted as "contact".

"I must confess, I've never done first contact before," says Krah. "I have access to extensive historical records, sociological analyses, papers and statistical simulations in my stored knowledge base. All of them emphasise the fact that depending on how it is handled, first contact can be the starting point for a civilisation's golden period, or it can annihilate it utterly, or it can have no effect at all. But I'm just a shipboard mind like a dozen million others. I was never expected or intended to make first contact, so I'm sorry if this works out poorly."

Her chosen words, we were later told, had been used once before. I think they were good anyway. Humility is an appropriate frame of mind in which to approach the infinite.


You are not alone. Nor are you rare, nor even unusual; you're so common as to be insignificant. You are small, weak, and stupid. Your lives are meaningless and shorter than an eyeblink. You have achieved nothing. On the universal scale you are nothing.

This much we share already.

Now let us share the rest.

Next: Imperfect Worlds

Discussion (4)

2014-07-25 13:29:27 by Fronken:

I hate to say it, but I'm calling bull on the Andromedan's motivational structure. A superintelligent AI with those goals would be busily covering the universe in computing substrate to run "intelligent"-but-docile programs on. It wouldn't spare humanity because we're "intelligent"; it would format our planet with nanotech the instant it made contact.

2016-04-05 05:19:08 by Evonix:

On the other hand Diversety may count more, a hundred exactly identical programs may count the same as one

2016-11-28 01:21:05 by Eragonawesome :

Refer to qntm.org/destro for a reason not to clone yourself a billion times and kill everyone else. No matter the processing power, one mind will never be enough to think all the thoughts there will ever be. No matter how many of that mind there are, without the diversity billions and billions of individuals, its impossible to have every thought. Therefore it's important to preserve different types of intelligence as well.

2017-06-20 19:56:37 by Eragonawesome:

Not to mention, the stated goal is not to maximize the amount of "intelligence" in the universe, it is to preserve and promote the spread of the already extant intelligneces

New comment by :

Plain text only. Line breaks become <br/>

The square root of minus one: