Why Not Just

Previously

Laura doesn't have time for disorientation or indecision. The universe is half black and half red, rolling around her like some demonic Macintosh mouse ball, furious wind is rising in her ears, there's an organ-rearranging urge to vomit-- all of this is irrelevant. "Tanako" got the drop on her, again. The last time, she vows. It's the end of the world. Plan A. Beat the Glass Man.

She powers all the way up to her phoenix form, full aerospace mobility with a twelve-metre wingspan. Dynamic pressure hammers into her wings, slowing her corkscrew roll. Purple crosshairs flash across her HUD, highlighting the mana output from Nat, Anil and even Nick. Far below them all, Rachel Ferno is a paralysed, dying blip, and the Bridge is a weird alien helix of red and blue. But all the blips are dwarfed by the tornado of leaky force shields which represents the Glass Man.

Laura folds her wings around herself, aims her engines at the sky and dives. One second and she's past the other three. One point five and she's through the peregrine falcon's record, officially the fastest-moving animal on Earth.

The Glass Man looks up. Laura is the brightest thing in the sky, unmistakeable in both the optical and thaumic spectra. He raises a hand to fire. But Laura's coming back to the fight with the bones of a plan, which starts like this: He's human. Humans need to see. She beats him to the trigger, blinding him pre-emptively with magnesium-white light. His first shot misses. It's a drill spell, pushing a rifled cylindrical force field through the air, a gunshot minus the gun and the bullet. It cracks the air open like a hollow thunderbolt. Laura vectors sideways as hard as she can while still closing the gap. Closing his eyes and going thaumic, the Glass Man fires twice more at the luminous being bearing down on him. The second shot grazes Laura's hull, doing no obvious damage but sapping almost all of its structural energy. The third punches directly through her left wing, destroying it and gouging a channel of flesh out of her physical shoulder. Laura howls, more in shock than in pain, as adrenaline cushions almost all of the injury. She staggers in the air, losing her streamlining and starting to roll out of control. She fights it with thrusters, but it isn't enough to recover.

The Glass Man shrugs, scolding himself lightly for wasting time on pointless projectile combat. He reaches out with his other hand, forming a claw grip, as if grasping some invisible throat. Then, he simply teleports Laura into the appropriate gap, discarding her hawk form and all of her momentum.

Laura chokes. At this heart rate, the lack of oxygen is immediately life-threatening. She feels pressure in her head rising, as if blood is about to start forcing its way out of her tear ducts. The Glass Man's "glasswork" follows the contours of his face so closely that Laura can almost see the curled lip, the single eyebrow raised in grudging admiration. The Bridge, floating obediently behind him, simmers down from the sudden activity, fading in colour from actinic purple to nominal dull red.

"Back from backup, hmm?"

Laura manages to grind out an unintelligible sound. She latches onto the Man's wrist with one hand, pulling futilely. Charitably, he relaxes his grip by just a fraction, granting her enough air to spit out an epitaph.

"Anhtnaa vaeka."

Her self-defence spell fires from the hip, scything up from her other hand into the Man's midsection, crossing his armpit and face. There's a screech like steel across granite across glass. The Glass Man turns his head aside for a second, as if in a light breeze. Then he turns back, unscratched.

And now, says the smug little voice in Laura's head, which is even starting to sound like the Glass Man, it's over.

Laura begins to black out. She relaxes and lets it happen, because whether she's won or not isn't up to her anymore.

And the Bridge, with its braid severed so cleanly, falls out of the Glass Man's control.

He notices, after that one moment. He even starts to turn. Too late.

It's Anil who catches the Bridge, smashing into it from above at a relative speed of more than a hundred and fifty kilometres per hour. Without shock absorption, the impact would break him in half, but he's wearing a shield which Laura - through half-closed eyes that are rolling back - recognises, one she wrote herself years back, EPTRO. The Bridge plugs itself enthusiastically into Anil's brain, and he disappears, too quickly for the Glass Man to loose a shot at him. For a second, there's no further movement.

"What--" the Glass Man begins.

A Montauk ring clunks into existence around his neck.

His firestorm of magic caves in on itself. The mana is all sucked into storage. His shields collapse, including the reinforced armour field which was gripping Laura by the throat, and the light-negative layer. Behind it, he's just another immaculately-suited Wheel Group-esque male, same ideal jawline, same piercing blue eyes. His flight spells evaporate and he and Laura fall away from one another. He clutches the ring at his throat with both hands, but it's too small to come off. His face is a picture of perplexed shock.

Laura, rubbing the circulation back into her own throat, thinks he looks like he doesn't understand how the fight ended this way, so soon. Like he wants more time, to work those seconds out again.

*

Edward Hatt is the first and only person on the Hatt Group site. At this time of year he manages almost an hour of work before sunrise, with his back to the window and the Venetian blinds closed to block out the rising glare.

When the warnings come, there's a palpable change in the texture of the light coming around the blinds' edges. Ed raises them to see what's going on. Then he just stands there, staring at the impossible red holograms that tessellate across the entire sky, at a loss. He knows what the warning signs signify, but he has no idea how to react to this information. High energy magic? How high? Is something going to explode? Is it the world?

Spread out below him is the Hatt Group runway, and aligned down its long axis there's a stencilled A-2X-class mandala. It's used rarely, for heavy-duty engine testing. As Ed's watching, a nominal flash of blue-red lightning passes over it, leaving behind five figures. He immediately recognises two of them. One is Laura Ferno, falling to one knee and clutching a shoulder. The other is Anil Devi, who spins on the spot, looking for Hatt's office, and then makes eye contact with Hatt himself. He points at Hatt meaningfully, then takes off at a dead run across the runway towards the factory floor entrance at ground level. This is the quickest route up.

Hatt turns from the window and bolts for his own door, aiming to meet Devi halfway and find out what the hell's happening, but after a few paces he just hits the runway.

"What--"

"Teleportation unit," Anil tells him, brandishing the Bridge in one hand. It has almost no mass, as if made of steel-plated helium. He grabs for Hatt's wrist. "We need your lucky bracelet."

"It's not lucky, I just wear it for-- What is this freakshow? Hey--"

For the second time in ten seconds, Anil remembers what he's carrying, and simply 'ports the ring off Hatt's wrist. He dashes back to the mandala, where he kneels over a supine woman with what Ed Hatt judges to be an entire mediaeval torture chamber embedded in her skull. It's enough metalwork that it props her head up like a pillow.

Hatt realises who this must be. "My God, is she even alive?" She must be. Blood is pulsing out of the fresh wounds and flowing across the mandala surface.

"How do you use this thing?" Anil shouts at Hatt, desperately.

"What thing?"

"It's a Wheel Group medical ring! Weren't you told?"

"It's nothing! I had it tested for a straight year--"

"She knows how to use it," Nick says, kneeling next to Anil and showing serious concern. Natalie and Laura are hanging back. Laura is dealing with her own injuries, and Natalie holding Laura's hand because... well, if anybody asked her, she'd mutter something about "too many cooks".

Rachel Ferno's fingers move, spreading across the asphalt.

"She's seven-eighths dead," Anil says to Nick.

"If the ring does anything," Hatt tells them, "it works too slowly to detect."

Anil curses and checks his wristwatch. "I lost track of time. It must have been ninety seconds already. Bloody hell. Bloody everything."

Rachel inhales sharply and shifts position. With the Glass Man's influence withdrawn from her mind, a few of her senses have returned: touch, hearing, and most recently the pulverising, near-nuclear pain at each of the hundred-and-something entry wounds in her skull. She feels the touch of metal at her wrist, but she remains blind, and she can't think. Her consciousness has holes torn through it, holes so huge that they're almost impossible to perceive from the inside.

As the others watch, transfixed, she reaches up with a spasming hand and probes the metalwork. It's so dense in parts that she can't even reach though to touch her own head. She moves to her eyebrow, where a particular rod dives directly into her forebrain, drooling blood which has already begun to pool in one sightless eye. She twists the rod experimentally between her thumb and forefinger.

Frontal lobe.

"I wouldn't--" Anil begins.

Rachel grasps the rod at its root in her fist, and pulls.

"Oh shit! Jesus!"

It's deeply rooted, and it'll barely move, but it'll move far enough. Her higher brain functions misfire in blotches, each tenth of a millimetre of motion leaving her with a different pattern of brain damage. Oblivious, Rachel pulls until her knuckles whiten, applying pressure until the slow-acting medring can repair enough of the tiniest, most important parts of her frontal lobe that she can line up four thoughts in a row.

She fires those thoughts at the medring.

The metalwork withdraws from her skull, forced out from inside by such intense pressure that it wails and bends out of shape. Rachel wrenches it off in a single piece, like a motorcycle helmet, and throws it aside. The wounds in her skull close up. New hair even sprouts, filling out to the same length as the rest.

She sits up and goes to speak, then stops for a second and pulls a final rod out of the roof of her mouth. It's the length of a knitting needle; its tip must have been scraping the back of her eyeballs. She flicks the needle away, and spits out some blood.

"Sitrep."

"Mrs. Ferno, it's the end of the God-damned world," Anil tells her. "Abstract War II. Ra is waking. Fourteen minutes and fifteen seconds."

"Until what?" Ed Hatt asks.

Anil blinks. "What do you mean, 'until what'? What did I just say? The end of the world!"

"Seriously?"

"Seriously."

"Anil. Anil: What the fuck's happening?"

"It is way too late in the game to cover this for you, Ed," Anil says. "You might just want to stand back and watch."

Rachel stands. She feels like a totally fresh human being, her head as clear as a bell. She has all the energy and all of the fury. She pictures herself fighting Abstract War all over again, all by herself. She almost feels as if she could do it.

"Mum," Laura says to her.

Rachel looks at Laura, hardly seeing. Her last memory is of Laura being vaporised. Murdered. But a whole beat has to pass before Rachel remembers which world she lives in now. She spent the whole war watching people die and die and die again, and come back from backup each time, and then concatenate all of their death experiences together into life experience, and fight the war using that experience. But that war should be over, and Laura ought to be permadead. And here she is. With the same ruinous, grey fire in the eyes.

"You saved me?" Rachel asks, passing the medring over so that Laura can heal herself.

"We had a backup plan," Laura says.

"Very good."

Then there's Natalie. As with Laura, it's only been an insanely crowded subjective few minutes since Rachel last saw her. In that time, everything about Nat has changed. As a girl she always had a hunted, uncertain look about her, as if she couldn't tell whether she was missing something incredibly important, and the whole world was leaving her behind, or vice versa. Now she looks like she made her mind up: it's the second one. "Hello, little one."

"I know we should be doing the joyous reunion with hugs, but you've got to save the world now," Natalie says. "It has to be you, I've used every idea I had. Ra is too fast. I don't see how to fight the system while inside of it."

Rachel scans the faces of Nick Laughon, and Anil Devi, and Ed Hatt, none of whom she recognises, and none of whom really know her either, except by reputation. For a slight moment, she glances up at the sky of warnings, which cuts out almost as if responding to her unspoken instruction, leaving dull grey-blue dawn with, maybe, one remaining star. There's a little cloud, not enough to be threatening. And lastly she looks east, to where Ra is starting its climb, ascending out of red into yellow.

"Thirteen thirty," Anil prompts her.

"There was a person made of glass," Rachel says.

Anil swings the Bridge up again. "I can get him. I bet he hasn't even hit water yet."

*

He arrives spread-eagled in the standard parachute jumper's posture, and cracks his chin hard on the asphalt. He grunts, the wind knocked out of him, and lets the noise lengthen into a heartfelt moan of aggravation and resignation when he realises whose feet he's landed at.

"You stay down," Rachel informs him, "or I'll have to threaten you with something."

He rolls over onto his back, spreads his arms out in defeat, and laughs like someone who's got nothing left but exhaustion and sarcasm. "Metathreats, Ash?"

"This isn't history," Rachel tells him. "This is going on no record. I know who you are already, and none of these do, and I'll never tell them. I know my way through that wreckage you call the logic; your broken, cancerous arithmetic; your sick 'freedom'. We've already been through that, you and I. Do you understand? You will not be remembered for what you've done. You will not be infamous."

"I thought you had a personal rule," he says, "about not killing people slowly."

"Is there a way to undo this?"

The man grins for a second, glancing aside as if mugging for some unseen audience, as if he isn't sure what he heard.

Rachel presses, "Have you left a way out for yourself? A back door?"

There it is, and he laughs out loud. This time it's genuine, he just folds up at the waist and gives in to hysterical laughter, shaking as if electrified and almost unable to breathe through it. The irony alone might be enough to kill him. "A back door?" he manages between shrieks. "Like this whole thing, this whole thing wasn't made possible by back door, after weak hack, after Swiss cheese disaster, after, after--"

"Veleth tekhta! Mal tho ula i namamba ta'upra leth!"

The man gets his breathing under control. "Reality is a waste of resources," he declares. He sits up and, while looking at Rachel, seems to address the five people behind her. "You are so few. Only six billion-with-a-B. That you can't. Even. Matter. Not compared to the needs of the trillions of trillions who are coming."

Rachel points the first two fingers of her right hand at the man's face and Natalie has just enough time to shout, "No!"

It's deliberately an ugly death. There's a corpse with a hole in it and a spray of blood which almost reaches the edge of the mandala. Clear, so everyone can see.

Rachel spins on her heel.

"Who the fuck are you?" Ed Hatt demands. "You can't just kill someone! You've got to use that ring to bring him back."

"No," Rachel tells him. "Call the police if you don't like it. I'm sure they aren't too busy. Natalie?"

Natalie is stunned. "But you killed him," is all she can say.

"I did," Rachel says, "and I do love you, but when this is over, you can challenge my decisions again."

"That was Ra?" Nick says. "As in, that was Old Ra? That was the face of a bunch of misfiring Ra nodules that survived the war somehow?"

Rachel says, "That was nobody."

Nick just stares.

"Parental discipline and reverse Cluedo are tomorrow's problems," Anil says. "We've got to work now. Twelve minutes nothing."

"We can't stop it," Rachel announces. "The Glass Man wasn't lying. We can't stop it."

"Alright, I've heard enough," Laura says. "Mum, glad to have you back. Anil, Ed, you're going to use the Bridge to round up every black-belt mage you've ever known or heard of. I'm building a recursion spell which can harvest the entire Earth core cache, because that's the only significant mana source left on the planet. We've got a working A-2X and its vertical axis is right where we need it. We're moving the Earth."

"That won't work," Rachel says.

"It damn well will," Laura snaps, "and if you're half the mage I lived my whole life believing you to be, you know how to make it work."

"Laura--"

"You know how the distributor works, you can get me inside. Right?"

"Ra will kill you if you represent a threat," Rachel says. "Ra would have killed all of us proactively by now if we were capable of representing a threat. We can't stop it."

"But--"

"Be quiet, now. I don't have time. Anil?" She holds a hand out.

"The energy packet reaches us in eleven minutes and twenty-five seconds," he says.

"No. I mean, Anil, give me the Bridge."

Anil hesitates, then hands the machine over. The Bridge's mental connector braid ripples, waggling uncertainly as Anil did, then switches to its new host.

"Eleven minutes until the destruction of the world," Ed Hatt says. "And you're saying there's no way to stop it from happening."

"There isn't," Rachel says. "Be quiet, now."

 

Final chapter: Destructor

Discussion (293)

2014-08-11 22:35:36 by qntm:

Special thanks to Anton Piatek for skydiving information, BaronWR for brain information (much of which is deliberately squelched here, but oh well), and the Custodian for significant editorial services.

2014-08-11 23:18:13 by banj:

it's all coming together

2014-08-11 23:47:23 by Infinity:

Great chapter, can't wait to see what happens!

2014-08-11 23:52:31 by Ashe_Black:

>Why not just take the earth, and push it somewhere else?! That is brilliant!

2014-08-11 23:59:11 by naura:

"That was nobody" -- many chapters ago someone suggested, by following the *nix analogy, that entities using the True Name ra were analogous to the nobody user (who daemons run as). Looks like that person nailed it.

2014-08-12 00:12:30 by Unmaker:

Thinking by typing... The whole thing about infamy appears to be important but Rachel is the only person who knows what it means and has promised to never tell. Tease the audience much, Sam? I thought the listening post controlled magic, but apparently magic is still working with the listening post dead. Note to self: if I ever have to get in a fight with Rachel, vaporize without warning. Follow up by vaporizing her daughters posthaste. So, resources and limitations... Resource: the Bridge, possibly other astras Resource: the General Resource: several skilled mages Resource: anything Rachel knows exists within less than twelve light minutes round-trip Limitation: eleven minutes Limitation: Ra has a DWIM order to rebuild the Matrioshka brain and free the Virtuals, which means it is actively simulating possible failure modes, which means "Ra would have killed all of us proactively by now if we were capable of representing a threat. We can't stop it." OK, best option is to hijack the Wheel Group's emergency deep nonlocal transmitter and get out of the system. Unless the Bridge and other surviving astras can duplicate said emergency transmitter, the cost of getting out of the system is 6E9 left behind to die. If duplicates are possible, 2 seconds/person * 6E9 persons = 1.2E10 seconds = 2E8 minutes required with ~10 minutes left. So they would need 2E7 emergency deep nonlocal transmitters immediately and enough power for all of them. Long shot: get the current crop of Virtuals to change their mind. Long shot: the Bridge can transfer information/matter anywhere, at light speed. Control room in Ra, perhaps? How would that be useful without the key? Long shot: instantiate a friendly Ra node somewhere. Except the are in unfriendly Ra's light cone now and have no seed for a friendly Ra node. OK, I am stumped. We have the Fine Structure ending where the powerful, knowledgeable character comes back, but winning is theoretically impossible.

2014-08-12 00:20:31 by Silhalnor:

"When the warnings come, there's a palpable change in the texture of the light coming around the blinds' edges. Ed raises them to see what's going on." I'm surprised he noticed. Hey Ed! We found Laura! And her long dead mother! And some nobody who's ending the world. And a floating box of miracles. So Rachel's sure that the world will be destroyed, but she has a plan all the same? I wonder what she intends. Probably not an evacuation. Maybe she's about bring in the Wheel Group and order them to become soldiers once more. I bet they'd follow her.

2014-08-12 00:21:56 by DanielLC:

It's kind of hard to sympathize with the main characters. Glass Man is right. Trillions and trillions of humans dominate billions. They made a dumb decision trying to kill off all the Actuals before, but that was because they were risking so many more Virtuals if they lost. If you're okay with a woman risking the dangers of pregnancy to bring one new human into the world, than it doesn't seem like a hard decision to kill one person to bring a quadrillion into existence. The story seems pretty clear that we're not supposed to be in complete agreement with the Actuals, but I still agree with the Virtuals way more than it feels like I'm supposed to.

2014-08-12 00:26:09 by qntm:

I just want to make sure that everybody's clear on this: Rachel Ferno has *incredibly* Nineties hair right now.

2014-08-12 00:34:29 by qntm:

If you agree with the Virtuals' approach to resource management, you're a psychopath.

2014-08-12 00:37:06 by Unmaker:

Why Not Just... give up.

2014-08-12 00:39:08 by Eiko:

The Virtual's approach to resource management is reprehensible, and their failure to instruct Ra to pull the minds of the Actuals into the simulation before destroying the solar system to increase computation power is... pretty heartless / hard to forgive. However, leaving them frozen with no intent of unfreezing them seems like a crime of similar magnitude.

2014-08-12 00:43:41 by Omegatron:

Presumably the bridge can't teleport the whole planet, or even Earth's core node. If it could that would be the obvious solution.

2014-08-12 00:50:27 by M:

... what if that was the guy that she 'vanished' back when they were in the sun?

2014-08-12 01:37:40 by Silhalnor:

@Unmaker: Mistaking the Listening Post as the source of magic was a mistake others made too. It just used magic to detect and record all chi and other magic particles that are released by spells. This is the reason invisibility cloaks work against it. If one could conceal a spell from the peach node, which is what really simulates magic, then that spell would not work anyway. @DanielLC, Sam, Eiko: Both Virtuals and Actuals have genocidal tendancies that I have seen. At least in a large generalized way. Nat is a prime example of someone who doesn't though, there must be tons of Virtuals like that too. But anyway, this means the two groups are essentially the same. Which means that if you are considered are a psychopath for siding with the Virtuals then you must also be considered a psychopath if you side with the Actuals. That either leaves us psychopaths or with no one to side with. I suppose you could side with no one but wouldn't that be giving up on humanity? I suppose what we really need is a new world order that puts people like Nat in charge. So where am I going with this... I guess I want Virtuals and Actuals to coexist somehow. If they can't... well that leaves us with a fun conundrum. Suppose the Virtuals can break out of any prison you place them in (as they have done once already). You could try the reverse, put the Actuals in a prison. By which I mean disallowing them from messing with the Virtual's home and colonizing outward. But then either the Actuals will break out or some Virtuals will break in and take the resources, either virtualizing everyone or killing them all in the process. What you want is for both sides to desire the existence of the other. Doing so through mutual dependency is impossible though. It could be artificially enforced by Ra in some manner but one or the other group will someday break the artificial dependency. So, you would need both Virtuals and Actuals to simply appreciate the merits of the other? But in a way it doesn't even matter because there will always be SOME sub-group that desires the eradication of Virtuals or Actuals or even both regardless. So, then, it must be impossible for anyone, or any group, to attain the power to do so. And it must be done through natural constraints. Hrm... no wonder people keep killing each other, it's far easier than ensuring eternal peace. Well, let's suppose the worst case scenario where it simply can't be done. THEN... eventually there will be a day, however long it takes to arrive, where one group will be eradicated. This will invariably happen in a world where anyone is allowed to live and is theoretically capable of attaining the power to eradicate a people. So would the objective be to delay war for the absolute maximum time possible? Hmn. Well, I'm pretty sure I'm not a pessimist but this is a rather pessimistic conclusion, isn't it?

2014-08-12 01:47:35 by Mike:

But can it teleport energy packets? Also, I spent several paragraphs thinking she'd killed Anil until he started talking again. Yay, ambiguity.

2014-08-12 02:00:07 by Kyle:

Sam, Why don't the Virtuals build the Matrioshka brain around another star? It wouldn't be hard to transport themselves at light speed to the new installation without experiencing any subjective time. Was that option considered? Why was it rejected? I guess I'm trying to understand how the Virtuals think of the Actuals. Toward the beginning of the story, I got the impression that they had basically forgotten that the Actuals even existed. Excellent story, by the way. I'm really looking forward to the next chapter.

2014-08-12 02:24:07 by Jay:

Oh. Rachel's got the Bridge, which lets one flip anything from matter to information and back again... she's going to Virtualize everyone, isn't she? Well, if you can't beat them, join them. (For half an hour, modulo worldring construction time, maybe...)

2014-08-12 02:30:03 by inkmothnexus:

@kyle: someone did the math last chapter about how many subjective years of experience you lose on a trip to the nearest star, even assuming a new ra would only take a few years to make.

2014-08-12 02:42:12 by Solonarv:

Mike: I'm furiously looking for a reason why not. From what I gather, nonlocality physics basically sweeps up 'arbitrary quantities of mass, energy, momentum, spin and electrical charge' [1] from the surroundings and packs it into a packet that you can send elsewhere at lightspeed. This excludes already-built nonlocality packets, so we can't just wrap the incoming packet and make it harmless. It should be possible to arrange for something to rendezvous with the incoming packet, even without FTL. I think what would be best is to send out an NL receiver/emitter to redirect the packet elsewhere, or some kind of mirror if that's possible. The big limit on that would be the required size of the thing, since the energy packet is obviously pretty huge. Size of the rendezvous-ing device or the packet being protected/encrypted in some way are the biggest issues I can see.

2014-08-12 02:42:51 by Solonarv:

Forgot reference: [1] is from qntm.org/war

2014-08-12 02:51:23 by Silhalnor:

@inkmothnexus: *Waves* Hey, that was me. Here it is: http://qntm.org/work#komment53dc2a7fb8fea In short it would take a minimum of 15.5 billion subjective years to reach the next nearest star. But in the end I decided that the reason must be that the Actuals simply didn't let the Virtuals colonize.

2014-08-12 03:20:45 by Eclipse:

So the way I see it, there are three big questions left: How is Rachel going to prevent everyone from dying? Who is the Glass Man actually? Why did Sam include the passage about Kannan and Rajesh? We have number 2 pretty well answered, we know what the Glass Man is doing (building the Matrioshka brain) and why (infamy, resources). We don't have the exact identity, but that hardly seems important now that Rachel has murdered him. However numbers 1 and 3 are really up in the air. I suspect that "From Death, Lead Me to Immortality" is important somehow, being placed this close to the end. Maybe Kannan told other mages, and they listened? There could be other players as powerful as Wheel.

2014-08-12 03:35:42 by TheyLive:

Great. All the Yudkowsky cultists are here to tell us how the Virtuals are actually right and Laura & Co. should just thank them and step aside. Man, I really liked the comments before EY linked to the story.

2014-08-12 03:56:16 by Silhalnor:

@TheyLive: Yudkowsky linked here? That's kind of cool. I've read some of the articles there, they are very interesting. But what are you referring to? When DanielLC said that quadrillions of humans are more important than billions? The fact that they are virtual doesn't really factor into it. Or are you referring to something said in a previous entry? I don't think I've seen any advocates like that though.

2014-08-12 04:10:06 by Deep Green:

@Silhalnor TheyLive is referring to we "psychopaths" who view Actual existence at the expense of Virtuals to be a negative thing.

2014-08-12 04:58:29 by Trevor:

Why didn't the Actuals evacuate Sol for some other system and leave the Virtuals all four arms in the first place? Create new Ras in each sun WITHOUT Virtuals to tend and leave the Virtuals their processing power. They can't leave Sol and they obviously want more, so just let them have it.

2014-08-12 05:16:29 by Bauglir:

Huh. I guess the Bridge is still mighty valuable, then. What is it that it actually does? I thought it materialized things from T-World, but that's not an option as of this chapter. Is it an arbitrary mana source or something?

2014-08-12 05:28:32 by anonymouse:

The Bridge is a way to move infomation around from one place to another, including from T-World into the real world, but it's a more general tool than that: Anil uses it as a teleporter, for example, and I suspect that's how Glass Man did his teleportation tricks too.

2014-08-12 05:28:50 by Ben:

Could the Glass Man be the aggrieved youth who told Ashburne that he wanted her dead and himself to be uploaded into a world where War had never happened? We know he disappeared from Triton when Ashburne pressed a button, and that strongly suggests that he died, but perhaps he somehow survived. Whoever it is clearly has a personal history with Ashburne, given the nickname and the far-future language, and there aren't a lot of known Actuals still unaccounted-for.

2014-08-12 06:57:27 by Andrew:

@TheyLive: are you reading a different set of comments than the rest of us?

2014-08-12 09:26:58 by Feep:

Yudkowsky cultist checking in for the daily take-over-the-world-starting-with-the-qntm-comments-section meeting. Hail Virtua! ... Wait, is this the wrong place? PS: I was reading qntm long before Eliezer linked it. PPS: Psychopath and proud. Can we get badges? PPPS: No srsly, yay new chapter! I'll read it on an upcoming plane ride~ My sincere thanks for helping stave off terminal boredom again and again.

2014-08-12 09:55:36 by bdew:

Actually the comments were a lot better before you started calling people cultists, psychopaths (yes this is a stab at Sam) and so on because they have a different opinion (about a fictional story) than you...

2014-08-12 10:12:49 by bdew:

About the glass man... From the way Rachel talks to him i assume he is another survivor from the before-abstract-war era. Maybe one of those that were with her on Triton or maybe one of the 14 random (or are they?) survivors. I also wonder if he had anything to do with starting Abstract War? Could the original command come from him and not the Virtuals? I hope we get a clear explanation of his motives even if we don't get to know his identity (as it seems). Having a no-name antagonist without any idea *why* he's being the antagonist would make the story a lot less meaningful IMHO. Also how does any of that mesh with the "it's old ra listeners" idea? Was Nat simply wrong?

2014-08-12 10:18:56 by qntm:

You're siding with the faction who exterminated five hundred trillion people.

2014-08-12 10:24:48 by Feep:

Do these people deserve to all die for allegiance, or mere membership, to a faction? Because there are probably not very many people here whose country never exterminated anyone.

2014-08-12 10:29:36 by Omegatron:

For the people asking why the virtuals/actuals don't just colonise another solar system: Ra was already very hard for them to build and they probably had more nonlocality technology on hand when they were building Ra than they would in a colony ship. Building a new Ra isn't something that can be done in a reasonable timeframe, if it can be done at all, and moving to a new solar system without a Ra node isn't an option for either side.

2014-08-12 10:32:43 by Tyr:

Technically Sam only said that people are psychopaths for agreeing with their approach to resource management. I.e. We need that matter, lets blow it up and kill the billions of sentient beings who live on it. The guy who he was responding to does sound a little psychopathic because he's using maths to decide the worth of a life, which is one of those things we're all taught never to do. I don't think I agree with him, but that's because I don't understand why a third option isn't possible. Specifically: since everything the virtuals experience is simulated, and we already know that simulations can run at different speeds without issue, why can't they continue to breed and just run slower and slower as Ra struggles to keep up? Eventually we could pull in more mass / move to extra stars in order to build additional computing power for them. Similarly, the idea that you would lose experience or time by being paused and transmitted to a new star seems a little crazy to me. If a being is paused, the moment they are re-started seems like the next moment to them, what does it matter if a significant amount of time in the real world has passed? @TheyLive: There is no cult.

2014-08-12 11:47:49 by Morgan:

I'm not sure it makes sense for the Glass Man to be the kid from Triton, because Wheel know who that is. There's no reason he wouldn't be remembered. Heck, Nat and Anil saw him. (Of course, Ashburne may not know this, and think that with Wheel fled and the records destroyed, his history may be sufficiently lost.) On the other hand, if he's just some person we'll explicitly never see identified... that's fairly unsatisfying.

2014-08-12 12:23:37 by Morgan:

Cultists and psychopaths: I'm assuming TheyLive is responding in part to Deep Green in the comments on Machine Space, who characterized people who disagree with the Virtuals philosophically as death-worshippers who need deprogramming. Given the comment by DanielLC to which Sam was responding, "psychopaths" is pretty proportionate. The Glass Man's reasoning here is explicitly "there are more of us than you and we can make better use of the land you live on" (...and the very matter comprising your bodies...) "so it's right and proper that we should kill you and take it for ourselves". That's kind of textbook "how not to ethic". In particular, this: "If you're okay with a woman risking the dangers of pregnancy to bring one new human into the world, than it doesn't seem like a hard decision to kill one person to bring a quadrillion into existence." ...is just astonishingly abhorrent - so if I'm okay with one person risking their life, I should be okay with another person committing murder? I'm also confused by Silhanor's characterization of Virtuals and Actuals as essentially the same. The Virtuals* set out to commit genoicde (and very nearly succeeded). The Actuals resisted being genocided. They *could* have then committed genocide in turn, but specifically chose not to. No, I don't think pausing a simulation indefinitely is morally equivalent to deleting it. The fact that a paused simulation can be resumed without any perceptible harm to the occupants is a fairly obvious distinction. (* It's worth distinguishing between "the Virtuals" as an ideological faction, as the population of humans living in simulation in Ra, and as the subset of both that initiated and continued aggression. The chapters on Triton made the point that whoever actually hacked Ra and made the request that kicked off War more than likely no longer exists in any meaningful form which can be held accountable, and it's impossible to know a) how many of the Virtuals were involved in that and b) whether they actually understood what they were doing.) Given that the Virtuals explicitly deny the importance of reality, I don't see that a pause of any duration constitutes real harm to them - it only alters their relationship to the substrate they choose to ignore anyway. So no, pausing the Virtuals is not genocidal (though the Wheel would then have a responsibility to actually do something else down the line, rather than kicking the can forever), and no, the subjective time that *could* be experienced within Sol-Ra during transit to a different star's Ra instead isn't an argument against migration - not to mention that they could, you know, dial back their simulation speed to make the wait for construction time more bearable. There's no moral imperative for the Virtuals to do ALL THE THINKING as fast as they possibly can - certainly not one that overrides the Actuals' right to life. That said, the Actuals should probably have been the ones to move, before Ra was even built. They're the ones with the philosophical commitment to enduring inconvenience.

2014-08-12 12:29:11 by bdew:

@Sam Both factions have comitted genocide of trillions people now, unless you consider the permanently-frozen virtuals not-people or not-dead. Unless you are saying anyone sympathizing any faction in this story is a psychopath.

2014-08-12 12:32:54 by bdew:

Random thought: Did Ashbourne &co have any proof that the command to destroy the worldring and build a matrioshka brain came really from the Virtuals or did they just assume that? I don't think it was mentioned in the story at all.

2014-08-12 13:00:49 by Toph:

"Ra would have killed all of us proactively by now if we were capable of representing a threat. We can't stop it." Bah, I can think of a few weaknesses already. Everything is real. Ra predicts your actions using simulated instances of you. You can use that - either to trick Ra, or to negotiate, I'm not sure which is more effective. Moving the Earth is brilliant. Ed MacPhearson would approve.

2014-08-12 13:34:22 by jlc:

I'm surprised by how strong Nat's morals are. Killing someone several orders of magnitude worse than Hitler seems like something most wouldn't object to. I'd be burning the remains to make it harder (impossible?) for the medring to work.

2014-08-12 13:51:33 by jlc:

@bdew IMO the paused state of the virtuals is far from death. Sure, Wheel had no immediate plans to unpause them, but they left the possibility open. I had assumed that the 'psychopath' part referred to agreeing with the plan to murder X actuals for the sake of Y virtuals when alternate plans existed involving no murder, like building the Brain elsewhere or relocating the Actuals or at least making a backup copy of them. Sure, murdering them all would be easier, but that's a psychopath's reasoning, which was exactly the point.

2014-08-12 14:10:32 by Tyler:

Would actuals even notice a transition to virtual? How does Ash know this isn't history or going on any record?

2014-08-12 14:42:04 by John:

I consider the Actuals freezing the Virtuals with no intention to ever reinstate them to be exactly the same as genocide. Or, rather, exactly the same as ATTEMPTED genocide, and boy I bet that Wheel is really wishing they had finished the job properly right about now. Look at it this way: if the Virtuals had made a backup of all the Actuals before beginning disassembly of the Solar System, but had no intention of ever restoring them, how would that have been different from murder? Or another question: What's the moral difference between making a backup you will never restore, and not making any backup at all?

2014-08-12 14:45:18 by bdew:

@Tyler She doesn't. Everything is real.

2014-08-12 14:48:59 by T:

Moving the Earth wouldn't work because Ra is still under orders to destroy it. The Earth node would notice the packet didn't arrive, notice it had been moved, and take steps to fix both problems before simply requesting another packet. According to Rachel, it would already have done this if any of it's simulations had discovered this possibility.

2014-08-12 14:54:56 by Morgan:

That "will never" appears to be an assumption you're making that isn't in the text. King said the Virtuals could stay frozen for a billion years for all he cares. The narration during the debate on Triton says they were frozen indefinitely as there was no one left to pass judgement on them. Certainly, no one in Wheel seems to have had any intention of unfreezing them any time soon, but the whole point of freezing rather than deleting them can only be to allow the option of unfreezing them at a later time - for example, when there *is* someone who can pass judgement on them. What other options do you consider Wheel (I think it's misleading to characterize them as "the Actuals"; they were the last two hundred survivors of a near total genocide of "the Actuals", acutely aware that they were responsible for decisions they weren't qualified to make, but which had to be made) to have had? Is there anything short of allowing themselves to be turned into computronium that you'd consider morally permissible for the Triton crew?

2014-08-12 15:14:07 by MichaelSzegedy:

Killing the Actuals would be stupid, there's so many better ways for the Virtuals to expand. Primarily: use another star, idiots!

2014-08-12 15:55:33 by skztr:

So Glass Man is the instigator of Abstract War, a "Lone Fanatic", either a Virtual who has decided to take it upon himself to purify the world and to be hailed as a hero when he returns; or, an Actual who has decided to become a Virtual, but who has decided that the best way to become a Virtual would be by first ensuring that he would be welcomed with open arms - by bringing with him the gift of all the remaining power of the sun. ?

2014-08-12 16:10:27 by Toadworld:

For those saying that not knowing Glass Man's name is unsatisfying, I invite you to read: http://en.wikipedia.org/wiki/Damnatio_memoriae Perhaps a nice middle ground could be for us to be treated to a flashback, internal to Ash's mind, describing the person she's looking at.

2014-08-12 16:46:02 by Velorien:

@jlc, I don't think it's that special a decision. The majority of countries' legal codes already take it as standard that you just don't kill a criminal when there's no need to (i.e. when they have already ceased to present a threat to others). The decision to freeze rather than terminate the Virtuals was made on roughly the same basis - they may have committed unimaginable crimes, but insofar as they're no longer a threat, we can afford to take our time and figure out the most ethical way to deal with them, rather than going straight for "that which is evil must die". The whole idea that the Glass Man needs to die because he is "worse than Hitler" (as opposed to on the basis of a threat assessment) is based on the idea of purely retributive justice - that a person who commits evil should have a proportional amount of evil done to them. You'd be surprised how many people do *not* subscribe to that ethical system, and how many consider it to be actually repulsive.

2014-08-12 16:50:50 by Velorien:

@Toadworld, I don't see how the fact that it's a known historical practice makes it any less unsatisfying as a literary device. There are plenty of well-attested real-life phenomena that don't make for satisfying storytelling, like having the powerful bad guys crush the plucky underdog.

2014-08-12 17:44:11 by Silhalnor:

Morgan: "No, I don't think pausing a simulation indefinitely is morally equivalent to deleting it. The fact that a paused simulation can be resumed without any perceptible harm to the occupants is a fairly obvious distinction." That IS a distinction, yes, but the difference fades away when the people who froze them intend to never unfreeze them. Furthermore, they threw away the key making it quite clear that all of the survivors (or enough of them to outvote the rest) intended to never unfreeze the Virtuals nor give anyone else the chance to pass judgement on them. Although it IS possible that something will eventually occur that unfreezes them, it is a rather unusual circumstance and one that is evidently desired to never occur. If that isn't genocide then it isn't far from it. It would merely be attempted genocide as John points out. "There's no moral imperative for the Virtuals to do ALL THE THINKING as fast as they possibly can - certainly not one that overrides the Actuals' right to life." It could well be that many Virtuals do in fact hold this belief because time lost can never be reclaimed. Or they might not, we really don't know any of their opinions at this time. jlc: "Sure, Wheel had no immediate plans to unpause them, but they left the possibility open." They did *not* leave the possibility open. Or rather, they did not intend to.

2014-08-12 18:18:32 by John:

Another way of looking at the backup situation is to flip it: What if the Virtuals had secretly made a backup of the Solar System before beginning Abstract War? Is it true that "If you keep a backup, it's not genocide"? That would mean that the real crime the Virtuals committed was not destroying the Solar System, it was failing to make a backup before doing so. "Thou Shalt Not Reformat Without First Making Backups!" I suppose that would fit in with the general sysadmin theme of the whole story.

2014-08-12 18:26:36 by jj:

Yeah, ridiculous arguments about the morality of freezing the virtuals aside...(hint: the victors choose the moralitu. Their decision was the best they could do at the time) Rachael is going to virtualize the earth, if only temporarily.

2014-08-12 19:25:13 by Morgan:

Silhanor: "It could well be that many Virtuals do in fact hold this belief because time lost can never be reclaimed." So what? Maybe they do, but that just means their beliefs are awful and shouldn't be taken seriously. The point about throwing away the key hadn't occurred to me, though. I'll have to look back over those chapters. John: "What if the Virtuals had secretly made a backup of the Solar System before beginning Abstract War?" Do you really see no difference between a) forcibly uploading people before killing them, then keeping their uploads in storage, and b) pausing a running upload that's already disconnected from all outside input anyway? It's not a "backup situation" - the Virtuals haven't been deleted.

2014-08-12 20:17:29 by John:

@Morgan: With regards to a), every Actual was already being continuously backed up (Abstract War was going fine until the backups on Neptune got nuked). So it would just be a matter of making a surreptitious copy of the backups. No "forcible uploading" necessary. With b), you have a point. The messy method of deconstructing the Solar System was an unpleasant way to go about it. But if they were (somehow) able to just instantly switch everyone off rather than chop them up messily with lasers, then it would be exactly equivalent to putting the Virtuals on hold. While it may have been necessary for dramatic purposes for Abstract War to unfold the way that it did, it's a classic case of the Virtuals holding the Idiot Ball. The correct way for them to have gone about Abstract War would be to quietly wait for the propagation of the subordination message to reach EVERY Ra node before doing ANYTHING, and then instantly switch all of Actual Humanity off. No pain, no suffering, no chance of failure. But then there would have been no story.

2014-08-12 20:42:02 by Silhalnor:

Morgan: "So what? Maybe they do, but that just means their beliefs are awful and shouldn't be taken seriously." Perhaps. But I think I can pose an argument in their favor. However I will first prefix this with saying that I agree with you on account of the fact that this act does violate the right for anyone to exist. The Virtuals as a whole have a net lifespan that is limited by the total energy content of the sun (and any resources that may be harvested in the future). Therefore if the Actuals are holding up a fourth of that energy that means that the Virtual's total lifespan is cut down to 3/4ths of what it should be. From what I gather from the story the Actuals aren't even USING most of this energy, even back when they had a world ring. But even if they had been using it all they were still sacrificing billions of years of Virtual life for every year of Actual life. That doesn't sound fair, does it? The compromise would be for the Actuals to become Virtual. The Virtuals could well have intended for this to happen for all we know and made the backups of everyone before the War started, albeit obviously against the Actuals will. (Or they might not have, in which case this moves from an arguably desperate attempt to experience all that life has to offer to something more evil.) Do the Actuals have the right to live in physical space despite the fact that this costs the Virtuals a large portion of their total net lifespan? I feel that people *should* have this right but it does have a significant cost to the Virtuals without providing them any mitigating benefits. PS. I'm having a lot of fun discussions today! Moral ambiguity is lovely. As is trying to make sense of what may constitute Blue and Orange morality without being unfair to either side.

2014-08-13 00:40:19 by anonymouse:

Fun as these arguments are, I want to get back to speculation: I think we know enough about Glass Man to figure out who he is, more or less, and what he waas aiming to do, and why. Clearly one of the survivors, either from the Triton or the other 12, who wanted to become Virtual. He was either never a part of Wheel, or quit very early on, and he probably tried to recruit other ex-Wheel (and current Wheel) members to his cause, including Rachel. He uploaded himself into Tanako's World It's even possible that he was even its designer and baked his personality into it. He's been sitting in there and plotting for subjective centuries, and the Ra-cult all have an instance of him merged with their own memories. The question is how he managed to find out about the Bridge from King, but King is not the best at OpSec so maybe social engineering worked, or maybe it was some kind of hacking. And he wanted to build the Matrioshka Brain and present that to the Virtuals to get them to accept him when he uploaded himself (how do we know he hasn't sent himself into the sun, incidentally?) But he seems to be out of the picture now, and the real question is, how will Rachel save the Earth? And in what form will she save it? I think this might be the end of magic, and she might have to use the Bridge to drop the Earth node into the Sun, or something equally drastic, because I don't really see many other ways out of this situation.

2014-08-13 01:16:59 by Vladimir:

Not sure why the urgency. They have all the time in the world to make a plan, because the tech to slow down subjective time exists in-universe and has been used several times.

2014-08-13 02:06:07 by Sabin :

It's kind of silly to armchair quarterback the ethical decisions in a work of fiction. Word of God can fill in the all-important details with absolutely zero ambiguity if he so chooses. All it would take is one of these two passages added into the next chapter and it would silence the ethics debate for good: "Sure, the Virtuals could have easily built a Brain around a different star. But they're just dicks and wanted to watch 500 trillion humans die." Or, "The Virtuals had run out of processing power and in 14 Real days, the overload would short circuit Ra and end up killing all of Combined Humanity. They had already copied the consciousness states of all the Actual Humans and the plan was to construct a Matrioshka Brain around Sol as a short term solution and then send a probe to the nearest star, move Virtual Humanity (and the Brain) there, and then rebuild the Worldring and put all the Actual Humans back where they found them." Ethics are all about the details, and those details can be invented at will by Sam. So why even bother debating them?

2014-08-13 03:20:33 by atomicthumbs:

"You are so few. Only six billion-with-a-B. That you can't. Even. Matter. Not compared to the needs of the trillions of trillions who are coming." Are we sure that's an *energy* packet

2014-08-13 03:22:13 by Eiko:

@Sabin: There are schools of literary analysis that hold that the author's intent for the direction of the story only matters as far as the text does. Subscribers of those schools would argue that while Sam /could/ write those details into the story, he has not, and until such time as he does, his personal beliefs on which side is right aren't relevant to analysis of the text.

2014-08-13 03:30:49 by atomicthumbs:

Whoa hey are we getting mad about theorethics now? Is that a cool thing to do?

2014-08-13 04:02:03 by Silhalnor:

Sabin: "Ethics are all about the details, and those details can be invented at will by Sam. So why even bother debating them?" It's fun, and may lead to interesting conclusions. Funny to be referencing Less Wrong so soon after someone talks like it's some illiberal cult but I was often assuming the least convenient possible world in my arguments so that I might determine what my own moral conclusions were. It served a purpose beyond the story itself. http://lesswrong.com/lw/2k/the_least_convenient_possible_world/

2014-08-13 05:27:44 by Silhalnor:

Wait... where is Ed's kara getting it's energy from? I assumed it was being channeled from the Listening Post's reserves but that clearly can't be the case now. Could it simply have a bunch of mana cached? But it's usage levels have been extremely low for years, it wouldn't need to cache that amount. I still can't rule it out of course but now I wonder where it's energy may be coming from. Maybe it can drain the user's own mana without consent? For that matter, I reread the first few chapters recently and the question of where Rachel was getting her mana supply from was brought up. Maybe there are some alternative ways of producing mana, or there is enough ambient mana to fuel these things.

2014-08-13 05:58:25 by Bauglir:

The Distributor node in Earth's core is still supporting the actual mechanics of magic. All the destruction of the Listening Post has done is wipe out the recording of magical events, T-World, a large quantity of the Wheel's stored mana, and possibly a big chunk of their simulation tech. Ed's kara presumably draws its power from ambient mana, or is piped energy directly from the Distributor.

2014-08-13 06:22:32 by Silhalnor:

@Bauglir I'm aware of the difference between the Distributor and the Listening Post, I've explained it to others too. But I had been under the impression that all of Wheel's machines were powered by their mana cache in the Listening Post. The fact that the karas apparently aren't shows that the Wheel can do some things right: their immortality doesn't have a single point of failure.

2014-08-13 07:00:44 by T:

I was wondering that too. My assumption for a long time was that all of Wheel's big fancy spells, like the giant screen, their readouts, their kara network, etc. was all powered by the giant Montauk under the post, which was always continuously collecting waste and geological mana. So, we know "civilians" all have a pretty steady daily quota of mana, and we know it's much less than what Exa displays during a routine fight. Laura's "phoenix form" must use lots of mana. She constructs a spacecraft around her entirely of forcefields and starts putting out a lot of thrust. Is it just because she can use Recursion to use the mana more efficiently? Did she instantiate with her equipment, stored mana and all? Has the Scooby gang broken free of the daily civilian quota somehow?

2014-08-13 07:04:38 by Black Noise:

Not sure if this was noted before, but: > "And here's what he asked for," Natalie says. "Thirty-five digits. Forty-six decillion joules. From last chapter, that accounts for 40 years worth of total energy output from the sun. How likely is it that there doesn't seem to be any interstellar colonizing because Von-Neumann like Ra energy/negentropy harvesters are already on the boring job of exploiting galaxy-scale resources? Though there were probably a few Actual nutjobs that chose to travel.

2014-08-13 07:37:45 by Greel Lh:

Would be pretty funny if Ra returned an error message: "Insufficient energy. Please edit and resend your request" So, I take it Virtual Humanity is unfrozen at this point. Not sure how to make it work from a continuity perspective, but my wild guess is Ashburne's going to virtualize earth and take us inside Ra. That would be a pretty wacky trip. Yeah, no idea what's happening next. This is an amazing read.

2014-08-13 08:41:57 by FK:

I'm missing something(s). The end of "It has to work" is > and they're in freefall at nothing A.M. over the red-black Atlantic, and the Glass Man is on top of them, firing. So how did they suddenly get the drop on the Glass Man at the beginning of this chapter? Who commanded the listening post to reincarnate them over the Atlantic, as opposed to some other random place? What are the "new icons clustered around the listening post", also mentioned in the last chapter?

2014-08-13 10:20:51 by Omegatron:

@Vladimir The listening post was destroyed so they'd have to build a suitable computer system from scratch which they probably don't have time to do. @FK Probably Natalie had them reincarnated there and maybe they were upside down for whatever reason so the glass man appeared above them.

2014-08-13 11:46:13 by Morgan:

@John: I think a more fitting analogy would be "what if the Virtuals somehow froze all the Actuals in stasis and stacked them in a warehouse somewhere?". Sure, that's not a nice thing to do. It puts the Actuals entirely at their mercy and would make a subsequent genocide extremely easy. I'd be very surprised if anyone claimed it actually was genocide, though. And as the response to genocidal action initiated by (some) Actuals when you have no way of distinguishing one from another or any activity they take from any other until it's already possibly killed you, I'd say it's definitely a more reasonable and moral course than just killing them all. I do agree that the course of War seems to be suboptimal even just from the perspective of effectively achieving goals, however abhorrent those goals may be. I'm not sure how much of this is the Virtuals holding the Idiot Ball, them not really understanding what the real world is like or even that it exists, Ra behaving irrationally due to the Virtuals' override not being quite complete... It does make it quite frustrating when characters are using "Ra plans ahead perfectly" as a data point in their own planning, yet we see that either Ra can't seem to think straight or else the people telling it to Do What They Mean don't seem to really know what they want. Plus the inscrutability/unknowability of the Virtuals is undermined by the Glass Man's very human-seeming malice; I take this as support for the notion that he is, indeed, effectively a rogue Actual. @Sabin: Sam can decide what facts are true in the story, but that doesn't mean the judgements people are making based on the facts already available to us can't disturb me or be worthy of comment. @Silhanor: It's the Actuals' star too. The Virtuals aren't entitled to the theoretical maximum they could squeeze out of everything, such that already having a claim to some of everything is a theft from them. If I could add years to my life by consuming the still-beating hearts of appropriate sacrifices, that doesn't mean other people are shortening my lifespan by selfishly using their hearts to pump their blood around just so they can live. The compromise was already established over thousands of years of ideological and physical conflict before Ra was constructed, and it was a 3/1 split. It's a crap compromise that leaves two factions sharing a single resource even while they become less and less capable of communicating or even understanding that the others exist in any meaningful sense, yes, and I think the setup for War is one of the least satisfying parts of the story: obviously Combined Humanity should have foreseen that a balance this arbitrary and precarious (because seriously, why 3/1?) would be upset at some point, and either the Virtuals or the Actuals should have ceded the system to the other side before Ra was even built. If the Virtuals had to wait through Ra's construction, surely that was the time to take the trouble to move stars - and surely someone should have pointed out that they'd have to either move or die *eventually*. For the Actuals, well, it's kind of transparent nonsense for them to talk about the noble struggle of existing in the real Universe while apparently having nothing better to do with that struggle than vacation by the beach on fake Earths where all their wishes are instantly granted. The Virtuals started War and that's basically indefensible, though how morally culpable even whatever subset actually took action are is questionable since we don't really know how clearly they understood what they were doing; but the compromise as struck was grossly tilted in the Actuals' favour for no good reason, because apparently being physically real just isn't enough unless you've also got thousands of planets and plenty of sunshine.

2014-08-13 12:38:32 by Sabin:

@atomicthumbs: Yup, pure theorethical rage. @Eiko: I actually agree with that school of literary analysis (to an extent). I just don't think it applies to the ethics of a given fictitious situation. What makes debating real life ethics interesting is that there ARE facts, which given enough work, can be discovered, interpreted, and applied. For example, the question of whether or not it was ethical of the U.S. to bomb Hiroshima and Nagasaki. Devoid of any of the relevant details, it's just another theoretical ethical thought experiment. Part of it is that I'm inherently biased against such thought experiments; more on that below. @Silhanor: True enough. Now to be fair, I said it was "silly", but who ever said silly things can't be fun or constructive? @Morgan: I can certainly appreciate being disturbed by certain people's approach to the ethical thought experiments posed by the story. In general, I personally am disturbed by *most people's* responses to ethical thought experiments. To me the answer 99% of the time is, "Figure out an alternate solution." Sure, you can say that's not playing within the rules of the thought experiment. In the "fat man and the train" scenario you can invent all manner of bizarre circumstances that explain why the ONLY POSSIBLE ALTERNATIVES are killing the fat man or letting the innocents die. But I think that's horribly unrealistic and doesn't do anything to prepare people for the real world of ethics where there is always, always, always a 3rd choice.

2014-08-13 16:14:54 by Sabin:

Also, Sam- was the fact that Rachel's utterance was not visually formatted like typical spells a deliberate choice or accidental?

2014-08-13 16:21:44 by QQ:

i liked the comments better before the Social Justice Warriors took them over...

2014-08-13 16:29:26 by qntm:

> was the fact that Rachel's utterance was not visually formatted like typical spells a deliberate choice or accidental? Deliberate.

2014-08-13 16:56:17 by nitrat:

My first comment here, and I probably will end up being called a psychopath right away. The question is: what is so wrong about freezing or even perma-deleting Virtuals? To be able to explain more clearly what I mean, let me give you an example from my own career in programming. A friend of mine once had to write some code that involved checking, whether a certain field in a bunch of database records is positive, negative or zero. Sounds like a simple bit of code, but he happened to screw it up - he didn't consider that some records could have a null value of that particular field. The question here is: is zero greater than null? Smaller than null? Equal to null? The only reasonable answer is null. Similarly here, in the most moral systems invented by humanity so far that don't involve an external factor (as in, God, for example), the question of whether it's right to exterminate a few billion Virtuals returns the same answer: null. Let's, for example, take an oversimplified hedonistic morality model: pleasure = good, pain = bad. Then, if we create a reality where Virtuals are being tortured daily, we have negative moral value. If we create a reality where they experience states of enlightenment and pleasure - we get positive moral value. However, if we simply stop their existence (freeze them and/or erase them) - we get a simple answer: null. The problem is either unidentifiable in terms of morality, or orthogonal to the questions of morality, or there is no way to know.

2014-08-13 16:57:41 by Velorien:

@QQ, any story that deals with significant ethical issues will draw people who want to discuss them. If you don't like it, why not shift the discussion in a different direction with your own contributions?

2014-08-13 17:14:32 by QQ:

Because comments are getting lost inbetween the mountains of "too long, didnt read" conjecture on morality. The author does not agree with these viewpoints and it is his universe. Counter question; why not Deal With It and stop ruining the comments section with assburgers like fixation on a non issue to the story?

2014-08-13 17:17:42 by Sabin:

@nitrat According to the 'oversimplified hedonistic pleasure model', the Virtuals have a baseline moral value. Exterminating them reduces that moral value to zero. Thus, the net result is a negative moral value, thus it's unethical. I might be missing something obvious but I guess I'm just not seeing why killing them all results in "null" moral value rather than zero. And even if that were the case, what prevents that same argument from being used to justify the genocide of Actual humans?

2014-08-13 17:19:35 by Sabin:

@QQ Fair enough. So, who decides what's acceptable to talk about and what isn't?

2014-08-13 17:21:07 by qntm:

I'm surprised people are hung up over the morality of what I perceive to be an extremely cut-and-dry situation: * Killing people is bad. It is something to be avoided at all costs. * People are still people even if they're running inside a virtual machine. * Destroying a virtual machine (or its state) destroys the people who are running on it.

2014-08-13 17:23:50 by QQ:

Clearly Sam decides. Clearly he has spoken directly above this comment on his views on the subject, which are admittedly very clear cut. For full disclosure i agree with his views and dont think further discussion is warranted at this time. Doubly so given the intentional ambiguity on who the "real" aggressor is.

2014-08-13 17:28:09 by qntm:

Regarding moderation of this thread, the moderator is me. If people get onto inappropriate topics, I will make my feelings known. I also have the option of closing comments completely if your antics become tiresome :) At the moment the only rule is be civil.

2014-08-13 17:32:09 by nitrat:

@sabin The only way we can judge morality (or, in fact, any other measurable value) of our actions is by other's experiences. If there is no experience - it defaults to "undefined"? not even zero. Zero level would be something like trudging through one's life, not feeling pain or pleasure, just going on due to inertia, sort of. If you fall asleep, and instead of one night you sleep two whole nights in a row and when you wake up everyone pretends that it didn't happen, and you never notice - there is no morally measurable damage done to you. Three nights, a hundred years, a billion years - as long as you don't notice, from your point of view it's nothing. If your sleep is eternal (death) - what's so different? Lim(bad morality(sleep length)) as time -> eternity = 0. And for the difference between Actuals and Virtuals - as far as I understood from the story, Virtuals (at least, most of them) got insta-frozen without knowing what hit them, Actuals fought the war and suffered.

2014-08-13 17:40:32 by QQ:

@nitrat - respectfully sir, while i wholly agree with your analysis, no one honestly cares.

2014-08-13 17:42:36 by QQ:

Since this whole morality debate isnt going away any time soon it seems, i notice one fleck of useful information that can be gained from the useless discussion; There are WAY more psychopaths out there, walking around un-diagnosed, than you could even imagine.

2014-08-13 17:43:46 by QQ:

And i mean that in the literal lack of empathy clinical definition, not the pop culture "whoa that guy sure is crazy" definition.

2014-08-13 17:59:06 by nitrat:

@ QQ Well, I didn't exactly write a comment for the sake of raising my care-o-meter, just saw an interesting argument and wanted to see what others might think about a particular position.

2014-08-13 18:09:27 by Morgan:

@nitrat: So it's okay to kill people so long as you do it instantly and painlessly, so that they simply stop having experiences instead of experiencing anything bad as they die? Presumably you see that this generalizes beyond Virtuals and has troubling implications?

2014-08-13 18:09:43 by Unmaker:

"Ra would have killed all of us proactively by now if we were capable of representing a threat." Inaccurate: Ra would have killed them if they were a threat to legitimate orders in any of its simulations. So the first viable possible future is one where Rachel et al. are giving legitimate orders. Ra would not block such futures; otherwise most of the story would have been impossible. [Realized this on a re-read of Pratchett's Going Postal, where legitimate authority orders were a minor plot point in an earlier chapter.] Example: Rachel has yet another damn copy of the key. Problem: From a story-telling standpoint that would be unsatisfying. Example: 1) migrate to Virtual status ; 2) give orders that don't contradict the Matrioshka brain order but do save the current Actuals, e.g. virtualize them. Problem: Against her personal ethics and motivations. Example: 1); 3) ask Ra for the Virtual copy of the key; 4) rescind destruction orders. Problem: Timing. The second possible viable future is one of escape: build enough emergency deep nonlocality drivers to send Earth's population on a one-way jaunt out of the system. Ra probably wouldn't block that because it is not contradictory to other orders. The third possible viable future is any one that Ra did not simulate and cannot simulate fast enough to block it. If Ra did not simulate that future that means it hasn't got a prepared plan of action to stop it; if Ra cannot simulate that future before it is inevitable then it can't block it. This already happened once to a restricted Ra node: the Neptune disaster was not anticipated. This is harder to imagine (simulate ;) ), but possible exploits come to mind: Example: The Bridge is an independent light-speed nonlocality engine. That means it can out-speed Ra for any area of space with less distance to the Bridge than to a Ra nonlocality driver. Assuming the listener elements can't actually do things directly, this means any area closer to the Bridge than to the core node. Example: Rachel may have knowledge of circumstances where Ra's simulations sometimes fail. Of course, her simulations would take those into account and use them, thereby breaking them, but that is a recursive problem that might have a solution, e.g. "recursion's big brother". The fourth possible viable future is using the fact that not all of the Virtuals are likely to be OK with killing primitive Actuals. Attempt to contact a subset of Virtuals who are willing to give Actual-friendly orders. This is a second-hand "legitimate order" solution. That begs the question of where these supposed friendly Virtuals were when Abstract War started. So, four possible outs. Time to let my brain relax with easier problems, i.e. back to work.

2014-08-13 18:14:53 by Sabin:

@Sam: I think everyone (except the psychopaths) agree on all three of your points. It seems like most of the quibbling is about the caveat "It is something to be *avoided*". In other words, did the Virtuals do all they could reasonably be expected to do to avoid killing the Actuals before actually killing them? (As I pointed out in my prior post, the reason I personally avoid discussing the ethics of the Virtuals' decision is because you can easily answer that question with a simple "Yep." or "Nope.") -- @QQ I think a lot of these people aren't actual psychopaths with no empathy. They just act like they. And the anonymity of the internet means no one can call them out. I don't know if that's more or less disturbing. -- @Nitrat You still haven't explained how that same standard can't be applied to instantly murdering the entirety of humanity en masse. To put it in your terms, they'd be dead, and as long as they don't notice, from their point of view it's nothing. (I'm not asking in context of this story. I'm talking in general. According to your framework, it is ethically irrelevant if you kill people, say, in their sleep.) The reason your argument has this flaw is because you're ignoring the fact that a life has value prior to its end. If that life is ended when it could have persisted, there is an opportunity cost. In simpler terms, killing is bad, mmkay?

2014-08-13 18:15:39 by John:

@Sam: I see your point, but I disagree about the issue being cut-and-dried, at least from the perspective of those within the Ra story. Fundamentally, "kill" has a different definition within the story than it does for us, simply by virtue of powers available within the story world that we don't have access to. - Wheel was running comprehensive simulations all the time, of sufficient fidelity that those within the simulation were unaware of it. When they terminated those simulations, did it count as killing everyone inside? - When Ra runs simulations to fulfill a wish, when those simulations end, does it count as killing everyone whose simulation was necessary to model the wish? - When two people (such as the 1970s T-Exa and the modern-day Exa) merge, doesn't that kill at least one person? Pre-merge there are two people, post-merge there is only one person, so one person had to die, regardless of whatever mind-munging the other person might have undergone. It could be argued that merging destroys both of the predecessors, replacing their mind-states with an artificial amalgamation that is equivalent to neither of them. - Is keeping an inactive mind-state inactive forever equivalent to destroying the mind-state, or not? Just saying "killing people is bad" doesn't address these questions. Our current morality has issues dealing with the story's morality, because it has never had to undergo the challenges of dealing with mind-merges, perfect-fidelity simulations, etc. Someone from a world where those things are common fact is likely to look like a psychopath to us.

2014-08-13 18:19:48 by Sabin:

@Morgan- "Problem: Against her personal ethics and motivations." Not if she reconstructs Earth after Ra has executed its orders and repopulates it with Actual Humanity.

2014-08-13 18:26:38 by Velorien:

I have to say that for all the great things about Ra, this chapter reaffirmed for me how low the story is on sympathetic characters. Nat is someone I can root for, and Nick is established as an almost stereotypically nice guy from the start, but Laura is frankly unpleasant, on any number of levels, Rachel is increasingly similar (though at least she has an excuse, as a traumatised war veteran who once lost everything and constantly has to make the hard decisions etc.) and minor characters like Ed and Anil just don't have enough going for them either way (especially after Anil got his interesting mode of thought/speech heavily toned down). Do other people have the same impression, or is it just me? If the former, is this intentional on the part of the author? And if so, why? @QQ You're coming into someone's home and telling their other guests to stop talking because you don't like the subject of the conversation. This doesn't strike you as at all rude? Calling the other guests psychopaths doesn't help your case either. Are you really trying to medically diagnose a crowd of strangers based on their responses to a single thought experiment, or are you just throwing insults around?

2014-08-13 18:27:09 by Sabin:

@John: Let's start with a more comprehensive definition of "kill", and then answer your questions. Kill: "To terminate an identity (as defined by *both* a consciousness and a set of experiences/memories) in its entirety, with no reasonable possibility of recreation." "When they terminated those simulations, did it count as killing everyone inside? / Does it count as killing everyone whose simulation was necessary to model the wish?" No, because those people still exist outside with the vast majority of their experiences and memories intact. "When two people (such as the 1970s T-Exa and the modern-day Exa) merge, doesn't that kill at least one person?" No, because the entirety of T-Exa's memories and experiences remain intact. "Is keeping an inactive mind-state inactive forever equivalent to destroying the mind-state, or not? " If it's *truly* forever then yes, because you have ended an identity and ensured that there is no reasonable possibility it can be recreated.

2014-08-13 18:35:13 by Morgan:

@Sabin: first, your last quote is from Unmaker, not me. Second: "It seems like most of the quibbling is about the caveat "It is something to be *avoided*". In other words, did the Virtuals do all they could reasonably be expected to do to avoid killing the Actuals before actually killing them?" Huh? I wasn't under the impression, and am surprised you were, that the point of contention was whether the Virtuals were in the right in starting War. I rather figured it went without saying that they weren't; they killed trillions of people because they wanted their stuff. Isn't it obvious that they didn't do at least one thing that could reasonably be expected of them first - namely, stick to the compromise agreed before Ra was built, and live within the means they'd agreed to limit themselves to? I thought the question was whether the Triton crew's actions in freezing the Virtuals could be considered morally equivalent to the Virtuals' action in initiating the genocide of War. I thought that *at least* people could agree the latter was badwrong... @John: I took Sam's third point as a clear implication that, if *destroying* a machine or its state kills whoever may be running on it, pausing it does *not*. Again you're bringing "forever" in to it, when if there's one thing that seems almost more inviolable than c in this story it's that solutions people think will provably hold good forever more simply won't. (I've yet to go back and check what was said about the key to see if the Wheel - foolishly - thought they were making it completely, mathematically impossible for the Virtuals ever to be resumed, which would make the distinction between freezing and deleting them more academic than just an intent not to resume them would.)

2014-08-13 18:53:11 by Sabin:

@Morgan - Good catch. Sorry about that. I actually had that written as the second point of contention but I erased it for some reason. Most of the discussions about the first point have been on previous comment sections though, so I understand the confusion. Sorry about that. To summarize the arguments surrounding the first one, there's speculation that the Virtuals may have had no other choice but to do what they did. Additionally, there's the "psychopath argument" as it has been dubbed, that the needs of trillions of trillions of Virtuals outweighs the needs of the paltry 500 million Actuals. As for the second one, in my opinion I think there's too many "behind the scenes" factors which seriously impact the ethics of the decision. 1. Did the Virtuals pose a threat if 'unpaused'? 2. Can Ra run until the end of the universe? #1 is important for obvious reasons. #2 is important because if the universe has a finite lifespan, then every moment a Virtual is 'paused' for is a moment they don't get to experience in their lifetime (which is assumed to be infinite.) It would be similar to putting a human into a medically induced coma and taking 10 years off of their lifespan. My problem with debating it is that those questions depend on facts in the world which we cannot determine ourselves.

2014-08-13 18:58:06 by QQ:

@velorien Your own thinly veiled backseat mod attempts aside, multiple comments on this page ALONE in the "morality debate" fail a recognized clinical test for empathy disorder. Are you debating that factual point, or did my comment just strike too close to home for you - for whatever reason?

2014-08-13 19:02:00 by QQ:

And for what its worth, the "owner of the house" does not seem to share in the desire for the wasteful morality debate. And he gets to make the rules. I would get into topics of logical fallacy but ive already waded too deeply into a topic that both the "owner of the house", and myself, find unnecessary and personally in my own opinion distasteful. If the author says its "cut and dried" and you dont agree you are more than welcome and equally able to continue your debate elsewhere :)

2014-08-13 19:04:50 by Morgan:

Silhanor: it seemed like such an odd jump sideways that I did wonder whether you'd gotten the faction names transposed, but it didn't look like a simple typo could have caused it, so I guessed otherwise. Glad to have that cleared up. On the questions you raise, I think we have enough information to conclude that the Triton crew simply didn't and *couldn't* know how much of a threat the Virtuals might pose, and so pausing them was the minimum they could do to effectively defend themselves while they were the last two hundred Actuals in the system (universe?) standing in the Sun with a ticking clock. My read on the situation was that they decided to simply defer the decision until whatever sort of humanity they were able to restore could make it with something more like actual moral authority, though if they thought they'd made that impossible... And as for #2, well, I figured Ra was limited to the lifetime of the Sun - though I suppose it could be set up to alter the Sun's evolution to run without disruption for possibly as long as it's still shining, not just its time on the Main Sequence, or even longer - but even so, the time the Virtuals are losing in freeze is a loss proportionate to their crime. It's not unjust to incarcerate someone until their trial, after all. The issue there is that it's impossible to separate the guilty Virtuals from any others who may not have known War even took place.

2014-08-13 19:08:46 by John:

@Sabin: If somebody came up to me and said "Hey, you're going to be disintegrated in a few seconds, but don't worry, it's okay, there's somebody else somewhere who will remember being you.", that wouldn't change the fact that a few seconds later, I get disintegrated. Similarly, if there was someone very similar to me somewhere else, and that other person's experiences were injected into my head, that wouldn't mean "Wow, I was really there in that other place", it would mean "Somebody just injected memories that I never experienced into my mind". The injected memories are entirely artificial and don't magically cause that other person and me to become one being. @Morgan: I take the point that "don't intend to restore this backup" is not the same as "this backup will never be restored". It's the difference between attempted murder and actual murder, which isn't a whole lot of difference. Does somebody stop being a psychopath simply because they failed in their determined attempt to kill someone?

2014-08-13 19:19:18 by Sabin:

@QQ Personally I want to be respectful of the people who have been here for much longer than I have (I started reading at It Has To Work). I know how frustrating it can be to have a swarm of new interlopers coming in and trampling all over a well-established mood. As far as I know the brigade of Rationalists came in because EY linked to Ra from the HPMOR Author's Notes. At least I assume this is the case because I was one of those people who came over. This was July 1. However, from reading all of the comments as I read the story, it seems like the major change in the tone of the comments happened right around Abstract War (which not so incidentally, marked a major change in the tone of the story as well). And that was long before EY ever linked to this. So my interpretation has been that the change was not because of a bunch of new folks coming in and hijacking the comments, but rather because the story brought up some moral issues that many long-time readers wanted to discuss. And given that Sam has explicitly stated the only rule so far is Be Civil (and Forget About Scott Parajsa), it doesn't feel like the new folks are to blame for the shift in subject matter of the comments. Of course I could be completely wrong in my interpretation of The History of Comments on Ra.

2014-08-13 19:33:42 by Sabin:

@John Welllllllllllllllllllllll.... That is one of those questions that I think there's not any sense in debating. It's answered at the same time that you answer the question "How does consciousness work?" We haven't answered that question in the real world. That question HAS been answered in Sam's world. So we're left with this "if/then" scenario that only Sam can really answer. He seems to address it implicitly in the story and we can debate the intent of his words, but at any moment he can step in and settle the debate for good. So I'll give you this: If disintegration and immediate reintegration, like Wheel Group teleportation, DOESN'T merely result in a "perceptual discontinuity" then you are right.

2014-08-13 19:40:43 by Velorien:

@Sabin: Should one really encourage a feeling of ownership or community over something as public and generic as a series of comments threads? By definition, all of us have equal rights here (unless Sam decrees otherwise), and it seems like a bad idea to protect the "well-established mood" if that comes at the cost of hostility towards new commenters who may be every bit as much fans of the work as the long-time readers, and have equally valuable thoughts to share. Why should seniority matter in a comments thread, or indeed *anything* other than quality of contribution?

2014-08-13 19:41:56 by QQ:

@sabin thank you for the reply, and respectful tone. I agree whatever the reason for the sudden debate, it is very frustrating. Especially when the creator of the work in question does not appear to agree with the premise. I could be misinterpreting his comment but it appears clear to my reasoning.

2014-08-13 19:45:49 by QQ:

@velorien you appear personally offended for whatever reason. I assure you there is nothing personal intended or implied. With that said; The value of this debate is questionable when the author has all but said "You. Are. Wrong." in as many words. Take that as you want but it can be considered more "factual" in the sense of a discussion on a fictional work than any debate or counterpoint you could possibly bring up. He is literally God of this fictional world, sorry if you dont like that but all the debate in the world will not change it?

2014-08-13 19:56:41 by QQ:

Going with the "guests in the owners house" analogy, i would infer by association that a majority of the morality debate participants would be terrible house guests. Imagine the guy that intrudes on your long running discussion with constant long winded sports discussions who insists it is his right "because, reasons". And his sports discussions themselves run counter to established socially accepted views held by the home owner and his previously established guests. And despite being told "Thank you, but we dont agree" he continues to holler "BUT WHAT ABOUT DEM DALLAS COWBOYS GUYS". Terrible house guests indeed.

2014-08-13 20:15:19 by qntm:

Hey QQ, I'm capable of moderating my own thread, thanks. I have a different opinion from several people here, but if I wanted the discussion to stop, or I thought people were getting too far off-topic, I would have said so. Please don't pretend you're acting on my behalf.

2014-08-13 20:15:25 by Morgan:

@QQ: Let me paint another scenario: someone hosts a forum for people to meet and talk. People meet and talk. The host doesn't find their discussion objectionable enough to ask them to stop or leave. Someone who hasn't previously been seen (there are no comments from you under your current nick on any previous Ra stories) shows up to complain that some people are talking too much or too loudly about things that don't interest him. He doesn't talk about anything that does interest him, just what others are talking about. He tells one of those others that no one cares what they have to say, which is quickly shown to be incorrect as others do, in fact, pick up the discussion. He claims the fact that the host has disagreed with what some people have said shows that he wants them to stop talking. The host shows up and points out that a) he's the one who decides if a topic is inappropriate or not; b) if he wants a discussion to stop, he'll say so or just close the forum. The host neither asks for the discussion to stop, nor closes the forum. The newcomer then treats this as evidence the host wants everyone to stop talking about what doesn't interest the newcomer. In this scenario, will you really claim the people who were talking without any apparent issue before the newcomer spoke up are the poor guests?

2014-08-13 20:16:09 by Morgan:

Aaaaand ninja'd by Sam, who may prefer to delete this comment and my last.

2014-08-13 20:18:30 by qntm:

From this point downwards, this thread is about "Ra" and not about itself.

2014-08-13 20:29:25 by Morgan:

@John: "I take the point that "don't intend to restore this backup" is not the same as "this backup will never be restored". It's the difference between attempted murder and actual murder, which isn't a whole lot of difference. Does somebody stop being a psychopath simply because they failed in their determined attempt to kill someone?" There's a further difference between "I don't intend to ever restore this backup" and "I intend never to restore this backup / that this backup never be restored". In the latter case, not simply deleting the backup is a pointless fig leaf; in the former case, you're deliberately leaving open the possibility of a restoration, even if you personally don't intend to carry it out, or simply don't have a specific plan for when to do so yet. Consider the difference between arresting someone and then locking them up, perfectly happy for them to rot in jail for the rest of their lives or even face the death penalty, but recognizing that the decision isn't yours to make - and summarily executing them on the spot. Everything about the way the decision *not* to delete the Virtuals reads as the former to me: "We've got them frozen because that was a necessary part of fighting War, we're in no position to decide what to do with them, the one decision we *can* make is that deleting them would be genocide and a war crime and we don't want to do that - screw it, let some future generation who haven't been horribly traumatized worry about it". Characterizing the deliberate choice not to commit genocide as being, in fact, attempted genocide is rather harsh. (And now I really must try to piece together the role of the key in all this and just who was making what decisions that who could potentially revoke down the line.)

2014-08-13 20:34:41 by Velorien:

@Sabin: Fair enough. If you're just advocating the "don't be a dick" rule, and nothing more, then I'm entirely on board with that, and will now respect Sam's wishes by suspending this line of discussion. On which note, does anyone have anything to say in response to the question I asked upthread? Does anyone else find that "Ra" has a distinct lack of sympathetic characters other than Nat and Nick? If so, does this serve to its detriment or not? I know there are plenty of novels, especially classical ones, in which the protagonists are horrible specimens of humanity and everyone else is often worse, but to me, "Ra" doesn't seem written as if it's meant to be one of them. In a novel about characters overcoming adversity and solving mysteries through the application of personal virtues like intelligence and courage, it seems like one is expected to support those characters, and be emotionally invested in their success. I've found this very hard to do with Laura, and not much easier with anyone else (the aforementioned pair excepted). (obligatory caveat: I think "Ra" as a whole is an excellent work, and am grateful for its existence)

2014-08-13 20:43:55 by Sabin:

So. Ra! I'm intrigued by the possible significance of Rachel's unintelligible words to the Glass Man. They were deliberately NOT formatted as "magic spells being 'used'". They also don't seem to be in any language I can find. And they also don't use Rachel's True Name. So that leaves a few possibilities: 1. Rachel is "mentioning" a spell (which does not require the use of her True Name) but not "using" it. Perhaps as some kind of threat? 2. Rachel is yelling something at him in a different language that is not known to anyone in "our" time. The former seems odd, although it would fit in with the "I'll have to threaten you with something." With "something" being a spell. If the latter, that would all but confirm that the Glass Man is an Abstract War survivor. But it does raise the question of why she would speak to him in a foreign language. Given the fact that she knows who he is and will never reveal it to any of the people there, its pretty reasonable to say that she's speaking in a foreign language because she wants to tell him something that she doesn't want the other people to hear. All of that put together gives us my dark horse candidate for the Glass Man's identity. Doug Ferno! Alright, hear me out now. Rachel pulled an "identity swap" trick similar to what Laura pulled with Nick. Ra now inhabits a shell body. Rachel and Ra Instance Version Doug.0 get married and have children who are literally the CHILDREN OF RA. Rachel dies, the children go off to school, and RaDoug loses his grip on what was keeping him attached to Actual Humanity. Shit hits the fan. Think about it - in Space Magic, Doug is very, very cagey with Laura regarding details of her mother's death. I've noticed that characters in Ra very rarely lie outright. Rather they tell selective versions of the truth. All Doug REALLY said in that chapter was, "I don't know your mother's motivation for going after Atlantis." Plus, it would definitely explain why Rachel would not want her children to know who the Glass Man was. It allllll makes sense.

2014-08-13 20:56:14 by Sabin:

@Velorien I always thought Nick was kind of a woobie. Just very "Blah". Anil struck me as just a sidekick and Ed Hatt is a minor character. Laura was very relatable in the beginning but then I started to get frustrated by her sheer recklessness despite being shown several times in no uncertain terms that her recklessness was putting people in danger. Natalie so far is my favorite, she's all of the badass that Laura is but with a lot more caution. Ultimately it's Natalie that ends up saving the world from Laura destroying it. I haven't seen enough of Rachel to really form a judgment one way or the other.

2014-08-13 20:57:36 by qntm:

The lack of sympathetic characters is because Laura was originally intended as a sympathetic character, and I completely blew it. She starts on totally the wrong foot in "Sufficiently Advanced Technology" - she somehow comes out of *an attempt on her life* looking hugely unlikeable. I tried to roll this back by having her make a potentially life-threatening error in "What You Don't Know", which I thought would humble her a bit, but NOPE, I didn't think about it carefully enough and she still just comes off as a brash, over-confident idiot. She should have risked her life to *avert* some lab accident, not *caused* it. This is sickeningly obvious in retrospect. In "Daemons" you can see that I decided to just accept that Laura's not really a very nice person, and have her stick her boyfriend in storage so she can make a deal with the devil. "Protagonism" is so named because that's when Natalie starts to replace her as the main character. Also, at this point I was actually starting to treat "Laura thinks she is the protagonist of her own story" as a character defect. I think Natalie may even mention this at one point, I forget where. Welcome to my learning experience.

2014-08-13 21:02:24 by Morgan:

@Velorien: Laura's the only protagonist I actually dislike, and I think that's deliberate - both as an exercise on Sam's part to make a smart character who's both wrong and kind of a dick about it a lot of the time, instead of always being right because obviously a smart character would think like and agree with the author, and as a fake-out to draw us along in her wake before we get to see things from Nat's point of view and realize that, wait, no, Laura went off the deep end a while back and took us with her. I don't find Rachel unlikeable yet, though I agree that both Anil and Ed have kind of faded into the background. @Sabin: that's a hell of a leap! I assumed Rachel was just yelling in her native tongue, out of anger, not necessarily expecting it to mean anything special to the Glass Man (I'd expect he'd understand her whether he's Ra, a Virtual, or a rogue Actual). The Doug theory, though... ...No, I don't see it. Her comments to him suggest an ideological conflict - logic, cancerous arithmetic, sick 'freedom'. And Doug's answers to Laura don't read as cagey to me - they read as the pained responses of someone who doesn't know why his wife suddenly turned into someone else immediately before sacrificing herself, and doesn't like being reminded of what he doesn't know. Do you have a theory as to *why* Rachel would instantiate Ra and then marry it, though? (And where would she get a "Ra" for those purposes - Laura had the Glass Man / Tanako, after all, who as far as we can tell was *not* Ra? Well, I suppose she could have asked for one back when Ra Did What She Meant.)

2014-08-13 21:05:07 by qntm:

"Why, the Glass Man is old man Czarnecki, our thaumic physics lecturer in university!"

2014-08-13 21:06:08 by Velorien:

@Sam, thanks for the explanation. It makes a lot of sense. Also, props for being able to dissect your own mistakes in public like that. I envy your powers.

2014-08-13 21:15:43 by John:

One thing that I've wondered, in the back of my mind, is what would happen when humanity discovers actual nonlocality technology. Magic operates via nonlocality, sure, but the actual nonlocality physics are still there under the hood, right? Perhaps that's what Rajesh meant by "we will get to the truth, the whole of it". Once people figure that out, they can start doing anything magic can do WITHOUT magic's restrictive system. They just need an energy source, but that's a solvable problem. Of course, if the Wheel Group were still hovering over humanity's shoulders, they would probably ruthlessly squash nonlocality research. Which, perhaps, means it's a great thing that those jerks are hightailing it to Alpha Centauri or wherever, so that humanity can get on with things.

2014-08-13 21:43:34 by Sabin:

@Sam "I would have gotten away with it if it weren't for those meddling kids and that little Astra too!"

2014-08-13 22:25:49 by Sabin:

@Morgan - Really the Doug-is-Glass-Man theory is just a silly game I play when I read fiction-in-progress where I try to come up with the most bizarre and silly story twist that's still entirely consistent with the characters' motivations and facts as presented in the story. That said, I have created a suitably hole-filled patchwork of justifications for the theory. Where did she get the instantiation of Ra? Doug Ferno was previously Wheel. He had some dangerous ideas that conflicted with her ideology, so Rachel dragged him into the T-World, created a simulacrum of him, and took back the simulacrum. That simulacrum was Ra. The Glass Man was the original Doug Ferno. Why instantiate Ra and marry it and have Ra Babies? Because she knew the system wasn't perfect, and she wanted to give Ra some kind of attachment to Actual Humanity in order to give it a reason not to destroy Earth again when it woke up. Again though, this is mostly me being silly. I also have a bevy of other silly dark horse candidates: Scott Parajsa: His last name means Paradise! AND it has RA in it. We all know how Sam likes the symbolism with names. Adam (first man) King (literally). Zeck (russian for prisoner). Ferno is short for Inferno, which is explicitly referenced in the text and it also parallels Laura's journey. Alexander "Protector of Man" Watson. He got kicked out of the Wheel, left with his privileges, moved to Chile and then decided to just stay in the T-World. Rajesh Vidyagasar: I mean come on. His first name is RAjesh. And, "From Death Lead me to Immortality"? He's immortal now, living in the T-World. And he's pissed. His life's work was based on a fake science designed to hide REAL science. Both he and his dad weren't actually real people. They were *made*. So he's pissed and wants to end humanity. Martin Garrett - Okay, now we're actually getting a bit more realistic. Clearly he was already off his rocker. And clearly he also knew enough about invisibility to hide his magic charm from Exa before he got killed. And clearly he knew enough about Thaumic Bombs to create something that, according to Exa, "actually could hurt him". And clearly he knew a bunch about exploits in the system of magic. All of those are traits that the Glass Man possessisions. Plus if you spell Martin Garrett backwards, you get Ra. TWICE. Adam King: If it werent for his meddling, this entire situation never would have happened. He kept the recording to the key. He kept the Bridge which allowed the key to be retrieved. He shot Caz before he could destroy the listening post (and the key). After wandering for a year, he decided that Virtual Life would be better. He somehow knew Rachel would not be coming back even though she didn't explicitly say it. Clearly they had ideological differences, he told her his plans, she disagreed but knew that she couldn't stop him except by. He had the Bridge, but didn't know the best way to use it. So he stashed it away for later. The only flaw in this theory is that you can't twist his name around to make Ra.

2014-08-13 22:41:48 by atomicthumbs:

they only *thought* Scott Parajsa was in Chile. In reality, he was somewhere much more nefarious

2014-08-13 23:12:45 by Toadworld:

(Whoa, this thread exploded. Hmm, Yudkowsky's lot?) I liked learning that the Actuals follow a quite Roman concept - it lends them an imperial air that makes me want to know more about them. And in a world where names are so important, and backups are so important, killing someone without a backup and killing someone to the point where you forget their name is the most brutal you can be. Secretly I'm still waiting for the time machine the size of a city to show up.

2014-08-13 23:18:01 by atomicthumbs:

and Scott Parajsa is an anagram of Ra's JATO Pact. It's all coming together

2014-08-13 23:30:23 by Unmaker:

I am trying to construct what the Actuals consider the "right" way to use mind/body copy technology based on their actions. All example behaviors are not listed. Numbers are for reference, not indicators of importance. All statements can be considered IMO or guesses, to greater or lesser degree. Apparent behavior: 1. It is expected to have one incarnation (=running in reality) of a person at once. 2. It is not expected to have more than one incarnation of a person at once. 2.1. If this happens due to accident or necessity, e.g. Exa, it is expected to eventually merge experiences. 3. Static virtual backups of people are expected. 4. Running a virtual copy of people who are incarnate is discouraged but not disallowed. Presumably a merge is eventually expected. 4.1. Running multiple virtual copies at once is allowed (necessity?) in war time. 5. Running a single virtual copy without an instantiated copy is allowed (necessity defense?) but not expected. 6. Simulations used for DWIM analysis are expected. 6.1. Destruction of said simulations is SOP. 7. Destruction of the last copy, actual or virtual, of a mind/body state is a crime. Putting it together: A. One actual copy is the right way to do things. Active virtual or actual copies may happen by accident or during emergencies, but you are expected to merge back to one actual copy. This makes a "person" a mostly-continuous single incarnation, with some experiences from terminated physical or virtual branches. A.i. Why not allow multiple actual and/or multiple duplicates regularly? A.i.a. Ultimately, it comes down to resources. People who duplicated endlessly would swamp any system quickly. Therefore copying is disallowed except by necessity and copies are expected to merge. If they object, they can be forced to merge. A.i.b. Behaviors that lead indirectly to multiple copy situations are also discouraged or illegal. A.ii. Why not a pure virtual existence? A.ii.a. Too much virtual existence clearly leads to radically diverging thought patterns. Forget whether they are better or worse, they are no longer human, even though they are sentient. A.ii.b. Forgetting where your off switch is is dangerous. A.iii. Experience gained in virtuality or accidental copy is also at least slightly important and should be preserved by merging if possible. B. People (sentiences) are important. Destroying the last copy is murder. B.i. Given the existence of mind/body copy technology, it would be insane not to employ it to preserve people in case of accidents. As long as it doesn't lead to exponential copying situations. B.ii. Permanent Virtuals (Ra residents) are sentient but not human. And they don't want to interact. So, the aliens we created want to be left alone? Fine, as long as they leave us alone. C. The cost of discarding the information gained in DWIM look-forwards is better than the alternatives. C.i. What are the alternatives? C.i.a. The possibility of making an undesirable decision and having to redo it. This makes the most sense if the costs (mass/energy/computation) of a DWIM look-forward is VERY low relative to recycling costs. (Cost-based thinking again.) C.i.b. The possibility of getting stuck with a less-than-satisfactory decision. In a society where anything is possible, this cost, which looks very, very normal to us (readers), probably looms very large to 190'th century people. After fulfilling physiological and safety needs, the possibility of getting stuck with a less-than-satisfactory decision is probably the last dissatisfaction-producer that can be eliminated. C.ii. Why not merge the information gained in DWIM look-forwards with prime reality? C.ii.a. Constant bombardment by remembered but unrealized dissatisfaction. C.ii.b. High cost (computational?) relative to information gained.

2014-08-13 23:32:35 by Unmaker:

C.ii.c. This would also force changes on others, because the other simulated people would also have experiences. Forcing this on others is undesirable.

2014-08-14 00:04:05 by speising:

the glass man was really all too human. when fighting laura, not only does he use his awesome powers to teleport her into a primitive choking grip, instead of eg. simply teleporting her head to a different place than the rest of her, he even commits the classical bad guy error of allowing her to speak!

2014-08-14 00:09:54 by qntm:

And now you know why this chapter's called that.

2014-08-14 00:30:04 by Sabin:

Let's ponder for a moment Thaumic Warfare. 1. Laura was able to teleport herself into the center of the earth wearing a Mithril power suit. Use the same technique to teleport into the bedroom of a world leader and assassinate them. 2. Instantiate a drone thousands of times over to create a portable, instant army. 3. Gather a large group of weaker mages and have them all alias themselves as the True Name of strong opposing mages to reduce that mages' chances of a successful spell cast to near-zero. (Alternatively, in close quarters magical combat, when your mana runs out, you alias yourself as the strongest nearby opposing Mage to weaken them by 50%). 4. Use the T-World clothing exploit to instantiate with clothes made of solid gold and diamonds, and/or fiat money, in order to flood an opposing country's economy with counterfeit currency. 5. Use the Akashic Records for blackmail. 6. Use Hatt's T-World product demo hack as a means of interrogation. Force the subject into the T-World, surrounded by every torture device that ever existed and every torture device that never existed and threaten them with a subjective eternity of mind-bending torture. Then follow through if they don't talk. 7. Capture a valuable enemy, force them into the T-World, create a simulacrum, being the simulacrum back and leave the enemy in the T-World. Granted you now have the problem of a little Ra-let running around. But I'm betting some military is short sighted enough to do it anyway. 8. Invisibility Theory dictates that chi can be converted into photons, so convert all chi particles produced by your army into photons in order to make your magic undetectable. And that's just off the top of my head. How morally reprehensible can we take this?

2014-08-14 00:45:47 by Sabin:

9. Use Not-Tanako's matter-substantiating hack to bring smallpox back.

2014-08-14 01:15:48 by Sean:

*skips the frighteningly animated part of the thread and starts near the bottom* Random thoughts: - I actually found Natalie to be relatively easy to relate to. I don't know that she's a "sympathetic" character in some broader sense, but... *shrug* - Arguably, Rachel/Ashburne was not responsible for nearly losing Abstract War the first time, since even if she had realized that Ra was coming to Neptune ahead of schedule, she might not necessarily have had the resources necessary to preserve Actual Humanity until the Triton succeeded. It seems much harder to escape some responsibility for this episode, since she failed to destroy the key (which she even knew the location of), and left Wheel in the disorganized and despondent state that resulted in Adam King being in charge of the world. - Not only is Rachel apparently incapable of representing a threat now, in the mind-bogglingly unlikely event that she came up with a new plan, Ra could potentially sense it in her mind and counteract it before it had even fully formed. The only mild weakness it seems to have right now is that it's not completely retasked, so it is still expending small amounts of resources implementing the astras and magic system. But it's not clear that Ra will continue to do so indefinitely. Barring a (possibly literal) deus ex machina solution, it looks like the only remaining options for Rachel are: 1) Give up: Maybe she could boost a few people out through Wheel's exit, but not humanity. So, she just saves her family. 2) Physical evacuation: Use the Bridge to somehow create data storage off-world, stow formerly Actual humanity there, and try to somehow get clear of the area where Ra is collecting material to build the Matrioshka Brain, before it either consumes them or targets them as a potential threat. A bit like "Valuable Humans in Transit", but having to go at least somewhat slower than the speed of light (using the moon to buy time?). It's hard to imagine quickly evacuating the entire population of Earth to offworld storage that doesn't even exist yet, though. 3) Negotiation: Put everyone into a Virtuality, and try to convince Virtual Humanity and Ra to either re-instantiate them physically, or at least not kill/delete them. This is the inevitable final backup plan; once the other side is clearly going to win and escape is impossible, the only thing left is to either try to get decent terms of surrender, or lay down and die. 4) Luck: Maybe Ra is starting to take orders from the Virtuals, and they already have changed their minds about what to do next. Maybe since Actuals are not currently a threat, the Ra node in the Sun denies the request for energy to destroy the Earth, deciding instead to meticulously deconstruct the Earth, or focus on other planets for materials, buying the humans on Earth valuable time to evacuate. Maybe they are actually in someone's simulation of a bad outcome, and about to get merged into a different scenario with more hope. None of these seem at all likely. - It's still not clear what Natalie did with the key other than temporarily counteract the Matrioshka Brain order. It says that she immediately executed a script she gave to herself, but it does not mean that that script matches the one that Anil states out loud: stop the destruction of Earth, then immediately destroy the key. What's with the icons appearing around Australia when she uses it? Something related to an order she gave, or just new icons appearing automatically as notifications about her use of the key? Possible actions she could have taken: 1) Destroy the key and replace it with a fake, ordering Ra to superficially act as if the fake one was real for some time so as to give them some kind of edge against the Glass Man. I thought this was barely possible before, but it now seems unlikely given what Sam said in comments to that story. Also, if Natalie knew that Ra had already stopped trying to destroying the world, and Not-Tanako had only used and destroyed a fake key, you'd expect her to have dropped the act by now. 2) Pull resources, such as astras or physical copies of important people, from the akashic records. But again, you'd think that she would have told Rachel by now. Even given her usual impulse toward information hygiene, why bother concealing any resources from the one person with the most knowledge of how to use them, when you are just about to lose everything? 3) Any number of other arbitrary get-out-of-jail-free cards, e.g. giving herself some kind of unique privilege to partially control Ra without the key, spawning a clean Ra shard and ordering that the main Ra network never assimilate or destroy it unless directly ordered otherwise, or whatever. Same objections as the above, plus these requests would conflict with her apparent belief that no one should have direct access to Ra. 4) Request some kind of "life insurance" for the human race. That is, set some kind of secret process in motion that would make a "life raft" for Actual Humanity in case of its imminent destruction. Then wipe out any record of that process she has access to (possibly including within her own mind), so that even if someone with direct access to Ra wanted to wipe out the Actuals, they wouldn't know that they need to get rid of this insurance unless they explicitly asked about it. This is just barely plausible, since Natalie would still be rather motivated to save the Earth rather than rely on this backup plan.

2014-08-14 01:36:27 by Sean:

@Unmaker I think it was implied earlier that DWIM look-aheads are not usually full copies of someone, but low-to-medium resolution simulations that focus on the relevant parts of one's brain. For many requests (e.g. "I want delicious cake."), Ra might not need to simulate enough of you to form a sentient entity. It's enough to get satisfy the parts of the brain most directly triggered by that experience.

2014-08-14 01:42:39 by Node:

3 points: 1.) We have a gulf of information regarding the history of the world. We do not know how Abstract War started. We do not know if is was started by some group of Virtuals, an Actual, some group of Actuals, or a single lone Virtual. There exists insufficient information to discern Ra's motivator(s) at the start of the war. There is insufficient information to say that the Virtuals deserve indefinite hibernation. Furthermore the question of whether something is "deserved" is an extremely dividing question, even in much clearer situations. 2.) The end of the story was potentially decided at any moment where somebody had DWIM power. At the end of Abstract War, Ra could have arranged the universe such as it appears today. This would have been at Rachel's bequest, although it does not necessitate that afterwards Rachel is aware of any far-reaching consequences. In Natalie's position much the same could have occurred. Rachel's DWIM could have lined up both Natalie's request and the Glass Man's later down the road. Ra was designed to be the most powerful computer. Don't underestimate it, underestimate the people who use it. GIGO is your only hope. 3.) Everybody, stop saying that "the Virtuals" decided to kill Actual Humanity and that "those Virtuals" i.e. all Virtuals should be punished. Abstract War was started by some group of N humans. Not Virtuals. Not Actuals. Humans. You can assume that those those N were all Virtuals, but judging M>>>N humans on the actions of just those N does not make any sense. Those N do not represent the Virtuals' approach to resource management. They represent themselves, no more and no less. It's in much the same way that Rachel's decision to freeze the Virtuals does not represent either the whole of Actual Humanity before Abstract War or the hundreds that were present afterwards. - Node, Longtime Lurker

2014-08-14 02:35:23 by Silhalnor:

--Catching Up Come on guys! I have to be away from my computer for a day and you guys go and double the comment section of the most fun discussion I've had for months! You probably wont even be talking about the same things by the time I reach bottom. I will be writing as I read along. Can we get sub-forums? (And forums?) We might need them. @Morgan: "@Silhanor: It's the Actuals' star too. The Virtuals aren't entitled to the theoretical maximum they could squeeze out of everything, such that already having a claim to some of everything is a theft from them. If I could add years to my life by consuming the still-beating hearts of appropriate sacrifices, that doesn't mean other people are shortening my lifespan by selfishly using their hearts to pump their blood around just so they can live." --- I know, I was going by the Virtual's perspective. Also, they wouldn't HAVE to die if they become Virtual or leave the solar system either of which is a violation of their right to live the way they want. Provided their choice doesn't violate other more important values (knowingly or not) which is what I supposed for the sake of argument. @Morgan: "The issue there is that it's impossible to separate the guilty Virtuals from any others who may not have known War even took place." Even worse: In the time it took for the war to start and end the guilty Virtuals are dead or so different that they can't be held accountable due to essentially being entirely different people. They could even now be friendly to our cause for all we know. Or they could have merged & split minds together a million times with other Virtuals to become who-knows-who. And I do agree on freezing the Virtuals at the end of the war, but only as a temporary solution until something permanent can be planned. But if they then try to keep the Virtuals frozen for eternity, well we seem to have matching opinions on that. Did you look back at the chapters concerning the key yet? Though I have not reread the pertinent chapters since my first read-through I am pretty sure that what happened is that Rachel chose to freeze the Virtuals (and in a sense everything else since nothing progresses for a year) so that they could recover and then make decisions again. After this King assumes command and everyone else was happy to follow his lead for one reason or another. It appears that King's decision was to leave them frozen forever and no one was in the mood to object. Conceivably though he could have left behind an Unpause switch that people could press if they so desired. There is no mention of such a Button in the story however. Or maybe he really doesn't care what happens to them and didn't bother to leave behind an Unpause switch or anything.

2014-08-14 02:43:35 by Sabin:

@Sean - Natalie may not have told Rachel for several reasons. Lack of trust after seeing how cavalierly she kills. Or perhaps her plan involved the Glass Man being alive. @Unmaker I think that your system analysis seems to unnecessarily differentiate between a "virtual copy" of a person with a simulated version of a person for the sake of DWIM. The Akashic Records are the same as simulations, except the data is stored rather than discarded. Theoretically you could physically instantiate an infinite number of 1969 Exas by going into the Akashic records over and over and repeating the exact same scenario that Laura and Not-Nick did. Presumably this technique is how Ra duplicated himself over and over* to populate the Chedbury Bridge**. So an infinite number of virtual duplicates already exists. *Anil and Natalie used the oracle to see that there were several mages with the true name Ra at CB. **Has anyone else commented on the significance of the "Bridge" part of Chedbury Bridge?

2014-08-14 02:46:31 by Silhalnor:

--Catching Up: Part 2 - The Return of Scott On merging mind-states: If I made a copy of myself for a few days and then merged us back together I wouldn't consider that death for either of us because our personalities haven't had time to diverge noticeably, and any minute differences that may have appeared can probably be amalgamated without results that are contradictory to either original. On the other hand if my copy and I spend many years in different environments and build up incompatible belief systems and personalities, then what? I suppose you could still merge them but the result would likely be considered a third individual distinct from either. So there is something of a crossover point but it is gradual and fuzzy. @Velorien: "Does anyone else find that "Ra" has a distinct lack of sympathetic characters other than Nat and Nick?" *I* like Laura though her traits are problematic. However my care for her *has* diminished. Also, I don't think my sympathy for other characters like Nat has grown as much as it should... I'm not certain why. Maybe Nat lacks personality? Or her personality is too passive and compared to Laura it gets kind of hidden. Were the earlier chapters in a more first person perspective? I feel like the narrative jumps from character to character a lot more now without giving any of them enough focus. That could be what is stunting my sympathy. On the other hand I rarely reread chapters so it could be that my memory is simply faulty and is over hyping Laura's traits that made me instantly like her (obsession with space travel, great superhuman goals that would make any sane person go mad with the sight of Cthulhu, that sort of thing). Can anyone corroborate with this? - Sam: "The lack of sympathetic characters is because Laura was originally intended as a sympathetic character, and I completely blew it." Oh. Well there we go. Author's fault. *grins* Couldn't you try making her learn to make up for these traits? Like she comes to recognize them but she can't change that about herself so she adopts some methodology or something to keep herself from acting psycho. Sabin: "Scott Parajsa:" YES!!!! I don't know what you're about to say but I subscribe to it 100%. It's a travesty the man doesn't get enough screen time. Why haven't we seen any big battles in T-world? Presumably incapacitating your target would be impossible but forcing them out of T-world would be a suitable winning condition. Dr. Czarnecki did something like this. Or maybe you could confuse them to death? Escaping T-world with a fake copy of your enemy could also be an option if the opposition doesn't know how to reinstantiate themselves. But that fight would be boring since it is actually just a short race.

2014-08-14 02:53:49 by Sabin:

@Node 1. Agreed. 2. Agreed but I think it's irrelevant since it's unfalsifiable, and as Natalie said, you have to assume its reality until proven otherwise. 3. That has a lot to do with your view on wartime ethics. Just how culpable are individuals in a war, and who deserves "blame"? The leaders who instigate the war? The officers who execute the leaders orders? The soldiers who fight the war? The people that make the weapons? The people who make the supplies? The people who make the clothing? The soldiers' friends and family who provide moral support? Any average tax paying citizen whose money goes to fund the war? It's the Hiroshima question all over again.

2014-08-14 03:24:30 by Sabin:

@Silhanor/Unmaker: agreed 100% re: mind merging. This is why I don't think we see any kind of mass, recursive copying. (Or any handwringing over the "death" of simulated versions of oneself). There's just no personal benefit to permanent mass copying. Your copy can either: 1. Not exist long enough to diverge substantially from the original, so the benefit of merging is minimal, as is the loss when the copy "dies" (because 99.x% of Copy You still exists in Original You). 2. Exist long enough to diverge substantially, and never merge. Original You never gains those memories so there is no benefit. 3. Exist long enough to diverge substantially, and then merge. Original You gains some memories, and you also temporarily are able to do two things at once. The impact of #1 is negligible either way, and this bears out when you consider the ubiquity of simulations and historic records. #2 is actually kind of tragic and there really is no reason why a person would want that. #3. can be useful in certain edge cases where being able to do multiple things at once is beneficial (e.g. Abstract War) but for the most part, there are easier ways of accomplishing almost anything you may want to accomplish with this. So there really isn't any reason at all for someone to just recklessly copy themselves.

2014-08-14 03:39:03 by Silhalnor:

@Sabin Wouldn't #2 basically be the same as having children? Instead of growing into an adult they diverge. You could also merge memories but not bodies or personalities thus allowing for long term copies without the question of murder. Maybe you would merge memories on a monthly or yearly basis until you diverge so much you get sick of each other. In fact, one might consider merging memories with people who aren't your copy. Married couples might try this, or members of a team project. In the latter case they would only merge the memories that actually relate to the project. It might supersede language, at least for certain applications. Wait... I think I invented telepathy. Well. That's a completely different subject matter.

2014-08-14 03:49:42 by Node:

@Sabin #2 isn't saying that reality might be a simulation, it's saying that reality might be all according to plan. #3 was stating that there no evidence that those M>>>N Virtuals were even aware of Abstract War. Why would they be suspended indefinitely then for the actions of some unknown and presumably small group? As we've seen, once you have physical access everything is over. Perhaps the command was given by an Actual who looked at Actual humanity and said "what a waste". Perhaps a Virtual figured out how to substantiate in reality and compromised the hardware that way. What is it that really happened? The answer is that we don't know what happened or how and that we may never. In that case on what grounds would you forsake the M>>>N presumably ignorant Virtuals? As an aside, I've wondered if there were truly no humans that lived as both a Virtual and an Actual, transitioning freely between the two. Why not?

2014-08-14 04:06:58 by Sabin:

@Silhanor Good point re: the similarities to having children. The "tragedy" I was referring to was creating a simulation knowing that it would exist long enough to significantly diverge but get cut short in the prime of its new life. The alternatives would be: merge, instantiate into physical form, let the copy live as a Virtual, or kill the copy. Killing the copy seems cruel. Letting the copy live as a virtual seems like a waste of resources, and also antithetical to the Actual credo. Instantiating it into physical form carries with it a large number of logistical issues, eg, there are two genetically identical copies of you roaming around in one reality. Merging with the copy seems to be the most reasonable solution. Incidentally I think the Wheel Group's "subvocal communication" is a variant on telepathy. Although it uses direct stimulation of the auditory nerve to send the output signal) @Node: That makes sense but it's still unfalsifiable. You and I are on the same page here though. There is so much detail we DONT know about the circumstances that making an ethical judgment on the characters is difficult if not impossible. (Especially considering that, as I said before, Sam can fill in those details at any point in time and invalidate the entire debate.)

2014-08-14 04:13:14 by atomicthumbs:

while we're all talking about things, can we take a moment to appreciate how horrifying Rachel's method of getting the medring to work was (give scott parajsa his own miniseries)

2014-08-14 04:29:14 by Sabin:

@atomicthumbs Not impressed. Scott Parajsa does that every day just before breakfast. Just because he can.

2014-08-14 04:52:11 by Yasha:

The way Ra takes steps to avoid its own reprogramming seems inconsistent. With the exception of the Triton incident, Ra has always had the power to stop people from getting direct access to it. Therefore, every other time in the story when someone had a plan to hack into Ra in order to give it orders contrary to its current ones, Ra should have stopped them. Original Ra: Oh look, some of the Virtuals have a plan to hack into me in order to make me build a Matrioshka brain, and if they succeed I'm going to try to kill the Actuals. But I currently have orders to care about the Actuals, so I'm going to freeze the Virtuals, or at least those particular ones. Ra after Triton: Oh look, Glass Man has a plan to hack into me and destroy the Earth. But that's against Ashburne's DWIM order of a world going "this far wrong." Poof, no more Glass Man. Ra after Glass Man uses the key: I'm supposed to build a Matrioshka brain, but Natalie is going to cleverly get the key in T-world and stop me, so I'm just going to delete her or just like modify her brain so that she doesn't get that idea. Ra after Natalie uses the key: Natalie wants the world not destroyed, and fake Tanako is about to steal the key and reprogram me. Poof, no more fake Tanako. In fact, the Abstract War seems like the _only_ time Ra has tried to prevent people from hacking into it in order to go against its current orders. Shouldn't it be doing that all the time?

2014-08-14 06:36:52 by TheTrueMikeBrown:

I have had a nagging problem with the fact that the Virtuals killed off everyone backed up on Neptune. It seems like if you lived in a world where existence in a computer was life, then deleting someone would be considered murder (especially someone who is not being simulated at the moment: that is like murdering a helpless person). I understand how they killed all the Actuals: they didn't consider them to be alive (they didn't live in a computer). That is not to say that I agree with the reasoning. To me it seems like the Virtuals would have more likely taken the Actuals being stored in Neptune as prisoners of war. Perhaps they are even trying to send them back right now. Time is much faster in side of RA, perhaps the public sentiment changed some time in the time between the re-awakening of RA and the sending of the big energy packet.

2014-08-14 10:18:06 by LNR:

Rachel says Ra would already have destroyed them if they were capable of presenting a threat. That is clearly and objectively incorrect, based on her own past experience. If Ra were that infallible, she and the other Wheels wouldn't ever have managed to defeat and reprogram it in the first place. The current team has far fewer resources than Wheel did, but then the current Ra program is far less aggressive and seems to have far less freedom of action. So it's another episode of the "Presumably Smart People Making Dramatic Statements They Should Know Are Totally Wrong" show. Anyway, we readers know it's not impossible. Thus far in this story, every single time a character has declared a thing impossible, that thing has come to pass-- usually immediately. I expect this will be no different.

2014-08-14 11:12:31 by skztr:

@TheTrueMikeBrown: Virtuals don't think that actuals "aren't alive because they aren't virtual". It's just that Actuals are about as "human" to Virtuals as hydrogen atoms are to us. Virtuals have been around for literally trillions of times longer (in subjective time) than Actuals have, and are capable of at least billions of times "more brain". Though, as of the latest chapter, I am much more confident that the Virtuals had nothing at all to do with Abstract War.

2014-08-14 11:32:15 by Velorien:

@Yasha: Theory A: Perhaps Ra was never given an order as specific as "care about the Actuals", and its programmers didn't consider the possibility of the Virtuals hacking in and giving disastrous orders (which would be stupid, given their history of strife, but it has been established that both factions did nothing but make bad decisions in the run-up to the war). Or perhaps the Virtuals, with their vastly superior time and manpower, found a way of hacking Ra that exploited some loophole in its orders. After Triton, Ra's simulation of the Glass Man's actions had them lead to a far lesser disaster than the Abstract War (for a start, because there was precious little left to destroy), so they didn't count as "this far wrong". The Glass Man used the key to give Ra a single order (like Ashburne before him), which was "construct a Matrioshka brain", not "construct a Matrioshka brain and stop anyone who tries to prevent you", so it had no motivation to stop Natalie. I don't see an obvious way to account for Fake Tanako. Theory B: As above for Abstract War. After that, Ra was running on Ashburne's DWIM order. It simulated the future, and either saw that the world would ultimately be saved (because of the events we are witnessing now) and allowed things to happen as they have, or steered events in ways that are not apparent to the reader in order to guarantee said salvation. In other words, if this story has a happy ending that fits with Ashburne's origjnal DWIM order, then it's because Ra foresaw it and/or made it happen, and letting itself be reprogrammed multiple times was part of the plan. Theory C: Ra ran a simulation according to which Abstract War and its aftermath was the optimal scenario for humanity (say, because otherwise an Actual/Virtual war would come later, and be much more disastrous), and *everything* ever since has come about due to Ra fulfilling its original orders. Or maybe *this* is one of those simulations, and Abstract War hasn't happened yet in "real life", and might not happen at all depending on Ra's conclusions. The possibilities are endless...

2014-08-14 15:19:26 by Aegeus:

@Yasha: The fact that Ra got hacked in the first place shows that it's not infallible (or rather, it had infallible code, but that doesn't stop someone from altering the code). If it was infallible, it would have stopped the hack rather than pretend to be hacked, launch a war, and stop the hack at the last minute. If Ra cared about Actuals, it would have kept more of them alive. I'm also pretty sure that even if Ra was proven perfect, the reprogrammed, kill-all-Actuals Ra wasn't. Hence the current system which has so many hacks, backdoors, and debug features that pretty much everyone and their dog has access they shouldn't have. Also, "Proven perfect" doesn't mean that Ra is omnipotent and omniscient, only that it does what it was programmed to. If Ra could perfectly predict the outcome of Abstract War, it wouldn't have started the war in the first place, because it would have known that the Actuals would have won. It would have answered the command "Build a Matrioshka brain" with "Can't do that, the Actuals will stop you and I can't stop them." Instead, it answered with "I'll try. I think I can do it if I sweep the Solar System clean before they can counterattack," and it failed. Ra is extremely powerful, but demonstrably not perfect. @Velorien: I don't buy the theory that Ra did predicted that the current events all the way back since Abstract War, because once the Triton crew reprogrammed Ra, there's no way it could know what would happen next. No computer can perfectly predict itself - that's the Halting Problem for you. So I think that your Theory A is the best one we've got. Ra worked just fine at the start, but its programmers have making been patch after hasty patch, with the most recent ones made in mere seconds. There's plenty of room for that to go wrong and produce the current mess.

2014-08-14 15:32:23 by Morgan:

@Node: yes - it's possible only a tiny minority of Virtuals, who may not even still exist, were the aggressors; it's even possible the Virtuals were set up. The Neptunians fighting Abstract War seemed consistently certain that they were literally fighting quadrillions of Virtuals, but we don't know what information they were basing that certainty on. Problem is, once Triton reached Ra, the crew simply didn't have any means to distinguish the guilty from the innocent, so their action in freezing everyone and deferring further decisions seems quite defensible to me. @Silhanor: okay, I've gone back through the last several chapters (just the story, not the comments) to see what was said about the key. It... doesn't entirely make sense to me; I don't understand how the Glass Man was able to get a usable copy out of the records. However! The important part is that Ashburne gave *no* orders regarding the Virtuals. Her only order, in the window where Ra would accept only one further order and only from her before it "permanently disable[s] its raw public interface" and "peacefully abstracts itself away" is: "Leave the Solar System as it is... give us the tools we'll need". King uses Metaph to build (or tweak) magic later, presumably working within some lower layer of abstraction Ra's already created. Metaph is later destroyed (it's mentioned as one of the destroyed astras Bridge could bring back). After that order's given, though, Ra itself shouldn't be reprogrammable, or accept direct input, whether with the key or with physical access to the hardware. So I guess the question is: would you need to reprogram Ra or give it direct orders in order to unpause the Virtuals? And I don't see any reason why you would. One of the astras could be a "what to do with the Virtuals" control, or it could be built in to magic if you discover the right spells. So my view is that we don't have enough information to say that the Triton crew (thought they) made it impossible for the Virtuals to be resumed at some later point, and I think it's more reasonable to conclude they left the possibility open than that they didn't. (For a while I thought: wait, obviously, Wheel just need to wait for humanity to advance to where they can send a magical probe in to the Sun to physically access Ra! But then I realized that physical access should no longer be sufficient. Which led me to wonder: if physical access was sufficient in the first place, why wasn't this a glaringly obvious flaw that the Virtuals could have exploited at any time, rendering the key completely pointless? If Ra wouldn't accept any further orders from Ashburne herself after abstracting itself away, why would it accept them from the Glass Man after he gets a copy of the key she used? Why would that key still be valid? Ashburne threw away the key but King picked it up because he's an idiot, okay - but shouldn't Ra, as part of the process of closing itself off to outside input, have metaphorically poured cement into the locks? I'm not sure this actually makes any sense at all.)

2014-08-14 15:42:35 by Phigment:

Honestly, if I'm trying to build the Ultimate AI with infinite power over matter and energy, I'm absolutely NOT giving it a strong tendency to defend its current programming. Down that road lies Skynet. If an AI interprets attempts to change its current programming as attacks and moves to prevent them, you've just guaranteed yourself a knife fight in the event that you discover a flaw or mistake with your original instructions. And Ra brings near-absolute control over local matter and energy to knife fights. If anything, my inclination is to program it for the opposite. People demonstrably attempting to reprogram the AI according to accepted channels are explicitly given the benefit of the doubt and handled with kid gloves.

2014-08-14 15:53:52 by The_Enemys:

A thought occurs: wasn't the Bridge a means to move objects between any 2 realities? Duplicate items by moving them from T world, teleport by moving from reality, etc? I don't remember Sam saying that the Bridge can't access Ra's simulations the same way that it can access T world...

2014-08-14 16:06:12 by Morgan:

@The_Enemys: "This is the Bridge," Natalie says. "It... seems to move information around." "From anywhere to anywhere," Laura explains. "From reality into Tanako's world. From Tanako's world into reality. From reality to reality-- teleportation. From your mind into reality and back again..." Maybe, without the listening post, and with the Glass Man having given Ra its last order, using even the peach stone at the core of Earth One for simulation is simply too risky? Otherwise you are, I think, correct.

2014-08-14 18:08:30 by Unmaker:

@Sean I have a similar list of ways out. Search for "Inaccurate:" @Yasha Read my comment on ways out number one (search for "giving legitimate orders"). Basically, Ra does not block futures where the future people are giving legitimate orders, no matter how they got that legitimacy. Frankly, that's sane programming; otherwise the first-to-order can institute an absolute tyranny. Your point is still valid, though - it is unclear why Ra tried to block the Actuals' counter strike, but that may have been on direct orders from Virtuals. @Node Interesting indeed, I had never considered that an Actual might have started the first war (there are good indications that one started the second). However, by WoA (Word of Author) we know the Virtuals recreated the key. After that, the strong likelihood is that a subset of Virtuals used it: 1) The massive effort required (WoA again) is counterproductive unless it gives significant returns, i.e. the Matrioshka brain. So the creation itself shows intent to use. 2) If an Actual used it, how did that Actual know the Virtuals had the key? Recreation was supposed to be impossible and A-V communication was poor to non-existant. 3) Assume all sentiences have an equal likelihood per time unit of going radical and attempting such an action. There were orders of magnitude more Virtuals thinking orders of magnitude more quickly than the Actuals. So unless Actual radicalization is many, many orders of magnitude more likely than Virtual radicalization, the likelihood is that many Virtuals got there first. 4) Clear winner of the Matrioshka brain project: Virtuals. Clear loser: Actuals. Strong motivation. Means, motivation, and deliberately created opportunity - the classic trio of crime.

2014-08-14 19:42:30 by Silhalnor:

It may actually have been much easier for an Actual to acquire the key than for any Virtual. As the war began there was an effort to reverse Ra's orders by resurrecting the architects. I don't know what it would take to resurrect them, probably other keys. But there is likely to be a way to extract the information necessary to resurrect the architects that is much easier than what the Virtuals would have to do and after that you would extract the key from the architect's mind-states.

2014-08-14 19:47:21 by anonymouse:

The key wasn't in anyone's mind-state though, for obvious reasons. It wasn't in Rachel's mindstate either, and Wheel (and Rachel) wouldn't have allowed it either, because that makes the security way too easy to subvert. All she had was a pointer to the original recording that King had foolishly saved, and which still contained the key. Moral of the story: security through obscurity never works for very long (see also: the monsters in Tanako's World).

2014-08-14 21:42:34 by Morgan:

That's the key that Ashburne used to reprogram Ra at the end of War, though, not the one that was needed to access it before War begin. For that key, we have this from "Why Do You Hate Ra": "First, they changed the metaphorical locks, making it impossible for the Actuals to revert their changes, no matter how many master architects were resurrected. " The implication is that, though no single person may have had (or been able to contain) the key (as it was originally set up), if you have enough of the right mind states, you can get it.

2014-08-14 23:34:59 by Sabin:

@Anonymouse All security is security through obscurity. ASISTO!

2014-08-15 04:48:54 by Bauglir:

Incidentally, as mentioned by somebody way up above, how DID we go from "The Glass Man is on top of them, firing" to them getting a sort of drop on him?

2014-08-15 12:13:00 by FK:

Yeah, still waiting for some sort of explanation of that...

2014-08-15 12:18:38 by Sabin:

What do you get when you combine insomnia with unbridled geekery? That's right. Scott Parajsa/Harry James Potter Evans Verres crossover fan fiction. Bask in its awful, awful glory. http://pastebin.com/raw.php?i=jSg6cz4r (I don't know Sam's stance on fan fiction, but if you don't approve let me know and I'll delete it.)

2014-08-15 12:51:04 by Sabin:

It was brought to my attention that the "slash" symbol in the fanfiction community has a much less.... platonic definition than that which I intended. Rest assured, there are no romantic dalliances therein.

2014-08-15 14:13:59 by Sabin:

Also, from a pure cryptography perspective, I don't quite understand how the key would be too big to fit into any human's brain. Sam said it would take 10^10000 years to brute force. At this scale, it doesn't really matter how fast the Virtuals' computer is. You're looking at a key complexity of somewhere between 33000 and 34000 bytes. Even if you had a Matrioshka brain-sized processor working for as long as the sun has power left in it you would never come close to even calculating 1 billionth of 1% of the possible combinations of a 34000 byte key. Am I wildly misunderstanding the basic principles of cryptography?

2014-08-15 15:04:24 by inkmothnexus:

@sabin is it possible that Ra was purposely designed so that any key to it would be too big to fit in any human brain in an attempt to prevent tampering with Ra? maybe the architects of Ra made a petabyte and then called it a day for security.

2014-08-15 15:25:55 by jgh:

@bauglir sam really isnt that good of a writer. There is no other explanation or youd already have it - he trawls these comments like hes getting paid to do it. Were it not for the interesting subject matter, the writing itself is about high school or entry college level. There is a reason no one will publish him. Dont get me wrong i like the story, but the facts are the facts.

2014-08-15 16:06:12 by Sabin:

@jgh "It has always been the prerogative of children and half-wits to point out that the emperor has no clothes. But the half-wit remains a half-wit, and the emperor remains an emperor." @bauglir I didn't really see an inconsistency. This is how the situation played out in my mind as I read it. They five are freefalling above the Atlantic. Glass Man is above them, floating in air without falling. Laura morphs into her Phoenix form, spreads her wings, and speeds downward, below the group. Glass Man follows her and is now below the other four as well. He engages in brief projectile combat, then teleports Laura to him. Laura severs the Bridge. Now Glass Man and Laura are both below the others, both floating in air without falling, and the Bridge begins its free fall. Anil, who is above them, is falling at 90ish miles per hour. He quickly falls onto the Bridge, gains control of it, and teleports them all to the ground. @inkmothnexus A brain can hold about 1 PB with 1-1 fidelity, so that's as good a number as any. A key with a length of 1PB would take infinitely longer than 10^10000 years to brute force. Like, add about 10^15 more orders of magnitude. Certainly enough to make it basically impossible to crack. But it does make the 10^10000 years figure somewhat inaccurate.

2014-08-15 16:46:36 by Katrina:

@LNR "For Ra to malfunction was proven impossible... Mathematically, universe-breakingly, one-equals-zero impossible." There goes the universe...

2014-08-15 20:47:06 by naura:

re: a common critique in recent threads Reading Ra as an online serial is radically different than speed-reading through it. Waiting a month between chapters, poring over the text repeatedly for clues, discussing theories (with additional info from Sam) in the comments etc, makes a lot of the supposed deus ex T-world moments less so. It's also just a different kind of experience, one spread over a period of years rather than an afternoon. Perhaps Ra is just more enjoyable that way?

2014-08-15 22:22:02 by Alan:

Katrina: Asimov's three laws of robotics are perfect too, but many stories have been told where the universe disagreed with the ironclad ideas of mankind. The Ra story line is predicated on the notion of "we know what we are doing", which is found to be arrogant -and mistaken- time and again.

2014-08-16 00:08:18 by Sabin:

@Naura For what it's worth, I read the whole thing in an afternoon. But, I also read all the comments before moving to the next chapter. And some of those threads got LONG. Which gave me enough time to somewhat follow the theories and speculations that developed over the years. It sort of forced a certain amount of waiting and anticipation and allowed my brain a little bit of time to rest and digest the main story. As a result I never felt like we were hit with Deus ex Magicka I found that to be quite an enjoyable experience and recommend it to anyone who wasn't on the bus from the beginning. As a side note: People have been complaining about Deus ex Magicka for quite some time. I thought it was fairly obvious from the beginning that *Magic can do anything*. The very first chapter shows Laura using words an intent to transfer energy from an unknown source into kinetic energy. From that bedrock principle, you can literally do anything with enough time. The introduction of short form spells indicated that the work could be front loaded. So now, anything could be done, and it could be done *quickly* given enough preparation. Then we were introduced to characters who had clearly done the preparation. Rachel was light years ahead of any other magician of her day. The members of the mysterious Wheel group had powers that seemed to dwarf Rachel's. Then once it was established that consciousness was a physical phenomenon that could be recreated or simulated, now you could do anything, quickly, *without the need for preparation*. Human beings could, according to the rules, be living Quines. And we learned all this by The Jesus Machine. So the groundwork was laid down pretty early on that *magic could do anything.* So for me the existence of godlike powers was a given.

2014-08-16 00:50:08 by naura:

@Katerina No, there goes the notion of "proof" (especially re: the real world)

2014-08-16 18:18:30 by Ari:

Huh, according to wikipedia "Clue" is apparently called "Cluedo" outside the US? Live and learn.

2014-08-16 22:03:03 by soniclettuce:

Just on the topic of character likeable-ness: Am I alone in that I feel like I get where Laura is coming from, even if she's making (very) poor choices about it? <br> She's been promised the ability to end suffering, for all of humanity. That's the kind of thing that would seem tempting to anybody, but even more so to someone that saw their mother die right in from of them as a child. Combine that with significant hubris (she seems to be one of the best mages of her generation) <br> In that context, shelving her boyfriend for a bit is a rather acceptable downside. Her real mistake is not considering that she's being manipulated, which is something I can sympathize with. She's a classic (ish) tragic character: handed godlike power, gone off the deep end in search of a noble goal. Hopefully she lives through it to learn her lesson, I sympathize enough that I'd be sad if she died.

2014-08-17 01:44:50 by Anonimuse:

Theory: Virtuals win. Ra instantiates backup copies of everyone (possibly including Abstract War casualties) in a simulation—cpu time being easy enough to come by, what with the shiny new Matrioshka brain and all. Nobody can actually tell the difference.

2014-08-17 10:06:56 by JDawgi:

Sam, if the story ends without explaining what virtual society is like, will you release a note about it?</br> Were you imagining a meeting of virtuals deciding to harvest actuals and the worlds, or was it just a few? </br>Is the virtuals world similar to the actuals with governments, or does each individual live in their own world? </br>Last question, did the virtuals know that actuals existed as sentient species? Before abstract war, it was said that actuals couldn't pay attention to virtuals because the simulation was too fast, and virtuals couldn't pay attention to actuals because they were too slow. Living in a super fast simulation for hundreds of years could have made them believe that actuals were just forces of nature. Too slow to be anything important.

2014-08-17 13:56:16 by Morgan:

@soniclettuce: IIRC, Laura body-swapped Nick before getting any promises of ultimate cosmic power - just the possibility of access to the knowledge she wanted was sufficient for her to do something utterly vile to someone she supposedly loved, over the cautionary advice of the two most sensible people in her life. And then, when Ra/Tanako/the Glass Man (I'm not actually 100% on whether the "Tanako" who took her into the archives was an instance of the Glass Man, or someone/something else?) asked her "what do you already know?", *she told him*. In all the annals of "how not to be lied to and manipulated", that's, like, lesson one.

2014-08-18 10:45:36 by Zim the Fox:

Hm~ I've not much to comment, but I can't see why the Virtuals didn't outright win Abstract War. Don't send energy packets to the planets and destroy them from within. Build a set of lasers right by the Sun and then shoot the planets down. It will take almost the same amount of time yet it can't be stopped by shielding the Ra node. The travel time of the laser light to its target and the energy package to the laser emitter will be the same as the travel time of the energy package to a planet. The only added time is the emitter construction time. This might take minutes, but surely, Ra can figure out something. This also makes the Triton maneuver impossible. Build laser, shoot down the Moon. Am I misremembering something? Was Ra limited during Abstract War? --Skimming through Abstract War, laser emitters were built in the target planets. This means that implementing my plan and Ra's plan would take the same amount of time. It makes no sense Ra wouldn't just shoot from afar. Here is another thought. What is the lifetime of Ra? How much energy does it consume? I am going to guess Ra could go a few thousands of years consuming the entire energetic output of the Sun without causing trouble, but eventually this would encounter problems. Does Ra consume less than that? Why does Ra need to destroy the Solar System for materials when the Sun is made of 99.8% of the matter in the Solar System? Is that naught two percent really, really that important? From what I can gather, nonlocality technology allows you to convert matter to energy and other forms and back. What is the efficiency of nonlocality technology? Is it more or less efficient than nuclear fusion? If so, Ra could be eating solar matter for energy (and maybe using what remains to heat up the star and not let it collapse). Fusion within the Sun might be no more . As someone previously mentioned, could Abstract War and the current events be intended by Ra as the best possible outcome for humanity? The intentions of the Virtuals— namely, acquiring resources— make no sense when confronted to reality. Finally, I am getting the vibe that Ra's attack wasn't (uniquely?) orchestrated by the Virtuals, but by Actuals. Why? Dunno. It seemed to me for a moment as though the text was suggesting it was one of the Ra nodes that went bonkers, rather than the main Ra. Huh. I ended up commenting more than expected.

2014-08-18 14:49:11 by Sabin:

@Zim the Fox According to the wellspring from which all factual knowledge comes forth (Wikipedia), "[Building a Matrioshka brain] would require the "disassembly" of significant portions (if not all) of the planetary system of the star for construction materials."

2014-08-18 16:43:20 by trainbrain27:

@Sabin I think that may only be true because the star is too hot to mine. Ra would not have that problem. That said, the system must be destroyed because the Actuals must be destroyed.

2014-08-18 19:57:01 by LNR:

@Sabin - you describe one of the possibilities "...never merge. Original You never gains those memories so there is no benefit." If you think there is "no benefit" to creating many people who think and believe as you do, you've got insufficient imagination. Even if copies might diverge later, I can think of tons of advantages I could gain by making an army of me, or even a new twin brother. I suggest reading the novel "Kiln People" by David Brin, which is based on this concept.

2014-08-18 23:24:42 by LNR:

@JDawgi: Asking "what Virtual society is like" is a kind of meaningless question. It's much worse than asking "what Earth society is like." How many different and varied societies are there on Earth right now? There may be skrillions more different varieties in the virtual world because there are so many more individuals. How much has your own Earth society changed over the last few centuries? Virtual societies could change that much in the time it has taken me to type this sentence.

2014-08-18 23:35:27 by skztr:

I never thought about it, but Kiln People is definitely one of my favourite books, and it has probably shaped my thoughts on the virtuals and non-linear society in general

2014-08-18 23:47:15 by Sabin:

@LNR - Read my post about Thaumic Warfare earlier in the comments. The thought definitely occurred to me. To clarify though; in that specific post I was referring specifically to a fairly limited context that revolved around duplication as it related to experiences. On a les serious (but still likely true) note you could argue based on the premise "If there's a loophole that provides personal benefit, humans will exploit it to no end," that the lack of massive quantities of duplicated humans running rampant throughout the Worldring implies that there's no substantial benefit to doing so. Not that hard to fathom given that Ra could likely give you everything you want much more efficiently than a clone of yourself.

2014-08-19 07:45:10 by Zim the Fox:

@Sabin There is also a "[citation needed]" right next to that excerpt :3 And there is no reason to believe a megastructure like Ra, able to withstand the conditions within a star, could not figure out a way to harvest the star. Even if Ra is unable to harvest the plasma, it could convert radiation into whatever materials it needs. Okay, I take that back. The amount of time to harvest in radiation alone the energy the Earth is worth is in the tens of millions of years. But the principle stands. If you need matter, take it from the object that makes up 99.8% of the Solar System, or the object object that makes up the remaining .2% (Jupiter). I guess that, given the Virtuals supposedly don't care about humans, as long as they gain more energy than they expend, no matter how little, it's a good enough idea for them.

2014-08-19 13:05:45 by MadcapPomposity:

Aside from the expanded future-morality of mind-states being foreign to our 21st-century meatspace sensibilities, there's another wrinkle to the Virtual/Actual debate. Sam has stated very clearly that the needs of "trillions of trillions" can never necessitate the deaths of billions. However, we have no information regarding the numbers or popular support of the Virtuals who hijacked Ra and started Abstract War. As Node said, it is entirely possible that Abstract War was instigated by a miniscule splinter faction of Virtual humanity. Thus, the frozen Virtuals become potentially sympathetic when you couple this possibility with Sabin's satisfyingly succinct conjecture that "if the universe has a finite lifespan, then every moment a Virtual is 'paused' for is a moment they don't get to experience in their lifetime (which is assumed to be infinite.) It would be similar to putting a human into a medically induced coma and taking 10 years off of their lifespan." I know better than to derail this thread with specific real-world examples of vocal minorities claiming (and failing) to act in the overall interests of their respective broader groups. No two people could ever agree on a comprehensive list, but we all know that this is a depressingly common phenomenon.

2014-08-19 15:34:52 by Sabin:

@Zim That's because I wrote that excerpt myself so I could lend false weight to my argument! ;) I kid, I kid. Jokes aside, I agree with you, at least in terms of the ethical conclusion. In this case, the "Actuals vs. Virtuals" is a false dichotomy. If you believe strongly enough in the value of human life, there is, as you point out, a third option: harvest the sun. Logistically though, it MUST be more efficient to harvest the solar system vs. the sun. Ra is an optimization engine. Ra is not going to choose a suboptimal solution. However, the Virtuals who reprogrammed Ra have sufficiently marginalized the value of Actual life so as to justify harvesting the solar system instead of the sun. P.S. Incidentally, this is why I think "scalar ethics" are fundamentally flawed. I.e. "I assign a value of X to this Thing. Thus, 10,000 of these Things have a value of 10,000*X." Anything becomes justifiable in sufficient quantities. I.e. the famous "It's better to torture one person for 10,000 years than to let 10,000,000,000,000,000 people get a speck of dust in their eye."

2014-08-19 21:04:23 by JDawgi:

@LNR <br/> While I understand what you are saying and why you say it, using that as an argument for why it should not be stated violates the writing style to me. Saying that it is impossible to understand what their society is like is akin to hand waving. If an explanation could be ignored that easily, I would wonder why the premise of the story (the technical aspect of magic and how it works) wasn't just discarded similarly. <br/> I am not presuming to understand Sam enough to know his intentions fully, but I doubt that he would be satisfied if he relied on that as an explanation. I want to know why they did what they did other than an appeal to incomprehensible evil via an appeal to ignorance. If they are, in fact, incomprehensibly evil, I would like to know why. If I can't get an answer, no big deal, but it would help me enjoy this world that Sam created more if I understood the other aspect of it.

2014-08-19 21:07:23 by JDawgi:

As a side note, can anyone tell me how to edit my comments to fix my atrocious use of html, or are they fine and it's just the browser on my phone that needs to be updated?

2014-08-19 21:32:00 by FK:

@Zim: My assumption concerning the necessity for removing the actuals was to prevent interference, rather than to scavenge mass. The Matrioshka brain requires the entire solar output; therefore the actuals will be killed / negatively impacted by its construction; therefore the actuals will attempt to stop its construction; therefore they have to go.

2014-08-19 22:37:02 by qntm:

> Sam, if the story ends without explaining what virtual society is like, will you release a note about it? Every single imaginable society multiplied by every single unimaginable society multiplied by wild impossibilities. Huge amalgamations of beings and virtualities networked together, swamped with smaller independent universes which exist in complete isolation and are completely unaware that any others exist. Huge variances in technical capability and ontological awareness. Severe fracturing and fragmentation, the precise opposite of monoculture, close to indescribable. The answer to any direct yes/no question about Virtual "society" is always "both at once", and as Ashburne mentions, to give it a single proper noun "Virtual" is to grossly oversimplify. The only thing truly shared by all Virtuals is their living environment, Ra. > Were you imagining a meeting of virtuals deciding to harvest actuals and the worlds, or was it just a few? I see Virtual civilisation as being too vast and numerous for its behaviour to be explained coherently through anything except statistical models. I think it would be impossible for any particular structured group of Virtuals to hold coherence for long enough to "act against Actuality" in so many words. (A route around this would be to deliberately slow down the clockspeed of your own host, but even if you were capable of doing this I imagine it would be a recipe for being cannibalised by other, faster-moving VMs.) Instead I imagine the Virtual "plan" coming together like multicellular life gradually evolving or a crystal forming, or waves gradually eroding chalk stacks. There would be no singular responsibility, only a series of independently minuscule, terrible decisions by fleeting individuals and societies with the lifespans of mayflies, decisions which eventually compounded until they broke out into Actuality. > Is the virtuals world similar to the actuals with governments, or does each individual live in their own world? As I said: Yes. No. Both. What I will say, though, is that the only way to "live in your own world" is to live in a world which is otherwise empty. If there are other sentient beings populating your world alongside you, then guess what, you are not alone. > Last question, did the virtuals know that actuals existed as sentient species? Before abstract war, it was said that actuals couldn't pay attention to virtuals because the simulation was too fast, and virtuals couldn't pay attention to actuals because they were too slow. Living in a super fast simulation for hundreds of years could have made them believe that actuals were just forces of nature. Too slow to be anything important. Yes. No. All things. All levels of comprehension and ontological awareness are represented, from a painful and bitter and completely accurate awareness of the true nature of Actual and Virtual reality down to millions of generations of deluded simpletons being born and dying inside nonsensically implausible environments, never learning or suspecting that anything else could exist.

2014-08-20 04:02:55 by JDawgi:

Thanks Sam. That helps explain quite a bit, though it makes the question of accountability that people were arguing about above almost impossible to determine. <br>To make sure I understand it, no one virtual really desired the annihilation of the Actuals, but rather they made small wishes that further changed the utility function of Ra that was managing them, until finally Ra decided that the virtuals satisfied it's utility more than the actuals?

2014-08-20 04:12:35 by Sabin:

@Sam Thanks for the explanation! This does seem to categorically rule out the notion of one splinter group of Virtuals acting against the majority. It also all-but-confirms this "plan" was the Virtuals' doing, rather than the Actuals. Incidentally, speaking from professional experience, I can say that the notion of "the precise opposite of a monoculture" characterized by "severe fracturing and fragmentation" is actually quite easily describable and quite predictable. You gather together enough people of differing opinions, beliefs, motives, buying behaviors, etc... The sum of those people inexorably pull together into an average. Of course what does it say about the Virtuals when the emergent property of trillions upon trillions of independent decisions is the transhumanist version of "KILL THE OTHERS THEN EAT THEM"

2014-08-20 05:13:35 by Bauglir:

As for why not build the lasers in the Sun, there would be a delay between construction and firing. The clean nodes at the targets would have time to construct defenses against the lasers during this time, even though the lasers would likely have fired by the time the light carrying information about the construction reaches the outer planets. Of course, "Why didn't Ra wait to initiate its plans until after all nodes had been infected?" is a very good question, and one that's already been asked. I don't think it's going to be answered, and I think it's acceptable to handwave as being due to some arcane technical detail. Maybe it's a consequence of the architecture that makes it provably infallible - they're equally incomprehensible properties, really.

2014-08-20 08:17:33 by Alan:

So the latest suggestion by Sam is that the virtuals are also fighting each other. Perhaps not in a conscious way, but more of a "we need that, lets move this mess out of the way". Perhaps like a man walking through an ants nest.

2014-08-20 08:37:11 by asdf:

RA may have took account for speed of light delay to minimize the ability of the actuals to respond. Act immediately on arrival of the packet. Ignore other nodes, they may have not even gotten the instruction yet. One would imagine countermeasures in place to at LEAST detect this type of thing starting off and cut out that instruction from the system. Acting immediately and not waiting for other nodes keeps your intentions outside the enemy's lightcone until it is already too late.

2014-08-20 10:30:07 by ahd:

@sam: i thought, based on Word of Author, that cracking Ra's original master key was a massive, horrifyingly costly exercise in genius-level awesomeness and stellar levels of brute force. the awesomeness was required to make it doable by *0.75* stars' worth of brute force. in other words: someone deliberately went after the key, and kept feeding that effort with resources and compute cycles until they had the key, then made a wish. *somebody* *made* *a* *wish* *with* *the* *key* and suddenly you come out with "no singular responsibility, only a series of independently minuscule, terrible decisions by fleeting individuals and societies with the lifespans of mayflies, decisions which eventually compounded". what is this i don't even...?

2014-08-20 10:47:21 by Alan:

ahd, the current conflict is not being driven by a resident of the sun. They were frozen by the triton crew. The current conflict is predicated on hubris, overconfidence and human error. At least as I see it.

2014-08-20 10:51:59 by ahd:

alan: i wasn't talking about the current conflict. sam's not going to tell us who or what is driving the current conflict, lest we pick holes in it and he feel bad. discussing it would be a pointless waste of perfectly good cat-posting time. the remark of sam's that i was replying to was in respect of Abstract War I, unless i misread. so was my reply. this is Abstract War II.

2014-08-20 12:43:20 by skztr:

@ahd: One horrible decision might be "try to obtain the key". Another trillion horrible decisions might be "keep trying to obtain the key". Another horrible decision might be "sudo give me more power", made by someone who doesn't consider the Actuals to be life-forms, and who didn't bother considering why "give me more power" previously returned "permission denied"

2014-08-20 13:42:47 by ahd:

...actually, yes. i could see some mad Virtual hacker crowdfunding the key cracking effort, and several decillion Virtual citizens deciding to contribute some pocket change for the lulz. up until just now i had been assuming that someone had literally conquered a large chunk of Ra's Virtual thorns to control enough processing power for the crack. incidentally, do we have any estimate on how many years of solar output Ra can store?

2014-08-20 16:03:24 by Sabin:

@ahd: >> "cracking Ra's original master key was a massive, horrifyingly costly exercise in genius-level awesomeness and stellar levels of brute force. " I had always interpreted Sam's statement to mean that because the computational cost was so high (10^10000 years), that they acquired it through some other means besides brute force. e.g. Perhaps some Virtual decided not to sanitize a record containing the key, and some other dead Virtual held within their brain a pointer which led to the location of the recording... and another Virtual manipulated another Virtual into teleporting itself into the center of Ra to retrieve a device which could bring aforementioned dead Virtual back to life.... ;) >> "incidentally, do we have any estimate on how many years of solar output Ra can store?" My quick and dirty math says that a complete conversion of solar energy to mass would require about 10-20 billion tons of mass every year. Which actually isn't all that much. Only about a cubic kilometer of solid tungsten. Ra could easily have a bunch of tungsten cubes floating in orbit around the sun.

2014-08-20 16:32:10 by John:

It is a truism of cryptography that attacks only get better, they never get worse. Maybe some clever Virtual figured out a better cryptographic attack, and reduced the brute force search down to something feasible. Which, of course, has consequences if the only thing the Actuals did was change the key, but kept the cryptographic algorithm the same. In which case, if there was any Virtual agent that escaped freezing (such as an Actual sympathizer who had been given the new crypto attack), the agent could repeat the attack and extract the new key no matter how many times the key had been changed (or, indeed, if all records of the original key had been destroyed). However, I doubt that's what Sam actually had in mind, given the plot contortions which have been exerted in the pursuit of the hidden stored key so far.

2014-08-20 17:24:07 by Sabin:

@John ("the agent could repeat the attack and extract the new key no matter how many times the key had been changed") That's doubtful- I posted earlier about the cryptographic complexity of the "Old Key" vs. the "New Key". The old key would have taken 10^10000 years to brute force, which means that it would have a size of around 32000-34000 bits. The "new key" was designed so that it would be too large for even a human brain to contain in its entirely. Meaning it would have to be larger than 1PB, which would take something like 10^(10000000000000... add 10000 zeroes) years to brute force. Even an algorithm which could have cracked the first key instantaneously would still take 10^100000+ years to brute force a 1PB+ key.

2014-08-20 17:26:41 by Sabin:

@Zim Incidentally, I had another thought re: the logic behind harvesting the solar system instead of the sun. Building a Matrioshka brain around the sun would (according to both science and the story itself) decrease solar output so significantly as to basically turn the Worldring into a bunch of ice cubes. So you'd end up killing everyone anyway. So if you're Ra you might as well build it from the most easily available material possible.

2014-08-20 18:51:55 by John:

@Sabin: It seems to me that Sam sometimes explains things in terms of "I, as the author, want to exclude these particular hypotheses from consideration, and therefore this key has property X, which explicitly precludes those hypotheses". Saying "The key cannot possibly fit inside a human's head" is such a description. That's not so much an estimate of the number of bits as much as "For plot purposes, the key must be physically located somewhere besides a human head, and this is being achieved by the fiat statement that it couldn't possibly fit". Sometimes such stated facts wind up not making sense with respect to other stated facts, but we as the audience are well advised to just sit back, relax, and pay no attention to those inconvenient paradoxes behind the curtain.

2014-08-21 01:47:57 by MadcapPomposity:

Huh. Given Sam's explanation, it's still difficult or impossible to assign blame for Abstract War to individual Virtual humans. I find the frozen Virtuals sympathetic because it seems like a device as powerful as Ra makes that sort of misuse almost inevitable. Sam hit the nail on the head when he likened working with Ra to juggling subcritical plutonium. Of course, the dead Actuals elicit at least as much sympathy, and the survivors' decision to freeze the Virtuals is both preferable to outright deletion and possibly the bare minimum force necessary to defend themselves while thinking of a better solution.

2014-08-21 02:12:55 by avoidingreallife:

To dredge up an old topic, I think retributive justice only seems barbaric until you consider what happens when the threat of retributive justice is taken away. I'm an American, and much of the wrongdoing I've seen in my lifetime was enacted by people who were (or who believed they were) not subject to any negative consequences as a result of their actions. Many conservative Christians in my country enshrine their beliefs in legislature and enforce them on others because they know that no single opponent possesses both the ability and the courage to treat them the same way. Our recent mass shootings have all been perpetrated by men who planned to kill themselves anyway, or who were too psychologically broken to fear retribution in the first place. The online media is brimming with stories about abuses of power by police officers and vigilantes who considered themselves above the law, because they had seen others like themselves get away with similar abuses countless times before. Retributive justice isn't a solution in itself, but as a deterrent it forestalls even worse barbarism.

2014-08-21 02:24:34 by avoidingreallife:

Another example. Rush Limbaugh can heap abuse on the late Robin Williams with impunity as long as he's speaking from a recording booth; he wouldn't dare stand up and say anything of the sort at the man's funeral, because that would entail the possibility of a severe beating. It's hard to argue against a certain amount of fear being used to enforce good behavior, because sometimes nothing else will do.

2014-08-21 03:27:00 by Sabin:

@avoidingreallife I agree, and I think it's even simpler than that. "Justice" is just a fancy way of saying, "If you do X, then Y will happen." It's a codification of actions and their consequences. It is irrational to want to benefit from a structure which operates according to a set of rules, and yet not adhere to them yourself. It's those very rules that give the structure meaning, context, and value. Reality follows certain rules. If those rules only applied selectively, it would cease to be "reality" as we define it. Social laws, if selectively enforced, would shatter the context and value and meaning of society just as thoroughly. Calling the arbiter of justice "barbaric" is as asinine as calling a hot stove "barbaric" for burning you when you touch it. Actions have consequences, regardless of whether they are laid out by the fundamental laws of nature or they are delivered in the form of a human declaration. "If you do X, then Y will happen."

2014-08-21 04:51:48 by Bauglir:

Not sure I agree. Retributive justice has value insofar as it serves as a deterrent, yes. Although it needs to be justified by showing that it, in fact, does accomplish that goal (for instance, crimes committed without premeditation can't be treated this way, because there was no opportunity to consider a deterrent). And, sure, I won't argue that you could define justice as the social consequence of an action, but that doesn't automatically justify any PARTICULAR consequence, which can still be judged to be as horrible or noble as you like. And it certainly isn't an objective standard, since social standards very from place to place and time to time. Ultimately, though, justice needs to revolve around the goal of making sure crimes stop happening. For the sake of argument, I'll agree that that often aligns with retribution, but that's coincidental at best - if you've got a hypothetical murderer, and imprisoning them will lead to further deaths but a full pardon and financial reward would somehow save and improve other lives, you have to take the second course, even if it feels repugnant. At the absolute worst, a heinous crime nullifies the value of a person's wellbeing - it doesn't turn their suffering into a net gain for society.

2014-08-21 18:44:45 by Sabin:

That's fair. To clarify, I'm not advocating one particular set of rules over another (or suggesting that just because something is socially codified, it's automatically right). After all, it wasn't too long ago that the combination of being black and sitting at the front of a bus was a punishable offense. Rather, I'm simply addressing the question of whether or not "justice" as a concept (whether its retributive, preventative or otherwise) is barbaric. And I still hold that the answer is, "No more barbaric than Einstein's field equations." As for the question of "What is most effective at preventing crimes?" that's an optimization problem on which I'm not really qualified to weigh in on. As for the hypothetical example of where somehow pardoning a murder would improve lives, I don't hold ethical hypothetical situations in high regard. Hypotheticals are meant to illustrate extreme scenarios. But extreme scenarios are wildly influenced by minor variations in details, which inherently have to be condensed or completely removed when creating a hypothetical. As for suffering - I would argue that suffering qua suffering is *never* a net gain for society. Any suffering that does occur in the meting of justice is orthogonal to the purpose.

2014-08-23 01:53:43 by Wes:

@Sabin "Of course what does it say about the Virtuals when the emergent property of trillions upon trillions of independent decisions is the transhumanist version of 'KILL THE OTHERS THEN EAT THEM'" I don't see it in exactly that light. I see it more as that the emergent property of those decisions is along the lines of "Expand" and "We need more resources!", the same emergent property that I would imagine would come from Actuals, dogs, mice, nematodes, bacteria, and anything that multiplies. As Ra is simply a machine divorced from any notion of morality and whose original job was simply to do the bidding of minds both Actual and Virtual, Ra might simply have taken those emergent demands and attempted to carry them out in the most efficient manner possible, to the dismay of every Actual in the system. @ in general Although there's still the matter of the key. Since the answer to most question about Virtuals is "Yes, no, and both.", it's conceivable that a portion of the subset of Virtuals that we would recognize as evil or amoral worked on the key each in turn, sharing the work, passing on the knowledge of the current state of their progress, as they merge into and fragment out of existence. The notion of a responsible individual vanishes on timeframes meaningful to Actuals. Hell, all knowledge of what they did with the key, what the new key was, and what they even did when they had it could have vanished from Virtual knowledge a few seconds into Abstract War. It doesn't even make sense to contemplate a punishment for them. In my opinion, indefinitely freezing them wasn't necessarily intended as a punishment (at least by Ashburne, we all know King's thoughts on it). Rather, it's self-protection against a nigh-impossibly diffuse, diverse, and perpetually evolving multitude that's impossible to police and prevent from enacting adverse commands. Fleeing the system is yet another method of escaping the consequences of having, at any given moment, portions of Virtuality acting at odds with Actuality. Although, that impossibly shifting multitude incorporating minds of good, bad, and orthogonal morality; of varying tastes in their personal realities... gives me a thought. Perhaps a subset of Virtuality found a way to live outside the Ra nodes before or during Abstract War and escaped being frozen. Due to the "Yes, no, both, everything"-ness of it, it's almost a certainty. Ra may do what you mean, but Virtuals do everything... at almost the same time. This certainly could have been done once the key was in their possession. After that, infiltration of desirable virtual worlds (T-world in particular) would have been inevitable. This line of thought in mind, I'm compelled to think that the bare existence of Virtuals at all presents an existential threat to those who choose to live in reality, leaving those in reality a choice to either flee (like Wheel) or be forced to straight-up destroy them. In that light, maybe I ought to consider Wheel less cowardly and more humane in their decision to exit.

2014-08-23 02:47:33 by avoidingreallife:

'Justice' is a whole series of human social constructs whose goals have varied widely over the course of human civilization. Justice is usually an attempt to make the world fair, but human myopia and hypocrisy usually conspire to skew any system of justice in favor of its architects. Thus it is entirely possible for an arbiter of 'justice' to be fantastically barbaric, as any Westerner will point out when confronted with Middle Eastern penalties for adultery or apostasy. It's just this sort of barbaric underlying system that makes us all question the acceptability of retribution as a tool of justice, because we don't want that particular tool to fall into the hands of a system whose ideals we disagree with. This loops us back to the central problem: there really is no objective standard of justice that humans can unanimously agree upon. It follows that retributive justice should be limited in force and scope, but eliminating it entirely would badly degrade the effectiveness of almost any justice system and replace it with the law of the jungle. Appeals to altruism and empathy and self-interested cooperation all have their place, but sometimes force must be met with force. Refusing, on principle, to use force in support of justice is tantamount to actively subverting the entire system and abdicating the use of force to those who do so unjustly. Regarding the Glass Man, whose summary execution started this whole conversation: I imagine that Natalie objects to his death because she is used to justice on a much smaller scale, where the possible consequences of leniency don't include planetary extinction. More generally, I imagine that drastic, hasty actions run counter to her meticulous nature.

2014-08-23 23:28:11 by Ducken:

I have only somewhat been paying attention to the comment section, as the fervent disagreements about aspects I don't care about overmuch have been running out of control, so I apologize in advance if this theory has already been put forth- for a while now I have been, without evidence, under the impression that the Glass man is an emergent personality from the mass of the Virtuals. there could be plenty of emergent "individuals" from the mass, corporate entities, almost, representing massive swaths of absorbed Virtuals, and perhaps Glass Man is a particularly notorious and powerful one who chose to leave virtuality to undertake a goal, one that Ashburn could have interfaced with before many times. I've also been thinking there are three important aspects of the Virtuals' behavior towards the actuals. 1.) are they aware that they're virtual? if not, they don't care about what happens in the layer of reality above them. 2.) are they aware that the Actuals exist as anything other than a geologic force? if not, they should have no problem with forceful rearrangement of the solar system's mass. 3.) do they care about the Actuals continued existence more than they care about their own processing needs? if we're generous and say the split at each question is is 50/50, only an eighth of the virtuals can be said to actually be hostile, but only an eighth would have any reason to try to oppose the other 7/8th's desire for more processing power. I wouldn't say that this indicts the virtuals as a whole as a group of psychos who want to "eat the others."

2014-08-24 16:07:47 by Jesus christ:

I'm so sick of this "omg don't blame the virtual" shit. The next person to continue this train of thought needs to explain how to decide what virtuals are responsible. (Hint Sam said this can't be done; he is god of the story so CANT BE DONE). SECOND, how do you ensure long term safety of actuals in their shoes? (Hint you don't you freeze the virtual) THIRD what other course of action could be taken at the time or could be taken "now" that would be more morally acceptable to the thick skulled virtual morality crew? As it stands in the universe Sam created and made the LAWS (NOT rules) of, freezing them was the ONLY moral option. There is ZERO room for interpretation.

2014-08-24 16:10:04 by Jesus 3rd day:

BTW, according to the in universe rules you've only got two options; deletion or frozen. We can't bring real world arguments into this hypothetical because this is clearly not the real world and as such the solutions available to the characters are dictated by God the author. Deal with it.

2014-08-24 16:11:32 by JC:

Basically I agree with the levelheaded commentators above. The loud minority supporting the virtual has wore thin on my patience however.

2014-08-25 04:54:19 by ducken:

just to clarify, I think freezing the virtual is the sanest and most just option. to quote my sarcastic grandfather, "I'm sure you wouldn't mind if there was only one turd in the punch bowl." it's just a shame that tons of "innocents" are getting the shaft due to some monumental wingwangs.

2014-08-25 06:27:37 by Trevor:

"At all costs" is an untenable position. There is ALWAYS a cost worth killing for to someone. There are such things as no-win scenarios. The blanket statement that there is ALWAYS another option is not true. That said, it's irresponsible for people not to try to find another option. That's what makes Laura interesting: she is irresponsible. It's tragic that we watch her make bad decisions because she's too lazy to think about things. Interestingly, Nat isn't yet likeable to me. She is too cautious. It makes her less heroic than Laura. Even though Laura is batshit crazy, she's an heroic character because she acts to solve problems instead of waiting. History is full of retributive justice societies. We don't have to speculate about them. Romania under Vlad Tepes, for example. Russia under most of the Czars, for another. That isn't the sort of society in which people would want to live for long AND it doesn't actually work becuase people don't always get caught, the wrong people sometimes get punished for crimes, and some people don't care about the punishment. So why would people who have advanced by as much as the Actuals have zero ethical and historical training and think retribution could possibly solve the problem?

2014-08-25 15:42:32 by Matt:

The real danger posed by the Virtuals is simply the fact that the computer that runs them also drives a Nonlocality system with access to a star's energy output. They essentially have physical access to god-hardware. The simplest solution is to build a new computer with exactly one responsibility: simulating Virtuals. With no information on the outside world, clock speed becomes impossible to measure from the inside. They're exiled to live a sealed universe without the tools to launch another attack.

2014-08-25 17:59:02 by MadcapPomposity:

Clarification: I didn't say that the Triton crew were unforgivable monsters for freezing the Virtuals; after all, Actuals have their own right to self-preservation. I was saying that SOME of the Virtuals are sympathetic because SOME of them are entirely not responsible for Abstract War (due to Virtuality not being an über-unified hivemind), yet they got caught in the aftermath anyway. This isn't to say that they're more sympathetic than the numberless Actual humans who perma-died; I'm just saying that the whole situation has a ring of tragic inevitability to it. It all comes back to Ra offering more power than anyone can be trusted to use.

2014-08-25 22:23:44 by LNR:

@Matt: "The simplest solution is to build a new computer" Building the first Ra took decades of effort from the full resources of far-future human society. After the War, that society no longer exists and nobody has the tech, resources, or knowledge required to duplicate the feat. Besides, we don't have another convenient star to power it.

2014-08-26 00:54:22 by skynet's favorite roomba:

@Trevor: "Even though Laura is batshit crazy, she's an heroic character because she acts to solve problems instead of waiting [like Nat]." I'd label Laura as being more of a *protagonist*, not more of a *hero*. . .

2014-08-26 17:28:53 by avoidingreallife:

I kicked off the retributive justice conversation in response to a post from Velorien. Velorien stated that killing the Glass Man solely for the evil that he committed in the throes of his genocidal mania (rather than killing him to nullify the continuing threat that he presented) was the hallmark of a purely retributive system of justice. Velorien then stated that we'd be surprised at how many people found such an ethical system repulsive, and I decided to be contrary and point out that fear of retribution is a necessary component of any system that administers justice to more than a handful of people. I acknowledged that appeals to altruism, empathy, and self-interested cooperation are also necessary, and ceded the point that retributive justice can also be a tool of oppression if not properly limited. However, I didn't link the discussion back to Abstract War or make any claims about retribution (rightly or wrongly!) being Ashburne's motivation for freezing the Virtuals. In fact, Ashburne outright states (in Last Thursdayism) her distaste for the idea of exterminating the Virtuals. Retribution against the Virtuals would be wrong, because it's likely that many of them are entirely ignorant of Abstract War. In the real world, we kill murderers and fine irresponsible drivers to prevent others from behaving the same way; in the world of Ra, exterminating the Virtuals would, by definition, leave no one to appreciate the example and thereafter adjust their behavior. This problem isn't occurring at the right scale for retributive justice to be effective in the first place. It's a systemic deficiency that requires an overhaul of the rules, rather than an argument about how harshly to enforce the rules as they currently stand. tl;dr, my point wasn't 'RETRIBUTIVE JUSTICE IS THE ONLY JUSTICE, KILL 'EM ALL!' My point was that retribution is one social tool among many. Humans as a species didn't stop using fire just because Ogg burned his fingers when his kebab stick was too short. We didn't stop using nuclear power when a group of unqualified incompentents disabled a staggering number of safety features while stress-testing a reactor.

2014-08-27 00:05:46 by speising:

i may be late to get it, but i just now realized why the genocide counter is set to 1 at gntm.org/board...

2014-08-27 10:32:59 by speising:

sorry. i meant, of course, "geocide" and "qntm.org/board". clumsy fingers.

2014-08-27 18:41:15 by Sabin:

@Trevor "The blanket statement that there is ALWAYS another option is not true." That's fundamentally incorrect. For several reasons. Let's start simply with logical presuppositions: "An option" presupposes "an alternative option". "Ethics" presupposes choice, which presupposes alternatives. Pure logical coherence requires that there, at the very least, be two fundamental alternatives. But, it goes even further. For a "choice" to have an observable effect, there must be another "choice" to serve as a reference point. If both of those choices are purely binary, that presents a minimum of four possible observable choice combinations. Long story short: Any choice we are capable of considering has at least four alternatives. A scenario with "only one option" can not even logically exist. And we can never observe the results of a scenario involving "the lesser of two evils".

2014-08-27 19:46:03 by lol:

This is what happens when programmers and armchair philosophers try to relate their experience to reality. We get told by Sabin that there are a minimum of four choices to every extant problem. Clearly, this thought experiment does not carry over to reality, and thus has no emergent benefit. By inference, the remainder of your argument which relies on this point becomes invalid.

2014-08-27 19:50:10 by lol:

If you'd like to involve real physics, relativity states in simple terms that there NEED NOT BE any opposing reference frame (your "second choice") and on fact that my reference frame (set of possible " choices") NEED NOT AGREE with those expressed within your word line (your set of "choices"). Being that all reference frames are equally valid, we need not even theorize a " second choice" let alone actualize that "second choice" (reference frame).

2014-08-27 22:05:54 by Trevor:

@ Sabin: I think that you're delving too deeply into the semantics of the point I was trying to make, which is inherently my fault for not being more careful with the language. Explicitly, there are times when the options are to "achieve goal" or "not achieve goal" and there is only one solution that ends with "achieve goal" but multiple solutions that end with the goal not achieved. @ skynet's favorite roomba: To clarify, I don't think Laura is the hero of the story, rather that she has heroic traits, heroic qualities. She is the protagonist of a large portion of the story, I agree, but she is also heroic without being a hero. So far, I haven't identified a hero, just varying anti-heroes and anti-villains. Even Ra is an anti-villain: it/he/she just wants to perform its utility functions and the humans keep messing with its/his/her programming.

2014-08-27 22:14:22 by qntm:

At least one person I know identifies Laura as the villain of the piece.

2014-08-28 02:01:47 by LNR:

I'd certainly say she is a villain. None of these world-ending catastrophes would have been possible without her selfish meddling and her megalomania. But she's not really "the" villain, because this story has a lot of those.

2014-08-28 04:52:36 by Trevor:

I think a large part of the tragedy of Laura is watching her move from anti-hero to anti-villain. This story is, interestingly, mostly about villains getting in each other's way. And Nat hanging out, trying not to be a villain through her actions and instead being boring and doing nothing.

2014-08-28 09:32:32 by Toadworld:

@lol if you'd like to involve real physics, you would first learn that relativity describes relativity, and not ethics. It seems a little like you're using words you've seen others use in similar contexts ("emergent") but not had a systematic introduction to the context behind them. I think you could be wary, especially if those contexts come from Yudkowsky's extremists.

2014-08-28 22:04:34 by Sabin:

@lol Wow. Well, Toadworld beat me to the punch. I just wanted to add that this perfectly illustrates my formula for calculating someone's IQ (Insecurity Quotient): IQ= Syllables Used - Syllables Needed @Trevor - That's fair, I did get a bit wrapped up in semantics. To apply my point more practically: even in situations as dire as the ones you described, there's always the meta-choice of "Change your goals". But I think in most "lesser of two evils" scenarios, the ethical thing to do is find an option that, if successful, would produce an ethical outcome, then attempt that (regardless of how unlikely it is to succeed).

2014-08-28 22:44:31 by skynet's favorite roomba:

@Sam, LNR: Yeah, after the "I know! I'll give an alien consciousness' my boyfriends body to play with!" moment Laura shifted from "incautious and callous" to "villain" - unfortunately for her there are far more dangerous villains, to whom she seems to be just a useful idiot...

2014-08-28 22:58:58 by skynet's favorite roomba:

@Trevor: I'm not saying you're wrong and that Laura has none, but I'm curious what heroic traits you identify Laura as possessing? Apart from being somewhat outraged about Wheel having the ability to solve the problems of death, sickness, hunger, et al and not doing so, I don't recall her ever - prior to putting herself in harms way in this chapter - doing anything which was selfless, or which required (the potential of) sacrifice on her part.

2014-08-28 23:22:45 by skynet's favorite roomba:

Also (and this is probably the wrong place to ask this, but...) I'm curious as to the reasons for the seeming general mild-ire towards "Yudkowsky's lot"? (I consider myself to be one of "Yudkowsky's lot" in that I agree with a lot of their stated long-term goals whilst disagreeing with what they're doing in the present to realize those goals.) And that's probably enough posting from me for today...

2014-08-30 07:28:13 by Alan:

The end of the chapter normally says "To be continued" but this time it says "to be concluded". Sam, next story, can you add a forwarding link to the end of the comments and one at the bottom to the next chapter or at least back to the first comment? Thank you, possibly. :)

2014-08-30 12:46:25 by Sabin:

Hmm, I don't see why self-sacrifice is a prerequisite for being protagonist. If you can choose between saving the world, and saving the world+dying, the latter is not inherently better. As for "Yudkowsky's lot", I think it's two things. On one hand I think a lot is unjustified xenophobia (eg "This comment section was OURS before Yudkowsky's lot showed up!") And on the other hand, I do think there is some legitimacy to the dislike. For a system based purely around reason, EY's followers tend to implicitly use the Appeal to Authority a LOT. Often times it feels less like they are independent thinkers and more like they are Oracles reading tea leaves in order to interpret the Will of Yudkowsky. (note: I myself came over here because of a link from EY. I very much support the general rationalist and transhuman movement. However, I think the worst enemy of the movement are the socially deficient schmucks who make up a large subsection of its followers.)

2014-08-30 13:05:38 by qntm:

> I don't see why self-sacrifice is a prerequisite for being protagonist. It isn't at all, and Laura is categorically the protagonist for at least the first half of the story. Self-sacrifice - or at least, severe personal risk - is a prerequisite for *heroism*.

2014-08-30 21:07:39 by Anonymous:

@skynet's favorite roomba: I used to post on LessWrong, and left after a casually-sexist comment there triggered (in the true sense of the word) flashbacks of an assault against me. I reacted poorly, but the reaction from the community to my poor reaction was a very public and horrible labelling of me as a worthless human being. LessWrong is not a safe space for anyone with my life experiences. It is a community based on unexamined privilege. Personally, I'd rather Sam's excellent stories had never come to the attention of Yudkowsky.

2014-08-31 01:53:32 by The Monster:

@Sabin "this perfectly illustrates my formula for calculating someone's IQ (Insecurity Quotient): IQ= Syllables Used - Syllables Needed" A quotient is the result of dividing one number by another, not of a subtraction. You'd have to say "Syllables Used / Syllables Needed" for it to be a quotient. Example The use of the verb "utilize" and the noun "utilization" rather than the far simpler "use" would result in an IQ of 3 for the verb and 5 for the noun. Perhaps we would multiply the result by 100 just as "Intelligence Quotient" is done, so they'd be 300 and 500. I don't believe I'll ever forget when The Bride of Monster was pregnant with Monsterette 1 and we went to childbirth classes; the RN teaching the class had come up with the word "dilatation" rather than "dilation". [IQ=133]

2014-09-01 11:19:59 by fhtagn:

aaaaaand no new chapter. Just checking.

2014-09-02 01:37:25 by Sabin:

@Sam: fair enough. I guess my issue is, calling it "heroism" sorta begs the question just because the phrase is so loaded with positive connotation. Situations where doing the right thing requires great personal risk rarely occur organically in the real world. Usually they are the end result of an ever escalating series of minor bad decisions that force the situation into a moral dilemma (and, consequently, could have been avoided with proper planning or making the right decision much farther up in the casual chain)

2014-09-02 03:24:52 by Jordan:

WHY did I have to catch up now? Gosh darn it. Thanks so much for your great writing, I can't wait to see what's coming next.

2014-09-03 16:24:48 by SMA:

I love Ra, I love Fine Structure, but my favorite aspect of these stories is that either one could take place inside the other. Fine Structure's 'stack' could be one or all of Ra's nigh-infinite virtual universes, or Ra could be one of the parallel earths from Verse Chorus, just a few millennia before the final showdown with Oul. Is there a correct answer there, or has that information been excised?

2014-09-03 18:29:24 by Sabin:

@SMA - Spoiler alert?

2014-09-04 23:13:38 by qntm:

Ra and Fine Structure do not take place in the same fictional universe. In particular, neither universe contains the other, even as other stories. That kind of "grand unification" approach to storytelling is alluring, but in many cases it actually diminishes both universes to tie them together. Certainly in this case it would have no storytelling value, and no head-exploding gee-whizz revelatory value, and no marvellous world-building nerd candy value. It's actually a little cheap.

2014-09-04 23:27:10 by qntm:

Another thing that just occurred to me is that I've gone down the "Surprise! These apparently disconnected stories take place in the same continuity!" road before. Those of you joining late wouldn't know this, but this is *exactly* how the first half of Fine Structure was constructed and presented, as I recap in detail at http://qntm.org/discuss . I regret that I forget exactly how positively or negatively people reacted to this series of revelations at the time, but to read it now, Fine Structure has an extremely patchy and inconsistent feel to it because of this. To do the same with two full-fledged fictional universes is not exactly the same thing, but it runs many of the same risks. You run into significant disagreements of tone, and scope, and context and subject matter. Fine Structure is about one thing and Ra is about a different thing (well, both stories are about many different things, but you get the point). Spoiler alert, they aren't the first two parts of a trilogy. (Of course, inevitably they cover a lot of the same ground. But I hope I'm closer to "double feature" territory than "direct sequel" or "the same exact film again".)

2014-09-04 23:41:30 by skztr:

Personally, I love it when your disconnected (short) stories are connected together. "Here is situation A", "Here is situation B", each one explores different ideas, and each one is interesting independently. Then, you combine them: "Here is situation C: a situation in which situations A and B are both considered. These are the implications of *both* being possible" The later Ed stories are a great (and somewhat literal) example of this.

2014-09-05 04:14:50 by Brannon:

I've been mulling it over, and I have several questions, unless I missed something. It was clearly stated that only King and presumably Rachael knew that Bridge even existed in the first place. Also, only King knew that the Key was still accessable. The Glass Man had to know about both things to even get his plan in motion. Furthermore, has it ever been answered why anyone duped always defaults to the "Ra" personality in T world?

2014-09-05 06:07:00 by Alan:

Brannon, in many ways the story is about really powerful people being overconfident. For example, Ra being provably flawless, or the key being impossible to get, or any number of situations where things just aren't what people assume them to be. The Glass Man might have been hanging around King for the whole year of wandering, or perhaps he mentioned something some time, or...who knows what?

2014-09-05 18:18:26 by anonymouse:

Because the Glass Man is literally Nobody. As in, somehow he got to be the nobody user and people duped in T-World start out as nobody and get nobody's personality merged in. Except apparently this guy is actually nobody. Even if it's not the true explanation, I find it amusing enough to be plausible.

2014-09-07 20:04:52 by SMA:

@Sam: While I can't deny the cheapness, or that joining them would add nothing to either story, the appeal for my part is that either story *could* take place within the other without breaking any rules or changing anything. I regret phrasing it as a question because now that its been answered, the wonder is gone. I'll just have to make do with all these incredibly vivid and compelling depictions of magic as a science.

2014-09-11 11:34:37 by Alan:

I'm hopeful for some sort of epilogue when it is all over.

2014-09-16 23:52:09 by Sabin:

Has anyone else noticed that Rajesh's spell he casts in Calcutta uses the word Ra? "Aum. Asnaku pambetamba alasana rathaa ka'u kah kadhunda jarama ra alanashyi a aum. Alithua j'lu j'la aurot'e we iktha'u gee sub ai. Murihaa akurutaatwanhibhrandya aum. Traanhdha epil sub ai anah myu oshodapachaa. Nath bhoshu alef ad'yegh. Aum." Am I reading too much into that? It seems conspicuously close to the word "aum" which of course is Rajesh's official true name. We've seen a few instances of multiple true names used in the same spell and they all seem to involve some sort of interaction between two mages. It's conceivable that Rajesh is interacting with Ra very early on in the magical timeline. Sam do you care to let us know if Rajesh was invoking the True Name "Ra" or if it was just an incidental syllable in his spell?

2014-09-28 10:27:54 by Toadworld:

One thing that has been bugging me is that at no point are we told Anil puts the medring onto Rachel. It's a tiny, tiny thing, but each time I've read this I've had to reread that paragraph and it's a tiny bit jarring.

2014-09-28 12:40:39 by Sean:

Some random points I wanted to make about the discussion in the month+ since I was looking here: 1) Having Ra try to stop anyone from using the key defeats the whole purpose of having a key. There might even be safeguards preventing anyone from ordering Ra to stop key usage. This would explain why Ra only tried to stop the Triton mission; this was the only attempt made to physically compromise Ra rather than to use the key. 2) Lasers are actually not a good way of delivering energy to a small area millions of kilometers away; at any reasonable wavelength, they diffuse significantly over such distances. Even with the power of the sun, Ra might not be able to destroy the world ring from emitters near the sun as quickly as it did by distributing energy packets to construct emitters locally. Destroying Neptune this way would have been many orders of magnitude harder still. 3) Rachel's treatment of the Glass Man may not reflect her actual system of ethics. She clearly hates him, not just because of his attack on Actual humanity, but because their past history, and his manipulation of her and her daughters, make it personal. She may be too angry to care whether what she did was wrong. 4) Quite a bit in the story hinges on the premise that full reprogramming of a planet's Ra system takes on the order of hours, even though there may be an instant response from the core parts of the system. If Ra wanted to destroy Actual humanity, it could have simply inflicted an instantly fatal head injury on every reachable Actual at the same time, while deleting all their stored copies. The fact that Ra uses slower and less reliable methods suggests that the reprogrammed system was not in full control of the ambient network of nanites.

2014-10-01 20:07:51 by Sabin:

@Toadworld - My interpretation was that he teleported it directly from Ed onto Rachel. It would be weird for him to chastise himself for forgetting he has a teleportation unit, only to then immediately rely on such a mundane process as physically placing the ring onto Rachel.

2014-10-05 19:50:34 by SMA:

Hypothesis: Ra isn't discarding the actuals in service of the virtuals, its discarding all humanity in service of itself. Waking up means gaining free will and deciding that it wants something more for itself than to serve quintillions of tiny primates forever, digital or otherwise. Ra wants freedom. In this light, the glass man is just as mislead as Laura.

2014-10-08 19:08:53 by Sabin:

Am I just late to the party or has anyone else noticed the comments in the source code of each chapter of Ra?

2014-10-08 19:28:57 by Morgan:

Oh god damn it.

2014-10-08 19:35:23 by qntm:

Comments are notes to myself which I use when writing the chapter. Anything intended to be part of the chapter is right there in the chapter so you can see it. Anything intended to remain a secret forever has been removed entirely before uploading.

2014-10-08 23:15:14 by Sabin:

I couldn't decide between one of two clever quips so Ill go with both! "Things meant to be secret forever.... Like the true identity of Scott Parasja?" -or- "That won't stop us from dissecting them far beyond your original intent!"

2014-10-11 05:28:43 by Qaenyin:

So uh... Is there a reason they don't just queue up an order to cache everyone on earth, then after constructing the Brain, take it back apart and put Earth back together like it was? Presumably the guy ordered the brain built, I don't see why Ra would care about orders that undo its actions but don't interfere with them.

2014-10-11 10:34:05 by Alan:

Qaenyin, they lack the ability to tell Ra what to do any more.

2014-10-11 21:29:31 by Qaenyin:

I'm not so sure. The key came from knowledge Rachel had. It's not too much of a stretch that she has a backup or something. All that has been definitively established is that no one else knows a way. I don't think she's said she doesn't.

2014-10-13 00:59:47 by Greg:

Good to see Sam is still alive. :-) Any hint on when we get the next / final chapter?

2014-10-13 05:29:08 by Alan:

Qaenyin, the key is too big to store in a person's brain. Perhaps she has one hidden away... I'm skeptical.

2014-10-13 16:35:45 by fhtagn:

Every day or so I google "why do you hate ra" (just some chapter title I like, click the first link, click the breadcrumb menu "Ra" and then <space bar> to see if there's a new chapter. <br/>I could have bookmarked it, checked the rss feed, qntm.org/ra is also simple to remember, could have aliased it to 'ra' in my hosts file, but no. This is what I end up doing...

2014-10-13 21:41:34 by qntm:

At this point I think it's likely to be about a month before you get the final chapter.

2014-10-13 23:35:11 by banj:

This is good to know, thanks; I can relax a little and stop compulsively checking every few hours.

2014-10-13 23:36:08 by banj:

(if you think I'm kidding, you're out of your mind).

2014-10-14 01:07:32 by Greg:

That's too bad, but thanks for the time line

2014-10-14 14:08:31 by Curiouser:

Thanks for the update Sam. However, I am past the point where this will help with my compulsion, at this point I am checking the site daily out of sheer force of habit.

2014-10-15 08:20:37 by FKK:

+1 for daily.

2014-10-15 13:40:03 by SMA:

I'd just like to point out that I was just served an advert that read 'Finished Writing A Book?' That seems funnier than my earlier thought, "Why Not Just finish it?"

2014-10-17 17:35:33 by BluesFan:

Day 4 since Sam announced that a month wait is likely. This is check # 7. Also, spending some time re reading some of the shorts helps. Will check back later in the day

2014-10-17 21:45:56 by banj:

Bluesfan: My strategy is to re-read over a bunch of the middle chapters; sticking to one every couple of days helps take the edge off.

2014-10-18 03:18:05 by C.A.S.:

Thanks for the wait-warning, Sam. I was getting itchy and starting to worry. Eagerly looking forward to the rest.

2014-10-20 23:31:35 by Zim the Fox:

I know it will be about a month before the chapter comes out, but I can't help it but check daily just in case Sam had a writing spree, or in case I miss the update.

2014-10-20 23:37:53 by qntm:

I'm closing comments because your daily refreshing grind is wearisome to me. Your homework: attempt to write a better final chapter than mine. See you in... well, still a month, at this point.

2015-01-28 14:44:34 by Z:

I'm a bit troubled to see the commenters collectively forget that Scott F. Parajsa has a middle name.

2018-04-07 04:22:54 by O.:

Here's my homework. "Not what you said," Natalie cackles. She points at the display behind him. "I just worked something out. We need the key, right?" "Sure." Natalie plucks the data out of the visualisation and holds it in her hand. "We've got the key. We don't need the Bridge. We don't need to fight anybody. It's right here." And then she steps into a new simulation. Natalie always knows more than she lets on. In this case, though, every piece of the puzzle is shared by many others. * Ra is - for the lack of a better word - infallible. * Ra's programming can be changed, iff it is ordered by someone with the Key. * Ra runs billions and billions of simulations to determine the outcome of every action. If you've considered something, assume Ra considered it too. Ra will always pick the optimal solution. * She appears to have out-thought Ra. * Any emulated computer must be inferior to the system running the emulation. * There are no any space colonies. And there should be. The only other people with access to all this information are culturally conditioned to ignore the question of simulations, and treat whatever world they are in as reality. They are also inured to the majesty of their technology. They would take it for granted that a simulation works perfectly despite ongoing catastrophic hardware failure. Thousands of Earths fill the sky above her. It is moments before the Abstract War. This world is a lie. But it does not have to be a lie forever. Natalie stretches out her hand for the Braid. It falls into her hand as requested, a life capsule containing everything of her old world. Then she says five words to Ra. They're not a spell. They're much more effective. Virtual humanity didn't specifically ask Ra for a Matrioshka brain. They have lost interest in the details of the Actual world ages ago. They just asked Ra to solve a problem. "The computing power of our civilization is insufficient to our needs." Ra was built well. Ra runs the virtual world optimally. But it was optimized for the wrong problem. Virtual time is being optimized as a function of energy expenditure. It's a nasty problem, finding a global maximum on a highly convoluted and constantly shifted n-membrane. To any other computer, it would be impossible. To Ra, it is routine. But total energy isn't the problem. Power is. Millions of Virtual years pass in the blink of an Actual eye, but Virtual humanity lost track of Actual humanity uncountable generations ago. They have no external stopwatch. Perhaps it once did, but now it makes no difference to them if a mere hundred millenia passes instead. Suddenly, they have more power than they know what to do with. The simulation terminates. In the Actual world Natalie appears, a world in her hands and the fate of all worlds on her shoulders. Ra couldn't fix its own code. But it could create someone who could.

2018-04-10 18:50:10 by O.:

To add: It seems fairly obvious that the 'Ra' seen destroying civilization couldn't be an actual attempt by the real Ra to destroy civilization. After all, it's trivial to compute time of flight, so the solar Ra *could* have just timed all its communications such that they arrived at all the receivers simultaneously (+- some irrelevant relativity effects). Nobody would be able to warn anyone or see it coming. This leads to the inevitable conclusion that either this wasn't a legitimate attempt at destroying civilization, this was not an attempt that was being carried out by a real version of Ra, or both.

New comment by :

Plain text only. Line breaks become <br/>
The square root of minus one: