Quite a lot of science fiction isn't about what it's about.
"Lena" is about uploading, but uploading isn't real. It doesn't exist.
It might exist at some point in the future, but that just seems pretty improbable to me. As I understand it, right now, to "accurately" simulate the behaviour of just a handful of neurons requires a building-sized, custom-built supercomputer, running at around 1/1000th of real time. A human being has one hundred billion neurons, so the distance from here to there is something like nine orders of magnitude of processing power, and the distance from there to the "Lena" scenario is several orders of magnitude again. Moore's Law has to tap out at some point, right?
And the really hard problems probably aren't problems you can throw more and more transistors at. The real problems will be biological factors I know next to nothing about. What about the scanning process? How do you simulate the rest of the body and what happens when you don't? What happens when you upload half a person, or a rat brain? Where do you even begin with this?
I wrote the story in 2021 and set the time of first upload in 2031, which was as close to the present day as I could bring the timeline without it sounding absolutely absurd. But the reality, the real promise of uploading, now? It just... it smells of "full self-driving", right? A seemingly simple science fiction proposition with absolutely abyssal, unconquerable depths. The kind of thing you could semi-plausibly promise to magically produce within ten years, over and over again, for twenty years.
Anybody claiming to be able to upload you, now, is a con artist. Do not, under any circumstances, get inside their machine.
So uploading isn't real (now), so the things "Lena" has to say about uploading are academic (now).
The reason "Lena" is a concerning story isn't that one day we may be able to upload one another and when that happens we will do terrible things to those uploads. This isn't a discussion about what if, about whether an upload is a human being or should have rights. (I want to be abundantly clear: within the fictional context of "Lena", uploads definitely are human beings, and therefore automatically, inalienably, have rights.) This is about appetites which, as we are all uncomfortably aware, already exist within human nature. Upload technology is not the last missing piece of this.
Oh boy, what if there was a maligned sector of human society whose members were for some reason considered less than human? What if they were less visible than most people, or invisible, and were exploited and abused, and had little ability to exercise their rights or even make their plight known?
That's real! That actually happens! You can name four groups of people matching that description without even thinking. We don't need to add some manufactured debate about fictitious, magical uploads to these real scenarios. They are already terrible!
Is an android a human being? What about a hologram, what about a replicant? I don't care! A human being is a human being. That's the "debate" we're apparently having to have right now.
More specifically, "Lena" presents a lush, capitalist ideal where you are a business, and all of the humanity of your workforce is abstracted away behind an API. Your people, your "employees" or "contractors" or "partners" or whatever you want to call them, cease to be perceptible to you as human. Your workers have no power whatsoever, and you no longer have to think about giving them pensions, healthcare, parental leave, vacation, weekends, evenings, lunch breaks, bathroom breaks... all of which, up until now, you perceived as cost centres, and therefore as pain points. You don't even have to pay them anymore. It's perfect!
Ideally, you wouldn't have to pay for the CPU time either, or hardware engineers. That's what's next.
In the "Lena" scenario, your human workforce continues to create value for your business, but they cease to have voices, and they cease to be able to organise or strike. Human dignity and independence are expensive, and those expenses go away.
This is extremely realistic. This is already real. In particular, this is the gig economy. For example, if you consider how Uber works: in practical terms, the Uber drivers work for an algorithm, and the algorithm works for the executives who run Uber. There is a kind of airtight bulkhead separating the human drivers from the humans who control the algorithm. The management layer sends the drivers instructions electronically, impersonally, at immense scale, and the drivers respond using a narrow collection of APIs. This bulkhead is intentionally designed to make it impossible for these two sets of humans to actually talk to one another, for the drivers to request compassion or a break or better rates or a correction to a mistake, or for them to explain special or unique circumstances. This is by design. From the perspective of the corporation, all the drivers combined are just a small, locked box which produces money, and a bunch of controls which can be varied to vary the amount of money produced. Whether or not the current settings exploit or hurt people is not part of the calculus, and if you tried to explain to the executives or shareholders that this was bad, they would legitimately not comprehend what you were trying to tell them.
"Lena" is a true story. You knew it was when you read it.
So, what do we do about this? In reality?
"Lena" has a glaring omission, which it shares with a lot of dystopian science fiction, which is that it provides no alternative, and no exit. It gives us no tools for pushing back on these problems. It just throws us a terrible scenario and we read it and we come away thinking "Wow, that's terrible!" Yes, it would be terrible. It is terrible!
I've got no clue. It turns out that causing problems is a lot easier than organising against them, and I am just a science fiction writer.
Good luck, everyone.