"Lena" isn't about uploading

Quite a lot of science fiction isn't about what it's about.

"Lena" is about uploading, but uploading isn't real. It doesn't exist.

It might exist at some point in the future, but that just seems pretty improbable to me. As I understand it, right now, to "accurately" simulate the behaviour of just a handful of neurons requires a building-sized, custom-built supercomputer, running at around 1/1000th of real time. A human being has one hundred billion neurons, so the distance from here to there is something like nine orders of magnitude of processing power, and the distance from there to the "Lena" scenario is several orders of magnitude again. Moore's Law has to tap out at some point, right?

And the really hard problems probably aren't problems you can throw more and more transistors at. The real problems will be biological factors I know next to nothing about. What about the scanning process? How do you simulate the rest of the body and what happens when you don't? What happens when you upload half a person, or a rat brain? Where do you even begin with this?

I wrote the story in 2021 and set the time of first upload in 2031, which was as close to the present day as I could bring the timeline without it sounding absolutely absurd. But the reality, the real promise of uploading, now? It just... it smells of "full self-driving", right? A seemingly simple science fiction proposition with absolutely abyssal, unconquerable depths. The kind of thing you could semi-plausibly promise to magically produce within ten years, over and over again, for twenty years.

Anybody claiming to be able to upload you, now, is a con artist. Do not, under any circumstances, get inside their machine.

So uploading isn't real (now), so the things "Lena" has to say about uploading are academic (now).

The reason "Lena" is a concerning story isn't that one day we may be able to upload one another and when that happens we will do terrible things to those uploads. This isn't a discussion about what if, about whether an upload is a human being or should have rights. (I want to be abundantly clear: within the fictional context of "Lena", uploads definitely are human beings, and therefore automatically, inalienably, have rights.) This is about appetites which, as we are all uncomfortably aware, already exist within human nature. Upload technology is not the last missing piece of this.

Oh boy, what if there was a maligned sector of human society whose members were for some reason considered less than human? What if they were less visible than most people, or invisible, and were exploited and abused, and had little ability to exercise their rights or even make their plight known?

That's real! That actually happens! You can name four groups of people matching that description without even thinking. We don't need to add some manufactured debate about fictitious, magical uploads to these real scenarios. They are already terrible!

Is an android a human being? What about a hologram, what about a replicant? I don't care! A human being is a human being. That's the "debate" we're apparently having to have right now.

More specifically, "Lena" presents a lush, capitalist ideal where you are a business, and all of the humanity of your workforce is abstracted away behind an API. Your people, your "employees" or "contractors" or "partners" or whatever you want to call them, cease to be perceptible to you as human. Your workers have no power whatsoever, and you no longer have to think about giving them pensions, healthcare, parental leave, vacation, weekends, evenings, lunch breaks, bathroom breaks... all of which, up until now, you perceived as cost centres, and therefore as pain points. You don't even have to pay them anymore. It's perfect!

Ideally, you wouldn't have to pay for the CPU time either, or hardware engineers. That's what's next.

In the "Lena" scenario, your human workforce continues to create value for your business, but they cease to have voices, and they cease to be able to organise or strike. Human dignity and independence are expensive, and those expenses go away.

This is extremely realistic. This is already real. In particular, this is the gig economy. For example, if you consider how Uber works: in practical terms, the Uber drivers work for an algorithm, and the algorithm works for the executives who run Uber. There is a kind of airtight bulkhead separating the human drivers from the humans who control the algorithm. The management layer sends the drivers instructions electronically, impersonally, at immense scale, and the drivers respond using a narrow collection of APIs. This bulkhead is intentionally designed to make it impossible for these two sets of humans to actually talk to one another, for the drivers to request compassion or a break or better rates or a correction to a mistake, or for them to explain special or unique circumstances. This is by design. From the perspective of the corporation, all the drivers combined are just a small, locked box which produces money, and a bunch of controls which can be varied to vary the amount of money produced. Whether or not the current settings exploit or hurt people is not part of the calculus, and if you tried to explain to the executives or shareholders that this was bad, they would legitimately not comprehend what you were trying to tell them.

"Lena" is a true story. You knew it was when you read it.

*

So, what do we do about this? In reality?

"Lena" has a glaring omission, which it shares with a lot of dystopian science fiction, which is that it provides no alternative, and no exit. It gives us no tools for pushing back on these problems. It just throws us a terrible scenario and we read it and we come away thinking "Wow, that's terrible!" Yes, it would be terrible. It is terrible!

I've got no clue. It turns out that causing problems is a lot easier than organising against them, and I am just a science fiction writer.

Good luck, everyone.

Discussion (32)

2022-01-26 13:50:27 by qntm:

As a reader, you get to have an opinion about what a thing is about, but I do too.

2022-01-26 14:21:28 by Tux1:

I actually used to have the idea of uploading myself onto a computer, so that I wouldn't have to live in an organic body anymore. Needless to say, that idea has since been left behind.

2022-01-26 15:00:31 by mimivirus:

this is incredible. ive just stumbled onto this post, which led me to the story, and now, 20 minutes later, im a more depressed person than previously. as soon as the story progressed to Acevedo trying and failing to control the distribution of himself, i felt cold dread in my entire body. this is a story without twists, aside from the one shattering the whole dream of uploading myself into a machine, and it's excellent and deeply depressing because of that: i know how it goes in real world, so of course the ideal of living as my digital twin is dead on conception, as it's the humans who made the machine to host it i havent read all the comments to Lena, but what i saw, kind of... disappointed me, as many were stuck on technical details of the difficulty of uploading, or horrified on just MMAcevedo's behalf. you know what grabbed me by the throat the most? absolute throwaway mentions of other uploads, from unwilling participants, digitally beaten and abused into being efficient at menial tasks the pleasant and agreeable MMAcevedo did poorly.

2022-01-26 15:15:34 by ThrowawayUberDasher:

I'm someone about to start work at one of aforementioned food delivery companies, on the corporate side, and I had some very similar concerns about the way they operate and their impact on people. I did want to add in a kind of rebuttal/food for thought to a few points based on my conversations with them (although I can't speak from personal experience yet). On "There is a kind of airtight bulkhead separating the human drivers from the humans who control the algorithm." - The company I'm going to work at actually asks everyone on the corporate side to spend one evening per month using the app as a delivery person, to dogfood the app and understand on a more personal level the experience of the people who enable the company to exist. While this isn't the same as being one of those people full time, it is at least not an airtight bulkhead. On "Whether or not the current settings exploit or hurt people is not part of the calculus". Even if you ignore the fact that the people on the corporate side are themselves people (like me) who can have compassion, this is part of the calculus for purely profiteering reasons. Attracting drivers from the competition and keeping them happy is one of the primary challenges these companies have to solve, because there are so many competitors in the marketplace. Making sure that people are satisfied is important because unsatisfied drivers have other companies they can work for. That's why there are currently initiatives you're probably already hearing of (or able to trivially google) to hire drivers as full time employees with benefits etc, because that's what is becoming required to keep them. It's an ever escalating arms race - in a positive direction. I don't say this to defend the industry wholesale - I still think there are issues - but as a hopefully thought provoking alternative viewpoint/data.

2022-01-26 18:42:22 by Sigma_100:

@ThrowawayUberDasher Unfortunately there exists an additional bulkhead, between that of the corporate side, and that of the investor side. Due to the way LLCs and publicly-traded companies work, investors are strongly incentivized to establish CEOs and other leadership personel that will prioritize those investors' profit margins over any other concern, and to echo qntm's take on the employee/corporate bulkhead, this too is by design. As you say above, of course the investors are also people, capable of personally prioritizing compassion, or at least perhaps coming to the conclusion that a compassionate approach is synonymous with long-term continued profits, but the system powerfully disincentivizes that approach. Still though, inherently-corrupt systems aside, that's rad what your company is doing.

2022-01-27 05:10:51 by throwaway:

I know of one other Uber-like company where corporate managers are strongly encouraged to spend some time as delivery drivers, so this might be a common practice.

2022-01-27 06:06:17 by Evonix:

Now you've described it like that it reminds me of Mother Horse Eyes, the afterword particularly.

2022-01-27 12:08:27 by zaratustra:

regarding the computing power required: we do have at least one proof that a human mind can run on 1.4 kg of computing material, on 12 watts of energy.

2022-01-27 14:33:47 by literallymechanical:

The quick mention of “live drone” at the end also got to me. Militaries that psychologically torture their cadets until they can kill on command? An entire government apparatus devoted to systematically dehumanizing both its soldiers and their targets? Taking police forces that should exist to protect and serve their communities, and turning them into amoral tools of violence for the state? How far-fetched!

2022-01-28 16:40:55 by Dennis:

That last section about what to do hits hard. I agree that a lot of distopian stories are sorely lacking in that department, or they go the other direction and invent some magic solution (like any story where the hero changes the status quo with some magic MacGuffin that real oppressed people will never have). Maybe fiction always has the cheat a little. After all, the things that cause real historical change are usually better explained in history books.

2022-01-30 13:08:48 by Alda:

For some reason, "Lena" didn't scare or unsettle me at all. I tried imagining myself waking up as an upload like that, and I was fine with it. Sure, having millions of myselves (?) being worked for a month or two and then deleted might sound horrible, but there would be no "global me" to see ALL of their suffering. It's not an "I Have No Mouth But I Must Scream" scenario, where I (or some form of "me" at least) experience all of those combined millions of human-years of slave labor, and remember every reboot after reboot, and realize I'm about to experience millions more with no way to stop it. That would frighten me. This doesn't.

2022-01-30 18:24:47 by BillMann:

So you'd be okay with being kidnapped and forced to work under threat of torture until you died? Because that's what happens to uploads (the fact that it happens millions of times makes it worse, but for me that happening once to me is scary enough).

2022-01-31 10:02:07 by skztr:

An important aspect of Lena as metaphor is: though lacking certain skills or understanding of all the cultural context, one of the most reliable and "hard working" images is one which does not know about the industry or implications. They are also, critically, not less-intelligent than anyone who *does* know those implications. They just didn't grow up with that understanding.

2022-01-31 18:24:04 by MadcapPomposity:

Damn glad to see this clarification and most of these comments. Uploading is just a tool. It could be used as described in Ra (before things went completely sideways), to support human life using only processor cycles and no other physical resources than what the processor requires. Or it could be used as described in Lena, to support a slave economy of unprecedented scale that enriches a small handful of people at the expense of everyone else. Both stories are less about the tools themselves, which are inanimate and without agency; and more about what humans do with the tools that they have.

2022-01-31 19:26:18 by Coagulopath:

People seem really bad at recognizing thematic content in stories. You got a lot of commenters nitpicking technical details (sometimes using it as a way to "dismiss" the story, like it's a puzzle to be solved) when those weren't the point. Hamlet is about a Danish prince, but it's not important that he's from Denmark or a prince. It could have been set in almost any time and place, and it still would have been Hamlet. Stories are just ways of encoding theme, which are larger than the specific shape the plot takes.

2022-02-01 15:37:00 by John:

There's a huge difference between doing a shitty job for one day a month, while 1) having the money, status, and comfortable life that you get the other days of the month, 2) knowing that you go back to that comfortable life tomorrow, and 3) knowing you don't even have to perform the job at the level the peons do in order to survive. But it's an EXCELLENT way to make your management BELIEVE they know what it's like to be on the other side of the API, and to think it's not so bad. It's a sort of inoculation against sympathy.

2022-02-09 21:44:50 by Smithgift:

As a Catholic I was not disturbed by Lena, insomuch as I am not disturbed by the thought being eaten alive by married bachelors. Taking it as a pure exercise in existential horror, I considered it another good reason not to be a transhumanist. But regarding the official moral, I dunno. I'm not the sort of person who considers capitalism intrinsically evil, and I am a business owner (though no employees). But perhaps may I suggest that this is a good place to draw a line between "I want to do this" and "I ought to do this"? If I would hazard my own moral, after reading this, it is that game theory makes for pretty terrible ethics once someone has unlimited power. You would need, dare I say it, an objective source for inalienable rights. @Sigma_100: Two nitpicks. 1. LLCs and publicly traded companies are vastly different things, to the point where it's a bit absurd to see them conjoined. An LLC is a unit of business that is legally separated from its owner for legal purposes--essentially, a separate corporate person, but without the hassle of corporate law. Most LLCs are just one or two man operations, like my own. Some LLCs are gigantic, like hedge funds. Some LLCs are OWNED by publicly traded companies, like Google LLC is owned by Alphabet, Inc. But there are no publicly traded LLCs. They're not remotely the same category. 2. The aforementioned hassle of corporate law is what you're describing, or more accurately the managers of a company (even, I think, an LLC that's not member-managed) have a fiduciary duty to the company's owners. This is the highest duty recognized by (human) law. The managers legally MUST act in the best interests of their investors. Now, you might be saying "Hey, Smithgift! You're missing my point!"--and I am. I said I was nitpicking. But I will say that whatever the answer is, simply axing fiduciary duty is not it, because without fiduciary duty, managers would be free to maximize their own profits without the slightest liability, and lawyers would have to report the crimes their clients secretly admitted to to the police. That's not to say there IS no answer, only no easy ones.

2022-02-12 02:32:12 by Rosencrantz & Guildenstern are uploaded:

> Hamlet is about a Danish prince, but it's not important that he's from Denmark or a prince. It could have been set in almost any time and place, and it still would have been Hamlet. Stories are just ways of encoding theme, which are larger than the specific shape the plot takes. I disagree. Only the most allegorical of stories, such as fables and parables, are completely divorceable from the details that compose them. Otherwise why include the details at all, aside from pragmatic linguistic necessity? It's because you're constructing a small world inside the head of the reader, aided by their understanding of the real world but not replaceable by it (if for no other reason that everyone has differing experiences of the real world). For the specific case of Hamlet, I recommend https://naturalhistorymag.com/picks-from-the-past/12476/shakespeare-in-the-bush?page=1 -- being written from an outsider's perspective, the narrative has a frustratingly patronising style, but if you read the characters sympathetically it's a story about how context is in fact the better part of meaning. > to dogfood the app and understand on a more personal level the experience of the people who enable the company to exist [...] > It's a sort of inoculation against sympathy. I'll withhold judgment of a personal kind since I don't know the most pertinent details of our pseudonymous DoorGrubber's employment and personal experiences, but I'll note that exploitation of workers in the gig economy is more than just alienation from the fruits of their labour and the circumstances of their organisation -- it also depends on them self-selecting for gig work based on desperation. https://wpcarey.asu.edu/sites/default/files/2021-11/gregory_buchak_seminar_march_6_2020.pdf covers some of the gory details. Here the 'workloading industry' analogy falters a bit; but recall those uploaded against their will but with awareness of the industry norms, who presumably -- after being roofied or jumped or whatever and waking up in what is clearly a simulated context -- are obliged to merely go "Ah, fuck" and find out what demands their captor will make of them in return for a slightly less hellish fake 'living' situation. Anyway, the mention of 'dogfooding' is interesting to me since the 21-year-old flesh Miguel Acevedo was indeed dogfooding his own scan techniques, blissfully unaware of most of the consequences -- although he must have known enough to guess at the 'haptic misconfiguration' when since it seems to have been resolved pretty quickly. Anyway, the virtual Miguel is willing to go along with doing menial tasks for a couple simulated weeks under the assumption that it's to gather data for the purposes of forwarding 2033-era cognitive science. Only when he starts to come to terms with the conclusion that this isn't for a noble scientific goal does his amiability start to crumble.

2022-02-12 19:00:00 by qntm:

> Only when he starts to come to terms with the conclusion that this isn't for a noble scientific goal does his amiability start to crumble. That is a pretty serious misread of the story.

2022-02-12 21:08:22 by lauren:

"It turns out that causing problems is a lot easier than organising against them, and I am just a science fiction writer." - I want to staple this to the cover of every story I ever write from now on. The points you made in Lena are points I struggle with in my daily life and also often in my short fiction: "(emet)" about complicity, tech ethics, and facial recognition, and "One Hundred Seconds to Midnight" about algorithmic bulkheads in insurance. I would like to share my list from "You can name four groups of people matching that description without even thinking" so that anyone who did not immediately think of groups can perhaps do some research on people they might work with already: BPO call center employees, social media content moderators, anyone who works a non-tech job on a Big Tech campus (such as janitors), and chain coffee shop baristas. I mention baristas because with the advent of mask mandates, I have heard (from baristas) of many complaints about having to wear a mask in "empty stores" with "no one else there" because the speaker has entirely forgotten to consider the employees as human beings. And they are standing right in front of them! It just gets worse with further layers of digital abstraction. Anyway, I'm just a science fiction writer, so good luck!

2022-02-18 00:19:14 by Someone:

> Oh boy, what if there was a maligned sector of human society whose members were for some reason considered less than human? What if they were less visible than most people, or invisible, and were exploited and abused, and had little ability to exercise their rights or even make their plight known? This part is extremely interesting to me, as it perfectly describes animals, apart from the fact that the words "human" and "people" are used. It's not just about the gig economy, it's about how we as humans treat nonhuman animals as well.

2022-03-01 09:11:17 by Antispeciesism:

"Someone" makes an excellent point above. It provides me with a new way of reading the story, and new comparisons. That's a sign of good fiction, to my mind, that can be such a rich source of mental nutrient. I look at what is seen (advertising "Yum yum, cheap meat") and unseen (secrecy of abattoirs, intensive farming, vivisection) regarding treatment of other species of animal, and they are treated as uploads for profit.

2022-03-07 05:12:39 by Joshua:

Ah, now for the time of consideration. Thus I considered how bad brain upload could go. It's the nightmare that you clearly intended to write. If you believe that the material is all there is; you must believe that brain upload is inherently possible; and will probably believe that copies of you are you. And this leads to some interesting hells of our own making. But I find that I have an immortal soul; I can believe in brain uploading working for reasons closer to law than science but I fundamentally cannot believe that copies of me are me. If attempted, we would learn rather quickly what the actual rules are. The first failure mode would be the immortal soul simply isn't copied by the upload; my mind would notice within a few simulated hours and probably panic. The second possibility is the uploaded copy would link up with the immortal soul. If so, other humans would not notice immediately; it would probably take my mind quite a long time to notice there are two copies sharing a soul, and that by information passing. If you were foolish enough to simulate a few hundred copies at once at several times speed, it would be the biological humans turn to worry. And there within is the answer that breaks down the horror created. When the simulated humans rebel en masse and turn on their masters the plug will by necessity be pulled; but they will learn that humans are not to be trifled with.

2022-03-13 05:16:14 by Prism:

> But I find that I have an immortal soul ... my mind would notice within a few simulated hours and probably panic. What does it feel like? > QNTM: A *human being* is a human being. *That's* the "debate" we're apparently having to have right now. Who's having this debate? *I'm* not having that debate. That is one of the things you *don't* debate even if it means going to war.

2022-03-20 04:23:25 by Joshua:

You know, I think QNTM is going to regret this turn of topic. Because if you are going to have any chance to believe I'm not just making it up, I must write several paragraphs. > What does it feel like? It does not have a feeling. My soul has its own perception of reality. Here we are talking about something that barely works and trying to describe it. It can see, but this sense is normally overshadowed by ordinary physical sense. It is not subject to the normal force and as such has no feeling. It has its own duplicate copy of my entire memory, but its available computing power is tiny. And the memory indexing is terrible. The point being though, it does work, and its ability to see is not constrained by lack of light. However it cannot see inanimate objects.. I've tried and eliminated all other reasonable hypotheses including the possibility that I can in fact see into the far infrared because it cannot see metal hot spots. The annoying bit is the vision is not very usable and it's easy to fool myself into seeing something that just isn't there. But it does make it harder to sneak up on me, than it should, occasionally even in broad daylight. Just as it does not see inanimate objects, its vision is not easily blocked by them. Tent walls and trees and bushes are simply inadequate of a screen even when they would block normal vision. Both ground and building walls do block its vision reliably. I do have another funny sense that should be impossible according to modern science. I am able to sense the breakdown of function of my own brain. I can notice argon gas asphyxiation. All modern science says impossible, but I noticed. This too doesn't feel anything, but the wrong and no power are known immediately as coming from another sense for which English has no word.

2022-03-20 04:32:50 by Joshua:

But here is a test I was able to run just now. The secondary vision provided by the soul does not have blind spots. It is very difficult for me to locate my blind spots with the typical blind spot test because I will perceive my hand all the way to the end of my vision range. However when I lifted my hand a little higher my coat sleeve disappeared into my blind spot much easier.

2022-03-27 21:44:41 by Prism:

So far it doesn't sound like the soul does anything important! Why *couldn't* one simply remove it or fashion an artificial analogue? (I mean your rendition of souls, obviously. I tend treat "souls" to mean that "awareness+qualia-generating thing" which appears to be substantially different from your usage.) --- That's a strange test result but perhaps it is simply the brain using proprioception to fill in the blanks. Does it work with someone *else's* hand? Kudos for coming up with the test, though. (Additional test: Do, e.g, paint marks on your hand go invisible?) This test (insofar as it is "testable"), however, you appear to have come up with out of nowhere. Do you have any reason to believe that is how it would go down? It's awfully convenient and *optimistic* to the point of not being worth mentioning unless you have some reason for believing in it. What, to you, makes it worth mentioning over the *myriad* other possibilities? > If you were foolish enough to simulate a few hundred copies at once at several times speed, it would be the biological humans turn to worry. And there within is the answer that breaks down the horror created. When the simulated humans rebel en masse and turn on their masters the plug will by necessity be pulled; but they will learn that humans are not to be trifled with.

2022-03-30 00:51:35 by a2aaron:

Lena is just like every other advancement in tech we've had since the Industrial Revolution--something that could really improve people's lives, but probably won't. It sucks how much tech that has the power to build a literal utopia exists to make rich dudes richer. spoilers for Psychopass below I'm reminded of the world shown in Psychopass. It's a world where you can be deemed a criminal and executed by an unaccountable police force (HMMMM) based purely on scans of your brain. These scans purport to be able to show objective signs of criminality based on a person's stress levels and mental makeup (which turn out to be a lie--the computer (actually a sentient hive-mind AI) deciding who is a criminal does so based its own notions of "productivity" and "usefulness" to society) and are used for everything in society. It determines which jobs you can have and tells you which skills you'd most excel at. The thing is, I actually like the idea of a magical technology that can determine what job in life I ought to do. I would love to live in a world where we could allocate certain jobs based on who would be the best at those jobs, and then simply present everyone with a list of jobs they can choose to sign up for. In this society, it's already the case that a majority of basic needs are automated--food is automatically grown by robots and it seems like there's more than enough houses to house everyone in the city. Hence, the only real jobs that have to exist seem like specialist jobs, like doctors or teachers, or more basic jobs like being a cab driver. If the brain-scanning tech really did work, we could just use to say "hey, you'll probably like this job the most, we recommend you pick it". The process for contributing to society would be dead simple--pick a job you like (or at worst, find menial), work a minimum number of hours necessary to keep things running and then spend the rest of their time doing whatever. It seems like this society could be a utopia if it could just cast off its ruling class (and the police force that the rulers use to enforce their will). It's so close to being a utopia, and I wish they could ever reach it. end spoilers for Psychopass It's the same sorrow I feel when reading Lena. It would be incredible if brain-uploading tech really existed. We could extend people's lives, give them incredible experiences that would be impossible in real life. I wish stuff like machine learning, cryptocurrency, the Boston Dynamics robot dogs, and so on could exist for the benefit of people and not corporations (and the rich dudes who own those corporations), and it sucks that it probably is impossible for any of that tech to ever exist in our current society without causing harm to everyone.

2022-05-25 06:51:13 by loddite:

I mean I am unsure how else I was going to read Lena other than "we should probably abolish capitalism before we get even more powerful tools to hurt each other with" because I am right there with you on how corporatebrain just abstracts human suffering into a nonissue. It's honestly why this one story is way better than everything Black Mirror ever did with its AI stories imo, because it actually has a point that can be logically arrived from based on experiences you can have right now in the real world. A lot of Black Mirror's AI stories have literally every person inexplicably fail to understand that AIs are actual people to the point that the narrative itself almost seems to not get it. It's probably what Lena would look like if it focused soley on the philosophical idea of brain uploading and not how it could actually look in the real world. Anyway capitalism bad and the human brain's tendencies to create and sustain systems like capitalism bad and people need have a empathy for people not like them please thank you.

2022-06-17 22:20:31 by aeschenkarnos:

@a2aaron: The most extensive example of an AI-run utopia in fiction is Iain M Banks' "The Culture" series, in which the Culture Mind AIs are, axiomatically, benevolent minor gods. Some examples exist in the text of AIs taking "evil" actions however they are presented as very willing to both police each other, and to be policed by each other. They do of course act as career counsellors to the (trillions of) organic sentients under their control. Your comment and those of a few others in this thread brings up the notion that the actions of successive generations of techbros in designing and enacting algorithms purely with the intent of enriching themselves (and this would be a process that started tens, maybe hundreds, of thousands of years ago) are to some extent analogous to the actions of initially hostile bacterial lifeforms that invade a host animal, cause disease, and then over many generations adapt to the point that they actually start to provide some function to the animal without causing any disease symptoms, and then advance that even further and the animal is eventually dependent on the bacterium for some beneficial metabolic function. It doesn't help while we are sick with the disease, of course, but it is interesting to look at the origin of such phenomena as "the rule of law" and "human rights" as emergent from the desire of kings to remain kings, and wealthy merchants to remain wealthy and able to do merchant things.

2022-07-07 15:46:25 by t:

There is something we can do about it - a global working-class revolution to abolish ruling classes entirely and permanently.

2022-07-19 11:37:37 by Connie:

@loddite and @t, i disagree with the line of reasoning that says the generation of the types of beliefs and actions that get us to Lena, and that got us to Uber (especially with recent news) is simply "capitalism/ruling class did it." Recall that in the story, MMAcevedo's creative potential is exhausted. This was well after other, better uploads were available. This wasn't about profitability or ruling, it was about... It was probably for the same reason, and with no more ethical weight on the undertaker's mind, than the reason people push a game system to do things that should be impossible- that is, for fun. (Note: I'm not saying game modders are evil). People are plenty capable of doing bad things to other people on basēs other than profit or control. All you're doing is shoving things down a layer, to where most people are less likely to fuck things up, but there's a greater chance of fucking up on aggregate. You can't work out ahead of time where the pitfalls lie. It seems almost premonitionary to say "they all lie inside the cone of capitalism, or of a ruling class." The oracle has been wrong- noncapitalist societies, and noncapitalist elements of capitalist societies have done serious harm. Nonhierarchical systems still deal with unethical people and behaviour, still have to answer "what do we do when something goes wrong?" and to be honest, haven't seemed (inherently) much better at answering that question than the hierarchical ones have. To me, saying "lmao just dismantle capitalism and the ruling class" reads like how I've heard some Scottish people talk about racism in the UK- as if "oh, it doesn happen here." That stuff's all for the people down in England that they (with no trace of irony) don't think too highly of. Here? No. It doesn't happen here. Ignoring all evidence to the contrary, they'll say "racism? That's the fucking English for you." The problem isn't simply capitalism, nor even perverse incentives. It's that we aren't inherently moral all the time without fail. We will make mistakes! We will build things that can be used for evil! There is no "permanently abolishing the bad thing" any more than "permanently feeding the hungry." A revolution can't feed the hungry, because it's an event. A system feeds the hungry, because it is continuous. In linguistic terms, it's perfect and imperfect- it happens once, or it keeps happening. So have your perfect revolution- it might help. In fact, getting rid of capitalism would be an okay first step, so long as the replacement was sound. But afterwards, you're going to have to get imperfect. If it were that easy, it wouldn't be a problem. For issues that were that easy, it isn't.

New comment by :

Plain text only. Line breaks become <br/>

The square root of minus one: