I agree. I wouldn’t put it in the sci-fi category either. Yes, technology of some sort plays a substantial role but that alone doesn’t make it sci-fi. I wouldn’t even consider the USS Callister episode sci-fi.
Take the robotic dog out of “Metalhead” and replace it with a human – there you go.
The robotic dog packs a more visceral punch, though.
I’ve seen the first 4 so far and I think my issue with USS Callister is that I just don’t really care about artificial AI getting ‘tortured’. They weren’t sentient enough IMO to really give a crap about them. Perhaps in the far future when AI are considered equally sentient to humans maybe this will be considered a reprehensible POV, I dunno. However that’s why Callister just didn’t resonate. In the same instance Hang the DJ, while charming, doesn’t particularly have the same impact of episodes in other Black Mirror seasons. As soon as it is realized they are computer simulations in a dating app, it’s a bit like my interest in them waned and re-conceptualized my investment in the episode (if that makes sense).
As the person who is outraged by the DNA thing… your analysis doesn’t feel right, although of course it’s always hard to really honestly evaluate one’s own motives. But I think my strong reaction to the DNA thing relates to (a) when and how it’s introduced in the episode, and (b) how clearly nonsensical it is, compared to many “ok, let’s pretend…” type premises.
Often, a sci fi story (Black Mirror or not) will start with one or more fictional premises. Often, these premises, being sci-fi, are ones that are more or less impossible according to current science. So I sit down to read a book and 5 pages in I realize that these people have time travel and teleportation machines that work via (technobabble), I’m totally willing to buy into that, and see where the story goes. I do that with black mirror all the time. “Yes, we can put an implant into your child’s brain that lets you see through their eyes, and if we detect that their adrenal gland is spiking, we blur out whatever it is that was upsetting them”. That doesn’t make a TON of sense (how do they know what in the child’s visual range is the upsetting image and what isn’t?) but that’s cool, I’m in, I’ll go with that premise and see where the storytelling leads.
So if Callister had started with the premise of “in a world not too far in the future… scientists have learned to create virtual clones with all your memory just from a strand of DNA…”, well, it would still be just as scientifically ridiculous, but it would be presented up front as the necessary premise that the episode was built around, and I have enough trust in Black Mirror to hopefully sit back and enjoy the ride. Instead, the episode is more like “ok, there are these virtual copies of people, can you guess why, what do you think the answer is”, and the ANSWER is DNA.
More importantly, though, there’s a difference between Star Trekky technobabbly “that doesn’t make sense” or “that wasn’t explained” or “it seems unlikely that would ever be possible” or “it’s weird that that one particular aspect of technology has advanced so far but everything else is the same”, all of which are fairly common in non-extremely-hard-sci-fi, vs “that’s absolutely nonsensically impossible”. Might technology advance to the point that a cell phone can run thousands of full-world simulations featuring very strong AIs in a few moments? Well, not any time soon, but I don’t know of any obvious physical law that’s violating, particularly one that is plain and straightforward high school level science. But your DNA just plain does not contain your memory, period.
And the other thing that frustrates me is that it didn’t have to be this way. If you need some way that Daly can copy people’s entire memories to make for good storytelling (which it did, even if I wasn’t crazy about the episode as a whole) fine, that’s easy enough… he uses nanoparticles, or he mounted a neural scanner in the guest chair in his office that people sat down on, or whatever. There are plenty of science-y terms you can use to get the sci-fi premise you want… why pick one that is immediately and obviously completely impossible? (In fact, Black Mirror is usually a smart enough show, that had I been paying attention, the fact that something so clearly unscientific was going on might have led me to suspect that in fact the enter Daly universe was a simulation, and that there was a universe-within-a-universe thing going on.)
Are you saying (subtle distinction here) that you don’t think that they were sentient, because of what you know about actual computer technology and AI issues and so forth, or that the episode presents them as less than sentient, ie, there are clues in the script concerning their behavior which suggest that they are just simulacrums, not fully sentient beings?
Because the emotions and behaviors of the AIs were complex enough that I definitely felt like the episode was presenting them to us as fully sentient, even if you personally didn’t buy into that. (As someone else in this thread mentioned, many Black Mirror episodes have the premise of “fully intelligent and sentient AIs can be run in fully simulated virtual worlds on consumer-level computer hardware”. If you start assuming that every AI in every Black Mirror episode isn’t really sentient, and their pain and despair and feelings don’t matter, well, you’re going to enjoy the show as a whole a LOT less…)
I believe the episode was attempting to portray them as sentient, but I didn’t buy it (partially because what I know, or rather think I know, about actual computer technology and AI issues).
I essence I felt like Haley Atwell in “Be Right Back” (in Season 2) - you are trying to make me feel an artificial lifeform is the same as a human (in terms of empathy in my context), but it doesn’t work.
I think people play up how many “AI” centered episodes there have been in Black Mirror. In the first 2 seasons only one episode (Be Right Back) was about AI. And generally speaking, other earlier episodes that dealt with AI (White Christmas), dealt with that AI’s effects on actual humans.
While I thought the first 4 were just ok, I’m really glad I stuck it out for the last two! Fantastic stuff, especially Black Museum! Both were basically the most “Black Mirror-y” of the season. I loved the whole idea that humans may have extincted themselves in Metalhead, and that in a world were electronic mega powerful dogs had destroyed everything that something as simple as a teddy bear had such meaning. And Black Museum was just utterly fantastic! 2 mini Black Mirror stories about the potential horrors of technology all leading to the ultimate twist at the end - caught me completely off guard. And I’ve always thought San Junapero had a potential dark side which Black Museum explored very well (and kind of sketched the prequel history leading to consciousness transfer). Great stuff!
My grades:
USS Callister: C
Arkangel: B
Alligator: B-
Hang the DJ: B
Metalhead: A
Black Museum: A
Where is everyone getting this idea of “massive AI simulations being carried out on smart phones”? When I watched the episode, I assumed the simulations were run on some huge mainframe in North Dakota or Eastern Europe or somewhere, and the results were relayed to their phones.
I mean, these people had to have submitted to a whole lot of proceduring in order to have their simulacrums created in the first place. It’s not like the app ran 1000 simulations based on a photo taken across a crowded bar.
You know the scary thing about this ? apparently match and other sites are trying to do this … its one reason for the current feature aka "matches you missed " tracking app where you can find out if you go to the same places
the metalhead plot has been done in any sci-fi or twilight zone series (even star wars had a novel done on it )
usually it goes like this : lost ancient civilization had something incredibly valuable stored by its god king guarded by immortal fierce warriors … Someone/thing destroyed said civilization 10 thousand years (more or less ) later its built up by legend to be a great treasure
Adventurers go through Indiana jones type of adventures only to find out that everyone forgot what was being guarded … and it turns out to be either a piece of tech or machinery that would of conquered the world 10 k years ago but is considered worthless junk today Or the rulers much loved but totally trivial piece of property…
Well, that’s what we actually see in the episode: two people in a bar with smartphones checking an app that tells them they’re a match. After we just saw that the simulation was the app. And no hypothetical mainframes either in evidence nor even implied.
We did not see that the simulation was the app, we saw the app display the results of the simulation, the same result on each phone. I think the implication is clear that it’s not the individual phones doing 1,000 simulations.
When Match tells us they found a match, or when Waze tells us they found a new route to our destination, it’s not our phones doing the heavy lifting, it’s the providers servers. That’s what cloud computing is all about, making it easy to access the work done by a bank of servers, and displaying it on a portable device.
Finally got round to seeing this and watched the first two episodes:
USS Callister: As many noted the premise that he can create sentient adults just from their DNA, and with the memories of the equivalent people in the real world was too much to swallow. Also as others have noted, it really would be a hopeless situation since no doubt the DNA would be stored digitally so throwing away lollipops or whatever would be pointless.
I actually enjoyed the episode: the visuals are great and I like the sorta-twist that we first sympathize with Daly but then discover he’s a monster.
But the episode only works as a Toy Story like fantasy of “What do NPCs do when you’re not logged in?” Trying to follow the science of it fails completely.
Arkangel: Good episode. I thought it dealt well with the theme of an overprotective parent. Only criticism is I didn’t buy the ending: I understand what they were going for: That because she couldn’t see or hear her mother the emotional impact wasn’t there to stop her continuing the assault. But it still seemed implausible.
I don’t understand this complaint. The implication of the episode is that the characters were sentient, period. What does “not sentient enough” mean?
Also not giving a crap about torture because they are not sentient is a strange thing to say: do you care about animals being horrifically tortured?
Do you get upset about prostitutes that have gotten killed in Grand Theft Auto V? Now obviously that’s a crude AI, but as games progress in their programming, at some point you are going to have to ask at what level does sentience come in, if it does at all? In order words whether some AI is “sentient enough” is going to have to be considered. And furthermore, if sentience is the ability to feel subjectively, does that apply to a completely human created being? Is it really subjective perception or feeling if it’s been programmed by someone else?
Which also answers the second point. Animals have some level of sentience. Now most humans agree that they aren’t as sentient as humans and therefore penalties for harming animals are much less than those harming humans.
This is where the concept of a “soul” might come in, and as I’ve said before I’m surprised that it’s never been brought up by the show, not even in a skeptical context. Has the word “soul” ever been uttered in Black Mirror even once?
Agreed that a discussion of a ‘soul’ as it applies to AI (when should an AI be determined to have a soul, can it ever have a soul at all, etc) may be a great basis for a Black Mirror episode.
Firstly no, the characters in this episode were nothing like NPCs in contemporary games. We see the latest character wake up in the game and see her shock and horror. There is no reason why in the logic of the game such an experience would be simulated; we the viewer are clearly supposed to conclude she is a sentient being forced to play the role of an NPC. Likewise much of the dialog between the characters when Daly is offline.
And your logic doesn’t follow anyway. If you’re saying we care about animals being harmed because they’re somewhat sentient are you really suggesting that those characters were significantly less self aware than a dog, such that you can completely ignore their suffering?
Yes, actually. I am making the point that I don’t believe a created being made of code really has ‘sentience’. Their ‘subjective’ feelings are created by someone else. You may of course disagree, but there is a question of when sentience actually develops - and of course there is a difference regarding transfered conciousness as well.
(oh, and I don’t believe this is an example of transfered conciousness as the actual individual is still around in the real world - at best it’s a copy, though not sure how conscienceness can be copied through DNA)
Artificial (Frankenstein’s monster) and, later, digital, entities being sentient is a hoary trope. So is uploading someone’s consciousness to a computer tape. So is obtaining a digital copy of someone’s mind to torture or otherwise interrogate for information or simply to disassemble and sift through directly, or to replace the original with a modified clone, or whatever.
With well-trodden sci-fi like that, you had better have an interesting take on it. For instance, in Episode 1 all the sci-fi is a weak shim without much thought put into it, which is used to frame the real story.
We’re talking about a black mirror episode. In the episode, the clear implication is that the characters are sentient and conscious. Essentially it’s part of the premise that Strong AI is true.
So I’m not really sure of your objection: it’s a story, it doesn’t require you to agree strong AI is true in real life, any more than it requires you to believe there’s a famous developer called Richard Daly (or whatever it was), and a TV show called Space Fleet etc.
While we’re speaking philosophically, no, it doesn’t matter to me how their subjective feelings were created. Doesn’t matter that they were artificially created, the fact they have subjective feelings, for whatever reason, already means I care if they get tortured horribly.
Agreed and on the DNA thing I made the same complaint upthread. No mechanism is given for how memories, or even the adult form of their bodies could be inferred from their DNA, and seems based on typical misconceptions about what DNA is.