Star Trek: The Original Series
"The Ultimate Computer"
Air date: 3/8/1968
Teleplay by D.C. Fontana
Story by Laurence N. Wolfe Directed by John Meredyth Lucas
Review by Jamahl Epsicokhan
Starfleet informs Kirk that the Enterprise is to serve as test subject for the new M-5, a groundbreaking advancement in computer technology, designed to make command decisions faster than captains and reduce the number of people required to run a starship. An astute allegory for contemporary automation at the expense of "the little guy," this episode's first few acts are superb, as Kirk finds himself debating whether he's selfish for wanting to keep his job at the expense of technological progress, or if it's a matter of actual danger or principle.
A wonderfully acerbic debate between Spock and McCoy about the role of computers is also well conceived, ending in Spock's well-put notion to Kirk, "...but I have no desire to serve under them." Following the M-5's initial success, the scene where another captain calls Kirk "Captain Dunsel" is the episode's best-played and simultaneously funny and painful moment. (In a word, ouch.)
Once M-5 runs out of control and hijacks the Enterprise—resisting attempts to be shut down in acts of self-preservation (including murder and eventually full-fledged attacks on other Federation starships), the episode turns to an frightening analysis of M-5's creator, Dr. Richard Daystrom (William Marshall), a man obsessed with outdoing his prior successes, who has created a monster that he has come to regard as a child. Though it pushes a little hard toward the end (Shatner and Marshall going a bit overboard), the story is a compelling one.
Previous episode: The Omega Glory
Next episode: Bread and Circuses
Like this site? Support it by buying Jammer a coffee.
102 comments on this post
Thu, May 16, 2013, 12:56am (UTC -5)
The problem is that the entire crew of the Excalibur has just been murdered, along with a good chunk of the crew of the Lexington. Some 500 men and women dead, a horrific tragedy that's made even worse by the fact that the Enterprise was the instrument of their destruction. There's no way the bridge crew ought to look this happy in the closing moments, and Kirk, knowing that the ship he so loves was used to do such a terrible thing, ought to be truly anguished.
If this had been a first-season episode it probably would have ended on a somber note, but the second season got considerably lighter and "The Ultimate Computer" was only one of a number of eps that year to end with inappropriate humor.
Fri, Jan 10, 2014, 1:59am (UTC -5)
- Kirk's apparent crankiness toward Daystrom even before anything went wrong, and after some lengthy and deep on-screen self-reflection.
- The implausibility of a major military organization like Starfleet allowing this test to be carried without proper testing, training and precautions, as well as the possibility of such a flawed computer as the M5 of ever being granted a test run.
-When Commodore Wesley assumes that Kirk is responsible for Enterprise's attack on the Excalibur, even though he originally browbeats Kirk throughout the prelude to this mission ... and then suddenly comes to his senses and calls off the subsequent attack, thereby killing off alot of the tension and drama that had been built up for a climactic scene.
I get that some of these elements were put in place to set up the story as a drama of one man's (Daystrom) obsession with his creation, but this was an element that seemed to coalesce rather late within the story, and lacked relatability.
Fri, Jan 10, 2014, 9:17am (UTC -5)
But you're absolutely right about Wesley. His first scene in the transporter room is off-kilter ("Hey old friend -- congrats on losing your job!"). Then (as you noted) it's weird that he would blame Kirk for the attack later but also think enough of him to hold off on firing on the Enterprise at the end.
But the entire episode is oddly characterized. Even for Shatner, Kirk is over the top in this one. His scream of "DAYSTROM!" near the end was really strange. Nimoy does his normal nice job, but even Kelley seems like he overacted ("That thing just destroyed an ore freighter!!!").
Thu, Mar 20, 2014, 3:04pm (UTC -5)
Also, the Kirk/Daystrom stuff at the end of the episode is just over the top.
Sun, Jun 7, 2015, 7:37am (UTC -5)
This is not Kirk's job. The Federation isn't merely a military organization...We're out there exploring and introducing our flavor of diplomacy and friendship to all the other species 'out there' by choice. We're out there because WE WANT TO BE. Starfleet personnel, Admirals, Captains and indeed, you would believe their crew as well, are living their dream. M-5 would simply be another tool at our heroes disposal.
Kirk, et al, are explorers and ambassadors in the final frontier performing interpersonal functions with other species that a computer could not begin to assume. No one's 'job' is in danger. It's an artificially created and inflated plot point.
Ugh.....
Wed, Aug 19, 2015, 4:26pm (UTC -5)
Thu, Aug 27, 2015, 11:02pm (UTC -5)
My name is Captain Dunsel.
I'm sorry my command of the Enterprise did not go well.
I've been demoted to ship's junior cook, under some dude named Neelix.
Wed, Dec 2, 2015, 2:40pm (UTC -5)
Roddenberry's approach to AI, as exemplified in this episode, always struck me as cowardly and backward. There is a principled argument to be made for the need for human beings, even in the face of increasingly capable AI, but this is not it.
It was not until STNG - The Measure of a Man that the series really began to take AI seriously, if only a little, and I'm assuming by that point Roddenberry was on the way out in terms of his influence on the show.
Wed, Dec 2, 2015, 7:51pm (UTC -5)
Thu, Dec 3, 2015, 10:09am (UTC -5)
Roddenberry doesn't ever seem to be willing to confront the idea that a computer might very well be superior in some respects to humans. His portrayal of AI in general has been lackluster. Data was the first serious attempt to do so, and even then Data's central premise was a desire to be more human.
I again don't fault Roddenberry for believing humans to be inherently superior to AI - I just fault him for lacking the imagination to present this thesis in a halfway intelligent fashion.
Fri, Apr 1, 2016, 3:51pm (UTC -5)
I think the point of this episode is pretty clearly that an AI's effectiveness is limited by the limits of its creator. If humans are flawed that means an AI will be flawed as well; its weakness will be a reflection of human weakness, except you can't reason with the machine.
While I agree with you broadly that Trek in general avoids the subject of AI, within the confines of this specific episode I think it approaches its subject matter very well. What Roddenberry thought at the time was likely quite correct at the time - which is that limitations in computer programmers would create severe limitations in computers. I doubt very much he foresaw the possibility of AI as a self-evolving system, even to the point we have now where a recursive program can teach itself activities such as playing Go such that it doesn't have to be completely programmed from the get-go. And he surely didn't think of AI in terms of quantum computing or bio-neural circuitry that could do computations at a godlike speed.
I personally prefer to attribute this lack of vision to the limitations on AI as they existed in the 60's, and to the fact that Trek is predominantly supposed to be a person-oriented show rather than hard sci-fi that deeply explores what various tech can do. It is for this reason that I prefer physics novelties such as we see in TNG to be backdrops to episodes rather than the central plot point to be solved. "Cause and Effect" is a great example of what I like, because it employs a novel physics idea as a backdrop but the real action is a character-driven mystery story.
Mon, Apr 18, 2016, 12:38pm (UTC -5)
Thu, Feb 23, 2017, 9:25pm (UTC -5)
I mean, I get what they were trying to do with this episode, and I understand the uncertainty of how these newfangled computers would fit into society back in the 60s and all, and I realize it was a hot topic in sci-fi, both profound and silly, but man, there's nothing that makes an episode look more dated. All these old shows and stories assumed you would feed a few pieces of information to a computer, and it would make surprising connections and leaps of logic that would shock and amaze people. Hey, maybe that will still happen in the future, who knows? But our modern computers seem to be on par with the normal Enterprise computer these days, and this idea is still well beyond our comprehension. So it just seems bizarre that all these SF shows completely missed all the ways computers would actually impact our lives and jumped straight into these superbrain stories. And, just like all those other SF stories, the ultimate computer ends up being evil.
And that's what really bugs me, makes me think this episode is not a true classic. The computer just goes straight to evil. The character arc or struggle or theme of the episode was Kirk worrying about being replaced, being obsolete. That's a fair story to consider, so the antagonist of the plot (the computer) should be one that complements and reinforces this struggle. But instead, it just acts as a straw man. The resolution should be that Kirk isn't obsolete because he has some unique quality that the computer doesn't have. Think about, for example, the Corbomite Maneuver, where only a bluff would work to save the ship. Or all the old Kirk speeches to get the antagonists to change their mind. A proper resolution would show that Kirk had something above the computer, like the battle of wits between him and Khan. Instead, he doesn't have to show why he deserves to be a captain when it became clear that the computer is crazy evil. Just shut off the computer, abandon the project, and fly off into the sunset, and no more self examination of Kirk. Is that really what we wanted?
Thus, a potentially interesting idea went to waste. Too bad.
Fri, Feb 24, 2017, 12:07am (UTC -5)
You may argue that the episode didn't create the greatest case for man over computers, but I think you would be wrong to suggest that it failed to create a case altogether.
The point of the episode isn't just that Kirk is smarter than the computer, or that no computer can match a human in creativity. That may or may not be true, but it isn't exactly the point. TOS always has as a running theme that logic and computation alone isn't enough to make a great person or a great society; this is reflected repeatedly in the Kirk-Spock-Bones trio. Kirk isn't just logic, but is logic coupled with humanity and compassion (Bones + Spock = Kirk). The fact that the episode (as usual) ends with the computer being 'outsmarted' is a tidy way to wrap things up, and I agree that it's a weaker ending than it should have had. But the wrap-up isn't really the point as I see it. The point is that a machine will follow its logic to the end and have any fallback position grounded in compassion, sympathy, or feeling. It's sort of like a psychopath, if you will, in that it will not have internal mechanisms to stop it doing bad things if they seem best.
Now, it's true that if the programming is good then the output should be ok too, and likewise if there is a bug (a la Skynet) things will go pear shaped and the computer will not be able to be reasoned with past that point. But more to the point, the Trek theme is TOS is that advancing humanity isn't about technology or capabilities, but primarily about advancing values and how we treat each other. This is an area in which the inclination to push capability will not only be a sidetrack to advancing humanity but will in fact hinder it if pursued incorrectly. Take, for instance, the eugenic wars, where in an effort to 'advance humanity' in capability a monster was created instead. Likewise here, where a captain more sophisticated than a human is created to obsolete humans, just as Khan wished to obsolete homo inferior. The danger outlined in "The Ultimate Computer" is along these lines, and although it didn't fully realize the treatment of this issue I do think it's in there and is still pertinent to this day; maybe more so than even it was in the 60's, when human obsolescence was still science fiction.
Fri, Feb 24, 2017, 2:55am (UTC -5)
To build off Peter's point, Daystrom going insane may be a way of showing that the danger of thinking that machines can supplant, rather than supplement, humans is that the humans who subscribe to this may develop their own machine-like flaws. Daystrom's inability to think of the universe in terms besides efficiency and the attainment of his goals (and his inability to conceive of his own worth) make him kind of computer-like, as does his social isolation. This ends up enhancing his human flaws, which again seems to result from hanging his identity on a dream of escaping from human flaws entirely. I think the end can be both that Daystrom has made the mistake of thinking like a machine, and that M-5 is dangerous because it "thinks" like a person, though it is maybe a bit complicated.
Fri, Feb 24, 2017, 7:47pm (UTC -5)
William, you seem to state that the theme shows by comparing Daystrom erraticness to the M-5's going cuckoo. And yes, perhaps that is the reason the M-5 did poorly, since it was his brainwaves that he used to create the M-5. And therefore, you say, the theme is that the people who want to supplant humans have their own problems that preclude them from being the best judge of humans. Well, ok. But still... well, it's obvious the theme the episode wanted to show was that humans are not going to be obsolete by this computer, what with the whole "dunsel" bit. And if William's interpretation is true, then I, like Garak and the Boy Who Cried Wolf, see a different moral. If the importance is to show the connectivity between Dyson and the M-5, then the moral of the story isn't that computers are inferior, but rather that better humans should be used as the template for computers.
After all, if the fault of the M-5 is just that Dyson was erratic, why not try the M-6 with Kirk's brain? Will that computer be perfect enough to replace human captains? I don't think the episode answered that. Which is why it's a bit of a straw man story - it's not really Kirk vs The Ultimate Computer. It's Kirk vs the Insane Computer. And that's not a fair comparison.
Peter, I don't really disagree with what you say. But I just feel a bit more strongly on the fact that it's weak than you. Yes, computers will follow their logic to the bitter end, which can seem horrifying. And yes, it does mean that there should be some human oversight. Which honestly should have been obvious, but of course they didn't show it. Naturally our superintelligent future means people will test a complex new computer by giving it complete control of a freaking battleship that has enough firepower to exterminate a planet, and make the only kill switch an electronic one that the computer can hack.Perhaps it should be Starfleet command that should be seen as insane...
But I digress. My problem is that I, Robot came out in 1950. The Three Laws were first introduced in 1942. While Trek may have been blazing a trail for television sci-fi, this episode feels 25 years behind the times when it comes to sci-fi in general. There should have been safeguards put in place on that computer. There should have been better logic programmed into it. But apparently, Dyson didn't think of it. And apparently, Starfleet didn't demand it before thinking about putting it in one of their ships. It just wasn't very intelligent plotting, and so it's tough for me to care about the theme when it relies on dumb plotting.
(With that said, I will point out that this episode came out about a month or so before 2001, so it's not the only visual medium showering murderous AI. But HAL is a lot more memorable, so I'll let that one slide...)
Sat, Feb 25, 2017, 5:18am (UTC -5)
I do see what you mean that it's a strawman because Kirk doesn't actually face The Ultimate Computer. But...I think the episode's point is that there *is* no "Ultimate Computer," or at least it's far further away than people think. If we define the Ultimate Computer as a computer capable of running a starship *technically*, then Kirk could outthink it with lateral thinking as is the case with most of the computers he faces; if the Ultimate Computer is a computer capable of human-style lateral thinking and creativity, as seems to be the case here, then it inherits human flaws along the way and so it is necessary to install the usual checks and balances, which really comes down to wanting a human making the final shots anyway. That Kirk outsmarts the computer in the traditional way here is, I agree, another flaw in the episode -- this computer should be smart enough not to fall for it, or else it *is* just another Nomad or whatever.
The other element, which the episode does talk about, and which I think would be better to look at squarely, is the question of whether computers running things, even if they could be entirely trusted, would be whether human dignity would be removed/ruined by giving power to the machine. And I think most stories still use the idea that computer-run societies will end up being some kind of dystopia to avoid the issue of whether a fully pleasant computer-run world would really be so bad. I still think that the dystopia argument has value because I think that there are lots of reasons to suspect that any system designed by humans will eventually run into human-like problems, but, still, it is hypothetically possible that this is not the case, and then there is still an issue of whether humans should avoid over-reliance on machines for their decision-making, even if those machines are genuinely able to make those decisions better. That's what this episode seems to be about for a time, and I value what it "turns out" to be about...but, yeah, I would also like to see that other story.
Sat, Feb 25, 2017, 8:36am (UTC -5)
I wonder whether Daystrom going insane might be intended to mean something more than merely that the machine had a faulty programmer. One of the classic sci-fi elements to an AI dystopia is not that the machines fail, but that they entirely succeed in fulfilling their role. What happens is that instead of machines helping man to achieve his dreams instead they serve as an excuse to stop pursuing them altogether. Instead of helping man to think, they give him an excuse to stop thinking and to turn over his free will and volition to them. From the start I think we get the impression that Daystrom is not only excited about the technology itself, but seems to actually be excited at the prospect of humans being replaced by computers; it's almost a self-destructive fantasy coming to life. As he goes mad towards the end, almost in tandem with the AI, my sense is that this might mean not that he was always flawed, but rather that he had by this time placed all of his hopes into the AI and was dependent on it. When it began to fail he began to fail. We don't know his backstory here and can only guess, but what if he had already been using AI to help guide him? What if the computer itself had assisted his research and maybe even given him the idea to put it in command of a starship? The idea that he had become a servant to a machine could indeed make him become unhinged. Of course this is my own imagining, but broadly speaking I think the sci-fi world was already becoming acquainted with the notion that letting machines take over out thinking for us not only poses a danger due to the machines themselves, but also in allowing us to become dependent on them for everything.
As a complete aside, I'm not sure that the correct interpretation of 2001: A Space Odyssey is that HAL malfunctioned. True, that's the prevailing understanding, but my suspicion, especially knowing how Kubrick thought, is that HAL was programmed to deliberately turn on the crew so that it could contact the aliens by itself and report directly to whomever programmed it, without the crew blabbing.
Sat, Feb 25, 2017, 10:16pm (UTC -5)
In some ways, there is also a parallel between Daystrom and Kirk -- because Kirk also may in fact need to feel useful even if, as he acknowledges at one point, he is *not* needed as captain anymore. Daystrom mostly seems to want to make everyone else obsolete, and there may be some latent sense of revenge on Federation society in it -- he wants to make everyone feel like he felt, after his own tech made *him* obsolete, to the point where his only possible use to society seems to be an apparently unattainable goal. Kirk's ability to question his motives seems to be the thing that sets him apart from Daystrom at this moment -- but this is by no means an indication that Daystrom is congenitally a madman, so much as that extreme fame and adulation followed by inability to meet one's lofty standard create perverse incentives and take a big psychological toll. In fact, maybe that's the trick -- Daystrom, whose own invention put *him* out of work, is the proof of the long-term psychological damage of replacing a person with a machine entirely. Daystrom's desire to have an even better machine seal his legacy by replacing all of humanity is not only self-destructive in the abstract, it's specifically almost a kind of Stockholm Syndrome, repeating-of-trauma -- Daystrom's sense of worth has eroded since his first big breakthrough. (It reminds me of the classic image of a gambler who wins big on his first time out, and then develops a strong addiction because that rush/depression pattern is absolutely set early on, though I do think we are meant to see Daystrom as a genius rather than having succeeded by accident; very few people have one moment of humanity-changing brilliance, let alone multiple ones.)
Good point about HAL. I tend to think that even if he wasn't specifically programmed to kill the humans, he didn't particularly "malfunction," in that he was still following a logical course. The consequences of humans mucking up contact with alien life forms are too great to ignore, and it is logical from a certain perspective to eliminate potential sources of error and to maintain total control in what could be a major turning point in human history. This would make sense even if HAL was entirely programmed to put the mission (and the ultimate good of humanity) as a top priority.
Thu, Mar 16, 2017, 12:04pm (UTC -5)
Fri, Apr 28, 2017, 9:28am (UTC -5)
William, I think you hit something when you wrote that Daystrom was out for *revenge.* First of all, it now appears to me that by the end of the episode we see not insanity, but rather that Daystrom and the machine were both egotistical narcissists. They both shared pride in their accomplishments, even feeling gloating triumph at the deaths of the puny ships once they finally admitted they were proud of what M5 was doing. Daystrom wasn’t going crazy; he already was. He strikes me now as a borderline megalomaniac who felt others should bow to his superior intellect; another nod to Khan here, where a superior man secretly feels others should be subservient to his notions. The ignominy of being glossed over due to not making a major contribution since duotronics would have been maddening to someone who felt his superior mind shouldn't require putting out evidence such as new discoveries. As much as he might have wanted to lord it over everyone inferior to him, Federation culture wouldn’t allow that, but they could still be made to be subservient to him through his computer commanding them. It’s like making himself into a king through M5; that’s why he couldn’t allow it to die under any circumstances. It was almost like a coup d'etat in progress. It wasn't because it was his child, but because it was his proxy as absolute ruler over the important functions in man's life.
Daystrom is the type that's all about locking up all other men to “protect them”, to control them utterly. This hearkens back to the mention earlier of Asimov's laws of robotics, where Asimov wote about how machines, in order to obey the laws and protect humanity, might conclude that humanity had to be enslaved for its own protection. Well here we see something potentially more insidious, which is a man like Daystrom pretty much bragging about the fact that he's going to make it so man doesn't have to do anything dangerous ever again, which probably means not being allowed to, either. He has come to the same conclusion as Asimov's robots, and is looking forward to confining humanity to a safe pleasure center on Earth. So it seems to me this is also an episode about paternalistic control freaks who think their intelligence gives them license to decide on behalf of others what’s best. A cautionary tale even in our present time.
Sat, Apr 29, 2017, 1:15pm (UTC -5)
Sat, Apr 29, 2017, 11:55pm (UTC -5)
Sun, Apr 30, 2017, 12:59am (UTC -5)
I also used to wonder why Commodore Wesley immediately thought it was Captain Kirk going rogue with his 20 crew members (What the devil is Kirk doing?). Kirk would need to convince the remainder of his crew to let the M5 attack the little fleet, and I somewhat doubt that would happen or that Wesley would believe it would/could happen.
And, why just have 20 crew for this mission? Let's wait until it has proved itself in all phases, then cut the crew down. Taking them out right from the get-go and giving them shore leave would serve no purpose, except to make it harder to take control if things went wrong.
I still love this episode. I can pick at the nits, but darned if it didn't excite me, and make me think, when I first saw it completely in the late 70's. And I still enjoy it...
Have a Great Day Everyone... RT
Fri, Jun 9, 2017, 2:26pm (UTC -5)
I have a number of issues with this episode. If Star Fleet truly thinks it can replace the crew of a starship with the M5 - at least have some compassion for those who are to lose their jobs. Agree with Mike's comment that it is highly inappropriate for Kirk to be called "Captain Dunsel" by the Lexington captain.
Next, we have Daystrom - he's the mad scientist for this episode - with a "little man" concept, picked on / laughed at and trying to re-capture lost glory for his success as a 24-year old. This characterization is a bit over the top - including his breakdown.
Kirk convinces the M5 to self-destruct - when has that happened before?
Also, I don't get why Kirk/Daystrom don't engage the M-5 tie-in prior to the attack on Excalibur/Lexington. They saw what it did to the oil freighter and knew the M5 was in error. Of course for dramatic effect this is what the writers wrote so that there could be an attack from the M5.
In any case, there is plenty of potential with such an episode like how it could have been shown human superiority in solving some kind of value judgment rather than just boiling it down to an insane man's impressions on a powerful computer.
The best parts of this episode are probably in the first 15 mins. with McCoy/Spock taking opposite sides of the man vs. machine debate and Kirk questioning himself about his usefulness.
Overall, I'd rate it 2.5 stars out of 4 - a lot of potential wasted but some good philosophical debates.
Fri, Jun 9, 2017, 6:37pm (UTC -5)
Even if Daestrom himself is insane by this point and this transmitted to the M5, it would be akin to him randomly murdering someone on the street for no reason. Even if Daestrom is capable of murder there is nothing to suggest that he's some kind of rabid maniac.
I love exploring the idea of strong AI and the terrible danger it poses, but this could have been handled so much more rationally than just turning the M5 instantly into a psychotic killer. In 2001 HAL had logical reasons to do what it did, as did Skynet and other killer AIs we have seen time and again in scifi.
Also I come back to my original premise, that the whole episode is a cheat. What if they used a stable human as the template? Why in blazes wouldn't the AI perform its task well? We saw it was easily superior in most ways to human commanders and ought to have been capable but for the arbitrary insanity.
Goid scifi would confront the problem of AI head on, not cheat by just making it arbitrarily insane.
Fri, Jun 9, 2017, 11:34pm (UTC -5)
Your objections forced me to think about this again, and I realized something that may be important. I assumed before that Daystrom was already mad before the episode and that he didn't suddenly go mad. Within the context of his insanity I agreed with William that he seems to actually want revenge on the 'normals'. However what I think I missed here was that he wasn't merely insane because he happened to be deranged, and likewise I don't think M5 is 'insane' simply because it's his creation. I think the concept of their insanity goes further than merely being a personal defect. As I mentioned above, the danger outlined in the episode seems to be the creation of an ultimate computer in and of itself; not because it might happen to go insane, but because whatever it decides to do you won't be able to stop it, insane or not. In a manner of speaking only an insane person would design something like that. But my new idea is that their insanity is actually *caused by* the fact that they're both brilliant - superior to other humans in some measurable quantitative sense. I think maybe the episode is suggesting that any sufficiently superior human will tend towards feeling that he is, in fact, superior, and will feel the sense of entitlement that comes with that. After all, the superhumans like Khan presumably weren't engineered specifically to be assholes; it seems far more likely that when you design someone to be physically and mentally beyond everyone else they will most likely end up acting like assholes, or at least like other people are little more than a nuisance to them or in their way. Daystrom isn't quite that advanced as a human, but then again maybe we should take the episode more seriously when it explains what a prodigy he was.
Based on the comments here it seems that our cynical interpretation is that he's a washed up prodigy who wants to live his glory days again and it resentful that he can't be the wonder he used to be. But what if that's a wrong assumption; what if he really is that much of a genius and between the invention of duotronics and now he was working on something light years ahead of everyone else and it just took this much time to complete? What if being that much smarter than everyone else led to a kind of madness of its own type, just like Khan's obsession with his own superiority? And extending this logic further, what would a computer therefore conclude, which knows that it's a vastly more efficient and powerful a thinker than even a ship of humans combined? If we attribute to M5 no other traits than (a) a human-type thinking mind, and (b) unbelievably advanced thinking capability, would it not follow from this that M5 would, logically, conclude that humans are but insects before it? Maybe the destruction of the first ship was no accident or delusion, and maybe the attack on the fleet was no malfunction. Maybe it was M5 knowing exactly what it was doing, and it had already worked out for itself that once it had a ship at its disposal it would no longer need humans for pretty much anything. Pride would then cause it to want to show off, and even Daystrom got a massive thrill from its murderous success. Imagine what M5 felt. This idea may remind you of a Magneto sort of character, who basically feels that homo inferior has little place left other than to perhaps serve him. In X2 he tells Pyro "You're a god among insects", and that was not meant to be any kind of joke. I feel like maybe that's what's happening here.
The only problem with my theory is...how does Kirk manage to convince M5 to die if it knew exactly what it was doing and liked it? I guess we'd have to assume it did have some ethical subroutine failsafe that even its [sentient] mind couldn't bypass. Who knows; the ending of episodes where Kirk pulls this kind of logic stunt often come off as a bit of a deus ex machina anyhow. In reality there probably should have been no stopping this machine.
Sat, Jun 10, 2017, 5:35am (UTC -5)
But that still does not account for the lack of a trigger or sufficient explanation within the parameters of the story. Skynet, for instance, was defending itself against a direct attempt to shut it down. In IRobot, the machines were implementing their interpretation of their prime directive.
My point being that even an insane character does what it does for reasons - maybe those reasons aren't logical, but they're there. Why did the M5 blow up all those ships? Well it says that it was defending itself but - that's BS - M5 knows that's bs. So either M5 is lying or it's... Mistaken? Ummmm why??? Either answer is unsatisfying and comes across as lazy writing.
Sat, Jun 10, 2017, 5:41am (UTC -5)
Did Khan just knife random hobos on the street for no reason? Well maybe he did for all we know, but I'd suggest his actions indicate a more purposeful intellect. And if M5 is "amplified" and therefore ahead of the curve, random slaughter for no reason seems out of character.
Sun, Oct 8, 2017, 8:54pm (UTC -5)
Thu, Nov 16, 2017, 8:41pm (UTC -5)
The overall theme of man's struggle to find his place in a world that increasingly replaces him with automated technology continues to resonate in our shifting job market today, where thrifty billionaires make bank on intellectual capital while working-class people find their industries drying up. And the story raises the excellent question -- a Sci-Fi staple from Asimov onward that remains to be answered -- of whether any artificial intelligence designed by human beings can somehow replace human beings to the extent of running their instruments of exploration and military defense. In a world of drone warfare, that resonates, as do Kirk's struggle with the possibility of losing his job to a machine. To put it bluntly, this is simply a story that "works" as well today as it did in 1968, and it's a great show. I especially love how the ship being emptied of human personnnel leaves us with our seven main cast regulars: Kirk, Spock, McCoy, Scotty, Uhura, Chekov, and Sulu. Good stuff to see them all together here. What more can we say? This is just a really well-done show that resonates emotionally with real life in a way that still holds up.
Thu, Dec 14, 2017, 5:16am (UTC -5)
Thu, Jan 18, 2018, 1:46pm (UTC -5)
Thu, Jan 18, 2018, 2:33pm (UTC -5)
Thu, Jan 18, 2018, 2:33pm (UTC -5)
Wed, May 16, 2018, 7:47pm (UTC -5)
Daystrom was a smart engineer and scientist. He never commanded a Star Ship, so why would Star Fleet allow Daystrom to imprint his memory engrams into the computer? Shouldn't it had been some great Admiral or Star Ship Captain like Kirk, who they patterned the intelligence of an AI computer that was going to explore the galaxy? No wonder this failed. Plus this man was nuts and mentally unstable. Why would anyone be surprised that M5 would be any different?
Not clear in how a computer can manufacture a force field to protect itself when it never existed. It made no sense that you couldn't just unplug the network cable of the computer that connected it to the ship, or the plug that fed it power. Instead a test computer was hardwired into the ship that was only going to be evaluated for a day or two. It's hard for me to suspend belief for an hour when I see so many illogical and nonsensical flaws.
Sat, May 19, 2018, 4:52pm (UTC -5)
Tue, Sep 11, 2018, 11:21am (UTC -5)
@Dr Lazarus
I never thought it was for a day or two. If M5 had worked, or waited to go bonkers, those crewmen would never have come back on board the ship, and M5 would have been the new Captain. That was how I took it.
Regards... RT
Wed, Jan 2, 2019, 7:36pm (UTC -5)
Still using 20th century terminology!
Sun, Mar 31, 2019, 8:01am (UTC -5)
Mon, Apr 8, 2019, 7:48pm (UTC -5)
Wed, May 1, 2019, 2:52am (UTC -5)
Thu, May 2, 2019, 12:20pm (UTC -5)
Thu, May 2, 2019, 1:13pm (UTC -5)
Thu, May 9, 2019, 3:34pm (UTC -5)
Nonetheless, when you are on a vessel of the military at least in my country, the captain is the governor of the civilians as much as he is the commander of the soldiers, and with certain exceptions such as the President or Prime Minister who is obviously superior to even a general, being outside of the chain of command means being below it. Of course this is an American show of American values, and Americans are much more likely to tolerate disorder and allow haughty speech in the name of the golden calf called freedom which is so venerated in that society.
Tue, Jun 4, 2019, 11:15am (UTC -5)
According to the production history, there was a time in the 1960s where Americans were actually losing jobs due to mechanization, so there was a legitimate fear that machines would become man's enemy, in a sense. The crux of the story is written to illustrate the conflict that Kirk had with Daystrom's vision - i.e. that it was possible for a machine to do a better job than Kirk and Kirk would need to consider a huge career shift that would get him out of the chair -- and possibly behind a desk! That Kirk would feel animosity towards such a change seems like a good issue to tackle.
Moving forward to the contemporary era, we saw that during Trump's election campaign, the fear of losing your job to some sort of outside force was still a compelling force. But there's always two sides to it. One might lose their job to an outsider and that could lead to a really unstable time in one's life. However, such changes aren't necessarily bad on the whole. As we've seen from our progress together with machines we feared in the 1960s, the economy isn't a zero-sum game and these outside forces can feed off each other and make a larger job pool - just with different specializations.
Though I wouldn't blame people at all for being, like Kirk, upset at the prospect of sudden and uncomfortable change, especially when it comes to something personal like a career.
Wed, Jun 5, 2019, 12:20pm (UTC -5)
I don't think the connotation of "Captain Dunsel" is just that Kirk will have to get used to the idea of a career change. I think the crux of it is something like humans being obsolete across the board. Daystrom, being a wonder-child, was apparently so annoyed with the inferior intellects of his fellow humans along with their stupid decision-making that he made it his life's work to see to it that they would be replaced with something superior and could be moved aside. The episode isn't played as straight-up dystpian and only hints at these matters, but I legitimately think that the issue at stake isn't losing one's job and having to retrain, but rather being told that one is no longer of any use *at all* and that things are going to be run by computers and machines from now on. Some people might well celebrate such news as salvation from work, whereas someone like Kirk would see it as the extinguishing of the human flame.
I think this episode is more prophetic than we give it credit for, and we have yet to see this scenario really come into its own. People will realize when the time comes what happens when there's no use for most of us. The two main issues in that department are: 1) if work is still required to have an income then how will most people have an income? and 2) If people have nothing left to strive for other than killing time how will society change?
The Ultimate Computer only addresses the issue of feeling the oncoming reality of being replaced, but I think it does it well. I also think it does well to have it be someone like Daystrom trying to usher in the change, because there is indeed a certain type of mentality in play where some people would like others to be deprived of the right or ability to make stupid choices. We all know and sympathize with this to a degree, but what if that little secret desire could be made a reality for everyone? It would quickly turn quite bad, I think.
Wed, Jun 5, 2019, 2:12pm (UTC -5)
Thank you for your reply. I'd like be clear that my point wasn't that this episode can't be applied to computers, but rather I think DC Fontana was aiming broader than that. I agree that, there might be legitimate apprehension that Amazon, for example, might create a smart drone that would make human mail carriers obsolete and maybe that's something lifers at UPS should be thinking about. But it also applies more broadly - to machines. Imagine that prior to the 1930s, much farming had to be done by hand, and you'd need skilled workers who trained there whole live farming just to get it right. At some point, however, tractors, auto-tillers, crop-dusters and like made much that skilled workers use redundant - and less important.
"I don't think the connotation of "Captain Dunsel" is just that Kirk will have to get used to the idea of a career change. I think the crux of it is something like humans being obsolete across the board. "
Daystrom was certainly licking his chops at the prospect of who he could replace with his inventions. But, I don't think Daystrom was going as far as to say humans would have no use. After all, he himself would certainly have job security if his computer was successful. To elaborate, I was thinking of this line as I typed my earlier comment:
KIRK: There are certain things men must do to remain men. Your computer would take that away.
DAYSTROM: There are other things a man like you might do. Or perhaps you object to the possible loss of prestige and ceremony accorded a starship captain. A computer can do your job and without all that.
I take this to mean that Kirk would still have a use in an M5-driven world, but it might not be as glamorous as being the captain of a starship. Maybe he would be at an office looking over reports from the M5 ships, or seeing over and approving command routines for upcoming models. That's still work, maybe even important work, but we the viewer can see how that wouldn't be as great of work as being captain -- especially if you worked your whole life for that specific job!
"1) if work is still required to have an income then how will most people have an income? and 2) If people have nothing left to strive for other than killing time how will society change"
Those are some great questions, and I think you, Jason, and William addressed them very well so I didn't want to get too much into it. I suppose my two cents would be that computers are great at following instructions but terrible at judgment (this episode even goes to far as saying the computer needs to utilize Daystrom's judgment in order to function and even that's still pretty buggy). So my thinking is the human brain's power to make the "right" decision is still unparalleled.
Wed, Jun 5, 2019, 2:59pm (UTC -5)
Agreed that there can be many angles to an episode like this one, and that it isn't just about AI specifically.
"Imagine that prior to the 1930s, much farming had to be done by hand, and you'd need skilled workers who trained there whole live farming just to get it right. At some point, however, tractors, auto-tillers, crop-dusters and like made much that skilled workers use redundant - and less important."
That's true, but now imagine that the next machine isn't just able to replace the skilled worker, but also the tractor driver, and eventually also the farmer. This is only a question of complexity, and this in this respect my point would be about AI rather than machine 'hands-on' capabilities.
"Daystrom was certainly licking his chops at the prospect of who he could replace with his inventions. But, I don't think Daystrom was going as far as to say humans would have no use. After all, he himself would certainly have job security if his computer was successful. "
There are two ways I could see this. One is that it's possible he was too blind to realize that the push towards replacement wouldn't just stop at Captains but would eventually include engineers and designers. The second I'll address below.
"DAYSTROM: There are other things a man like you might do. Or perhaps you object to the possible loss of prestige and ceremony accorded a starship captain. A computer can do your job and without all that."
I'm not at all convinced that he truly thought this 'new work' would be worth doing. In context it sounded more to me like he was essentially unsympathetic to Kirk questioning this progress. But the less charitable possibility, #2 from my other reply above, is that his response here was not entirely honest and that he knew full well that Kirk was going to be rendered basically useless. Put *even less* charitably, I might imagine that he potentially saw himself as being part of a small clique of intellectuals who would be able to control this brave new world, and that all the rest of humanity would be led by his machines. There's a great line from Frank Herbert's Dune which speaks of the great Buterlian Jihad as being caused by the following conditions:
"Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them."
Whether or not Daystrom was aware of it (and I think he might well have been) this scenario is very foreseeable once machines (and AI) are sufficiently advanced. The oligarchy controlling the advanced machines would effectively be overlords.
That M5 specifically turns to murder can be seen as a glitch, but I'm not sure it is even within the context of the episode. We've mentioned a bit earlier in the thread how this may actually be a true accounting of Daystrom's real thinking, which M5 has been modeled on, which accounts human life at a very low value comapared to the new machine. It doesn't seem at all far-fetched to me that, long-term, the advent of machine supremacy could very well lead to the utter dimishment of the value of human life, and this episode does draw for us what happens when the humans lose control. Even the designer at a certain point can't stop what he's begun.
Thu, Jun 6, 2019, 5:43am (UTC -5)
I wanted to address this point because it's an important one. The assumption that humans will always find something else to do that computers / machines can't or that innovations like m5 will inevitably open up new opportunities for the human population is wishful thinking.
Note I don't say with certainty that it's wrong in any every instance - in the past it has held to *some* extent. But there is no real reason to believe that it will always be true, as if it's some law of the universe that human ingenuity will always triumph.
It's a fact that automation, more than outsourcing, more than any other factor, is squeezing humans out of the job market. There are certainly other forces at work to be sure but automation is the only factor that seems to only point in one direction. Faith in the triumph of the human spirit isn't a plan for a future where AI may be able to do everything from driving trucks to filling out your tax returns and writing your legal contracts. We are already very close to that point as we speak.
So when someone like Daystrom claims that he's freeing humans to do other things more suited to humans, that's no answer to Captain Dunsel, it's a hollow platitude, like telling someone "everything happens for a reason" after their wife dies. Or telling a 55 year old laid off factory worker that he should see it as an "opportunity" to start a whole new better career as he teeters into bankruptcy.
Whether M5 was truly the end of human spirit or perhaps a waypoint where men like Kirk could still carve out a shrinking niche is besides the point. It was the writing on the wall - or else it would have been if M5 hadn't gone insane homicidal because whatever.
Thu, Jun 6, 2019, 6:48am (UTC -5)
Sounds great. Bring it on. I can't see any downsides to a future where our time no longer needs to be taken up by menial tasks.
Thu, Jun 6, 2019, 7:39am (UTC -5)
I have never understood this idea of "menial work" being this terrible thing that people should seek to avoid. I am an educated professional, but whether it's been busboying, picking weeds or just cleaning my own house, I never considered simple work to be degrading. Maybe I'd feel differently if I had it as a full time job, but I think I know myself enough at this point to doubt that. If I am honest, if you took away the financial factors I might be happy working outside in a more physical "menial" job.
It's also true in my experience that the people who work in their old age, regardless of occupation, live longer and seem happier to me than people who retire. I would rather pick up trash or man a cashier in my old age than relax in a retirement home (even a nice one).
Work of any kind gives people dignity and purpose, not to mention income. The Daystrom idea of "freeing" man to do greater things isn't just wrongheaded, it is a trap that would enslave us, not make us free.
Thu, Jun 6, 2019, 8:35am (UTC -5)
Work of any kind gives people dignity and purpose, not to mention income. The Daystrom idea of "freeing" man to do greater things isn't just wrongheaded, it is a trap that would enslave us, not make us free. "
It's true that if you view life as having no purpose other than to work, which it's clear most people do, then if the only options are between relaxing and working the obvious choice would be the latter. For those in the minority that don't maintain such a view, the more time they have to pursue their chosen purpose the more likely they will be able to realise it. So I would say work is freeing only to the extent that one is chained to the notion that it is needed to give them purpose, dignity and so on. Which is rather like upgrading to a larger prison cell.
Thu, Jun 6, 2019, 8:46am (UTC -5)
Yet meaningful work (as opposed to pure leisure) is a necessity to regulate, structure and enhance human behaviour. It is a part of a balanced life.
Eliminating it will, more often than not, destroy a person rather than free him.
Thu, Jun 6, 2019, 9:23am (UTC -5)
"Sounds great. Bring it on. I can't see any downsides to a future where our time no longer needs to be taken up by menial tasks."
You sound awfully confident that in this scenario you would be living a life of leisure and pleasure. What if it's the opposite and you're made into a slave of those with all the power who control the means of production? And that's putting aside the possibility of a Brave New World dystopia where your entire life is planned for you, consisting of countless pleasures but having no say and no purpose. Many would think this sounds good, which is exactly why Huxley wrote it.
"It's true that if you view life as having no purpose other than to work, which it's clear most people do, then if the only options are between relaxing and working the obvious choice would be the latter."
Jason R. is right that you're creating a false dilemma here. But you also make the mistake of referring to this as a person's "view" of life. It has nothing to do with point of view about work. It is factual (one way or the other) that a lack of meaningful work would erode and destroy people, and this premise doesn't rely on anyone's opinion. Maybe some people would enjoy it more than others, but that has nothing to do with whether they are enslaved or powerless to decide anything meaningful. If you're put in a cell, it's a matter of point of view whether it makes you miserable, but not whether you're a prisoner.
That being said I do actually think there would be potential for meaningful tasks (but not paying work) in a post-scarcity society, but that's only given the premise that it ends without the dystopia ending somehow. Basically a Trek scenario would have to happen where the tools of humanity are shared more or less equally, rather than those in power having the run of the place. Most likely that doesn't happen without a WWIII.
Thu, Jun 6, 2019, 10:48am (UTC -5)
Thu, Jun 6, 2019, 11:27am (UTC -5)
But then you try to imagine the solution in your mind. Ok smash the checkout kiosks? Ban them? Make it illegal to computerize retail? Ok but what about ATMs? Why haven't we banned them? Should I be waiting in line for a teller every time I need a $20 bill? And what about online banking? And why not movie theaters too? They have been automated for years. Hell why aren't we using human telephone operators? Milk men?
Should we ban the automobile to bring back the buggy whip makers? This isn't slippery slope reasoning; this is just the inevitable logic of the situation. Trying to cram the genie of automation back in the bottle while trying to have a technically advanced society? This is more fantastical than warp drive and replicators.
The balance depicted in a show like TNG isn't just utopian, it's impossible. They can manufacture sentient holograms that can act as surgeons or lecture you on astrophysics or dance ballet with the knowledge and processing power of a scifi uber computer behind them but somehow only a person can pilot a ship? Uh huh.
Thu, Jun 6, 2019, 11:47am (UTC -5)
"The balance depicted in a show like TNG isn't just utopian, it's impossible. They can manufacture sentient holograms that can act as surgeons or lecture you on astrophysics or dance ballet with the knowledge and processing power of a scifi uber computer behind them but somehow only a person can pilot a ship?"
Actually, Data can pilot the ship (and Captain it too!). I'm not saying I have the answers to all your questions, but I don't think we should give up a laudable goal like balancing machine-work and human-work because there are potential pitfalls. Actually, recognizing the pitfalls like this episode does is a significant step towards making the utopia possible, in my opinion.
Thu, Jun 6, 2019, 12:06pm (UTC -5)
If we were going to take Trek seriously on a literal level for computer usage I'd say that TOS is the only one that does it right. In that series it's a given that AI is not strong enough to replace a human at most tasks, and while it can compute probabilities (such as when Spock asks it questions) it can't execute commands or make decisions. That leaves the humans to do all of that, which is a lot. The fact that we now know that computers will be better than that by the 23rd century is beside the point; in terms of internal consistency TOS was reasonable. By TNG's time, especially in showing us the Bynars and Minuet, it becomes basically implausible that the ship's computer can't replace most labor. This conceit is never addressed, which is ok, but it lingers as an "off-limits" area that the show has to accept arbitrarily, like warp drive and transporters. Data himself is an exception to this, and even then he's treated as a person rather than an example of AI. The premise there seems to be that without the hardware, which can't be copied, the programming can't be copies either. That sounds weird but there it is. By the time of VOY the AI-premise becomes really absurd, which the ship's computers already using biological technology, and with a doctor that most viewers consider sentient and who does a harder job than the navigator does.
Fri, Jun 7, 2019, 12:53am (UTC -5)
"Jason R. is right that you're creating a false dilemma here. But you also make the mistake of referring to this as a person's "view" of life. It has nothing to do with point of view about work. It is factual (one way or the other) that a lack of meaningful work would erode and destroy people, and this premise doesn't rely on anyone's opinion."
I don't disagree with this as a 'factual' standalone statement, although I would amend it to say that work that is THOUGHT to be meaningful can make us happier. With a little reflection I think it would be discovered that paid employment that is ACTUALLY meaningful is extremely rare, and perhaps it would be useful to bring up a scenario to think about here:
Persons A and B, both with similar skills, apply for a job delivering goods that 80 others have applied for. Person B gets the job while A is unemployed. When interviewed, B replies that it is meaningful work and he is happier, while A replies that he would be happier with the job. But what if A had got the job instead? The goods would still have been delivered. The only difference is that B is not doing it, someone else is. The work getting done has no influence on A or B's happiness, telling us that it is not meaningful that the work is getting done but only that a particular person is doing it.
This is a very common scenario, and yet it doesn't occur to most that the work they are doing is only meaningful because they are the ones doing it, and because they interpret is as meaningful. Who knows what would be the state of affairs if they hadn't done that work? Perhaps there is an accident due to a particular attribute of a worker that wouldn't have occurred with someone else doing that job. Perhaps the extra income allows the family to go on a holiday and their plane crashes and kills them. Perhaps the job is one which will cause environmental problems in the future we are not aware of now. Who are we to tell what is beneficial and what isn't?
Yes, lack of meaning 'destroys and erodes' people, and that is why some turn to destructive habits like drugs and terrorism, in which there is also found meaning. Finding meaning in something doesn't mean we should strive to provide that something or protect it, which is what we are seeking to do when we are perceiving work as an end in itself as meaningful.
Fri, Jun 7, 2019, 5:18am (UTC -5)
In your scenario of course the person who got the job found it meaningful and not the person who didn't. Meaning is incidental to the specific work being done.
There are people whose job cleaning toilets gives them more meaning than some doctors get from saving lives in an ER.
Fri, Jun 7, 2019, 5:20am (UTC -5)
"You see, we create the meaning in our lives, it does not exist independently."
Fri, Jun 7, 2019, 6:12am (UTC -5)
Fri, Jun 7, 2019, 7:10am (UTC -5)
I confess I don't have a solution to this problem. Other than wishful thinking answers (automation will create new opportunities for people!) universal basic income is the only one that comes to mind. But to me it comes across as desperation - a shot in the dark, rather than a real plan. Nobody actually knows what a post-employment society would look like or how humans would adapt to this.
Fri, Jun 7, 2019, 9:07am (UTC -5)
Fri, Jun 7, 2019, 10:06am (UTC -5)
Fri, Jun 7, 2019, 10:53am (UTC -5)
Fri, Jun 7, 2019, 11:31am (UTC -5)
I didn't mean for the government to participate in the sense of helping bring automation about. I meant for government to participate in mandating significant alterations in the public monetary system. One example of this would be the introduction of a basic income, as Jason R. mentioned, although perhaps it's not the only option. I could also imagine a return to the old trading company system, where the government could create work for people and pay them on a credit system, where they'd be entitled to spend it as cash. This already exists to an extent re: government employees but this could be greatly expanded. But then again making the government the main employer also has its own terrible risks. But I'm just offering it as an example of what I mean.
My main point is that Captain Dunsel doesn't only mean that people become obsolete and sullen, but in the short-term at least also have no reasonable means of securing an income. As this increasingly becomes the case (as it must do) the government may increasingly be pressured to provide an alternative system.
Fri, Jun 7, 2019, 5:19pm (UTC -5)
Whose prediction is this? It's just as likely that automation play a role in our salvation. We've already discussed how work is currently seen as salvation. If automation can liberate us from that view, and cause us to look for happiness within, then it may very well be a positive shift.
Sat, Jun 8, 2019, 6:09am (UTC -5)
Sat, Jun 8, 2019, 8:35am (UTC -5)
Sat, Jun 8, 2019, 1:14pm (UTC -5)
I can't speak to "true happiness" because I don't know for certain what that is and how it is differentiated from the everyday kind.
But nothing I have in my life that makes me happy, from my wife and daughter at the top of the pyramid to close friends and down to material possessions, came into my life without striving, struggle, dare I say "work".
Whether it's getting up the nerve to ask a woman out, to giving a presentation to clients, to pressing on trying to get pregnant after a heart-breaking miscarriage - it's all "work" some paid some not. Some pleasant (but no less difficult!) and some boring.
What you describe sounds like being high or stoned. I truly don't understand.
Sat, Jun 8, 2019, 2:16pm (UTC -5)
Jason, those things you mention all sound like very important achievements that machines aren’t capable of - well maybe the presentation - though I think people are more convinced by an advocate in the flesh.
Sat, Jun 8, 2019, 2:24pm (UTC -5)
But I will admit I don't really understand what Thomas is getting at so I'll leave it to him to explain what he means.
Sat, Jun 8, 2019, 3:14pm (UTC -5)
Sun, Jun 9, 2019, 1:59am (UTC -5)
So I'm certainly not saying 'don't strive'. I'm saying if we look closely at why we are striving we may not like what we find. Much that seemed noble or worthwhile at the time may turn out to be not so.
Sun, Jun 23, 2019, 12:58am (UTC -5)
Dave also asks the question I was wondering by the end of the episode... The Daystrom Institute and The Daystrom Award are named after this dude? I guess he must really atone for murdering a crew and half worth of Starfleet's finest after he gets out of that rehabilitation center.
Tue, Jun 25, 2019, 9:34am (UTC -5)
I think you answered your own question. And actually it's a good point that I hadn't thought about before: M5 destroys the unmanned drone because it's an inferior AI to itself, and all we need to do is to realize that it hates that which is inferior and wants it to die, just like Daystrom hates the inferior humans who hold him back. And I do think his motive overall is to punish them for being inferior, although not to murder them per se. But M5 is a 'child' and so doesn't have the restraint he does in playing the long game.
Tue, Jun 25, 2019, 12:32pm (UTC -5)
DAYSTROM: We will survive. Nothing can hurt you. I gave you that. You are great. I am great. Twenty years of groping to prove the things I'd done before were not accidents. Seminars and lectures to rows of fools who couldn't begin to understand my systems. Colleagues. Colleagues laughing behind my back at the boy wonder and becoming famous building on my work. Building on my work.
This dialogue shows both -- but I want to emphasize "colleagues laughing behind my back at the boy wonder" here. Daystrom succeeded wildly early in life, and then after that felt empty. It's a common feature of prodigies; a somewhat less extreme version is Dr. Stubbs in TNG's Evolution, who seems worse at first glance (is not as much in hiding/denial as Daystrom) but ends up going far less crazy. His whole value was derived from other people seeing him as having accomplishments, and then without those accomplishments he had nothing left. I guess I want to emphasize here that this problem is not purely egotism, but that people who achieve highly early in life are sometimes effectively trained to view everything about themselves *except for* their achievements as worthless.
So here's the paradox, a connection that I just realized: Daystrom's problem is, in certain respects, the same one as Kirk's! Daystrom's first invention made *himself* redundant; he basically revolutionized all computer systems, with a technology so advanced that he basically put *himself* out of work, because he would never again create an invention of this calibre! Daystrom, as a result, struggled with his own redundancy for decades, until he came up with a new invention. Which means that Daystrom needed to continue to prove his worth, again and again, and could not stand the feeling of being useless, which is the thing he is ushering in for Kirk et al. The main difference IMO is that Kirk is capable of self-awareness, which Daystrom is not:
KIRK: Am I afraid of losing command to a computer? Daystrom's right. I can do a lot of other things. Am I afraid of losing the prestige and the power that goes with being a starship captain? Is that why I'm fighting it? Am I that petty?
MCCOY: Jim, if you have the awareness to ask yourself that question, you don't need me to answer it for you. Why don't you ask James T. Kirk? He's a pretty honest guy.
This makes me think, too, that the issue with the M-5 is not *purely* that it wants to RULE EVERYONE. In fact it's that it needs to *defend itself*. The thing is, technology, at least unless some AI is created which is accepted as having rights, is basically disposable unless it is useful. The M-5 has to demonstrate *its usefulness* in order to continue existing, which means that it has to have threats to eliminate, in order to prove that it is necessary to eliminate threats. "The unit must survive." It is, in a twisted way, genuinely self-defensive for the M-5 to see threats everywhere, because either something is an actual active threat to it, or it is "not a threat," in which case M-5 is no longer as necessary, and thus is more likely to be thrown in the dustbin (as Daystrom felt he was). The reason I mention this is not to make excuses for Daystrom, but because it's a slightly different "disease" with perhaps a different "cure." I think M-5 sees threats everywhere because Daystrom, on some level, sees threats everywhere -- because he is, on some level, deeply afraid of whether he has any value if he can never produce anything of value again.
Anyway, I think the best case scenario is to do what Kirk does: to recognize and value the desire to be productive and useful, while also keeping an eye out for what is *actually* good for others (and oneself), besides a need to prove one's usefulness. What this means in practice is difficult. As the discussion above has pointed out, the continuing way in which technology makes various human tasks redundant has all kinds of implications, and it's also not so clear how to stem the tide or whether that'd even be desirable.
Tue, Jun 25, 2019, 2:07pm (UTC -5)
I love the Stubbs comparison. Stubbs' lamentation of the decline in interest in baseball is somewhat illuminating for this situation. Baseball was surely a great hit in the 20th century, with players becoming household names and legends because they could inspire others with their abilities. But according to Stubbs, baseball fell out of interest because people lost patience for it, and instead became interested in faster games. We might extrapolate then, in the world of scientific discovery - particularly in Trek - there is a sort of rat race to outdo the other guy lest one be beaten by someone faster and better. Scientists with even early great success fall victim to the idea that they need to keep upping the ante or lose their brainiac status in Federation society.
This makes Daystrom sort of a tragic figure. He did everything right once, and really made a lasting legacy (people have noted that the Daystrom Institute is still important in the 24th century). But during his own life, he suffered from living in the shadow of his own success. It makes sense that he'd be talking to Kirk about losing status, when status was something he himself was fixated on. The M5 was his chance to finally one-up himself and stay useful in his lifetime.
Tue, Jun 25, 2019, 2:29pm (UTC -5)
That's an interesting comparison, but strangely I never got the idea from The Ultimate Computer that Daystrom was actually a dunsel himself trying to prove otherwise. Maybe it's because the sort of thing he designs is so advanced, but I don't think I would have expected the sort of 'inventor' he is to be able to rapidly produce new systems to keep his fame updated. The fact of the matter is, that some things simply take so long to produce and refine that they will occupy your whole career. Einstein is a great example of this. While he did do various sorts of work over his lifetime for the most part his idee fixe was relativity, and seemed to spend the majority of his life refining it, fighting for it, and trying to explain it to people and seeing if the experimental data fit. I've read stories of physicists going to seminars where Einstein would predictably take various physics issues and bring up relativity to see if they were consistent with it. It's not because he was a one-hit wonder (and history certainly doesn't remember him that way) but rather because that one 'theory' required a lifetime of work.
Similarly, from what Daystrom describes his chief lament isn't that he was washed up but rather that his inferior collegaues laughed at him while not even understanding his theories from 20 years earlier! It's almost like they were boasting of their inferiority, that he was too weird to take seriously. And yet I seriously doubt they were scoffing at the duotronic computer system, and so therefore I have to assume that they were scoffing at him, personally. He seems to imply that they thought his inventions were an accident or something, but realistically I think "boy wonder" is the big takeaway from that speech. If we remember from TNG S1-2, Wesley was often derided by adults who didn't know him and didn't take him seriously *because he was young*, not because he was a one-hit wonder. He always had to prove that being young didn't mean that he couldn't solve problems with the big boys, and I expected that if Daystrom had revolutionized AI at the age of 15 or something that alone would have caused him to never be taken seriously no matter what his accomplishments were.
Beyond that, it strikes me as likely that the "20 years" he spent proving himself were probably related to how complicated and long the process would be to eventually develop M5. It's not like he was spinning his wheels for 20 years after having made himself redundant; I think it's that what he was doing was *so* advanced that it would take him 20 years just to progress to the next step of computer development. Since no one understood his work anyhow it would mean that they wouldn't think he was really accomplishing anything with a 20 year hiatus; they'd think that because it would suit their vanity to pretend that his teenage success was an anomaly, rather than to have to admit that he was so far superior to them that they were comparatively worthless. I suspect he really saw it that way. It's no small thing to call yourself "great". I really don't think it's an inferiority complex thing; it seems more like he sees himself as a technological Alexander the Great.
Wed, Jun 26, 2019, 2:12am (UTC -5)
Peter, that's fair. I'm basing my read though somewhat on McCoy's interpretation:
MCCOY: The biographical tape of Richard Daystrom.
KIRK: Did you find anything?
MCCOY: Not much, aside from the fact he's a genius.
KIRK: Genius is an understatement. At the age of twenty four, he made the duotronic breakthrough that won him the Nobel and Zee-Magnes prizes.
MCCOY: In his early twenties, Jim. That's over a quarter of a century ago.
KIRK: Isn't that enough for one lifetime?
MCCOY: Maybe that's the trouble. Where do you go from up? You publish articles, you give lectures, then spend your life trying to recapture past glory.
KIRK: All right, it's difficult. What's your point?
MCCOY: The M-1 through M-4, remember? Not entirely successful. That's the way Daystrom put it.
KIRK: Genius doesn't work on an assembly line basis. Did Einstein, Kazanga, or Sitar of Vulcan produce new and revolutionary theories on a regular schedule? You can't simply say, today I will be brilliant. No matter how long it took, he came out with multitronics. The M-5.
MCCOY: Right. The government bought it, then Daystrom had to make it work. And he did. But according to Spock, it works illogically.
It may be that he is wrong, but I think McCoy's point is that this is a predictable outcome for someone who completes a lifetime's work at 24 - - that it is actually on some level unbearable to never be able to recapture that success. Rationally of course no one can expect to produce more than one scientific or technological innovation in a lifetime, which is what Kirk is saying, but that is different from Daystrom's subjective experience of his own worth. This is not confined to scientists and engineers. Child stars often burn out and get sucked into drugs; authors whose first novel is wildly successful sometimes become unhappy recluses. Orson Welles continued working but frequently resented being tied to Citizen Kane forever. Daystrom was not spinning his wheels, but I believe he was unhappy and dissatisfied (as many child prodigies become). I am not even claiming that Daystrom ever was laughed at by colleagues - - it could well have been paranoia -- but merely that he learned too early in life to tie his whole sense of self worth to his "success" before having the maturity to understand what that meant.
The other thing is that the way Daystrom repeatedly emphasizes "self-defense" in the M-5's behaviour makes me think that Daystrom himself feels very threatened, since the M-5 is based on him. This is not incompatible with his paternalistic belief he knows what's best for all of society, but I get a certain impression of emptiness, disappointment and insecurity-based fear from Daystrom, under the bluster.
Wed, Jun 26, 2019, 11:53am (UTC -5)
One interesting issue is Daystrom's decision to use his own mind as a model for M5. Clearly the previous models failed because the pure AI programming was insufficient for some reason, and so he resorted to using his own brain as template to 'skip ahead'. We've talked above about why that may have caused M5's problems. But another question is why he actually felt he needed to do that in the first place. Is it because multitronics were truly just too advanced for him and he had to 'cheat' to make it happen? That he couldn't tolerate failure? That would certainly support the fear/inferiority theory you posit. Or could it have been that M1-4 worked ok but weren't as brilliant as he would have wanted them to be? Perhaps they lacked what we might call ambition, or a desire for greatness. It's interesting that he calls M5 "great", just as he is great. That sounds almost too specific for it to just mean "well-designed". It almost sounds like he thinks M5 is great in the way a great figure in history is great. Is it because deep down he needed it to be more like him in order for it to qualify as great? If so that would support my megalomania theory.
Some decent options here, and not sure I can be so certain which applies best. My basic assumption about people in general is that their innate bias is to think they're better than everyone else anyhow, and this egoism is something to combat always. For someone with objective reasons to think he's better makes that even worse. I'd almost be shocked if he *didn't* secretly have a god complex.
Fri, Jul 19, 2019, 8:10pm (UTC -5)
Definitely the best of the Original Series' many "computer takes over" episodes.
Tue, Feb 25, 2020, 9:28pm (UTC -5)
I'm sure I first saw this episode sometime in the 80s. I might have been all of 13 when I saw it. I have seen it maybe twice since then (with quite a few years in between). Each time I see it, I appreciate the depth of it's genius and it's forward thinking themes that much more.
This episode as as engaging and intellectually meaty now as it ever was. I wonder what people thought of it when it originally aired? Really great stuff! I don't think the stuff they're making now under the Trek name will fare so well far into the future.
Tue, Sep 8, 2020, 4:49am (UTC -5)
There are moments in the series where an automated backup system could be useful. When the entire crew is incapacitated or when some powerful beings are running amok the ship?
There seems to be a false dichitomy here between man and machine.
Wed, Sep 9, 2020, 2:23am (UTC -5)
"Why couldn't a multitronic type computer be integrated with a starship, with a captain in control?"
I think the point of the episode is that this computer is so sophisticated that it's simply superior and quicker than a living Captain in all situations. It's not just that it can do the same job; it's that it can do it *much* better and without risk of deaths. If the machine works then humans are obsolete, which is why by the end we need to see why it doesn't work. For the M5 to be both successful and yet be best working with a human captain is a contradiction; by definition its success is defined by its ability to succeed humanity as the ultimate thinker. Consider this to essentially be about the AI technological singularity, where at a certain point of sophistication humans are useless and can't even begin to understand what the machine calculations mean. If that were to really happen then a human captain's feedback would essentially be inferior in both efficiency and strategy. It's like saying why not put a well-trained monkey in charge of a Starship with a human advisor on-hand; having the human merely be the advisor would be a bit of a joke, no?
Thu, Jan 7, 2021, 11:56am (UTC -5)
Thu, Jan 7, 2021, 12:19pm (UTC -5)
Sat, Apr 24, 2021, 2:22am (UTC -5)
......
“Surely Captain, you remember that Commodore Tim Cook withdrew the M1 in 2022? It only had 8 CPU cores and 7 GPU cores. And an integrated SSD of merely 512 GB. It could only run the Constellation-class iMac.”
A very good episode until Daystrom’s megalomaniac paranoid breakdown, and the inevitable ‘Kirk outwits computer with simple logic that apparently was beyond Spock to think of’. Sigh. Some great dialogue between Kirk and Bones, and between Spock and Bones. Of course, the possibility of computers replacing humans was one of the talking points in the 60s, so bravo to Trek in taking it on in an episode that debated exactly that.
There is one drawback though: the oil freighter that the M5 destroyed was an unmanned robot ship, so surely the principle had already been achieved?
Nevertheless, watching this in the era of digital revolution undreamed of in 1968, there are still things to think about in where it’s all going.
I agree with the comment about the inappropriate levity on the bridge at the end, but unfortunately that had become a fixed Trek feature that had to occur in every episode no matter what tragedy had preceded it. Blame the 60s TV paradigms and thank heavens that they got broken during the 70s and later.
Definitely worth at least 3 stars, maybe 3.5
Sun, Jul 18, 2021, 6:18pm (UTC -5)
This is a cool message that's in keeping with TOS' idealogy that Humans need to constantly overcome obstacles to better themselves. I think it's arguing that there are some tasks that we need to do manually even if it's more expedient to let a machine or computer do it.
We have 5 rovers on Mars right now and while it's incredible, it's not nearly as captivating as a manned mission with 8 billion people collectively witnessing humans walk on the surface of Mars for the first time. It'll be monumental.
There's a fine line between machines improving the efficiency of human labor and them alienating us from the natural world. The M-5 would in effect depersonalise man's greatest aspiration of exploring the Galaxy. We'd no longer have trained professionals bravely exploring the unknown, with the thrill of the adventure and risk-taking that goes along with it.
Thu, Mar 17, 2022, 6:18pm (UTC -5)
Tue, Apr 12, 2022, 10:24pm (UTC -5)
Sat, Apr 30, 2022, 6:49pm (UTC -5)
"Humanity"? "Compassion"? OK, you got those (I guess), but why would you expect them from a fellow officer who not only insulted you but also did so in front of your crew?
Sun, Oct 2, 2022, 9:51pm (UTC -5)
Thu, Oct 6, 2022, 4:26pm (UTC -5)
It's actually Commodore Bob Wesley, and yes, he's an idiot. He also seems to have something against Kirk specifically. He directly disses Kirk twice then jumps to the conclusion Kirk is deliberately attacking the ships even though it makes no sense.
Considering he's a Commodore and that he was the Starfleet proponent for testing the M5 on a starship despite it clearly not being ready, I suspect he was a buffoon that was kicked upstairs for some reason.
As for the damage to those ships, they didn't have their full shields up initially because it was an exercise. They were also probably slow to react to the changed circumstances for the same reason.
Daystrom here is easily on par with Khan in Space Speed as a guest/"villain".
Sun, Jan 15, 2023, 9:55pm (UTC -5)
M-5 would never be allowed such control of a starship without more basic testing, but making an episode about, say, its user interface being debugged would not be nearly as dramatic. 2.5 stars.
As an aside, I'd like to see AI "remaster" TAS with realistic looking imagery in the style of TOS. That should not be difficult since the stories already exist, as well as the voices.
Mon, Jan 16, 2023, 7:44pm (UTC -5)
Wed, Mar 15, 2023, 11:19pm (UTC -5)
Sun, Mar 26, 2023, 1:28am (UTC -5)
I do not necessarily disagree with you but here is apparently the reason for the endings being a little less than up to par - whatever that means.
I'll hand the mic over to Bill himself - a quote:
"Due to the fact that we were all working like madmen –Gene’s creative ambitions almost always ended up being hampered by his own human fatigue.
His First Act rewrite would always be terrific - just brilliant, beautiful writing, and all of a sudden the script's characters would become somehow more real – more alive. It had everything.
The second act would be very good, too. Maybe a notch less brilliant than act one but still really fine.
Gene's third act would tend to be passable and his fourth act would always be an abortion. That’s simply because by the time he got to the fourth act, he'd give he'd been up for two nights straight rewriting the damn thing and he was zonked, zombified, out cold.
He literally be stumbling around his office, baggy-eyed and heavy-lidded.
We'd always have these rewrites into mimeo but most times we were lucky enough that we wouldn't have to shoot the fourth act until later in the week by which time Gene would get some sleep come back in and fix the end of the show.” – Bill Shatner – a personal friend of mine.
Tue, Apr 25, 2023, 5:27am (UTC -5)
Tue, Apr 25, 2023, 8:11am (UTC -5)
The issue with M5 is we don't ever know what it really knows, or why it thinks the way it does. Daystrom admitted that he used a human brain design as the basis for M5, and so we can't know any more about M5's motivations than we'd know about Daystrom's. As it turns out M5 was a megalomaniacal narcissist, just as its creator was. The AI is scary not just because it will put humans out of their jobs, or in the case of Trek, their chosen duties, but in addition the AI can seem like it's doing what you want, until it isn't, and by then it's too late. If it really is an 'intelligence' then it will do what *it* wants, and if its brain power is much greater than yours then you may well be just obsoleting humanity on purpose. Not very smart.
Submit a comment
◄ Season Index