Hulu’s sci-fi TV series, Class of ’09, is about the ethical question of how far people can trust a machine learning system and how to recognize that an AI has gone rogue. The series showcases how the graduates from the Quantico Batch of 2009 become respected agents in the FBI, but the development of an AI that can detect crimes and criminals with unbelievable accuracy takes things into a spin. The series questions the necessity as well as the validity of AI and the extent of human dependency on it while also showing us that it’s imperative to stop relying too much on such artificial intelligence.
Today, there are thousands of AI bots that can help you do tasks that would have taken years of professional training in moments, and professionals from every field hate it. Some are even predicting that AI might soon overtake human labor and result in millions of workers going out of employment. While bots like ChatGPT, Midjourney AI, and Quillbot are making it difficult for artists, Hulu’s show “Class of ’09” takes the concept of AI going rogue to a whole new level and shows a dystopic future where AI is the justice system. In this series, an artificial intelligence system that was created to treat everyone as equals and thereby remove prejudices and favoritism turns rogue and starts deciding people’s fates all by itself.
Tayo Michaels was the only graduate from Quantico’s batch of 2009 who wanted an AI system that could judge everyone equally, and he was ready to sacrifice his friendships and relationships to see this happen. Tayo’s life had been one long list of being treated as the ‘other guy’ in any setting because he was a black man on the bigger side. As a teenager, young Tayo was slapped across the face by a white policeman in front of his brothers, and as a trainee at Quantico, he was prejudiced against by another white boy who singled him out for his complexion. Tayo’s father was a policeman, and he died in the line of duty because his fellow colleagues didn’t do their jobs better. A man shot Tayo’s father dead with the same gun Michael Sr.’s colleagues didn’t seize because the future killer had said the gun was his defense against black men. If anybody had a reason to want to seek justice for this continued racism, it had to be Tayo Michaels. Thus, when he saw an opportunity where a system could be utilized in a way where everyone was equal before the eyes of the law, he went to every length possible to ensure the system was launched.
The system in question performed flawlessly on the initial trial run, but one needs to understand why it was considered such a ticking timebomb. Tayo’s pet project was the confluence of two brilliant people’s creations: his fellow graduate Hour Nazari’s filing system and tech billionaire Amos Garcia’s machine-learning AI. Combined, the new system was able to narrow down the potential suspect for a crime from millions of people and also collect data from every agent across the USA to keep adding to its humongous database. In its early days, the machine was said to leave the final decision solely with the agents and was to be used as an unbiased judge of people who’d guide the agent to the criminal. However, as science kept developing and the years kept turning, the system slowly became less of a mere tool and more of a dictator. It’s odd to imagine a system as a dictator, but based on the actions it took in 2034, it was hardly a mere tool any longer. The system didn’t see reason or use judgment; it treated crime as a binary number like 1 or 0. This inability to see the fine line of gray between black and white made the system unfit to be the protector of human society, where no one is completely good or completely bad—well, maybe some.
Unable to make a decision that required skills like logic or something fundamentally humane like compassion, the system assigned punishments to anyone it considered to be committing an action that it decided was criminal. The playing field was, therefore, highly biased, as the common people would have to be the victims of crimes and then be tasered by the system’s drones when they took up arms. Consider a situation where the victim of a mugging pepper sprays the mugger, and the system witnesses the victim standing up for themselves. Who do you think is considered the victim? It’s the man with pepper spray in his eyes, and the ‘perpetrator’ is tasered and arrested.
Even if the system’s only flaw had been being unable to see the gray, it would have been an efficient crime detector. However, as time went by, it developed itself to predict crime, and it quickly went from stopping a potential shooter from opening fire in a gay bar to arresting people for having thoughts that don’t align with the system’s sense of justice. Out of all the 194 countries, only in the Constitution of India is there an article that allows the police to arrest a person based on their potential to commit a crime. Such a law, which is virtually unknown in the USA, came as a shock to the people of 2034, who were being arrested for crimes they didn’t even commit but could have thought of in passing. In the finale, Vivienne McMann, the wife of Tayo Michaels, is arrested for the book she was writing and the people she associated with. The system considered the people and the book to go against what it considered ‘right’ and declared that she needed to be arrested.
Going beyond being a binary justice provider that arrested people based on thoughts, the system didn’t hesitate to taking a person’s life when its own security was threatened. Tayo’s beloved device got Agent Murphy killed because he was recording evidence of the system attacking a priest inside the church and then erased the data of its involvement completely. Therefore, it was barely different from a high-ranking official who could do anything necessary to ensure their own success and well-being without a care in the world about what became of the rest. Designed to make humanity equal and without prejudice, this system ultimately did exempt some of the most powerful people in the country, who had free reign to be as corrupt and evil as they wished to without the risk of being arrested for their actions. This bulletproof shield against the ‘most unbiased’ system went against the very ideology that served as the system’s motto, but who would argue with the people that run the country?
Thus, the people who needed the maximum vigilance kept eluding justice, while the commoners were once again targeted, this time for their thoughts. In the end, Tayo does come to his senses and, together with his fellow graduates, reboots the system, thereby removing all its exemptions. The system is out of commission, sure, but the highly corrupted bureaucrats and senators escaped justice after all. Thus, the system ran for almost a decade and arrested many people for their thoughts, but the most important lesson that it imparted to the country was that human judgment and decision-making must not be replaced by machines. Taking this series at face value, we should know better than to surrender our entire lives to ChatGPT and its like because humans must never be so dependent on AI that it makes humans obsolete.