Subscriber OnlyBooksReview

Filterworld and Code Dependent: potentially devastating outcomes of algorithmic technology

Filterworld by Kyle Chayka and Code Dependent by Madhumita Murgia illuminate the dangers of AI and other technology and how they damage modern life

Filterworld: How Algorithms Flattened Culture
Filterworld: How Algorithms Flattened Culture
Author: Kyle Chayka
ISBN-13: 978-1788706971
Publisher: Heligo
Guideline Price: £22
Code Dependent: Living in the Shadow of AI
Code Dependent: Living in the Shadow of AI
Author: Madhumita Murgia
ISBN-13: 978-1529097306
Publisher: Picador
Guideline Price: £20

At first, two new books by journalists looking at the effects of technology on modern life seem unrelated.

In Filterworld, New Yorker tech and culture writer Kyle Chayka uncovers the pervasive, often unexpected ways in which social media and big platform recommender algorithms – the formulas that drive what we see on Instagram, TikTok or Amazon – have begun to shape not just our consumption of culture, but its actual production. In the snappily named Code Dependent, Wired and Financial Times writer Madhumita Murgia tackles tech’s topic du jour, artificial intelligence, the deeply concerning processes by which its algorithms are trained, and its unevenly distributed, frequently exploitative impacts.

But culture and AI turn out to be illuminating, merging paths for understanding one of the most important issues of our anxious times: how we are all now slaves to the algorithm. So does the similar approach both writers take in pursuing their theme. Rather than opting for a standard, distanced, third-person journalist voice, Chayka and Murgia each take a highly personal and deepfelt approach to their subject, weaving first-person ruminations and personal history into their narratives. Both also very effectively allow a handful of individuals telling their own stories to confront the reader with the human consequences of creating and deploying technologies that are secretive by design and poorly understood even by their creators.

Chayka starts with a helpful background on the rise of algorithmic recommendation systems, then dives into how promoting what is most “popular” has become a self-fulfilling designation. Culture by definition is – or maybe sadly now, has been – spiky, unique, complex, provocative, challenging, embodying the discernible voices of diverse artists and creators. Do we actually like what we see, and see what we like online, and by extension, into real-life cultural spaces? Chayka argues our desires and interests are now methodically flattened into a bland sameness – the Filterworld – that we are algorithmically engineered to like and want: “We now live in an era of algorithmic culture.”

READ MORE

Despite AI’s many possible positives, Murgia exposes how most AI software and uses are based on a significant degree of data colonialism and exploiting the ‘precariat’

He offers the deceptively inoffensive example of coffee shops, now similarly hip globally in exactly the same way: the cloned muted colours, clean lines, minimalist furniture, groovy lo-fi music, deftly placed plants and Instagrammable flat white. Like so much of cultural life, the coffee shop, he says, has been architected into an unexciting format designed to encourage shareable tweets, jpegs and videos.

Or consider the recommender systems on Netflix (or any other platform, really) which push “the user to coast along algorithmic lines of consumption”. Watch a film full of evil aliens or a Jane Austen adaptation and the algorithm relentlessly pushes ever more of the same. “Variety is a difficult concept for recommendation algorithms,” Chayka wryly notes.

His discussion of Amazon’s role in reshaping publishing in the most disheartening ways is another striking and convincing example of Filterworld’s workings, in part because it’s bolstered with facts and figures, forming a personal, though well-informed, hot-take. However, in one of the book’s weaker moments, Chayka uses Sally Rooney (“currently upheld as a peak of western millennial fiction”) as a confusing example of… well, it’s hard to tell quite what. He seems to be arguing that she’s written books to fit the demands of what people want to tweet and post about and to suit online streaming (even though they were BBC TV, not streaming studio, adaptations).

“[H]er novels are inextricable from their parallel existence online,” he postulates, though many of us, perhaps less entangled with social media than Chayka, likely never noticed that aspect. And ironically, his catty Rooney detour seems a somewhat forced construct for enabling one knowing, bitchily funny sentence, so tailor-made for retweeting: “The 2020 TV iteration of Normal People might be best interpreted as a series of soft-core-pornographic GIF sets that would have been extremely popular on Tumblr had the platform not banned adult material in 2018.” Might it? Oh, miaow.

Chayka’s concluding documentation of how he goes cold turkey off social media is by now a well-worn journalistic book and article construct for Discovering How Addictive Social Media Really Is, with the usual, totally predictable sequence of 1. withdrawal, followed by 2. growing, angsty self-knowledge culminating in 3. returning to use it (oh no) but with, at least for now, greater restraint and insight. That bit and the Rooney-bashing might productively have been left out of this excellent delve into the surprisingly overlooked specifics of how algorithms are eating away at exactly what makes us most human: arts and culture.

Like Filterworld, Murgia’s take on AI is richly considered, but also meticulously supported with footnotes and a good index (both would benefit Filterworld). It’s simultaneously a thoughtful introduction to AI, because potentially forbidding technical details are unobtrusive and serve the very human individual stories; and a deep dive for anyone who already knows quite a lot about AI, or thinks they do – for exactly the same reason.

Chayka argues our desires and interests are now methodically flattened into a bland sameness – the Filterworld – that we are algorithmically engineered to like and want

Every chapter in Code Dependent shocks, even though Murgia, a self-described “innate optimist” about technology, also puts a gentle case for AI’s potential benefits. The difficulty is in reconciling the potential with the current human costs and negative impacts. Despite AI’s many possible positives, Murgia exposes how most AI software and uses are based on a significant degree of data colonialism and exploiting the “precariat”, a portmanteau term that increasingly has been used to describe people such as gig workers struggling with job insecurity and financial and social instability. These people and their communities are profoundly vulnerable, and thus the easiest to exploit for personal data-gathering, for the ugly, mindless, repetitive work required to train AI systems, and for sometimes ill-conceived and poorly deployed AI community trials – all to produce or fulfil AI products and services the precariat will neither be able to afford nor benefit from. They are, among so many other examples, the delivery cyclists and ride-share drivers, the social media content moderators and self-driving car software program trainers, the healthcare app triallers and the AI revenge porn victims.

Seeking to better understand AI’s impacts, Murgia travels to both nearby and far-flung places to find those affected and asks “the small, human questions”. She talks to the heartbreakingly real people whose wellbeing and dignity is being sacrificed on the altar of ChatGPT, facial recognition technologies and monetisable AI apps and programs designed for the world’s better-off individuals and societies. She uncovers the ethically ambiguous ways in which these jobs or AI trials in these communities can also bring new earning opportunities, possibilities of advancement, even save lives. Yet most of the time, benefits are counterweighted by job insecurity, inequalities across global workforces, and exploitation of those too powerless to protest. It’s utterly riveting reporting.

Murgia also notes a salient fact: the majority of key research studies exposing AI’s inequalities and exploitations come from “women of colour outside the English-speaking West”. We owe these women an enormous debt of gratitude for this valuable professional work but also, for enduring – as I’ve observed them do – a particularly obnoxious and distressing level of online and professional harassment from indignant Tech Bro types and the inevitable mansplainers. Among others, Murgia namechecks Irish academic and researcher Abeba Birhane who is now, to our immense national good fortune, an appointee to the Irish Government’s AI Advisory Council.

What Murgia and these researchers show us is that when it comes to AI, we are all, ultimately, the precariat, at the questionable mercies of large tech companies and AI algorithms. “Each of these stories could be yours. AI systems will impact your health, your work, your finances, your kids, your parents, your public services and your human rights, if they haven’t already,” she warns.

Unexpectedly, the book closes with a 2023 visit to the Vatican for an interfaith discussion on AI between Muslim, Jewish and Christian leaders, an unprecedented occasion upon which religious figureheads of these three major faiths signed a joint statement calling for AI designers to develop technologies using six ethical principles. Murgia chats to Rabbi David Rosen, Ireland’s former chief rabbi, who thinks today’s giant tech companies, operating across borders ”with their billions of users, hold immense power… and have an outsize impact on the world, reminiscent of the ancient role of religions in society”. While recognising the ironic parallels, Rosen says the event enabled these religious leaders to “remind the tech companies of the follies of power”. But it may not be a message industry leaders will hear or hear loudly enough, even though major figures in the AI and tech sector were at the discussions.

Both these important books dovetail well to allow a reader to see the potentially devastating outcomes ahead for humanity – some, already happening – if the development and use of these two algorithmic technologies are left unchallenged, to primarily benefit large corporations or the already-privileged markets able to pay for new services. One can only hope these authors will be widely read and propel change.

Further Reading

Atlas of AI: Power, Politics and the Planetary Costs of Artificial Intelligence by Kate Crawford (Yale University Press, 2021). Rather than prying open the technologies, Crawford authoritatively looks at the consequences of AI on areas such as the environment, culture, society, work and politics. Among other prominent roles (inaugural visiting chair of AI and justice at the École Normale Supérieure, and the Miegunyah distinguished visiting fellow at the University of Melbourne), she’s a senior principal researcher at Microsoft Research, but this is anything but a cheerleading promotion of AI.

Data Ethics: Practical Strategies for Implementing Ethical Information Management and Governance by Katherine O’Keefe and Darragh O’Brien (KoganPage, 2018, revised 2023). A well-researched, practical handbook for understanding key concepts in data ethics and applying them within any organisation, from two Irish experts well-known in global data security and governance circles. More broadly, this is an excellent introduction to real, on-the-ground issues in data ethics including new developments in areas such as AI. Suggested readings in each chapter provide many more opportunities for exploring the subject in detail.

AI 2041: Ten Visions for Our Future by Kai-fu Lee and Chen Qiufan (WH Allen, 2024). So often, speculative fiction is what truly helps us visualise and understand new technologies and their possible long-term implications for humans. This unusual and inspired pairing brings together Lee, a Taiwanese-born American venture capitalist and a former president of Google China, and award-winning Chinese science fiction author Chen. Chen offers 10 stories; Lee dives into the factual science behind each. Much illuminating fun is had by all.

Karlin Lillington

Karlin Lillington

Karlin Lillington, a contributor to The Irish Times, writes about technology