The content of grief haunts my digital life

I am a primarily visual thinker and thoughts show up as scenes in the theater of my mind. When the many family members, friends and colleagues who supported me asked how I was doing, I saw myself on a cliff, pierced by an omniscient fog just past its edge. I am there, on the edge of the abyss, with my parents and my sisters, looking for a way down. In the scene, there is neither noise nor urgency and I wait for her to swallow me. I’m looking for shapes and navigation clues, but it’s so huge and gray and limitless.

I wanted to take this fog and put it under a microscope. I started googling the stages of grief, books and academic research on loss, from the app on my iPhone, scrolling through personal disasters while I waited for a coffee or watched Netflix. How will he feel? How am I going to handle it?

I began, intentionally and unintentionally, to consume people’s experiences of grief and tragedy through Instagram videos, various news feeds, and Twitter testimonials. It was as if the Internet was secretly associating itself with my compulsions and starting to fulfill my worst fantasies; the algorithms were a kind of priest, offering confession and communion.

Yet with every search and click, I inadvertently created a sticky web of digital grief. In the end, it would prove nearly impossible to disentangle myself. My sad digital life was preserved in amber by the pernicious personalized algorithms that had skillfully observed my mental preoccupations and offered me ever more cancer and loss.

I got out – finally. But why is it so hard to unsubscribe and unsubscribe from content we don’t want, even if it’s harmful to us?

I’m well aware of the power of algorithms – I’ve written about the mental health impact of Instagram filters, the polarizing effect of Big Tech’s engagement craze, and the weird ways advertisers target people. specific audiences. But in my haze of panic and research, I first felt that my algorithms were a force for good. (Yes, I call them “my” algorithms, because even though I realize the code is uniform, the output is so intensely personal that they feel exploit.) They seemed to be working with me, helping me find stories of people dealing with tragedy, making me feel less alone and more capable.

In my haze of panic and research, I first felt that my algorithms were a force for good. They seemed to be working with me, making me feel less alone and more capable.

In fact, I was experiencing the effects of an advertising-driven Internet intimately and intensely, which Ethan Zuckerman, the famous Internet ethicist and professor of public policy, information and communication at the University of Massachusetts in Amherst, dubbed “the original Internet.” Sin” in a 2014 Atlantic piece. In the story, he explained the advertising model that brings revenue to content sites best equipped to target the right audience at the right time and at scale. This, of course, requires “deepening the world of surveillance,” he wrote. This incentive structure is now known as “surveillance capitalism”.

Understanding exactly how to maximize every user’s engagement on a platform is the formula for revenue, and it’s the foundation of the web’s current business model.


Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button