The digital darkness: Doomscrolling and you

  • 0

Content is said to be king, but sometimes the content that social media shows its users can follow a dark, depressing spiral. If you’ve ever felt like you were having a bad day and indulging in negative internet content like a box of chocolates, it’s not just you: the phenomenon is called doomscrolling. Other users, sometimes algorithms, can lure you down into the dimly lit corners of the internet. Here’s more about the digital darkness, and how to stop doomscrolling from pulling you into the void.

What is doomscrolling?

According to Healthline, doomscrolling means users are “spending an increasing amount of time reading negative news online without pausing”. Doomscrolling first began to be described and studied during the lockdown restrictions. People naturally spent more time on the internet, and some users began to notice themselves being pulled into a loop of negative content.

Scrolling for negative content isn’t limited to news websites. It’s possible to get trapped in a scrolling loop on most websites, including YouTube and Netflix. Anywhere content is scrolled or suggested, a user might be inspired to click on one negative headline or video (and then another).

Instead of seeing content that brightens up their day, users might see increasingly negative memes or videos. Content becomes filled with triggers the more they scroll. Doomscrolling users can easily indulge in further negative content without realising it. Users might feel anxious and depressed as a result, but continue clicking on links that suggest only more of the same, making them feel even more anxious and depressed.

There are mental health dangers associated with consuming certain content or data, just like consuming certain food or beverages may pose health dangers. Partially, the phenomenon can be compared to putting on every sad love song you’ve ever listened to just to have a good cry, except that the doomscrolling loop tends to see users continuously clicking on even more negative content as suggested by algorithms.

How doomscrolling traps you

Even though it seems like a modern issue, doomscrolling isn’t new to the world at all. What a modern population calls doomscrolling has some links to what psychologists call negative confirmation bias. Negative confirmation bias is a similar thought loop, whereby someone sees increasingly negative or anxiety-causing connections and keeps looking for more of them, consciously or subconsciously.

Many individuals treat Friday the thirteenth as an unlucky day. If something termed unlucky happens on this day, it confirms their fears and they often take that as a sign that something else will happen as well. The thought is more rooted in paranoia than logic, yet it’s a particularly common one.

The opposite of this is positive confirmation bias, where people look for non-existent positive patterns or silver linings to reinforce a pre-existing idea. Gamblers often fall into this trap, believing that a certain amount of wins (or losses) must be able to turn their luck around. A “winning streak” indicates positive confirmation bias at work, while feeling lost or trapped in negativity describes doomscrolling and negative confirmation bias.

The harm

Doomscrolling can worsen depression or anxiety without the user realising the potential harm. Scrolling through content is something you’d think of as a worthy distraction when you’re feeling anxious or down, but doomscrolling becomes the exact opposite.

However, the harm isn’t just psychological. According to a 2025 Vice feature, doomscrolling can also cause financial harm. Vice points out that people might spend at least 3,5 hours doomscrolling at work. This has the potential to cost employers $5 600 per employee.

There’s one more important question to ask about doomscrolling: are users at fault for seeking out negative internet content, or should social media take more responsibility for its moderation, suggestions and tags?

User or algorithm

Users who notice that they’ve started doomscrolling are told to seek out more positive content deliberately, and sometimes to avoid social media or to use another platform they feel more comfortable with. A social media break, like a tolerance break from smoking weed, can make you return to a better experience.

Doomscrolling, however, isn’t just a user issue. Timelines are different, based on various things (including cookies and site settings). This is why your uncle might see numerous videos about John Deere tractors, whereas you’re getting ads for McDonald’s or KFC. Once you’ve listened to seven Bullet for My Valentine songs, site algorithms might begin to suggest similar content. Suggested content and timeline settings can swing a positive social media account toward a negative slant. Suggestions and settings lock users into seeing specific ads or content. Once you’ve clicked on enough similar things, algorithms take over the user’s responsibility of finding content.

Sometimes, the company you keep can also influence what is suggested for you. Advertisers can target you based on things your friends like; your friendships on social media platforms are no secret to algorithms.

Users should seek out positive content to avoid doomscrolling. However, users should also check their settings, suggestions and tags. Be proactive. Uncheck anything you don’t want to see, or search to find out how you can remove some tags from being suggested to you.

How to fight doomscrolling

Social media and similar algorithms show users more of what the system thinks is appropriate; the more a user indulges in doomscrolling, the more “similar” things they might see. As an illustration of this, Sky News created false “teenage” social media accounts as an experiment. Journalists were quickly locked into progressively more suggestions featuring Andrew Tate and similar content.

If your timeline feels like it’s going down the wrong rabbit hole, change your settings first. Algorithms aren’t always accurate or perfect. Changing your settings or removing some suggestions from your timeline can constrain social media to show more of what you’d like to see instead.

How, then? Start under Settings for the relevant website, which usually allow users to omit specific tags or topics from their timeline. When seeing content you don’t like or want to see less of, look for a way to report or remove it (usually that’s in the corner of the post itself). For YouTube, Netflix and Showmax, users can also choose which suggestions they don’t want to view. This works for specific tags, but also works for advertising: an alcoholic or gambler might want to avoid seeing specific ads, and can entirely omit some words or tags from their social media profiles. This also works when fighting doomscrolling on your personal profiles, or when trying to see less bad techno and more music you actually enjoy hearing.

Users can train algorithms. Do so. Do not allow suggestions to lead you down into the digital darkness.

  • 0
Verified by MonsterInsights
Top