Social Media Trial Should Lead to Platform Redesigns


In a landmark case, a jury this week concluded that Meta and YouTube negligently designed their platforms and harmed the plaintiff, a 20-year-old woman named Kaley GM. The jury agreed with the plaintiff that social media is addictive and harmful and was deliberately designed to be so. This finding aligns with my view as a clinical psychologist: Social media addiction is not a failure of users, but a characteristic of the platforms themselves. I believe accountability must extend beyond individuals and extend to the systems and incentives that shape their behavior.
In my clinical practice, I regularly see patients struggling with compulsive social media use. Many describe a pattern of “doomscrolling,” often using social media to numb themselves after a long day. Afterward, they feel guilty and stressed about the time wasted, but have had limited success in changing this pattern on their own.
It’s easy to see why scrolling can be so addictive. Social media interfaces are built around a powerful behavioral mechanism called intermittent reinforcement, explains Judson Breweraddiction researcher at Brown University, which is the most powerful and effective type of reinforcement learning. It’s the same mechanism that slot machines rely on: users never know when the next reward (a shower of quarters, or a multitude of likes and comments) will appear. Not every video in our feeds captivates us, but if we scroll long enough, we’re bound to come to one that does. The constant search for rewards traps us and reinforces itself.
Why social media is addictive
Individuals usually struggle alone to combat compulsive social media use. This should come as no surprise, because habits are usually not broken by simple discipline, but rather by changing the reinforcement loops that maintain them. Brewer asserts that “there is actually no neuroscientific evidence for the presence of will.” By placing the responsibility for self-regulation solely on users, we miss the deeper problem: these platforms are designed to override individual control.
A growing body of research identifies social media use and constant digital connectivity as important influences on the increasing incidence of mental health problems among adolescents. Brewer notes that adolescents are particularly vulnerable because they are in a “developmental phase” in which reinforcement learning processes are particularly strong. This vulnerability can be exploited by design features of large social media platforms.
How platforms are designed to maximize engagement
NPR has uncovered recordings of a recent lawsuit filed by the Kentucky attorney general against TikTok. According to these documents, TikTok implemented interface mechanisms such as autoplay, infinite scrolling, and a highly personalized recommendation algorithm that were systematically optimized to maximize user engagement.
TikTok’s algorithmically personalized “For You” content continuously tracks user behaviors, such as how long a video is watched, whether it is replayed, or quickly ignored. The feed then curates short videos, or Reels, for the user based on their past scrolling behavior and what is most likely to capture attention.
These documents show an example of a technology company that knowingly designs products to maximize attention. I think social media companies also have the ability to reduce addiction through intentional design choices.
How governments regulate social media
The good news is that we are not powerless. There are multiple levers for change: how we collectively talk about social media, how our governments regulate its design and access, and how we hold companies accountable for the practices that shape user behavior.
Some countries are moving quickly to establish policy regarding the use of social media. Australia has imposed a minimum age of 16 for social media accounts, and similar bans are underway in Denmark, France and Malaysia.
These bans are generally based on age verification. Users without a verified account can still passively watch videos on platforms like YouTube, but this approach removes many of the most addictive features, including infinite scrolling, personalized feeds, notifications, and follower and like systems. At the same time, age verification can lead to different problems in the online ecosystem.
Other countries target social media use in specific contexts. South Korea, for example, has banned the use of smartphones in classrooms. And the UK is taking a different approach; its Age-Responsive Design Code requires platforms to prioritize child safety when designing products. The code includes strict privacy settings, limits on data collection, and constraints on features that push users toward greater engagement.
How social media platforms could be redesigned
A report entitled Breaking the algorithmof Mental Health America, argues that social media platforms should shift from maximizing engagement to supporting well-being. It calls for revamping recommendation systems to detect patterns of unhealthy usage and adjusting feeds accordingly, for example by limiting extreme or distressing content.
The report also argues that users should not have to intentionally opt out of harmful design features. Instead, the safest settings should be the default ones. The report supports regulatory measures aimed at limiting features like autoplay and infinite scrolling while strengthening privacy and security settings.
Platforms could also give users more control by adding natural speed bumps, such as breakpoints or pause reminders while scrolling. Research shows that interrupting infinite scrolling with prompts like “Do you want to continue?” » Significantly reduces mindless scrolling and improves content memory.
Some social media platforms are already experimenting with more ethical engagement. Mastodon, a decentralized open source platform, displays posts in chronological order rather than ranking them by engagement, and does not offer algorithmically generated feeds like “For You.” Bluesky puts users in control by allowing them to customize their own algorithms and switch between different feed types, such as timeline or topic filters.
In light of the recent verdict, it is time for a national debate about the responsibility of social media companies. Individual responsibility will always be important, as will the mechanisms big tech uses to shape user behavior. While social media platforms are currently designed to capture attention, they can also be designed to give back some of it.
From the articles on your site
Related articles on the web



