TikTok as Your 'Digital Nicotine'

TikTok as Your 'Digital Nicotine'

Imagine picking up your phone for a quick break, tapping on TikTok, and then—almost without realizing it—minutes turn into hours. Video after video appears in a seamless, endless scroll, each engaging enough to keep you watching. For many, especially teens, this experience is an everyday reality. TikTok has captivated over a billion active users worldwide, drawing them into a highly personalized, addictive stream of entertainment. But as TikTok reshapes our habits, concerns grow around its “digital nicotine” effect—its ability to hook users and impact mental health.

To understand how TikTok became so engaging, it helps to look at the blueprint guiding its development. Complex systems like social media platforms are often built on reference models, design frameworks that ensure all parts of a system work together harmoniously. For TikTok, these models serve as the backbone of its user experience, organizing features like video feeds, comments, and recommendations into one addictive whole. These models guide developers in crafting a cohesive platform experience, making the app not just intuitive but, crucially, hard to put down.

Social media platforms are more than just apps—they’re intricate ecosystems of algorithms, data, and human psychology. TikTok’s core feature, its recommendation algorithm, lies at the heart of this system, powering the iconic “For You” page. Every swipe, pause, and rewatch a user makes is fed back into this algorithm, which then tweaks the content to be even more engaging the next time they open the app. But here’s where it gets complex: According to a Bloomberg report, this same algorithm can inadvertently lead vulnerable users toward darker, harmful content like self-harm or depression-related videos. For teens, this can create a feedback loop that not only amplifies their struggles but also makes it hard to disengage.

Reference models play a crucial role in how platforms like TikTok refine and add features. These models help developers evaluate features against set standards, ensuring consistency and integration. TikTok’s recommendation system, for instance, didn’t just emerge—it was carefully developed and tested to drive user engagement. But as the Bloomberg article highlights, the algorithm’s relentless focus on engagement can come at a cost, promoting distressing content to vulnerable users. While reference models help build cohesive features, they need oversight to prevent engagement from overtaking ethical considerations.

TikTok’s “For You” page is the ultimate expression of how reference models shape the user experience. As users dive deeper, the “For You” page continuously adapts, presenting increasingly tailored content to maximize engagement. On the surface, it’s a feature that brings joy and entertainment to millions. But as Bloomberg’s findings reveal, this seemingly innocent feature can lead some users—especially teens—into a spiral of distressing, even harmful, content. This case highlights a core dilemma in social media design: how to balance engagement with well-being, especially when the model guiding development prioritizes user retention.

Reference models are valuable tools for platform developers. They provide structure, consistency, and scalability, enabling social media apps to grow and evolve with ease. Yet, as the Bloomberg article suggests, without ethical guardrails, these models can also become enablers of harmful content. TikTok’s rise exemplifies this paradox: while the app’s cohesive design offers a streamlined, immersive experience, it can also spread content that impacts mental health. This tension underscores the need for reference models that incorporate ethical principles from the start, not as an afterthought.

The success of TikTok’s algorithm in retaining users also brings to light significant ethical challenges. While reference models lay the groundwork for engagement-focused features, they often sideline considerations of user welfare. The Bloomberg article illustrates how TikTok’s algorithm, fine-tuned for endless engagement, can inadvertently amplify content that risks harming vulnerable users, particularly teens. This highlights a pressing need for platforms to balance growth with a commitment to mental health—a delicate act that requires new frameworks to guide design in a socially responsible direction.

The rise of TikTok serves as both a triumph and a cautionary tale. The same reference models that made it so engaging also reveal the ethical responsibility tied to social media design. As TikTok and other platforms continue to evolve, developers, designers, and policymakers have an opportunity—and a duty—to consider the full impact of their creations. As users, we also play a role in advocating for a healthier digital space. Engagement is important, but it should never come at the cost of well-being. We must strive to build platforms that entertain responsibly, respecting not only our attention but also our mental health.

Samuel Gomba

Android developer||Machine Learning Front-end web developer|| Pianist||Junior Graphics designer

3w

Very informative. Thanks sir.

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics