Last night, I found myself laughing so hard that I nearly fell out of bed. TikTok had decided that I needed to learn Tagalog, but not the polite phrases your aunt would approve of. No, it started feeding me a steady stream of sass: lessons in how to say things like “Bitch, who do you think you are?” in the sweetest singsong accent imaginable. One video turned into another, and suddenly, I had a new bad-girl vocabulary at my fingertips.
When I switched to Instagram, the vibe shifted. Out went the Tagalog lessons; in came an endless parade of workout gurus teaching me how to “grow my glutes in just 30 days.” Then, hopping over to Facebook for a breather, the algorithm smacked me with ad after ad for weight loss drugs.
It was like being trapped in three different funhouse mirrors, one reflecting my curiosity, one reflecting my insecurities, and one selling me a solution to the problem it just created.
If this sounds familiar, it’s because none of us are immune to the rabbit holes that social media algorithms dig for us. And while this affects everyone, for transgender people, the stakes and the impact often hit differently.
This article explores how these algorithm-driven rabbit holes shape our experiences, our identities, and even our self-worth.
The Algorithm’s Agenda: What You See Is Not an Accident
Social media isn’t random. Every video, post, or ad you see is the product of a carefully tuned machine built to keep you scrolling.
At its core, an algorithm is a set of instructions. On TikTok, Instagram, or Facebook, those instructions revolve around one main goal: maximize engagement.
- Engagement = profit. The longer you stay, the more ads you see. The more ads you see, the more money platforms make.
- Engagement = data. Every like, pause, share, and scroll tells the system who you are and what you care about, whether you admit it or not.
- Engagement = influence. Algorithms don’t just respond to your interests; they shape them, nudging your attention, preferences, and even your worldview.
That’s why your feed can shift in minutes. Laugh at one Tagalog insult, and suddenly you’re getting a semester’s worth of “bad girl” lessons. Stop for a few seconds on a booty-building reel, and the algorithm assumes you want more. It’s not asking, “What does she need right now?” It’s asking, “What will keep her here?”
Rabbit Holes 101: From Curiosity to Obsession
The term “rabbit hole” feels whimsical, a nod to Alice tumbling into Wonderland. But in digital culture, it’s a darker metaphor. A rabbit hole is what happens when one click becomes twenty, and suddenly you’ve lost hours of your life to a subject you didn’t plan to explore.
Here’s how the cycle works:
- Hook: The platform serves you something that sparks an emotional reaction—humor, anger, curiosity, or desire.
- Reinforcement: You engage, and the algorithm doubles down.
- Flood: Your feed becomes saturated with similar content.
- Entrenchment: Over time, your perspective narrows. You start seeing less of the world at large and more of the world the algorithm has chosen for you.
For many users, this scenario just means extra dance tutorials or endless recipes. But for marginalized groups like transgender people, these rabbit holes can touch on deeply personal issues such as body image, gender identity, medical access, politics, and community.
The Transgender Experience: When Algorithms Hit Different
Algorithms don’t discriminate, but they do amplify. And when you’re transgender, what gets amplified can be complicated, even harmful.
Body Image Pressure
For trans women, feeds are often flooded with “feminization” tips, makeup hacks, breast enhancement exercises, or surgery advertisements. For trans men, it might be endless gym routines, binder recommendations, or facial hair growth supplements. While some of this is affirming, it can also feed dysphoria and insecurity.
What starts as a helpful tip can quickly turn into a flood of content suggesting you aren’t “enough” unless you buy, change, or fix something.
Medical Misinformation
Trans users often report being served ads for unregulated hormones, sketchy weight loss drugs, or “natural alternatives” to medical transition. Algorithms can’t distinguish between curiosity and desperation, so a single search can lead to weeks of potentially harmful content.
Community Connection vs. Exploitation
On the positive side, algorithms do help us find each other. From TikTok’s “TransTok” to Instagram’s gender-affirming fashion reels, many trans people discover community and validation online. But the flip side is that platforms monetize that connection, our identities become data points in someone else’s profit model.
Political Rabbit Holes
It’s not just personal. Algorithms also push political content. For trans users, this often means being bombarded with news of anti-trans legislation, hateful commentary, or fearmongering headlines. Staying informed matters, but doomscrolling through targeted outrage can wear down mental health fast.
Data Collection: The Invisible Price of Scrolling
It’s easy to joke about TikTok knowing you better than your therapist, but it’s not far from the truth. Social media algorithms rely on massive data collection:
- Behavioral data: How long you pause on a video, which posts you like, and what you share.
- Biometric data: On some platforms, your camera and microphone may capture subtle cues.
- Demographic data: Location, age, gender identity, device type.
- Cross-platform data: Your browsing outside of social apps, thanks to cookies and trackers.
For transgender users, this raises specific concerns. Many of us already navigate privacy risks when it comes to medical care, legal identity, and safety. Add to that the possibility of platforms selling or sharing data, and suddenly your online habits become a map of your gender journey, accessible to advertisers, corporations, and sometimes hostile actors.
Humor as a Coping Mechanism
If there’s one thing trans folks are good at, it’s finding humor in the chaos. The Tagalog lesson rabbit hole? Absolutely hilarious. The workout reels? Eye-roll worthy but still oddly motivating. Even the constant bombardment of weight loss ads becomes meme fuel after a while.
Humor doesn’t erase the harm of manipulative algorithms, but it does give us a way to reclaim power. Laughing at the absurdity of being simultaneously told to “eat less, move more” and “grow a bigger butt” is a form of resistance.
The Mental Health Toll
Still, we can’t ignore the darker side. Algorithm-driven rabbit holes can intensify:
- Anxiety: Constant exposure to political threats or body image content raises stress.
- Dysphoria: Being shown “ideal” versions of gender presentation can deepen insecurity.
- Isolation: Seeing curated versions of other people’s lives can create unrealistic comparisons.
- Sleep disruption: The endless scroll is engineered to keep you awake, often at the cost of rest.
For a community already facing disproportionate rates of depression, anxiety, and suicidal ideation, these effects aren’t trivial.
Breaking the Cycle: Practical Steps
We may not be able to outsmart billion-dollar algorithms, but we can build healthier relationships with our feeds.
Audit Your Engagement
Notice what you pause on. That three-second hesitation can mean weeks of similar content. Be intentional about what you like, share, or watch to completion.
Curate Actively
Follow accounts that bring you joy, knowledge, or affirmation. Unfollow those that trigger dysphoria or stress. Algorithms learn from both presence and absence.
Set Boundaries
Use timers or app limits. Create phone-free zones before bed. Even small changes, like charging your phone outside the bedroom, can break the cycle.
Diversify Platforms
Don’t let one app monopolize your attention. If TikTok is giving you doomscroll vibes, switch to a podcast, a book, or a Discord community.
Remember the Goal
Platforms are built to serve themselves, not you. Reclaim your agency by asking: “Am I choosing this content, or is it choosing me?”
Advocacy and Responsibility
On a systemic level, the responsibility shouldn’t rest solely on users. Platforms must be held accountable for the ways algorithms manipulate and exploit marginalized communities.
- Transparency: Companies should disclose how recommendation engines work.
- Regulation: Lawmakers should demand stronger privacy protections for vulnerable users.
- Community Pressure: Users, especially marginalized groups, have power in numbers. Collective demands can push platforms toward more ethical practices.
The Bottom Line
Social media is both a lifeline and a labyrinth. It connects us, educates us, entertains us, and sometimes even saves us. But it also exploits us, manipulates us, and wears us down.
For transgender people, the stakes are heightened. Algorithms touch our identities, our bodies, our politics, and our mental health. They can make us laugh until we cry or cry until we can’t sleep.
The key is awareness. When you know the rabbit hole is there, you can choose how far you’re willing to fall. And sometimes, you can climb back out, phone in hand, laughing at the absurdity of it all.