For years, social media companies insisted they were not interested in who you are, only in what you do. What you watch, what you like, what you linger on. Identity, they claimed, was incidental. Behavior was king.
That claim no longer holds.
Recent updates to TikTok’s privacy disclosures explicitly list gender identity alongside highly sensitive categories such as immigration status, citizenship, and sexual orientation. Even when wrapped in careful legal phrasing like “may collect” or “may infer,” the message is unmistakable. Gender identity is no longer treated as something personal and protected. It is being normalized as a data asset.
On its own, that shift would already be concerning. But context matters. Who controls the platform matters. Who sets priorities, responds to government pressure, and decides what data is retained or shared matters.
For transgender users, the growing influence of Trump-aligned billionaire Larry Ellison over TikTok’s U.S. ownership structure fundamentally changes how this data collection should be understood.
From Self-Expression to Data Extraction
TikTok built much of its cultural power on self-expression. For many transgender and nonbinary users, it became a place to experiment with identity, find language, and connect with community in ways that felt safer than offline spaces. Visibility often felt like progress.
That sense of safety masked a deeper reality. Every video watched, every creator followed, and every hashtag engaged with feeds systems designed to classify, predict, and optimize behavior. At first, those systems only needed to understand interests. Over time, that stopped being enough. Advertising models became more sophisticated. Recommendation engines demanded finer segmentation. Platforms were pushed to understand not just what users like, but who they are.
Gender identity, once considered too sensitive to handle directly, is now framed as simply another useful attribute.
This evolution is not ideological. It is economic. Identity increases precision. Precision increases value.
Why Ownership Changes the Risk
Technology platforms do not operate in a vacuum. They respond to owners, investors, regulators, and political pressure. Control determines whose interests are prioritized when conflicts arise.
Larry Ellison is not a passive or neutral figure in U.S. politics. He is a vocal Trump supporter with longstanding ties to Republican power structures, federal agencies, and law enforcement contracts through Oracle. His worldview is not a secret, nor is his proximity to administrations that have openly targeted transgender rights, healthcare, and legal recognition.
When a platform that now explicitly acknowledges tracking or inferring gender identity falls under the influence of someone aligned with anti-trans political movements, the risk calculus changes. This is no longer an abstract privacy debate. It becomes a governance issue.
Data does not have politics. Owners do.
How Gender Identity Becomes Actionable Data
Platforms often emphasize that they do not require users to explicitly disclose gender identity. That distinction sounds reassuring and ultimately misses the point.
Modern platforms rely on inference. Machine learning systems analyze behavior patterns at scale. They do not need declarations. They infer based on viewing habits, creator networks, language usage, engagement timing, and social connections.
A user does not have to say they are transgender for the system to decide they probably are.
Once that inference exists, it becomes actionable. It can be stored, cross-referenced, modeled, and retained. It can be accessed internally. It can be requested externally. It can persist long after a user deletes content or leaves the platform.
Under leadership aligned with Trump-era governance norms, the question is no longer whether this data exists. It is how resistant the platform will be to pressure to use it.
Data Is Not Neutral When Power Is Involved
Technology companies often claim that data is neutral and that harm only arises from misuse. That framing ignores how power operates.
Choosing to collect or infer gender identity is a decision. Choosing to retain it is a decision. Choosing how long it exists and who can access it is a decision. Those decisions are shaped by ownership priorities and political alignment.
Ellison’s history suggests comfort with expansive data use, close cooperation with government agencies, and a worldview that prioritizes state and corporate power over individual vulnerability. That does not require speculation. It is consistent with Oracle’s business model and Ellison’s public political behavior.
For transgender users, neutrality is not the absence of intent. It is the presence of protection. And protection depends on who is in charge.
RELATED: TikTok’s Transgender Flag Emoji Glitch Raises Concerns
Monetization Is Only Part of the Story
Much of the public conversation focuses on advertising. Gender identity as a targeting category. Identity as a way to sell products.
That is only part of the risk.
The more serious concern is normalization. Once gender identity is treated as a legitimate internal data category, it becomes easier to justify its retention and reuse for purposes beyond marketing. Compliance requests. Legal demands. Government inquiries. Policy shifts.
Trump-era governance demonstrated how quickly administrative power can be weaponized against marginalized groups. Databases that once seemed harmless became tools of enforcement and intimidation.
Data collected today does not disappear when administrations change. It accumulates.
Misclassification and Collateral Harm
Even if one assumes good faith, algorithms routinely get identity wrong. Trans people do not behave uniformly. Neither do allies, educators, or critics. Inference systems flatten complexity into probability.
There is no appeal process for algorithmic identity. No mechanism to correct a system that decides who you are. Misclassification can affect what content users see, how they are profiled, and potentially how they are exposed to risk.
Under leadership aligned with hostile political forces, the consequences of being incorrectly categorized are not theoretical.
Visibility Was Never Consent
Trans visibility on TikTok was not an invitation to be cataloged. It was an act of survival, creativity, and community building.
Treating that visibility as permission to extract identity data flips the moral equation. It places the burden of safety on users instead of on the platform. It assumes marginalized people must self-censor to remain safe.
That assumption becomes far more dangerous when ownership aligns with political movements that have already shown willingness to legislate trans people out of public life.
Why This Moment Demands Scrutiny
This is not about predicting malicious intent. It is about recognizing structural risk.
A platform that acknowledges tracking gender identity, combined with ownership aligned with Trump-era politics, creates a scenario where trust should not be assumed. Data governance cannot be separated from political reality.
History does not reward communities that ignore warning signs.
Transparency Without Safeguards Is Not Protection
Listing gender identity in a privacy policy is not protection. It is disclosure. Disclosure without limitation normalizes risk rather than mitigating it.
True protection would require minimizing identity inference, limiting retention, resisting government overreach, and clearly walling off sensitive data from political pressure.
Those actions require leadership willing to say no to power. Ellison’s record suggests the opposite instinct.
The Bottom Line
The concern here is not paranoia. It is pattern recognition.
When gender identity becomes a data asset, and control of that asset sits with individuals aligned to anti-trans political movements, transgender users are right to question what safeguards truly exist.
Trans people did not fight for visibility so their identities could be inferred, stored, and governed by those hostile to their existence. Technology should reduce risk, not quietly increase it under new ownership.
This moment deserves scrutiny, not reassurance. And for transgender users in particular, it deserves caution.

