Bluesky issues its first transparency report, noting rise in user reports and legal demands
Bluesky’s Explosive Growth and First Transparency Report: A Deep Dive into Moderation, Security, and the Future of Decentralized Social Media
Bluesky, the decentralized social media platform that has positioned itself as a formidable alternative to X (formerly Twitter) and Meta’s Threads, has released its inaugural transparency report, offering unprecedented insight into its operations, growth trajectory, and commitment to user safety. The timing couldn’t be more significant, as the platform experiences explosive growth while navigating the complex landscape of content moderation in an increasingly polarized digital world.
Meteoric Rise: From 25.9 Million to 41.2 Million Users in a Single Year
The numbers alone are staggering. Bluesky’s user base expanded by nearly 60% in 2025, jumping from 25.9 million users to an impressive 41.2 million. This growth isn’t just about raw numbers—it represents a fundamental shift in how people are thinking about social media. The platform’s unique positioning as a decentralized alternative to mainstream social networks appears to be resonating with users who are increasingly concerned about privacy, algorithmic manipulation, and corporate control over their digital conversations.
What makes this growth even more remarkable is that these figures include not only users on Bluesky’s own infrastructure but also those running their own servers as part of the broader decentralized network built on Bluesky’s AT Protocol. This federated approach to social networking represents a significant departure from the centralized models that have dominated the social media landscape for the past decade.
Content Explosion: 1.41 Billion Posts and Counting
The platform’s content creation has been equally impressive. Users generated 1.41 billion posts during 2025 alone, representing a staggering 61% of all posts ever made on Bluesky since its inception. This content explosion includes 235 million posts containing media, accounting for 62% of all media posts shared on the platform to date.
This massive content creation speaks volumes about user engagement and the platform’s ability to foster meaningful conversations. Unlike some social networks where users primarily consume content, Bluesky appears to be cultivating a community of active participants who are not just passive consumers but creators and contributors to the digital dialogue.
The Transparency Revolution: Beyond Traditional Moderation Reports
While Bluesky had previously published moderation reports in 2023 and 2024, this year’s document represents a quantum leap forward in transparency. The comprehensive transparency report goes far beyond simple moderation statistics to encompass a wide range of operational metrics, regulatory compliance efforts, and security initiatives.
This move toward greater transparency comes at a crucial time when social media platforms are facing increasing scrutiny from regulators, users, and civil society organizations. By voluntarily releasing detailed information about its operations, Bluesky is positioning itself as a leader in the movement toward more accountable and transparent social media governance.
Moderation Reports Surge 54%: A Closer Look at the Numbers
The moderation landscape on Bluesky has evolved significantly over the past year. The platform saw a 54% increase in user reports, jumping from 6.48 million in 2024 to 9.97 million in 2025. While this might initially seem concerning, Bluesky emphasizes that this growth closely tracked its 57% user growth during the same period, suggesting that the increase in reports is proportional rather than indicative of a decline in platform quality.
Approximately 3% of the user base, or 1.24 million users, submitted reports in 2025. The top categories reveal fascinating insights into user concerns and platform challenges. Misleading content, which includes spam, accounted for 43.73% of all reports, while harassment represented 19.93%, and sexual content made up 13.54%. The remaining reports fell into various other categories including violence, child safety, rule violations, and self-harm concerns.
The Spam Challenge: A Persistent Battle
Within the misleading content category, spam emerged as the dominant concern, accounting for 2.49 million reports out of 4.36 million total reports in this category. This highlights the ongoing challenge that social media platforms face in combating automated and malicious content that seeks to exploit user attention and platform resources.
Bluesky’s approach to spam appears to be multifaceted, combining user reporting with sophisticated automated detection systems. The platform’s ability to identify and address spam at scale will be crucial to maintaining user trust and ensuring that the platform remains a space for genuine human connection rather than becoming overwhelmed by automated content.
Harassment: A Complex Landscape
The harassment category presents a more nuanced picture. While hate speech accounted for the largest share of harassment reports at approximately 55,400 instances, the report reveals that the majority of harassment reports actually fell into what Bluesky describes as a “gray area” of antisocial behavior. This includes rude remarks and other forms of negative interaction that don’t necessarily rise to the level of explicit hate speech but still create a hostile environment for users.
Other harassment subcategories showed significant activity, including targeted harassment (approximately 42,520 reports), trolling (29,500 reports), and doxxing (about 3,170 reports). The diversity of harassment types underscores the complexity of creating safe online spaces and the need for nuanced approaches to moderation that can distinguish between different levels of harmful behavior.
Sexual Content: The Labeling Challenge
Sexual content reports present a particularly interesting case study in Bluesky’s moderation philosophy. The vast majority of sexual content reports—1.52 million out of the total—concerned mislabeling rather than explicit violations. This indicates that adult content was not properly marked with metadata that allows users to control their own moderation experience using Bluesky’s tools.
This finding suggests that Bluesky’s approach to adult content emphasizes user control and choice rather than blanket prohibition. By focusing on proper labeling and metadata rather than outright bans, the platform appears to be trying to balance freedom of expression with user safety and comfort.
Violence and Extremism: Targeted Interventions
Violence-related reports totaled 24,670 and were broken down into several subcategories. Threats or incitement accounted for approximately 10,170 reports, glorification of violence represented 6,630 reports, and extremist content made up 3,230 reports. These numbers, while smaller than some other categories, represent areas where Bluesky has taken particularly aggressive action to protect user safety.
The platform’s approach to violence and extremism appears to be focused on rapid identification and removal, with a particular emphasis on preventing the spread of content that could lead to real-world harm. This aligns with growing concerns about the role of social media in facilitating radicalization and violence.
Automated Systems: The Silent Guardians
Bluesky’s automated systems played a crucial role in platform safety, flagging 2.54 million potential violations throughout the year. This represents a significant investment in technological solutions to complement human moderation efforts. The combination of automated detection and human review allows Bluesky to scale its moderation efforts while maintaining the nuanced judgment that complex content decisions often require.
One area where Bluesky reported particular success was in reducing antisocial behavior through automated systems that identify toxic replies and reduce their visibility. This approach, which puts problematic content behind an extra click similar to X’s approach, resulted in a remarkable 79% decline in daily reports of antisocial behavior. This suggests that subtle interventions that don’t completely remove content but make it less visible can be highly effective in improving platform experience.
Regulatory Compliance and Legal Requests: A Fivefold Increase
Perhaps one of the most significant findings in the transparency report is the dramatic increase in legal requests. Bluesky reported a fivefold increase in legal requests from law enforcement agencies, government regulators, and legal representatives in 2025, with 1,470 requests compared to just 238 in 2024. This massive increase reflects both the platform’s growth and the increasing scrutiny that social media companies face from regulatory authorities.
The nature of these requests likely spans a wide range of issues, from criminal investigations to intellectual property disputes to national security concerns. How Bluesky handles these requests while maintaining user privacy and platform integrity will be crucial to its long-term success and credibility.
Takedowns and Enforcement: Getting More Aggressive
The report confirms what Bluesky had hinted at in the fall of 2025: the company is indeed getting more aggressive about moderation and enforcement. The numbers are striking—Bluesky took down 2.44 million items in 2025, including accounts and content. This represents a significant escalation from the previous year, when the platform had taken down 66,308 accounts and its automated tooling had removed 35,842 accounts.
The breakdown of takedowns reveals interesting patterns. Moderators took down 6,334 records, while automated systems removed 282. This suggests that while automation plays an important role, human judgment remains crucial for many content decisions. The platform also issued 3,192 temporary suspensions and 14,659 permanent removals for ban evasion, with most permanent suspensions focused on accounts engaging in inauthentic behavior, spam networks, and impersonation.
The Labeling Philosophy: A Preference for Transparency
Despite the aggressive enforcement actions, the report suggests that Bluesky prefers labeling content over outright removal. The platform applied 16.49 million labels to content in 2025, representing a 200% year-over-year increase. This approach aligns with Bluesky’s broader philosophy of giving users more control over their experience rather than imposing top-down restrictions.
Most of the labeling involved adult and suggestive content or nudity, reflecting the platform’s emphasis on proper categorization rather than prohibition. This labeling approach allows users to make informed decisions about what content they want to see while maintaining a diverse and open platform.
Influence Operations: A Growing Concern
Bluesky’s report also highlighted its efforts to combat influence operations, with the company removing 3,619 accounts suspected of engaging in such activities. While the report doesn’t specify the origins of these accounts, the context suggests that many likely originated from Russia, given the current geopolitical climate and the well-documented history of Russian influence operations on social media platforms.
The focus on influence operations reflects growing awareness of how social media can be weaponized for political purposes and the need for platforms to take proactive steps to protect the integrity of public discourse.
Age Assurance and User Safety
The transparency report also touched on Bluesky’s efforts regarding age assurance compliance, though specific details were limited. As platforms face increasing pressure to protect younger users, how Bluesky approaches age verification and child safety will be crucial to its long-term viability and regulatory compliance.
The Future of Decentralized Social Media
Bluesky’s first comprehensive transparency report represents more than just a collection of statistics—it’s a statement of intent about the future of social media. By embracing transparency, prioritizing user control, and taking a nuanced approach to moderation, Bluesky is positioning itself as a potential model for how social media platforms can operate in an era of increasing scrutiny and user demands for better experiences.
The platform’s growth, while impressive, is just the beginning. As more users become aware of alternatives to traditional social media and as concerns about privacy, algorithmic manipulation, and corporate control continue to grow, platforms like Bluesky that offer different approaches to these challenges may well represent the future of online social interaction.
The challenge for Bluesky will be maintaining this growth while scaling its moderation and safety efforts, navigating increasingly complex regulatory environments, and staying true to its decentralized principles. If the transparency report is any indication, the company appears to be taking these challenges seriously and investing in the infrastructure and processes needed to address them.
As we look toward the future of social media, Bluesky’s approach—combining transparency, user control, decentralized architecture, and thoughtful moderation—may well point the way forward for an industry that desperately needs new models and approaches. The success of this model could have profound implications not just for Bluesky but for the entire social media ecosystem.
Tags & Viral Phrases:
- Bluesky transparency report 2025
- Decentralized social media revolution
- 41.2 million users and growing
- 1.41 billion posts in a year
- Moderation reports up 54%
- Spam and harassment challenges
- Automated content labeling success
- Fivefold increase in legal requests
- Influence operations takedown
- Age assurance compliance
- AT Protocol decentralized network
- User control and privacy focus
- Social media alternative to X and Threads
- Transparency in social media governance
- The future of decentralized platforms
- Content moderation at scale
- User safety and platform integrity
- Social media growth metrics
- Digital communication evolution
- Online community building
- Platform accountability and trust
- Regulatory compliance in social media
- Technology innovation in social networking
- Digital rights and user empowerment
- The next generation of social platforms
,




Leave a Reply
Want to join the discussion?Feel free to contribute!