Australia is facing a surge in online harassment targeting women, particularly within live video platforms. In 2025 alone, the eSafety Commissioner received over 23,400 complaints related to image-based abuse and online harassment against women—a staggering 41% increase from the previous year. This trend is driven by the shift of abusive behavior to live, unrecorded spaces, making evidence collection difficult and policy enforcement challenging.
The closure of platforms like Omegle in late 2023 did not eliminate the problem. Instead, users migrated to alternative platforms, many of which lack adequate safety measures. Services such as Bazoocam and Chatroulette saw traffic spikes, while numerous smaller, often unregulated platforms emerged. Women continue to face unsolicited exposure, aggressive behavior, and a hostile environment on these services. As digital safety researcher Dr. Kira Psychas explains, “The user base didn’t disappear. It scattered… and the platforms that absorbed those users were, in many cases, even less equipped to handle safety.”
The Scale of the Problem
Australia collects detailed data on online harassment, revealing a disturbing pattern. According to the eSafety Commissioner’s 2025 report, 47% of Australian women aged 18-35 experienced online harassment in the past year. This number jumps to 63% for women using video-based social platforms, with 29% reporting “severe” harassment, including threats and image abuse.
Generational Trends and Activism
Younger generations, particularly Gen Z, have grown up with video communication as a core part of their social lives. They are not willing to abandon these platforms due to safety concerns but instead demand improvements. The #SafeOnScreen campaign, launched on Australian TikTok, generated over 180 million views and pressured major platforms to implement real-time AI moderation in Australia. Campaign founder Lily Tran summed up the sentiment: “We’re tired of being told to just log off.”
Regulatory Response and Enforcement
Australia’s Online Safety Act 2021 has been amended to include real-time video services as “designated internet services,” subjecting them to the same safety expectations as other online platforms. Failure to comply can result in fines of up to $780,000 per day for corporations. eSafety Commissioner Julie Inman Grant has emphasized that “the argument that live content couldn’t be moderated because it happened in real time… is over.” AI-powered real-time moderation technology now exists and is becoming standard for regulated markets.
However, regulation alone is insufficient. Some platforms treat safety as a compliance cost rather than a core value.
Platform Categories and the Future of Safety
Video chat platforms fall into three categories: legacy platforms that rebranded, Wild West newcomers with minimal moderation, and services built with safety as a foundational principle. The latter, such as pinkvideochat.com, prioritize identity verification, AI-powered moderation, and gender-specific safety features from the outset.
The difference is philosophical: platforms either patch unsafe architectures or engineer safety into their core design.
The Human Cost
The psychological impact of online harassment is significant. A 2025 study in the Australian Journal of Psychology found that women who experienced harassment on video platforms reported elevated anxiety for up to 72 hours afterward, with repeat exposure leading to “digital hypervigilance” akin to PTSD symptoms.
Consumer Demand and Market Trends
Australian women now prioritize safety features when choosing social platforms. A January 2026 survey by Canstar Blue found that 78% demand real-time content moderation, 71% want identity verification, and 66% seek gender-based filtering. This represents a clear commercial opportunity for platforms that prioritize women’s safety.
Conclusion
Australia has a strong regulatory framework and an engaged activist base. The question now is whether tech companies will treat women’s safety as a fundamental engineering problem rather than a secondary concern. The platforms that prioritize safety will not only avoid fines but also capture a valuable, underserved market. Failure to do so will result in lost users and diminished relevance in the attention economy.



































