Digital Violence Is Real Violence. Why Prevention and Justice Must Catch Up

Edith Mecha

Evidence shows online abuse escalating into real-world harm, exposing urgent gaps in prevention, policing, and accountability.

When a phone notification triggers fear rather than connection, something has gone wrong. For millions of women and girls, digital spaces have become sites of surveillance, harassment, intimidation, and control. This is not a marginal issue or a future risk. Technology-facilitated violence against women and girls is already reshaping lives, silencing voices, and undermining safety, justice, and democratic participation.

The evidence is now unequivocal. Digital violence does not stay online. It escalates. It follows women into their homes, workplaces, schools, and public life. Treating it as a lesser or purely virtual harm is no longer defensible.

The evidence is clear. Digital violence is escalating

Recent global research shows that technology-facilitated violence is both widespread and intensifying. Prevalence estimates range from 16 to 58 per cent worldwide, with women and girls disproportionately affected, particularly those who face intersecting discrimination due to age, disability, ethnicity, sexual orientation, or public visibility.

UN Women 2025 report Tipping Point. The chilling escalation of online violence against women in the public sphere provides some of the clearest evidence to date. Seven in ten women human rights defenders, activists, journalists, and public communicators surveyed reported experiencing online violence in the course of their work. More than four in ten reported offline harm directly linked to that abuse, including stalking, physical attacks, and threats at home.

For women journalists and media workers, the trend is especially alarming. Between 2020 and 2025, the proportion linking online abuse to offline harm more than doubled. This is not an anomaly. It is a pattern, and it confirms what survivors have long said. What begins on a screen often ends at the front door.

Young women face particularly high risk. Those aged 18 to 24 are several times more likely to experience digital violence than older women and are also more likely to encounter the same perpetrators offline. Digital harm compounds existing vulnerabilities rather than replacing them.

AI has intensified both scale and severity of harm

At the same time, artificial intelligence has changed both the scale and severity of abuse. Generative AI tools now make it easier and cheaper to create convincing harassment, impersonation, and non-consensual sexual imagery at scale. Nearly a quarter of women surveyed in 2025 reported experiencing AI-assisted online violence. Among writers and public communicators, exposure rose to 30 per cent.

Deepfake sexual content illustrates this shift starkly. The overwhelming majority of deepfake pornography targets women, often without their knowledge or consent. Once created, this material can be replicated endlessly, shared across platforms, and weaponised to shame, silence, and control.

AI does not merely amplify abuse. It also obscures accountability. Anonymity, bots, and coordinated harassment campaigns make perpetrators harder to identify, while existing legal frameworks struggle to keep pace with the speed, scale, and cross-border nature of harm.

Technology is not neutral. It reflects the values, priorities, and blind spots of those who design and govern it. When gender is absent from AI design and regulation, inequality is embedded by default.

Online abuse follows women offline

Digital Violence

Digital violence also does not exist in isolation. It follows women everywhere.

At home, cyberstalking and digital surveillance by current or former partners turn private space into a site of constant vigilance. At school, cyberbullying undermines girls’ confidence, learning, and sense of belonging. At work, online sexual harassment shapes career decisions, mental health, and economic security. In public life, gendered disinformation and online hate are deployed deliberately to silence women journalists, politicians, activists, and human rights defenders.

The consequences extend beyond individual harm. When women withdraw from digital spaces to protect themselves, democratic life narrows. Freedom of expression is eroded. Public debate loses critical perspectives. Inequality deepens.

Why current responses are failing survivors

Despite growing awareness, responses remain dangerously inadequate. Responsibility is still too often placed on victims and survivors to manage the risk. Report the abuse. Block the account. Change your settings. Leave the platform.

This approach is neither fair nor effective.

Law enforcement is frequently the first point of contact for survivors, yet responses are inconsistent and uneven. A woman’s initial interaction with police can determine whether she continues through the justice system or disengages entirely. When complaints are minimised or misunderstood, harm is compounded and trust is lost.

Legal frameworks also lag behind reality. Many forms of technology-facilitated violence are addressed indirectly through stalking or hate speech laws rather than being clearly named and prosecuted for what they are. Where legislation exists, enforcement is uneven, and cross-border accountability remains weak.

Technology platforms, meanwhile, continue to profit from engagement-driven systems that amplify abuse faster than it can be addressed. Safety tools remain reactive rather than preventative, placing the burden on those already harmed.

What prevention and justice must look like now

If prevention and justice efforts are to remain credible, this must change.

First, digital violence must be recognised as real violence. Laws, policies, and services need to treat technology-facilitated abuse as part of the continuum of gender-based violence, not as a secondary concern or a communications issue.

Second, survivor-centred and trauma-informed justice responses must become the norm. Police and judicial systems need training, resources, and clear mandates to respond effectively to digital abuse. Secure online reporting portals, specialised investigative units, and strong referral pathways to health, legal, and social services can reduce secondary victimisation and improve access to justice.

Third, accountability must shift decisively toward technology companies and the wider digital ecosystem. Safety by design, proactive detection of abuse, transparent reporting mechanisms, and cooperation across platforms are essential. Responsibility cannot rest solely on victim reporting.

Fourth, AI must be governed with gender at the centre. AI systems should be treated like any other product that poses public safety risks. This means mandatory risk and impact assessments, transparency requirements, and clear mechanisms for redress when harm occurs.

Finally, prevention requires sustained investment and partnership. Civil society, women’s rights organisations, educators, technologists, and policymakers must work together to challenge misogynistic norms, strengthen digital literacy, and support women and girls without blaming them for the violence they face.

As part of the UNiTE campaign to end violence against women and girls, the call to action is clear. Digital spaces must be safe spaces. Victims and survivors should never have to carry the burden of harm created by technological tools.

Policymakers can strengthen and enforce laws that recognise digital violence as a serious human rights violation. Justice systems can respond swiftly and respectfully. Technology companies can prioritise prevention over profit. Communities can believe survivors and challenge harmful behaviour wherever it appears.

Digital violence is not inevitable. It is the result of choices about design, governance, enforcement, and whose voices matter.

It is time to choose differently.

Digital violence against women and girls is widespread, escalating, and deeply harmful. Evidence shows a clear trajectory from online abuse to offline harm, intensified by AI and weak accountability. Ending it requires systemic change, survivor-centred justice, and prevention that places responsibility where it belongs.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.