Access to the internet has increased significantly worldwide in recent years. In Kenya, mobile data subscriptions have reached 57.18 million, with over 42 million smartphones purchased in the country, according to the Communication Authority of Kenya (2025). This growth demonstrates a demand among Kenyans to connect digitally. This rapid adoption of technology has transformed how people work, learn, and interact, making the internet an integral part of daily life. The internet offers immense opportunities, but has also become a new frontier for rogue elements such as cyber-bullies due to the anonymity offered by these platforms, which increases the risk of exploitation and gender-based violence (GBV), particularly targeted towards women and girls.
Data from the Collaborative Center for Gender and Development (2024) indicates that one in three women globally is likely to experience physical and/or sexual violence in her lifetime, and 35% of women report having experienced such violence. But as the world becomes increasingly connected and digital, technology-facilitated gender-based violence (TFGBV)is emerging as another form of gender-based violence (GBV) through harassment, abuse, and exploitation via social media posts, direct messages, or other digital interactions. According to the United Nations Population Fund (2024), 85% of women who use the internet have witnessed technology-facilitated GBV against other women, while 38% have personally experienced it.
As the world marks this year’s 16 Days of Activism, UN Women reaffirms its commitment to ending all forms of violence against women and girls, with a particular focus on technology-facilitated gender-based violence (TFGBV). As an annual international global campaign that runs from November 25th to December 10, this initiative is a significant step in raising awareness and advocating for the elimination of various forms of violence against women and girls. Technology will continue to shape our lives; it is imperative for us as researchers, policymakers, and most crucially, technology providers to ensure that digital spaces and communities are safe, inclusive, and empowering for women and girls. In this article, I explain how behavioral science can be applied to understand and address TFGBV, fostering safer digital environments for women and girls everywhere.
Behavioral drivers of TFGBV
Online spaces are characterized by a high degree of anonymity and reduced accountability, which significantly increases the risk of technology-facilitated gender-based violence (TFGBV) (Dunn, 2020). Most users create digital profiles using pseudonyms or fake names, making it difficult to trace their real identities. This perceived invisibility often gives perpetrators a false sense of freedom to act without consequence, which leads them to post or make hateful, abusive, or degrading comments toward others. Anonymity emboldens perpetrators to share sexualized content, such as explicit images or videos of their victims, believing that they are shielded from social judgment or legal punishment. In the Global South, there is a significant increase in non-consensual sharing of intimate images (commonly known as revenge pornography primarily due to the influx of technology, but most countries have ambiguous or unknown legal provisions to address this vice (Chisala-Tempelhoff & Kirya, 2016). When it happens, it becomes challenging to determine the true extent of the problem, as most survivors do not report these violations to authorities. This underreporting is limiting opportunities for justice and accountability (Gichana, 2025). Revenge pornography has become a harsh reality for thousands of women and girls across digital platforms. It often causes shame and guilt, leading to significant emotional and psychological distress, and has been attributed to some causes of suicide (Chisala-Tempelhoff & Kirya, 2016).
In addition, interactions between men and women in online spaces are deeply shaped by gender norms, which are socially constructed expectations about the roles, power, and status of women in society (Bursztyn, Cappelen, Tungodden, Voena, & Yanagizawa-Drott, 2023). In patriarchal cultures, these norms influence how women are perceived and treated, both offline and online. Women who deviate from these social expectations, particularly public figures such as activists, journalists, and politicians, often face online trolling, harassment, and sexist attacks targeting their competence, families, or physical appearance. Such attacks are often rooted in discomfort with women who challenge traditional power structures and gender hierarchies (Dunn, 2020).
Perceived social approval also drives TFGBV in digital spaces. Perpetrators may engage in harmful behaviors such as name-calling, body shaming, or sharing private sexual images to gain acceptance or validation within their social circles. In 1950, Solomon Asch conducted a psychological study to understand the extent to which social pressure can influence an individual to conform. The experiment demonstrated the powerful impact of social pressure on individual beliefs (Capuano & Chekroun, 2024). Participants in a group setting often conformed to a clearly incorrect group consensus when faced with the social pressure of being an outlier, highlighting the desire to ‘fit in’ even at the expense of one’s own perception (Franzen & Mader, 2023). In digital spaces, this manifests through visible engagement metrics such as likes, comments, and shares, which create a false sense of endorsement. Hateful and sexually motivated content often goes viral, attracting thousands of comments, likes, and shares across platforms (Weale, 2024). High engagement with harmful posts can signal social approval to perpetrators, strengthening the perception that such behavior is acceptable or rewarded online (Dunn, 2020).
Using behaviorally informed interventions to address TFGBV
We can use behavioral frameworks such as the COM-B model to address some of these gaps. . The model, developed by Susan Michie, Maartje van Stralen, and Robert West (2011), explains that Behavior (B) results from the interaction of Capability (C), Opportunity (O), and Motivation (M). The model has proven successful in behavior change interventions, and can be instrumental in promoting safety in digital spaces for women. The model identifies the right drivers of behavior change for targeted interventions (Mayne, 2018). In this model, behavior includes actions, reactions, and patterns of conduct that individuals exhibit in response to internal thoughts and feelings or external environmental factors. COM-B states that for any behavior to change, individuals must have:
- Capability – the knowledge, skills and ability to perform the behavior (e.g. knowing and demonstrating how to use digital spaces safely or how to report online harassment and abuse)
- Opportunity – a supportive social or physical environment that makes the behavior possible (e.g, support and influence from other online users to use digital spaces respectfully)
- Motivation – the desire, intention, or belief that the behavior is worth doing (e.g., the importance of having a clean digital footprint)
What can be done to make digital spaces safer for women and girls?
- Building capability
Digital literacy and resilience among women and girls should be strengthened to empower them to recognize, respond to, and report instances of TFGBV. This can be achieved through targeted digital safety training by public and private sector actors that builds their understanding of online rights, platform safety tools, and the legal and policy frameworks that protect them. Training can equip them with practical knowledge on safe reporting mechanisms, both within social media platforms and through law enforcement channels (Mayne, 2018). In many cases, survivors of TFGBV remain silent, often due to shame, fear, or stigma, and may lack awareness of the laws and policies that guarantee their protection, such as the Computer Misuse and Cybercrimes Act (2018) and the Kenya Data Protection Act (2019).
Increasing awareness of these legal safeguards, alongside clear guidance on reporting processes, can help survivors seek justice and support without fear. Social media platforms can complement these efforts by embedding behavior change messages that affirm women’s right to safety online and encourage reporting. For example, platforms could display prompts such as:
- ‘Many survivors of online harassment and abuse have reported and found support, you can too. You’re not alone in this.’- Social norms
- ‘Any form of abuse or harassment (name-calling, body-shaming, or sharing nude pictures or videos violates our community standards and may lead to account suspension.’- Salience of consequences
Such continuous messaging, combined with strong reporting systems and survivor-centered support, can help foster a safer and more accountable digital environment for women and girls.
- Creating opportunity
Digital spaces should be intentionally designed to promote safety and accountability for all users by default. This requires that mechanisms for reporting abuse or harassment are clear, visible, and easily accessible, encouraging users to take action when violations occur (Mayne, 2018).
Platforms can achieve this by incorporating simple and intuitive safety features, such as a ‘One-tap reporting’ button that allows users to immediately flag offensive or harmful content. Often, online users are afraid of reporting harassment or misinformation due to fear of being identified (UNDP, 2021). However, effective design goes beyond ease of reporting as platforms must also act promptly on reported cases and provide survivors with regular feedback on the status of their claims and the actions taken against perpetrators. Additionally, behaviorally informed design features can enhance deterrence and accountability. For instance, platforms could implement a visible ‘strike system’ or temporary public flags on user accounts associated with repeated reports of abuse, such as a red warning icon to signal unacceptable behavior and discourage recurrence.
By combining user-friendly reporting tools with transparent follow-ups and visible accountability measures, digital platforms can create safer, more trustworthy online environments where all users can engage without fear of harassment or harm.
- Fueling motivation
Social media users should be encouraged to develop and sustain the motivation, intention, and belief that responsible online behavior is both valuable and rewarding (Mayne, 2018). This is essential in fostering long-term behavior change, where users remain consistently committed to using digital platforms respectfully and safely.
One effective approach is through public commitment devices, which allow users to visibly affirm their pledge not to engage in or support any form of TFGBV, including posting, sharing, liking, or commenting on abusive content. For instance, platforms can introduce digital badges or symbols of commitment displayed on user profiles. These serve as emotional rewards, creating feelings of recognition and pride, which motivate users to maintain positive online conduct to preserve their badge.
Such public commitments not only reinforce individual accountability but also help build a culture of collective responsibility, demonstrating visible support for keeping digital spaces safe, respectful, and empowering for women and girls.
Conclusion
As technology continues to shape how we connect, learn, and express ourselves, digital spaces must remain safe, inclusive, and empowering for everyone, especially women and girls. Addressing TFGBV requires more than technical solutions; it demands a holistic approach that integrates clear policy enforcement, behavioral insights, and design changes that facilitate platform accountability. By combining these elements, societies can shift online norms from tolerance of harm to a culture of respect and collective responsibility, ultimately ensuring safe online spaces for women and girls.
References and further reading
- Bursztyn, L., Cappelen, A. W., Tungodden, B., Voena, A., & Yanagizawa-Drott, D. H. (2023). How are gender norms perceived? (No. w31049). National Bureau of Economic Research.
- Capuano, C., & Chekroun, P. (2024). A systematic review of research on conformity. International Review of Social Psychology, 37(1).
- Chisala-Tempelhoff, S., & Kirya, M. T. (2016). Gender, law and revenge porn in Sub-Saharan Africa: a review of Malawi and Uganda. Palgrave Communications, 2(1), 1-9.
- Dunn, S. (2020). Technology-facilitated gender-based violence: An overview.
- Communication Authority of Kenya. (2025). Mobile, Data and Digital Services on the Rise. https://www.ca.go.ke/mobile-data-and-digital-services-rise-ca-report-shows
- Collaborative Center for Gender and Development; University of Nairobi Women’s Economic Empowerment Hub. (2024). Rapid Study on TFGBV In Kenya’s Higher Learning Institutions. https://kenya.unfpa.org/en/publications/rapid-study-technology-facilitated-gender-based-violence-tfgbv-kenyas-higher-learning
- Franzen, A., & Mader, S. (2023). The power of social influence: A replication and extension of the Asch experiment. Plos one, 18(11), e0294325.
- Gichana, A.. (2025). Tech-fuelled violence: Why women are unsafe in digital spaces and struggle silently with scars. Nation. https://nation.africa/kenya/news/gender/beauty-queen-how-porn-revenge-dethroned-me–3578704
- Mayne, J. (2018). The COM-B theory of change model [Unpublished manuscript]. https://www.researchgate.net/publication/314086441_The_COM-B_Theory_of_Change_Model_V3
- Pilat D., & Sekoul K. (2021). The COM-B Model for Behavior Change. The Decision Lab. Retrieved October 29, 2025, from https://thedecisionlab.com/reference-guide/organizational-behavior/the-com-b-model-for-behavior-change
- Weale, S. (2024, February 6). Social media algorithms amplify misogynistic content, says report. The Guardian. https://www.theguardian.com/media/2024/feb/06/social-media-algorithms-amplifying-misogynistic-content


