Schedule

Gender, Online Safety, and Sexuality Workshop (GOSS)
Sunday, August 11th, 2024, 2-6pm EDT (UTC-4)
Hybrid - In-person and Online

Event Time (in EDT, UTC-4)
Doors open 2:00pm
Welcome 2:15 - 2:30pm
Keynote and Q&A - Nicola Henry, RMIT University

Abstract
Image-based sexual abuse (IBSA) is an umbrella term referring to the non-consensual taking, creating, or sharing of intimate images, including threats to share intimate images with others (“sextortion”), pressuring, threatening, or coercing someone into sharing their intimate images (“sexting coercion”), digitally altering images to falsely depict a person nude or engaged in sexual act (“deepfake porn”), as well as the unsolicited and unwanted sharing of sexually explicit images (“cyberflashing”). IBSA is a form of technology-facilitated abuse, as well as a form of gender-based violence. Intimate images have significant potency due to the shame and stigma attached to a person's gender and sexuality, as well as other markers of identity, such as race, age, and ability. IBSA can have significant impacts on victim-survivors, including anxiety, depression, suicidal ideation, and other social, psychological, and physical harms, often co-occurring with other forms of abuse, such as sexual or domestic violence. There are two key aims of this presentation. First, I will provide an overview of current empirical research on the prevalence, harms, and drivers of IBSA. I will also discuss the challenges of research in this field. And second, I will critically discuss the efficacy of various legal and non-legal measures designed to detect, prevent, and respond to IBSA.

Bio
Nicola Henry (PhD) is Professor and Australian Research Council (ARC) Future Fellow in the Social Equity Research Centre at RMIT University (Melbourne, Australia). Nicola is a socio-legal scholar with over two decades of research experience in the sexual violence field. Her research investigates the prevalence, nature, and impacts of sexual violence, including legal and prevention responses in Australian and international contexts. Her current research focuses on technology-facilitated sexual violence and image-based sexual abuse. Nicola is currently undertaking an ARC Future Fellowship on digital tools, policies and platforms and image-based sexual abuse, as well as two Google-funded projects, including one involving a survey on image-based sexual abuse in ten countries, and the other investigating the prevalence, nature, and harms of AI-generated image-based sexual abuse (also known as “deepfake pornography”). Nicola is also working on other projects, including one on alternative reporting for sexual assault survivors and another on preventing sexual violence. Nicola is also a member of the Australian Office of the eSafety Commissioner's Expert Advisory Group and regularly provides advice to technology companies to help shape their policies and practices around sexual violence and harmful online content.

2:30 - 3:30pm
Break 3:30 - 3:45pm
Lightning talks and Q&A
  • Culture, Gender and NCIDA: Exploring Non-Consensual Image Disclosure Abuse in Non-Western Societies, Amna Batool, University of Michigan
  • Image-Based Sexual Abuse: Trends and Prevalence, Rebecca Umbach, Google
  • Designing to support safer sexting in the face of relationship dissolution, Allison McDonald, Boston University
  • Understanding Privacy and Security Needs for Queer Health Technologies, Calvin Liang, Northwestern University
  • Towards Promoting Online Safety of Sexual and/or Gender Minority Youth, Mamtaj Akter, Vanderbilt University
3:45 - 4:40pm
Introduction to breakouts 4:40 - 4:50pm
Breakout sessions 4:50 - 5:30pm
Full group discussion 5:30 - 5:50pm
Closing 5:50 - 6:00pm