-
Advocacy Theme
-
Tags
- Abortion
- Adoption
- Caregiving
- CEDAW
- Disability
- Domestic Violence
- Domestic Workers
- Harassment
- Healthcare
- Housing
- International/Regional Work
- Maintenance
- Media
- Migrant Spouses
- Migrant Workers
- Muslim Law
- National budget
- Parental Leave
- Parenthood
- Polygamy
- Population
- Race and religion
- Sexual Violence
- Sexuality Education
- Single Parents
- Social Support
- Sterilisation
- Women's Charter
Stop asking why she took the nude, start asking why he shared it
May 5th, 2021 | Gender-based Violence, Letters and op-eds, News, TFSV
This op-ed was originally published in Channel NewsAsia on 5 May 2021.
SINGAPORE: It took unusual presence of mind from a National University of Singapore student for a new rash of Telegram chat groups circulating obscene images of women to come to public attention last month.
These groups were eerily similar to SG Nasi Lemak, which made headlines in 2019 for similar criminal activities, despite its shockingly high membership count (around 44,000 in one group alone), and the subsequent arrests of some admins.
Although most of us expressed disgust over the new groups, we ultimately moved on with our lives, like we have countless times before. These happen so often we are becoming inured to them.
After all, in the year 2020 alone, AWARE’s Sexual Assault Care Centre saw 140 cases of technology-facilitated sexual violence, or TFSV.
Unfortunately, victims of TFSV do not always have the option to move on. In addition to experiencing levels of trauma comparable with survivors of physical assault, isolation from loved ones and potential professional repercussions, victims of TFSV also face the constant, inescapable threat of being violated repeatedly – whenever their photos or videos are shared non-consensually with new recipients.
When their images end up online, women are often blamed for having participated in their creation. Instead of asking “why would someone do that to her?” we tend to instead direct our ire at the person in the photograph, asking “why did you take that photo?”
We zero in on an individual woman and condemn her, without paying heed to the context in which the photo was taken or the content of the photo itself.
WHAT TYPE OF PICTURES GO ONLINE
Over the course of an eight-year relationship, Sara* exchanged intimate photos with her boyfriend.
But when she wanted to end things, he threatened to release those photos and videos if she went through with the break up. Scared and tired of the emotional abuse she was facing, she sought help from our centre.
Sara’s case falls into a first category of photos that are non-consensually circulated online. These are photos and videos co-created by couples in the context of a romantic, often sexual relationship.
The voluntary sharing of intimate photos can be an expression of love and intimacy. They form a part of the cultural phenomenon of “sexting”, an increasingly common part of adult romantic relationships.
But they transform into TFSV when they are shared non-consensually, either via hacking, or when the initial recipients or co-creators disseminate them without the victim’s knowledge and consent.
A second category of photos are taken by someone known to the victim, but their creation and dissemination are non-consensual.
We often see such visuals emerge in the context of domestic violence, with their production and distribution means of control.
A client of ours, Ayesha* shared that her ex-boyfriend forced her to take nude photos and videos and sent these explicit materials to her family members after she filed a police report against him for harassment.
The third category are photos obtained through voyeurism – including upskirting, and photos taken while a woman is sleeping or in the shower. These might be non-consensually created and distributed, or used for private “consumption”
Finally, we have innocent images taken from women and girls’ own social media accounts and shared on chat groups without their consent. While accounts might be set to public or private, the photos are subject to a torrent of sexualised comments and framing after being shared on illicit groups.
Often these users are tweens who post everyday photos of themselves and their friends on Instagram – enjoying a meal, goofing around in the park and in many non-sexual everyday activities. Sometimes the photos are digitally altered to exaggerate certain body parts.
In these Telegram chats, personal details of the victim, such as their name, address and links to social media profiles, are shared alongside visuals, turning non-consensual online circulation into offline abuse, including stalking, sexual harassment and assault.
The effects can be wide-reaching. Half of 1,244 victims who had reached out to the US Cyber Civil Rights Initiatives had seen their full names and social media profiles published alongside their images. Over 20 per cent reported that their email addresses and phone numbers were published along with their images.
ASSIGN BLAME ON THOSE RESPONSIBLE
It should be plain to see that perpetrators are culpable for TFSV, not victims. Taking a photograph of yourself, or posing for a photograph, does not hurt anyone. Sharing those photographs without consent does.
So it does not help to focus on what women did that resulted in these Telegram chats, or chastise them for innocuous behaviour.
Focusing on a woman’s behaviour (for example, the fact that she took an intimate photo) is at the very heart of victim-blaming.
We have seen a similar pattern of comments online regarding cases of sexual assault that fixate on the women’s decisions (for example, we question if they drank alcohol and what they wore) and their reactions in the aftermath of violence, which cleverly diminishes the responsibility of the perpetrator and transfers it to victims.
Survivors who face victim-blaming are less likely to make police reports and seek further support in their recovery journeys. But women should not be held responsible for the behaviour of their boyfriends, husbands, exes, voyeurs, stalkers, underwear thieves or bosses who send unsolicited pictures of genitalia.
Until we clearly identify the point at which violence occurs, and assign blame where it belongs, we won’t see any real progress. Photos and videos of women will continue to circulate without their consent, and without accountability on the part of perpetrators.
Teaching consent from a young age should not be a matter of choice. It should be made non-negotiable. For older children, the Council of Europe recommends that parents use real-life examples to explain the risks, dangers, and legal issues surrounding sexting.
If children find themselves exposed, they should have the tools to discuss how to report the offensive materials with a trusted adult and find affirming emotional support.
WHAT MORE CAN BE DONE
Recent amendments updated Singapore’s Penal Code to be relevant to the “smartphone age” by criminalising voyeurism, non-consensual creation and distribution of intimate images, and cyber sexual exposure (i.e. vulgar images). However, despite this enormous progress, it’s clear we haven’t begun to solve the problem.
Clients of AWARE’s Sexual Assault Care Centre share with us the countless hours they spend tracking down images, filing takedown requests with individual social media platforms, changing phone numbers, deleting social media accounts – all to protect themselves from further harm and achieve a modicum of relief after their privacy and autonomy is violated.
Huge amounts of labour goes into those efforts. Yet we know that once a photo or video is out there on the internet, there’s no telling where it may end up. And the longer photos and videos stay online, the harder they become to remove.
We hope the new Protection from Harassment Act courts will be sensitive to these nuances, and allow the filing of cases and issuance of takedown requests to be processed within 48 to 72 hours.
We can model our system after other effective ones. Australia’s Office of eSafety Commissioner responds to complaints of image-based abuse within 48 hours of official reports being filed with them.
Social media platforms have a very important role to play here too. They must proactively implement steps to deter non-consensual photos or videos from being shared in the first place.
They should be required to share information across platforms so that a non-consensual image’s digital footprint can be used to prevent it from being uploaded on a platform after it has been taken down by another, instead of requiring victims to file complaints with each individual social media platform.
Social media platforms need to devote more human resources to tackling such cases and provide a clear time frame within which victims can expect assistance.
There’s much to be done before the horrible practice of TFSV can be put to bed. But just attacking women for sharing nudes is not one of them.
Shailey Hingorani, Head of research and advocacy, AWARE