-
Advocacy Theme
-
Tags
- Abortion
- Adoption
- Caregiving
- CEDAW
- Disability
- Domestic Violence
- Domestic Workers
- Harassment
- Healthcare
- Housing
- International/Regional Work
- Maintenance
- Media
- Migrant Spouses
- Migrant Workers
- Muslim Law
- National budget
- Parental Leave
- Parenthood
- Polygamy
- Population
- Race and religion
- Sexual Violence
- Sexuality Education
- Single Parents
- Social Support
- Sterilisation
- Women's Charter
AWARE’s Submission to the Public Consultation on Enhancing Online Safety For Users in Singapore
August 11th, 2022 | Children and Young People, News, TFSV
On 10 August 2022, AWARE made a submission to the Ministry of Communication and Information’s Public Consultation on Enhancing Online Safety For Users in Singapore. Our submission covered a wide range of issues relating to online harms, including technology-facilitated sexual violence (TFSV).
Since 2016, AWARE’s Sexual Assault Care Centre (SACC) has supported 747 clients who experienced TFSV; such cases constitute an average 17% of all SACC cases annually. Given our increased internet usage and children’s exposure to online content at an earlier age, we welcome the government’s recent efforts to tackle online harms, such as last year’s launch of the Alliance for Action to tackle online harms.
However, our approach towards addressing online harms can be further strengthened. Our submission highlighted gaps in the proposed measures for user safety (e.g. the need for greater clarity on content categories) and reporting mechanisms (e.g. the need to make information about take-down processes more accessible). Additionally, more safeguards can be introduced to ensure that young users are adequately protected in online spaces.
The recommendations we made were based on our experiences supporting victim-survivors of TFSV, as well as legislations and bills on online safety in countries such as Australia and the United Kingdom. Our recommendations include:
- Under “User Safety”, setting out clear definitions and parameters of content categories (such as sexual content and cyberbullying content) and content (such as posts, comments and direct messages).
- Providing more clarity on disabling users’ access to and/or removing reported content, including timelines for the take-down process and avenues of appeal.
- Introducing additional safeguards for young users, such as strict enforcement of the minimum age requirement for social media platforms and pornographic websites, mandatory onboarding and the provision of a resource centre.
- Under “User Reporting and Regulation”, temporarily suspending content once a report is filed, even if investigations have not yet commenced or are ongoing.
- Under the Content Code for Social Media Services, including sexist and misogynistic speech in the existing list of “extremely harmful content”.