SINGAPORE: According to the Ministry of Communications and Information (MCI), social media providers may soon be required to “limit access” to dangerous content or to block accounts that include such content.
To improve online safety, the government is consulting on two proposed rules of practice. The first is for selected social media services with a large user base or a high risk of abuse to implement system-wide protocols to improve online safety for all users, with special precautions for children under the age of 18.
The second proposal would give the Infocomm Media Development Authority (IMDA) the power to require any social media provider to remove “egregious information.”
Minister of Communications and Information Josephine Teo announced in March that the government would introduce new guidelines to prevent hazardous internet content from appearing on Singaporean platforms.
To “minimize users’ exposure” to dangerous information, the first code requires social media platforms to incorporate community standards and content management mechanisms.
Related Posts
They should also give consumers tools to help them reduce and lessen their exposure to harmful content.
Social media platforms should detect and remove child sexual exploitation and abuse content, as well as terrorism content, on a proactive basis.
Users should be able to report dangerous content and unwanted interactions, and this method should be simple to use and available at all times. When content is reported, social media sites must review it and take necessary action.
They must also submit an annual accountability report to IMDA for publication.
On social media sites, the second suggested code of practice covers content areas deemed to be “egregious online harms.”
These concerns include sexual harm, self-harm, and public health. Public security and racial or religious discord, or intolerance, are additional issues of concern.
Industry consultations began this month, and a public consultation will take place in July.