Ofcom has revealed a series of regulation changes to be implemented across various social media platforms, mainly targeting hateful content.
New regulations
The UK online regulations authority has laid out a number of measures video-sharing platforms (VSP) are expected to take to protect their users or else they will face hefty fines.
The VSPs, including TikTok, Snapchat, Vimeo, and Twitch, are all ordered to take “appropriate measures” to protect users from content that is related to child sexual abuse, racism, and terrorism. The move comes after research done by Ofcom revealed that a third of users have seen hateful content on these sites.
Strict consequences
VSPs will be required to provide and enforce clear rules for uploading content, make reporting and complaints process as easy as possible, and restrict access to adult sites with robust age verification. The VSPs will be under the microscope, with the promise of an Ofcom report next year that will determine if and how these regulations have been implemented.
YouTube isn’t expected to avoid the regulations either. The main video-sharing platform is expected to fall under the Irish regulatory regime, but also comes under the Online Safety Bill, which, when it becomes law, will offer much broader remissions for tackling harmful content on the bigger technology platforms like Twitter, Facebook, and Google.
Affect on marketing
Affiliate marketing managers should be happy with this update since it should ensure that the affiliates that are chosen will not feature on any hateful or illegal content. However, history will tell marketing teams that enforcing these rules is a matter of trial and error and that often content that is not harmful has been wrongly targeted and reported for being harmful to viewers. Therefore, affiliate content will also have to be well-vetted to ensure that nothing that can breach Ofcom’s regulations will be included in marketing campaigns.