Listen: tackling online harms
About two-thirds (65%) of internet users aged between the ages of 13 and 17 have experienced potential harms online, rising to 67% among 18-24 year olds, according to Ofcom research which shows 62% of all users experienced at least one potential harm online in the four weeks prior to the survey.
The most common online potential harms encountered by young users include generally offensive or ‘bad’ language (28%), misinformation (22%), unwelcome friend or follow requests (21%), trolling (17%), bullying, abusive behaviour and threats (14%), content depicting violence (14%) and hateful, offensive or discriminatory content that targets a group based on specific characteristics (14%).
Fewer than one in six young people (16%) report harmful behaviour when they see it online – the lowest proportion across all age categories. While 77% of those who were bothered or offended enough took some form of action, the most common being unfollowing, unfriending or blocking the poster or perpetrator, and clicking the report or flag button or marking it as junk, 51% said nothing had happened since doing reporting the content. A fifth (21%) said that the content had been removed.
The findings emerged as the Government’s Online Safety Bill continues to make its way through Parliament, Ofcom will enforce the new laws, and has already started regulating video sharing platforms established in the UK – such as TikTok, Snapchat and Twitch.
Aston spoke to psychologist Jo Hemmings and Anna-Sophie Harling, online safety principal at Ofcom.
Photo by Glenn Carstens Peter on Unsplash
Ads help us keep this site online