icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
24 Aug, 2021 11:00

Facebook apps responsible for nearly 50% of all online CHILD GROOMING cases during UK lockdown in 2020 – report

Facebook apps responsible for nearly 50% of all online CHILD GROOMING cases during UK lockdown in 2020 – report

Facebook has been accused by a UK child protection watchdog of fuelling a spike in online grooming of children during the pandemic – with “risky design features” in Instagram and other popular apps being exploited by offenders.

In a new report, the National Society for the Prevention of Cruelty to Children (NSPCC) noted that the various Facebook-owned platforms – including Instagram, WhatsApp and Messenger – accounted for nearly half of some 5,441 ‘sexual communication with a child’ offences recorded by police since April 2020.

However, the charity said the actual scale of online grooming was “likely to be higher” as a result of “tech failures” by the social media giant that resulted in a drop in the removal of abuse material during the 12-month period ending in March 2021.

According to data from 42 police forces across England and Wales, Instagram was the most common site used by groomers. It was flagged by police in 32% of the crimes where a platform was identified. Since 2017, the number of cases linked to the picture- and video-sharing site has almost doubled, police data showed.

Also on rt.com ‘Weaponizing its terms of service’: Facebook FORCED shutdown of project monitoring Instagram’s algorithm, German researcher says

Meanwhile, Snapchat was the second most flagged platform – linked to a quarter of the cases where a platform was identified. In all, the “big four” of the Facebook apps and Snapchat were responsible for nearly 75% of all cases where the platform used for grooming was known to police.

Noting that “online child abuse is inherently preventable,” NSPCC child safety online policy head Andy Burrows told The Herald newspaper that the high figures were caused by the “inaction of social media firms” and their adoption of a “piecemeal approach ... instead of taking proactive steps to make sure that their sites are safe.”

As an example of the “far easier” ways for offenders to “contact and exploit children,” Burrows noted that groomers are able to simply “refresh the page” on some of the worst-performing platforms to “get a fresh list of children to contact as a result of the site algorithmically recommending them.”

In response, Facebook said it “works quickly to find, remove and report” this “abhorrent behaviour.” It claimed that changes were made earlier this year to “block adults from messaging under-18s they are not connected with” and said it had “introduced technology that makes it harder for potentially suspicious accounts to find young people.”

Also on rt.com Apple to scan photos on all US iPhones for ‘child abuse imagery’ as researchers warn of impending ‘1984’ – reports

Although the tech firm said it scans images and videos on Instagram and Facebook to flag exploitative material so that it can be removed, the NSPCC said it had “removed less than half” of the child abuse content it had done previously over the last six months of 2020.

According to Burrows, this meant less “actionable intelligence” was passed to police during the “perfect storm” of a pandemic – at a time when children were online more than ever before. The NSPCC also called on Facebook to ensure its end-to-end encryption tech does not “compromise” child-protection tools.

Despite safety measures announced recently by Facebook, Apple and other firms, the charity said the platforms were “playing catch up” due to “historically poorly designed sites that fail to protect young users” – even though sending sexual messages to children has been a crime since 2015.

It said this showed the importance of the draft Online Safety bill – to be considered by a parliamentary committee next month – that holds “named managers personally liable for design choices that put children at risk.”

If you like this story, share it with a friend!

Podcasts
0:00
25:44
0:00
27:19