Facebook ‘fixing’ Instagram after Haugen leak: Nudging teens away from bad content & making Section 230 conditional on censorship
Facing accusations of failing to protect children, Facebook wants to introduce more restrictions and coercion on the platform. It also wants reform of Section 230 so its protections apply only to well-moderated platforms.
Facebook has been subjected to an onslaught of criticism after its former employee Frances Haugen leaked internal company documents to the press, and accused the tech giant of putting profits before the wellbeing of its users. One of the specific charges was that using Facebook-owned Instagram could be harmful to some vulnerable teenagers, affecting their self-esteem or even causing suicidal thoughts as a result of browsing too-good-to-be-real photos on the network.
“We can’t change human nature,” Facebook Vice President of Global Affairs Nick Clegg said, commenting on the allegations on CNN’s State of the Union program.
You always compare yourself to others, particularly those who are more fortunate than yourself. But what we can do is change our product, which is exactly what we’re doing.
He said Facebook has paused its “Instagram Kids” project and introduced several features to limit communications on the social network since conducting internal research about Instagram’s effect on teens, which was leaked by Haugen. In the wake of the accusations, it will also move forward with plans for more parental control on the platform.
Additionally, Facebook will introduce two automatic oversight features that are meant to limit potentially harmful experiences, Clegg promised.
.@DanaBashCNN presses Facebook Vice President ofGlobal Affairs Nick Clegg on tens of thousands of pages of internal research and documents, which were released by a whistleblower, indicating the companywas aware of various problems caused by its platforms. #CNNSOTUpic.twitter.com/HrFAZw4cvy
— State of the Union (@CNNSotu) October 10, 2021
“Where our systems see that a teenager is looking at the same content over and over again, and it’s content which may not be conducive to their wellbeing, we will nudge them to look at other content,” he explained. The app will also be “prompting teens to simply just take a break from using Instagram.”
Online products telling children to stop using them and spend their time in a more productive way is a relatively popular approach in Asia. South Korea, for example, banned kids from online gaming at night time a decade ago, though in August it was ruled unconstitutional. China introduced strict limits on children’s online gaming in 2019.
The Facebook official wouldn’t say what amount of time he would consider advisable on social media apps, saying it “varies from person to person.” But apparently the algorithms that would tell teens to knock it off would know better.
Facebook strongly disagrees with Haugen’s characterization of its motives, saying the amount of resources it puts into making its products better for users is a testament to how it values user experience above profits. “We have invested over the last several years $13 billion in this kind of work,” Clegg said, adding that the sum was “more than the total revenue of Twitter over the last four years.”
Facebook's Nick Clegg says Section 230 of the Communications Decency Act should be changed: "My suggestion would be to make that protection, which is afforded to online companies like Facebook, contingent on them applying... their policies as they're supposed to." #CNNSOTUpic.twitter.com/CiJ2gB2UAn
— State of the Union (@CNNSotu) October 10, 2021
The executive argued that the social media industry would benefit from increased regulation, in particular by reforming Section 230. This legislation shields social media from liability for the content they host – which was intended as a way of protecting budding online platforms from the burden of excessive content moderation.
Clegg argued the protection should be “contingent on [platforms] applying the systems and their policies as they are supposed to.” This presumably would mean that only big companies that already have the resources to police content would be shielded from lawsuits.
Commenting on the pending testimony by Haugen to the House Select Committee investigating the January 6 riot, Clegg insisted Facebook held no responsibility for those events. The social network was accused by the former employee of dismantling the unit that tackled misinformation during the 2020 presidential campaign, which, according to Haugen, led to the riot.
Also on rt.com January 6 committee to hear from Facebook ‘whistleblower’, who all but accused platform of letting ‘insurrection’ happen – mediaFacebook’s ranking algorithms have efficiently reduced the amount of hate content and misinformation on the platform, Clegg insisted. “I wish we could eliminate [hate speech] to zero,” he said.
He declined to say whether Facebook “amplified pro-insurrection voices” prior to January 6, saying users were free to seek whatever content they preferred on the platform.
If you like this story, share it with a friend!