Facebook needs to prioritize the welfare of children, Zuckerberg said

Facebook has lost the confidence of the parents, prioritizing the commercial profit on the needs of children and must take measures to restore faith on their platforms, a global alliance of child protection activists and experts warned Mark Zuckerberg.

The founder and executive director of Facebook is encouraged to publish their internal assessment of the risks facing young people in their services in a letter with 59 signatories, including the UK National Association for cruelty to children and Coalition of Children’s Rescue in the UK.

“The company needs to make significantly better to repair the confidence of parents and child protection staff, and above all, to ensure that their product decisions contribute to the jeopardy of the safety and well-being of children,” the letter said.

Mr Zuckerberg is encouraged to take five steps to address concerns about his approach to protecting children on his social media platform, Instagram application and video shared video and WhatsApp. Those steps are:

Comment all your internal research on the impact your platforms have in the welfare of children. – Start waiting for what research has been done on how the services of the company contribute to sexual abuse of children. – Scholarship risk assessments of how your platforms affect children. Discuss details of an internal review of your products reputation. -Review the child protection implications of the encrypted messages. The letter is to Mr. Zuckerberg sent following the Facebook revelation whistleblower Frances Haugen, who accused the company of a safety widespread approach in the evidence of US senators. Uu and a series of leaks of documents that the backbone of a series of condemned articles at the Wall Street Journal. Mark Zuckerberg is encouraged to take five steps to protect mail address about Facebook’s approach to children on their social media platforms.

One of the most harmful leaks was internal research on Instagram, which is the impact of the application in adolescence welfare, including a slide that shows that 30 percent of teenagers of teenage girls are cultivating dissatisfaction with their worse body.
Eggo Haugen The confirmation that the company puts profits before people, Letter The group was: “We cannot continue with a situation in which the needs of children or appear to be secondary to commercial motives, and in which the law of young people safeguard, Privacy and wellness negotiate to prioritize the interests of adults and other managers more influential “.

A Facebook spokesman said: “We are committed to maintaining young people who use our safe platform We spend $ 13 million [€ 1119 billion] in safety in recent years, including development tools to safety and improving The welfare of young people. Facebook and Instagram. We have shared more information with researchers and academics than any other platform and we will find ways to find external researchers more access to our data in a way that respects the privacy of people. ”

The letter was sent as audiences in the UK online security bill, which puts a duty of attention to the social networking companies to protect children from harmful content and prevent the spread of illegal content, such as pornography’s resumed. Laura Enelson, Laura Edelon, an expert on social networks at the University of New York, at the University of New York, has pushed Facebook algorithms vulnerable users to a more harmful content because they found him so attractive. “Attention is what these platforms are sold,” said Ms Anelon. “They sell advertising, so it’s your business model, to involve users and must build algorithms to do so and certain types of harmful content are more attractive.” Another witness, Guillaume Chaslot of the Algotransparency of the campaign group, said social enterprises and shared video companies should be formed according to how the detrimental video content is considered extensively. Under a regime of this type, platforms will have an incentive to respond quickly to dangerous content and ensure that their algorithms do not recommend unacceptable positions. – a stream.