The European Union has opened fresh investigation into Meta’s Facebook and Instagram over suspicions that they are failing to protect children on their platform, a violation that could result in fines of up to 6% of the company’s annual worldwide revenue.
The 27-nation bloc has said it is concerned that the Facebook and Instagram’s recommendation engine could “exploit the weaknesses and inexperience’” of children and stimulate “addictive behavior”. The bloc’s executive arm further said that these systems could reinforce the so-called “rabbit hole” effect that leads users to watch increasingly disturbing content.
What Is The Investigation’s Trajectory?
As a “matter of priority,” the EU regulator will now launch a thorough investigation and compile evidence through more information requests, interviews, and inspections.
The panel may also accept pledges from Meta to address the problems brought up during the inquiry.
What Has Meta Done To Protect Children On Its Platforms?
Earlier this year, Meta announced it was testing an AI-driven “nudity protection” tool that would find and blur images containing nudity that were sent to minors on the app’s messaging system.
Additionally, the company said it would roll out measures to protect users under 18 years of age by tightening content restrictions and boosting parental supervision tools.
Is This The Only Investigation Against Meta In The EU?
The EU is looking into Meta’s platforms in other ways besides this one. The regulatory body launched an inquiry in April, alleging that Meta had neglected to address misleading advertising and misinformation prior to the elections for the European Parliament.
The antitrust commission took action against Meta because it was thought that Iran, China, and Russia could have exploited the platform as a source of misinformation to sway EU voters.
What About Outside The EU?
Prior to the EU’s implementation of the Digital Sex Act, Wall Street Journal reported in June 2023 that Meta’s Instagram account “helps connect and promote a vast network of accounts openly devoted to the commission and purchase of under age sex content,” which sparked backlash in the United States.
At the time, the firm claimed to be working on “improving internal controls” and to have removed 490,000 accounts that had violated its child safety policies in just one month, along with 27 pedophile networks.
What Are The General Practices Of Protecting Minors Online?
With children growing up in an increasingly digital world, it has become increasingly difficult for parents and caregivers to ensure their online safety.
It is urged that parents make sure they have put up precautions to protect their child’s digital experience and are aware of the risks associated with using the internet. These can involve creating kid-friendly profiles, selecting age-appropriate games and applications, setting up kid-friendly websites and search engines, and making sure that age-restricted content is blocked from accessing on the platforms and devices they use. To make sure their kids don’t participate in risky activities or become victims of online predators, parents are also encouraged to monitor and spend time with their kids on the internet.
Furthermore, minors who use social media platforms need to make sure they know how to report and “block” accounts that include objectionable content. They should also encourage open dialogue to make sure an adult is present in case something doesn’t feel right.