EU proves Facebook and Instagram – What may be the reasons?

Shruti Govil
Shruti Govil May 20, 2024
Updated 2024/05/20 at 3:54 AM

Due to concerns that Meta is not adequately safeguarding children on its platform, the European Union has launched a new investigation into the company’s Facebook and Instagram accounts. If found guilty, Meta could face fines of up to 6% of its global yearly turnover. 

The recommendation engines on Facebook and Instagram have raised concerns from the 27-nation bloc that they may “exploit the weaknesses and inexperience” of young people and encourage “addictive behavior.” 

The commission will investigate Meta’s use of age verification technologies to bar users under 13 from accessing Facebook and Instagram as part of its investigation. Additionally, ascertain whether the business is upholding the strictest standards of child safety, security, and privacy in accordance with the bloc’s Digital Service Act (DSA). 

What prompted the inquiry? 

The DSA of the bloc went into force in February. It requires very big online platforms, with over 45 million users in the EU, to share their data with the Commission and state authorities to evaluate compliance with the law, and to offer an option in their recommender systems that is not based on user profiling. 

Additionally, the platforms must take precautions against content that could harm minors’ moral, mental, or physical development. Platforms must also implement specific policies to safeguard the rights of minors, such as age verification and parental control features designed to assist minors in reporting abuse or obtaining assistance. 

Because Facebook and Instagram have more users than the allotted amount, they are classified as extremely large platforms and fall under the legal purview. 

As a “matter of priority,” the EU regulator will now launch a thorough investigation and compile evidence through more information requests, interviews, and inspections. 

What steps has Meta taken to safeguard kids using its platforms? 

A “nudity protection” feature powered by AI was to be tested by Meta, the app’s messaging system, to detect and blot any photographs containing nudity that were transmitted to minors. This announcement was made earlier this year. 

 

Furthermore, the firm announced that it would implement efforts to safeguard customers who are younger than eighteen (18) by strengthening content limitations and enhancing parental supervision capabilities. 

What standard procedures are in place to safeguard adolescents who are online? 

As the world grows more digitally connected, parents and other adults are finding it harder and harder to protect their children online. 

It is urged that parents make sure they have put up precautions to protect their child’s digital experience and are aware of the risks associated with using the internet. To make sure their kids don’t participate in risky activities or become victims of online predators, parents are also encouraged to monitor and spend time with their kids on the internet. 

Furthermore, minors who use social media platforms need to make sure they know how to report and “block” accounts that include objectionable content. 

Share this Article