Meta Platforms, the parent company of Facebook, intentionally designed its social platforms to engage children and allegedly concealed millions of complaints about underage users on Instagram, only disabling a fraction of those accounts, according to a recently unsealed legal complaint reported by The Wall Street Journal and The New York Times. The lawsuit, initiated by the attorneys general of 33 states in late October, revealed internal documents where Meta officials acknowledged exploiting psychological vulnerabilities in young users, including impulsive behavior, susceptibility to peer pressure, and an underestimation of risks.
Despite company policies prohibiting users under 13, Facebook and Instagram remained popular among this age group. Meta countered the allegations, stating that the complaint misrepresented its decade-long efforts to enhance online safety for teenagers, emphasizing the provision of over 30 tools to support teens and their parents. Regarding age verification challenges, Meta suggested that the responsibility for preventing underage usage should be shared with app stores and parents. They expressed support for federal legislation requiring parental approval for youths under 16 downloading apps.
One Facebook safety executive, as per a 2019 email, hinted at the potential impact on the company’s business if restrictions were imposed on younger users. However, a year later, the same executive expressed frustration that the company prioritized studying underage usage for business reasons over identifying and removing younger children from its platforms. The legal complaint highlighted Meta’s backlog of up to 2.5 million accounts of younger children awaiting action, underscoring the complexity of addressing the issue, as reported by various news outlets.
Post Your Comments