Meta exec Adam Mosseri wanted to ‘upsell’ Instagram to children under 13: lawsuit
Instagram’s boss once floated a plan to “upsell” the social-media app to children under 13 — a reckless move to enlarge its base of pre-teen users as Meta struggled to police an influx of underage accounts, according to newly revealed allegations in an explosive lawsuit.
Instagram head Adam Mosseri allegedly concocted the plan in response to a massive “backlog” of some 700,000 accounts potentially belonging to users under age 13 that required review in late 2020, according to newly unredacted portions of New Mexico Attorney General Raul Torrez’s lawsuit against Meta.
The amended complaint claims that Meta — angling to boost user growth and advertising dollars despite concerns about child safety — had established an “underage enforcement war room” to address the profit hurdle.
Instagram was “not only aware that users under 13 were lying about their age to gain access to the platform, but that Meta executives believed the answer to this solution was to ‘upsell’ the service as opposed to instituting stricter registration procedures,” according to the suit.
Filed in December, the suit alleges that top boss Mark Zuckerberg and other Meta executives prioritized profits even as kids were exposed to sex predators and disturbing content.
“Faced with this backlog, Instagram head Adam Mosseri proposed creating a new type of family-centered account in Instagram would permit Meta to ‘upsell Instagram to children under 13, and broadly have a more compelling story on how we responsibly manage the fact that there are those under 13 who register for Instagram accounts,’” the lawsuit alleges.
Meta did not immediately return a request for comment.
A previously redacted 2018 internal email in which an unnamed Meta employee asserted that “’solid progress’ on teen-specific features would promote ‘teen growth,’ and thereby ‘defeat Snapchat in their core markets.’”
The lawsuit notes that Zuckerberg sent a “lengthy email” to his executive team titled “opportunities for teens and sharing” in 2016.
In another communication that same year, Mosseri purportedly said “[o]ur overall goal remains total teen time spent . . . with some specific efforts (Instagram) taking on tighter focused goals like US teen total time spent.”
In an October 2019 email, Mosseri wrote that he had spoken to Meta executive David Ginsberg and “others who’s [sic] researched Facebook and Instagram’s effect on well-being.” Mosseri said that “loneliness, excessive use, and social comparison” were three areas of concern that “consistently bubble up.”
During another discussion in 2021, Meta employees “discussed research on teen safety on Instagram that found that: ‘Children are regularly exposed to sexualizing content and comments – girls in particular have come to just expect that they’re [sic] see this type of content and have these experiences.’”
The lawsuit also cited an Instagram researcher who spoke with a “potential recruit” for a study on teen messaging. The teen purportedly “described a predatory grooming message she received from an older man.”
“The potential recruit was disturbed by this but also mentioned that she regularly gets these kinds of messages, usually at least once a day.,” the lawsuit says.
Meta’s failure to address online child safety risks was a major theme of last week’ bombshell Senate hearing on Capitol Hill – where lawmakers told Zuckerberg, TikTok CEO Shou Chew and other executives that they had “blood on [their] hands.”
At one point, Zuckerberg stunned the audience when he stood up and apologized to the parents of victims of online abuse and exploitation.
During the hearing, Zuckerberg admitted Meta once had “discussions internally about whether we should build a kids’ version of Instagram.” He said Meta had never moved forward with the plans and “we currently have no plans to do so.”
In an interview with The Post last week, New Mexico’s Torrez blasted Zuckerberg’s apology as “too little, too late.”
Torrez said the state’s investigation uncovered many Meta documents in which some of their own safety team are flagging these issues and raising concerns inside the company, elevating those concerns to senior management.”
“Time and again, they are ignored or disregarded,” Torrez added.
New Mexico’s lawsuit alleged that child predators use Facebook and other social media platforms to solicit children — even though Meta claims it doesn’t allow child exploitation.
State investigators set up test accounts that purportedly belonged to underage users, which became inundated with disturbing messages and outreach.
Previous updates in the lawsuit revealed how Meta employees were once left “scrambling” to respond to an unnamed Apple executive whose 12-year-old daughter had been solicited on Instagram.
An internal presentation in 2021 showed that “100,000 children per day received online sexual harassment, such as pictures of adult genitalia” on Meta-owned apps.
Meta has repeatedly denied wrongdoing and touted a years-long effort to roll out safety tools meant to safeguard children.
Last month, the company announced that teen users would no longer be able to receive direct messages from strangers. Zuckerberg said the company spent more than $5 billion on safety and security last year.