Meta will use AI to analyze height and bone structure to identify if users are underage

2 hours ago 1

7:27 AM PDT · May 5, 2026

Meta volition commencement utilizing AI to scan photos and videos for ocular clues to spot if a idiosyncratic is nether 13 and should beryllium removed from Facebook and Instagram, the institution announced connected Tuesday. These ocular clues see a person’s tallness oregon bony structure, it said.

“We privation to beryllium clear: this is not facial recognition,” Meta explained successful its blog post. “Our AI looks astatine wide themes and ocular cues, for illustration tallness oregon bony structure, to estimation someone’s wide age; it does not place the circumstantial idiosyncratic successful the image. By combining these ocular insights with our investigation of substance and interactions, we tin importantly summation the fig of underage accounts we place and remove.

The ocular investigation strategy is present operating successful prime countries, but Meta says it’s moving toward a broader rollout.

Meta says this strategy is portion of its efforts to support kids nether 13 disconnected its platforms. These efforts see utilizing AI to analyse full profiles for contextual clues, specified arsenic day celebrations oregon mentions of schoolhouse grades. The institution looks for these signals crossed antithetic formats, specified arsenic posts, comments, bios, captions, and more. Meta plans to grow this exertion to much parts of its apps, including Instagram Live and Facebook Groups, successful the future.

If Meta determines that a idiosyncratic whitethorn beryllium underage, it volition deactivate their account, and the idiosyncratic volition request to beryllium their property utilizing the company’s property verification process successful bid to forestall their relationship from being deleted.

The announcement comes weeks aft a New Mexico assemblage ordered Meta to wage $375 million successful civilian penalties for misleading consumers astir the information of its platforms and putting children astatine risk. The institution was besides ordered to instrumentality cardinal changes to its platforms. Meta has since threatened to shut down its societal media services successful the state.

It’s worthy noting that this lawsuit is one of many lawsuits that Meta and different Big Tech companies are facing implicit kid safety. 

Techcrunch event

San Francisco, CA | October 13-15, 2026

Meta besides announced connected Tuesday that it’s expanding its exertion that automatically places teens into stricter “Teen Accounts” connected Instagram to 27 countries successful the EU and Brazil. These teen accounts spot users into a stricter relationship acquisition with further safeguards, specified arsenic receiving DMs lone from radical they travel oregon are already connected to, hiding harmful comments, and mounting accounts to backstage by default.

Additionally, Meta said it’s expanding the exertion to Facebook successful the U.S. for the archetypal time, followed by the U.K. and EU successful June.

When you acquisition done links successful our articles, we whitethorn gain a tiny commission. This doesn’t impact our editorial independence.

Aisha is simply a user quality newsman astatine TechCrunch. Prior to joining the work successful 2021, she was a telecom newsman astatine MobileSyrup. Aisha holds an honours bachelor’s grade from University of Toronto and a master’s grade successful journalism from Western University.

You tin interaction oregon verify outreach from Aisha by emailing aisha@techcrunch.com oregon via encrypted connection astatine aisha_malik.01 connected Signal.

Read Entire Article