Meta Rolls Out AI Translations for Facebook and Instagram User-Generated Content

August 20, 2025

Meta’s AI team just rolled out new translation tools to Facebook and Instagram, allowing user-generated content to be converted into other languages on the fly. 

These new tools, which were announced at Meta Connect in 2024, are intended to help bridge the gap between people who speak different languages so that videos and reels can be understood by more audiences internationally.

Currently, the one-click translations effectively translate English reels into Spanish (or vice versa). Though other languages will be added at a later date, they aren’t completely left out for now. A Facebook-only feature lets creators add multiple audio reels to a single video, allowing it to be accessed in as many languages as the uploader provides tracks for. It’s unknown whether this feature will come to Instagram at a later date.

A representative for Meta did not immediately respond to a request for comment.

AI Atlas

Additionally, uploaders can opt into an AI lip-sync feature that works in tandem with the translations to create what Meta says are seamless mouth movements in AI-edited videos. The AI lip-sync feature can be turned off using creator controls at any time, which is helpful for anyone nervous about AI-generated visual elements in their content.

New tools on the backend will also give content creators on these platforms additional control over how their content spreads. A new metric in the Insights tab will let creators sort their view count by language, presumably to prove the effectiveness of these AI translations.

The suite of AI translation tools follows a report signaling that Google could soon add Duolingo-like AI tools to Google Translate. During the 2025 Google I/O presentation, an on-stage demonstration of Gemini integration for smart glasses showed off what an AI-assisted real-time conversation between people who speak different languages could look like in the near future.

 

Search

RECENT PRESS RELEASES