Meta’s “Mango” and “Avocado” models hint at where the AI race is heading next
Meta’s building “Mango” and “Avocado.” Why fruit names matter, and what this shift could mean for AI video, coding models, and ads.
For the last two years, the AI race looked like a chatbot race. Who answers better, faster, cheaper.
Now it’s shifting. The real fight is moving toward images, video, and “make something for me” workflows, because that’s where users spend time and where businesses spend money.
That is why Meta’s latest reported internal push is interesting: two new models, codenamed Mango and Avocado, are being built with a clear goal to catch up and move ahead.
TL;DR:
According to Alexander Wang, Meta’s AI Chief: Meta is reportedly developing Mango, a new AI model focused on image and video generation, and Avocado, a text model described as being stronger at coding. Both are expected to target a first half of 2026 release window. The same internal discussion also referenced early work on “world models,” a clue that Meta is thinking beyond chat and into AI that understands environments.
What Meta is building in the AI department?
These names are not consumer brands (at least not yet). They’re internal codenames that leaked through reporting, mostly tied to an internal Q and A session.
Mango: the image and video engine Meta wants to own
Mango is reported as a model focused on image and video generation, with a tentative first half of 2026 launch target.
If you run a marketing team, the “why now” is obvious. Video is the most valuable format across Meta’s apps, and generative video is quickly becoming a default creative workflow, not a novelty.
Meta also has a practical reason to push here: if the next phase of social content gets flooded with AI visuals, Meta would rather power it than merely host it.
Avocado: text, with a not so subtle focus on coding
Avocado is described as a text-based model, with reporting highlighting improved coding ability.
That matters because coding ability is not just “developer stuff.” In 2026, coding strength is how you build:
- agents that can take multi step actions,
- internal automation that replaces repetitive ops work,
- product features that ship faster than your competitors can copy.
Coding is leverage. Meta knows it.
Why these “world-class models” matter for Meta?
In mid 2025, Reuters reported Meta reorganized its AI work under a new division called Superintelligence Labs, led by Alexandr Wang, with Nat Friedman co leading and overseeing AI products and applied research.
That reorganization matters because it changes incentives. When a company creates a dedicated “war room” org, it usually means two things:
- the old structure was too slow,
- leadership wants fewer excuses and faster shipping.
In other words, Mango and Avocado are not side bets. They look like outputs from a new operating system inside the company.
Meta already started training users for AI video: Vibes
On September 25, 2025, Meta announced Vibes, an AI video feed inside the Meta AI app and on meta.ai, where people can create, remix, and share short AI generated videos.
That product is worth mentioning for one simple reason: it shows Meta is actively trying to make AI video feel normal.
It also quietly answers a question you might have had. If Meta is building Mango, where does it get used? Vibes is an obvious place, because Vibes is already built around the habit of “see something, remix it, post it.”
What this means for marketers, creators, and teams running Meta Ads
You do not need to predict model architecture to make this useful. You just need to watch the right outcomes.
Here are four practical things to track over the next 6 to 12 months:
- Whether Avocado becomes an API product. If Meta prices it, that tells you it sees real enterprise demand, not just consumer chat usage.
- Where Mango shows up first. If it lands inside creator tools, it’s a culture play. If it lands inside ads tooling, it’s a revenue play.
- How controllable the output is. The market is tired of “cool demos.” Winning tools let you keep characters consistent, make local edits, and iterate quickly. (This is why Google’s “nano banana” style tools got attention.)
- What Meta does about quality spam. AI video feeds can become “AI slop” fast, and distribution platforms eventually have to choose between growth and trust.
Bottomline
Mango and Avocado are still just reported codenames. The real test is shipping, and shipping at quality.
But Meta’s direction is already visible. It is moving from “AI that answers” to AI that creates, and possibly AI that sells. If Mango powers a serious visual engine in 2026, the creative workflow for Meta ads and content will shift again. If Avocado goes closed and paid, Meta’s open model narrative changes overnight.
Get the latest marketing news and trends
Delivered straight to your inbox.
Thank you for subscribing!
Stay tuned for the latest updates.
OpenAI came up with two new announcements in past 24 hours, and...
TL;DR Member tags: Add a short label for yourself inside a specific...
Right now you guys probably grumble at ChatGPT for spoiling em dashes....
Facebook Groups are home to over 1.8 billion people each month, covering...