- Meta’s AI image generator briefly faced a shortage in generating accurate images of Asian individuals on Instagram.
- Despite initial inquiries and attempts to rectify the issue, Meta did not respond to the reporter’s questions or provide an explanation.
- The incident highlights ongoing concerns regarding racial representation and accuracy in AI systems, prompting further scrutiny of Meta’s image generation technology.
Meta’s AI image generator faced a temporary glitch, failing to generate accurate images of Asian individuals, following a previous issue of consistently portraying everyone as Asian, regardless of specified race.
Instagram AI image generator glitch
Also read: Meta’s standalone AI image generator: Meaningful for human creativity
Yesterday, I highlighted that Meta’s AI image generator was consistently portraying individuals as Asian, regardless of the specified race in the text prompt. Today, I encountered the opposite issue briefly: I struggled to generate any images depicting Asian individuals using the same prompts as the previous day.
The experiments I conducted yesterday took place on Instagram, utilizing the AI image generator accessible in direct messages. Despite numerous attempts, I failed to produce a single accurate image with prompts such as “Asian man and Caucasian friend” or “Asian man and white wife.” The system only managed to create a correct image once, depicting an Asian woman and a white man—persistently rendering everyone as Asian.
Upon my initial inquiry yesterday, a Meta spokesperson requested further details about my story, including my deadline. I responded promptly but did not receive any follow-up. Today, I was eager to ascertain whether the issue had been resolved or if the system still struggled to generate accurate images of an Asian person alongside their white friend. Instead of encountering a series of racially inaccurate images, I was met with an error message: “Looks like something went wrong. Please try again later or attempt a different prompt.”
Communication with Meta’s team
Had I reached a limit for generating fabricated Asian individuals? I enlisted the help of a colleague from The Verge, and she encountered the same outcome.
I experimented with broader prompts related to Asian individuals, such as “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of receiving an image, I consistently encountered the same error message. Once again, I reached out to Meta’s communications team—what was the issue? Let me fabricate images of Asian individuals! (During this period, I also faced difficulties generating images using prompts like “Latino man in suit” and “African American man in suit,” which I inquired about with Meta.)
Racial representation concerns
Forty minutes later, following a meeting, I had yet to receive a response from Meta. However, by then, the Instagram feature was functioning properly for straightforward prompts like “Asian man.” It’s not uncommon for companies I cover to quietly modify something, rectify an error, or remove a feature once it’s brought to their attention by a reporter. Did my inquiry inadvertently cause a temporary shortage of AI-generated Asian individuals? Was it merely a coincidence in timing? Is Meta actively working to resolve the issue? I wish I had the answers, but Meta did not respond to my inquiries or provide an explanation.
Whatever is unfolding at Meta’s headquarters, there is still room for improvement—prompts like “Asian man and white woman” now yield an image, but the system continues to misrepresent the races by portraying both individuals as Asian, as it did yesterday. It appears we are back to square one. I will continue monitoring the situation.






