Figma disabled AI tools similar to Apple’s weather app

  • Figma suspended the use of its AI design tool “Make Designs” because it generated designs that were too similar to Apple’s weather app and could raise legal issues. 
  • The company is improving the quality assurance process and plans to re-enable the feature in the near furture. 

OUR TAKE
Figma has decided to temporarily disable its AI design tool because is too similar to Apple’s Weather app which can cause raise copyright issues. Moreover, Figma introduced other AI tools and provided the option for user data to be used for future model training, allowing users to choose whether or not to participate.

-Rae Li, BTW reporter

What happened

Figma, a popular online design collaboration platform, recently decided to temporarily suspend its AI design tool “Make Designs”. The decision was made after users discovered that the designs generated by the tool were very similar to Apple’s Weather app. Figma’s VP Noah Levin noted in a company blog post that despite careful review of the design system during the development and private testing phases, new components and sample screens added in the week before the Config event were not adequately reviewed. 

To address this situation, Figma removed the asset that caused the similarity from the design system, and Figma promises that they will improve the quality assurance process to re-enable the functionality in the future. In addition, Figma introduced other AI tools and provided the option for user data to be used for future model training to demonstrate that Figma is actively responding to user feedback and working to ensure that its AI tools are compliant and user-friendly.

Also read: Google, Microsoft offer Nvidia AI chips to China

Also read: Tiny Japanese startup brings AI dating to the elderly

Why it’s important 

With the rapid advancement of AI technology, it is becoming increasingly important to ensure that the use of AI tools does not lead to unethical or illegal results. Figma’s case highlights the need for AI developers to take responsibility for AI-generated content and to ensure that their tools do not inadvertently contribute to infringing behaviours.

At the same time, Figma’s incident highlights the importance of AI tool regulation and compliance. As AI becomes more widely used in various fields, it is significant to ensure that AI tools are compliant and prevent them from producing outputs that could cause legal problems.

Rae-Li

Rae Li

Rae Li is an intern reporter at BTW Media covering IT infrastructure and Internet governance. She graduated from the University of Washington in Seattle. Send tips to rae.li@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *