Microsoft launches lightweight AI model Phi-3-mini  

  • Phi-3-mini marks the initial release among a trio of small language models (SLMs) by Microsoft.
  • Sébastien Bubeck, Microsoft’s vice president of GenAI research, highlighted its significant price advantage over comparable models in the market.
  • SLMs are designed to perform simpler tasks, making them easier to use by companies with limited resources, according to Microsoft.

Microsoft launched the next version of its lightweight AI model, Phi-3 Mini, the first of three smaller models the company plans to release.

The move marks the company’s strategic endeavour to cater to a wider range of customers by offering more affordable options in the fast-growing AI technology space.

The AI model Phi-3-mini  

Phi-3-mini marks the initial release among a trio of small language models (SLMs) by Microsoft.

The company is betting heavily on these models, recognising that they have the potential to revolutionise industries and redefine how people deal with technology in their professional lives.

As mentioned by Sébastien Bubeck, Microsoft’s vice president of GenAI research, “Phi-3 is not slightly cheaper, it’s dramatically cheaper, we’re talking about a 10x cost difference compared to the other models out there with similar capabilities.”

Also read: What are the two main types of generative AI models?

Also read: OpenAI and Meta to launch AI models with ‘reasoning’ capabilities

SLMs are designed to perform simpler tasks

Designed to handle simpler tasks, SLMs like Phi-3-mini offer practical solutions tailored for companies operating with limited resources.

According to the company, Phi-3-mini will be immediately available on the AI model catalogue in Azure, Microsoft’s cloud services platform, Hugging Face, a machine learning model platform, and Ollama, a framework for running models on local computers.

Besides, SLM is optimised for Nvidia’s graphics processing units (GPUs) and integrated with Nvidia’s software tool, Nvidia Inference Microservices (NIM), to further enhance accessibility and performance.

Jennifer-Yu

Jennifer Yu

Jennifer Yu is an intern reporter at BTW Media covering artificial intelligence and products. She graduated from The University of Hong Kong. Send tips to j.yu@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *