Trends

Microsoft launches a tiny AI model that’s perfect for phones

Microsoft has launched the first of three lightweight AI models, Phi-3 Mini, to attract a wider client base with cost-effective options, according to Reuters. The latest model features 3.8 billion parameters and is currently available on Ollama, Hugging Face, and Azure. About Phi-3 Mini Phi-3 Mini i…

Microsoft launches a tiny AI model that’s perfect for phones

Headline

Microsoft has launched the first of three lightweight AI models, Phi-3 Mini, to attract a wider client base with cost-effective options, according to Reuters. The latest model features 3.8 billion parameters and is currently available on Ollama, Hugging Face, and Azure. About…

Context

Microsoft has launched the first of three lightweight AI models , Phi-3 Mini, to attract a wider client base with cost-effective options, according to Reuters. The latest model features 3.8 billion parameters and is currently available on Ollama, Hugging Face, and Azure. Phi-3 Mini is the first of the three small language models (SLM) to be released by the company. It is betting its future on a technology that is anticipated to have a profound effect on society and the way people work.

Evidence

Pending intelligence enrichment.

Analysis

Executive Vice President of Microsoft Azure AI Platform Eric Boyd claims that the Phi-3 Mini is “just in a smaller form factor” than models like the GPT-3.5. It should perform better on devices like laptops and phones and be less expensive as a compact AI model. A children’s book was created by an LLM (large language model) using a list of 3,000 words to teach Phi-3, which is already an improved version because it can reason and code more effectively. This might be able to compete with Meta’s Llama 3 8B, which is starting to catch up to GPT-4 in certain areas. An AI model consumes more power and energy the more powerful it is. Since the Phi-3 Mini can run locally on low-power hardware instead of outsourcing its computing tasks to pricey cloud-based processing centres, its smaller size than its competitors is an advantage. It’s small enough to be deployed on a smartphone and sits beneath 7B and 14B-parameter versions called Phi-3 Small and Phi-3 Medium. Sébastien Bubeck, Microsoft’s vice president of GenAI research, said: “Phi-3 is not slightly cheaper, it’s dramatically cheaper, we’re talking about a 10x cost difference compared to the other models out there with similar capabilities.”

Key Points

  • Microsoft’s new Phi-3 Mini AI model is efficient, with capabilities similar to GPT-3.5, but in a smaller size, perfect for handheld devices.
  • With 3.8 billion parameters, Phi-3 Mini is part of a series of lightweight AI models Microsoft is debuting for better portability and performance.
  • The AI model can run locally on low-power hardware, saving energy and costs compared to cloud-based processing at a lower price.

Actions

Pending intelligence enrichment.

Author

Editorial author not yet assigned.