- Experts from Intel and Microsoft talked about optimising AI applications using the OpenVINO toolkit for better performance on personal and edge devices.
- New advances in AI technology showcased in Shanghai promise to accelerate application development, creating both opportunities and challenges for data security.
OUR TAKE
The Joint Developer Day, which was held in Shanghai by Intel and Microsoft Reactor, was focused on making small language models (SLM) more widely used and more effective by using Intel’s OpenVINO toolkit. This event brought together top technical experts from both companies, offering in-depth sessions to help developers use AI in different ways. It shows a key change towards easier and more powerful AI tools that could make it easier for more people to use AI. This could lead to more creative applications and solutions across industries, while also raising important questions about data security and the ethical use of AI.
–Heidi Luo, BTW reporter
What happened
Intel and Microsoft Reactor hosted a joint developer day on Saturday at the Zhangjiang Artificial Intelligence Island in Shanghai. The event focused on the efficient deployment and optimisation of small language models(SML) and was designed to help developers master the ability to build applications based on Phi-3 + OpenVINO.
The release of Microsoft’s Phi-3 family of models enables SLM deployment on edge computing platforms and personal laptops. Based on the Intel AI PC and new OpenVINO toolkits, developers can quickly deploy Phi-3 to support the development of AI applications.
Technical experts from Microsoft and Intel shared various aspects covering the latest OpenVINO technology highlights and demo presentations, the combination of Phi-3 and OpenVINO, and the analysis of Intel’s AI PC architecture.
The conference also featured a hands-on demonstration of the new Intel Core Ultra processor-based AI PC, which allowed developers to experience the power of Intel’s latest processor.
Also read: Microsoft launches lightweight AI model Phi-3-mini
Also read: Intel bets on China’s electric car market with new AI chips
Why it’s important
“The Phi-3 series is so powerful that even the smallest version, the Phi-3-mini, outperforms models with larger parameter sizes in multiple benchmarks. The Phi-3 series models consume less memory and can run on mobile phones and other devices, lowering the threshold for AI in practice,” said Mr Lu Jianhui, a senior cloud evangelist at Microsoft.
While the OpenVINO toolkit plays an important role in accelerating the performance of SLM in local operation and can be combined with various frameworks to create a solution for Phi -3-mini.
“People also expressed concerns about AI-related technology. “Large and small models are a good direction, but they can’t solve all problems, Phi-3 can only solve some specific problems as well,” said attendee Vu Yee Goh, a software development engineer.
AI can help developers work more efficiently, write and test code more effectively. But we can’t really trust what AI comes up with. There’ll always be data security issues.
Li Mingguo, a software development engineer at Huawei
“Each company’s data is their privacy, so it only makes sense for them to use their own data to train their own AI big model. A model trained from your own company’s data is more suited to your company’s business, and then data security can be guaranteed. But training a company’s own big model is more expensive, which is unaffordable for small businesses, he says.