• With a rapid integration of generative AI tools into cloud business models, more than 70% of cloud environments are now utilizing managed AI services.
  • Microsoft’s Azure has emerged as a leader in the field of AI services, with approximately 70% of analyzed Azure environments incorporating Azure AI Services, accounting for around 39% of all cloud environments.
  • Wiz, the cloud security platform provider, has highlighted the significance of considering costs associated with training and inference in the upcoming year.

Cloud security platform provider Wiz has revealed that over 70% of cloud environments are now utilizing managed AI services. Either directly use OpenAI or through Azure SDKs, indicates a rapid integration of generative AI tools into cloud business models.

Azure Emerges as Leading Cloud Platform for AI Services

The study has examined more than 150,000 public cloud accounts, highlighting Microsoft’s Azure as a frontrunner in this space. In fact, 70% of the Azure environments analyzed included instances of Azure AI Services, representing approximately 39% of all cloud environments. The report also noted a significant 228% growth in Azure OpenAI usage over a four-month period in 2023.

While Azure’s leadership position is expected, especially given the recent availability of AWS’ fully managed offering, Amazon Bedrock, the report acknowledged the strong presence of Amazon SageMaker, closely trailing Azure AI Services. Although Amazon Bedrock was not included in the analysis, Wiz suggested that at least 15% of organizations were beginning to adopt it.

Also read: OpenAI shines, auto industry transforms: CES 2024 unveils AI’s impact

User Adoption Trends and Challenges in Managed AI Services

Interestingly, the majority of users leveraging managed AI services are still in the experimentation phase according to Wiz’s categorization. However, there is a notable portion of active users (28%) and power users (10%) who are pushing the boundaries of AI implementation.

The analysis methodology employed by Wiz focused on correlating the number of instances of a specific service across various cloud environments. Power users, defined as those with 50 or more instances, are facing challenges such as high costs associated with training and fine-tuning, as well as strict quotas imposed by some providers on the number of deployable AI service instances.

Also read: Microsoft’s Copilot on IOS makes premium AI services redundant

Integration of OpenAI Models and Software in Cloud Environments

More than half (53%) of the analyzed cloud environments are utilizing either OpenAI or Azure OpenAI SDK, enabling seamless integration with a range of OpenAI models. Additionally, self-hosted AI and ML software, such as Hugging Face Transformers (45%), LangChain (32%), and the Tensorflow Hub library (22%), are widely prevalent in cloud setups.

Wiz emphasized that the cost considerations related to training and inference will be paramount for customers in the coming year. This focus on cost optimization may present a turning point for organizations as they navigate the complexities of adopting AI technologies and determine the most effective paths for investment.

In conclusion, the report forecasted that 2024 could be a pivotal year for companies as they evaluate the value of their AI experimentation efforts and strategically determine the AI-based products and features they intend to pursue. The evolving landscape of generative AI experimentation is poised to demonstrate its potential in enhancing operational efficiency and unlocking innovative functionalities across various industries.