Microsoft moves to stop M365 Copilot from ‘oversharing’ data

  • Microsoft M365 Copilot has raised concerns over exposing confidential data due to oversharing issues in organisations’ files and privacy settings. To address this, Microsoft has rolled out tools like Restricted Content Discovery and integrated SharePoint Advanced Management into Copilot subscriptions.
  • These measures aim to improve data governance and privacy by providing administrators with tools to limit Copilot’s access to sensitive files and mitigate oversharing risks.

When does collaboration become intrusive? It’s a question Microsoft faced after concerns surfaced about M365 Copilot potentially oversharing sensitive information. As a generative AI assistant designed to streamline work processes, Copilot has demonstrated its strengths in retrieving data across company platforms like Word, Teams, and SharePoint. However, this capability has also highlighted vulnerabilities in data governance, particularly where access permissions are too broad.

What happened: M365 Copilot oversharing concerns

At its Ignite event last month, Microsoft announced new tools aimed at tightening Copilot’s access to sensitive information. These include enhanced features in SharePoint Advanced Management and Purview, alongside a comprehensive deployment guide to help businesses mitigate the risks of oversharing.

Also read: Intel’s $7.86B CHIPS Act funding comes with strict conditions
Also read: ITW Asia 2024: Connecting Asia’s digital future

The AI-powered M365 Copilot integrates deeply with organisational data, pulling information from platforms to assist with tasks. Yet, its functionality raised alarms when confidential files like payroll data or corporate strategies inadvertently appeared in its results.

The problem often stems from inadequate privacy settings, with SharePoint files defaulting to “public” or lacking sensitivity labels. To combat this, Microsoft is expanding access to SharePoint Advanced Management, which will now come bundled with M365 Copilot at no extra cost from early 2025. New features include tools like Restricted Content Discovery, which prevents Copilot from accessing or processing data from selected sites.

Further safeguards were introduced in Microsoft Purview, enabling administrators to detect and manage overshared files. Features like Data Loss Prevention (DLP) policies allow organisations to exclude certain files based on sensitivity levels, ensuring Copilot operates within secure boundaries.

Why this is important

AI tools like M365 Copilot are becoming essential for businesses seeking efficiency. However, their ability to access expansive datasets makes robust governance non-negotiable. Without it, organisations risk exposing sensitive information, which could lead to reputational damage or regulatory repercussions.

Microsoft’s updates address these challenges by providing granular control over data access and offering administrators visibility into potential oversharing risks. While these tools help build confidence in adopting AI, they also emphasise the need for organisations to invest in complementary measures like training and robust governance frameworks.

As generative AI becomes more ingrained in the workplace, balancing its potential with rigorous security will remain a critical concern. Microsoft’s proactive measures might ease some of these worries, but they also underline the complexities of deploying AI responsibly.

Vionna-Fiducia Theja

Vionna Fiducia Theja

Vionna Fiducia Theja is a passionate journalist with a First Class Honours degree in Media and Communication from the University of Liverpool. A storyteller at heart, she delves into the vibrant worlds of technology, art, and entertainment, where creativity meets innovation. Vionna believes in the power of media to transform lives and spark conversations that matter. Connect with her at v.zheng@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *