Recently, Microsoft's "Connected Experience" setting in its Office software has sparked widespread discussion and user concerns. Many people worry that this default-enabled feature may allow Microsoft to use users' data from Word and Excel documents to train its AI models without obtaining user consent. In response, Microsoft firmly denies these allegations, clearly stating that it will not use customer data without explicit permission from the user.
A Microsoft spokesperson told the media, "In Microsoft 365 consumer and business applications, Microsoft does not use customer data to train large language models unless we have your explicit permission." However, when asked about the specific meaning of "permission" and whether this permission is opt-in or opt-out, Microsoft has not provided a clear explanation.
The "Connected Experience" feature has been part of Microsoft Office for many years, providing users with various online services such as translation, audio transcription, and grammar checking. This feature aims to offer smarter and more personalized services through internet connectivity. However, as more users become concerned about data privacy, some have begun to worry whether this data might be used to train Microsoft's internal AI systems.
Discussions about this issue have gradually increased on social media. One user discovered that the Connected Experience feature is set to be enabled by default on their Windows 11 device. This raised further questions: Will users' content be used to train AI models? While this possibility is low, it cannot be completely ruled out.
For educational and enterprise users, Microsoft's security policies are stricter, further reducing the likelihood of content collection through the Connected Experience. Although Microsoft has clearly stated it will not use customer data for training, its privacy statement allows for various uses of collected data, including for product improvement and AI model training.
In August of this year, Microsoft announced it would utilize consumer data from Copilot, Bing, and Microsoft Start to train the generative AI model for Copilot, promising that users could opt out. Microsoft also stated that the related opt-out controls would be launched in October and assured that it would not use consumer data from the European Economic Area for training.
As users become increasingly attentive to Microsoft's AI strategy, the company must remain transparent to ensure users fully understand how their data is used. In its latest statement, Microsoft mentioned that in certain cases, enterprise customers may agree to allow Microsoft to use their data for foundational model training, providing more information to the public.
Key Points:
🔒 Microsoft firmly denies using user data to train AI models without permission.
🧩 The "Connected Experience" feature can enhance services, but user data privacy still needs attention.
📜 Microsoft's privacy statement allows data usage for product improvement, necessitating transparency and clarity.