Transparansi dan Kontrol dalam Penggunaan Data Konsumen

Transparansi dan Kontrol dalam Penggunaan Data Konsumen

Since launching our new generative AI experiences in products like Copilot, we’ve received a tremendous amount of product feedback both on what you’ve enjoyed and what you want more of in the future. We have consistently heard that you expect the next wave of AI experiences to feel seamlessly personalized, effortlessly integrated, and coherent across

Since launching our new generative AI experiences in products like Copilot, we’ve received a tremendous amount of product feedback both on what you’ve enjoyed and what you want more of in the future. We have consistently heard that you expect the next wave of AI experiences to feel seamlessly personalized, effortlessly integrated, and coherent across platforms.  

We’re energized by this feedback and are committed to delivering incredible new AI experiences for everyone. Fundamental to this process is our full commitment to earning your trust through transparency, accountability, and user control. That’s why we’re sharing some upcoming changes in how we will use consumer data, and our approach to ensuring our users are always in control.  

  • We will soon start using consumer data from Copilot, Bing, and Microsoft Start (MSN) (including interactions with advertisements) to help train the generative AI models in Copilot. We are making this change as real-world consumer interactions provide greater breadth and diversity in training data. We believe this will help us build more inclusive, relevant products, and improve the experience for all users. For example, our AI models can learn from an aggregated set of thumbs up/thumbs down selections to provide better, more useful responses in the future. Our AI models can learn colloquial phrases or local references from Copilot conversations. Our AI models can also learn from Microsoft advertising segments what, broadly, is most interesting and compelling to consumers, helping the models deliver more relevant content in the future. 
  • We will also make it simple for consumers to opt-out of their data being used for training, with clear notices displayed in Copilot, Bing, and Microsoft Start. We will start providing these opt-out controls in October, and we won’t begin training our AI models on this data until at least 15 days after we notify consumers that the opt-out controls are available.  

In short, we will always ask first, and we’ll always put you in control. 

These changes will only apply to consumers who are signed into their Microsoft Account. Consumers will retain existing controls over their data, and there are no changes to the existing options to manage how their data is used to personalize Microsoft services for them. In addition, this change only applies to how we will start training our generative AI models in Copilot, and does not change existing uses of consumer data as outlined in the Microsoft Privacy Statement. 

We will gradually roll this out in different markets to ensure we get this right for consumers and to comply with privacy laws around the world. For example, we will not offer this setting or conduct training on consumer data from the European Economic Area (EEA) until further notice. To learn more, please see our FAQ here

When we begin training, we will continue to adhere to the following data protection commitments: 

  • Your data will not be used to identify you: Before training these AI models, we will remove information that may identify you, such as names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses. 
  • Your data will be kept private: Your data remains private when using our services and is not disclosed without your permission. Generative AI models do not store training data or return it to provide a response, and instead are designed to generate new content. We continuously evaluate our models for privacy and safety, including conducting testing and building filters that screen out previously published or used material. We will protect your personal data as explained in the Microsoft Privacy Statement and in compliance with privacy laws around the world. 
  • Data from minors will not be used for training: We will only train our generative AI models on data from consumers who tell us they are 18 years or older. 

There are no changes to how Microsoft manages commercial customers’ data. More information about how we manage commercial customer data can be found here

We are fully committed to earning your trust. We will remain steadfast in protecting your data and will be open about what we do with it, and why. If we plan for additional changes to how we use consumer data for training our generative AI models in Copilot, we will share that transparently and will ensure there remains an ability for consumers to stay in control and choose whether to allow that.  

Once again, we will always ask first, and we’ll always put you in control. 

source