Related topics
×
Sign in with Microsoft
Sign in or create an account.
Hello,
Select a different account.
You have multiple accounts
Choose the account you want to sign in with.

Below are the answers to some frequently asked questions regarding Microsoft Copilot in Viva Goals.

Copilot in Viva Goals lets you leverage the power of generative AI to help set business goals and align teams to your organization’s strategic priorities. It offers in-app guidance to users for creating, refining, and summarizing goals, as well as sharing updates related to goals.

  • Generate high-quality goals from strategy documents.

  • Generate potential goals from user-supplied prompts.

  • Summarize goals for a user or team.

  • Help acquaint users with methodologies like OKR and SMART so they can get their goal programs running more quickly.

  • Generate child goals, key results, or metrics from a given goal title.

A number of languages are supported for Copilot in Viva Goals scenarios. English is expected to have the highest quality at present; the quality for other languages should improve over time. More languages and locales will be added in the future. To learn more, see Supported languages for Microsoft Copilot.

Copilot in Viva Goals was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. Further evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot in Viva Goals is continuously evaluated with user online feedback. 

Copilot in Viva Goals has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented:

  • A Responsible AI handling pipeline to mitigate risks, like harmful or inappropriate content.

  • In-product user feedback with which users can report offensive content back to Microsoft.

Copilot in Viva Goals includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, including images, please submit feedback using the thumbs up/thumbs down feedback in the Copilot UI so that we can improve our safeguards. Microsoft takes this challenge very seriously, and we are committed to addressing it. 

Copilot in Viva Goals is built on Microsoft's comprehensive approach to security, compliance, and privacy and does not use customer data to train its model. Copilot in Viva Goals follows the privacy and data compliance currently in place for Viva Goals.  

Copilot in Viva Goals helps users create better goals and summarize progress updates, but the content it generates can be inaccurate or inappropriate. It can’t understand meaning or evaluate accuracy, so be sure to read over what it creates and use your own judgment.   

While these features work to avoid sharing unexpected offensive content in results and take steps to prevent displaying potentially harmful topics, you may still see unexpected results. We’re constantly working to improve our technology and proactively address issues in line with our responsible AI principles. 

As with any AI-generated content, Copilot is a great tool for getting started, but it's important that you review, edit, and verify anything it creates for you. 

Learn more

Introduction to Copilot in Viva Goals

Use Copilot in Viva Goals to create, summarize, and understand goals

Where can I get Microsoft Copilot?

Copilot Lab

Need more help?

Want more options?

Explore subscription benefits, browse training courses, learn how to secure your device, and more.

Communities help you ask and answer questions, give feedback, and hear from experts with rich knowledge.

Was this information helpful?

What affected your experience?
By pressing submit, your feedback will be used to improve Microsoft products and services. Your IT admin will be able to collect this data. Privacy Statement.

Thank you for your feedback!

×