5 Things to Consider Before You Enable Copilot | Quisitive
Copilot Microsoft 365 - What to know before you enable Copilot blog feature image, Microsoft 365 Copilot on a computer screen with a hand holding a phone with the copilot logo on screen in front of the computer
5 Things to Consider Before You Enable Copilot
February 27, 2024
Steve Corey
In this article, I will share 5 things to consider before you enable Microsoft Copilot in your environment. I will also share some practical tips on how to prepare your environment first.

Today, we’re exploring the most common pitfalls we’ve seen as organizations get ready to use Microsoft Copilot. While the issues outlined below may sound scary, you can rest easy knowing there are ways to enable Copilot safely so that your organization can begin leveraging this amazing technology.

So let’s go over these 5 common pitfalls and what you can do to make Copilot work for you, instead of against you.

Pitfall #1: Not Training Your Organization to Use Copilot

First, you must think about training. Copilot is not something you can just switch on and expect your users to know how to use. They need to learn how to use it properly and how to improve their productivity with it. This applies to any Copilot product, whether it’s Copilot for M365, which is the most popular one among clients, GitHub Copilot, or any of the many Copilot products.

How do you properly train your users to use Copilot?

My recommendation is to create a pilot group, train them, and support them. You also need to have a feedback mechanism in place, so that they can tell you what works and what doesn’t as they move through the pilot.

Start by identifying different use cases for the different Copilot products. (Need help identifying use cases and developing your pilot plan? Try our Microsoft 365 Copilot Workshop.)

Pitfall #2: Oversharing Information Due to Access Issues in SharePoint

The second issue is oversharing. This is when you have data that is shared with more people than necessary, and you may not even be aware of it as an administrator or a site owner.

SharePoint is designed for sharing. It’s in the name… SHARE-Point. You will naturally share lists, files, and data with other people. But sometimes people share too much. They may share with everyone in the organization, or add Everyone but External Users to their site members group, because they want everyone to access their site.

But over time, you may gather a store of sensitive data that you don’t want everyone to see. You’ll need to ask yourself how often your organization do site access reviews, for the whole tenant or for individual sites. As a consultant, I can tell you that most organizations don’t do them at all. You need to do them if you want to use Copilot, otherwise it could result disaster.

How can you mitigate the risk of oversharing with Copilot?

This will take more time than just training your users, because you need to find those cases where content is shared too much. You can do this by performing an access audit. You can do this in several ways:

1. If you have the budget, SharePoint premium will give you some additional reporting in the admin center.

You will be able to see what content has been shared with everyone but external users. That’s a big indicator of content that is over shared. You may not have a good reason to use that group. You may want to use more specific groups, like full-time employees, or vendors, and keep them separate. That way, it’s easier for site owners to share content with the right people.

SharePoint premium also lets you trigger site audit requests for your site admins, so that they have to review their site access and confirm that it’s correct. If not, they have to make changes and approve the request. This will put more responsibility on the site owners, but you need to train them on best practices for site security and how to do these access reviews.

Another feature with SharePoint premium is the restricted access controls. This allows you or other admins to lock down certain sites, regardless of how they have been shared. For example, if you have a site with sensitive or external data, and you need to secure it quickly, you can use this feature to override the site permissions and specify who can access the site. This will ignore anything that the site owners or users have done, and lock down the site access immediately. This will be very useful for those situations where you have a lot of content on the site, and you don’t have time to fix all the access issues.

2. If you don’t have SharePoint premium, you can also use tools like Purview, which is part of the compliance center in M365.

There are a lot of options there for securing content. We will talk more about that when we get to pitfall #4, sensitive data, but one thing you can do right now is block crawling of certain sites or libraries. This will prevent them from being in the search index. This is a bit extreme, because then users can’t search for content on those sites, but maybe you need to do that in an emergency.

Microsoft will give us more options in the future, to allow searching but not allow Copilot to access or analyze the content of a document or data. Copilot will know that the content exists, but it won’t be able to see what’s inside, because it doesn’t have permission.

There are some of those controls in Purview, but they are more related to classification and labels, and sensitive content.

Pitfall #3: Trusting AI-generated Content a Little Too Much

How many people will take content from AI, put it in an email, send it to a client, and never think twice about it?

If that content is offensive, or inaccurate, or contains information that the client shouldn’t see, that’s going to be a big problem.

If you are a developer using GitHub Copilot, this is even more important, because Copilot can save you a lot of time, but it can also ruin your day. You need to resolve the trust issue before you use Copilot. I will tell you how to do that later, but first let’s talk about the next issue, which is sensitive data.

How can your organization build and maintain trust while still leveraging AI to improve productivity?

How do you handle trust issues with Copilot? Any generative AI will give you content to use, but you can’t just trust it blindly. You need to have policies in place to govern how your users can use AI-generated content, and what process they need to follow.

For example, for documents and especially code, you’ll need to specify that everything has to be reviewed by a human, and make sure that it is accurate, consistent, clear, and sensible. It needs to provide the information that you expected it to provide. You need to review all the content before you send it to a client, or add it to code, or do anything else with it.

Copilot will save you a lot of time with creating documents, analyzing data, or writing code. That’s why we are excited about it. But you need to use it carefully, otherwise you will have bugs in your code, security issues, data loss, or poor quality content.

It’s easy to get lazy and trust Copilot too much. You shouldn’t trust it at all. You should treat it like an assistant — you ask it to do something, and when it comes back with the work, you check it. You make sure it is good, and it sounds like you. Then you can incorporate it into your work. That’s the safest way to use Copilot, and having a policy in place to ensure that your users follow this expectation will make you successful with AI.

Pitfall #4: Not Protecting Sensitive Data Properly

We all have sensitive data in our environment. That’s how the cloud works these days. The data we store is searchable. You can use the Microsoft search bar at the top of your applications or SharePoint, and you can search for content. It will give you relevant results based on your activity in the cloud, but it mostly uses keyword matching. That’s how search has always worked.

Copilot is not a search bar. The difference is that Copilot does not use keyword matching; It uses semantic matching. This means Copilot can find data even if the keywords don’t match exactly because understands the synonyms and the meanings of words.

If you ask it “What is my boss’s salary?,” a normal search may not find that, but Copilot can. It may find content somewhere that you have access to, but you never knew it existed. It may tell you “Your boss makes X thousand dollars per year.”

This is a big problem, but it gets worse.

What if someone searches for an upcoming acquisition that they shouldn’t know about before the acquisition has been made official? What if they find out that their company is going to be bought by a bigger company, and they use that knowledge to buy stock in that company? That’s called insider trading in the US, and that’s illegal. It could be a very messy situation, all because someone had access to information they shouldn’t have.

These are both examples of sensitive content that needs to be protected, so that only the right people can see it and the wrong people cannot. There are a few ways to do that, and I will tell you later, but first let’s talk about the last issue, which is measuring success.

How can your organization ensure your sensitive data is protected?

As we discussed earlier, Purview will help you a lot with this.

Purview will let you create policies to restrict what Copilot can access. If a user searches for content, Copilot will know that the content exists, but it will tell the user that it can’t summarize it, or analyze it, or do anything else with it.

You need to have these policies in place for your content labels. What else can you do to protect your sensitive content from Copilot? First, make sure you know where it is. Identify all the containers that have your sensitive data. Apply labels to them. Require labeling for those containers, whether they are SharePoint sites, Teams, folders, or anything else. Make sure that you have labeling in place and enforced as needed, to make sure that the protection is applied to the content. This is not a one-time thing. This is an ongoing process. The first time you do it, it will take longer, because you have to set everything up. After that, you just have to maintain it, and as new containers or sites are created that need sensitive data, you apply the correct labels to them.

Then, you also need to train your users on how to classify information. Make them comfortable with how to label it, and what the results of those labels are. Whether it’s encryption, or prevention of forwarding, or anything else. Make sure they understand how to apply these labels to protect sensitive content, and what the impact will be.

5. Not Understanding What a Successful Copilot Implementation Looks Like For Your Organization

If you don’t have a way to measure your success with Copilot, your implementation is much more likely to it will fail. Why? Because you haven’t defined what success looks like, and therefore, you don’t know how to measure it.

You have to know where you’re going to know when you’ve arrived! You’ll need to have a tool to measure your success with Copilot, to see if you have achieved your goals.

How can your organization outline and track the success of your Copilot enablement?

Finally, you need to have a way to measure the success of your Copilot implementation. You need to have a dashboard to see how you are doing with Copilot, and if you have reached your goals.

Microsoft provides a Viva Insights dashboard for Copilot implementation. It will give you information on how you use Copilot. It will give admins a big picture of what’s going on with Copilot in the organization. It’s currently in a public preview, so you should be able to access it, but it may already be in general availability by the time you read this article.

Ready to get started with Microsoft Copilot for Microsoft 365?

Quisitive can help! Explore our Microsoft 365 Copilot Workshop, where we help you unlock the full potential of Microsoft Copilot and develop a plan to get started with your AI journey.