How to Prepare Your Organization for a Microsoft Stream Migration | Quisitive

If you’ve come across this blog post, it’s likely that you and your team are in the process of migrating from Microsoft Stream (Classic) to Stream (on SharePoint) or preparing for it. Microsoft will be ending support for Stream (Classic) in 2024. Today, we’ll share the history of Microsoft Stream, updates to the platform, information about the Microsoft Stream Migration Tool and tips to help fix common issues.

Read the full blog to make sure you’re ready for your Microsoft Stream migration.

History of Microsoft Stream

If you’re not yet aware of this unexpected migration, let’s take a moment to catch you up.


In 2017, Microsoft released Microsoft Stream (Classic), replacing Office 365 Video. It offered features like speech-to-text transcribed audio, face detection, linked time codes, and M365 interoperability. Although it was part of the M365 suite, it wasn’t well understood or heavily adopted by the community. During the 6 years since its release, several enhancements were made to improve the app’s experience, but it became clear over time that the architecture wasn’t as scalable as it needed to be. The community still hadn’t fully accepted it as a video streaming service, beyond automatically recorded meetings and the occasional town hall. Accessibility features were lacking and more importantly, Teams development was rapidly accelerating, and increased interoperability was a key focus. 


Microsoft went back to the drawing board and in September of 2020 they announced their plans to transition away from the current Stream architecture to a new one, with SharePoint at the center of it versus unique containers to Stream. 


In October of 2022, Microsoft Stream (on SharePoint) was released.

In that announcement, Microsoft shared details about new features to expect in addition to the efforts that would go into the migration. Every M365 team around the world realized they had an unscheduled migration project on their hands as a result of this evolution.

The good news is Microsoft did a lot of preparation to help the community make it through this process. That said, there’s no longer a wide gap between the development lifecycle for Stream and the latest releases. This means the Microsoft community is now contributing to the evolution of Stream a bit more by reporting issues and sharing experiences. A similar process occurred when Teams was released in 2017. Microsoft started small and deployed enhancements, many of which were a direct result of community feedback. 


After Microsoft released Stream (on SharePoint) last year, they began the enhancements phase for the migration tool, inventory report, and the Power BI dashboard that comes with it. Along the way, there have been many “gotchas” we’ve observed and are ready to share with you to save you headache and time. No, you’re not crazy. That “one button” is actually missing! But first, let’s cover the basics. 

Microsoft Stream Today: What you should know right now 


Now that the migration is in full swing, standard and government tenants (GCC) have access to the tools offered by Microsoft for the migration. For a brief time, GCC tenants didn’t yet have access. That’s usually the case with enhancements and updates due to diligence and security requirements for government tenants. 

Timeline Milestones 

Firstly, we strongly recommend you bookmark the retirement timeline, migration tool release notes, and web part transition timeline to understand where we’ve been and where we’re going. Those pages include details for standard and GCC tenants. Microsoft’s technical guide for Stream is now very extensive so we also recommend familiarizing yourself with the structure of their guide to make your life a bit easier. 

Standard Tenants Timeline

February 15, 2023 

On this day, the migration tool was made available for standard tenants. Since then, a few notable enhancements were made that are exclusive to standard tenants, and we’ll review those later in this blog post.  

May 15, 2023 

This was the first major milestone date, when no new videos could be uploaded to Stream (Classic) unless an admin took action to push that date back.  

October 15, 2023 

Now that May 15 has passed, the next milestone date to keep your eye on is this coming October 15, when users will no longer be able to access or use Stream (Classic) unless an admin takes action to push that date back. This change can be delayed until April 15, 2024. 

April 15, 2024 

At this time, Stream (Classic) will be fully retired & automatically disabled, users & admins will no longer be able to access or use Stream (Classic), and any remaining content in Stream (Classic) that wasn’t migrated will be deleted. 

GCC Tenants Timeline

July 30, 2023 

On this day, the Microsoft Stream migration tool was made available to GCC tenants. 

October 30, 2023 

After this day, no new videos will be allowed to be uploaded to Stream (Classic) unless an admin takes action to push the date back. This can be delayed until January 30, 2024. 

January 30, 2024 

After this day, no new videos can be uploaded to Stream (Classic) for any customers and this date cannot be deferred.  

March 30, 2024 

Users will no longer be able to access or use Stream (Classic) unless an admin takes action to push this date back. This change can be delayed until July 30, 2024. 

July 30, 2024 

Microsoft Stream (Classic) will be fully retired & automatically disabled, users & admins will no longer be able to access or use Stream (Classic), and any remaining content in Stream (Classic) that wasn’t migrated will be deleted. 

Now that we’ve covered the bigger milestone dates, let’s cover some of the finer details. 

What’s changing in Microsoft Stream? 

It can be a challenge to summarize every single upcoming change in Stream so our goal is to bring your attention to the bullet points that will save you time and thought energy, especially if you’re struggling to understand it or explain what’s happening to your team and leaders. There are three main points to understand. 

1. Storage 

Stream’s storage wasn’t very tangible compared to files in SharePoint Online or OneDrive, for example. So, Microsoft decided to use SharePoint Online to store Stream videos. It makes sense, right? Why continue investing in a separate storage architecture that’s not easily accessible or understood when you’ve got one already?

The main thing to know about this is that storage capacity for Stream videos wasn’t as much of a concern until now. Now that SharePoint Online is the platform for video storage, you’ll need to keep an eye on your storage utilization in the SharePoint Admin center to make sure you don’t exceed your licensed capacity.

Not all videos will migrate into SharePoint. Some videos may go into OneDrive, as well. This will be a decision you may have to make multiple times during the Microsoft Stream migration process. The default destination for Teams meeting recordings is the video owner’s OneDrive storage. (Reminder: every user in OneDrive is allotted 1TB of personal storage space.) You have the option to change this destination during the migration.

2. Stream (Classic) Web Part

The classic Stream web part is being deprecated during this transition. If you’re using the classic Stream web part on any of your SharePoint pages, you may want to start gaining a deeper understanding of how frequently it’s being used so you can estimate your workload to replace it. What is replacing the classic web part? There are actually multiple web parts that can now interact with videos because they are stored in SharePoint Online. The main candidates include the File and Media, Highlighted Content, and Hero web parts.

3. Stream Playlists 

This is a relatively new concept because now you can store Stream playlists as standard SharePoint lists in any site. It takes some adjusting, but you may find yourself feeling a bit freer to display videos anywhere in SharePoint with this new concept. Microsoft intends to release their own version of this type of page in the future, but Quisitive wants to help you now. Don’t hesitate to reach out to us if you need help with customizations like this. 

Now that you have these high-level details, let’s dive into the mechanics of the migration. 

Microsoft Stream Migration Tool 

With the recent enhancements Microsoft added to the tool, there are some options we didn’t have just a month ago. Your strategy options changed for the better, as a result.  

Where is the Microsoft Stream migration tool? 

You can access to the tool in your tenant by going to the following URL: 

You can navigate there as well by going to your M365 admin center > Settings > Migrations > Microsoft Stream. 

How do I use the migration tool? 

Since Microsoft has done an outstanding job of documenting the process and features of the tool, I’m going to point you straight to their Microsoft Stream migration tool knowledge base. They cover every detail and have an FAQ page for it, as well. 

How do I know which videos to migrate? 

You have two options for collecting video data to determine which videos should be migrated and which should be left behind. A lot of your choices may be unique to your business based on how Stream has been adopted so far.  

You might find that you have only 5 video containers and a total of only 20 videos. I’ve personally seen some tenants with upwards of 300 containers and 3500 videos. Even then, the video count isn’t your main focus.  

You’ll likely find meeting recordings more than anything else. Most of those will automatically be mapped for you to the owner’s OneDrive storage account. But there will be others that aren’t as easy to determine.  

Option 1 

We recommend you first run the scan on your Stream containers to get a high-level estimate of the work ahead. You’ll know almost instantly if your Stream service was adopted heavily or not just by the number of containers in your Stream service. If you see very few containers, you’re one of the lucky ones! You may not need to do much digging to determine the most appropriate destination, and you could skip straight to the migration phase. But you can still go with the second, more detailed option if you prefer. 

Option 2 

If Stream is heavily adopted in your organization and you see thousands of videos inside hundreds of containers after running your high-level scan inside the tool, we recommend you run the Stream inventory report. That report comes with a dashboard that will help you understand your videos in greater detail. 

Which approach is best?

The answer to this question depends on what you find when you scan your videos. Microsoft has outlined 3 different approaches and a checklist that might best meet your needs.  

Quisitive recommends focusing your attention on videos that aren’t automatically mapped to users’ OneDrive storage accounts. Those automatically mapped videos are the “low hanging fruit” you might find easiest to migrate without much discussion. But other videos that belong to teams or groups need some extra care to make sure they go to the right place.  

What do the recent enhancements do? 

The two most impactful enhancements made to the migration tool allow you to filter videos based on certain meta data and discover orphaned videos. An orphaned video has no owner. Without an owner, the tool doesn’t know where to send them. So, now it’s up to you to decide where those videos go. 

Note: these enhancements may not yet be available to GCC tenants by the time this blog is published. 

1. Filter videos in the Microsoft Stream migration tool 

You can now filter videos by content type, publish date, last view date, and view count. There are three content types to choose from. They include Video on demand, Teams meeting recording, and live event.

2. Discover and migrate orphaned videos 

When the author of a video leaves an organization, there video doesn’t leave with them. That video remains where it is but now has no owner. This enhancement allows you to find the orphaned videos and migrate them. Because they have no owner, the decisions for these might be more complex than the rest.

What are the available destinations? 

There are only two destinations available during the migration – SharePoint Online or OneDrive.  

What do I do when I’m done migrating my videos? 

Make sure to close the loop on your migration efforts by selecting the options in the Stream Admin center to switch over to Stream (on SharePoint). You must be a designated Stream admin to perform this task. Once you’re at the Stream Admin settings page, select the radio buttons to save videos on Stream and disable Stream (Classic) for users.   

Now that we’ve covered the basics with the tool, let’s go over the “gotchas” Quisitive has discovered along the way.  

Common Issues and How to Address Them

As with any new tool, there are bound to be bugs and issues encountered along the learning curve. Here are some “gotchas” to keep in mind when moving around inside the Microsoft Stream migration tool and working with your videos. 

1. Settings button is missing 

It was reported by one of our clients that the new “Settings” button that appears in the Microsoft Stream migration tool doesn’t appear. After thoroughly testing, Quisitive found that the migration “Settings” button doesn’t appear in Microsoft Edge, but it does appear in Google Chrome.  

Setting button in Microsoft Stream migration tool on Google Chrome

This is important because these settings are directly related to the new filtering enhancement. If you don’t see this button in Edge, try out Google Chrome and it should appear for you. We’ve reported this to Microsoft but there is likely not enough data to support any fixes for it yet. So, if you experience it, let Microsoft know! 

2. Stream doesn’t connect inside the migration tool 

When you visit the Stream migration tool for the first time, there will be a small graphic in the upper right indicating that you are connected to the Stream service and database. 

We’ve gotten reports from some clients that it takes a while for this page to finally load. Up to 20 minutes for one of them. Our recommendation is to leave this page open for an extended period of time to see if it connects successfully. If not, make sure you’ve configured your network firewall to allow the following URLs: 

This Microsoft support page explains this symptom in more detail. If you’ve taken this step already, opening a case with Microsoft is the best next step. 

3. The tool showed failed migrations temporarily 

We’ve observed some margin of error when it comes to the reporting during a migration. On the migration page, you will see a status bar that looks like this: 

We’ve gotten reports that the count of failed videos or containers disappears as soon as the job is finished. We can only infer this is related to the error checking process in the background. It’s possible the output to the screen is delayed, as well. We consider those false positives. 

Focus on the count of failures once all videos in the migration job are finished moving. If there are no jobs running in the tool, and you still see failed videos, it’s accurate and you should address those failures as you observe them. 

4. Users are reporting that videos aren’t playing after the migration 

If you move a video to a new location after it’s been migrated, it will not play correctly. This is due to the fact that each migrated video has a redirect URL that will break if moved. That URL is set during the migration as metadata in the video file and when the file is moved, that redirect URL becomes invalid. Not all is lost if that happens, however. You can remigrate a video to the desired destination, if needed. But for your own peace of mind, we recommend being certain that the destination you’ve chosen is the final one. The official note from Microsoft: 


Without a comprehensive understanding of the choices ahead about your Stream migration, you might struggle more than you need to on your own. And the rapid evolution of Stream and the migration tool presents a unique challenge. But it’s also a unique opportunity to be on the ground floor of Microsoft’s plans for Stream. Afterall, Teams didn’t start strong, but look at it now. Teams has become Microsoft’s flagship suite for productivity and conferencing. It’s exciting to think about where Stream will be in a few years. Will it look like YouTube? Who knows, but Quisitive will be here for it. 

Get help with your Microsoft Stream migration

If you find yourself or your team struggling to understand this migration process or you don’t have the time to invest, Quisitive can help so you can focus on the projects you planned for. Don’t hesitate to reach out to us. We understand that this migration effort was a surprise for a lot of IT teams, but we’ve got you covered. 

Google Analytics in SharePoint Online Feature Image - Is it right for your organization? Image shows the logos for both technologies overlaid on an image of graphs and charts.

Who should read this article?

If you’re a business owner, leader, or decision-maker looking to mature your business with Microsoft’s M365 cloud services and SharePoint Online, you might be thinking about analyzing how your employees use these tools to maximize their value and boost adoption and engagement. This article is for you. 

What are analytics?

Web analytics about user behavior can be interpreted to drive adoption and engagement in a front-facing system like a public website or a company intranet like SharePoint Online. The data collected often includes information about search results, page views, unique viewers, click-through rate, most popular pages, peak usage hours, and much more. 

Analytics create potential for a deeper understanding of employee adoption. Without knowing how your employees use SharePoint Online and M365, you may be missing opportunities for engagement and increased adoption of new tools. The best analytics solution for your organization depends on your business needs. Some analytics tools are more generic in nature and are applicable to wider audience. Oftentimes, free options like Google Analytics are attractive because your budget is limited, or the procurement process takes too long. But there is more to consider that could shift your focus to a hidden cost that isn’t always immediately observed. 

So, let’s review all options before discussing one of the more popular free options, Google Analytics. 

What are my analytics options for SharePoint Online?

There are several options to collect analytics, but if you do a quick search you might find results like Google Analytics, tyGraph (AvePoint), CardioLog, Netwrix, Microsoft Adoption Content Pack, Syskit Point, and ShareGate Desktop.

The audience differs for each solution. For example, Syskit, Netwrix, and ShareGate are aimed at administrators and are usually used for auditing and governance. But others are more geared toward employee focused groups like Marketing, Human Resources, or Communications for the purpose of employee engagement. tyGraph, for example, has focused their attention on a smoother experience of interpreting the data, making it more user friendly. Quisitive is a tyGraph (AvePoint) partner because it integrates with more than just SharePoint Online and we support all aspects of the M365 suite including OneDrive, Teams, and the Viva Suite. 

Of course, Microsoft offers their own analytics engine inside of the M365 admin center and a separate Power BI dashboard kit. Without question, both add tremendous value in lieu of a third-party tool. Microsoft has made great strides in the last few years with their analytics, but the scope of the data collected is limited. If you’re not in a position to fund the purchase of a third-party tool, the M365 Usage report is an excellent choice and requires little effort to understand and use.

Usage graph example

Factors to consider when choosing an analytics tool may include: 

Google Analytics in SharePoint Online

Google Analytics Logo

One of the most popular free alternatives is Google Analytics. Let’s review the pros and cons. 

Pros of Google Analytics in SharePoint Online

Cons of Google Analytics in SharePoint Online

Which tool is best for my organization? 

Now that you are aware of the pros and cons, let’s put it all into context.  

One of the most common reactions to a paid solution is aversion due to the perception of excessive cost. But just because a tool is free doesn’t mean it won’t cost you anything and may simply be the wrong tool for your business. Security is also concern that might be ignored in favor of a free tool. What good is a free tool if it doesn’t truly meet your needs? 

Your organization may in fact already be using Google Analytics and wondering whether to stay with it or move to a new solution. But the decision remains the same because your business is going to change, and your solutions should be agile enough to change with it. Is Google Analytics agile enough to keep up with M365? Google is focused on a wider audience, mostly public websites, and especially online retail. Metrics meant for retail and public websites simply can’t be used in SharePoint Online because the data point isn’t available. 

With that in mind, it’s safe to say that the cost of an analytics solution is relative. Not just compared to other solutions, free or paid, but relative to the potential consequences of using the wrong solution, costing you time and energy when you likely revisit this decision in the future as your business matures beyond the free solution’s limitations. 

You might think ROI isn’t a factor while implementing Google Analytics because the solution is free. The reality is that if you choose an analytics solution that isn’t right for your business, you could find yourself collecting irrelevant data or more likely end up in an information deficit – behavioral information that could help you motivate your employees to use the tools that organically boost efficiency and output and shift your company culture in a positive direction. 

The financial cost of an incomplete understanding how your employees work is difficult to quantify but it’s easily observed in common day-to-day pain points and high turnover. Both could be addressed by leadership who are armed with the added context the behavioral data offers. 

The ultimate goal of an analytics solution is to boost productivity but like any business goal, the underlying objective is financial. Increased productivity usually equates to greater cost savings and ideally leads to increased revenue, covering the cost of the solution that enabled your business to mature and succeed. In theory, a paid tool could pay for itself eventually. 


If your goal is to mature your business by using an analytics solution for M365 and SharePoint Online, the question you might ask yourself when choosing a product is: 

“Can we afford the cost associated with the risk of an information deficit?”

If you’ve exhausted your options for funding and have no alternative outside of the native M365 analytics, Google Analytics will suffice, especially with a tight timeline. 

But if you have any wiggle room in your budget and timeline to obtain a paid solution that integrates with M365, you may end up covering the cost with the boost in productivity from increased adoption down the road. It’s a win-win scenario, if done right. You should still take your time to do your diligence and make a confident decision. 

Whether your goal is to simply go paperless or to implement complex solutions for big ideas inside SharePoint Online and M365, Google Analytics is a fine starting point and may suffice long-term. However, if you’re expecting ongoing organizational change alongside adoption of new M365 tools, you may better serve your organization by choosing a solution that fully integrates with M365 applications, in anticipation of the future business needs. 

Remember, you likely won’t stop adopting M365 at SharePoint Online and might eventually introduce Teams, OneDrive, and the Viva Suite to your employees. You may need a more comprehensive report of usage across them all to make sure your business is headed in the right direction.

Looking for additional assistance with SharePoint Online?

Quisitive offers a team of experts that can help you create a strategy for implementation, manage adoption, build your SharePoint instance, or optimize your current instance.

AI Readiness Blog Feature Image: Profile of a woman overlaid with computer code


The adoption of Artificial Intelligence (AI) is rapidly increasing across industries, offering new opportunities for businesses to automate processes, improve decision-making, and enhance customer experiences. To leverage these benefits, organizations need to be AI ready. Let’s focus on the steps you need to take to prepare your organization for AI implementation, focusing on Azure Machine Learning, Azure Data Factory, and Azure DevOps as essential technologies.

Understanding AI Readiness

AI readiness is the degree to which an organization is prepared to integrate AI technologies into its business processes and operations. It involves a comprehensive assessment of the organization’s current capabilities, infrastructure, data, and workforce, followed by the development of a strategic roadmap to address identified gaps and opportunities.

Key Components of AI Readiness

  1. Strategy and Vision: Develop a clear understanding of how AI can support your business goals and define a strategic vision for AI adoption.
  2. Data Infrastructure: Ensure that your organization has access to the necessary data and infrastructure to support AI initiatives, including data storage, processing, and analytics capabilities.
  3. Talent and Skills: Assess your organization’s current AI talent and identify areas where additional guidance may be necessary to support AI initiatives.
  4. Governance and Ethics: Establish policies and guidelines to ensure the ethical use of AI and to address potential risks and challenges associated with AI implementation.

Leveraging Azure Technologies for AI Readiness

Building the Model

Azure Machine Learning is a cloud-based service that enables organizations to build, train, and deploy machine learning models at scale. By using Azure Machine Learning, organizations can:

Data is Key

Azure Data Factory is a cloud-based data integration service that allows organizations to ingest, prepare, and transform data from various sources into a format suitable for AI and machine learning applications. Key features of Azure Data Factory include:

Complete the Cycle with MLOps

Azure DevOps is a suite of tools and services designed to support the entire application development lifecycle, from planning and coding to deployment and monitoring. By integrating AI and machine learning projects with Azure DevOps, organizations can:

Building AI-Ready Organizations

To fully leverage the benefits of AI, organizations need to develop robust data and analytics capabilities and adopt a data-driven mindset. This involves:

  1. Data Strategy: Develop a comprehensive data strategy that outlines the organization’s goals and objectives related to data management, analytics, and AI.
  2. Data Governance: Implement data governance policies and procedures to ensure data quality, security, and compliance.
  3. Data Integration: Integrate data from various sources, both internal and external, to create a unified view of the organization’s data assets.
  4. Data Analytics: Leverage advanced analytics and AI technologies to extract valuable insights from the organization’s data, driving better decision-making and improved business outcomes.

AI Journey

AI readiness is crucial for organizations to harness the full potential of artificial intelligence and gain a competitive edge in today’s rapidly evolving business landscape. By developing a clear AI strategy and vision, investing in the right data infrastructure, building the necessary talent and skills, and implementing strong governance and ethics policies, organizations can prepare themselves for a successful AI-driven future.

Leveraging Azure technologies like Azure Machine Learning, Azure Data Factory, and Azure DevOps can significantly streamline the AI readiness process, enabling organizations to build, deploy, and manage AI solutions more effectively. By focusing on data and analytics and adopting a data-driven mindset, organizations can further enhance their ability to capitalize on the opportunities presented by AI.

If your organization is embarking on its AI journey and seeking expert guidance to help you become AI-ready, feel free to contact us. Our Data and AI team will be more than happy to assist you in developing a customized roadmap for AI adoption tailored to your unique business needs and objectives.

HFMA 2023 Recap - Feature Image, A female doctor holds a clipboard and looks at the camera, HFMA 2023 logo, green background pattern

The Healthcare Financial Management Association (HFMA) Conference is an annual event that brings together healthcare finance professionals, industry leaders, and innovators to discuss the latest trends and advancements in healthcare finance.

The Quisitive team was honored to sponsor this year’s event held recently in Nashville at the beautiful Gaylord Opryland Resort.  It was an extraordinary gathering of minds, offering insights into the future of healthcare finance and the transformative potential it holds for the industry. In this blog post, our very own Suresh Krishnan shares his highlights and takeaways from the HFMA 2023 conference and shares the key trends that every finance leader in healthcare should keep in mind to improve financial results and thrive in 2023 and beyond

All Eyes are On Digital Transformation and Advanced Technologies in Healthcare Finance

One of the overarching themes of HFMA 2023 was the significance of digital transformation in healthcare finance. The conference shed light on the increasing adoption of advanced technologies such as artificial intelligence (AI), machine learning (ML), robotic process automation (RPA), and blockchain in finance operations. Session speakers emphasized the potential these technologies have for streamlining processes, enhancing data analytics, improving revenue cycle management, and driving cost efficiencies.

Data-Driven Decision Making Will Set You Apart

Data analytics emerged as another critical focus area during the HFMA Conference. During the event sessions, many experts discussed leveraging analytics to drive insights into financial performance, risk management, population health, and patient outcomes. With the exponential growth of healthcare data, there is a massive need to harness its full potential strategically and responsible for strategic decision-making. Data and reporting will remain hot topics for finance professionals in healthcare as advanced analytics tools and techniques enable leaders to make data-driven decisions, optimize revenue cycles, identify trends, and improve overall financial performance. Finance leaders who invest in the tools and infrastructure needed to integrate and report on data will set their organizations apart.

Value-Based Care & Payment Reform is Here

The conference provided a platform to explore the ongoing transition from fee-for-service to value-based care models. Experts highlighted the importance of aligning financial incentives with quality and outcomes, emphasizing the need for collaboration between healthcare providers, payers, and patients. Sessions focused on strategies for managing financial risk in value-based contracts, improving care coordination, and leveraging technology to support payment reform initiatives.

Revenue Cycle Management Remains a Pain Point

I was surprised at the number of vendors showcasing revenue cycle management (RCM) solutions, highlighting the ongoing struggle in this area. The conference showcased innovative RCM strategies and technologies aimed at improving revenue integrity, reducing denials, and enhancing patient financial experience. We were there showcasing our MazikCare platform and Payer Matrix RCM Collections Workflow Module. Discussions covered topics such as automation of RCM processes, predictive analytics for claims management, patient engagement tools, and revenue cycle optimization best practices.

Social Determinants of Health Can Bridge Gaps

I listened to Dr. Thomas Fisher from the University of Chicago Medicine talking passionately about health equity in inner-city communities. His stories from ER during the COVID crisis were very moving. During his session, Dr. Fisher emphasized the need for healthcare technology to address the underlying social determinants of health (SDOH) to achieve true health equity in underserved inner-city communities. Dr. Fisher’s accounts from the front lines of the COVID-19 crisis shed light on the profound impact of systemic disparities on individuals’ health outcomes. As he spoke, it became clear that digital solutions that integrate SDOH considerations are essential for bridging the gap and delivering comprehensive care to vulnerable populations. By incorporating data on factors such as housing, education, and access to resources, healthcare technologies can empower providers to offer personalized interventions and support, ultimately improving the health outcomes and quality of life for individuals in these communities.


The HFMA 2023 Conference was an enlightening gathering that showcased the transformative potential of healthcare finance in driving positive change in the industry. Sessions offered valuable insights into the digital transformation of finance operations, the power of data analytics, the shift towards value-based care, regulatory compliance, and innovative revenue cycle management strategies. The conference undoubtedly left participants inspired and equipped with new knowledge and perspectives to shape the future of healthcare finance. As we reflect on HFMA 2023, it is evident that healthcare finance professionals are at the forefront of driving change, innovation, and financial sustainability in the industry. By embracing technological advancements, leveraging data analytics, and adapting to evolving payment models, healthcare finance leaders can position their organizations for success in an increasingly complex and dynamic healthcare landscape.

About the Author

Suresh Krishnan Headshot

Suresh Krishnan, Senior Director, MazikCare – Quisitive

Suresh is a healthcare IT Leader with CHCIO certification and over 25 years of experience in application development, infrastructure management and cybersecurity. In 2016, Suresh was recognized as a Top100 CIO in Hospitals and Health Systems. Suresh works with healthcare leaders to expand their use of modern technologies to improve end-to-end care delivery and operations.

Computer with code feature image. Developer works on coding Responsible AI.

As new and more powerful Artificial Intelligence (AI) technologies are being developed, AI products are becoming increasingly integrated into our lives. From the algorithms that power search engines, to facial recognition software, AI is already having a major impact on society, and it has the potential of revolutionize many aspects of our lives. However, with great power comes great responsibility. As AI continues to evolve, it is important that we ensure that it is developed and used responsibly. 

Why do we need responsible AI? 

Historically, the advancement of technologies, even if developed with the best intentions, can have unintended consequences, that can be even harmful. Developing responsible AI can help anticipate and prevent the potential issues caused by AI. There are many reasons why developing responsible AI software is important, but here are a few of the most important reasons: 

Avoiding bias and discrimination 

Most AI software relays on machine learning (ML) models to learn how to respond to inputs. The ML models are trained using data (this data is named training data). If the training data is not selected carefully, it can contain biases that are then introduced to the AI software. Selecting the training data is a very important but hard task. ML models usually learn from existing real world data, due to the large amount of data needed to properly train them. This means that even accurate models can learn and sometimes amplify pre-existing biases in the data based on race, gender, religion, etc. 

One well known example of how unintended harmful biases can be introduced in AI is the COMPAS system, which was used by criminal justice agencies to estimate future reoffending rates of individuals. This model used an AI algorithm that determined the risk of an inmate to reoffend, based on criminal history and demographic information.  The intention of this program was to create a fair system that was not influenced by the unconscious biases a person may have. 

However, a study published in Science Advances in 2018 showed that the model was biased, and it discriminated against Black individuals. Moreover, the overall accuracy of the model was around 65%, which is comparable to the combined average results of untrained people. The study cites: 

“Black defendants who did not recidivate were incorrectly predicted to reoffend at a rate of 44.9%, nearly twice as high as their white counterparts at 23.5%; and white defendants who did recidivate were incorrectly predicted to not reoffend at a rate of 47.7%, nearly twice as high as their black counterparts at 28.0%.” 

It was ultimately identified that the source of bias was in the data that was used to build the AI algorithm. In the training data set, black people were more likely to be incarcerated than white people. By overlooking this fact,  AI was not designed responsibly, and the AI model learned a racial bias as a result. Since the COMPAS AI algorithm might be used by parole boards and other judicial bodies, there is great potential for harm, which highlights the importance for responsible AI design. 

Promoting fairness and equity 

Another troubling example lies in facial recognition software. Through various studies, researchers have demonstrated that some widely used AI based facial recognition algorithms have trouble recognizing subjects that are female, Black, and between 18-30 years old. This can lead to degraded experiences to users on those categories and even biases against those users. The problem here seems to be that the training data is not representative of all the users for the facial recognition algorithms. Reflecting possible oversight during data collection, inadequate sampling data, poor design or budgetary limitations. Regardless of the cause, the potential for harm is real.  

Responsible AI practices can help to ensure that AI systems are designed to be fair and to promote equity between its users. The idea behind fairness is that AI systems should be beneficial to everyone, not just a select few, and they should treat everyone fairly and impartially. However, fairness is far from a solved problem, as shown by the previously mentioned examples. 

For fair responsible AI design we should make sure the training data is sampled in a way that is representative of users. For example, if the model will be used for people of all ages, but you only have training data for young adults, that will likely not be a fair model. To train a fair model, developers should make sure to include data for people of all ages in the training data set. 

Protecting privacy and security 

AI systems collect and process a lot of personal data, like images of faces, age, demographics, etc. That data is used to train ML models to provide a service to users. However, the use of personal user data raises privacy and security concerns. If this data is not properly protected, it could be used for malicious purposes, like identity theft. 

Moreover, AI systems trained on personal data can be used to track people’s movements, monitor their online activity, and even to predict their behavior. For example, facial recognition algorithms are currently being used in China to track millions of citizens daily lives, which is already rising privacy concerns. Building responsible AI can help protect the privacy and security of its users. In order to develop responsible AI, we must balance security, convenience and privacy. 

What can we do about it? 

Alongside the fast advancements in AI in the last few years, there has also been a lot of research conducted about how to solve and prevent the problems of AI. The huge potential benefits of AI are clear, but researchers are looking for a way to balance its benefits with the potential to cause harm. Many of the companies leading the development of AI technologies, like Microsoft and Google, are also invested in research to guarantee the AI systems they develop are fair, safe and protect the user’s privacy. In other words, they are invested in the development of responsible AI. 

In order to guide organizations in developing responsible AI, Microsoft developed six guiding principles: 

  1. Fairness: AI systems should treat all people fairly. 
  1. Inclusiveness: AI systems should empower everyone and engage people. 
  1. Reliability and Safety: AI systems should perform reliably and safely. 
  1. Transparency: AI systems should be understandable. 
  1. Privacy and Security: AI systems should be secure and respect privacy. 
  1. Accountability: People should be accountable for AI systems. 

You can find more information on Microsoft’s principles for responsible AI at

Ready to talk to an expert about how to establish responsible AI practices in your organization?

In today’s digital landscape, where cyber threats loom larger than ever before, cyberattack prevention is more important than ever. Safeguarding your business from potential data breaches, cyberattacks, and security incidents is paramount. As the guardians of your company’s technological infrastructure, you hold the responsibility of protecting sensitive information, ensuring operational continuity, and maintaining customer trust. That’s where Managed Detection and Response (MDR) services step in as your ultimate game-changer.  

In this blog post, we’ll explore the undeniable benefits that MDR services bring to your organization, empowering you to make informed decisions that can transform your cybersecurity posture, aid in cyberattack prevention, and ultimately enhance your business outcomes. 

1. Proactive Threat Detection and Rapid Response: 

Cybersecurity is constantly evolving. MDR utilizes a mixture of automation and analysts to implement an around-the-clock proactive approach to cyberattack prevention. This is accomplished through monitoring your network, endpoints, and cloud infrastructure for any signs of suspicious activity or potential threats. 

An example of this proactive approach to security is an immediate notification for when a document is shared with sensitive information, even if that file is stale. Once the automated system detects this bad behavior, it sends the alert to an analyst to resolve the matter before any harm has occurred. By employing these advanced technologies and leveraging threat intelligence, MDR teams can quickly identify and mitigate emerging threats, ensuring that any security incidents are swiftly contained and neutralized before they cause extensive damage. 

2. Around-the-Clock Security Operations: 

When clients would ask if we could monitor them around the clock, the answer was always “No”, until now. We can now offer security operation centers that are fully staffed 24/7 every day, even holidays, in the US.

This approach allows us to always have an expert analyst in the chair promptly responding to alerts and threats around the clock. Your information assets are continually monitored for any sign of bad behavior. This means you can rest easy, knowing that there’s always a team of experts diligently watching over your systems and responding promptly to any security events. 

3. Access to Cutting-Edge Technologies and Expertise: 

Managing cybersecurity internally can be a daunting task, requiring significant investments in infrastructure, tools, and talent. This is what makes our partnership with Critical Start so powerful. Critical Start is a known leader in the area of advanced security operations since 2015, and an integral Microsoft partner, allowing these security protocols to work seamlessly with your existing software infrastructure.

MDR implements, optimizes, and helps customers get more out of Microsoft investments like: Microsoft 365 E5, Azure Purview, Azure Sentinel, Microsoft Security Center. Essentially, adding a module of capability to the existing programs. This synergy of leading software companies and cutting-edge platforms allows advanced threat detection capabilities, threat hunting techniques, incident response best practices, and cyberattack prevention, all without the burden of building and maintaining an in-house security operation. 

4. Improved Incident Response and Remediation: 

When a security incident occurs, time is of the essence. That’s why we have a service-level agreement of 1 hour time-to-detection and within 1-hour resolutions. This means that regardless of the time it is received, every alert will get an expert’s attention within 1 hour, and,  your digital assets will always receive rapid response to incidents.

On top of this, our clients have full visibility of every alert and activity. Providing a comprehensive view of your company’s security threats. You will always stay in the loop about your company’s security, and we will always respond to any cyber threats without delay. These protocols minimize the impact on your business operations and reduce downtime. 

5. Enhanced Compliance and Regulatory Adherence: 

In an era of increasingly stringent data protection regulations, compliance is no longer a choice—it’s a necessity for cyberattack prevention–MDR services can play a pivotal role in helping your organization achieve and maintain compliance with industry-specific regulations such as GDPR, HIPAA, PCI DSS, and more. By aligning their processes with regulatory requirements and offering valuable insights and documentation, MDR providers can assist you in demonstrating your commitment to data security and regulatory adherence. 

6. Risk Reduction and Business Continuity

A successful cyberattack can lead to severe financial losses, reputational damage, and operational disruptions. MDR services offer a proactive defense strategy that significantly reduces your risk profile and aids in cyberattack prevention. By quickly identifying vulnerabilities, implementing preventive measures, and fortifying your security defenses, MDR providers enable you to safeguard your business continuity, protect your critical assets, and ensure uninterrupted service delivery to your customers. 

Protect Your Business with Spyglass MDR

With Spyglass MDR we implement, fix, improve, and offer 24/7 monitoring. This is a total solution. Additionally, we can have a client receiving full 24x7x365 monitoring in 7-14 days. Embracing MDR services empowers your business to stay one step ahead of malicious actors, ensuring that your digital infrastructure remains secure, your operations run smoothly, and your customers trust you with their sensitive information. So, make the strategic decision today and unlock the power of MDR services to elevate your cybersecurity posture and achieve greater business success.

Remember, cybersecurity and cyberattack prevention is not just an IT concern; it’s a fundamental business imperative. 

Stay secure and vigilant by contacting our security experts today about Spyglass-MDR.

The benefits of managed services feature image - a tech team works to monitor technology for a company

You’ve built your IT team carefully, adding experienced and effective personnel over a long period. Your team knows your systems inside and out. They might even have built them from the ground up. They are experts who provide significant value to your organization.

But you’ve been hearing that the benefits of managed services could add value and you might be wondering: Why would you lose that huge competitive advantage that your local team represents, and hand the reins of your technology to someone else?

It can be difficult at first to see the value in a managed services approach to your technology infrastructure. But this model can provide your organization with considerable advantages without taking away from the great work your in-house team does.

The Benefits of Managed Services

1.      Expertise

A managed services provider specializes in running infrastructure and staying abreast of technology trends. An in-house team can’t be expected to keep up with rapidly evolving technology, much less know when new components become available that can give your organization new advantages. Having a managed services partner with expertise in infrastructure is the best (and cheapest) way to make this expertise available for your organization.

2.     Premier Support

Premier-level support for your Microsoft infrastructure is costly, and beyond the capacity of smaller organizations, but it’s essential to running your infrastructure effectively. With premier support, if your service provider can’t solve your problem, they can go straight to Microsoft to get in-depth root-cause analysis from their engineering QA teams. Bringing on a managed services partner that leverages premier-level support is the most cost-effective way to get this level of service.

3.     Cost

Maybe you can do it all in-house yourself, but what is it costing you in time and resources? Can you afford the staff that you’ll need for an entire IT team, or would it be better to hire a service to do it all for you? From a cost point of view, it’s far better to let your IT team focus on their core competencies and your strategic initiatives like support for your enterprise systems and integrations and let a managed services partner provide the best practices and expertise specifically for your infrastructure to keep your running in optimally and securely.

4.     24/7 Support

IT issues don’t follow a 9-to-5 schedule, and your support can’t either. What would it cost to build an in-house team to work on a 24/7 basis? A better solution is to engage a managed services provider that has a blended team, pulling talent from a worldwide pool. They can provide round-the-clock coverage, ensuring that you are taken care of at any time, on any day.

5.     Gap Filling

No matter how great your IT team is, it’s impossible for them to cover everything. Perhaps they lack experience in a certain area. Maybe a key member of the team will need time off, so you’ll need temporary support until they’re back. A managed services partner can fulfill those short-term needs to make your team even more efficient and effective.

The managed services approach won’t solve all of your IT problems. But understanding the benefits of managed services, the model and the potential advantages will help you create an accurate picture of your current and future IT needs and evaluate how to best meet them in the context of your larger organizational strategy.

Quisitive offers a cloud managed services program where you gain access to our expertise in Managed IT Services that spans a wide range of technologies.

We can help sustain your applications and technologies with routine support and environment management as well as advise you on strategic improvement programs and ongoing coaching.

Unlock the benefits of managed services with Quisitive.

Explore our different Managed Services offers or, for ad-hoc needs, see our Flex Services.

How to Apply Multi-Select Filters in Power Apps

Power Apps with Combo Box

In this step-by-step tutorial video, we will learn how to build multi select filters in Power Apps. We will apply multiple filters to PowerApps Gallery including multiple item selections using combo box, multi-select checkboxes and multi select buttons.

We will build these gallery filters keeping delegation in mind (No delegation warning) & work with multiple data sources like Dataverse & SharePoint.

I will showcase how to filter a multi select choice column based on a multi select combo box control.

Trick is to go around the IN non-delegable Operator and use Equals (=) which is a delegable function. I will also cover a new function called “Index” in Power Apps.

This video covers the following:

Looking for additional assistance with Power Apps?

Microsoft Announces Copilot in SharePoint

On May 2nd 2023, Microsoft 365 Copilot in SharePoint was added to the Microsoft 365 Roadmap for SharePoint.

Microsoft emphasizes that Copilot in SharePoint “combines the power of Large Language Models (LLMs), your data in the Microsoft Graph, and best practices to create engaging web content. Use a brief prompt to generate custom sites and pages with content hierarchy, design, and sample content that aligns with user needs. And all within our existing commitments to data security and privacy in the enterprise.”

So what does this mean for you?

Copilot in SharePoint is really going to empower users to create content easier than they were able to before with much richer automation techniques. With natural language, users will be able to ask Copilot to create a new site based on a PowerPoint presentation. Microsoft 365 Copilot will then take over and create the content, whether that’s a site, a page, etc.

What is Microsoft 365 Copilot?

Microsoft 365 Copilot is a new technology that Microsoft has rolled out. It’s an AI tool based on GPT4. GPT4 may sound familiar to you because ChatGPT is also based off of GPT4.

Microsoft 365 Copilot is going to be rolled out across the Microsoft 365 stack in everything from Word and Excel to Teams, SharePoint and more. It’s going to use the power of AI to do the work for you to get information you need to shorten the amount of effort that it takes to do normal things. Whether it’s creating content in Word, SharePoint, Teams, it’s going to integrate with Outlook as well and the Viva Suite. 

Here’s an example video Microsoft has provided where somebody is asking Microsoft 365 Copilot to create an employee onboarding site for product managers and use this onboarding PowerPoint to get started. With just a simple sentence and an uploaded PowerPoint file, Copilot is able to create a new site and start populating it with information.

From what I can tell, if you’re not good at page design, that won’t matter anymore because Copilot will be able to come up with pretty nice looking designs. So there’s going to be a lot of benefits to this technology, especially when we have Copilot in SharePoint later this year.

Copilot in SharePoint Roll Out

According to the Microsoft 365 Roadmap, the Copilot in SharePoint roll out will begin in November 2023.

My own personal tenant is running on the targeted release so it should get things sooner than the majority of production environments which would typically be on the standard release. As soon as it’s available in my tenant, I’m definitely going to be trying this out and posting more videos on what I can do with it.

With this being rolled out to desktop applications as well as the Enterprise Suite like Teams, Viva, and SharePoint, users are going to get a lot of exposure to this. You should be familiar with this so you can answer questions they may have and help them because this is going to be a large amount of functionality that’s added that users will definitely be trying out this technology.

To support this new interface that Copilot is going to be sitting in, the edit layout for SharePoint pages is changing as well, moving all of your authoring tools off to the side so that you can quickly get to what you need I think it’s a much needed improvement over the old editing layout.

As more information comes out I’m going to keep putting out update videos on these topics to keep you informed are you excited to try out Microsoft 365 Copilot when it rolls out to your tenant.


Looking for additional assistance with SharePoint?

M365 group lifecycle ideation Feature Image - a lightbulb made of paper with the m365 logo overlaid

In the first blog in this series, I noted that there are five stages to the lifecycle of each M365 Group: Ideation, Request, Creation, Monitoring, and Archival.

Today, I’ll expand on the first stage, Ideation. This is the point at which someone has an idea for a place to accomplish some kind of work. We need to fully understand several issues that may come up during phase, so let us review them.  

Does an M365 Group Already Exist?

One major issue that organizations face is that when they allow M365 Groups to be created, there is an explosion of groups that are duplicates of each other. We saw this in Yammer, where it was so easy to create a new community that users would often create one before determining if one already existed.

Ensuring that users check to see if an M365 Group already exists for their proposed use is step one in controlling your M365 Group environment. The question is, how do we accomplish this? 

The first step should be to train our users to always look for an existing group before they create one. Unfortunately, this is more challenging than you think. When a user has an idea for a Group, they want to get working on it right away, and any delay in our process is likely to drive them into using another tool that offers them immediate gratification.

For example, a user wants to chat and collaborate with their team on a project. This is a quintessential ad hoc collaboration example that Microsoft designed M365 Groups to address. However, if we force users to enter a request and it takes hours or days to create their workspace, they might just create a Group Chat in Teams with everyone on the project team and use the Files tab in the chat to collaborate in their various OneDrive accounts.

This is not the best practice for using Microsoft Teams. We should provide guidance for users on the proper behavior. It underscores how important it is to not place roadblocks to creating M365 Groups. 

If we want our users to see if an M365 Group already exists, then we are going to have to help them as the out-of-the-box experience is lacking. Let us take as an example a user who wants to create a Microsoft Team workspace for a demo. For this example, I have created three Teams workspaces. Demo Public Team, Public Demo Team, and Demo Private Team.  

Demo Public T 

For our purposes, either would be a duplicate of our example user’s new proposed Team. They go to the “Join or create a team” link and click it. They are presented with a list of public teams that they might want to join.  

You can see the first problem here. Where is the Demo Public Team? It is not suggested to the user for some reason that only Microsoft truly knows. We can search for it, so our example user will type in Demo and see if they find anything. 

In this case, the user sees the public team that starts with Demo, but not the private team, or the team Public Demo Team because it only looks for Teams that start with your search term. We can see that if we search for “public” we get these results. 

As for private teams, that is even harder to locate. So, how do we solve this problem?  

Creating a Directory of M365 Groups 

We will need to create our directory of M365 Groups. This is not necessarily difficult but keeping it up to date is tedious and requires us to automate a process that runs on some schedule. We can use PowerShell to accomplish this. There is a PowerShell cmdlet called Get-AzureAdGroup that returns every AzureAD Group. It is part of the AzureAD module, and it will return the following: 

This could be used to create a directory, but it does not tell us much about the M365 Group. We could use Get-SPOSite instead which gives us much more information, including Site Collections that are not M365 Groups, which might be valuable to us. We could also go to the SharePoint Admin center and export the list of Active Sites to a CSV file: 

Here you will get a list of all the sites with everything that you might need. 

If you want the list up to date, then you will either need to re-run the PowerShell or the export and then save the data someplace like a SharePoint list that you can use for your directory. Keeping that directory up to date either requires a job to run daily or forcing every new site to be created through a process that updates this directory as part of its process. We will talk about that in Part 3 where we discuss Requesting a new M365 Group. 

Is an M365 Group the Best Way to Solve this Problem? 

This is a tricky question to address technologically, but from a Governance perspective, it is one that we need to address. There are reasons that an organization might want users to work in specific ways. For example, there might be a process to create a project workspace when D365 reaches a specific point in the opportunity lifecycle. Thus, we do not want users to request or create a site for a project since it will happen as part of an ERP workflow.

This training and communications issue should be part of our Governance process even though there is not a technological solution for it. We might prevent users from seeing the project workspace template, but that will not stop them from requesting say a generic M365 Group and customizing it. We need to monitor that, as well as educate our users on how to create these groups. 

Select the Right Template for the M365 Group 

Like the directory that we will need to create, each M365 Group should be based on a template. These should be designed to guide users to which template they would request based on the problem they are trying to solve. For example, if a user is looking for a place to collaborate and communicate around a set of documents for a presentation, creating a Teams workspace makes sense. If they are working on a process like loan origination, then it might make more sense to create a channel in an existing Teams workspace or add a folder to a SharePoint site.  

Another advantage of templates is that they can be used to customize the content, the features, the look and feel, and more of an M365 Group when it is created. Take as an example a new project workspace. We might want to include the template for the project charter, as well as a risk register, and the templates for requirements and design documents. That way we can ensure that the correct documents are used without the users having to search for them. 

Putting It All Together 

Our goal here is to allow our end users to easily create sites when they need them with minimal disruption or delay. To accomplish this, we need to ensure that they can quickly and easily find M365 Groups that already exist, even if they do not have access to them to prevent duplication. If we do not do this now, then during the approval or creation step someone else will need to validate that a group does not already exist, or we will have duplicate groups that will require merging.

This is the time to prevent that, but to do so we must have a searchable list of all the groups. Yes, you can still hide some groups that might be secure in nature. We call those hidden groups that do not appear in the directory. 

Next Steps 

The Ideation phase of M365 Group Lifecycle is focused on aligning users with the best options using M365 Groups to address their issues.  We also need to ensure that they don’t duplicate existing sites.  This means that we need a directory of sites that users can search prior to creating a site.  The Out of the Box tools for this aren’t great, but with some PowerShell and time you can enable this searchable list for your users.  This is the point that we should expose users to the potential list of group templates that exist so that they can select what works for them.  This means we need to identify and create these templates that are linked to use cases that we have identified. 

In the next article in this series, we will talk about the Request phase of M365 Group lifecycle. 

Looking for additional assistance with Microsoft 365?