Time is crucial in healthcare, where every delay can significantly impact a patient’s experience and recovery. Effective care coordination ensures that patients receive the right care at the right time. To achieve this, physicians, providers, patients, and other stakeholders need to communicate efficiently to convey critical healthcare information and take necessary actions.
There is growing interest in quality care coordination, with healthcare trends shifting away from conventional fee-for-service (FFS) models towards value-based care, which prioritizes patient outcomes and cost savings. Technological advancements like AI and machine learning provide healthcare providers with tools to fine-tune and automate information for more effective management of patient needs.
However, despite the benefits of technologically advanced coordination solutions, healthcare teams may still cling to outdated systems and processes due to the technical complexities involved.
Let’s explore the common pain points of digital transformation in healthcare and effective ways to overcome them with intuitive software solutions.
Care coordination involves multiple parties, including specialists, healthcare staff at various organizations, diagnostic technicians, patients, and payers. Undisrupted communication between all parties is crucial for maintaining high standards of value-based healthcare. Unfortunately, healthcare teams may still rely on legacy communication channels, such as faxing, which can cause significant delays for patients seeking urgent care and treatment.
Test Result Tracking
While mainstream technology allows for real-time tracking of food deliveries and parcels, many healthcare settings lack real-time monitoring for patient test results, despite their impact on a patient’s quality of care. Patients and caregivers often experience lengthy wait times to receive test results particularly when performed by an outside organization, affecting their lifestyle and healthcare planning. Poor care coordination has also resulted in patient-reported adverse events, such as inaccurate test findings among diabetes patients.
Healthcare data security is a complex process with risks, such as improper handling of electronic health records and widespread ransomware attacks that have resulted in losses of over $30 billion.
Rapid changes in data security standards and practices make it challenging for healthcare providers to safeguard their data networks, especially when combined with the ongoing challenges of meeting the latest healthcare compliances rules. A compromised healthcare database poses a serious security issue, as leaked data can undermine the confidentiality of medical history and payment details , which cybercriminals can manipulate or steal for ransom.
The Rise of AI and Machine Learning Opportunities
AI and machine learning have emerged as powerful solutions for enhancing productivity across industries, including care coordination in healthcare.
The dynamic functions of these technologies can reduce human error and expedite repetitive procedures, such as appointment scheduling, patient follow-ups, prior authorization, and referral management, thereby boosting care coordination efforts. Market experts predict that the global AI what? Investment? in healthcare market will reach $188 billion by the end of the decade, with a compound annual growth rate (CAGR) of 37%.
Providing Accurate Data Analytics and Tracking
The robust mechanisms of AI enable teams to consistently monitor patient needs and emergencies throughout their care and treatment by using advanced algorithms gleam exceptions and abnormalities. Healthcare facilities can analyze large databases of electronic health records (EHRs) and quickly notify relevant specialists and providers about patient needs without manually going through all the details.
Optimizing Care Management
The practical implementation of care management remains a challenge for healthcare experts. AI’s organized collection and distribution of health records information can simplify care management processes in complex and stressful environments, facilitating value-based healthcare.
Securing Healthcare Data
Modern care coordination platforms, like MazikCare, offer added protection through a combination of techniques, such as encryption technology and ML methods like federated learning (FL), which train algorithms without exposing sensitive information.
Care Coordination of the Future
Quisitive’s MazikCare Care Coordination platform equips healthcare teams with the tools needed to streamline communications and data management from a unified source. MazikCare establishes a digital bridge between patients and healthcare staff, enabling the highest level of care and service at every stage.
MazikCare drives optimized care coordination in a simple and accessible way through a two-pronged approach: integrating team collaborations and empowering patients.
The MazikCare platform provides a sleek user experience that connects physicians, patients, and payers through highly intuitive dashboards and portals. MazikCare allows teams to manage all types of care management and staff scheduling, keeping every involved party well-informed of the latest changes.
Unrivaled Patient Support
MazikCare empowers patients with access to a sleek portal for taking charge of their healthcare journey. The digital portal guides users through medication management, scheduling, and self-assessments, improving the efficiency of their healthcare experiences. These patient empowerment features provide individuals with improved control over their treatment plans and medical appointments, ultimately leading to better recovery outcomes and disease control.
The increasing momentum in the investment of value-based healthcare suggests that adopting a scalable care coordination platform integrating primary and specialty care with security, interoperability, and user-friendliness gives providers an advantage in this time-sensitive industry.
Speak with Quisitive today to learn how MazikCare can help elevate your care coordination standards with cutting-edge tech breakthroughs.
Dynamics 365 Finance is a powerful ERP solution from Microsoft suitable for medium to large enterprises across various industries. It can help organizations streamline operational efficiency, financial management, and decision-making processes, ultimately driving growth and profitability.
Quisitive is very excited about the release of Dynamics 365 Finance Wave 1 and the value it brings to our customers. All release notes can be found on the Microsoft site here, but below are our top five new features included in the latest 10.0.37 release:
- Invoice Capture: To help expedite the accounts payable process, invoice capture is now available. This new feature automates the reading and recognition of vendor invoices by providing optical character recognition (OCR) capabilities. Vendors can now email an invoice to the system, which is converted into an invoice directly in Dynamics 365. Many customers have been asking about OCR functionality, and we always had to turn to expensive ISV solutions or integrations. With this now out of the box in Dynamics 365, it provides a fully connected accounts payable automation solution.
- Financial Tags: The new financial tags available in Dynamics 365 Finance are an alternative to financial dimensions and eliminate the need for ledger reversal and reclass entries. Financial tags are a way for a business to identify/tag financial transactions to better analyze and associate transactions within the business. For the first release, up to 20 financial tags can be created and have been incorporated into the General journal and Global general journal. Financial tags will be incorporated into more transactions and processes with each subsequent release.
- Subscription Billing Enhancements: In the latest release, subscription billing has been enhanced for cost and revenue deferrals within the project accounting and sub ledger. Subscription billing enables organizations to manage subscription revenue opportunities and recurring billing through billing schedules and better recognize subscription revenues.
- Business Performance Planning: This new feature allows companies to create company-wide budgets and perform continuous planning to drive agile decision-making. Using Power BI and Excel, users can model Dynamics 365 Finance data to derive key insights.
- Added localizations and languages: As businesses expand their reach globally, ensuring all regulations are covered becomes more challenging. This new release now accommodates 51 out-of-the-box countries/regions with localizations for Chile, Colombia, Costa Rica, Nicaragua, Panama, Paraguay, and Uruguay. This release includes standard Electronic Reporting for those areas, including regulatory updates for upcoming legislation in Japan and France. Dynamics 365 now covers 57 languages in total. If a language is not covered out of the box, Translation Services through LCS can help generate.
Questions about this release? Get in touch.
Are you looking to make the move to Dynamics 365 Finance? Book a cloud ERP migration assessment.
Are you still using Dynamics AX? Explore the differences AX vs. Dynamics 365 Finance.
If you’ve come across this blog post, it’s likely that you and your team are in the process of migrating from Microsoft Stream (Classic) to Stream (on SharePoint) or preparing for it. Microsoft will be ending support for Stream (Classic) in 2024. Today, we’ll share the history of Microsoft Stream, updates to the platform, information about the Microsoft Stream Migration Tool and tips to help fix common issues.
Read the full blog to make sure you’re ready for your Microsoft Stream migration.
History of Microsoft Stream
If you’re not yet aware of this unexpected migration, let’s take a moment to catch you up.
In 2017, Microsoft released Microsoft Stream (Classic), replacing Office 365 Video. It offered features like speech-to-text transcribed audio, face detection, linked time codes, and M365 interoperability. Although it was part of the M365 suite, it wasn’t well understood or heavily adopted by the community. During the 6 years since its release, several enhancements were made to improve the app’s experience, but it became clear over time that the architecture wasn’t as scalable as it needed to be. The community still hadn’t fully accepted it as a video streaming service, beyond automatically recorded meetings and the occasional town hall. Accessibility features were lacking and more importantly, Teams development was rapidly accelerating, and increased interoperability was a key focus.
Microsoft went back to the drawing board and in September of 2020 they announced their plans to transition away from the current Stream architecture to a new one, with SharePoint at the center of it versus unique containers to Stream.
In October of 2022, Microsoft Stream (on SharePoint) was released.
In that announcement, Microsoft shared details about new features to expect in addition to the efforts that would go into the migration. Every M365 team around the world realized they had an unscheduled migration project on their hands as a result of this evolution.
The good news is Microsoft did a lot of preparation to help the community make it through this process. That said, there’s no longer a wide gap between the development lifecycle for Stream and the latest releases. This means the Microsoft community is now contributing to the evolution of Stream a bit more by reporting issues and sharing experiences. A similar process occurred when Teams was released in 2017. Microsoft started small and deployed enhancements, many of which were a direct result of community feedback.
After Microsoft released Stream (on SharePoint) last year, they began the enhancements phase for the migration tool, inventory report, and the Power BI dashboard that comes with it. Along the way, there have been many “gotchas” we’ve observed and are ready to share with you to save you headache and time. No, you’re not crazy. That “one button” is actually missing! But first, let’s cover the basics.
Microsoft Stream Today: What you should know right now
Now that the migration is in full swing, standard and government tenants (GCC) have access to the tools offered by Microsoft for the migration. For a brief time, GCC tenants didn’t yet have access. That’s usually the case with enhancements and updates due to diligence and security requirements for government tenants.
Firstly, we strongly recommend you bookmark the retirement timeline, migration tool release notes, and web part transition timeline to understand where we’ve been and where we’re going. Those pages include details for standard and GCC tenants. Microsoft’s technical guide for Stream is now very extensive so we also recommend familiarizing yourself with the structure of their guide to make your life a bit easier.
Standard Tenants Timeline
February 15, 2023
On this day, the migration tool was made available for standard tenants. Since then, a few notable enhancements were made that are exclusive to standard tenants, and we’ll review those later in this blog post.
May 15, 2023
This was the first major milestone date, when no new videos could be uploaded to Stream (Classic) unless an admin took action to push that date back.
October 15, 2023
Now that May 15 has passed, the next milestone date to keep your eye on is this coming October 15, when users will no longer be able to access or use Stream (Classic) unless an admin takes action to push that date back. This change can be delayed until April 15, 2024.
April 15, 2024
At this time, Stream (Classic) will be fully retired & automatically disabled, users & admins will no longer be able to access or use Stream (Classic), and any remaining content in Stream (Classic) that wasn’t migrated will be deleted.
GCC Tenants Timeline
July 30, 2023
On this day, the Microsoft Stream migration tool was made available to GCC tenants.
October 30, 2023
After this day, no new videos will be allowed to be uploaded to Stream (Classic) unless an admin takes action to push the date back. This can be delayed until January 30, 2024.
January 30, 2024
After this day, no new videos can be uploaded to Stream (Classic) for any customers and this date cannot be deferred.
March 30, 2024
Users will no longer be able to access or use Stream (Classic) unless an admin takes action to push this date back. This change can be delayed until July 30, 2024.
July 30, 2024
Microsoft Stream (Classic) will be fully retired & automatically disabled, users & admins will no longer be able to access or use Stream (Classic), and any remaining content in Stream (Classic) that wasn’t migrated will be deleted.
Now that we’ve covered the bigger milestone dates, let’s cover some of the finer details.
What’s changing in Microsoft Stream?
It can be a challenge to summarize every single upcoming change in Stream so our goal is to bring your attention to the bullet points that will save you time and thought energy, especially if you’re struggling to understand it or explain what’s happening to your team and leaders. There are three main points to understand.
Stream’s storage wasn’t very tangible compared to files in SharePoint Online or OneDrive, for example. So, Microsoft decided to use SharePoint Online to store Stream videos. It makes sense, right? Why continue investing in a separate storage architecture that’s not easily accessible or understood when you’ve got one already?
The main thing to know about this is that storage capacity for Stream videos wasn’t as much of a concern until now. Now that SharePoint Online is the platform for video storage, you’ll need to keep an eye on your storage utilization in the SharePoint Admin center to make sure you don’t exceed your licensed capacity.
Not all videos will migrate into SharePoint. Some videos may go into OneDrive, as well. This will be a decision you may have to make multiple times during the Microsoft Stream migration process. The default destination for Teams meeting recordings is the video owner’s OneDrive storage. (Reminder: every user in OneDrive is allotted 1TB of personal storage space.) You have the option to change this destination during the migration.
- You can find out more about the future state of videos once they’re migrated HERE.
- And you’ll find a full breakdown of how Stream will affect your SharePoint storage HERE.
2. Stream (Classic) Web Part
The classic Stream web part is being deprecated during this transition. If you’re using the classic Stream web part on any of your SharePoint pages, you may want to start gaining a deeper understanding of how frequently it’s being used so you can estimate your workload to replace it. What is replacing the classic web part? There are actually multiple web parts that can now interact with videos because they are stored in SharePoint Online. The main candidates include the File and Media, Highlighted Content, and Hero web parts.
- You can read more about the web part transition HERE.
- You can review the FAQs Microsoft has prepared for questions you may already have.
3. Stream Playlists
This is a relatively new concept because now you can store Stream playlists as standard SharePoint lists in any site. It takes some adjusting, but you may find yourself feeling a bit freer to display videos anywhere in SharePoint with this new concept. Microsoft intends to release their own version of this type of page in the future, but Quisitive wants to help you now. Don’t hesitate to reach out to us if you need help with customizations like this.
- You can find all the details about creating Stream playlists HERE.
- One of our own consultants, Steve Corey, has published a video on YouTube demonstrating how to set up a hub of videos using the Highlighted Content web part and PnP search.
Now that you have these high-level details, let’s dive into the mechanics of the migration.
Microsoft Stream Migration Tool
With the recent enhancements Microsoft added to the tool, there are some options we didn’t have just a month ago. Your strategy options changed for the better, as a result.
Where is the Microsoft Stream migration tool?
You can access to the tool in your tenant by going to the following URL: https://admin.microsoft.com/Adminportal/Home?source=applauncher#/featureexplorer/migration/StreamMigration
You can navigate there as well by going to your M365 admin center > Settings > Migrations > Microsoft Stream.
How do I use the migration tool?
Since Microsoft has done an outstanding job of documenting the process and features of the tool, I’m going to point you straight to their Microsoft Stream migration tool knowledge base. They cover every detail and have an FAQ page for it, as well.
How do I know which videos to migrate?
You have two options for collecting video data to determine which videos should be migrated and which should be left behind. A lot of your choices may be unique to your business based on how Stream has been adopted so far.
You might find that you have only 5 video containers and a total of only 20 videos. I’ve personally seen some tenants with upwards of 300 containers and 3500 videos. Even then, the video count isn’t your main focus.
You’ll likely find meeting recordings more than anything else. Most of those will automatically be mapped for you to the owner’s OneDrive storage account. But there will be others that aren’t as easy to determine.
We recommend you first run the scan on your Stream containers to get a high-level estimate of the work ahead. You’ll know almost instantly if your Stream service was adopted heavily or not just by the number of containers in your Stream service. If you see very few containers, you’re one of the lucky ones! You may not need to do much digging to determine the most appropriate destination, and you could skip straight to the migration phase. But you can still go with the second, more detailed option if you prefer.
If Stream is heavily adopted in your organization and you see thousands of videos inside hundreds of containers after running your high-level scan inside the tool, we recommend you run the Stream inventory report. That report comes with a dashboard that will help you understand your videos in greater detail.
Which approach is best?
The answer to this question depends on what you find when you scan your videos. Microsoft has outlined 3 different approaches and a checklist that might best meet your needs.
Quisitive recommends focusing your attention on videos that aren’t automatically mapped to users’ OneDrive storage accounts. Those automatically mapped videos are the “low hanging fruit” you might find easiest to migrate without much discussion. But other videos that belong to teams or groups need some extra care to make sure they go to the right place.
What do the recent enhancements do?
The two most impactful enhancements made to the migration tool allow you to filter videos based on certain meta data and discover orphaned videos. An orphaned video has no owner. Without an owner, the tool doesn’t know where to send them. So, now it’s up to you to decide where those videos go.
Note: these enhancements may not yet be available to GCC tenants by the time this blog is published.
1. Filter videos in the Microsoft Stream migration tool
You can now filter videos by content type, publish date, last view date, and view count. There are three content types to choose from. They include Video on demand, Teams meeting recording, and live event.
- Read more HERE to understand when and where to set those filters.
2. Discover and migrate orphaned videos
When the author of a video leaves an organization, there video doesn’t leave with them. That video remains where it is but now has no owner. This enhancement allows you to find the orphaned videos and migrate them. Because they have no owner, the decisions for these might be more complex than the rest.
- You can read more about this enhancement HERE.
What are the available destinations?
There are only two destinations available during the migration – SharePoint Online or OneDrive.
What do I do when I’m done migrating my videos?
Make sure to close the loop on your migration efforts by selecting the options in the Stream Admin center to switch over to Stream (on SharePoint). You must be a designated Stream admin to perform this task. Once you’re at the Stream Admin settings page, select the radio buttons to save videos on Stream and disable Stream (Classic) for users.
Now that we’ve covered the basics with the tool, let’s go over the “gotchas” Quisitive has discovered along the way.
Common Issues and How to Address Them
As with any new tool, there are bound to be bugs and issues encountered along the learning curve. Here are some “gotchas” to keep in mind when moving around inside the Microsoft Stream migration tool and working with your videos.
1. Settings button is missing
It was reported by one of our clients that the new “Settings” button that appears in the Microsoft Stream migration tool doesn’t appear. After thoroughly testing, Quisitive found that the migration “Settings” button doesn’t appear in Microsoft Edge, but it does appear in Google Chrome.
This is important because these settings are directly related to the new filtering enhancement. If you don’t see this button in Edge, try out Google Chrome and it should appear for you. We’ve reported this to Microsoft but there is likely not enough data to support any fixes for it yet. So, if you experience it, let Microsoft know!
2. Stream doesn’t connect inside the migration tool
When you visit the Stream migration tool for the first time, there will be a small graphic in the upper right indicating that you are connected to the Stream service and database.
We’ve gotten reports from some clients that it takes a while for this page to finally load. Up to 20 minutes for one of them. Our recommendation is to leave this page open for an extended period of time to see if it connects successfully. If not, make sure you’ve configured your network firewall to allow the following URLs:
This Microsoft support page explains this symptom in more detail. If you’ve taken this step already, opening a case with Microsoft is the best next step.
3. The tool showed failed migrations temporarily
We’ve observed some margin of error when it comes to the reporting during a migration. On the migration page, you will see a status bar that looks like this:
We’ve gotten reports that the count of failed videos or containers disappears as soon as the job is finished. We can only infer this is related to the error checking process in the background. It’s possible the output to the screen is delayed, as well. We consider those false positives.
Focus on the count of failures once all videos in the migration job are finished moving. If there are no jobs running in the tool, and you still see failed videos, it’s accurate and you should address those failures as you observe them.
4. Users are reporting that videos aren’t playing after the migration
If you move a video to a new location after it’s been migrated, it will not play correctly. This is due to the fact that each migrated video has a redirect URL that will break if moved. That URL is set during the migration as metadata in the video file and when the file is moved, that redirect URL becomes invalid. Not all is lost if that happens, however. You can remigrate a video to the desired destination, if needed. But for your own peace of mind, we recommend being certain that the destination you’ve chosen is the final one. The official note from Microsoft:
Without a comprehensive understanding of the choices ahead about your Stream migration, you might struggle more than you need to on your own. And the rapid evolution of Stream and the migration tool presents a unique challenge. But it’s also a unique opportunity to be on the ground floor of Microsoft’s plans for Stream. Afterall, Teams didn’t start strong, but look at it now. Teams has become Microsoft’s flagship suite for productivity and conferencing. It’s exciting to think about where Stream will be in a few years. Will it look like YouTube? Who knows, but Quisitive will be here for it.
Get help with your Microsoft Stream migration
If you find yourself or your team struggling to understand this migration process or you don’t have the time to invest, Quisitive can help so you can focus on the projects you planned for. Don’t hesitate to reach out to us. We understand that this migration effort was a surprise for a lot of IT teams, but we’ve got you covered.
Who should read this article?
If you’re a business owner, leader, or decision-maker looking to mature your business with Microsoft’s M365 cloud services and SharePoint Online, you might be thinking about analyzing how your employees use these tools to maximize their value and boost adoption and engagement. This article is for you.
What are analytics?
Web analytics about user behavior can be interpreted to drive adoption and engagement in a front-facing system like a public website or a company intranet like SharePoint Online. The data collected often includes information about search results, page views, unique viewers, click-through rate, most popular pages, peak usage hours, and much more.
Analytics create potential for a deeper understanding of employee adoption. Without knowing how your employees use SharePoint Online and M365, you may be missing opportunities for engagement and increased adoption of new tools. The best analytics solution for your organization depends on your business needs. Some analytics tools are more generic in nature and are applicable to wider audience. Oftentimes, free options like Google Analytics are attractive because your budget is limited, or the procurement process takes too long. But there is more to consider that could shift your focus to a hidden cost that isn’t always immediately observed.
So, let’s review all options before discussing one of the more popular free options, Google Analytics.
What are my analytics options for SharePoint Online?
There are several options to collect analytics, but if you do a quick search you might find results like Google Analytics, tyGraph (AvePoint), CardioLog, Netwrix, Microsoft Adoption Content Pack, Syskit Point, and ShareGate Desktop.
The audience differs for each solution. For example, Syskit, Netwrix, and ShareGate are aimed at administrators and are usually used for auditing and governance. But others are more geared toward employee focused groups like Marketing, Human Resources, or Communications for the purpose of employee engagement. tyGraph, for example, has focused their attention on a smoother experience of interpreting the data, making it more user friendly. Quisitive is a tyGraph (AvePoint) partner because it integrates with more than just SharePoint Online and we support all aspects of the M365 suite including OneDrive, Teams, and the Viva Suite.
Of course, Microsoft offers their own analytics engine inside of the M365 admin center and a separate Power BI dashboard kit. Without question, both add tremendous value in lieu of a third-party tool. Microsoft has made great strides in the last few years with their analytics, but the scope of the data collected is limited. If you’re not in a position to fund the purchase of a third-party tool, the M365 Usage report is an excellent choice and requires little effort to understand and use.
Factors to consider when choosing an analytics tool may include:
- Initial cost and cost of implementation, maintenance, upgrades, and retirement
- Interoperability with enterprise systems and devices
- Security and privacy
- Skill set required to implement and use it
- Adaptability to new systems in the future without losing funcionality
- Data source compatibility – from which systems can I pull data, even if not interoperable
Google Analytics in SharePoint Online
One of the most popular free alternatives is Google Analytics. Let’s review the pros and cons.
Pros of Google Analytics in SharePoint Online
- Compatible with SharePoint Online
- Low initial cost if total number of hits remains below 10 million per month
- You get to skip the procurement process
- User friendly (post-installation)
- Learning curve is reasonable, albeit a bit technical
- More granular than limited native M365 analytics
- Mobile devices are supported
- Most organizations use mobile devices for productivity so that base is covered
Cons of Google Analytics in SharePoint Online
- Google just released Google Analytics v4 and a migration/upgrade may be in your near and long-term future
- Setup process isn’t as straightforward as competitors
- Tagging pages for analytics was never simple and remains that way, even in the latest Google Analytics 4. You must be familiar with SharePoint Online administration to install.
- Personally identifiable information is prohibited. “[Google Analytics] prohibits sending personally identifiable information (PII) to Analytics (such as names, social security numbers, email addresses, or any similar data), or data that permanently identifies a particular device…”
- Support options are limited without a partner relationship
- Google is a large company and typically hands support tasks to its qualified Partners.
- Lack of M365 native interoperability
- Only page data is collected, document usage isn’t collected. There is relevant data to collect outside of SharePoint Online when influencing employees to adopt multiple M365 applications. Google does not offer documentation explaining which data isn’t collected
- You will be charged if total hits exceed 10 million in a month
- Anyone who has access to the Google Analytics account has access to all SharePoint analytics data
- There isn’t a way to keep the analytics data private and segregate roles. Other tools offer role-based security (viewers vs admins).
- Privacy isn’t a guarantee
- You’ll want to read the fine print before choosing Google Analytics since they often use customer information for targeted ads and predictive search results.
Which tool is best for my organization?
Now that you are aware of the pros and cons, let’s put it all into context.
One of the most common reactions to a paid solution is aversion due to the perception of excessive cost. But just because a tool is free doesn’t mean it won’t cost you anything and may simply be the wrong tool for your business. Security is also concern that might be ignored in favor of a free tool. What good is a free tool if it doesn’t truly meet your needs?
Your organization may in fact already be using Google Analytics and wondering whether to stay with it or move to a new solution. But the decision remains the same because your business is going to change, and your solutions should be agile enough to change with it. Is Google Analytics agile enough to keep up with M365? Google is focused on a wider audience, mostly public websites, and especially online retail. Metrics meant for retail and public websites simply can’t be used in SharePoint Online because the data point isn’t available.
With that in mind, it’s safe to say that the cost of an analytics solution is relative. Not just compared to other solutions, free or paid, but relative to the potential consequences of using the wrong solution, costing you time and energy when you likely revisit this decision in the future as your business matures beyond the free solution’s limitations.
You might think ROI isn’t a factor while implementing Google Analytics because the solution is free. The reality is that if you choose an analytics solution that isn’t right for your business, you could find yourself collecting irrelevant data or more likely end up in an information deficit – behavioral information that could help you motivate your employees to use the tools that organically boost efficiency and output and shift your company culture in a positive direction.
The financial cost of an incomplete understanding how your employees work is difficult to quantify but it’s easily observed in common day-to-day pain points and high turnover. Both could be addressed by leadership who are armed with the added context the behavioral data offers.
The ultimate goal of an analytics solution is to boost productivity but like any business goal, the underlying objective is financial. Increased productivity usually equates to greater cost savings and ideally leads to increased revenue, covering the cost of the solution that enabled your business to mature and succeed. In theory, a paid tool could pay for itself eventually.
If your goal is to mature your business by using an analytics solution for M365 and SharePoint Online, the question you might ask yourself when choosing a product is:
“Can we afford the cost associated with the risk of an information deficit?”
If you’ve exhausted your options for funding and have no alternative outside of the native M365 analytics, Google Analytics will suffice, especially with a tight timeline.
But if you have any wiggle room in your budget and timeline to obtain a paid solution that integrates with M365, you may end up covering the cost with the boost in productivity from increased adoption down the road. It’s a win-win scenario, if done right. You should still take your time to do your diligence and make a confident decision.
Whether your goal is to simply go paperless or to implement complex solutions for big ideas inside SharePoint Online and M365, Google Analytics is a fine starting point and may suffice long-term. However, if you’re expecting ongoing organizational change alongside adoption of new M365 tools, you may better serve your organization by choosing a solution that fully integrates with M365 applications, in anticipation of the future business needs.
Remember, you likely won’t stop adopting M365 at SharePoint Online and might eventually introduce Teams, OneDrive, and the Viva Suite to your employees. You may need a more comprehensive report of usage across them all to make sure your business is headed in the right direction.
Looking for additional assistance with SharePoint Online?
Quisitive offers a team of experts that can help you create a strategy for implementation, manage adoption, build your SharePoint instance, or optimize your current instance.
The adoption of Artificial Intelligence (AI) is rapidly increasing across industries, offering new opportunities for businesses to automate processes, improve decision-making, and enhance customer experiences. To leverage these benefits, organizations need to be AI ready. Let’s focus on the steps you need to take to prepare your organization for AI implementation, focusing on Azure Machine Learning, Azure Data Factory, and Azure DevOps as essential technologies.
Understanding AI Readiness
AI readiness is the degree to which an organization is prepared to integrate AI technologies into its business processes and operations. It involves a comprehensive assessment of the organization’s current capabilities, infrastructure, data, and workforce, followed by the development of a strategic roadmap to address identified gaps and opportunities.
Key Components of AI Readiness
- Strategy and Vision: Develop a clear understanding of how AI can support your business goals and define a strategic vision for AI adoption.
- Data Infrastructure: Ensure that your organization has access to the necessary data and infrastructure to support AI initiatives, including data storage, processing, and analytics capabilities.
- Talent and Skills: Assess your organization’s current AI talent and identify areas where additional guidance may be necessary to support AI initiatives.
- Governance and Ethics: Establish policies and guidelines to ensure the ethical use of AI and to address potential risks and challenges associated with AI implementation.
Leveraging Azure Technologies for AI Readiness
Building the Model
Azure Machine Learning is a cloud-based service that enables organizations to build, train, and deploy machine learning models at scale. By using Azure Machine Learning, organizations can:
- Access a wide range of pre-built AI models and algorithms.
- Develop custom models using popular machine learning frameworks and libraries.
- Automate the entire machine learning lifecycle, from data preparation to model deployment.
- Monitor and manage AI models in production, ensuring optimal performance and ongoing improvement.
Data is Key
Azure Data Factory is a cloud-based data integration service that allows organizations to ingest, prepare, and transform data from various sources into a format suitable for AI and machine learning applications. Key features of Azure Data Factory include:
- Support for a wide range of data sources, including on-premises, cloud, and hybrid environments.
- Robust data transformation capabilities, including data cleansing, aggregation, and enrichment.
- Seamless integration with other Azure services, such as Azure Machine Learning, Azure Data Lake, Azure SQL and more.
Complete the Cycle with MLOps
Azure DevOps is a suite of tools and services designed to support the entire application development lifecycle, from planning and coding to deployment and monitoring. By integrating AI and machine learning projects with Azure DevOps, organizations can:
- Streamline the development and deployment of AI models and applications.
- Ensure consistent and reliable AI model performance by implementing continuous integration and continuous delivery (CI/CD) pipelines.
- Monitor and manage AI models and applications in production, addressing issues and opportunities as they arise.
Building AI-Ready Organizations
To fully leverage the benefits of AI, organizations need to develop robust data and analytics capabilities and adopt a data-driven mindset. This involves:
- Data Strategy: Develop a comprehensive data strategy that outlines the organization’s goals and objectives related to data management, analytics, and AI.
- Data Governance: Implement data governance policies and procedures to ensure data quality, security, and compliance.
- Data Integration: Integrate data from various sources, both internal and external, to create a unified view of the organization’s data assets.
- Data Analytics: Leverage advanced analytics and AI technologies to extract valuable insights from the organization’s data, driving better decision-making and improved business outcomes.
AI readiness is crucial for organizations to harness the full potential of artificial intelligence and gain a competitive edge in today’s rapidly evolving business landscape. By developing a clear AI strategy and vision, investing in the right data infrastructure, building the necessary talent and skills, and implementing strong governance and ethics policies, organizations can prepare themselves for a successful AI-driven future.
Leveraging Azure technologies like Azure Machine Learning, Azure Data Factory, and Azure DevOps can significantly streamline the AI readiness process, enabling organizations to build, deploy, and manage AI solutions more effectively. By focusing on data and analytics and adopting a data-driven mindset, organizations can further enhance their ability to capitalize on the opportunities presented by AI.
If your organization is embarking on its AI journey and seeking expert guidance to help you become AI-ready, feel free to contact us. Our Data and AI team will be more than happy to assist you in developing a customized roadmap for AI adoption tailored to your unique business needs and objectives.
The Healthcare Financial Management Association (HFMA) Conference is an annual event that brings together healthcare finance professionals, industry leaders, and innovators to discuss the latest trends and advancements in healthcare finance.
The Quisitive team was honored to sponsor this year’s event held recently in Nashville at the beautiful Gaylord Opryland Resort. It was an extraordinary gathering of minds, offering insights into the future of healthcare finance and the transformative potential it holds for the industry. In this blog post, our very own Suresh Krishnan shares his highlights and takeaways from the HFMA 2023 conference and shares the key trends that every finance leader in healthcare should keep in mind to improve financial results and thrive in 2023 and beyond
All Eyes are On Digital Transformation and Advanced Technologies in Healthcare Finance
One of the overarching themes of HFMA 2023 was the significance of digital transformation in healthcare finance. The conference shed light on the increasing adoption of advanced technologies such as artificial intelligence (AI), machine learning (ML), robotic process automation (RPA), and blockchain in finance operations. Session speakers emphasized the potential these technologies have for streamlining processes, enhancing data analytics, improving revenue cycle management, and driving cost efficiencies.
Data-Driven Decision Making Will Set You Apart
Data analytics emerged as another critical focus area during the HFMA Conference. During the event sessions, many experts discussed leveraging analytics to drive insights into financial performance, risk management, population health, and patient outcomes. With the exponential growth of healthcare data, there is a massive need to harness its full potential strategically and responsible for strategic decision-making. Data and reporting will remain hot topics for finance professionals in healthcare as advanced analytics tools and techniques enable leaders to make data-driven decisions, optimize revenue cycles, identify trends, and improve overall financial performance. Finance leaders who invest in the tools and infrastructure needed to integrate and report on data will set their organizations apart.
Value-Based Care & Payment Reform is Here
The conference provided a platform to explore the ongoing transition from fee-for-service to value-based care models. Experts highlighted the importance of aligning financial incentives with quality and outcomes, emphasizing the need for collaboration between healthcare providers, payers, and patients. Sessions focused on strategies for managing financial risk in value-based contracts, improving care coordination, and leveraging technology to support payment reform initiatives.
Revenue Cycle Management Remains a Pain Point
I was surprised at the number of vendors showcasing revenue cycle management (RCM) solutions, highlighting the ongoing struggle in this area. The conference showcased innovative RCM strategies and technologies aimed at improving revenue integrity, reducing denials, and enhancing patient financial experience. We were there showcasing our MazikCare platform and Payer Matrix RCM Collections Workflow Module. Discussions covered topics such as automation of RCM processes, predictive analytics for claims management, patient engagement tools, and revenue cycle optimization best practices.
Social Determinants of Health Can Bridge Gaps
I listened to Dr. Thomas Fisher from the University of Chicago Medicine talking passionately about health equity in inner-city communities. His stories from ER during the COVID crisis were very moving. During his session, Dr. Fisher emphasized the need for healthcare technology to address the underlying social determinants of health (SDOH) to achieve true health equity in underserved inner-city communities. Dr. Fisher’s accounts from the front lines of the COVID-19 crisis shed light on the profound impact of systemic disparities on individuals’ health outcomes. As he spoke, it became clear that digital solutions that integrate SDOH considerations are essential for bridging the gap and delivering comprehensive care to vulnerable populations. By incorporating data on factors such as housing, education, and access to resources, healthcare technologies can empower providers to offer personalized interventions and support, ultimately improving the health outcomes and quality of life for individuals in these communities.
The HFMA 2023 Conference was an enlightening gathering that showcased the transformative potential of healthcare finance in driving positive change in the industry. Sessions offered valuable insights into the digital transformation of finance operations, the power of data analytics, the shift towards value-based care, regulatory compliance, and innovative revenue cycle management strategies. The conference undoubtedly left participants inspired and equipped with new knowledge and perspectives to shape the future of healthcare finance. As we reflect on HFMA 2023, it is evident that healthcare finance professionals are at the forefront of driving change, innovation, and financial sustainability in the industry. By embracing technological advancements, leveraging data analytics, and adapting to evolving payment models, healthcare finance leaders can position their organizations for success in an increasingly complex and dynamic healthcare landscape.
About the Author
Suresh Krishnan, Senior Director, MazikCare – Quisitive
Suresh is a healthcare IT Leader with CHCIO certification and over 25 years of experience in application development, infrastructure management and cybersecurity. In 2016, Suresh was recognized as a Top100 CIO in Hospitals and Health Systems. Suresh works with healthcare leaders to expand their use of modern technologies to improve end-to-end care delivery and operations.
As new and more powerful Artificial Intelligence (AI) technologies are being developed, AI products are becoming increasingly integrated into our lives. From the algorithms that power search engines, to facial recognition software, AI is already having a major impact on society, and it has the potential of revolutionize many aspects of our lives. However, with great power comes great responsibility. As AI continues to evolve, it is important that we ensure that it is developed and used responsibly.
Why do we need responsible AI?
Historically, the advancement of technologies, even if developed with the best intentions, can have unintended consequences, that can be even harmful. Developing responsible AI can help anticipate and prevent the potential issues caused by AI. There are many reasons why developing responsible AI software is important, but here are a few of the most important reasons:
Avoiding bias and discrimination
Most AI software relays on machine learning (ML) models to learn how to respond to inputs. The ML models are trained using data (this data is named training data). If the training data is not selected carefully, it can contain biases that are then introduced to the AI software. Selecting the training data is a very important but hard task. ML models usually learn from existing real world data, due to the large amount of data needed to properly train them. This means that even accurate models can learn and sometimes amplify pre-existing biases in the data based on race, gender, religion, etc.
One well known example of how unintended harmful biases can be introduced in AI is the COMPAS system, which was used by criminal justice agencies to estimate future reoffending rates of individuals. This model used an AI algorithm that determined the risk of an inmate to reoffend, based on criminal history and demographic information. The intention of this program was to create a fair system that was not influenced by the unconscious biases a person may have.
However, a study published in Science Advances in 2018 showed that the model was biased, and it discriminated against Black individuals. Moreover, the overall accuracy of the model was around 65%, which is comparable to the combined average results of untrained people. The study cites:
“Black defendants who did not recidivate were incorrectly predicted to reoffend at a rate of 44.9%, nearly twice as high as their white counterparts at 23.5%; and white defendants who did recidivate were incorrectly predicted to not reoffend at a rate of 47.7%, nearly twice as high as their black counterparts at 28.0%.”
It was ultimately identified that the source of bias was in the data that was used to build the AI algorithm. In the training data set, black people were more likely to be incarcerated than white people. By overlooking this fact, AI was not designed responsibly, and the AI model learned a racial bias as a result. Since the COMPAS AI algorithm might be used by parole boards and other judicial bodies, there is great potential for harm, which highlights the importance for responsible AI design.
Promoting fairness and equity
Another troubling example lies in facial recognition software. Through various studies, researchers have demonstrated that some widely used AI based facial recognition algorithms have trouble recognizing subjects that are female, Black, and between 18-30 years old. This can lead to degraded experiences to users on those categories and even biases against those users. The problem here seems to be that the training data is not representative of all the users for the facial recognition algorithms. Reflecting possible oversight during data collection, inadequate sampling data, poor design or budgetary limitations. Regardless of the cause, the potential for harm is real.
Responsible AI practices can help to ensure that AI systems are designed to be fair and to promote equity between its users. The idea behind fairness is that AI systems should be beneficial to everyone, not just a select few, and they should treat everyone fairly and impartially. However, fairness is far from a solved problem, as shown by the previously mentioned examples.
For fair responsible AI design we should make sure the training data is sampled in a way that is representative of users. For example, if the model will be used for people of all ages, but you only have training data for young adults, that will likely not be a fair model. To train a fair model, developers should make sure to include data for people of all ages in the training data set.
Protecting privacy and security
AI systems collect and process a lot of personal data, like images of faces, age, demographics, etc. That data is used to train ML models to provide a service to users. However, the use of personal user data raises privacy and security concerns. If this data is not properly protected, it could be used for malicious purposes, like identity theft.
Moreover, AI systems trained on personal data can be used to track people’s movements, monitor their online activity, and even to predict their behavior. For example, facial recognition algorithms are currently being used in China to track millions of citizens daily lives, which is already rising privacy concerns. Building responsible AI can help protect the privacy and security of its users. In order to develop responsible AI, we must balance security, convenience and privacy.
What can we do about it?
Alongside the fast advancements in AI in the last few years, there has also been a lot of research conducted about how to solve and prevent the problems of AI. The huge potential benefits of AI are clear, but researchers are looking for a way to balance its benefits with the potential to cause harm. Many of the companies leading the development of AI technologies, like Microsoft and Google, are also invested in research to guarantee the AI systems they develop are fair, safe and protect the user’s privacy. In other words, they are invested in the development of responsible AI.
In order to guide organizations in developing responsible AI, Microsoft developed six guiding principles:
- Fairness: AI systems should treat all people fairly.
- Inclusiveness: AI systems should empower everyone and engage people.
- Reliability and Safety: AI systems should perform reliably and safely.
- Transparency: AI systems should be understandable.
- Privacy and Security: AI systems should be secure and respect privacy.
- Accountability: People should be accountable for AI systems.
You can find more information on Microsoft’s principles for responsible AI at https://www.microsoft.com/en-us/ai/responsible-ai.
Ready to talk to an expert about how to establish responsible AI practices in your organization?
In today’s digital landscape, where cyber threats loom larger than ever before, cyberattack prevention is more important than ever. Safeguarding your business from potential data breaches, cyberattacks, and security incidents is paramount. As the guardians of your company’s technological infrastructure, you hold the responsibility of protecting sensitive information, ensuring operational continuity, and maintaining customer trust. That’s where Managed Detection and Response (MDR) services step in as your ultimate game-changer.
In this blog post, we’ll explore the undeniable benefits that MDR services bring to your organization, empowering you to make informed decisions that can transform your cybersecurity posture, aid in cyberattack prevention, and ultimately enhance your business outcomes.
1. Proactive Threat Detection and Rapid Response:
Cybersecurity is constantly evolving. MDR utilizes a mixture of automation and analysts to implement an around-the-clock proactive approach to cyberattack prevention. This is accomplished through monitoring your network, endpoints, and cloud infrastructure for any signs of suspicious activity or potential threats.
An example of this proactive approach to security is an immediate notification for when a document is shared with sensitive information, even if that file is stale. Once the automated system detects this bad behavior, it sends the alert to an analyst to resolve the matter before any harm has occurred. By employing these advanced technologies and leveraging threat intelligence, MDR teams can quickly identify and mitigate emerging threats, ensuring that any security incidents are swiftly contained and neutralized before they cause extensive damage.
2. Around-the-Clock Security Operations:
When clients would ask if we could monitor them around the clock, the answer was always “No”, until now. We can now offer security operation centers that are fully staffed 24/7 every day, even holidays, in the US.
This approach allows us to always have an expert analyst in the chair promptly responding to alerts and threats around the clock. Your information assets are continually monitored for any sign of bad behavior. This means you can rest easy, knowing that there’s always a team of experts diligently watching over your systems and responding promptly to any security events.
3. Access to Cutting-Edge Technologies and Expertise:
Managing cybersecurity internally can be a daunting task, requiring significant investments in infrastructure, tools, and talent. This is what makes our partnership with Critical Start so powerful. Critical Start is a known leader in the area of advanced security operations since 2015, and an integral Microsoft partner, allowing these security protocols to work seamlessly with your existing software infrastructure.
MDR implements, optimizes, and helps customers get more out of Microsoft investments like: Microsoft 365 E5, Azure Purview, Azure Sentinel, Microsoft Security Center. Essentially, adding a module of capability to the existing programs. This synergy of leading software companies and cutting-edge platforms allows advanced threat detection capabilities, threat hunting techniques, incident response best practices, and cyberattack prevention, all without the burden of building and maintaining an in-house security operation.
4. Improved Incident Response and Remediation:
When a security incident occurs, time is of the essence. That’s why we have a service-level agreement of 1 hour time-to-detection and within 1-hour resolutions. This means that regardless of the time it is received, every alert will get an expert’s attention within 1 hour, and, your digital assets will always receive rapid response to incidents.
On top of this, our clients have full visibility of every alert and activity. Providing a comprehensive view of your company’s security threats. You will always stay in the loop about your company’s security, and we will always respond to any cyber threats without delay. These protocols minimize the impact on your business operations and reduce downtime.
5. Enhanced Compliance and Regulatory Adherence:
In an era of increasingly stringent data protection regulations, compliance is no longer a choice—it’s a necessity for cyberattack prevention–MDR services can play a pivotal role in helping your organization achieve and maintain compliance with industry-specific regulations such as GDPR, HIPAA, PCI DSS, and more. By aligning their processes with regulatory requirements and offering valuable insights and documentation, MDR providers can assist you in demonstrating your commitment to data security and regulatory adherence.
6. Risk Reduction and Business Continuity:
A successful cyberattack can lead to severe financial losses, reputational damage, and operational disruptions. MDR services offer a proactive defense strategy that significantly reduces your risk profile and aids in cyberattack prevention. By quickly identifying vulnerabilities, implementing preventive measures, and fortifying your security defenses, MDR providers enable you to safeguard your business continuity, protect your critical assets, and ensure uninterrupted service delivery to your customers.
Protect Your Business with Spyglass MDR
With Spyglass MDR we implement, fix, improve, and offer 24/7 monitoring. This is a total solution. Additionally, we can have a client receiving full 24x7x365 monitoring in 7-14 days. Embracing MDR services empowers your business to stay one step ahead of malicious actors, ensuring that your digital infrastructure remains secure, your operations run smoothly, and your customers trust you with their sensitive information. So, make the strategic decision today and unlock the power of MDR services to elevate your cybersecurity posture and achieve greater business success.
Remember, cybersecurity and cyberattack prevention is not just an IT concern; it’s a fundamental business imperative.
Stay secure and vigilant by contacting our security experts today about Spyglass-MDR.
You’ve built your IT team carefully, adding experienced and effective personnel over a long period. Your team knows your systems inside and out. They might even have built them from the ground up. They are experts who provide significant value to your organization.
But you’ve been hearing that the benefits of managed services could add value and you might be wondering: Why would you lose that huge competitive advantage that your local team represents, and hand the reins of your technology to someone else?
It can be difficult at first to see the value in a managed services approach to your technology infrastructure. But this model can provide your organization with considerable advantages without taking away from the great work your in-house team does.
The Benefits of Managed Services
A managed services provider specializes in running infrastructure and staying abreast of technology trends. An in-house team can’t be expected to keep up with rapidly evolving technology, much less know when new components become available that can give your organization new advantages. Having a managed services partner with expertise in infrastructure is the best (and cheapest) way to make this expertise available for your organization.
2. Premier Support
Premier-level support for your Microsoft infrastructure is costly, and beyond the capacity of smaller organizations, but it’s essential to running your infrastructure effectively. With premier support, if your service provider can’t solve your problem, they can go straight to Microsoft to get in-depth root-cause analysis from their engineering QA teams. Bringing on a managed services partner that leverages premier-level support is the most cost-effective way to get this level of service.
Maybe you can do it all in-house yourself, but what is it costing you in time and resources? Can you afford the staff that you’ll need for an entire IT team, or would it be better to hire a service to do it all for you? From a cost point of view, it’s far better to let your IT team focus on their core competencies and your strategic initiatives like support for your enterprise systems and integrations and let a managed services partner provide the best practices and expertise specifically for your infrastructure to keep your running in optimally and securely.
4. 24/7 Support
IT issues don’t follow a 9-to-5 schedule, and your support can’t either. What would it cost to build an in-house team to work on a 24/7 basis? A better solution is to engage a managed services provider that has a blended team, pulling talent from a worldwide pool. They can provide round-the-clock coverage, ensuring that you are taken care of at any time, on any day.
5. Gap Filling
No matter how great your IT team is, it’s impossible for them to cover everything. Perhaps they lack experience in a certain area. Maybe a key member of the team will need time off, so you’ll need temporary support until they’re back. A managed services partner can fulfill those short-term needs to make your team even more efficient and effective.
The managed services approach won’t solve all of your IT problems. But understanding the benefits of managed services, the model and the potential advantages will help you create an accurate picture of your current and future IT needs and evaluate how to best meet them in the context of your larger organizational strategy.
Quisitive offers a cloud managed services program where you gain access to our expertise in Managed IT Services that spans a wide range of technologies.
We can help sustain your applications and technologies with routine support and environment management as well as advise you on strategic improvement programs and ongoing coaching.
Unlock the benefits of managed services with Quisitive.
Explore our different Managed Services offers or, for ad-hoc needs, see our Flex Services.
How to Apply Multi-Select Filters in Power Apps
Power Apps with Combo Box
In this step-by-step tutorial video, we will learn how to build multi select filters in Power Apps. We will apply multiple filters to PowerApps Gallery including multiple item selections using combo box, multi-select checkboxes and multi select buttons.
We will build these gallery filters keeping delegation in mind (No delegation warning) & work with multiple data sources like Dataverse & SharePoint.
I will showcase how to filter a multi select choice column based on a multi select combo box control.
Trick is to go around the IN non-delegable Operator and use Equals (=) which is a delegable function. I will also cover a new function called “Index” in Power Apps.
This video covers the following:
- Multiple Item Selection based Filters
- Combo box control to filter Gallery for SharePoint List & Dataverse
- Multi select checkbox-based filtering
- Multi select button-based Gallery Filter
- Multi Select Filter on Multi Select Choice Column
- Reset filters
- Delegation Workaround
- Index function