;)

Cloud adoption is on the rise as businesses today face market and supply chain disruptions unlike any they’ve faced in the past and turn to the cloud for the scale, flexibility, and security they need to keep up.
Quisitive will…
We meet you where you are on your cloud journey. Even on-premises workloads can benefit by extending capabilities using Azure services.
With the Azure Hybrid Benefit, you can use an existing Windows Server or SQL Server license to save on virtual machines in Azure SQL Servers.
Contact us to learn more about cost savings opportunities when you migrate Windows and SQL Servers to Azure.
Your ideal cloud infrastructure should earn your trust with resilience, scalability, and cost efficiency. Migrate SQL Servers to Azure SQL servers or migrate Windows Servers to Azure to realize the benefits of the cloud.
Receive up to 478% 3-Year ROI when you migrate from Windows and SQL Servers to Azure.
Accelerate the modernization of your infrastructure and applications with the power of cloud computing on Azure’s scalable and resilient platform.
Use a combination of cloud and on-premises services where needed to maximize agility and value.
Take advantage of multi-layered security across physical data centers, infrastructure, and operations, as well as Disaster Recovery.
About Quisitive
Quisitive is a premier, global Microsoft Partner that harnesses the Microsoft cloud platform and complementary technologies, including custom solutions and first-party offerings, to generate transformational impact for enterprise customers. Quisitive has consistently been recognized as a leading Microsoft Partner with 16 Specializations and all 6 Solution Partner Designations. Quisitive’s Microsoft awards include the 2023 US Partner of the Year Winner for Health and Life Sciences, 2023 US Partner of the Year Winner for Solutions Assessment and 2023 US Partner of the Year Finalist for the Industrial and Manufacturing vertical.

;)

In this case study:
Client: Chesterfield County
Industry: Public Sector
Products and Services: Azure, Power BI, SQL Server
Country: USA

About Chesterfield
Chesterfield County, located in the state of Virginia, is recognized as the “17th Best Place to Live in America” and “One of the 100 Best Communities in America for Young People.” The county also has the lowest crime rate in the Richmond Metropolitan region, so the residents flock to live there.
For Chesterfield County residents, businesses, and visitors, there is an evolving need for access to public schools, parks & recreation sites, and public libraries as the county continues to grow.
Challenge
Chesterfield County was growing so fast, nine percent since the last census, that demographics were shifting, and stakeholders needed to ensure the “model of excellence in public education” stayed intact. The growth dynamics in any county presented challenges for school planning, ranging from enrollments in particular facilities up to where new facilities should be built. With the median costs for elementary, middle, and high schools at $35M, $50M, and $75M respectively, finding the right locations and subsequent population zoning is important.
Chesterfield County’s primary goals were to be more transparent with its taxpayers and demonstrate why decisions were made when building new schools, and to better plan for where to build schools and how big they should be. After creating a one-year forecasting model, Chesterfield found that the rate of accuracy was only 90%. With the desire to improve on the accuracy findings, Chesterfield County planned to develop a five-year prediction model that forecasted enrollment for schools in the county and prove that the school investments were a necessity. Additionally, the model needed to help discover insights to make data-driven decisions in the development of not only schools, but other capital facilities to ensure fiscal responsibility for the taxpayers.
Solution
After a Data & Analytics pilot and the discovery that the model had to go past one year and include other sources of data to be accurate, Quisitive helped Chesterfield County create a more accurate five-year forecasting model to help make data-driven decisions in school development, such as building new schools or shifting school boundaries. Quisitive validated and consolidated data from 60 schools over a 15-year timeframe, including demographic and 20 years of housing data with additional sources of data, into a data warehouse for reporting and analysis. Using machine learning and Power BI, data sets were mapped and results to insights were written to a SQL server to visually report the forecasting of school enrollment in different areas of the county.
Quisitive also used Azure ML Suite to build and validate production-ready time series forecasts. The data-driven visual dashboards developed, helped Chesterfield County find insights to plan for future school developments. It also helped specific school districts plan for certain needs, determined by the school enrollment forecast.
Results
The model predicts a 5-year forecast horizon and shows geographical student growth decoupled from the school districts, making this model more accurate than the legacy one-year model. Quisitive worked with the County’s data engineering team to build a new dataset based on housing parcels – a “parcel level” approach. This enabled Quisitive to build a new model that could predict student growth to various levels of geographic aggregation, such as neighborhood and census tract. The approach was successful and was integrated into the County’s planning process.
Quisitive also partnered with the County’s IT analysts and transferred knowledge of data science skills so they could apply the same approach to forecasting the need for building new fire stations, or other infrastructure needs throughout the County.
The forward-thinking county of Chesterfield increased the county’s overall data maturity and transparency to all stakeholders. With a successful student demand forecasting model, the county has begun to expand in other ways of modernizing its analytics infrastructure.
The elasticity of the cloud allows state and local government agencies to consolidate their data from many sources into an environment where they can take full advantage of high-grade security features, data at scale, advanced analytics, and machine learning capabilities. These combined capabilities enable public sector analysts or data scientists to build better models, the GIS team to add high-quality layers of geospatial data, and the users to consume the results in easy-to-digest dashboards and reports.
Transformative Impact
Built a secure ML Platform
Leveraged Azure Machine Learning Studio and Azure Data Factory to securely connect to Azure SQL databases and Azure Data Lakes and enable streamlined model deployment.
Improved Accuracy
Improved accuracy of long-range school enrollment forecasts using the County’s own schools, demographic, real estate, and planning data, relative to existing statistical models designed on state-level population.
Upskilled Analyst
By working alongside Quisitive experts, the County’s data analysts gained valuable data science skills to start building their own use case and using the model to make data-driven decisions when evaluating public facilities.
Award Winner
Chesterfield County was recognized with the 2022 National Association of Counties Achievement Award for this engagement, a project titled: Forecasts for Communities: Data-Driven Populations Projections. “The cross-functional team included Deputy County Administrators; Budget, Parks, and Library Directors and subject matter experts; technology leaders and staff; and data engineers and data scientists from Catapult Systems [now a part of Quisitive], a Microsoft Gold Partner.”
It’s been many years since I read that SQL databases should use an NTFS volume formatted with at 64KB file allocation unit size (block size). So long that I didn’t remember why or if it is still considered best/good practice. It appears that it is according to Microsoft and the foremost authority on SQL with ConfigMgr.
Microsoft recommends this for User Databases and the TempDB: https://docs.microsoft.com/en-gb/azure/virtual-machines/virtual-machines-windows-sql-performance?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json
Steve Thompson [MVP]: https://stevethompsonmvp.wordpress.com/2014/07/25/sql-server-ntfs-cluster-size
Steve explains, “The reason that SQL Server prefers 64KB NTFS cluster size, this happens to correspond to the way SQL Server allocates storage. One page = 8K, and SQL Server allocates an Extent, which is 8 pages in size. 8 pages x 8KB = 64KB/extent.”
To check the Block Size per drive/volume/partition…
From PowerShell, execute
Get-WmiObject -Namespace 'root\CIMv2' -Class Win32_Volume -Filter "DriveType = 3 and Label <> 'Recovery'" | Select Name, Label, BlockSize

OR from a Command Prompt
fsutil fsinfo ntfsinfo s: | findstr "Bytes per"

OR create a small file on the drive(s) and check the file properties. This can easily be done by…
- open Notepad, hold any key for about 30 seconds, then save the file
- open a Command Prompt and type FOR /L %I (1,1,200) DO @echo %I>> %temp%\test.file (or similar)
Using Windows Explorer, right-click the test file and notice the Size and Size on Disk info.
From testing, the file had to be more than 500 bytes on a 4K block size volume to register any size on the disk. On the 64K block size disk it took about 800 bytes to register. Your mileage may vary though.

In addition, if the SQL PowerShell module installed or there is some other method to query the existing SQL server, PowerShell can check if the block size on all drives where a SQL files exist or will exist if using the default file locations.
Import-Module SQLPS
$SQLDrivesInUse = @(Invoke-Sqlcmd -Query 'SELECT DISTINCT left(physical_name,3) [Drive] from master.sys.master_files')
Write-Output 'SQL Drives In Use'
$SQLDrivesInUse | Format-Table -AutoSize
$SQLDriveDefaults = @(Invoke-Sqlcmd -Query "SELECT Left(Convert(varchar(255), SERVERPROPERTY('instancedefaultdatapath')),3) [Drive] UNION SELECT Left(Convert(varchar(255), SERVERPROPERTY('instancedefaultlogpath')),3)")
Write-Output 'SQL Drive Defaults'
$SQLDriveDefaults | Format-Table -AutoSize
$DiskDrives = @(Get-WmiObject -Namespace 'root\CIMv2' -Class Win32_Volume -Filter "DriveType = 3 and Label <> 'Recovery'" | Select Name, Label, BlockSize, ExpectedBlockSize, IsOK)
ForEach ($DiskDrive in $DiskDrives) {
$SQLDrivesInUse.Drive | ForEach-Object {
If ($_ -eq $DiskDrive.Name) { $DiskDrive.ExpectedBlockSize = 8*8*1024 }
}
$SQLDriveDefaults.Drive | ForEach-Object {
If ($_ -eq $DiskDrive.Name) { $DiskDrive.ExpectedBlockSize = 8*8*1024 }
}
If ($DiskDrive.ExpectedBlockSize -eq $null) {
$DiskDrive.ExpectedBlockSize = 4*1024
}
If ($DiskDrive.BlockSize -eq $DiskDrive.ExpectedBlockSize) {
$DiskDrive.IsOK = $true
} else {
$DiskDrive.IsOK = $false
}
}
$DiskDrives | Select Name, Label, BlockSize, ExpectedBlockSize, IsOK | Format-Table -AutoSize

Upgrading SQL Reporting Services (SSRS) to SSRS 2012 should be a fairly easy task, but there are a few “bumps in the road” to watch out for. For this discussion let’s pretend that we have a SharePoint 2010 farm with 2 WFE, 2 APP, and 1 DB servers. SSRS 2008 R2 has been installed on both of the APP servers with a load balanced URL of spreports.contoso.local, each service uses a unique service account identity, all SPNs have been created, and Kerberos constrained delegation is configured and working properly.
Getting started I’m sure you have a few questions, hopefully this will answer a few of them.
- My web application is currently configured for classic mode authentication. Do I need to configure my web app for claims authentication?
The answer is no. Communication between the web app and SharePoint service applications is almost always converted to claims by the SharePoint STS for intra-farm communication, even if the web app is using classic mode for user authentication. However, you will need to use the Claims to Windows Token Service to convert the claims token back to a Windows Identity when connecting to data sources. - Do I need to uninstall SSRS 2008 R2 first?
No. The SSRS instance will automatically be upgraded when running the upgrade from the SQL installation media. - Do I have to upgrade the SQL database server which the databases reside?
No you do not, but it would probably be a good idea. - Is there any additional Kerberos configuration that is needed?
Maybe. Using the same identity for the new SharePoint SSRS service application will keep all of the previous configuration intact, but you may need to perform some additional Kerberos related tasks for the Claims to Windows Token service – more on this later. - Do I need to run the SQL upgrade on all servers in the farm?
You can, but I typically would just run it on only the servers that were running an SSRS instance and install the SharePoint Add-In individually on all remaining farm servers. - Do I need to do anything to my existing reports?
No. The upgrade process will take care of this for you the first time that each report is opened. - I have PowerPivot installed in my SharePoint farm will that continue to work?
If installed on the same server as SSRS then no. You will have to completely uninstall PowerPivot before installing SSRS. Afterwards you can install PowerPivot again from the SQL 2012 installation media. - Is there a difference between SSRS enterprise, BI, or standard?
Yes. The standard edition does not include PowerView or alert subscriptions.
Starting the Upgrade
Ok, now on to the upgrade. Typically what you would do is pop in the installation media for SQL server in the “CD” of each SharePoint server running an instance of SSRS. In this case it is the two App servers. Just make sure to choose the option to Upgrade from SQL Server 2005, SQL Server 2008 or SQL Server 2008 R2.

Once the upgrade wizard starts, it will install the setup files, run the System Configuration Checker, and it will detect the existing SSRS instance for you to upgrade. You pretty much just need to just move the wizard along until you get to the point in the wizard page called Reporting Services SharePoint mode Authentication. This is where I hit my biggest speed bump.
Error: The credentials you provided for the ” service is invalid.
Typically it is a best-practice to use a different domain account for each service instead of use the built-in accounts like Network Service. I think though, there may be an issue with the SQL installer when upgrading SSRS in SharePoint integrated mode to 2012. Every time that I would enter the password for the SSRS service account I would get an error that states:
The credentials you provided for the ” service is invalid.

Every time that I would enter the credentials and click Next I would see this error and then later, also realized that the service account was getting locked out because of too many bad password attempts – even though I just clicked Next just once.
The solution: Change the service account for the SharePoint SSRS instance to run using Network Service. After the upgrade is complete, you change change the credentials of the new SharePoint service application pool back to the previously used account. To change the credentials of the service account to Network Service, open the Reporting Services Configuration Manager and change the account. You will probably be prompted to backup your encryption keys during the process.

Now, you should be able to sail through the installation to perform the upgrade on each server running an instance of SSRS. Once the upgrade is complete, you should see the new service application listed in SharePoint Central Administration and a new service application pool will have been created as well. Once the upgrade is complete the new service application pool for the SSRS service application should be set to execute using the original credentials that SSRS was previously running and not Network Service. To change this, open Central Administration and click the Security link in the left menu and then Configure service accounts.

In the list of Service Accounts you should see a new service application pool that was created by the SSRS upgrade with a name similar to Service Application Pool – MSRSSHP11.0_MSSQLSERVER. Select this application pool and in the bottom drop box, register the account that was previously used by SSRS and select it to change the service identity of the new service application from Network Service to the previously used domain account that has all of the required SPNs already created.

You should then be able to use the newly created SQL Server Reporting Services 2010 service application.
But wait, there are some additional steps that you may want to accomplish before calling the upgrade complete.
Rename the Service Application
Personally, I don’t really care for the new default name of the new service application (MSRSSHP11.0_MSSQLSERVER_Service_Application) and I want to give it a more friendly name, something like SQL Server Reporting Services. There is not a way to rename the proxy using the GUI, but I don’t really care so much about the proxy so I just left it as is.

To do this you can click select the service application and then click the Properties button in the ribbon above to change the name.

This is where I ran into the next issue.
Grant the SharePoint Farm Account db_owner on the 3 Reporting Services databases.
When I tried to rename the service application I was presented an error that stated that it was unable to open the ReportServerTempDB and that the login had failed for the SharePoint farm account.

Now that reporting services is hosted as a SharePoint service application, the SharePoint farm account should be granted db_owner permissions on all 3 of the Reporting Services databases – yes, there are now 3 of them.
Error: Login Failed for “NT Authority\Anonymous Logon”
When I finally went to test one of the existing reports I was greeted with yet another error. The error basically stated that it could not connect to one of the data sources in the report as an anonymous user when I would have expected it to connect as my user account. This led me to believe that there was a Kerberos problem.
In this environment the Claims to Windows Token Service (C2WTS) had already been configured and SPNs and delegation for C2WTS had already been setup for all of the same data sources in order to support Excel Services. If you have not yet configured C2WTS you should create a SPN (doesn’t matter what it is, I just called mine SP/C2WTS) so that the Delegation tab is exposed in Active Directory Users and Computers. You will then need to configure constrained delegation for the services accounts and SPNs for all of the data sources that you plan to connect to. Really though, all of the delegation settings for the account used for the SSRS service should also be replicated for the account running C2WTS.
Every time that I have to setup Kerberos in a SharePoint environment I always refer back to a great Microsoft document Configuring Kerberos Authentication for Microsoft SharePoint 2010 Products. I decided to review the section for setting up Excel services and started double checking my environment to make sure that everything was setup similarly for SSRS. There was one statement in the guide that caught my attention:
“Select Use any authentication protocol. This enables protocol transition and is required for the service account to use the C2WTS.”

I double checked the service account for which SSRS was running and sure enough, it was set to Use Kerberos only. Once I had this changed to “Use any authentication protocol” and restarted all of the appropriate services my reports started working again!
Reclaim the old DNS Record
Now that SSRS is running in SharePoint as a service application the friendly URL that was being used to serve reports and load-balance requests was no longer being used. I was able to reclaim the old URL of spreports.contoso.local.