Crcl Revitalizes Online Reviews with New Application | Quisitive
Crcl Case Study Feature Image
Crcl Revitalizes Online Reviews with New Application
Learn how CRCL’s recommendation app, developed for iOs and Android, revitalized online reviews by catering to individual interests.

Please note that Menlo Technologies is a Quisitive company. 

CRCL logo

In this case study:

Client: CRCL

Industry: Software & Technology

Products and Services: Application Development on Microsoft Azure

Country: USA

“Attention from a VP for such a new start-up was really refreshing. We felt like a priority even though we were just starting out. [Quisitive] ensured that communication was consistent as they set up multiple project management platforms and implemented a suite of tools and that helped Crcl effectively run a remote project team.”
Chas Pulido
Founder, Crcl

The Idea

Crcl (pronounced “circle”) has been different from the start.

 

In March of 2015, app development was the last thing on Chas Pulido’s mind. The senior in high school discovered a lump in his upper arm that was later identified as a tumor. Luckily, after months of monitoring, the tumor was ruled benign. But this mortality check had an impact on Chas.

 

With the gift of time, Chas gained an understanding of the importance of enjoying life, people and positive experiences. He wanted to encourage others to embrace their passions. With that vision in mind, Chas began searching.

Getting Started

When Chas enrolled in The University of Notre Dame in the fall of 2015, he put together a team of fellow students and started conducting market research. The team recognized several flaws in the review market, a lack of trust between users, no incentive to review following an experience, review biases (rating inflation), and the absence of a social community.

 

Chas and his team envisioned a solution to these shortcomings—an app that would revitalize the world of online reviews with recommendations that cater to individual interests.

The Application Product

Crcl combines individual profiles, a proprietary rating system, as well as planning features to create an engaging social community. Whether a member of the community wants to go for a hike, go out to eat, or find a movie, Crcl helps users find the best option. Crcl provides dedicated forums for users to connect and post about interests and activities they enjoy. The app empowers people to go out into the world and experience the things they love.

 

CRCL engaged Menlo Technologies, a Quisitive company, when they needed to accelerate the development of their mobile app and could not find iOS developers with a satisfactory experience. Menlo Technologies developed Crcl’s iOS and Android applications.

 

"Dave Hickman really sold me on Menlo,” said Chas Pulido, Founder of Crcl. “Attention from a VP for such a new start-up was really refreshing. We felt like a priority even though we were just starting out. Menlo ensured that communication was consistent as they set up multiple project management platforms and implemented a suite of tools and that helped Crcl effectively run a remote project team.”

App Features

  • Circles of trust: Crcl’s Groups feature organizes reviews from your friends and by your interests allowing users to see reviews for what they are looking for by people they trust.
  • Personal Reviews: Instead of reading reviews by strangers. Crcl’s profile tags make it easy to see who is writing a review. Friends and friends of friends help users find just the right place or activity that they are searching for.
  • Quick rating system: With the Crcl App, users don’t need to type paragraphs of reviews or read blocks of text. Crcl has introduced the adjective approach to condense reviews into what really matters most.

Welcome to the “Introducing” series (check here for the full list of blog posts in this series). In the previous three blog posts, we introduced Azure and what services it provides, next we introduced certifications for Azure and how to get started with Azure and most recently we introduced the structure of Azure. In this blog post, we will introduce how much does Azure cost for resources works.

Costing in Azure:

In an earlier blog post in the series, we introduced how pricing works in the cloud. A quick summary of that blog post is below:

Visualizing Costing information:

If you are looking for a quick high-level view of your current Azure spend in the Azure portal you can open “Subscriptions” in Azure. In this view, you can see a list of what subscriptions you have access to as well as their current cost levels. This view is available by going to All services and typing “subscriptions” as shown below.

Graphic 1: Getting to subscriptions

Azure subscription

Graphic 2: Subscriptions and costing

Subscriptions and Costing

The primary method to view details on costs for resources in Azure is to use the “Cost Management + Billing view”. This view is available by going to All services and typing “cost” as shown below in the Azure portal.

Graphic 3: Getting to Cost Management & Billing

Getting to Cost Management & Billing

This view can be used to show the accumulated costs for Azure across one or more subscriptions, but it is currently restricted to viewing a single directory at a time.

Graphic 4: Cost Management + Billing

Cost Management + Billing

Please note, Azure Sponsorship cost information is not currently available in Cost Management but it’s listed as “coming soon” as of 4/13/2020.

Graphic 5: Extracting data and visualizing in Excel

Extracting data and visualizing in Excel

Below is a simple treemap showcasing how you can easily see which specific subscriptions are using the most Azure from a costing perspective (with the subscription names scrubbed).

Graphic 6: Extracting data and visualizing exported data in Power BI

Extracting data and visualizing exported data in Power BI

The treemap approach is also available in Power BI. Below is a quick sample put together in Power BI using the CSV data exported from Azure.

Graphic 7: Extracting data and visualizing PowerShell gathered data in Power BI

Extracting data and visualizing PowerShell gathered data in Power BI

If the existing solutions in Azure are not sufficient for your needs, you can also export this data (or gather it via PowerShell scripts) and import the data into Power BI. In the example below I gathered 30 days’ worth of detailed data which could be seen via the type of resource and costs on a daily basis.


Microsoft provides great options directly available in Azure to see Azure costing and to forecast where costing will go. Additionally, the data can be accessed via a CSV export (which can be scheduled) and can also be gathered via PowerShell scripting.

Consumption as a metric for good

In a household, there are two metrics that are heavily dependent upon consumption – electrical and water. Depending on how you are billed, electrical is the closest to a truly consumption-based metric. The more electricity you use, the more it costs every month. Similarly, for water – the more water your household uses the more it costs every month. Speaking from a homeowner’s perspective, I do not begrudge my electrical or water providers what they charge me for the usage of their services – mostly because I have a level of control over my usage of either of these resources. What I do begrudge them is the base charges which occur every month regardless of how much I use their services.

Using a consumption-based metric for pricing at first sounds like it would be the most beneficial to the company that is being consumed from. IE: That the company selling the services wants you to consume more so they are incentivized to have you consume more (possibly even more than you need). This metric, however, works out to be the most beneficial to the consumer because if you are not gaining value from consuming you will not continue to consume. Therefore, it is critical to the vendor providing the services that you as the consumer see benefit from consuming their services.

Thank you to Greg T, Chad S and Beth F for their help on this blog post!

Additional resources:

Series Navigation:

Intelligent Structures Case Study Feature Image
Intelligent Structures Monitors the Health of Highway Infrastructure with Microsoft Azure
Learn how Intelligent Structures monitors the health of highway infrastructure with Microsoft Azure by using the IOT Hub and Azure SQL stack.

Please note that Menlo Technologies is a Quisitive company. 

intelligent structures logo

In this case study:

Client: Intelligent Structures

Industry: Software & Technology

Products and Services: Infrastructure, Data & Analytics

Country: USA

Menlo Technologies, a Quisitive company, implemented a Microsoft Azure-based architecture to capture the device data synchronously which provided the dashboards required to analyze the data.

The Challenge

Intelligent Structures provides an innovative end-to-end enterprise solution for bridge performance management and are committed to supporting infrastructure executives with the real-time information and analytics they need for fact-based decision-making for productive bridge asset management. Intelligent Structures Inc. wanted to leverage sensor device data to monitor various kinds of indicators such as cracks, temperature, icing, scouring, wind speed and load on bridges across the U.S. and Canada. The company was looking for a solution that could be extended to multiple locations and geographies to monitor the bridges.

 

Intelligent Structures sensor devices are placed on bridges that continuously monitor the environmental conditions and the bridge parameters and transmit the data to a centralized repository. An application helps provide dashboards to show the health of the bridges by checking live and recorded data.

The Solution

Menlo Technologies, a Quisitive company, implemented a Microsoft Azure-based architecture to capture the device data synchronously which provided the dashboards required to analyze the data. Azure’s architecture showed a visual representation of:

  • The Volume of Data:  Multiple sensors send a tremendous amount of data for each bridge. Each sensor is expected to have at least 30 sensors and to send data points every second or less than a second.
  • The Performance for data retrieval and archival.
  • Multiple sensors sending data in parallel.
  • Data loss.
  • Data Security.

The Results

The architecture addressed all of the client’s requirements with Microsoft Azure by using the IOT Hub and Azure SQL stack.

 

The Windows Service:

  • Read the encrypted data from data streams, pushing the processed data to IOT hub.
  • Decrypted the data in a secure and safe manner.
  • The IOT Hub sent data to Azure SQL and PowerBI. Microsoft Stream Analytics was used to pull data from the IOT Hub and push to Azure SQL. The host of the Windows service is a virtual machine that will eventually be deployed on Azure web app services.
  • The sensor devices continuously send the data into Azure. Windows Services synchronously reads this data in parallel and pushes it to the IOT Hub. The IOT Hub guarantees that there was no data loss and scaled to accommodate any number of sensor device data in parallel.

Benefits of the Azure Solution

  • Can easily scale horizontally for any volume of data
  • Good performance for data retrieval and archival as IOT Hub provides a way to configure devices so that the applications can write to each device in IOT Hub in parallel
  • Easy for development as IOT Hub integration is through an easy interface
  • Provides advanced security at each data collection points.
  • Can build add on services such as Machine Learning and Artificial Intelligence for enhanced predictive analytics
  • Systematically allows for Intelligent Structures and their end clients to determine if a specific section of a bridge may need repairs rather than an entire bridge replacement. Moreover, the data may also determine that a bridge can have a longer life than originally thought based on traditional manual data collection and analysis.

Methods to provide internet backup connectivity at home – Part 2

How well do you know your neighbors? No, really – how well do you know your neighbors and how well do you trust them? If you know and trust your neighbors and you are looking for a creative way to provide backup internet connectivity this post is for you!

If your neighborhood is like mine, you probably have a dozen or so Wi-Fi networks that show up when you connect to your home Wi-Fi network. Behind the scenes, many of your neighbors are probably using the same Internet Service Provider (ISP) but many of them are not. I was surprised to find out that in my area there are more than 5 ISP options available (for a good look at what options are available for internet connectivity check this out). Examples of options available include cable, fibre, satellite, DSL and other solutions.

Connect to multiple ISP’s

If you truly cannot function without a functional internet connection and a tether is not a viable option then you should consider connecting your home to multiple ISP’s. For a simplistic approach, you can connect to two different providers and have each share out via different Wi-Fi configurations. A more complex configuration would require configuring a router that has a connection to both ISP’s and can switch between connections in case of a failure (or even load balance between both ISP’s). There are several negatives to this approach:

Develop a backup internet alliance

If you are friends with your neighbors another option is to set up a secured guest network on your home and share that with your neighbors. In this configuration, they can connect to your network in case of an ISP failure and hopefully they will reciprocate (assuming that you and your neighbors are not both on the same ISP). If you have 2 or 3 neighbors who are in the Wi-Fi range and they use different ISP’s you can provide a backup connection that would continue to function even if a single ISP were to go offline.

There are several challenges which this approach would need to be considered or mitigated:

This concept of a backup internet alliance could be approached on an individual level as discussed in the example below, on a neighborhood level or potentially even on a city level. Imagine for a minute what it would be like if you could go anywhere in your town and connect wherever you went spanning the variety of ISP’s which are available.

Summary: There are additional ways to provide backup internet connectivity including getting connected to a second ISP and/or creating a backup network alliance to provide a functional connection even when your ISP goes offline.

In the next part of this blog post series, I will provide some options I have found out about recently, a step-by-step I used at home, and information from some of my colleagues in the industry!

The links for this series are below:

Part 1: How to use tethering to provide backup internet connectivity

Part 3: How router configuration can provide backup internet connectivity and maximize bandwidth

One of the challenges that we have seen while working with Teams or other video conferencing platforms is a general slowdown of meeting audio when multiple videos are shared such as in a classroom environment or a large company meeting. During most of the day, this isn’t a problem, but as we hit certain points of the day (afternoon most often) there is a significant slowdown that can occur. This blog post will focus on debugging conference call latency issues in Microsoft Teams, but these issues will occur on any video conferencing software platform.

An important part of this to realize is that performance for video sharing can be impacted due to several underly causing conference call latency:

Additionally, it can also be a combination of any of the above which is occurring. To answer a simple question like “Why is my conference call latency poor, causing issues such as choppy audio, or problems seeing the screen share?” isn’t really that simple.

The graphic below shows a simplified version of how each of the attendees connects to a video conference. In most cases, there is someone running the meeting who we will refer to as a presenter (the teacher in a classroom setting). There are also several attendees (the students in the classroom setting). Each of these attendees is connecting to the internet through some manner (cable, fibre optic, ADSL, etc.) which are represented by the lines between the presenter and attendees to the internet. The connectivity is likely to different Internet Service Providers but we will simplify this to show that they are all connecting to the internet somehow. From their internet connection, they are each communicating with the video conference application (Teams in this blog post example).

Graphic 1: How people connect to a video conference when all is working well

Connecting to video conference services to avoid conference call latency

There are a lot of parts that must work to make this whole process work. The presenter and attendees all need to have functional internet connectivity, and the video conference application must be functional and performing effectively. If any problems occur in the diagram below, there will be problems in the video conference.  As an example, if one attendee has a slow internet connection, it will impact their ability to see the video conference (including what is shared on the screen, audio from the presenter, etc.). The slow link is shown in graphic 2 below by changing the color of the link between the attendee and the internet to yellow.

Graphic 2: How people connect to a video conference when one attendee has a slow internet connection

One student connection to avoid conference call latency issue

If the person who is presenting (or teaching) has a slow internet connection it will impact all the attendee’s (students) ability to see what they are sharing on the screen as well as audio and video from the presenter. This is represented in graphic 3 by the yellow line between the presenter and the internet.

Graphic 3: How people connect to a video conference when the presenter has a slow internet connection

Presenter connection issue

If internet service providers are experiencing a slowdown (most likely due to additional network traffic occurring during this outbreak), this will impact all of the attendees of the video conference as shown in graphic 4.

Graphic 4: How people connect to a video conference when the internet service providers or connections are slow

Internet connection issue

Finally, if there is an issue with the underlying video-conferencing application this will also impact all attendees of the video, causing conference call latency issues as shown in graphic 5.

Graphic 5: How people connect to a video conference when the video conference application is slow

Conference call latency: Video conference service issue

How to debug problems during video conferences

The above graphics should show that there are many different things which can cause a problem during a video conference. So how can we debug this situation?

Common issues & resolutions:

Tips & Tricks:

Feedback from a colleague on this blog post

I sent this blog post to David B, who had the following thoughts for consideration (this has been consolidated to specific bullet points):

Configuring an email notification for service incidents

On the Microsoft 365 admin center under preferences, you can set up an email notification if there are health service issues for the services that you are interested in. To set this up open the Service health view and click on Preferences (highlight below).

Conference call latency: Service health preferences

If you enable the checkbox which says “Send me service health notifications in email” you can specify whether to include incidents (which we are looking for in this case).

Preferences - part 1

You can choose what specific services you want to be notified about (Microsoft Teams and SharePoint Online in this example).

Preferences - part 2

This notification should be sent to your technical contact at your organization or to the most technical person in your organization so they can determine if this incident will impact your organization.

Configuring a Teams site to test connectivity

You can create a Teams site which has different pages which will help with debugging connectivity issues. For this Teams site, you can add a webpage that points to one location to check your internet connection and a second webpage that checks your connectivity to Office 365. These provide a quick way to debug what could be causing conference call latency and communication issues.

To configure this, I created a new Team called “Teams Status”. On this team, I used the + sign to add a new tab.

Adding tab in Teams

I created two tabs, one called “Internet Connectivity Test” and one called “Teams Connectivity Test”. For each of these, I added them as a website from the options shown below.

Add a tab - options

For this new tab, you just need to type in the name of the website and add the URL you want it to go to.

Adding a website

Below are screenshots from my two websites that are available directly in Teams so it’s easier to track down what may be causing issues.

If you show more information, it gives more details which can help with debugging connectivity from your location. The URL I added was: https://fast.com/. In the example below, we can see that my internet speed is 32 Mbps, unloaded connections are at 14 ms and loaded connections are at 595 ms. Unloaded latency is how long it takes to connect when there is not much load through your link to the ISP. Loaded latency is how long it takes to connect when there is a load on the link to your ISP.

ISP connection speed

The Teams Connectivity Test checks the load time to bring up https://outlook.office365.com/. The URL I added was: https://tools.pingdom.com/#5c486c4d70400000. In the example below, we can see that the load time is 365 ms.

Conference call latency connectivity test to O365

Additional reference:

Summary: Understanding how video conferencing systems work from a high level can help you to debug problems and work around them more quickly. Hopefully, this blog post has given you a quick crash course and has given some tips which will help to make your meetings (or classes) continue to go on without a hitch and avoid conference call latency!

Welcome to the “Introducing” series (check here for the full list of blog posts in this series)! Today we will look at how to get started with Azure, what certifications exist and where to get started with them. In the last part of this series, we introduced Azure. In this blog post, we will look at what certification exams are available around Azure and how to get started if you want to learn more about Azure.

Certification exams can provide a way to demonstrate what you know and establish a level of credibility in a variety of technologies. For this blog post, we will focus on Microsoft Certification Exams with a focus on Azure.

For my old-school readers on this blog post who have been working in Microsoft technology for a while, Microsoft recently announced that they will be retiring MCSA, MCSD, and MCSE exams effective June 30th, 2020. I believe that this indicates a push towards their remaining certification exams that are focused more on their cloud technologies such as Azure.

There are a variety of exams that are available and focus on Azure (the list is available here). Currently these are: AZ-103AZ-120AZ-204AZ-220AZ-300AZ-301AZ-400AZ-500AZ-90070-487 and 70-537.

You may be asking yourself if you can get a job in technology without a college degree and the answer is Yes. Microsoft is among many companies in the industry that don’t necessarily require a college degree (at least for entry-level positions).

What certification should I start with and what resources should I use?

At first glance, that looks daunting – how to get started with Azure and which exam should you take first? I reached out to several of my colleagues and got some excellent recommendations on that. The first step I would recommend is an exam which is referred to as “AZ-900 – Azure Fundamentals”. The Microsoft Azure Fundamentals exam appears to be a great starting point to get to know more about Azure. Below are resources I would recommend checking out for free (or very inexpensive) self-study for that exam:

What about getting a degree?

Getting a degree in Information Technology or Computer Science is a great way to get into a new career! I started my career with a Bachelor of Science in Computer Science and that has opened a lot of doors which would have been closed for me otherwise.

Your local community college (Collin College as an example in my area) can provide you with courses that would help you to see what you are (or more importantly) what you are not interested in with regards to computers. Specifically, if math isn’t your strong area I would recommend looking at “Information Systems”. If math is one of your strengths look at “Computer Science”. Community colleges give you access to a wide variety of topics (many of which are computer-related) at a minimal cost which will give you a chance to see what you are and are not interested in.

What about bootcamps to get through certification exams?

Another approach is to go and study at something which is called a “bootcamp”. Bootcamps are designed to provide you with the tools that you need to study a topic quickly and very focused. The goal of a bootcamp is to provide you with the information required to pass multiple Azure certifications. The negative on these is that they tend to be expensive and require dedicated time which would mean taking time off work. An example of one of these is available here.  Before choosing any bootcamps, be sure to find out what their exam pass rate is.

Do you have any tips for taking certification exams?

The most important point to effectively taking certification exams is to spend the time and study and try out the technology that you are being tested on (for details on this see “Free Azure Resources” above).

Did you know that you can take certification exams online?

I ran across the following articles recently which provide some good tips and tricks for taking certification exams:

What are good resources to help with self-study for the other Azure certification exams?

Below are a set of links that I have run across on Twitter while I’ve been researching for this article.

AZ-103: Microsoft Azure Administrator

If you are looking for an exam to take after the AZ-900 this is most likely the one.

AZ-104: Microsoft Azure Administrator (Beta)

AZ-204: Developing Solutions for Microsoft Azure

AZ0303:

AZ-500: Microsoft Azure Security Technologies

Additional resources:

Thank you to Tony N, John S, Kris T and JC W for all of your insights on this blog post and thank you to Chad S and Beth F for their help on this blog post! And thank you to Thomas Maurer for this tip and many other links in this blog post!

Update: My friend James A sent me this list with a set of free e-books from Microsoft in this space!

Series Navigation:

I’ve had a lot of trouble recently with connecting my PC to my SharePoint Online tenants via PowerShell.  The process is fairly simple and this article is the best starting point:  http://technet.microsoft.com/en-us/library/fp161388(v=office.15).aspx.

Having done all the right things, I was running into the following error upon every time I’d run the Connect-SPOService cmdlet:  “​The Application ID (AppID) for which the service ticket is requested does not exist on the system.”

I don’t know what that means or where to begin with it, so I hit the web and found nothing useful.  Ugh.

Well, today I got lucky.  I’m presenting at tonight’s meeting of the Phoenix Office 365 User Group and have been spending a few precious lunchtime minutes prepping for tonight.  In particular, the facility we meet in does not provide wired network access and so I want to make sure that my virtual machines work are going to play nicely with my wireless NIC today.  I disabled my Ethernet NIC and am only on wireless now as I write this.

Somehow, that fixed the issue.  I’m guessing it has something to do with my host (Windows 8.1) running Hyper-V and having several different virtual switches that share my wired Ethernet NIC.

I’m now essentially not using any unusual network settings and I’ve been able to connect to multiple tenants now without a problem.  Yay!

Update 01/05/15

I just called and talked to tech support at my ISP and the “The Application ID (AppID) for which the service ticket is requested does not exist on the system” message is occurring because I’m on a residential network instead of a business network.  Looks like I’ll have to go to the office for these kinds of things.  Argh!

Update 02/02/16

I’ve found success in connecting to my company’s VPN for the last year. While that’s worked, it’s been pain in the neck to connect to VPN on and off all day. Today Max Melcher posted a similar article here: https://melcher.it/2016/02/for-security-reasons-dtd-is-prohibited-in-this-xml-document. Check it out. You might find success in some DNS updates and disabling IPv6.

Using the DATEDIFF function allows you to easily calculate week days in SQL, because it both removes the time from a date and converts the date into a number for easy mathematical calculations.

Calculating Most Recent Monday

DECLARE @MostRecentMonday DATETIME = DATEDIFF(day, 0, GETDATE() – DATEDIFF(day, 0, GETDATE()) %7)

PRINT @MostRecentMonday

Calculating Previous Sunday

DECLARE @CurrentWeekday INT = DATEPART(WEEKDAY, GETDATE())

DECLARE @LastSunday DATETIME = DATEADD(day, -1 *(( @CurrentWeekday % 7) – 1), GETDATE())

PRINT @LastSunday

Calculating Previous Monday

DECLARE @CurrentWeekday INT = DATEPART(WEEKDAY, GETDATE())

DECLARE @LastMonday DATETIME = DATEADD(day, -7 *(( @CurrentWeekday % 7) – 1), GETDATE())

PRINT @LastMonday

More helpful SQL content:

  1. Float vs decimal in SQL 
  2. Cannot resolve the collation conflict
  3. SQL changes not permitted

In Cumulative Update 3 for System Center Configuration Manager 2012 R2, Microsoft introduced Management Point Affinity and Justin Chalfant had a nice write-up on the new feature.  One thing that was left undocumented was an acceptable way to set MP Affinity; the blog only mentions the use of Group Policy, Compliance Scripts, etc.

I worked out the details of a ConfigMgr Configuration Item (for Compliance).  The challenge is that the registry key is a Multi-String Value (an array of strings) and that ConfigMgr’s Configuration Item cannot natively handle this registry data type.  I decided to use a VBScript since it is the least common denominator of our scripting choices.

Below is the Discovery/detection script and the Remediation script.  I’ve also included step-by-step screen shots and the final exported Configuration Item if you just want to import it and not create the object yourself.

The exported file can be downloaded from OneDrive @ https://onedrive.live.com/redir?resid=E3B0C73435A2F778%212827 \ ConfigMgr Client AllowedMPs.cab

Scripts

For testing or one-off situations, run this command line to set the list of allowed MPs.

reg.exe add HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\CCM /v AllowedMPs /t REG_MULTI_SZ /d “https://MP1.lab.localhttp://MP2.lab.localMP3.lab.local”

This is the Configuration Item Discovery script.  Be sure to update strDataDesired with your actual value for the group of computers which will be targeted.

On Error Resume NextstrDataDesired = “https://MP1.lab.local|http://MP2.lab.local|MP3.lab.local|”Set StdOut = WScript.StdOutSet objWMIreg=GetObject(“winmgmts:{impersonationLevel=impersonate}!\\.\root\default:StdRegProv”)objWMIreg.GetMultiStringValue &H80000002,“SOFTWARE\Microsoft\CCM”,“AllowedMPs”,arrDataIf VarType(arrData) = 8204 Then     For Each strData In arrData        strDataDetected = strDataDetected & strData & “|”    NextEnd If If strDataDesired <> strDataDetected Then    StdOut.WriteLine “reset needed”Else    StdOut.WriteLine “as expected”End If

This is the Configuration Item Remediation script.  Be sure to update arrDataDesired with your actual value for the group of computers which will be targeted.

On Error Resume NextarrDataDesired = array(“https://MP1.lab.local”,“http://MP2.lab.local”,“MP3.lab.local”)Set oReg=GetObject(“winmgmts:{impersonationLevel=impersonate}!\\.\root\default:StdRegProv”)oReg.SetMultiStringValue &H80000002,“SOFTWARE\Microsoft\CCM”,“AllowedMPs”,arrDataDesired

After adding these scripts to a Configuration Item to a Configuration Baseline and Deploy it to a collection of computers.  You’ll need a different Configuration Setting, Baseline, Deployment, and Collection for each list of AllowedMPs you need.

Screen Shots

Create a new Configuration Item and give it an appropriate name accounting for the group of computers / list of MPs that will be allowed.

Select the operating systems this will be allowed to run on.

Configure the Setting type as Script and Data type as String

Type the script to detect the registry key value.  Ensure that the strDataDesired variable is updated to match the list of allowed MPs.  Notice the | (pipe) as the last character.

Type the script to remediate the registry key value.  Ensure that the arrDataDesired variable is updated to match the list of allowed MPs.

Select the Compliance Rules tab, give it a name, set the Rule type to Value, The value returned by the specified script to Equals  as expected.

Enable “Run the specified remediation script..”

Optionally enable “Report noncompliance…” and set the severity to an appropriate value such as Warning

Next, Next, Next, …

That completes the creation of the Configuration Item.  Add it to a Configuration Baseline and Deploy it to a collection of computers.  You’ll need a different Configuration Setting, Baseline, Deployment, and Collection for each list of AllowedMPs you need.

A colleague of mine recently was working on a Windows image with a specific setting for Microsoft Office 2010.  However, the registry key simply would not “sick” and would be stripped out at some point before the user could log on and launch the application.

There are a few options for making changes to all existing and future/new user profiles:
1) CopyProfile in Unattend.xml – only works for new users but will meet the criteria during OS deployment
1) modify the default user profile (load default user hive) – only works for new users but will meet the criteria during OS deployment
2) modify the default user profile AND each existing user profile (load default and each user hive) – works but is a hassle
3) create a Scheduled Task to run at logon – executes code each logon
4) utilize HKLM\Software\Microsoft\Windows\CurrentVersion\Run – executes code each logon
5) utilize Active Setup
There also may be some method I’m not aware of.

Active Setup to the rescue!

A former co-worker who has a background in application development and software packaging introduced me to this feature some time ago.  Here’s a quick and simple example:

With Admin rights (SCCM/MDT/etc.) write the following registry keys:

reg.exe ADD HKLM\Software\Microsoft\Active Setup\Installed Components\<UniqueID> /d StubPath /t REG_SZ /v “cmd.exe /c %ProgramData%\Scripts\myScript.vbs”
reg.exe ADD HKLM\Software\Microsoft\Active Setup\Installed Components\<UniqueID> /d Version /t REG_SZ /v 1,0    (yes commas, not periods)

Each time a user logs on, the script referenced in StubPath will be executed if HKCU\Software\Microsoft\Active Setup\Installed Components\<UniqueID>\Version does not exist or is a lower version.

Active Setup is explained in a more detail over at AppDeploy (now ITNinja) and elsewhere.  See the following articles:
http://www.itninja.com/blog/view/an-active-setup-primer by Bob Kelly
http://www.itninja.com/blog/view/appdeploy-articles-activesetup
https://helgeklein.com/blog/2010/04/active-setup-explained