Posts Tagged ‘Powershell’

Provisioning students with Office 365 ProPlus licenses

NOTE:  This post was updated on 2014-12-23 to reflect refinements in the PowerShell provisioning script. The script now has support for provisioning O365 ProPlus for faculty/staff under the new Office 365 Faculty/Staff Benefit.

Interesting… I still seem to be working at UVM. There must be a story there, but you won’t read about it here.

Anyway, after getting back from my brief hiatus at Stanford University, I got back on the job of setting up Azure DirSync with Federated login to our in-house Web SSO platforms. I’ll need to post about the security changes required to make that work with UVM’s FERPA interpretation. To summarize, we got it working.

However, once students can log in to Office 365, we need to provision them with licenses. DirSync can’t do this, so I needed to script a task that will grant an office 365 ProPlus license to any un-provisioned active student. You will find the script, mostly unaltered, below. I just set it up as a scheduled task that runs sometime after the nightly in-house Active Directory update process.

To be useful outside of UVM, the code would need to be customized to handle the logic used in your organization to determine who is a student. We have extended the AD schema to include the eduPerson schema, and have populated the “eduPersonPrimaryAffiliation” attribute with “Student” for currently active students. If you do something different, have a look at the “Get-ADUser” line, and use a different LDAP query to fetch your student objects.


# Provision-MSOLUsers.ps1 script, by J. Greg Mackinnon, 2014-07-30
# Updated 2014-11-20, new license SKU, corrections to error capture commands, and stronger typing of variables.
# Updated 2014-11-21, added "license options" package to the add license command, for granular service provisioning.
# Updated 2014-12-22, Now provisions student, faculty, and staff Office 365 Pro Plus with different SKUs.
# Provisions all active student accounts in Active Directory with an Office 365 ProPlus license.
# Requires:
# - PowerShell Module "MSOnline"
# - PowerShell Module "ActiveDirectory"
# - Azure AD account with rights to read account information and set license status
# - Credentials for this account, with password saved in a file, as detailed below.
# - Script runs as a user with rights to read the eduPersonAffiliation property of all accounts in Active Directory.
#    Create a credential file using the following procedure:
#    1. Log in as the user that will execute the script.
#    2. Execute the following line of code in PowerShell:
#    ConvertTo-SecureString -String 'password' -AsPlainText -Force | ConvertFrom-SecureString | out-file "c:\pathToCreds\msolCreds.txt" -Force

Set-PSDebug -Strict

#### Local Variables, modify for your environment: ####
[string] $to = ''
[string] $from = ''
[string] $smtp = ''
[string] $msolUser = ''
[string] $searchBase = 'ou=people,dc=mydomain,dc=edu'
[string] $ea1Filter = '(&(ObjectClass=inetOrgPerson)(|(extensionAttribute1=*Student*)(extensionAttribute1=*Staff*)(extensionAttribute1=*Faculty*)))'
### Example Filters:
# Filter for all fac/staff/students: 
#   (&(ObjectClass=inetOrgPerson)(|(extensionAttribute1=*Student*)(extensionAttribute1=*Staff*)(extensionAttribute1=*Faculty*)))
# Filter for just students:
#   (&(ObjectClass=inetOrgPerson)(extensionAttribute1=*Student*))
#### End Local Variables ##############################

#### Initialize logging and script variables: #########
#initialize log and counter:
[string[]] $log = @()
[long] $pCount = 0

#initialize logging:
[string] $logFQPath = "c:\local\temp\provision-MSOLUsers.log"
New-Item -Path $logFQPath -ItemType file -Force

[DateTime] $sTime = get-date
$log += "Provisioning report for Office 365/Azure AD for: " + ($sTime.ToString()) + "`r`n"

#### Define Functions:  ###############################
function errLogMail ($err,$msg) {
    # Write error to log and e-mail function
    # Writes out the error object in $err to the global $log object.
    # Flushes the contents of the $log array to file, 
    # E-mails the log contents to the mail address specified in $to.
    [string] $except = $err.exception;
    [string] $invoke = $err.invocationInfo.Line;
    [string] $posMsg = $err.InvocationInfo.PositionMessage;
    $log +=  $msg + "`r`n`r`nException: `r`n$except `r`n`r`nError Position: `r`n$posMsg";
    $log | Out-File -FilePath $logFQPath -Append;
    [string] $subj = 'Office 365 Provisioning Script:  ERROR'
    [string] $body = $log | % {$_ + "`r`n"}
    Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp

function licReport ($sku) {  
    $script:log += 'License report for: ' + $sku.AccountSkuId 
    $total = $sku.ActiveUnits
    $consumed = $sku.ConsumedUnits
    $script:log += 'Total licenses: ' + $total  
    $script:log += 'Consumed licenses: ' + $consumed  
    [int32] $alCount = $total - $consumed
    $script:log += 'Remaining licenses: ' + $alCount.toString() + "`r`n"
#### End Functions  ###################################

#Import PS Modules used by this script:
try {
    Import-Module MSOnline -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading Azure AD (MSOnline) PowerShell module."
    errLogMail $myError $myMsg
    exit 101

try {
    Import-Module ActiveDirectory -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading ActiveDirectory PowerShell module."
    errLogMail $myError $myMsg
    exit 102

#Get credentials for use with MS Online Services:
try {
    $msolPwd = get-content C:\pathToCreds\msolCreds.txt | convertto-securestring -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered getting creds from file."
    errLogMail $myError $myMsg
    exit 110

try {
    $msolCreds = New-Object System.Management.Automation.PSCredential ($msolUser, $msolPwd) -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in generating credential object."
    errLogMail $myError $myMsg
    exit 120
#Use the following credential command instead of the block above if running this script interactively:
#$msolCreds = get-credential

#Connect to MS Online Services:
try {
    #ErrorAction set to "Stop" for force any errors to be terminating errors.
    # default behavior for connection errors is non-terminating, so the "catch" block will not be processed.
    Connect-MsolService -Credential $msolCreds -ErrorAction Stop
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in connecting to MSOL Services."
    errLogMail $myError $myMsg
    exit 130
$log += "Connected to MS Online Services.`r`n"

#Generate license report:
$studAdv = Get-MsolAccountSku | ? {$_.accountSkuId -match 'STANDARDWOFFPACK_IW_STUDENT'}  
$facStaffBene = Get-MsolAccountSku | ? {$_.accountSkuId -match 'OFFICESUBSCRIPTION_FACULTY'} 

#Set license options for student and faculty/staff SKUs: 
# NOTE: License options for a SKU can be enumerated by examining the "ServiceStatus" property of the MsolAccountSku objects fetched above using "Get-MsolAccountSku".
$facStaffLicOpts = New-MsolLicenseOptions -AccountSkuId $facStaffBene.AccountSkuId -DisabledPlans ONEDRIVESTANDARD

#Retrieve active fac/staff/student accounts into a hashtable:
[hashtable] $adAccounts = @{}
try {
    #$NOTE:  The filter used for collecting students needed to be modified to fetch admitted students that are not yet active
    #   This is a 'hack' implemented by FCS to address the tendency for the registrar not to change student status until the first day of class.
    #   (Actual student count should be lower, but we have no way to know what the final count will be until the first day of classes.)
    #get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(eduPersonAffiliation=Student))' -SearchBase 'ou=people,dc=campus,dc=ad,dc=uvm,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
    get-aduser -LdapFilter $ea1Filter -Properties extensionAttribute1 -SearchBase $searchBase -SearchScope Subtree -ErrorAction Stop | % {$adAccounts.Add($_.userPrincipalName,$_.extensionAttribute1)}
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Active Directory."
    errLogMail $myError $myMsg
    exit 200
$log += "Retrieved accounts from Active Directory."
$log += "Current account count: " + $adAccounts.count

#Retrieve unprovisioned accounts from Azure AD:
[array] $ulUsers = @()
try {
    #Note use of "Synchronized" to suppress processing of cloud-only accounts.
    $ulUsers += Get-MsolUser -UnlicensedUsersOnly -Synchronized -All -errorAction Stop
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Azure AD. "
    errLogMail $myError $myMsg
    exit 300
$log += "Retrieved unlicensed MSOL users."
$log += "Unlicensed user count: " + $ulUsers.Count + "`r`n"

#Provision any account in $ulUsers that also is in the $adAccounts array:
foreach ($u in $ulUsers) {
    #Lookup current unlicensed user in the AD account hashtable:
    $acct = $adAccounts.item($u.UserPrincipalName)
    if ($acct -ne $null) {
        #Uncomment to enable verbose logging of user to be processed.
        #$log += $u.UserPrincipalName + " is an active student."
        try {
            if ($u.UsageLocation -notmatch 'US') {
                #Set the usage location to the US... this is a prerequisite to assigning licenses.
                $u | Set-MsolUser -UsageLocation 'US' -ErrorAction Stop ;
                #Uncomment to enable verbose logging of usage location assignments.
                #$log += 'Successfully set them usage location for the user. '
        } catch {
            $myError = $_
            $myMsg = "Error encountered in setting Office 365 usage location to user. "
            errLogMail $myError $myMsg
            exit 410
        try {
            # Note: Order of if/elseif logic determines which SKU "wins" for people with multiple affiliations (both student and faculty/staff)
            # At present student affiliation "wins", which is preferable because we have 1 million student licenses and only 6 thousand fac/staff.
            # If we change licensing options in the future, we will want to revisit this logic.
            if ($acct -match 'Student') {
                #Uncomment to enable verbose logging of Student license assignments.
                #$log += 'Setting Student license options for user...'
                $u | Set-MsolUserLicense -AddLicenses $studAdv.AccountSkuId -LicenseOptions $stuLicOpts -ErrorAction Stop ;
            } elseif ($acct -match 'Faculty|Staff') {
                #Uncomment to enable verbose logging of Fac/Staff license assignments.
                #$log += 'Setting Fac/Staff license options for user...'
                #Assign the student advantage license to the user, with desired license options
                $u | Set-MsolUserLicense -AddLicenses $facStaffBene.AccountSkuId -LicenseOptions $facStaffLicOpts -ErrorAction Stop ;
            #Uncomment to enable verbose logging of license assignments.
            #$log += 'Successfully set the Office license for user: ' + $u.UserPrincipalName
            $pCount += 1
        } catch {
            $myError = $_
            $myMsg = "Error encountered in assigning Office 365 license to user. "
            errLogMail $myError $myMsg
            exit 420
    } else {
        $log += $u.UserPrincipalName + " does not have an active AD account.  Skipped Provisioning."
    Remove-Variable acct

#Add reporting details to the log:
$eTime = Get-Date
$log += "`r`nProvisioning successfully completed at: " + ($eTime.ToString())
$log += "Provisioned $pCount accounts."
$tTime = new-timespan -Start $stime -End $etime
$log += 'Elapsed Time (hh:mm:ss): ' + $tTime.Hours + ':' + $tTime.Minutes + ':' + $tTime.Seconds

#Flush out the log and mail it:
$log | Out-File -FilePath $logFQPath -Append;
[string] $subj = 'Office 365 Provisioning Script:  SUCCESS'
[string] $body = $log | % {$_ + "`r`n"}
Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp

Migrating Windows-auth users to Claims users in SharePoint

A short time back I published an article on upgrading a Windows-authenticated based SharePoint environment to an ADFS/Shibboleth-based claims-based environment. At that time I said I would post the script that I plan to use for the production migration when it was done. Well… here it is.

This script is based heavily on the one found here:‎
Unfortunately, “SharePoint-Voodoo” appears to be down at the time of this writing, so I cannot make appropriate attribution to the original author. This script helped speed along this process for me… Thanks, anonymous SharePoint Guru!

My version of the script adds the following:

  • Adds stronger typing to prevent script errors.
  • Adds path checking for the generated CSV file (so that the script does not exit abruptly after running for 30 minutes).
  • Introduces options to specify different provider prefixes for windows group and user objects.
  • Introduces an option to add a UPN suffix to the new user identity
  • Collects all user input before doing any processing to speed along the process.
  • Adds several “-Limit All” parameters to the “Get-SP*” cmdlets to prevent omission of users from the migration process.

There are still some minor problems. When run in “convert” mode, the script generates an error for every migrated user, even when the user is migrated successfully. I expect this is owing to a bug in “Move-SPUser”, and there probably is not much to be done about it. Because I want to migrate some accounts from windows-auth to claims-with-windows-auth, there is some manual massaging of the output file that needs to be done before running the actual migration, but I think this is about as close as I can get to perfecting a generic migration script without making the end-product entirely site-specific.

I will need to run the script at least twice… once for my primary “CAMPUS” domain, and once to capture “GUEST” domain users. I may also want to do a pass to convert admin and service account entries to claims-with-windows-auth users.

Set-PSDebug -Strict
add-pssnapin microsoft.sharepoint.powershell -erroraction 0

# Select Options
Write-Host -ForegroundColor Yellow "'Document' will create a CSV dump of users to convert. 'Convert' will use the data in the CSV to perform the migrations."
Write-Host -ForegroundColor Cyan "1. Document"
Write-Host -ForegroundColor Cyan "2. Convert"
Write-Host -ForegroundColor Cyan " "
[int]$Choice = Read-Host "Select an option 1-2: "

    1 {[bool]$convert = $false}
    2 {[bool]$convert = $true}
    default {Write-Host "Invalid selection! Exiting... "; exit}
Write-Host ""

$objCSV = @()
[string]$csvPath = Read-Host "Please enter the path to save the .csv file to. (Ex. C:\migration)"
if ((Test-Path -LiteralPath $csvPath) -eq $false) {
	Write-Host "Invalid path specified! Exiting..."; exit

if($convert-eq $true)
	$objCSV = Import-CSV "$csvPath\MigrateUsers.csv"

    foreach ($object in $objCSV)
        $user = Get-SPUser -identity $object.OldLogin -web $object.SiteCollection 
        write-host "Moving user:" $user "to:" $object.NewLogin "in site:" $object.SiteCollection 
        move-spuser -identity $user -newalias $object.NewLogin -ignoresid -Confirm:$false
	[string]$oldprovider = Read-Host "Enter the Old Provider Name (Example -> Domain\ or i:0#.f|MembershipProvider|) "
    [string]$newprovider = Read-Host "Enter the New User Provider Name (Example -> Domain\ or i:0e.t|MembershipProvider|) "
	[string]$newsuffix = Read-Host "Enter the UPN suffix for the new provider, if desired (Example -> "
	[string]$newGroupProvider = Read-Host "Enter the New Group Provider Name (Example -> Domain\ or c:0-.t|MembershipProvider|\) "

    # Select Options
    Write-Host -ForegroundColor Yellow "Choose the scope of the migration - Farm, Web App, or Site Collection"
    Write-Host -ForegroundColor Cyan "1. Entire Farm"
    Write-Host -ForegroundColor Cyan "2. Web Application"
    Write-Host -ForegroundColor Cyan "3. Site Collection"
    Write-Host -ForegroundColor Cyan " "
    [int]$scopeChoice = Read-Host "Select an option 1-3: "

        1 {[string]$scope = "Farm"}
        2 {[string]$scope = "WebApp"}
        3 {[string]$scope = "SiteColl"}
        default {Write-Host "Invalid selection! Exiting... "; exit}
    Write-Host ""
    if($scope -eq "Farm")
        $sites = @()
        $sites = get-spsite -Limit All
    elseif($scope -eq "WebApp")
        $url = Read-Host "Enter the Url of the Web Application: "
        $sites = @()
        $sites = get-spsite -WebApplication $url -Limit All
    elseif($scope -eq "SiteColl")
        $url = Read-Host "Enter the Url of the Site Collection: "
        $sites = @()
        $sites = get-spsite $url

    foreach($site in $sites)
		$webs = @() #needed to prevent the next foreach from attempting to loop a non-array variable
        $webs = $site.AllWebs

        foreach($web in $webs)
            # Get all of the users in a site
			$users = @()
            $users = get-spuser -web $web -Limit All #added "-limit" since some webs may have large user lists.

            # Loop through each of the users in the site
            foreach($user in $users)
                # Create an array that will be used to split the user name from the domain/membership provider
                $displayname = $user.DisplayName
                $userlogin = $user.UserLogin

                if(($userlogin -like "$oldprovider*") -and ($objCSV.OldLogin -notcontains $userlogin))
                    # Separate the user name from the domain/membership provider
                        $a = $userlogin.split("|")
                        $username = $a[1]

                            $a = $username.split("\")
                            $username = $a[1]
                        $a = $userlogin.split("\")
                        $username = $a[1]
                    # Create the new username based on the given input
					if ($user.IsDomainGroup) {
						[string]$newalias = $newGroupProvider + $username
					} else {
						[string]$newalias = $newprovider + $username + $newsuffix

                    $objUser = "" | select OldLogin,NewLogin,SiteCollection
	                $objUser.OldLogin = $userLogin
                    $objUser.NewLogin = $newAlias
	                $objUser.SiteCollection = $site.Url

	                $objCSV += $objUser

    $objCSV | Export-Csv "$csvPath\MigrateUsers.csv" -NoTypeInformation -Force

Bulk-modification of deployment deadlines in SCCM

A full two years into testing we finally are moving forward with production deployment of our Configuration Manager 2012 (SCCM) environment. Last month we (recklessly?) migrated 1000 workstations into the environment. While the deployment was a technological success, it was a bit of black-eye in the PR department.

Client computers almost uniformly did an unplanned reboot one hour after the SCCM agent got installed on their workstations. In addition to that, many clients experienced multiple reboot requests over the coming days. Many client reported that they did not get the planned 90-minute impending reboot warning, but only the 15-minute countdown timer.

Lots of changes were required to address this situation:

See the “Suppress and required client restarts” setting documented here:

This one was causing clients to reboot following upgrade of their existing Forefront Endpoint Protection client to SCEP. That explained the unexpected 60-minute post-install reboot.

Next, we decided to change the reboot-after deadline grace period from 90 minutes to 9 hours, with the final warning now set to one hour, up from 15 minutes. This should allow people to complete work tasks without having updates interrupt their work day.

Finally, we are planning to reset the deployment deadline for all existing software update deployments to a time several days out from the initial client installation time. Since we have several dozen existing software update group deployments, we need a programmatic approach to completing this task. The key to this was found here:

Thanks to Nickolaj Andersen for posting this valuable script.

It did take me a bit of time to decode what Nickolaj was doing with his script (I was not already familiar with the Date/Time format generally used in WMI). I modified the code to set existing update group deployments to a fixed date and time provided by input parameters. I also added some in-line documentation to the script, and added a few more input validation checks:

# Set-CMDeploymentDeadlines script
#   J. Greg Mackinnon, 2014-02-07
#   Updates all existing software update deployments with a new enforcement deadline.
#   Requires specification of: 
#    -SiteServer (an SCCM Site Server name)
#    -SiteCode   (an SCCM Site Code)
#    -DeadlineDate
#    -DeadlineTime


    [string] $SiteServer,

    [string] $SiteCode,

        if (($_.Length -eq 8) -and ($_ -notmatch '[a-zA-Z]+')) {
        } else {
            Throw '-Deadline must be a date string in the format "YYYYMMDD"'
    [string] $DeadlineDate,

        if (($_.Length -eq 4) -and ($_ -notmatch '[a-zA-Z]+')) {
        } else {
            Throw '-DeadlineTime must be a time string in the format "HHMM", using 24-hour syntax' 
    [string] $DeadlineTime

Set-PSDebug -Strict

# WMI Date format is required here.  See:
# This is the "UTC Date-Time Format", sometimes called "dtm Format", and referenced in .NET as "dmtfDateTime"
#The grouping of six zeros represents milliseconds.  The last cluster of MMM is minute offset from GMT.  
#Wildcards can be used to for parts of the date that are not specified.  In this case, we will not specify
#the GMT offset, thus using "local time".

# Build new deadline date in WMI Date format:
[string] $newDeadline = $DeadlineDate + $DeadlineTime + '00.000000+***'
Write-Verbose "Time to be sent to the Deployment Object: $newDeadline"

# Get all current Software Update Group Deployments.
# Note: We use the WMI class "SMS_UpdateGroupAssignment", documented here:
# Shares many properties with "SMS_CIAssignmentBaseClass", documented here:
$ADRClientDeployment = @()
$ADRClientDeployment = Get-WmiObject -Namespace "root\sms\site_$($SiteCode)" -Class SMS_UpdateGroupAssignment -ComputerName $SiteServer

# Loop through the assignments setting the new EnforcementDeadline, 
# and commit the change with the Put() method common to all WMI Classes:
foreach ($Deployment in $ADRClientDeployment) {

    $DeploymentName = $Deployment.AssignmentName

    Write-Output "Deployment to be modified: `n$($DeploymentName)"
    try {
        $Deployment.EnforcementDeadline = "$newDeadline"
        $Deployment.Put() | Out-Null
        if ($?) {
            Write-Output "`nSuccessfully modified deployment`n"
    catch {
        Write-Output "`nERROR: $($_.Exception.Message)"

We additionally could push out the deployment time for Application updates as well using the “SMS_ApplicationAssignment” WMI class:

In this case, we would want to change the “UpdateDeadline” property, since we do not set a “deployment deadline” for these updates, but instead are using application supersedence rules to control when the updates are deployed.

Automated Driver Import in MDT 2013

As a follow up to my previous post, I also have developed a script to automate the import of drivers into MDT 2013.  This PowerShell script takes a source folder structure and duplicates the top two levels of folders in the MDT Deployment Share “Out-of-box drivers ” branch.  The script then imports all drivers found in the source directories to the matching folders in MDT.

All we have to do is extract all drivers for a given computer model into an appropriately named folder in the source directory, and then run the script.

#  Create-MDTDriverStructure.ps1
#  J. Greg Mackinnon, University of Vermont, 2013-11-05
#  Creates a folder structure in the "Out of Box Drivers" branch of a MDT 2013
#    deployment share.  The structure matches the first two subdirectories of 
#    the source filesystem defined in $srcRoot.  All drivers contained within
#    $srcRoot are imported into the deployment share.
#  Requires: 
#    $srcDir - A driver source directory, 
#    $MDTRoot - a MDT 2013 deployment share
#    - MDT 2013 must be installed in the path noted in $modDir!!!

# Define source driver directories:
[string] $srcRoot = 'E:\staging\drivers\import'
[string[]] $sources = gci -Attributes D $srcRoot | `
    Select-Object -Property name | % {$}
# Initialize MDT Working Environment:
[string] $MDTRoot = 'E:\DevRoot'
[string] $PSDriveName = 'DS100'
[string] $oobRoot = $PSDriveName + ":\Out-Of-Box Drivers"
[string] $modDir = `
	'C:\Program Files\Microsoft Deployment Toolkit\Bin\MicrosoftDeploymentToolkit.psd1'
Import-Module $modDir
New-PSDrive -Name "$PSDriveName" -PSProvider MDTProvider -Root $MDTRoot

foreach ($source in $sources){
    Write-Host "Working with source: " $source -ForegroundColor Magenta
    # Create the OOB Top-level folders:
    new-item -path $oobRoot -name $source -itemType "directory" -Verbose
    # Define a variable for the current working directory:
    $sub1 = $srcRoot + "\" + $source
    # Create an array containing the folders to be imported:
    $items = gci -Attributes D $sub1 | Select-Object -Property name | % {$}
    $oobDir = $oobRoot + "\" + $source

    foreach ($item in $items) {
		# Define source and target directories for driver import:
	    [string] $dstDir = $oobDir + "\" + $item
	    [string] $srcDir = $sub1 + "\" + $item
	    # Clean up "cruft" files that lead to duplicate drivers in the share:
		Write-Host "Processing $item" -ForeGroundColor Green
	    Write-Host "Cleaning extraneous files..." -ForegroundColor Cyan
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "Found " $delItems.count " files to delete..." -ForegroundColor Yellow
	    $delItems | remove-Item -force -confirm:$false
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "New count for extraneous files: " $delItems.count -ForegroundColor Yellow

	    # Create the target directory:
		Write-Host "Creating $item folder" -ForegroundColor Cyan
	    new-item -path $oobDir -name $item -itemType "directory" -Verbose
	    # Import all drivers from the source to the new target:
		Write-Host "Importing Drivers for $item" -ForegroundColor Cyan
	    Import-MDTDriver -Path $dstDir -SourcePath $srcDir 
        Write-Host "Moving to next directory..." -ForegroundColor Green
    } # End ForEach Item
} # End ForEach Source

Remove-PSDrive -Name "$PSDriveName"


Rethinking Driver Management in MDT 2013

We have been using the Microsoft Deployment Toolkit (MDT) in LTI/Lite Touch mode here at the University for a long time now.  Why, we used it to deploy XP back when MDT still was called the Business Desktop Deployment Solution Accelerator (BDD).  In this time, we have gone though several different driver management methods.  Gone are the nightmare days of dealing with OEMSETUP files, $OEM$ directories, can elaborate “DriverPack” injection scripts for XP (thank goodness).  

With the release of Vista, we moved from a PnP free-for-all model of driver detection.  After Windows 8.0, we found we really needed to separate our drivers by operating system.  Thus, we created Win7, Win8, and WinPE driver selection profiles.

But now we are finding that driver sprawl is becoming a major issue again.  On many new systems we run though a seemingly successful deployment, but end up with a non-responsive touch screen, a buggy track pad, and (sometimes) a very unstable system.

Starting this week, we are trying a new hybrid driver management approach.  We will create a driver folder for each computer model sold though our computer depot.  I have developed a custom bit of VBScript to check to see if the hardware being deployed to is a known model.  Driver injection will be restricted to this model if a match is found.  The script contains additional logic to detect support for both Windows 7 and Windows 8 variants, and to select the most current drivers detected.  Unknown models will fall back on the PnP free-for-all detection method.

Here is how it works…

  1. Create a new group in your OS deployment task sequence named “Custom Driver Inject”, or something similar.  Grouping all actions together will allow easier transfer of these custom actions to other Task Sequences:
  2. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name the variable “TargetOS”,and set the value to the OS that you are deploying from this task sequence.  You must follow the same naming convention that you use in your Out-of-box driver folder.  I use Win(X), where (X) is the major OS version of the drivers in the folder.  In this example, I have chose “Win8″:
  3. Add an action of type “Run Command Line”.  Name this action “Supported Model Check”.  Under the Command line field, enter “cscript “%SCRIPTROOT%\ZUVMCheckModel.wsf”.  (We will import this script into the deployment share later on.)
  4. Add a sub-group named “Supported Model Actions”.  Under the “Options” tab, add a condition of type “Task Sequence Variable”.  Use the variable “SupportedModel”, the Condition “equals”, and the Value “YES”.  (The SupportedModel variable gets set by the CheckModel script run in the previous step.):
  5. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name this task “Set Variable DriverGroup002″.  Under “Task Sequence Variable”, set “DriverGroup002″, and set the value to “Models\%TargetOS%\%Model%”.  (Note:  You could use “DriverGroup001″, but I already am using this variable to hold a default group of drivers that I want added to all systems.  The value “%TargetOS%\%Model%” defines the path to the driver group in the deployment share.  If you use a different folder structure, you will need to modify this path.):
  6. Create a new task of type “Inject Drivers”.  Name this task “Inject Model-Specific Drivers”.  For the selection profile, select “Nothing”.  Be sure to select “Install all drivers from the selection profile”.  (NOTE: The dialog implies that we will be injecting only divers from a selection profile.  In fact, this step will inject drivers from any paths defined in any present “DriverGroupXXX” Task Sequence variables.)
  7. Now, under our original Custom Driver Inject group, add a new task of type “Inject Drivers”.  Choose from the selection profile “All Drivers”, or use a different fallback selection profile that suits the needs of your task sequence.  This time, select “Install only matching drivers from the selection profile”:
    Under the “Options” tab, add the condition where the “Task Sequence Variable” named “Supported Model” equals “NO”:
    This step will handle injection of matching drivers into hardware models for which we do not have a pre-defined driver group.
  8. Optionally, you now can open the “CustomSettings.ini” file and add the following to your “Default” section:
    (I have a “Peripherals” driver group configured which contains USB Ethernet drivers used in our environment.  These are a necessity when deploying to hardware that does not have an embedded Ethernet port, such as the Dell XPS 12 and XPS 13.  You also could add common peripherals with complicated drivers such as a DisplayLink docking station or a Dell touch screen monitor.)
  9. Add the “ZUVMCheckMedia.wsf” script to the “Scripts” folder of your deployment share.  The code for this script is included below.  I think the script should be fairly easy to adapt for your environment.
  10. Finally, structure your “Out-of-Box Drivers” folder to contain a “Models” folder, and a single folder for each matching hardware model in your environment.  I get most of our driver collections from Dell:
    (NOTE:  Thanks Dell!)
    The real challenge of maintaining this tree is in getting the model names right.  Use “wmic computersystem get model” to discover the model string for any new systems in your environment.  A table of a few current models I have been working with is included below.

Dell Marketing Model Name to WMI Name Translator Table:

  • Dell XPS 12 (first generation) – “XPS 12 9Q23″
  • Dell XPS 12 (second generation) – “XPS 12-9Q33″
  • Dell XPS 13 (first generation) – “Dell System XPS L321X”
  • Dell XPS 13 (second generation) – “Dell System XPS L322X”
  • Dell XPS 14 – “XPS L421Q”
  • Dell Latitude 10 – “Latitude 10 – ST2″
  • VMware Virtual Machine – “VMware Virtual Platform”
  • Microsoft Hyper-V Virtual Machine – “Virtual Machine”

A fun nuance we encountered last week was a Latitude E5430 model that contained “no-vPro” after the model number. Dell does not provide separate driver CABs for vPro/non-vPro models, so I added a regular expression test for Latitudes, and strip any cruft after the model number. There is one more problem down…

The following site contains a list of older model name translations:
As you can see, most Latitudes and Optiplexes follow sane and predictable model name conventions. I wish the same were true for the XPS.

Finally, I am indebted to the following sources for their generously detailed posts on driver management. Without their help, I doubt I would have been able to make this solution fly:

Jeff Hughes of the Windows Enterprise Support Server Core Team:

Andrew Barnes (aka Scriptimus Prime), whose posts on MDT driver management give the basics DriverGroups and model selection:
AND, of automating driver import into MDT (written for MDT 2012… some changes required for 2013):

The incredible Michael Neihaus, who in this post discusses the use of DriverGroups and Selection Profiles:

And finally Eric Schloss of the “Admin Nexus”, who give me the idea of developing a fallback for systems that do not match a known model. It was this key bit of smarts that gave me the confidence to move forward with a model-specific driver grouping strategy:

ZUVMCheckModel.wsf script:

(NOTE: WordPress stripped off the WSF headers and footers from my script. These are the first three and last two lines in the script. If you copy from this post, please remember to place greater than and less than tags around these lines before running, as indicated in the comments.)

' Uncomment and wrap each of the following three lines in less than/greater than characters to convert them to tags.
'job id="ZUVMCheckModel"
'script language="VBScript" src="ZTIUtility.vbs"/
'script language="VBScript"

Option Explicit

'// Main Class
Class ZUVMCheckModel
	'//  Constructor to initialize needed global objects
	Private Sub Class_Initialize
	End Sub
	'// Main routine

	Function Main()
	' //*******************************************************
	' //
	' // File: ZTIUVMCheckModel.wsf
	' //
	' // Purpose: Checks the model of this system against
	' //          a list of known machine models.  Returns
	' //          TRUE if a matching model is detected.
	' //
	' // Usage: cscript ZUVMCheckModel.wsf /Model: [/debug:true]
	' //
	' //*******************************************************
	'Use the following lines for debugging only.
	'oEnvironment.Item("TargetOS") = "Win7"
	'oEnvironment.item("DeployRoot") = "c:\local\mdt"
	'oEnvironment.Item("Model") = "Latitude E6500 some annoying variation"
	'End debug Params

	  Dim aModels()          'Array of models taken from DriverGroups.xml
	  Dim bOldDrivers        'Boolean indicating drivers present for an older OS version
	  Dim i                  'Generic integer for looping
	  Dim j                  'Generic integer for looping
	  Dim iRetVal            'Return code variable
	  Dim iMaxOS             'Integer representing the highest matching OS driver store
	  Dim oRegEx
	  Dim oMatch
	  Dim match
	  Dim oXMLDoc            'XML Document Object, for reading DriverGroups.xml
	  Dim Root,NodeList,Elem 'Objects in support of oXMLdoc
	  Dim sDGPath            'Path to DriverGroups.xml file
	  Dim sInitModel         'String representing the initial value of
	                         '   oEnvironment.Item("Model")
	  Dim sItem	             'Item in aModels array.
	  Dim sMaxOS             'OS Name of highest matching OS driver store
	  Dim sOSFound           'OS Name for a given matching driver set.
	  oLogging.CreateEntry "Begin ZUVMCheckModel...", LogTypeInfo
	  'Set the default values:
	  oEnvironment.Item("SupportedModel") = "NO"
	  iMaxOS = CInt(Right(oEnvironment.Item("TargetOS"),1))
	  'wscript.echo "Default value for iMaxOS = " & iMaxOS
	  bOldDrivers = false
	  sInitModel = oEnvironment.Item("Model")
	  'wscript.echo "sInitModel value = " & sInitModel
	  Set oRegEx = New RegExp
	  oRegEx.Global = True
	  oRegEx.IgnoreCase = True
	  'Modify the detected model name to handle known variations:
	  oRegEx.pattern = "Latitude"
	  if oRegEx.test(sInitModel) then
		oLogging.CreateEntry "Model is a Latitude.  Cleaning up the model name...", LogTypeInfo
		oRegEx.pattern = " "
		set oMatch = oRegEx.Execute(sInitModel)
		'wscript.echo "oMatch Count is: " & oMatch.count
		if oMatch.Count > 1 then
			i = oMatch.item(1).FirstIndex
			oEnvironment.Item("Model") = Left(sInitModel,i)
			'wscript.echo """"&oEnvironment.Item("Model")&""""
		end if
	  end if

	  'Check for DriverGroups.xml file, which will contain the supported model list:
	  iRetVal = Failure
	  iRetVal = oUtility.FindFile("DriverGroups.xml", sDGPath)
	  if iRetVal  Success then
		oLogging.CreateEntry "DriverGroups file not found. ", LogTypeError
		exit function
	  end if 
	  oLogging.CreateEntry "Path to DriverGroups.xml: " & sDGPath, LogTypeInfo
	  'Parse the DriverGroups.xml file:
	  oLogging.CreateEntry "Parsing DriverGroups.xml...", LogTypeInfo
	  Set oXMLDoc = CreateObject("Msxml2.DOMDocument") 
	  oXMLDoc.setProperty "SelectionLanguage", "XPath"
	  Set Root = oXMLDoc.documentElement 
	  Set NodeList = Root.getElementsByTagName("Name")
	  oLogging.CreateEntry "NodeList Member Count is: " & NodeList.length, LogTypeInfo
	  'oLogging.CreateEntry "NodeList.Length variant type is: " & TypeName(NodeList.Length), LogTypeInfo
	  i = CInt(NodeList.length) - 1
	  ReDim aModels(i) 'Resize aModels to hold all matching DriverGroup items.
	  'oLogging.CreateEntry "List of Available Driver Groups:", LogTypeInfo
	  i = 0
	  For Each Elem In NodeList
		if InStr(Elem.Text,"Models\") then
			aModels(i) = Mid(Elem.Text,8)	'Add text after "Models\"
			'oLogging.CreateEntry aModels(i), LogTypeInfo
			i = i + 1
		end if
	  oLogging.CreateEntry "End Parsing DriverGroups.xml.", LogTypeInfo

	  'Loop through the list of supported models to find a match:
	  oLogging.CreateEntry "Checking discovered driver groups for match to: " & oenvironment.Item("Model"), LogTypeInfo
	  For Each sItem in aModels
		oLogging.CreateEntry "Checking Driver Group: " & sItem, LogTypeInfo
		i = InStr(1, sItem, oEnvironment.Item("Model"), vbTextCompare)

		'wscript.echo TypeName(i) 'i is a "Long" number type.
		If i  0 Then
			oLogging.CreateEntry "Matching Model found.", LogTypeInfo
			j = InStr(sItem,"\")
			sOSFound = Left(sItem,j-1)
			'wscript.echo "sOSFound = " & sOSFound 
			if (InStr(1,sOSFound,oEnvironment.Item("TargetOS"),vbTextCompare)  0) then
				oLogging.CreateEntry "Drivers matching the requested OS are available.  Exiting with success.", LogTypeInfo
				oEnvironment.Item("SupportedModel") = "YES"
				iRetVal = Success
				Main = iRetVal
				Exit Function
			end if
			if iMaxOS > CInt(Right(sOSFound,1)) then
				iMaxOS = CInt(Right(sOSFound,1))
				'wscript.echo "iMaxOS = " & iMaxOS
				sMaxOS = sOSFound
				bOldDrivers = true
				'wscript.echo "sMaxOS = " & sMaxOS
			end if
		End If
	  If bOldDrivers Then 'Run if sMaxOS is defined... set a boolean when this is defined and test against that?
		oLogging.CreateEntry "Model drivers were found for an OS older than the one selected...", LogTypeWarning
		oEnvironment.Item("SupportedModel") = "YES"
		oEnvironment.Item("TargetOS") = sMaxOS
	    oLogging.CreateEntry "No matching drivers were found for this model.", LogTypeInfo
	  End If
	  oLogging.CreateEntry "End ZUVMCheckModel.", LogTypeInfo

	  iRetVal = Success
	  Main = iRetVal

	End Function

End Class

' Uncomment and wrap each of the following two lines in less than/greater than characters to convert them to tags.

Improving Notifications in System Center Operations Manager 2012

Anyone who depends on System Center Operations Manager 2012 (or any earlier version of SCOM, back to MOM) likely has noticed that notifications are a bit of a weak spot in the product.

To address this, we have use the “command channel” to improve the quality of messages coming out of SCOM.  Building on the backs of giants, we implemented a script that takes an AlertID from SCOM, and generated nicely formatted email and alpha-numeric pager messages with relevant alert details.

More recently, we have identified the need to generate follow-up notifications when an initial alert does not get addressed.  I went back to our original script, and updated it to use a new, custom Alert ResolutionState (“Notified”), and I have added logic to update the Alert CustomField1 and CustomField2 with data that is useful in determining whether or not an alert should get a new notification, and how many times follow-up notifications have been sent.

Heart-felt appreciation goes out to Tao Yang for his awesome work on his “SCOMEnhancedEmailNotification.ps1″ script, which served as the core for my work here.

Here is my version… I don’t have a lot of time to explain it, but hopefully the comments give you enough to go on. Apologies for the rather bad munging of quotation marks… wordpress hates me this month. If you want to use this code, search for ampersand-quot-semicolon, replace with actual quotation marks.

# AUTHOR:	J. Greg Mackinnon, Adapted from 1.1 release by Tao Yang 
# DATE:		2013-05-21
# Name:		SCOMEnhancedEmailNotification.PS1
# Version:	3.0
# COMMENT:	SCOM Enhanced Email notification which includes detailed alert information
# Update:	2.0 - 2012-06-30	- Major revision for compatibility with SCOM 2012
#								- Cmdlets updated to use 2012 names
#								- "Notified" Resolution Status logic removed
#								- Snapin Loading and PSDrive Mappings removed (replaced with Module load)
#								- HTML Email reformatted for readability
#								- Added '-format' parameter to allow for alphanumeric pager support
#								- Added '-diag' boolean parameter to create options AlertID-based diagnostic logs
# Update:   2.2 - 2013-05-16    - Added logic to update "CustomField1" alert data to reflect that notification has been sent for new alerts.
#								- Added logic to update "CustomField2" alert data to reflect the repeat count for new alert notification sends.
#								- Added support for specifying alerts with resolution state "acknowledged"
#                               - Did some minor adjustments to improve execution time and reduce memory overhead.
# Update:	3.0 - 2013-05-20	- Updated to reduce volume of PowerShell instance spawned by SCOM.  Added "mailTo" and "pageTo" paramerters to allow sending of both short
#                                         and long messages from a single script instance.
#								- Converted portions of script to subroutine-like functions to allow repetition (buildHeaders, buildPage, buildMail)
#								- Restored "Notified" resolution state logic.
#								- Renamed several variables for my own sanity.
#								- Added article lookup updates from Tao Yang 2.0 script.
# Usage:	.\SCOMEnhancedEmailNotification.ps1 -alertID xxxxx -mailTo @('John Doe;','Richard Roe;') -pageTo @('Team Pager;')
#In OpsMgr 2012, the AlertID parameter passed in is '$Data/Context/DataItem/AlertId$' (single quote)
#Quotation marks are required otherwise the AlertID parameter will not be treated as a string.
	[string]$alertID = $(throw 'A valid, quote-delimited, SCOM AlertID must be provided for -AlertID.'),
Set-PSDebug -Strict

#### Setup Error Handling: ####
#$erroractionpreference = "SilentlyContinue"
$erroractionpreference = "Inquire"

#### Setup local option variables: ####
## Logging: 
#Remove '$alertID' from the following two log file names to prevent the drive from filling up with diag logs:
$errorLogFile = 'C:\local\logs\SCOMNotifyErr-' + $alertID + '.log'
$diagLogFile = 'C:\local\logs\SCOMNotifyDiag-' + $alertID + '.log'
#$errorLogFile = 'C:\local\logs\SCOMNotifyErr.log'
#$diagLogFile = 'C:\local\logs\SCOMNotifyDiag.log'
## Mail: 
$SMTPHost = ""
$SMTPPort = 25
$Sender = New-Object System.Net.Mail.MailAddress("", "Lifeboat OpsMgr Notification")
#If error occured while excuting the script, the recipient for error notification email.
$ErrRecipient = New-Object System.Net.Mail.MailAddress("", "SAA Windows Administration Team")
##Set Culture Info (for knowledgebase article language selection):
$cultureInfo = [System.Globalization.CultureInfo]'en-US'
##Get the FQDN of the local computer (where the script is run)...
$RMS = $env:computername

#### Initialize Global Variables and Objects: ####
## Mail Message Object:
[string] $threadID = ''
$SMTPClient = New-Object System.Net.Mail.smtpClient
$ = $SMTPHost
$SMTPClient.port = $SMTPPort
##Load SCOM PS Module
if ((get-module | ? {$ -eq 'OperationsManager'}) -eq $null) {
	Import-Module OperationsManager -ErrorAction SilentlyContinue -ErrorVariable Err | Out-Null
## Management Group Object:
$mg = get-SCOMManagementGroup
##Get Web Console URL
$WebConsoleBaseURL = (get-scomwebaddresssetting | Select-Object -Property WebConsoleUrl).webconsoleurl
#### End Initialize ####

#### Begin Parse Input Parameters: ####
##Get recipients names and email addresses from "-to" array parameter: ##
if ((!$mailTo) -and (!$pageTo)) {
	write-host "An array of name/email address pairs must be provided in either the -mailTo or -pageTo parameter, in the format `@(`'me;`',`'you;`')"
$mailRecips = @()
Foreach ($item in $mailTo) {
	$to = New-Object psobject
	$name = ($item.split(";"))[0]
	$email = ($item.split(";"))[1]
	Add-Member -InputObject $to -MemberType NoteProperty -Name Name -Value $name
	Add-Member -InputObject $to -MemberType NoteProperty -Name Email -Value $email
	$mailRecips += $to
	Remove-Variable to
	Remove-Variable name
	Remove-Variable email
$pageRecips = @()
Foreach ($item in $pageTo) {
	$to = New-Object psobject
	$name = ($item.split(";"))[0]
	$email = ($item.split(";"))[1]
	Add-Member -InputObject $to -MemberType NoteProperty -Name Name -Value $name
	Add-Member -InputObject $to -MemberType NoteProperty -Name Email -Value $email
	$pageRecips += $to
	Remove-Variable to
	Remove-Variable name
	Remove-Variable email
if ($diag -eq $true) {
	[string] $("mailRecipients:") | Out-File $diagLogFile -Append 
	$mailRecips | Out-File $diagLogFile -Append
	[string] $("pageRecipients:") | Out-File $diagLogFile -Append 
	$pageRecips | Out-File $diagLogFile -Append
## Parse "-AlertID" input parameter: ##
$alertID = $alertID.toString()
#remove "{" and "}" around the $alertID if exist
if ($alertID.substring(0,1) -match "{") {
	$alertID = $alertID.substring(1, ( $alertID.length -1 ))
if ($alertID.substring(($alertID.length -1), 1) -match "}") {
	$alertID = $alertID.substring(0, ( $alertID.length -1 ))
#### End Parse input parameters ####

#### Function Library: ####
function getResStateName($resStateNumber){
	[string] $resStateName = $(get-ScomAlertResolutionState -resolutionStateCode $resStateNumber).name
function setResStateColor($resStateNumber) {
		"0" { $sevColor = "FF0000" }	#Color is Red
		"1" { $sevColor = "FF0000" }	#Color is Red
		"255" { $sevColor = "3300CC" }	#Color is Blue
		default { $sevColor = "FFF00" }	#Color is Yellow
function stripCruft($cruft) {
	#Removes "cruft" data from messages. 
	#Intended to make subject lines and alphanumeric pages easier to read
	$cruft = $cruft.replace("®","")
	$cruft = $cruft.replace("(R)","")
	$cruft = $cruft.replace("Microsoftr ","")
	$cruft = $cruft.replace("Microsoft ","")
	$cruft = $cruft.replace("Microsoft.","")
	$cruft = $cruft.replace("Windows ","")
	$cruft = $cruft.replace(" without Hyper-V","")
	$cruft = $cruft.replace("Serverr","Server")
	$cruft = $cruft.replace(" Standard","")
	$cruft = $cruft.replace(" Enterprise","")
	$cruft = $cruft.replace(" Edition","")
	$cruft = $cruft.replace(".campus","")
	$cruft = $cruft.replace(".CAMPUS","")	
	$cruft = $cruft.replace("","")
	$cruft = $cruft.replace(".AD.UVM.EDU","")
	$cruft = $cruft.trim()
	return $cruft
function fnMamlToHTML($MAMLText){
	$HTMLText = "";
	$HTMLText = $MAMLText -replace ('xmlns:maml=""');
	$HTMLText = $HTMLText -replace ("maml:para", "p");
	$HTMLText = $HTMLText -replace ("maml:");
	$HTMLText = $HTMLText -replace (&quot;</section>&quot;);
	$HTMLText = $HTMLText -replace (&quot;<section>&quot;);
	$HTMLText = $HTMLText -replace (&quot;<section>&quot;);
	$HTMLText = $HTMLText -replace (&quot;<title>&quot;, &quot;<h3>&quot;);
	$HTMLText = $HTMLText -replace (&quot;</title>&quot;, &quot;</h3>&quot;);
	$HTMLText = $HTMLText -replace (&quot;&quot;, &quot;<li>&quot;);
	$HTMLText = $HTMLText -replace (&quot;&quot;, &quot;</li>&quot;);
function fnTrimHTML($HTMLText){
	$TrimedText = &quot;&quot;;
	$TrimedText = $HTMLText -replace (&quot;&lt;&quot;, &quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;&quot;)
	$TrimedText = $TrimedText -replace (&quot;<h1>&quot;, &quot;<h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;</h1>&quot;, &quot;</h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;<h2>&quot;, &quot;<h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;</h2>&quot;, &quot;</h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;<H1>&quot;, &quot;<h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;</H1>&quot;, &quot;</h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;<H2>&quot;, &quot;<h3>&quot;)
	$TrimedText = $TrimedText -replace (&quot;</H2>&quot;, &quot;</h3>&quot;)
function buildEmail {
	## Format the message for full-HTML email
	[string] $escTxt = &quot;&quot;
	if ($resState -eq '1') {$escTxt = '- Repeat Count ' + $escLev.ToString()}
	[string] $script:mailSubj = &quot;SCOM - $resStateName $escTxt - $alertSev | $moPath | $alertName&quot;
	$mailSubj = stripCruft($mailSubj)
	[string] $script:mailErrSubj = &quot;Error emailing SCOM Notification for Alert ID $alertID&quot;
	[string] $webConsoleURL = $WebConsoleBaseURL+&quot;?DisplayMode=Pivot&amp;AlertID=%7b$alertID%7d&quot;
	[string] $psCmd = &quot;Get-SCOMAlert -Id `&quot;$alertID`&quot; | format-list *&quot;
	# Format the Mail Message Body (do not indent this block!)
	$script:MailMessage.isBodyHtml = $true
	$script:mailBody = @&quot;

<p><b>Alert Resolution State:<Font color='$sevColor'> $resStateName </Font></b><br />
<b>Alert Severity:<Font color='$sevColor'> $alertSev</Font></b><br />
<b>Object Source (Display Name):</b> $moSource <br />
<b>Object Path:</b> $moPath <br />
<p><b>Alert Name:</b> $alertName <br />
<b>Alert Description:</b> <br />
$alertDesc <br>
	if (($resState -eq 0) -or ($resState -eq 1)) {
		if ($isMonitorAlert -eq $true) {
$script:mailBody = $mailBody + @&quot;
<b>Alert Monitor Name:</b> $MonitorName <br />
<b>Alert Monitor Description:</b> $MonitorDescription
		}elseif ($isMonitorAlert -eq $false) {
			$script:mailBody = $mailBody + @&quot;
<b>Alert Rule Name:</b> $RuleName <br />
<b>Alert Rule Description:</b> $RuleDescription <br />
$script:mailBody = $mailBody + @&quot;
<b>Alert Context Properties:</b><br /> 
$alertCX <br />
<b>Time Raised:</b> $timeRaised <br />
<b>Alert ID:</b> $alertID <br />
<b>Notification Status:</b> $($alert.CustomField1) </br>
<b>Notification Repeat Count:</b> $($escLev.ToString()) </p>
<b>PowerShell Alert Retrieval:</b> $psCmd <br />
<b>Web Console Link:</b> <a href="&quot;$webConsoleURL&quot;">$webConsoleURL</a> </p>
	if (($resState -eq 0) -or ($resState -eq 1)) {
		foreach ($article in $arrArticles) {
		$articleContent = $article.content
$script:mailBody = $mailBody + @&quot;
<b>Knowledge Article / Company Knowledge `-$($article.Language):</b>
<p> $articleContent

$script:mailErrBody = @&quot;

<p>Error occurred when excuting script located at $RMS for alert ID $alertID.
<p>Alert Resolution State: $resStateName
<p><b>**Use below command to view the full details of this alert in SCOM Powershell console:</b>
<p> SCOM link:<a href="&quot;$webConsoleURL&quot;"> $webConsoleURL </a>

function buildPage {
	## Format the message for primitive alpha-numeric pager
	$script:moPath = stripCruft($moPath)
	[string] $escTxt = ''
	if ($resState -eq '1') {$escTxt = '- Rep Count ' +$escLev.ToString()}
	[string] $script:mailSubj = &quot;SCOM - $resStateName $escTxt | $moPath&quot;
	[string] $script:mailErrSubj = &quot;Error emailing SCOM Notification for Alert ID $alertID&quot;
	#UFT8 makes the message body look like trash.  Use ASCII (the default) instead.
	#$mailMessage.BodyEncoding =  [System.Text.Encoding]::UTF8 
	$script:MailMessage.isBodyHtml = $false
	$script:moSource = stripCruft($moSource)
	$script:alertName = stripCruft($alertName)
	$script:mailBody = &quot;| $moSource | $alertName | $timeRaised&quot; 
	$script:mailBody = stripCruft($mailBody)
function buildHeaders {
	## Complete the MailMessage object:
	$script:MailMessage.Sender = $Sender
	$script:MailMessage.From = $Sender
	# Regular (non-error) format
	if ($error.count -eq &quot;0&quot;) { 				
		$script:MailMessage.Subject = $mailSubj
		Foreach ($item in $recips) {
			$to = New-Object System.Net.Mail.MailAddress($, $
			Remove-Variable to
		$script:MailMessage.Body = $mailBody
	# Error format:
	else {									
		$script:MailMessage.Subject = $mailErrSubj
		$script:MailMessage.Body = $mailErrBody
	## Log the message if in diag mode:
	if ($diag -eq $true) {
		[string] $('Mail Message Object Content:') | Out-File $diagLogFile -Append
		$mailMessage | fl * | Out-File $diagLogFile -Append
#### End Function Library ####

#### Clean up existing logs: ####
if (Test-Path $errorLogFile) {Remove-Item $errorLogFile -Force}
if (Test-Path $diagLogFile) {Remove-Item $diagLogFile -Force}
if ($diag -eq $true) {
	[string] $(&quot;AlertID : `t&quot; + $alertID) | Out-File $diagLogFile -Append
	[string] $(&quot;MailTo      : `t&quot; + $mailto) | Out-File $diagLogFile -Append
	[string] $(&quot;PageTo      : `t&quot; + $pageto) | Out-File $diagLogFile -Append
	#[string] $(&quot;Format  : `t&quot; + $format) | Out-File $diagLogFile -Append

#### Begin Alert Handling: ####
## Locate the specific alert:
$alert = Get-SCOMAlert -Id $alertID
if ($diag -eq $true) {
	[string] $('SCOM Alert Object Content:') | Out-File $diagLogFile -Append
	$alert | fl | Out-File $diagLogFile -Append
## Read Alert Informaiton:
[string] $alertName = $alert.Name
[string] $alertDesc = $alert.Description
#[string] $alertPN = $alert.principalName
[string] $moSource = $alert.monitoringObjectDisplayName 	# Display name is &quot;Path&quot; in OpsMgr Console.
[string] $moId = $alert.monitoringObjectID.tostring()
#[string] $moName = $alert.MonitoringObjectName 			# Formerly &quot;strAgentName&quot;
[string] $moPath = $alert.MonitoringObjectPath 				# Formerly &quot;pathName
#[string] $moFullName = $alert.MonitoringObjectFullName 	# Formerly &quot;alertFullName&quot;
[string] $ruleID = $alert.MonitoringRuleId.Tostring()
[string] $resState = ($alert.resolutionstate).ToString()
[string] $resStateName = getResStateName $resState
[string] $alertSev = $alert.Severity.ToString() 			# Formerly &quot;severity&quot;
if ($alertSev.ToLower() -match &quot;error&quot;) {
	$alertSev = &quot;Critical&quot; 									# Rename Severity to &quot;Critical&quot;
[string] $sevColor = setResStateColor $resState				# Assign color to alert severity
#$problemID = $alert.ProblemId
$alertCx = $(1($alert.Context)).DataItem.Property `
	| Select-Object -Property Name,'#text' `
	| ConvertTo-Html -Fragment								# Alert Context property data, in HTML
$localTimeRaised = ($alert.timeraised).tolocaltime()
[string] $timeRaised = get-date $localTimeRaised -Format &quot;MMM d, h:mm tt&quot;
[bool] $isMonitorAlert = $alert.IsMonitorAlert
$escLev = 1
if ($alert.CustomField2) {
	[int] $escLev = $alert.CustomField2
## Lookup available Knowledge articles, if new alert:
if (($resState -eq 0) -or ($resState -eq 1)) {
	$articles = $mg.Knowledge.GetKnowledgeArticles($ruleId)
	if (!$error) {	#no point retrieving the monitoring rule when there's error processing the alert
		#if failed to get knowledge article, remove the error from $error because not every rule and monitor will have knowledge articles.
		if ($isMonitorAlert -eq $false) {
			$rule = Get-SCOMRule -Id $ruleID		
			$ruleName = $rule.DisplayName
			$ruleDescription = $rule.Description
			if ($RuleDescription.Length -lt 1) {$RuleDescription = &quot;None&quot;}
		} elseif ($isMonitorAlert) {
			$monitor = Get-SCOMMonitor -Id $ruleID
			$monitorName = $monitor.DisplayName
			$monitorDescription = $monitor.Description
			if ($monitorDescription.Length -lt 1) {$monitorDescription = &quot;None&quot;}
		#Convert Knowledge articles
		$arrArticles = @()
		Foreach ($article in $articles) {
			If ($article.Visible) {
				$LanguageCode = $article.LanguageCode
				#Retrieve and format article content
				$MamlText = $null
				$HtmlText = $null
				if ($article.MamlContent -ne $null) {
					$MamlText = $article.MamlContent
					$articleContent = fnMamlToHtml($MamlText)
				if ($article.HtmlContent -ne $null) {
					$HtmlText = $article.HtmlContent
					$articleContent = fnTrimHTML($HtmlText)
				$objArticle = New-Object psobject
				Add-Member -InputObject $objArticle -MemberType NoteProperty -Name Content -Value $articleContent
				Add-Member -InputObject $objArticle -MemberType NoteProperty -Name Language -Value $LanguageCode
				$arrArticles += $objArticle
				Remove-Variable LanguageCode, articleContent
	if ($Articles -eq $null) {
		$articleContent = &quot;No resolutions were found for this alert.&quot;
## End Knowledge Article Lookup
#### End Alert Handling ####

#### Begin Mail Processes:
if ($mailto) {
	# For all alerts, send full HTML email:
	$MailMessage = New-Object System.Net.Mail.MailMessage
	buildHeaders -recips $mailRecips
	invoke-command -ScriptBlock {$SMTPClient.Send($MailMessage)} -errorVariable smtpRet
if ($pageTo) {
	# For page-worthy alerts, format short message and send:
	$MailMessage = New-Object System.Net.Mail.MailMessage
	buildHeaders -recips $pageRecips
	invoke-command -ScriptBlock {$SMTPClient.Send($MailMessage)} -errorVariable smtpRet
#### End Mail Message Formatting #### 

# Populate CustomField1 and 2 to indicate that a notification has been sent, with repeat count.
if (!$smtpRet) { 							# IF the message was sent (apparently)...
	[string] $updateReason = &quot;Updated by Email notification script.&quot;
	[string] $custVal1 = &quot;notified&quot;
	if ($resState -eq &quot;0&quot;) { 				# . AND IF this is a &quot;new&quot; alert...
		$alert.ResolutionState = 1			# ..Set the resolution state to &quot;Notified&quot;
		$alert.CustomField2 = $escLev		# ..Set CustomField2 to the current notification retry count (presumably 1)
		if (!$alert.CustomField1) {			# ..AND if CustomField1 is not already defined...
			$alert.CustomField1 = $custVal1	# ... Set CustomField1.
	elseif ($resState -eq &quot;1&quot;) {		# .Or,If this is a &quot;notified&quot; alert
		if ($alert.CustomField2) {		# ..and the notification retry count exists..
			$escLev += 1				# ...Increment by one.
		$alert.CustomField2 = $escLev

Write-Host $error
##Make sure the script is closed
if ($error.count -ne &quot;0&quot;) {
	[string]$('AlertID string: ' + $alertID) | Out-File $errorLogFile
	[string]$('Alert Object Content: ') | Out-File $errorLogFile
	$alert | Format-List * | Out-File $errorLogFile
	[string]$('Error Object contents:') | Out-File $errorLogFile
	$Error | Out-File $errorLogFile
#Remove-Variable alert
#Remove-Module OperationsManager

Coping with Renamed user Accounts in sharepoint

Yesterday I received a strange error report from a person trying to create a new SharePoint site collection.  Our front line guy went to investigate and found that she was getting a “User cannot be found” error out of SharePoint when attempting to complete the self-service site creation process.  This person reported that her last name changed recently, along with her user ID, yet SharePoint will still showing her as logged in under her old name.

Linking the “Correlation ID” up to the diagnostic logs was of no great help.  The diagnostic logs simply reported “User cannot be found” when executing the method “Microsoft.SharePoint.SPSite.SelfServiceCreateSite”.  We are able to see that “ownerLogin”, “ownerEmail”, and “ownerName” strings were being passed to this function, but not what the values of those strings were.  I guessed that the web form was passing the person’s old account login name to the function, and that since this data was no longer valid, an error was getting displayed.  But how to fix this?

SharePoint 2010 (and WSS 3.0 before it) keeps a list of Site Users that can be accessed using the SharePoint Web “SiteUsers” property. This list is updated every time a new user logs in to the site.  The list entries contain username, login identity, email address, and security ID (SID) data.  It also appears that Site User data is not updated when user data changes in Active Directory (as long as the SID stays the same, that is).  Additional user account data is stored in XML data in the SharePoint databases, and can be accessed using the SharePoint Web “SiteUserInfoList” property.  All of this data needs to be purged from the root web site so that our hapless user can once again pass valid data to the SelfServiceCreateSite method.

Presumably the Site Management tools could be forced to get the job done, but the default views under SharePoint 2010 are hiding all site users from me, even when I log in as a site administrator.  Let’s try PowerShell instead:

add-pssnapin microsoft.sharepoint.powershell 
$root = get-spweb -identity "" 

# "Old ID" below should be all or part of the user's original login name: 
$oldAcc = $root.SiteUsers | ? {$_.userLogin -match "oldID"} 
#Let's see if we found something: 

#Remove the user from the web's SiteUsers list: 
#Let's see if it worked: 
$id = $oldAcc.ID 
$root = get-spweb -identity "" 
# (This should return a "User cannot be found" error.) 

#Now to see what is in SiteUserInfoList: 
# (This data can be cleaned up in the browser by visiting:
# " /_catalogs/users/simple.aspx" 
# from your site collection page.)

Moving User Profiles with PowerShell

Something that comes up with some frequency on Terminal Servers (or “Remote Desktop Servers”), but perhaps sometimes in VDI, is “How to I move a user profile from one drive to another”. The traditional answers include the use of the user profile management GUI, or some expensive piece of software. But what if you need to automate the job? Or if you don’t have any money for the project?

Answer? PowerShell, of course… and robocopy.

Below is a code snippet that will set existing user profiles to load from “C:\Users” to “E:\Users”:

#Collect profile reg keys for regular users ("S-1-5-21" excludes local admin, network service, and system)
$profiles = gci -LiteralPath "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList" `
	| ? {$ -match "S-1-5-21-"} 

foreach ($profile in $profiles) {
	#Set the registry path in a format that can be used by the annoyingly demanding "get-itemproperty" cmdlet:
	$regPath = $(
		$($profile.pspath.tostring().split("::") | Select-Object -Last 1).Replace("HKEY_LOCAL_MACHINE","HKLM:")
	#Get the current filesystem path for the user profile, using get-ItemProperty"
	$oldPath = $(
		Get-ItemProperty -LiteralPath $regPath -name ProfileImagePath
	#Set a varialble for the new profile filesystem path:
	$newPath = $oldPath.Replace("C:\","E:\")
	#Set the new profile path using "set-itemproperty"
	Set-ItemProperty -LiteralPath $regPath -Name ProfileImagePath -Value $newPath

#Now copy the profile filesystem directories using "robocopy".

But this code will not actually move the data. For that, we need robocopy. Make sure that your users are logged off before performing this operation, otherwise “NTUSER.DAT” will not get moved, and your users will get a new TEMP profile on next login:

robocopy /e /copyall /r:0 /mt:4 /b /nfl /xj /xjd /xjf C:\users e:\Users

Finally, be sure to set the default location for new profiles and the “Public” directory to your new drive as well. For that, run “Regedit”, then go to:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList
and set new paths for the registry strings “ProfilesDirectory” and “Public”. Moving the default user profile is optional.

Oh yeah… you might want to purge the old Recycle Bin cruft for your moved users as well:

rmdir /s /q C:\$Recycle.Bin

SharePoint 2010 – Email Alerts to Site Administrators

We are in the final stages of preparation for the long-overdue upgrade to SharePoint 2010.  I have set up a preview site with a copy of the production SharePoint content database, and I want to notify all site owners that they should check out their sites for major problems.  How to do?  PowerShell?  Absolutely!

Set-PSDebug -Strict
Add-PSSnapin -Name microsoft.SharePoint.PowerShell

[string] $waUrl = ""
[string] $SmtpServer = ""
[string] $From = ""

$allAdmins = @()

[string] $subjTemplate = 'Pending Upgrade for your site -siteURL-'
[string] $bodyTemplate = @"
Message Body Goes Here.
Use the string -siteURL- in the body where you want the user's site address to appear.

$wa = Get-SPWebApplication -Identity $waUrl

foreach ($site in $wa.sites) {
	#Write-Host "Working with site: " + $site.url
	$siteAdmins = @()
	$siteAdmins = $site.RootWeb.SiteAdministrators
	ForEach ($admin in $siteAdmins) {
		#Write-Host "Adding Admin: " + $admin.UserLogin
		[string]$a = $($admin.UserLogin).Replace("CAMPUS\","")
		[string]$a = $a.replace(".adm","")
		[string]$a = $a.replace("-admin","")
		[string]$a = $a.replace("admin-","")
		if ($a -notmatch "sa_|\\system") { $allAdmins += , @($a; [string]$site.Url) }

$allAdmins = $allAdmins | Sort-Object -Unique
#$allAdmins = $allAdmins | ? {$_[0] -match "jgm"} | Select-Object -Last 4

foreach ($admin in $allAdmins) {
	[string] $to = $admin[0] + ""
	[string] $siteUrl = $admin[1]
	[string] $subj = $subjTemplate.Replace("-siteURL-",$siteUrl)
	[string] $body = $bodyTemplate.Replace("-siteURL-",$siteUrl)
	Send-MailMessage -To $to -From $From -SmtpServer $SmtpServer -Subject $subj -BodyAsHtml $body

vSphere 5.1 – Train Wreck in Slow Motion

vSphere 5.1 arrived this summer to no great fan-fare. We waited a few weeks, heard no sounds of howling pain (we did not listen very hard, I guess), and decided to proceed with upgrading vCenter.  I have been digging out of the wreckage ever since.

How do you know if upgrading to vSphere 5.1 is right for you?  Here are a few bullet points to help you decide:

  • Do you have CA-signed (externally trusted, or in-house Enterprise CA server) certificates in use in your current vSphere environment?
  • Are you using an external MS SQL Server to host your vCenter database?  Are you using mirrored SQL databases?
  • Is your environment currently stable and reliable?

Is you answered “yes” to any of these questions, do not upgrade to vSphere 5.1.  At least, not yet. Do deceive yourself that that the vSphere 5.1.0a release will be any help, either.

What is the big problem, you ask?  The major source of pain in this release is the new “Single Sign-On Service” that handles authentication and authorization for all of the other vSphere components.  This component of vSphere has twitchy SSL certificate requirements that are poorly documented by VMware.  The SSL requirements are so touchy that in our case, even the self-signed certs generated by the installer did not work.  Unlike all of the other current vSphere components, it does not support mirrored SQL databases.  It has new permissions requirements in AD that are not documented at all, and at the time of our installation, did not even have a KB entry.  The installer is very buggy, most notably in that it requests that you set and admin password for the SSO Service, and demands password complexity, but it does not inform you when your password is unacceptably long (i.e. longer than 32 characters) or when your password contains illegal characters (i.e. most regular expression special characters).

So, if you do upgrade, be prepared for an extended service outage.  Give yourself a long service window.  Have your VMware support contract numbers handy.  Familiarize yourself with the myriad of locations that are used to log vCenter data.  Learn to use PowerShell (get-childitem -recurse | select-string -pattern “configSettingThatThevCenterInstallerBorkedUp”) and keep this page bookmarked:

Here are UVM we are indebted to Derek Seaman for his thorough documentation of the vSphere 5.1 installation process and detailed SSL certificate generation instructions.

Following are some installation quirks that we encountered, presented mainly for my own reference, but maybe you will find them useful as well:

  1. “Performance Charts Experienced an Internal Error” seen in the vSphere client after the upgrade:
    This happened because vCenter Web Services did not read the database mirroring configuration from our defined ODBC data sources… it grabbed the primary database only, and not the mirror data.  The fix?  Edit:
    “%ProgramData%\VMware\VMware VirtualCenter\”
    Find the “url=” line, and append:
    (Where [mirrorServer] the the actual DB mirror host name.  Don’t forget the “\” before the “=”.)
  2. Some users with permissions to vCenter 5.0 cannot log in after the upgrade.  In the vSphere web client, these users are marked as “disabled”:
    This occurred for use for two reasons:

    1. The SSO Service installer prompts us for a service account to use during install.  Following installation, the service is seen to be running as “SYSTEM”, and not the specified service account.  Change the Service to run with your planned service account using services.msc after the installation.  As an alternative, you could specify those credentials  in the vSphere Web Client -> Administration ->Sign-On and Discovery -> Configuration -> Identity Sources.  Edit your identity source, and under “Authentication Source” select “password”, then enter your service account credentials.
    2. The SSO Service needs to read account attributes that cannot be read by a standard user account (at least, not in an AD forest at a Server 2008 R2 functional level).  When we asked VMware support to define the required permissions, they replied: “an account has to have at least read-only permissions over the user and group Organization Units furthermore read permissions also on the properties of the users, such as UserAccessControl.”  After some experimentation, I just gave the SSO Service account “read all properties” rights to the account OU, and login abilities were restored.
  3. Our SSO Service broke when the mirrored database servers that we currently use for vCenter services had a failover event.  During install, I used the standard “failoverPartner=” JDBC connection string property to specify our failover database server.  Unfortunately, the SSO service ignores this property.  I could not identify an acceptable workaround for this problem. Ultimately, I installed a SQL Express instance on our vCenter server to house just the SSO database.  I tried:
    1. Using SQL Aliases, but this failed because the JDBC driver is not aware of SQL Aliases.
    2. Using a script that edits the local “hosts” file on a database failover event.  I then used this host name alias for the database connections.  This almost worked.  I edited the following files to use the host alias, instead of the actual database server host name:
      Upon restart, the SSO Service was able to connect to the database, but it did not survive a failover.  Apparently the old database connection information was still in use somewhere, and VMware support was not helpful in identifying all of the database configuration locations for SSO.
    3. While VMware does have command line configuration tools that could have been used to script reconfiguration of the database connection strings, I have deemed that they are too fragile for production use.
  4. The option to authenticate using Windows session credentials in the vSphere Client (traditional version) stopped working after the 5.1 upgrade.  This is a bug that is fixed with the 5.1.0a release.  Unfortunately, the SSO installer for 5.1.0a does not work in upgrade mode.  Aargh!  I had to uninstall the SSO service to get the updated files into place.  Guess what the uninstaller does?  That’s right… it erases the SSO Service database (drops all tables!  Gah!), and deletes all configuration files for the service.  Before you upgrade, make sure that you have an SSO Service backup bundle.  I did, but it was outdated.  I had to re-register all of the vCenter components with SSO manually, which was a pain in the butt.
  5. vSphere Update Manager registered with vCenter using the wrong DNS name.  We could not scan ESXi hosts for updates, because vCenter was telling them to connect to an invalid URL.  To fix, I needed to search the registry for the incorrect host name, and replace with the correct one:
    “HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\VMware, Inc.\VMware Update Manager\VUMServer”
    For good measure I also edited:
    %Program Files(x86)%\VMware\Infrastructure\Update Manager\extension.xml
    To contain the correct host name.  Then we restart the Update Manager services, and we are back in business.
  6. Other fun related to VMware Update Manager… the SQL Account used by Update Manager cannot have a password that exceeds 24 characters in length. Special characters in the SQL Account password also may cause problems.

So, VMware is not my favorite company this month.  On to solve more problems.  We still cannot add new permissions to vCenter, and Performance Charts are loading like a slug in winter.