Logon Performance in VDI Land

After spending hte better part of three days attempting to shave time off of login times in our VDI environment (VMware View-based), I thought I should scribe down some notes on effective troubleshooting tools and techniques. There were a lot of self-inflicted wounds this time, and I could have saved myself a lot of time if 1) I had documented the build process for my new VDI pool and 2) I had taken notes that last time I had made login optimizations.

WARMING: This post is largely unedited and probably a bit incoherent. Read at your own risk.

Group Policy:

Computer Configuration->Policies->Administrative Templates->System: Display highly detailed status messages
This setting causes the Windows login screen to provide more verbose feedback to the user about what winlogon.exe is doing at any given time. Rather than just seeing “Preparing Windows”, you will instead see things like “Processing Drive Map Preferences”. If the logon screen hangs on one of these steps for 30 seconds, you will know exactly which Group Policy setting is killing logon performance.

Event Viewer:

Windows 7 and 8 both include a Group Policy operational log under: Event Viewer->Applications and Services Logs->Microsoft->Windows->GroupPolicy->Operational.  This log contains a lot of useful information about the timing of various group policy components, and many times will contain all of the information you need to pinpoint troublesome Group Policy Settings.

If the Event Viewer does not have all of the information you need, you can enable verbose policy logging:
http://social.technet.microsoft.com/wiki/contents/articles/4506.group-policy-debug-log-settings.aspx

I typically find that this is not necessary, and that the Event Viewer has the information that I need.

Problems with User Profile loading often can be found under: Event Viewer->Applications and Services Logs->Microsoft->Windows->User Profile Service->Operational log. This log is especially useful when using roaming or mandatory profiles. Unfortunately, this entry just tracks initial profile location and loading, and does not log anything related to Active Setup.

Windows Performance Toolkit:

Part of the Assessment and Deployment Toolkit (ADK) for Windows 8.1. The Performance Toolkit includes the Windows Performance Recorder and Windows Performance Analyzer. Run the Recorder with the “Boot” performance scenario, with 1 iteration, then use the Analyzer to read the trace file that was created during reboot and logon. Make note of the relative time of each event in the boot/logon process (i.e. time of boot, time of login, time to desktop load). The Recorder only logs relative time from boot up, so you might have some trouble correlating wall-clock time with recorded event times. Try to locate processes that line up with the delays you see during login.

As an alternative, you can enable boot logging using “ProcMon”. The Performance Analyzer arguably offers better visualizations of boot issues, but ProcMon has more comprehensive process information, and may be a more familiar tool for many administrators.

Active Setup:

Active Setup is a pain. This is a pooly documented mechanism by which applications (mostly Microsoft applications) can run per-user configuration tasks (generally first-run tasks) on logon. It is synchronous, meaning each task much be completed before the next runs. Also, Active Setup runs in Winlogon.exe and blocks loading of the desktop. Because of this, Active Setup has the potential to greatly delay first time logon. As a result, it also becomes a scapegoat for logon delays, even when it is not the root cause. I have no really helpful advice for troubleshooting Active Setup other than use use the Performance Analyzer or ProcMon to locate Active Setup processes that take a long time to execute. See the following for a better explanation of the internals of Active Setup:
http://helgeklein.com/blog/2010/04/active-setup-explained/
And this for an explanation of situations in which you might want to disable Active Setup:
http://blog.ressoftware.com/index.php/2011/12/29/disable-active-setup-revealed/

SysInternals AutoRuns:

You can wade though every obscure registry key looking for processes that run at login, or you can just use AutoRuns and pull them up all in one place. Thanks to AutoRuns, I was able to locate the entry point for an irksome logon process that was running for no apparent reason. I had forgotten that under Windows Vista and later, Scheduled Tasks can use user logon events as a trigger event for starting a process. This brings us to the process that killed two days of my life…

@#$@%! Google Chrome:

Using The Performance Analyzer, I was able to determine that Google Chrome was adding over 30 seconds of time to logon on one of my VDI pools. Likely the delay originated from some of the Chrome site blacklist settings that we implemented. Regardless, our users really do not need Chrome tun re-run setup and installation tasks on each and every logon, so how to we fix this?
Investigation of Active Setup showed that Active Setup for Chrome already had been completed in our Mandatory roaming profile. So why was Chrome setup running on each and every user logon? It also was configured as a Scheduled Task that runs on each user logon event. Aargh! As noted above, SysInternals AutoRuns was used to locate this entry point.

Unfortunately, Google Update is a bit on the complicated side:
http://omaha.googlecode.com/svn/wiki/GoogleUpdateOnAScheduleOverview.html

There are two separate Google Update system services, two separate Scheduled Tasks related to Google Update, and three separate task triggers, including the one that runs a logon. For now, I have just disabled the scheduled tasks in my template machine. Unfortunately, this completely disabled Google Update in the VDI pool. Also, the changes will be wiped out if we update Google manually or via SCCM in the future. Better would be a Group Policy-based solution.

Some of you may know that there actually is official registry/Group Policy support for control of Google Update. See:
https://support.google.com/installer/answer/146164
and:
http://www.chromium.org/administrators/turning-off-auto-updates
However, these setting just disable Auto Update entirely. They do not allow you to control how and when updates will apply (i.e. disable user-mode updates, but leave machine-mode updates intact.

I expect the “real” fix here would be to run a separate scheduled task script or startup script that used PowerShell to fund and remove the scheduled task triggers. That’s more time than I want to spend on this project at present.

Provisioning students with Office 365 ProPlus licenses

Interesting… I still seem to be working at UVM. There must be a story there, but you won’t read about it here.

Anyway, after getting back from my brief hiatus at Stanford University, I got back on the job of setting up Azure DirSync with Federated login to our in-house Web SSO platforms. I’ll need to post about the security changes required to make that work with UVM’s FERPA interpretation. To summarize, we got it working.

However, once students can log in to Office 365, we need to provision them with licenses. DirSync can’t do this, so I needed to script a task that will grant an office 365 ProPlus license to any un-provisioned active student. You will find the script, mostly unaltered, below. I just set it up as a scheduled task that runs sometime after the nightly in-house Active Directory update process.

To be useful outside of UVM, the code would need to be customized to handle the logic used in your organization to determine who is a student. We have extended the AD schema to include the eduPerson schema, and have populated the “eduPersonPrimaryAffiliation” attribute with “Student” for currently active students. If you do something different, have a look at the “Get-ADUser” line, and use a different LDAP query to fetch your student objects.

Enjoy.

# Provision-MSOLUsers.ps1 script, by J. Greg Mackinnon, 2014-07-30

#Provisions all active student accounts in Active Directory with an Office 365 
#  ProPlus license.

#Requires:
# - PowerShell Module "MSOnline"
# - PowerShell Module "ActiveDirectory"
# - Azure AD account with rights to read account information and set license status
# - Credentials for this account, with password saved in a file, as detailed below.
# - Script runs as a user with rights to read the eduPersonAffiliation property of 
#     all accounts in Active Directory.

#Create a credential file using the following procedure:
#    1. Log in as the user that will execute the script.
#    2. Execute the following line of code in PowerShell:
#    ConvertTo-SecureString -String 'password' -AsPlainText -Force `
#      | ConvertFrom-SecureString | out-file "c:\path\to\cred\file"


Set-PSDebug -Strict

#Setup local variables:
[string] $to = 'sysAdmin@domain.com'
[string] $from = 'ProvisioningScript@myhost.domain.com'
[string] $smtp = 'smtp.uvm.edu'
[string] $msolUser = 'azureAdServiceAccount@myAzureDomain.domain.com'

#initialize log and counter:
[string[]] $log = @()
[long] $pCount = 0

#initialize logging:
[string] $logFQPath = "yourLogLocation"
New-Item -Path $logFQPath -ItemType file -Force

$sTime = get-date
$log += "Provisioning report for Office 365/Azure AD for: " + ($sTime.ToString()) + "`r`n"

function errLogMail ($err,$msg) {
    # Write error to log and e-mail function
    # Writes out the error object in $err to the global $log object.
    # Flushes the contents of the $log array to file, 
    # E-mails the log contents to the mail address specified in $to.
    [string] $except = $err.exception;
    [string] $invoke = $err.invocationInfo.Line;
    [string] $posMsg = $err.InvocationInfo.PositionMessage;
    $log +=  $msg + "`r`n`r`nException: `r`n$except `r`n`r`nError Position: `r`n$posMsg";
    $log | Out-File -FilePath $logFQPath -Append;
    [string] $subj = 'Office 365 Provisioning Script:  ERROR'
    [string] $body = $log | % {$_ + "`r`n"}
    Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp
}

#Import PS Modules used by this script:
Import-Module MSOnline -ErrorAction SilentlyContinue
Import-Module ActiveDirectory -ErrorAction SilentlyContinue

#Get credentials for use with MS Online Services:
try {
    $msolPwd = get-content 'c:\path\to\cred\file' | convertto-securestring -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered getting creds from file."
    errLogMail $myError $myMsg
    exit 110
}
try {
    $msolCreds = New-Object System.Management.Automation.PSCredential ($msolUser, $msolPwd) -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in generating credential object."
    errLogMail $myError $myMsg
    exit 120
}


#Connect to MS Online Services:
try {
    #ErrorAction set to "Stop" for force any errors to be terminating errors.
    # default behavior for connection errors is non-terminating, so the "catch" block will not be processed.
    Connect-MsolService -Credential $msolCreds -ErrorAction Stop
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in connecting to MSOL Services."
    errLogMail $myError $myMsg
    exit 130
}
$log += "Connected to MS Online Services.`r`n"

#Generate license report:
$lics = @()
$ppsSub = Get-MsolAccountSku | ? {$_.accountSkuId -match 'OFFICESUBSCRIPTION_STUDENT'}
$log += 'Office 365 ProPlus for Student - license report:'
$log += 'Total licenses: ' + $ppsSub.ActiveUnits
$log += 'Consumed licenses: ' + $ppsSub.ConsumedUnits
$log += 'Remaining licenses: ' + ($ppsSub.ActiveUnits - $ppsSub.ConsumedUnits) + "`r`n"


#Retrieve active student accounts into a hashtable:
[hashtable] $students = @{}
try {
    get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(eduPersonAffiliation=Student))' -SearchBase 'ou=people,dc=campus,dc=ad,dc=uvm,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
} catch {
    $msg = "Error encountered in reading accounts from Active Directory."
    errLogMail $myError $myMsg
    exit 200
}
$log += "Retrieved active students from Active Directory."
$log += "Active student count: " + $students.count


#Retrieve unprovisioned accounts from Azure AD:
[array] $ulUsers = @()
try {
    #Note use of "Synchronized" to suppress processing of cloud-only accounts.
    $ulUsers += Get-MsolUser -UnlicensedUsersOnly -Synchronized -All -errorAction Stop
} catch {
    $msg = "Error encountered in reading accounts from Azure AD. "
    errLogMail $myError $myMsg
    exit 300
}
$log += "Retrieved unlicensed MSOL users."
$log += "Unlicensed user count: " + $ulUsers.Count + "`r`n"

#Provision any account in $ulUsers that also is in the $students array:
foreach ($u in $ulUsers) {
    if ($students.get_item($u.UserPrincipalName) -eq $true) {
        $log += $u.UserPrincipalName + " is an active student."
        try {
            $u | Set-MsolUser -UsageLocation 'US' -ErrorAction Stop ;
            $log += 'Successfully set the Office usage location for the user. '
        } catch {
            $msg = = "Error encountered in setting Office 365 usage location to user. "
            errLogMail $myError $myMsg
            exit 410
        }
        try {
            $u | Set-MsolUserLicense -AddLicenses uvmoffice:OFFICESUBSCRIPTION_STUDENT -ErrorAction Stop ;
            $log += 'Successfully set the Office license for the user. '
            $pCount += 1
        } catch {
            $msg = "Error encountered in assigning Office 365 license to user. "
            errLogMail $myError $myMsg
            exit 420
        }
    } else {
        $log += $u.UserPrincipalName + " is not an active student.  Skipped Provisioning."
    }
}

#Add reporting details to the log:
$eTime = Get-Date
$log += "`r`nProvisioning successfully completed at: " + ($eTime.ToString())
$log += "Provisioned $pCount accounts."
$tTime = new-timespan -Start $stime -End $etime
$log += 'Elapsed Time (hh:mm:ss): ' + $tTime.Hours + ':' + $tTime.Minutes + ':' + $tTime.Seconds

#Flush out the log and mail it:
$log | Out-File -FilePath $logFQPath -Append;
[string] $subj = 'Office 365 Provisioning Script:  SUCCESS'
[string] $body = $log | % {$_ + "`r`n"}
Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp

Parting Scripts – Add a new network printer and set it as default

Some time back, I discovered that a Group Policy Preference that we had applied to a VMware View VDI pool was adding an additional 30 seconds of time staring at the blue spinning donut at each VDI desktop logon.  The policy in question was a printer policy.  Colleagues at other Higher Ed institutions confirmed that they had the same problem with GPP printer preferences.  It has been reported that using the “Mandatory Printers” policy is faster, but this policy does not allow you to assign a default printer.

Enter good old VBScript…

The following script will install a defined network printer and set it as default. If the print share does not exist, an error will be returned. 95% of the code in this script was lifted from my own “KillAndExec.vbs” script from last year. There really is only two lines of new code in here. It is good having a code library to draw on, because it would have taken be days to generate this stuff from scratch. VBScript is so obtuse… so why do I keep using it? Hmmmm….

'addDefaultPrinter script - J. Greg Mackinnon, 2014-06-11
'  Adds the network printer specified in the script argument "/share".
'  Sets this printer as the default printer for the current user.

option explicit

'Declare Variables
Dim bBadArg,bNoArgs
Dim cScrArgs
Dim iReturn
Dim sBadArg,sLog,sPrintShare,sScrArg,sScrArgs,sTemp,sTextsLog

Dim oFS,oLog,oShell
Dim WshNetwork

'Set initial values:
bBadArg = False
bNoArgs = False

'Instantiate Global Objects:
Set oShell = CreateObject("WScript.Shell")
Set oFS  = CreateObject("Scripting.FileSystemObject")

Set WshNetwork = CreateObject("WScript.Network")    


'''''''''''''''''''''''''''''''''''''''''''''''''''
' Define Functions
Sub subHelp
	echoAndLog "addDefaultPrinter.vbs Script"
	echoAndLog "by J. Greg Mackinnon, University of Vermont"
	echoAndLog ""
	echoAndLog "Installs a printer from a named network share, and sets this"
	echoAndLog "as the default printer for the current user."
	echoAndLog ""
	echoAndLog "Logs output to 'addDefaultPrinter.log' in the %temp% directory."
	echoAndLog ""
	echoAndLog "Required arguments and syntax:"
	echoAndLog "/share:""\\[server]\[share]"""
	echoAndLog "     Specify the UNC of the print share to be set as default."
End Sub

function echoAndLog(sText)
'EchoAndLog Function:
' Writes string data provided by "sText" to the console and to Log file
' Requires: 
'     sText - a string containing text to write
'     oLog - a pre-existing Scripting.FileSystemObject.OpenTextFile object
	'If we are in cscript, then echo output to the command line:
	If LCase( Right( WScript.FullName, 12 ) ) = "\cscript.exe" Then
		wscript.echo sText
	end if
	'Write output to log either way:
	oLog.writeLine sText
end function
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Initialize Logging
sTemp = oShell.ExpandEnvironmentStrings("%TEMP%")
sLog = "addDefaultPrinter.log"
Set oLog = oFS.OpenTextFile(sTemp & "\" & sLog, 2, True)
' End Initialize Logging
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Parse Arguments
If WScript.Arguments.Named.Count > 0 Then
	Set cScrArgs = WScript.Arguments.Named
	For Each sScrArg in cScrArgs
		Select Case LCase(sScrArg)
			Case "share"
				sPrintShare = cScrArgs.Item(sScrArg)
			Case Else
				bBadArg = True
				sBadArg = sScrArg
		End Select
	Next
ElseIf WScript.Arguments.Named.Count = 0 Then 'Detect if required args are not defined.
	bNoArgs = True
End If 
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Process Arguments
if bBadArg then
	echoAndLog vbCrLf & "Unknown switch or argument: " & sBadArg & "."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
elseif bNoArgs then
	echoAndLog vbCrLf & "Required arguments were not specified."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
end if
echoAndLog "Printer share to set to default: " 
echoAndLog sPrintShare & vbCrLf
' End Process Arguments
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
'Begin Main
'
on error resume next
'Add Printer
iReturn = 0
iReturn = WshNetwork.AddWindowsPrinterConnection(sPrintShare)
if err.number  0 then 'Gather error data if AddWindowsPrinterConnection failed.
	echoAndLog "Error: " & Err.Number
	echoAndLog "Error (Hex): " & Hex(Err.Number)
	echoAndLog "Source: " &  Err.Source
	echoAndLog "Description: " &  Err.Description
	iReturn = Err.Number
	Err.Clear
	wscript.quit(iReturn)
end if
if iReturn  0 then
	echoAndLog "Non-zero return code when attempting to set default printer."
	echoAndLog "Return Code was: " & iReturn
end if

'Set Default Printer
iReturn = 0
iReturn = WshNetwork.SetDefaultPrinter(sPrintShare)
if err.number  0 then 'Gather error data if SetDefaultPrinter failed.
	echoAndLog "Error: " & Err.Number
	echoAndLog "Error (Hex): " & Hex(Err.Number)
	echoAndLog "Source: " &  Err.Source
	echoAndLog "Description: " &  Err.Description
	iReturn = Err.Number
	Err.Clear
	wscript.quit(iReturn)
end if
on error goto 0
'echoAndLog "Return code from the command: " & iReturn
if iReturn  0 then
	echoAndLog "Non-zero return code when attempting to set default printer."
	echoAndLog "Return Code was: " & iReturn
end if

oLog.Close
wscript.quit(iReturn)

' End Main
'''''''''''''''''''''''''''''''''''''''''''''''''''

Rejecting read receipt requests with Procmail

I preparation for my exit, I have been re-routing various routing bulk mail messages using “Procmail” recipes. While I was working on this, I got an email from a colleague who always requests read receipts for every message that he sends. Despite being asked to stop, messages continue to come in with read receipt requests. I thought “wouldn’t it be great if I could get procmail to just reject these messages outright?”

Consulting common references on Procmail was not helpful because they rely procmail having access to a full shell environment. My colleague Jim Lawson gave me the framework for a different solution which instead involves the use of the “:0fwc” construct to pipeline multiple procmail actions. Interesting is the use of the “appendmsg” command, for which I cannot find a reference anywhere. This work, though. Aggressive/Aggressive handling of read receipt requests achieved!

#Use ":0c:" below if you want to receive a copy of the original message instead of just rejecting it.
:0:
# Check to see if the message contains any of the following command read-receipt request headers:
* ^Disposition-Notification-To:|\
  ^X-Confirm-Reading-To:|\
  ^Return-Receipt-To:
# Prevent mail loops... mail loops are bad.
* ! ^X-Loop: jgm@uvm\.edu
{
        :0fw
        | formail -pQUOTE: -k -r
        BODY=`formail -I ""`

        :0fw
        | formail -A"X-Loop: jgm@uvm.edu"\
        -I"Subject: Rejected mail: Read Receipts not accepted by this account."
        #-I"To: ${REJECT}" \
        
        # scrape off the body
        :0fwc
        | formail -X ""

        :0fwc
        | appendmsg "Message rejected because it contains a read receipt request." 

        # put back the quoted body
        :0fwc
        | appendmsg $BODY
        
        :0
        | sendmail -t
}

Final days… looking back.

Things have really slowed down here in the Brain Corral.  Projects and tasks have been rolling in faster than my ability to document them.  I am sure all of my “fans” have been most disappointed by lack of updates.  Well fans, sorry to be the bearer of bad news, but this likely will be my last post in the ‘ol corral.  June 13th, 2014 will be my last day of employment at UVM, which will put me a two weeks shy of 14 years of employment in Catamount country.

It has been a long road, with much territory covered.  When I started, I was the new “NetWare Guy”, responsible for Novell file and print services, and some antivirus management.  Since then, I have plowed though:

  • Two major directory service changes, and several upgrades (NetWare bindery to eDirectory, to Active Directory on Server 2003, and AD upgrades all the way though Server 2012 R2)
  • four major file server upgrades (NetWare 4 standalone to Netware 6 failover clusters, to NetApp FAS270, to NetApp FAS3000 series, to native Windows 2008 R2 file services)
  • three print server redeployments (Traditional NetWare to NDP, to monolithic Windows print services on Server 2003, to distributed print servers on Server 2008)
  • four antivirus vendor changes, (McAfee, to Norton/Symantec, to ESET NOD32, to MS Forefront/SCEP)
  • two terminal server projects (Citrix MetaFrame 1.8 to MS Terminal Services on Server 2008.
  • a VDI deployment with several minor upgrades (VMware View 5.0, 5.1, 5.2, and 5.3… over it’s <2 year history, that's an upgrade every six months!)
  • A SharePoint deployment with three major upgrades (STS 2 to WSS 3, to a 64-bit scale-out farm upgrade, to SPF 2010, and an incomplete 2013 Standard Edition upgrade)
  • three Windows deployment technology changes with many version upgrades (Ghost to RIS, to BDD, which later became MDT/LiteTouch)
  • a major deployment of VMware vSphere and vCenter, with many version upgrades (vCenter 2.5, 4.0, 4.1, 5.09, 5.1, 5.5).  From it’s humble beginnings, vSphere now hosts over 90% of our non-research computing operating systems!
  • four major changes in Windows enterprise patch management methods (SUS, WSUS, WSUS with Secunia CSI, SCCM 2012 with ADR and third party “Application packages” with supercedence rules).  We went from full MS-only non-enforced patch management, to adding support for third party applications, to fully automated patch management.
  • Two major whole disk encryption projects (PGP Whole Disk, followed by BitLocker with the “Microsoft BitLocker Administration and Management” add-on)
  • Several Windows server management re-deployments (BigBrother to MOM 2, MOM 2005 (redeploy), SCOM 2007 (redeploy), SCOM 2007 R2 (upgrade), SCOM 2012 (redeploy), SCOM 2012 R2 (upgrade))
  • …and other projects to numerous to detail (security initiatives, network protocol transitions, hosted server upgrade assistance, application deployment and upgrade assistance).

It has been a rocky road, full bumps and potholes.  Fortunately, with the assistance of my truly excellent teammates, we have made this journey with very few significant breakdowns.  Central IT has perhaps quintupled in size and importance over the past 14 years.  I am proud to have been part of this explosive period in IT at UVM, and honored to have worked with such highly intelligent and motivated people.

I wish you all the very best of luck with all future endeavors, and hope that you will keep in touch, even though I will be on the west coast of America (not just the west coast of New England) at Stanford University.

-J. Greg Mackinnon | ETS Systems Architecture and Administration

Migrating Windows-auth users to Claims users in SharePoint

A short time back I published an article on upgrading a Windows-authenticated based SharePoint environment to an ADFS/Shibboleth-based claims-based environment. At that time I said I would post the script that I plan to use for the production migration when it was done. Well… here it is.

This script is based heavily on the one found here:
blog.sharepoint-voodoo.net/?p=68‎
Unfortunately, “SharePoint-Voodoo” appears to be down at the time of this writing, so I cannot make appropriate attribution to the original author. This script helped speed along this process for me… Thanks, anonymous SharePoint Guru!

My version of the script adds the following:

  • Adds stronger typing to prevent script errors.
  • Adds path checking for the generated CSV file (so that the script does not exit abruptly after running for 30 minutes).
  • Introduces options to specify different provider prefixes for windows group and user objects.
  • Introduces an option to add a UPN suffix to the new user identity
  • Collects all user input before doing any processing to speed along the process.
  • Adds several “-Limit All” parameters to the “Get-SP*” cmdlets to prevent omission of users from the migration process.

There are still some minor problems. When run in “convert” mode, the script generates an error for every migrated user, even when the user is migrated successfully. I expect this is owing to a bug in “Move-SPUser”, and there probably is not much to be done about it. Because I want to migrate some accounts from windows-auth to claims-with-windows-auth, there is some manual massaging of the output file that needs to be done before running the actual migration, but I think this is about as close as I can get to perfecting a generic migration script without making the end-product entirely site-specific.

I will need to run the script at least twice… once for my primary “CAMPUS” domain, and once to capture “GUEST” domain users. I may also want to do a pass to convert admin and service account entries to claims-with-windows-auth users.

Set-PSDebug -Strict
add-pssnapin microsoft.sharepoint.powershell -erroraction 0

# Select Options
Write-Host -ForegroundColor Yellow "'Document' will create a CSV dump of users to convert. 'Convert' will use the data in the CSV to perform the migrations."
Write-Host -ForegroundColor Cyan "1. Document"
Write-Host -ForegroundColor Cyan "2. Convert"
Write-Host -ForegroundColor Cyan " "
[int]$Choice = Read-Host "Select an option 1-2: "

switch($Choice)
{
    1 {[bool]$convert = $false}
    2 {[bool]$convert = $true}
    default {Write-Host "Invalid selection! Exiting... "; exit}
}
Write-Host ""

$objCSV = @()
[string]$csvPath = Read-Host "Please enter the path to save the .csv file to. (Ex. C:\migration)"
if ((Test-Path -LiteralPath $csvPath) -eq $false) {
	Write-Host "Invalid path specified! Exiting..."; exit
}

if($convert-eq $true)
{
	$objCSV = Import-CSV "$csvPath\MigrateUsers.csv"

    foreach ($object in $objCSV)
    {
        $user = Get-SPUser -identity $object.OldLogin -web $object.SiteCollection 
        write-host "Moving user:" $user "to:" $object.NewLogin "in site:" $object.SiteCollection 
        move-spuser -identity $user -newalias $object.NewLogin -ignoresid -Confirm:$false
    }
}
else
{
	[string]$oldprovider = Read-Host "Enter the Old Provider Name (Example -> Domain\ or i:0#.f|MembershipProvider|) "
    [string]$newprovider = Read-Host "Enter the New User Provider Name (Example -> Domain\ or i:0e.t|MembershipProvider|) "
	[string]$newsuffix = Read-Host "Enter the UPN suffix for the new provider, if desired (Example -> @domain.com) "
	[string]$newGroupProvider = Read-Host "Enter the New Group Provider Name (Example -> Domain\ or c:0-.t|MembershipProvider|domain.com\) "


    # Select Options
    Write-Host -ForegroundColor Yellow "Choose the scope of the migration - Farm, Web App, or Site Collection"
    Write-Host -ForegroundColor Cyan "1. Entire Farm"
    Write-Host -ForegroundColor Cyan "2. Web Application"
    Write-Host -ForegroundColor Cyan "3. Site Collection"
    Write-Host -ForegroundColor Cyan " "
    [int]$scopeChoice = Read-Host "Select an option 1-3: "

    switch($scopeChoice)
    {
        1 {[string]$scope = "Farm"}
        2 {[string]$scope = "WebApp"}
        3 {[string]$scope = "SiteColl"}
        default {Write-Host "Invalid selection! Exiting... "; exit}
    }
    Write-Host ""
    if($scope -eq "Farm")
    {
        $sites = @()
        $sites = get-spsite -Limit All
    }
    elseif($scope -eq "WebApp")
    {
        $url = Read-Host "Enter the Url of the Web Application: "
        $sites = @()
        $sites = get-spsite -WebApplication $url -Limit All
    }
    elseif($scope -eq "SiteColl")
    {
        $url = Read-Host "Enter the Url of the Site Collection: "
        $sites = @()
        $sites = get-spsite $url
    }

    foreach($site in $sites)
    {
		$webs = @() #needed to prevent the next foreach from attempting to loop a non-array variable
        $webs = $site.AllWebs

        foreach($web in $webs)
        {
            # Get all of the users in a site
			$users = @()
            $users = get-spuser -web $web -Limit All #added "-limit" since some webs may have large user lists.

            # Loop through each of the users in the site
            foreach($user in $users)
            {
                # Create an array that will be used to split the user name from the domain/membership provider
                $a=@()
                $displayname = $user.DisplayName
                $userlogin = $user.UserLogin

                if(($userlogin -like "$oldprovider*") -and ($objCSV.OldLogin -notcontains $userlogin))
                {
                    # Separate the user name from the domain/membership provider
                    if($userlogin.Contains('|'))
                    {
                        $a = $userlogin.split("|")
                        $username = $a[1]

                        if($username.Contains('\'))
                        {
                            $a = $username.split("\")
                            $username = $a[1]
                        }
                    }
                    elseif($userlogin.Contains('\'))
                    {
                        $a = $userlogin.split("\")
                        $username = $a[1]
                    }
    
                    # Create the new username based on the given input
					if ($user.IsDomainGroup) {
						[string]$newalias = $newGroupProvider + $username
					} else {
						[string]$newalias = $newprovider + $username + $newsuffix
					}
                    

                    $objUser = "" | select OldLogin,NewLogin,SiteCollection
	                $objUser.OldLogin = $userLogin
                    $objUser.NewLogin = $newAlias
	                $objUser.SiteCollection = $site.Url

	                $objCSV += $objUser
                }   
            }
        }
        $site.Dispose()
    }

    $objCSV | Export-Csv "$csvPath\MigrateUsers.csv" -NoTypeInformation -Force
}

Integrating SharePoint 2013 with ADFS and Shibboleth… The Motion Picture

Time again to attempt to implement that exciting technology, Federation Services (Web Single Sign On, SAML, WS-Federation, or whatever you want to call it) with SharePoint. Last time we tried this, SharePoint 2010 was a baby product, MS was just testing the waters with SAML 2.0 support in ADFS 2.0, and Shibboleth 2 was pretty new stuff here at UVM. The whole experience was unsatisfying. SharePoint STS configuration was full of arcane PowerShell commands, ADFS setup was complicated by poor farm setup documentation, and interop of Shibboleth 2 with ADFS 2 was not documented at all.  After wading though all of that mess, we ended up with user names being displayed as “i:05.t|adfsServiceName|userPrincipalName” (bleagh!), and with many applications that could not deal with Web SSO authenticators.  My general conclusion was that SharePoint really was not ready for federated login.

Four years later, things have changed a bit.  STS configuration still requires dense PowerShell commands, but at least it is better documented.  ADFS and Shibboleth interoperability also are excellently documented at this point.  Microsoft has most Office apps working with passive SAML authenticators, and has pledged to get the rest working this year.  While I would not judge the use of ADFS (with or without Shibboleth) to be the easy route to take to SharePoint 2013 deployment, it at least looks functional at this point.  So let’s kick the tires and see how it works…

Part 1: ADFS Setup

First, we need to setup ADFS.  We chose to deploy “ADFS 3″ using Windows Server 2012 R2 as the OS platform.  ADFS 3 is required to support the new “workplace join” feature of Server 2012 R2.  Since we want to test this, we would need ADFS 3 anyway.  Unfortunately, ADFS on Server 2012 R2 is pretty virgin territory, and does not have the same troubleshooting resources available for it as do earlier releases.

Most of the configuration steps followed the TechNet documentation without variation.  We did find that we needed to modify the ACL on the service account used to run ADFS… we added the “Authenticated Users” principal to the ACL, and assigned the “Read all properties” right.

For us, the most complicating factor in ADFS 3 deployment is the replacement of IIS with the Windows Kernel-mode HTTP server “http.sys”.  When we started experiencing connectivity problems with various clients to ADFS, we had no experience with HTTP.sys to assist in troubleshooting.  Most articles on HTTP.sys relate to remote desktop services, and with Server 2003.  Our problems with HTTP.sys were rooted in an undocumented requirement for clients to submit SNI (Service Name Indicator) information in their TLS “CLIENT HELLO” sequence.  I had to open a support case with Microsoft to resolve this problem, and only afterword was I able to find any Internet discussions that reflected MS advice:
http://social.msdn.microsoft.com/Forums/vstudio/en-US/d514b5a0-c01c-4ce4-b589-bca890e5921d/how-to-properly-setup-lb-probe-for-adfs-30-servers?forum=Geneva

It seems that if http.sys is bound using the hostname:port format, then TLS will require SNI.  If the binding is instead specified using ipAddr:port, SNI will not be required.  To fix our problem, we just needed to add a second HTTPS certificate binding using an IP address.  In this case, we just used “0.0.0.0:443″.  Here is the procedure:

  1. On each ADFS server and proxy, open an elevated command prompt
  2. run: netsh http show sslcert
  3. Record the certificate hash and application ID for the certificate used by ADFS
  4. run: netsh http add sslcert ipport=0.0.0.0:443 certhash= appid={}

Part 2: Configuring SharePoint to use ADFS

I started with Microsoft’s guide to configuring SAML authentication for SharePoint 2013 using ADFS:
http://technet.microsoft.com/en-us/library/hh305235(v=office.15).aspx
This is a well written guide, and fairly easy to follow.  The only issue I take with the article is the recommendation to use the “emailAddress” claim as the “identifier claim” in SharePoint.  In many federated login scenarios, a foreign Idp may not want to release the email address attribute.  However, some variation of UPN likely will be released.  In the case of the InCommon federation, the “ePPN” value (eduPersonPrincipalName) generally is available to federation partners.  For this reason I chose to implement “UPN” as the “identifier claim” in the last command of phase 3 of the document:

$ap = New-SPTrustedIdentityTokenIssuer -Name  -Description  -realm $realm -ImportTrustCertificate $cert -ClaimsMappings $emailClaimMap,$upnClaimMap,$roleClaimMap,$sidClaimMap -SignInUrl $signInURL -IdentifierClaim $roleClaimmap.InputClaimType

Part 3: Migrating SharePoint Users From Windows to Claims

Since we are planning an upgrade, not a new deployment, we need migrate existing Windows account references in SharePoint to federated account references. To make this happen, we need to establish the federated account identity format. I simply log in to an open-access SharePoint site as an ADFS user, and record the account information. In this case, out accounts look like this:

i:0e.t|adfs.uvm.edu|user@campus.ad.uvm.edu

and groups:

c:0-.t|adfs.uvm.edu|samAccountName

We then can use PowerShell to find all account entries in SharePoint, and use the “Move-SPUser” PowerShell cmdlet to convert them.  I am still working on a final migration script for the production cutover, and I will try to post it here when it is ready.

Of some concern is keeping AD group permissions functional.  It turns out that SharePoint will respect AD group permissions for ADFS principals, but there are a few requirements:

  1. The incoming login token needs to contain the claim type “http://schemas.microsoft.com/ws/2008/06/identity/claims/role“,
    and that this claim type needs to contain the SamAccountID (or CN) of the AD
    Group that was granted access to a site.  (In ADFS, this means that you need to release the AD LDAP Attribute “Token-Groups – Unqualified Names” as the outgoing claim type “Role”.
  2. When adding AD Groups as permissions in SharePoint, we need to use the “samAccountName” LDAP attribute as the identifier claim.  The LDAPCP (see Part 4) utility makes this easy as it will do this for us automatically when configured to search AD.

Requirement 1 could be a problem when using Shibboleth as the authentication provider.  Our Shibboleth deployment does not authenticate against AD, so a Shibboleth ticket will not contain AD LDAP “tokenGroup” data in the “role” claim.  I am working with the Shibboleth guys to see if there is any way to augment Shibboleth tokens with data pulled from AD.

Part 4:  The SharePoint PeoplePicker and ADFS

Experienced SharePoint users all know (and mostly love) the “people picker” that searches Active Directory to validate user and group names that are to be added to the access list for a SharePoint site.  One of the core problems with federation services is that they are authentication systems only.  ADFS and Shibboleth do not implement a directory service.  You cannot do a lookup on an ADFS principal that a user adds to a SharePoint site.  This is particularly irksome, since all of our ADFS users actually have matching accounts in Active Directory.

Fortunately, there is a solution… you can add a “Custom Claims Provider” into SharePoint which will augment incoming ADFS claims with matching user data pulled from Active Directory.  This provider also integrates with the PeoplePicker to allow querying of AD to validate Claims users that are being added to a SharePoint site.  A good write-up can be found here:
http://blogs.msdn.com/b/kaevans/archive/2013/05/26/fixing-people-picker-for-saml-claims-users-using-ldap.aspx

“But I don’t want to compile a SharePoint solution using Visual Studio,” I hear you (and me) whine.  No problem… there is a very good pre-build solution available from CodePlex:
http://ldapcp.codeplex.com/discussions/361947

Normally I do not like using third-part add-ons in SharePoint.  I will make an exception for LDAPCP because:

  1. It works.
  2. It saves me hours of Visual Studio work.
  3. It is a very popular project and thus likely to survive on CodePlex until it is no longer needed.
  4. If the project dies, we can implement our own Claims Provider using templates provided elsewhere with (hopefully) minimal fuss.

My only outstanding problem with LDAPCP is that it will not query principals in our Guest AD forest.  However, there are some suggestions from the developer along these lines:
http://ldapcp.codeplex.com/discussions/478092

To summarize, the developer recommends compiling our own version of LDAPCP from the provided source code.   We would use the method “SetLDAPConnections” found in the “LDAPCP_Custom.cs” source file to add an additional LDAP query source to the solution.  I will try this as time permits.

Part 5: Transforming Shibboleth tokens to ADFS

So far we have not strayed too far from well-trodden paths on the Internet.  Now we get to the fun part… configuring ADFS as a relying party to our Shibboleth Idp, then transforming the incoming Shibboleth SAML token into an ADFS token that can be consumed by SharePoint.

Microsoft published a rather useful guide on ADFS/Shibboleth/InCommon integration:
http://technet.microsoft.com/en-us/library/gg317734(WS.10).aspx
Using this guide we were able to set up ADFS as a relying party to our existing Shibboleth Idp with minimal fuss.  Since we already have an Idp, we skipped most of Step 1, and then jumped to Step 4 as we did not need to configure Shibboleth as an SP to ADFS.

I had the local “Shibboleth Guy” add our ADFS server to the relying parties configuration file on the Shib server, and release “uvm-common” attributes for this provider. This allows SharePoint/ADFS users to get their “eduPersonPrincipalName” (ePPN) released to ADFS/SharePoint from Shibboleth. However, SharePoint (and ADFS) do not natively understand this attribute, so we configure a “Claim Rule” on the “Claims Provider Trust” with Shibboleth.  The rule is an “Acceptance Transformation Rule” that we title “Transform ePPN to UPN”, and it has the following syntax:

c:[Type == "urn:oid:1.3.6.1.4.1.5923.1.1.1.6"] 
=> issue(Type = "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn", 
Issuer = c.Issuer, OriginalIssuer = c.OriginalIssuer, Value = c.Value, 
ValueType = c.ValueType);

The “urn:oid:1.3.6.1.4.1.5923.1.1.1.6″ bit is the SAML 2 identifier for the ePPN type.

After configuring a basic Relying Party trust with SharePoint 2013, I need to configure Claims Rules that will release Shibboleth User attributes/claims to SharePoint.  You could use a simple “passthough” rule for this.  However, I want incoming Shibboleth tokens that have a “@uvm.edu” UPN suffix to be treated as though they are Active Directory users.  To accomplish this, I need to do a claims transformation. In AD, the user UPN has the “@campus.ad.uvm.edu” suffix, so let’s transform the Shibboleth UPN using a Claim Rule on the SharePoint Relying Party Trust:

c:[Type == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn", 
Value =~ "@uvm\.edu$"]
=> issue(Type = "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn", 
Issuer = c.Issuer, OriginalIssuer = c.OriginalIssuer, 
Value = regexreplace(c.Value, "^(?[^@]+)@(.+)$", "${user}@campus.ad.uvm.edu"), 
ValueType = c.ValueType);

This appears to work… the first RegEx “@uvm\.edu$” should match an incoming UPN that ends with “@uvm.edu”. In the second set of regExps, we create a capture group for the user portion of the UPN (which is everything from the start of the value up to (but not including) the “@” character), and then replace everything trailing the user portion with “@campus.ad.uvm.edu”.

However, as noted above, the incoming Shibboleth SAML token does not contain AD group data in the “Role” attribute, so users authenticating from Shibboleth cannot get access to sites where they have been granted access using AD groups.  Boo!

Part 6:  Where are you from?  Notes on Home Realm Discovery

When an ADFS server has multiple “Claims Provider Trusts” defined, the ADFS login page automatically will create a “WAYF”, or “Where are you from?” page to allow the user to select from multiple authentication providers.  In our case, the user would see “Active Directory” and “UVM Shibboleth”.  Since I would not want to confuse people with unnecessary choices, we can disable the display of one of these choices using PowerShell:

Set-AdfsRelyingPartyTrust -TargetName "SharePoint 2013" -ClaimsProviderName "UVM Shibboleth"

In this sample, “SharePoint 2013″ is the name of the relying party defined in ADFS for which you want to set WAYF options.  “UVM Shibboleth” is the Claims Provider Trust that you want used.  This value can be provided as an array, but in this case we are going to provide only one value… the one authenticator that we want to use.  After configuring this change, ADFS logins coming from SharePoint get sent straight to Shibboleth for authentication.

Part 7: The Exciting Results

Only a sysadmin could call this exciting…

Given how heavily MS invested in implementing WS-Federation and WS-Trust into their products (MS Office support for federation services was, to the best of my knowledge, focused entirely on the WS-* protocols implemented in ADFS 1.0), I was not expecting any client beyond a web browser to work with Shibboleth.  Imagine my surprise…

Browsers:

IE 11 and Chrome both login using Shib with no problems.  Firefox works, but not without a glitch… upon being redirected back to SharePoint from our webauth page, we get a page full of un-interpreted html code.  Pressing “f5″ to refresh clears the problem.

Office 2013 Clients:

All core Office 2013 applications appear to support opening of SharePoint documents from links in the browser.  Interestingly, it appears that Office is able to share ADFS tokens obtained by Internet Explorer, and vice versa.  The ADFS token outlives the browser session, too, so you actually have to log off of ADFS prevent token re-use.  I tested the “Export to Excel” and “Add to Outlook” options in the SharePoint ribbon, and both worked without a fuss.

Getting Office apps to open content in SharePoint directly also works, although its a bit buggy.  Sometimes our webauth login dialog does not clear cleanly after authentication.

SkyDrive Pro (the desktop version included with Office 2013) (soon to be “OneDrive for Business) also works with Shibboleth login, amazingly.  The app-store version does not work with on-premises solutions at all, so I could not test it.

Mobile Clients:

I was able to access a OneNote Notebook that I stored in SharePoint using OneNote for Android.  However, it was not easy.  OneNote for Android does not have a dialog that allow for the adding of notebooks from arbitrary network locations.  I first had to add the notebook from a copy of OneNote 2013 for Windows that was linked to my Microsoft account.  The MS account then recorded the existence of the notebook.  When I logged in to OneNote on the Android, it picked up on the SharePoint-backed notebook and I was able to connect.

The OneNote “metro” app does not appear to have the same capability as the Android app.  I cannot get it to connect to anything other than Office 365 or CIFS-backed files.

I was unable to test Office for iOS or Android because I do not own a device on which those apps are supported.

I still need to look at the “Office Document Connection” that comes with Office for the Mac, and at WebDAV clients, and perhaps some other third-party SharePoint apps to see if they work.

Bulk-modification of deployment deadlines in SCCM

A full two years into testing we finally are moving forward with production deployment of our Configuration Manager 2012 (SCCM) environment. Last month we (recklessly?) migrated 1000 workstations into the environment. While the deployment was a technological success, it was a bit of black-eye in the PR department.

Client computers almost uniformly did an unplanned reboot one hour after the SCCM agent got installed on their workstations. In addition to that, many clients experienced multiple reboot requests over the coming days. Many client reported that they did not get the planned 90-minute impending reboot warning, but only the 15-minute countdown timer.

Lots of changes were required to address this situation:

See the “Suppress and required client restarts” setting documented here:

http://technet.microsoft.com/en-us/library/gg682067.aspx#BKMK_EndpointProtectionDeviceSettings

This one was causing clients to reboot following upgrade of their existing Forefront Endpoint Protection client to SCEP. That explained the unexpected 60-minute post-install reboot.

Next, we decided to change the reboot-after deadline grace period from 90 minutes to 9 hours, with the final warning now set to one hour, up from 15 minutes. This should allow people to complete work tasks without having updates interrupt their work day.

Finally, we are planning to reset the deployment deadline for all existing software update deployments to a time several days out from the initial client installation time. Since we have several dozen existing software update group deployments, we need a programmatic approach to completing this task. The key to this was found here:

http://www.scconfigmgr.com/2013/12/01/modify-the-deadline-time-of-an-adr-deployment-with-powershell/

Thanks to Nickolaj Andersen for posting this valuable script.

It did take me a bit of time to decode what Nickolaj was doing with his script (I was not already familiar with the Date/Time format generally used in WMI). I modified the code to set existing update group deployments to a fixed date and time provided by input parameters. I also added some in-line documentation to the script, and added a few more input validation checks:

# Set-CMDeploymentDeadlines script
#   J. Greg Mackinnon, 2014-02-07
#   Updates all existing software update deployments with a new enforcement deadline.
#   Requires specification of: 
#    -SiteServer (an SCCM Site Server name)
#    -SiteCode   (an SCCM Site Code)
#    -DeadlineDate
#    -DeadlineTime
#

[CmdletBinding()]

param(
    [parameter(Mandatory=$true)]
    [string] $SiteServer,

    [parameter(Mandatory=$true)]
    [string] $SiteCode,

    [parameter(Mandatory=$true)]
    [ValidateScript({
        if (($_.Length -eq 8) -and ($_ -notmatch '[a-zA-Z]+')) {
            $true
        } else {
            Throw '-Deadline must be a date string in the format "YYYYMMDD"'
        }
    })]
    [string] $DeadlineDate,

    [parameter(Mandatory=$true)]
    [ValidateScript({
        if (($_.Length -eq 4) -and ($_ -notmatch '[a-zA-Z]+')) {
            $true
        } else {
            Throw '-DeadlineTime must be a time string in the format "HHMM", using 24-hour syntax' 
        }
    })]
    [string] $DeadlineTime
)

Set-PSDebug -Strict

# WMI Date format is required here.  See:
# http://technet.microsoft.com/en-us/library/ee156576.aspx
# This is the "UTC Date-Time Format", sometimes called "dtm Format", and referenced in .NET as "dmtfDateTime"
#YYYYMMDDHHMMSS.000000+MMM
#The grouping of six zeros represents milliseconds.  The last cluster of MMM is minute offset from GMT.  
#Wildcards can be used to for parts of the date that are not specified.  In this case, we will not specify
#the GMT offset, thus using "local time".

# Build new deadline date in WMI Date format:
[string] $newDeadline = $DeadlineDate + $DeadlineTime + '00.000000+***'
Write-Verbose "Time to be sent to the Deployment Object: $newDeadline"
 

# Get all current Software Update Group Deployments.
# Note: We use the WMI class "SMS_UpdateGroupAssignment", documented here:
# http://msdn.microsoft.com/en-us/library/hh949604.aspx
# Shares many properties with "SMS_CIAssignmentBaseClass", documented here:
# http://msdn.microsoft.com/en-us/library/hh949014.aspx 
$ADRClientDeployment = @()
$ADRClientDeployment = Get-WmiObject -Namespace "root\sms\site_$($SiteCode)" -Class SMS_UpdateGroupAssignment -ComputerName $SiteServer

# Loop through the assignments setting the new EnforcementDeadline, 
# and commit the change with the Put() method common to all WMI Classes:
# http://msdn.microsoft.com/en-us/library/aa391455(v=vs.85).aspx
  
foreach ($Deployment in $ADRClientDeployment) {

    $DeploymentName = $Deployment.AssignmentName

    Write-Output "Deployment to be modified: `n$($DeploymentName)"
    try {
        $Deployment.EnforcementDeadline = "$newDeadline"
        $Deployment.Put() | Out-Null
        if ($?) {
            Write-Output "`nSuccessfully modified deployment`n"
        }
    }
    catch {
        Write-Output "`nERROR: $($_.Exception.Message)"
    }
}

We additionally could push out the deployment time for Application updates as well using the “SMS_ApplicationAssignment” WMI class:

http://msdn.microsoft.com/en-us/library/hh949469.aspx

In this case, we would want to change the “UpdateDeadline” property, since we do not set a “deployment deadline” for these updates, but instead are using application supersedence rules to control when the updates are deployed.

Automated Driver Import in MDT 2013

As a follow up to my previous post, I also have developed a script to automate the import of drivers into MDT 2013.  This PowerShell script takes a source folder structure and duplicates the top two levels of folders in the MDT Deployment Share “Out-of-box drivers ” branch.  The script then imports all drivers found in the source directories to the matching folders in MDT.

All we have to do is extract all drivers for a given computer model into an appropriately named folder in the source directory, and then run the script.

################################################################################
#
#  Create-MDTDriverStructure.ps1
#  J. Greg Mackinnon, University of Vermont, 2013-11-05
#  Creates a folder structure in the "Out of Box Drivers" branch of a MDT 2013
#    deployment share.  The structure matches the first two subdirectories of 
#    the source filesystem defined in $srcRoot.  All drivers contained within
#    $srcRoot are imported into the deployment share.
#
#  Requires: 
#    $srcDir - A driver source directory, 
#    $MDTRoot - a MDT 2013 deployment share
#    - MDT 2013 must be installed in the path noted in $modDir!!!
#
################################################################################

# Define source driver directories:
[string] $srcRoot = 'E:\staging\drivers\import'
[string[]] $sources = gci -Attributes D $srcRoot | `
    Select-Object -Property name | % {$_.name.tostring()}
	
# Initialize MDT Working Environment:
[string] $MDTRoot = 'E:\DevRoot'
[string] $PSDriveName = 'DS100'
[string] $oobRoot = $PSDriveName + ":\Out-Of-Box Drivers"
[string] $modDir = `
	'C:\Program Files\Microsoft Deployment Toolkit\Bin\MicrosoftDeploymentToolkit.psd1'
Import-Module $modDir
New-PSDrive -Name "$PSDriveName" -PSProvider MDTProvider -Root $MDTRoot


foreach ($source in $sources){
    Write-Host "Working with source: " $source -ForegroundColor Magenta
    # Create the OOB Top-level folders:
    new-item -path $oobRoot -name $source -itemType "directory" -Verbose
    # Define a variable for the current working directory:
    $sub1 = $srcRoot + "\" + $source
    # Create an array containing the folders to be imported:
    $items = gci -Attributes D $sub1 | Select-Object -Property name | % {$_.name.tostring()}
    $oobDir = $oobRoot + "\" + $source

    foreach ($item in $items) {
		# Define source and target directories for driver import:
	    [string] $dstDir = $oobDir + "\" + $item
	    [string] $srcDir = $sub1 + "\" + $item
	
	    # Clean up "cruft" files that lead to duplicate drivers in the share:
		Write-Host "Processing $item" -ForeGroundColor Green
	    Write-Host "Cleaning extraneous files..." -ForegroundColor Cyan
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "Found " $delItems.count " files to delete..." -ForegroundColor Yellow
	    $delItems | remove-Item -force -confirm:$false
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "New count for extraneous files: " $delItems.count -ForegroundColor Yellow

	    # Create the target directory:
		Write-Host "Creating $item folder" -ForegroundColor Cyan
	    new-item -path $oobDir -name $item -itemType "directory" -Verbose
	
	    # Import all drivers from the source to the new target:
		Write-Host "Importing Drivers for $item" -ForegroundColor Cyan
	    Import-MDTDriver -Path $dstDir -SourcePath $srcDir 
		
        Write-Host "Moving to next directory..." -ForegroundColor Green
		
    } # End ForEach Item
} # End ForEach Source

Remove-PSDrive -Name "$PSDriveName"

 

Rethinking Driver Management in MDT 2013

We have been using the Microsoft Deployment Toolkit (MDT) in LTI/Lite Touch mode here at the University for a long time now.  Why, we used it to deploy XP back when MDT still was called the Business Desktop Deployment Solution Accelerator (BDD).  In this time, we have gone though several different driver management methods.  Gone are the nightmare days of dealing with OEMSETUP files, $OEM$ directories, can elaborate “DriverPack” injection scripts for XP (thank goodness).  

With the release of Vista, we moved from a PnP free-for-all model of driver detection.  After Windows 8.0, we found we really needed to separate our drivers by operating system.  Thus, we created Win7, Win8, and WinPE driver selection profiles.

But now we are finding that driver sprawl is becoming a major issue again.  On many new systems we run though a seemingly successful deployment, but end up with a non-responsive touch screen, a buggy track pad, and (sometimes) a very unstable system.

Starting this week, we are trying a new hybrid driver management approach.  We will create a driver folder for each computer model sold though our computer depot.  I have developed a custom bit of VBScript to check to see if the hardware being deployed to is a known model.  Driver injection will be restricted to this model if a match is found.  The script contains additional logic to detect support for both Windows 7 and Windows 8 variants, and to select the most current drivers detected.  Unknown models will fall back on the PnP free-for-all detection method.

Here is how it works…

  1. Create a new group in your OS deployment task sequence named “Custom Driver Inject”, or something similar.  Grouping all actions together will allow easier transfer of these custom actions to other Task Sequences:
    DM-DriverInjectGroup      
  2. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name the variable “TargetOS”,and set the value to the OS that you are deploying from this task sequence.  You must follow the same naming convention that you use in your Out-of-box driver folder.  I use Win(X), where (X) is the major OS version of the drivers in the folder.  In this example, I have chose “Win8″:
    DM-SetTargetOS
  3. Add an action of type “Run Command Line”.  Name this action “Supported Model Check”.  Under the Command line field, enter “cscript “%SCRIPTROOT%\ZUVMCheckModel.wsf”.  (We will import this script into the deployment share later on.)
    DM-RunModelCheckScript
  4. Add a sub-group named “Supported Model Actions”.  Under the “Options” tab, add a condition of type “Task Sequence Variable”.  Use the variable “SupportedModel”, the Condition “equals”, and the Value “YES”.  (The SupportedModel variable gets set by the CheckModel script run in the previous step.):
    DM-ConditionalGroup
  5. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name this task “Set Variable DriverGroup002″.  Under “Task Sequence Variable”, set “DriverGroup002″, and set the value to “Models\%TargetOS%\%Model%”.  (Note:  You could use “DriverGroup001″, but I already am using this variable to hold a default group of drivers that I want added to all systems.  The value “%TargetOS%\%Model%” defines the path to the driver group in the deployment share.  If you use a different folder structure, you will need to modify this path.):
    DM-SetDriverGroup
  6. Create a new task of type “Inject Drivers”.  Name this task “Inject Model-Specific Drivers”.  For the selection profile, select “Nothing”.  Be sure to select “Install all drivers from the selection profile”.  (NOTE: The dialog implies that we will be injecting only divers from a selection profile.  In fact, this step will inject drivers from any paths defined in any present “DriverGroupXXX” Task Sequence variables.)
    DM-InjectModelDrivers
  7. Now, under our original Custom Driver Inject group, add a new task of type “Inject Drivers”.  Choose from the selection profile “All Drivers”, or use a different fallback selection profile that suits the needs of your task sequence.  This time, select “Install only matching drivers from the selection profile”:
    DM-InjectUnsupported1
    Under the “Options” tab, add the condition where the “Task Sequence Variable” named “Supported Model” equals “NO”:
    DM-InjectUnsupported2
    This step will handle injection of matching drivers into hardware models for which we do not have a pre-defined driver group.
  8. Optionally, you now can open the “CustomSettings.ini” file and add the following to your “Default” section:
         DriverGroup001=Peripherals
    (I have a “Peripherals” driver group configured which contains USB Ethernet drivers used in our environment.  These are a necessity when deploying to hardware that does not have an embedded Ethernet port, such as the Dell XPS 12 and XPS 13.  You also could add common peripherals with complicated drivers such as a DisplayLink docking station or a Dell touch screen monitor.)
  9. Add the “ZUVMCheckMedia.wsf” script to the “Scripts” folder of your deployment share.  The code for this script is included below.  I think the script should be fairly easy to adapt for your environment.
  10. Finally, structure your “Out-of-Box Drivers” folder to contain a “Models” folder, and a single folder for each matching hardware model in your environment.  I get most of our driver collections from Dell:
    http://en.community.dell.com/techcenter/enterprise-client/w/wiki/2065.dell-driver-cab-files-for-enterprise-client-os-deployment.aspx
    (NOTE:  Thanks Dell!)
    The real challenge of maintaining this tree is in getting the model names right.  Use “wmic computersystem get model” to discover the model string for any new systems in your environment.  A table of a few current models I have been working with is included below.

Dell Marketing Model Name to WMI Name Translator Table:

  • Dell XPS 12 (first generation) – “XPS 12 9Q23″
  • Dell XPS 12 (second generation) – “XPS 12-9Q33″
  • Dell XPS 13 (first generation) – “Dell System XPS L321X”
  • Dell XPS 13 (second generation) – “Dell System XPS L322X”
  • Dell XPS 14 – “XPS L421Q”
  • Dell Latitude 10 – “Latitude 10 – ST2″
  • VMware Virtual Machine – “VMware Virtual Platform”
  • Microsoft Hyper-V Virtual Machine – “Virtual Machine”

A fun nuance we encountered last week was a Latitude E5430 model that contained “no-vPro” after the model number. Dell does not provide separate driver CABs for vPro/non-vPro models, so I added a regular expression test for Latitudes, and strip any cruft after the model number. There is one more problem down…

The following site contains a list of older model name translations:
http://www.faqshop.com/wp/misc/wmi/list-of-wmic-csproduct-get-name-results
As you can see, most Latitudes and Optiplexes follow sane and predictable model name conventions. I wish the same were true for the XPS.

Finally, I am indebted to the following sources for their generously detailed posts on driver management. Without their help, I doubt I would have been able to make this solution fly:

Jeff Hughes of the Windows Enterprise Support Server Core Team:
http://blogs.technet.com/b/askcore/archive/2013/05/09/how-to-manage-out-of-box-drivers-with-the-use-of-model-specific-driver-groups-in-microsoft-deployment-toolkit-2012-update-1.aspx

Andrew Barnes (aka Scriptimus Prime), whose posts on MDT driver management give the basics DriverGroups and model selection:
http://scriptimus.wordpress.com/2013/02/25/ltizti-deployments-injecting-drivers-during-deployment/
AND, of automating driver import into MDT (written for MDT 2012… some changes required for 2013):
http://scriptimus.wordpress.com/2012/06/08/mdt-2012-creating-a-driverstore-folder-structure/

The incredible Michael Neihaus, who in this post discusses the use of DriverGroups and Selection Profiles:
http://blogs.technet.com/b/mniehaus/archive/2009/09/09/mdt-2010-new-feature-19-improved-driver-management.aspx

And finally Eric Schloss of the “Admin Nexus”, who give me the idea of developing a fallback for systems that do not match a known model. It was this key bit of smarts that gave me the confidence to move forward with a model-specific driver grouping strategy:
http://adminnexus.blogspot.com/2012/08/yet-another-approach-to-driver.html

ZUVMCheckModel.wsf script:

(NOTE: WordPress stripped off the WSF headers and footers from my script. These are the first three and last two lines in the script. If you copy from this post, please remember to place greater than and less than tags around these lines before running, as indicated in the comments.)

' Uncomment and wrap each of the following three lines in less than/greater than characters to convert them to tags.
'job id="ZUVMCheckModel"
'script language="VBScript" src="ZTIUtility.vbs"/
'script language="VBScript"

Option Explicit
RunNewInstance

'//--------------------------------------------------------
'// Main Class
'//--------------------------------------------------------
Class ZUVMCheckModel
	
	'//—————————————————————————-
	'//  Constructor to initialize needed global objects
	'//—————————————————————————-
	Private Sub Class_Initialize
	End Sub
	
	'//--------------------------------------------------------
	'// Main routine
	'//--------------------------------------------------------

	Function Main()
	' //*******************************************************
	' //
	' // File: ZTIUVMCheckModel.wsf
	' //
	' // Purpose: Checks the model of this system against
	' //          a list of known machine models.  Returns
	' //          TRUE if a matching model is detected.
	' //
	' // Usage: cscript ZUVMCheckModel.wsf /Model: [/debug:true]
	' //
	' //*******************************************************
	
	'Use the following lines for debugging only.
	'oEnvironment.Item("TargetOS") = "Win7"
	'oEnvironment.item("DeployRoot") = "c:\local\mdt"
	'oEnvironment.Item("Model") = "Latitude E6500 some annoying variation"
	'End debug Params

	  Dim aModels()          'Array of models taken from DriverGroups.xml
	  Dim bOldDrivers        'Boolean indicating drivers present for an older OS version
	  Dim i                  'Generic integer for looping
	  Dim j                  'Generic integer for looping
	  Dim iRetVal            'Return code variable
	  Dim iMaxOS             'Integer representing the highest matching OS driver store
	  Dim oRegEx
	  Dim oMatch
	  Dim match
	  Dim oXMLDoc            'XML Document Object, for reading DriverGroups.xml
	  Dim Root,NodeList,Elem 'Objects in support of oXMLdoc
	  Dim sDGPath            'Path to DriverGroups.xml file
	  Dim sInitModel         'String representing the initial value of
	                         '   oEnvironment.Item("Model")
	  Dim sItem	             'Item in aModels array.
	  Dim sMaxOS             'OS Name of highest matching OS driver store
	  Dim sOSFound           'OS Name for a given matching driver set.
	  
	  oLogging.CreateEntry "Begin ZUVMCheckModel...", LogTypeInfo
	  
	  'Set the default values:
	  oEnvironment.Item("SupportedModel") = "NO"
	  iMaxOS = CInt(Right(oEnvironment.Item("TargetOS"),1))
	  'wscript.echo "Default value for iMaxOS = " & iMaxOS
	  bOldDrivers = false
	  sInitModel = oEnvironment.Item("Model")
	  'wscript.echo "sInitModel value = " & sInitModel
	  
	  Set oRegEx = New RegExp
	  oRegEx.Global = True
	  oRegEx.IgnoreCase = True
	  
	  'Modify the detected model name to handle known variations:
	  oRegEx.pattern = "Latitude"
	  if oRegEx.test(sInitModel) then
		oLogging.CreateEntry "Model is a Latitude.  Cleaning up the model name...", LogTypeInfo
		oRegEx.pattern = " "
		set oMatch = oRegEx.Execute(sInitModel)
		'wscript.echo "oMatch Count is: " & oMatch.count
		if oMatch.Count > 1 then
			i = oMatch.item(1).FirstIndex
			oEnvironment.Item("Model") = Left(sInitModel,i)
			'wscript.echo """"&oEnvironment.Item("Model")&""""
		end if
	  end if

	  'Check for DriverGroups.xml file, which will contain the supported model list:
	  iRetVal = Failure
	  iRetVal = oUtility.FindFile("DriverGroups.xml", sDGPath)
	  if iRetVal  Success then
		oLogging.CreateEntry "DriverGroups file not found. ", LogTypeError
		exit function
	  end if 
	  oLogging.CreateEntry "Path to DriverGroups.xml: " & sDGPath, LogTypeInfo
	  
	  'Parse the DriverGroups.xml file:
	  oLogging.CreateEntry "Parsing DriverGroups.xml...", LogTypeInfo
	  Set oXMLDoc = CreateObject("Msxml2.DOMDocument") 
	  oXMLDoc.setProperty "SelectionLanguage", "XPath"
	  oXMLDoc.load(sDGPath)
	  Set Root = oXMLDoc.documentElement 
	  Set NodeList = Root.getElementsByTagName("Name")
	  oLogging.CreateEntry "NodeList Member Count is: " & NodeList.length, LogTypeInfo
	  'oLogging.CreateEntry "NodeList.Length variant type is: " & TypeName(NodeList.Length), LogTypeInfo
	  i = CInt(NodeList.length) - 1
	  ReDim aModels(i) 'Resize aModels to hold all matching DriverGroup items.
	  'oLogging.CreateEntry "List of Available Driver Groups:", LogTypeInfo
	  i = 0
	  For Each Elem In NodeList
		if InStr(Elem.Text,"Models\") then
			aModels(i) = Mid(Elem.Text,8)	'Add text after "Models\"
			'oLogging.CreateEntry aModels(i), LogTypeInfo
			i = i + 1
		end if
	  Next
	  oLogging.CreateEntry "End Parsing DriverGroups.xml.", LogTypeInfo

	  'Loop through the list of supported models to find a match:
	  oLogging.CreateEntry "Checking discovered driver groups for match to: " & oenvironment.Item("Model"), LogTypeInfo
	  For Each sItem in aModels
		oLogging.CreateEntry "Checking Driver Group: " & sItem, LogTypeInfo
		i = InStr(1, sItem, oEnvironment.Item("Model"), vbTextCompare)

		'wscript.echo TypeName(i) 'i is a "Long" number type.
		If i  0 Then
			oLogging.CreateEntry "Matching Model found.", LogTypeInfo
			
			j = InStr(sItem,"\")
			sOSFound = Left(sItem,j-1)
			'wscript.echo "sOSFound = " & sOSFound 
			if (InStr(1,sOSFound,oEnvironment.Item("TargetOS"),vbTextCompare)  0) then
				oLogging.CreateEntry "Drivers matching the requested OS are available.  Exiting with success.", LogTypeInfo
				oEnvironment.Item("SupportedModel") = "YES"
				iRetVal = Success
				Main = iRetVal
				Exit Function
			end if
			if iMaxOS > CInt(Right(sOSFound,1)) then
				iMaxOS = CInt(Right(sOSFound,1))
				'wscript.echo "iMaxOS = " & iMaxOS
				sMaxOS = sOSFound
				bOldDrivers = true
				'wscript.echo "sMaxOS = " & sMaxOS
			end if
		End If
	  Next
		
	  If bOldDrivers Then 'Run if sMaxOS is defined... set a boolean when this is defined and test against that?
		oLogging.CreateEntry "Model drivers were found for an OS older than the one selected...", LogTypeWarning
		oEnvironment.Item("SupportedModel") = "YES"
		oEnvironment.Item("TargetOS") = sMaxOS
	  Else
	    oLogging.CreateEntry "No matching drivers were found for this model.", LogTypeInfo
	  End If
	  
	  oLogging.CreateEntry "End ZUVMCheckModel.", LogTypeInfo

	  iRetVal = Success
	  Main = iRetVal

	End Function

End Class

' Uncomment and wrap each of the following two lines in less than/greater than characters to convert them to tags.
'/script
'/job