Automated Driver Import in MDT 2013

As a follow up to my previous post, I also have developed a script to automate the import of drivers into MDT 2013.  This PowerShell script takes a source folder structure and duplicates the top two levels of folders in the MDT Deployment Share “Out-of-box drivers ” branch.  The script then imports all drivers found in the source directories to the matching folders in MDT.

All we have to do is extract all drivers for a given computer model into an appropriately named folder in the source directory, and then run the script.

################################################################################
#
#  Create-MDTDriverStructure.ps1
#  J. Greg Mackinnon, University of Vermont, 2013-11-05
#  Creates a folder structure in the "Out of Box Drivers" branch of a MDT 2013
#    deployment share.  The structure matches the first two subdirectories of 
#    the source filesystem defined in $srcRoot.  All drivers contained within
#    $srcRoot are imported into the deployment share.
#
#  Requires: 
#    $srcDir - A driver source directory, 
#    $MDTRoot - a MDT 2013 deployment share
#    - MDT 2013 must be installed in the path noted in $modDir!!!
#
################################################################################

# Define source driver directories:
[string] $srcRoot = 'E:\staging\drivers\import'
[string[]] $sources = gci -Attributes D $srcRoot | `
    Select-Object -Property name | % {$_.name.tostring()}
	
# Initialize MDT Working Environment:
[string] $MDTRoot = 'E:\DevRoot'
[string] $PSDriveName = 'DS100'
[string] $oobRoot = $PSDriveName + ":\Out-Of-Box Drivers"
[string] $modDir = `
	'C:\Program Files\Microsoft Deployment Toolkit\Bin\MicrosoftDeploymentToolkit.psd1'
Import-Module $modDir
New-PSDrive -Name "$PSDriveName" -PSProvider MDTProvider -Root $MDTRoot


foreach ($source in $sources){
    Write-Host "Working with source: " $source -ForegroundColor Magenta
    # Create the OOB Top-level folders:
    new-item -path $oobRoot -name $source -itemType "directory" -Verbose
    # Define a variable for the current working directory:
    $sub1 = $srcRoot + "\" + $source
    # Create an array containing the folders to be imported:
    $items = gci -Attributes D $sub1 | Select-Object -Property name | % {$_.name.tostring()}
    $oobDir = $oobRoot + "\" + $source

    foreach ($item in $items) {
		# Define source and target directories for driver import:
	    [string] $dstDir = $oobDir + "\" + $item
	    [string] $srcDir = $sub1 + "\" + $item
	
	    # Clean up "cruft" files that lead to duplicate drivers in the share:
		Write-Host "Processing $item" -ForeGroundColor Green
	    Write-Host "Cleaning extraneous files..." -ForegroundColor Cyan
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "Found " $delItems.count " files to delete..." -ForegroundColor Yellow
	    $delItems | remove-Item -force -confirm:$false
        $delItems = gci -recurse -Include version.txt,release.dat,cachescrubbed.txt $srcDir
        Write-Host "New count for extraneous files: " $delItems.count -ForegroundColor Yellow

	    # Create the target directory:
		Write-Host "Creating $item folder" -ForegroundColor Cyan
	    new-item -path $oobDir -name $item -itemType "directory" -Verbose
	
	    # Import all drivers from the source to the new target:
		Write-Host "Importing Drivers for $item" -ForegroundColor Cyan
	    Import-MDTDriver -Path $dstDir -SourcePath $srcDir 
		
        Write-Host "Moving to next directory..." -ForegroundColor Green
		
    } # End ForEach Item
} # End ForEach Source

Remove-PSDrive -Name "$PSDriveName"

 

Rethinking Driver Management in MDT 2013

We have been using the Microsoft Deployment Toolkit (MDT) in LTI/Lite Touch mode here at the University for a long time now.  Why, we used it to deploy XP back when MDT still was called the Business Desktop Deployment Solution Accelerator (BDD).  In this time, we have gone though several different driver management methods.  Gone are the nightmare days of dealing with OEMSETUP files, $OEM$ directories, can elaborate “DriverPack” injection scripts for XP (thank goodness).  

With the release of Vista, we moved from a PnP free-for-all model of driver detection.  After Windows 8.0, we found we really needed to separate our drivers by operating system.  Thus, we created Win7, Win8, and WinPE driver selection profiles.

But now we are finding that driver sprawl is becoming a major issue again.  On many new systems we run though a seemingly successful deployment, but end up with a non-responsive touch screen, a buggy track pad, and (sometimes) a very unstable system.

Starting this week, we are trying a new hybrid driver management approach.  We will create a driver folder for each computer model sold though our computer depot.  I have developed a custom bit of VBScript to check to see if the hardware being deployed to is a known model.  Driver injection will be restricted to this model if a match is found.  The script contains additional logic to detect support for both Windows 7 and Windows 8 variants, and to select the most current drivers detected.  Unknown models will fall back on the PnP free-for-all detection method.

Here is how it works…

  1. Create a new group in your OS deployment task sequence named “Custom Driver Inject”, or something similar.  Grouping all actions together will allow easier transfer of these custom actions to other Task Sequences:
    DM-DriverInjectGroup      
  2. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name the variable “TargetOS”,and set the value to the OS that you are deploying from this task sequence.  You must follow the same naming convention that you use in your Out-of-box driver folder.  I use Win(X), where (X) is the major OS version of the drivers in the folder.  In this example, I have chose “Win8″:
    DM-SetTargetOS
  3. Add an action of type “Run Command Line”.  Name this action “Supported Model Check”.  Under the Command line field, enter “cscript “%SCRIPTROOT%\ZUVMCheckModel.wsf”.  (We will import this script into the deployment share later on.)
    DM-RunModelCheckScript
  4. Add a sub-group named “Supported Model Actions”.  Under the “Options” tab, add a condition of type “Task Sequence Variable”.  Use the variable “SupportedModel”, the Condition “equals”, and the Value “YES”.  (The SupportedModel variable gets set by the CheckModel script run in the previous step.):
    DM-ConditionalGroup
  5. Under this new group, add a new action of type “Set Task Sequence Variable”.  Name this task “Set Variable DriverGroup002″.  Under “Task Sequence Variable”, set “DriverGroup002″, and set the value to “Models\%TargetOS%\%Model%”.  (Note:  You could use “DriverGroup001″, but I already am using this variable to hold a default group of drivers that I want added to all systems.  The value “%TargetOS%\%Model%” defines the path to the driver group in the deployment share.  If you use a different folder structure, you will need to modify this path.):
    DM-SetDriverGroup
  6. Create a new task of type “Inject Drivers”.  Name this task “Inject Model-Specific Drivers”.  For the selection profile, select “Nothing”.  Be sure to select “Install all drivers from the selection profile”.  (NOTE: The dialog implies that we will be injecting only divers from a selection profile.  In fact, this step will inject drivers from any paths defined in any present “DriverGroupXXX” Task Sequence variables.)
    DM-InjectModelDrivers
  7. Now, under our original Custom Driver Inject group, add a new task of type “Inject Drivers”.  Choose from the selection profile “All Drivers”, or use a different fallback selection profile that suits the needs of your task sequence.  This time, select “Install only matching drivers from the selection profile”:
    DM-InjectUnsupported1
    Under the “Options” tab, add the condition where the “Task Sequence Variable” named “Supported Model” equals “NO”:
    DM-InjectUnsupported2
    This step will handle injection of matching drivers into hardware models for which we do not have a pre-defined driver group.
  8. Optionally, you now can open the “CustomSettings.ini” file and add the following to your “Default” section:
         DriverGroup001=Peripherals
    (I have a “Peripherals” driver group configured which contains USB Ethernet drivers used in our environment.  These are a necessity when deploying to hardware that does not have an embedded Ethernet port, such as the Dell XPS 12 and XPS 13.  You also could add common peripherals with complicated drivers such as a DisplayLink docking station or a Dell touch screen monitor.)
  9. Add the “ZUVMCheckMedia.wsf” script to the “Scripts” folder of your deployment share.  The code for this script is included below.  I think the script should be fairly easy to adapt for your environment.
  10. Finally, structure your “Out-of-Box Drivers” folder to contain a “Models” folder, and a single folder for each matching hardware model in your environment.  I get most of our driver collections from Dell:
    http://en.community.dell.com/techcenter/enterprise-client/w/wiki/2065.dell-driver-cab-files-for-enterprise-client-os-deployment.aspx
    (NOTE:  Thanks Dell!)
    The real challenge of maintaining this tree is in getting the model names right.  Use “wmic computersystem get model” to discover the model string for any new systems in your environment.  A table of a few current models I have been working with is included below.

Dell Marketing Model Name to WMI Name Translator Table:

  • Dell XPS 12 (first generation) – “XPS 12 9Q23″
  • Dell XPS 12 (second generation) – “XPS 12-9Q33″
  • Dell XPS 13 (first generation) – “Dell System XPS L321X”
  • Dell XPS 13 (second generation) – “Dell System XPS L322X”
  • Dell XPS 14 – “XPS L421Q”
  • Dell Latitude 10 – “Latitude 10 – ST2″
  • VMware Virtual Machine – “VMware Virtual Platform”
  • Microsoft Hyper-V Virtual Machine – “Virtual Machine”

A fun nuance we encountered last week was a Latitude E5430 model that contained “no-vPro” after the model number. Dell does not provide separate driver CABs for vPro/non-vPro models, so I added a regular expression test for Latitudes, and strip any cruft after the model number. There is one more problem down…

The following site contains a list of older model name translations:
http://www.faqshop.com/wp/misc/wmi/list-of-wmic-csproduct-get-name-results
As you can see, most Latitudes and Optiplexes follow sane and predictable model name conventions. I wish the same were true for the XPS.

Finally, I am indebted to the following sources for their generously detailed posts on driver management. Without their help, I doubt I would have been able to make this solution fly:

Jeff Hughes of the Windows Enterprise Support Server Core Team:
http://blogs.technet.com/b/askcore/archive/2013/05/09/how-to-manage-out-of-box-drivers-with-the-use-of-model-specific-driver-groups-in-microsoft-deployment-toolkit-2012-update-1.aspx

Andrew Barnes (aka Scriptimus Prime), whose posts on MDT driver management give the basics DriverGroups and model selection:
http://scriptimus.wordpress.com/2013/02/25/ltizti-deployments-injecting-drivers-during-deployment/
AND, of automating driver import into MDT (written for MDT 2012… some changes required for 2013):
http://scriptimus.wordpress.com/2012/06/08/mdt-2012-creating-a-driverstore-folder-structure/

The incredible Michael Neihaus, who in this post discusses the use of DriverGroups and Selection Profiles:
http://blogs.technet.com/b/mniehaus/archive/2009/09/09/mdt-2010-new-feature-19-improved-driver-management.aspx

And finally Eric Schloss of the “Admin Nexus”, who give me the idea of developing a fallback for systems that do not match a known model. It was this key bit of smarts that gave me the confidence to move forward with a model-specific driver grouping strategy:
http://adminnexus.blogspot.com/2012/08/yet-another-approach-to-driver.html

ZUVMCheckModel.wsf script:

(NOTE: WordPress stripped off the WSF headers and footers from my script. These are the first three and last two lines in the script. If you copy from this post, please remember to place greater than and less than tags around these lines before running, as indicated in the comments.)

' Uncomment and wrap each of the following three lines in less than/greater than characters to convert them to tags.
'job id="ZUVMCheckModel"
'script language="VBScript" src="ZTIUtility.vbs"/
'script language="VBScript"

Option Explicit
RunNewInstance

'//--------------------------------------------------------
'// Main Class
'//--------------------------------------------------------
Class ZUVMCheckModel
	
	'//—————————————————————————-
	'//  Constructor to initialize needed global objects
	'//—————————————————————————-
	Private Sub Class_Initialize
	End Sub
	
	'//--------------------------------------------------------
	'// Main routine
	'//--------------------------------------------------------

	Function Main()
	' //*******************************************************
	' //
	' // File: ZTIUVMCheckModel.wsf
	' //
	' // Purpose: Checks the model of this system against
	' //          a list of known machine models.  Returns
	' //          TRUE if a matching model is detected.
	' //
	' // Usage: cscript ZUVMCheckModel.wsf /Model: [/debug:true]
	' //
	' //*******************************************************
	
	'Use the following lines for debugging only.
	'oEnvironment.Item("TargetOS") = "Win7"
	'oEnvironment.item("DeployRoot") = "c:\local\mdt"
	'oEnvironment.Item("Model") = "Latitude E6500 some annoying variation"
	'End debug Params

	  Dim aModels()          'Array of models taken from DriverGroups.xml
	  Dim bOldDrivers        'Boolean indicating drivers present for an older OS version
	  Dim i                  'Generic integer for looping
	  Dim j                  'Generic integer for looping
	  Dim iRetVal            'Return code variable
	  Dim iMaxOS             'Integer representing the highest matching OS driver store
	  Dim oRegEx
	  Dim oMatch
	  Dim match
	  Dim oXMLDoc            'XML Document Object, for reading DriverGroups.xml
	  Dim Root,NodeList,Elem 'Objects in support of oXMLdoc
	  Dim sDGPath            'Path to DriverGroups.xml file
	  Dim sInitModel         'String representing the initial value of
	                         '   oEnvironment.Item("Model")
	  Dim sItem	             'Item in aModels array.
	  Dim sMaxOS             'OS Name of highest matching OS driver store
	  Dim sOSFound           'OS Name for a given matching driver set.
	  
	  oLogging.CreateEntry "Begin ZUVMCheckModel...", LogTypeInfo
	  
	  'Set the default values:
	  oEnvironment.Item("SupportedModel") = "NO"
	  iMaxOS = CInt(Right(oEnvironment.Item("TargetOS"),1))
	  'wscript.echo "Default value for iMaxOS = " & iMaxOS
	  bOldDrivers = false
	  sInitModel = oEnvironment.Item("Model")
	  'wscript.echo "sInitModel value = " & sInitModel
	  
	  Set oRegEx = New RegExp
	  oRegEx.Global = True
	  oRegEx.IgnoreCase = True
	  
	  'Modify the detected model name to handle known variations:
	  oRegEx.pattern = "Latitude"
	  if oRegEx.test(sInitModel) then
		oLogging.CreateEntry "Model is a Latitude.  Cleaning up the model name...", LogTypeInfo
		oRegEx.pattern = " "
		set oMatch = oRegEx.Execute(sInitModel)
		'wscript.echo "oMatch Count is: " & oMatch.count
		if oMatch.Count > 1 then
			i = oMatch.item(1).FirstIndex
			oEnvironment.Item("Model") = Left(sInitModel,i)
			'wscript.echo """"&oEnvironment.Item("Model")&""""
		end if
	  end if

	  'Check for DriverGroups.xml file, which will contain the supported model list:
	  iRetVal = Failure
	  iRetVal = oUtility.FindFile("DriverGroups.xml", sDGPath)
	  if iRetVal  Success then
		oLogging.CreateEntry "DriverGroups file not found. ", LogTypeError
		exit function
	  end if 
	  oLogging.CreateEntry "Path to DriverGroups.xml: " & sDGPath, LogTypeInfo
	  
	  'Parse the DriverGroups.xml file:
	  oLogging.CreateEntry "Parsing DriverGroups.xml...", LogTypeInfo
	  Set oXMLDoc = CreateObject("Msxml2.DOMDocument") 
	  oXMLDoc.setProperty "SelectionLanguage", "XPath"
	  oXMLDoc.load(sDGPath)
	  Set Root = oXMLDoc.documentElement 
	  Set NodeList = Root.getElementsByTagName("Name")
	  oLogging.CreateEntry "NodeList Member Count is: " & NodeList.length, LogTypeInfo
	  'oLogging.CreateEntry "NodeList.Length variant type is: " & TypeName(NodeList.Length), LogTypeInfo
	  i = CInt(NodeList.length) - 1
	  ReDim aModels(i) 'Resize aModels to hold all matching DriverGroup items.
	  'oLogging.CreateEntry "List of Available Driver Groups:", LogTypeInfo
	  i = 0
	  For Each Elem In NodeList
		if InStr(Elem.Text,"Models\") then
			aModels(i) = Mid(Elem.Text,8)	'Add text after "Models\"
			'oLogging.CreateEntry aModels(i), LogTypeInfo
			i = i + 1
		end if
	  Next
	  oLogging.CreateEntry "End Parsing DriverGroups.xml.", LogTypeInfo

	  'Loop through the list of supported models to find a match:
	  oLogging.CreateEntry "Checking discovered driver groups for match to: " & oenvironment.Item("Model"), LogTypeInfo
	  For Each sItem in aModels
		oLogging.CreateEntry "Checking Driver Group: " & sItem, LogTypeInfo
		i = InStr(1, sItem, oEnvironment.Item("Model"), vbTextCompare)

		'wscript.echo TypeName(i) 'i is a "Long" number type.
		If i  0 Then
			oLogging.CreateEntry "Matching Model found.", LogTypeInfo
			
			j = InStr(sItem,"\")
			sOSFound = Left(sItem,j-1)
			'wscript.echo "sOSFound = " & sOSFound 
			if (InStr(1,sOSFound,oEnvironment.Item("TargetOS"),vbTextCompare)  0) then
				oLogging.CreateEntry "Drivers matching the requested OS are available.  Exiting with success.", LogTypeInfo
				oEnvironment.Item("SupportedModel") = "YES"
				iRetVal = Success
				Main = iRetVal
				Exit Function
			end if
			if iMaxOS > CInt(Right(sOSFound,1)) then
				iMaxOS = CInt(Right(sOSFound,1))
				'wscript.echo "iMaxOS = " & iMaxOS
				sMaxOS = sOSFound
				bOldDrivers = true
				'wscript.echo "sMaxOS = " & sMaxOS
			end if
		End If
	  Next
		
	  If bOldDrivers Then 'Run if sMaxOS is defined... set a boolean when this is defined and test against that?
		oLogging.CreateEntry "Model drivers were found for an OS older than the one selected...", LogTypeWarning
		oEnvironment.Item("SupportedModel") = "YES"
		oEnvironment.Item("TargetOS") = sMaxOS
	  Else
	    oLogging.CreateEntry "No matching drivers were found for this model.", LogTypeInfo
	  End If
	  
	  oLogging.CreateEntry "End ZUVMCheckModel.", LogTypeInfo

	  iRetVal = Success
	  Main = iRetVal

	End Function

End Class

' Uncomment and wrap each of the following two lines in less than/greater than characters to convert them to tags.
'/script
'/job

Miracast – the technology with 100 names

Some of us in ETS have been experimenting with wireless display technologies in the hopes of finding solutions that will work for those of us who don’t have an Apple client devices and an Apple TV.   (To those of you with Macs, we are indeed jealous.  AirPlay truly dominates the wireless display market at this time).

Much noise has been made of late concerning a technology called “Miracast”.  This is an open standard for wireless display.  It is built into the new Windows 8.1 OS, and Android 4.2 and later.  When you can get a functional receiver (we tested the NetGear Push2TV with some success) it is a pretty slick technology.  However, issues getting the required drivers and firmware in place can be quite frustrating and lead to failed deployments.  If you are working on a wireless display deployment, we would love to share notes with you to see if we can reach some configuration recommendations for the rest of campus.

In the interest of information sharing, here are some factoids that we discovered:

Miracast = The “Wi-Fi Alliance” marketing name for the rather boringly named “Wi-Fi Display” standard.

Miracast = Sony “screen mirroring” = Panasonic “display mirroring” =  Google “Wireless Display” = Samsung “AllShare Cast” = LG “SmartShare” = Intel “WiDi 3.5″
(e.g. Implementers of the Wi-Fi Display protocol don’t have to call it Miracast.  The question is, why would they want to call it something else?)

Miracast Intel WiDi versions prior to 3.5. AirPlay DIAL (ChromeCast) DLNA

(e.g. Intel WiDi has been replaced with a Miracast implementation, but maintains the name WiDi when deployed with support for back-level WiDi implementations.  Also, don’t confuse Miracast with ChromeCast (a DIAL implementation), AirPlay (an Apple technology), or DLNA (a media streaming solution.)

References:
http://en.wikipedia.org/wiki/Miracast
http://www.theverge.com/2013/9/25/4753264/is-the-airplay-killer-already-dead

App-V Server Configuration, Load Balanced Configuration

In my last post I discussed issues around Kerberos configuration for an App-V 5 server cluster in a load balanced configuration.  Today I will discuss subsequent configuration requirements for making App-V publishing function in a load-balanced environment.

After configuring standalone app-V servers with Management and Publishing Server roles, I had good success with adding packages to the environment and publishing them.  However, when switching to a load balanced configuration, I experienced a failure of the publishing server to pick up on changes in the management configuration.  Helpful resources and troubleshooting notes follow:

  • A TechNet social page that I referenced in my previous post makes reference to this same problem:
    http://social.technet.microsoft.com/Forums/en-US/2b39e2b8-aba1-4e96-b18f-c5bcb9f12687/load-balancing-two-appv-50-servers-the-publishing-service-is-not-able-to-contact-the-management
    But does not point me towards any solutions.  This seems like some sort of permissions problem, so I put Sysinternals Procmon on watching w3wp.exe for “Access Denied” events, but I get nothing.  However, I do see a fair amount of database traffic at IIS startup time.
  • The following TechNet blog provided a key tipoff in App-V server diagnostics:
    http://blogs.technet.com/b/appv/archive/2013/01/23/how-to-collect-app-v-5-0-debug-traces.aspx
      The trick was to select “Show Analytic and Debug Logs” under “Actions” in the event viewer.  With this option enabled, I now see App-V management and publishing debug logs instead of just the default App-V event logs.  The debug logs contain the real error.  We see that the SID recorded for the publishing server in the management server database does not match the SID of the account making the connection!  What we needed to do was delete the publishing server entries from the management configuration, and create one new “server” under the name of the publishing server service account, not the computer account.  I just updated the SQL database entry manually, but I likely could have just used the Silverlight UI instead.  This change cleared up the mismatched SID error, but now we get an “access denied” error to the publishing metadata directory.
  • The following blog gives an excellent technical overview of App-V server infrastructure and the general troubleshooting process for resolving configuration issues:
    http://kirxblog.wordpress.com/2012/05/11/app-v-5-beta-basic-troubleshooting/

    • Here it is suggested that I look at HKLM/Software/Microsoft/AppV/Server to review the management and publishing server configurations.  Sure enough, one problem seen here is that the publishing server is configured to connect to the management server on an http:// address.  However, I updated the management servers to use https://.  I modified those registry values and restarted IIS.  Still no luck… 
    • This blog explains how published applications are read out of a metadata xml file that is exposed to the publishing server by the management server.  Both are stored in c:\programdata\microsoft\appv.  When running Procmon.exe against w3wp.exe we see “Access Denied” to these directories by our service account.  After adding “modify” rights for the service account to these directories, metadata updates again start to happen.

It is unsurprising that switching form the use of a local server account to an AD service account caused access problems for the App-V server.  The difficulty of discovering where account info and rights needed to be updated was a bit of a surprise.  But thanks to the blog-o-sphere and the mighty “procmon.exe”, we have our answers.

Now on to performance testing…

App-V 5 Server, F5 Load Balancers, and Kerberos

More fun today with Kerberos and load balancers.  Today’s challenge related to getting the Microsoft App-V publishing server to work with an F5 load balancer in a Layer 4/n-Path/DSR configuration.  Everything was working when I was accessing the individual server nodes, but when I switched to using the load balanced name and address, authentication started to fail.

After lots of log searching I eventually tried a wire trace, and found the following Kerberos error in the response from the App-V server to the App-V client:
KRB5KRB_AP_ERR_MODIFIED

Lots of different resources helped here:

  • This TechNet page explains various Kerberos errors and why they might occur:
    http://blogs.technet.com/b/askds/archive/2008/06/11/kerberos-authentication-problems-service-principal-name-spn-issues-part-3.aspx

    Of note is the scenario where the account handling the authentication request does not hold the SPN for which the request was made.  I set the SPN for my IIS application pool identity, but further analysis of the error packet shows that it was handled by my App-V server machine account, not the service account.  Augh!  Why?

  • This thread on TechNet Social was the biggest help:

    http://social.technet.microsoft.com/Forums/en-US/2b39e2b8-aba1-4e96-b18f-c5bcb9f12687/load-balancing-two-appv-50-servers-the-publishing-service-is-not-able-to-contact-the-management

    The user posted all of the steps they followed in configuring IIS and the service account SPN, including the tidbit:
    changed the authentication of the “Management Service” web site to useAppPoolCredentials=”true”
    I have never used this particular setting, so I dug into it…

  • The following MSDN article explains the IIS 7.0 feature of “kernel authentication”, how it affects the need for SPN entries, and its interplay with application pool identity accounts:
    http://blogs.msdn.com/b/webtopics/archive/2009/01/19/service-principal-name-spn-checklist-for-kerberos-authentication-with-iis-7-0.aspx

    Basically, with kernel-mode authentication, the SYSTEM account will handle all Kerberos authentication by default.  This explains why we were seeing Kerberos errors in the communications with the App-V client… the IIS pool identity account was not handling Kerberos delegation!

    Of special interest is this statement:
    Either:
    Disable Kernel mode authentication and follow the general steps for Kerberos as in the previous IIS 6.0 version.
    Or,
    [Recommended for Performance reasons]
    Let Kernel mode authentication be enabled and the Application pool’s identity be used for Kerberos ticket decryption. The only thing you need to do here is:
    1. Run the Application pool under a common custom domain account.
    2. Add this attribute “useAppPoolCredentials” in the ApplicationHost.config file.

  • This TechNet page documents how to configure Kerberos auth in IIS, and mentions the use of the IIS appcmd.exe to set the “useAppPoolCredentials” option:
    http://technet.microsoft.com/en-us/library/dd759186.aspx
    Included is the exact command line required to set the value to true:
     appcmd.exe set config -section:system.webServer/security/authentication/windowsAuthentication -useAppPoolCredentials:true
    (But the page does not really tell you what it is for, which is where the MSDN article comes in handy.)

So, Kerberos under IIS 7 and later has some nuances not present in IIS 6.  I wonder how I did not encounter this before?

Using SQL Logins with “AlwaysOn” Availability Groups

I had some fun today configuring a SQL 2012 “AlwaysOn” Availability Group working with a database that was configured to use SQL Logins.  I am working with the vendor to see if we can use Integrated Authentication instead, but in the meantime I managed to get failover functional.

The problem I was experiencing was that on failover, the database users defined in the database lost their mappings to the SQL Login on the server.  I had created SQL Logins on both nodes of the Availability Group with identical usernames and passwords, but the mapping still failed.  Turns out this was happening because SQL Logins (like AD and NT accounts) have SIDs, and these SIDs are used to map database users to logins.  To correct the problem, I needed to create SQL Logins with identical SIDs on the two servers.

Procedure:

  1. Create SQL Logins on one node of the Availability Group and perform mappings.
  2. Lookup the SIDs of the users using the following query:
    SELECT SUSER_SID (‘[SqlLogin]‘)
  3. Script out creation of the matching logins:
    USE [master]
    GO
    CREATE LOGIN [SqlLogin] WITH PASSWORD=N’[Password]‘, SID=[HexidecimalSID], DEFAULT_DATABASE=[myDatabase], DEFAULT_LANGUAGE=[us_english], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
    GO
  4. Failover the Availability Group and verify that user mappings are working as planned.  Verify that passwords are working, too.

Another useful tidbit here is a script to re-map SQL database users to local SQL logins:

USE myDatabase
GO
sp_change_users_login @Action=’update_one’, @UserNamePattern=’databaseUserName’, @LoginName=’SqlLoginName’;
GO

Really I am surprised that I did not run into this problem with our SQL 2008 mirrored databases.  The problem should have been present there as well.

The case of the undeletable directory

I had a “fun” time today with some directories that I could not delete.  These were mandatory roaming profiles that I previously had attempted to upload to a file server using “robocopy”.  Owing to switches that I used when performing the upload, directory junctions were treated as files, and robocopy got caught in a recursive loop because of a circular reference to “Application Data”.  By the time I caught the problem, I had created a set of nested “Application Data” directories that must have been over 400 characters long.

All of the tricks I had used in the past to delete “deep” directories were failing me.  “Dir /s /q [dirName]” failed with “Access Denied”, even though I was the directory owner.  Running the command as System encountered the same problem.  I mapped a drive to the directory, as deep down in the nested folders as I could get.  From there, I was able to “CD” to the deepest “Application Data” directory, but I still could not delete the directory.  (I got a “directory not empty” error.)

Eventually, Google unveiled a suggestion to use “robocopy” to mirror an empty directory to the problematic directory.  (Unfortunately, I have lost track of the link, so I cannot give credit where it is due.)  “Good idea,” I thought.  Why not use the utility that created the problem to solve the problem?  After all, if robocopy can create paths that are 400 characters long, perhaps it can delete them, too.

To test, I simply created an empty directory “E:\foo”, and ran the command:
robocopy.exe /mir /e E:\foo E:\mandatory\corruptProfile

Robocopy quickly chewed though the problematic profile, and a few seconds later I had an empty directory that I was able to delete.  Hurray!

LiteTouch failures on UEFI systems

Today I got a complaint that a user could not deploy Windows 8 on a UEFI-enabled computer (a Latitude 10 tablet, to be precise) using our MDT 2012 Update 1 deployment share (in Lite Touch mode, of course).  At the same time, I was experiencing problems deploying Windows 8 on a Dell XPS 14 with UEFI firmware enabled.  Interestingly, the deployment problem did not occur when running with Legacy BIOS enabled.  What gives?

The error reported by Lite Touch was: “MDT FAILURE (5616): Verify BCDbootex”.  Much digging though the logs revealed the command line that triggered the problem (It was spelled out in the “LTIApply.log”).  This command was something like:
\\server\deployShare\Tools\x64\bcdboot c:\windows /l en-us /s v: /f UEFI

If I try to run the command above manually, I just get back the bcdboot.exe help dialog.  Further investigation revealed that the version of bcdboot.exe on the server share is dated to 2009.  This is the Windows 7 RTM release version!

I created a new deployment share to see if the bad version of bcdboot.exe gets placed there again… it did not.  So, I removed bcdboot.exe from the deployment share (along with an old version of ImageX.exe and some other expired utilities).  Upon re-running LiteTouch, deployment succeeds.  Both Windows 7 and Windows 8 boot media contain bcdboot.exe in the search path… no server side copy is required.  I wonder how it got there in the first place?  Maybe we were trying to make it easier for people to capture system images using the LiteTouch boot media.  Another entry for the department of unintended consequences…

VDI Profile Loading Delays

We are noticing that it takes rather a long time for users to log in to our VDI environment (~2 minutes, in some circumstances).  I did some analysis of login times using Sysinternals Procmon.  (Enable boot logging, use the “view process tree” feature to look at process times at logon.  See http://blogs.technet.com/b/markrussinovich/archive/2012/07/02/3506849.aspx for details).  What I found was that a child process of explorer.exe called “ie4uinit.exe” was running for most of this time.  This process appears to be part of Microsoft “Active Setup” (discussed in some detail here: http://blog.ressoftware.com/index.php/2011/12/29/disable-active-setup-revealed/).

So what if we disable Active Setup?  Noise on the Internet suggests that this is possible , simply be deleting the key:
HKEY_LOCAL_MACHINE\Software\Microsoft\Active Setup
as suggested here:
http://communities.vmware.com/thread/292229?start=0&tstart=0

However, there is some indication that this could have unintended consequences.  In my case, it immediately caused a logon script to fail to run.  Bummer!

What other solutions are possible?  Members of the Windows in Higher Education mailing list recently recommended using mandatory profiles.  There is a reasonably good rundown of the mandatory profile creation process here:
http://markswinkels.nl/2009/12/how-to-create-a-mandatory-profile-in-windows-server-2008-r2/
Missing details are:

  1. It is possible for the mandatory roaming profile to be stored locally (i.e. “C:\Users\VDI_Mandatory.V2″ to avoid over-the-network profile copy delays.  However, in our View environment, using a network location appears to be faster!
  2. The mandatory roaming profile can be specified using the Group Policy settings in Computer -> Policies -> Administrative Templates -> System -> User Profiles.  (See “Set roaming profile path for all users logging onto this computer” and “Delete cached copies of roaming profile”.)

In testing, I found initial logon times were reduced from two minutes to approximately 20 seconds.  Good!  (But still not great.)  Additional benefits are that it is no longer necessary to run the logon script that I developed to customize the Start Menu and Task Bar.  I also can remove the Group Policy preferences that clean up local profiles on the computer.

MBAM Configuration Nuances

This week we are continuing testing of the new Microsoft Bitlocker Administration and Management 2.0 tool (MBAM).

MBAM is not overly complicated, but it does have several service tiers and dependencies which make initial setup a bit irksome. After plowing though configuration of a SQL database, SQL Reporting Services, and IIS, we are still need to configured MBAM Group Policy settings, and then we needed to do a fir number of tweaks to make the service actually work. Here are the most significant deviations from the official documentation:

  1. The Group Policy templates for MBAM are not uploaded to the AD Policy Store during product installation, nor does the documentation recommend that you complete this step. However, if you want to be able to edit MBAM Policy from any workstation in the domain, you really do need to upload the ADMX templates. Making this happen is easy… just use the MBAM installer to install the MBAM policy templates locally, then open c:\windows\PolicyDefinitions, and copy BitLockerManagement.admx and BitLockerUserManagement.admx to \\[domain]\SYSVOL\[domain]\Policies\PolicyDefinitions (you will need domain admin rights to do this. Also copy the corresponding .adml files in the local language directory of your local PolicyDefinitions directory to the local language directory on the domain controller (in my case, these are in the “en-US” subdirectory).
  2. After installing the MBAM Client and policy settings, clients were failing to auto-initiate encryption, and were failing to report status to the management server.  The MBAM Admin Event Logs were showing the following error:
    Log Name: Microsoft-Windows-MBAM/Admin
    Source: Microsoft-Windows-MBAM
    Event ID: 4
    Task Category: None
    Level: Error
    User: SYSTEM
    Computer: machinename.domainname.com
    Description: An error occurred while sending encryption status data.
    Error code: 0x803d0013

    This is occurring for a few reasons.  One, the MBAM server is not trusted for delegation, so it cannot perform Kerberos authentication in IIS.  Two, the public URL for MBAM services (https://bitlocker.uvm.edu) does not match the internal name of the server (BAM1).  To fix this, we needed perform a few additional configuration steps:

    1. Create the following key and value on the MBAM management server:
      HKEY_LOCAL_MACHINE\Software\Microsoft\MBAM
      DWORD(32-bit) - DisableMachineVerification
      Value = 1
    2. On the MBAM Administration Server AD object, enable the “Trust for delegation for any service (Kerberos Only) option”, under the Delegation tab.
    3. Use the “setspn” utility to add additional principal names for the public URL of the server to the AD server account:
      setspn -A HOST/bitlocker.mydomain.com MYDOMAIN\MyServer$
      setspn -A HTTP/bitlocker.mydomain.com MYDOMAIN\MyServer$
      setspn -A RestrictedKrbHost/bitlocker.mydomain.com MYDOMAIN\MyServer$
      (Note that if using a service account to run the MBAM Administration Service, you should use “setspn” to set the HOST/HTTP names for the service account instead of the domain computer account).
    4. It appears that it may also be necessary to add the “BackConnectionHostNames” Reg_multi_Sz value to “HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0″ to include any public dns names used by the MBAM Administration Server (this likely only is necessary in a load balanced configuration).

    We then needed to perform an IISRESET on the management server, and cycle the MBAM Clients.

  3. The MBAM Help Desk web application was failing to display reports.  This was happening because the installer grabbed the unencrypted reporting services URL from the reporting services instance.  I had to open:
    C:\inetpub\Microsoft BitLocker Management Solution\Help Desk Website\web.config
    Then locate the tag, and edit the URL value to the SSL version of the Reporting Services web site.
  4. The MBAM documentation claims that you will use MBAM policies in place of standard Windows BitLocker policies.  This is somewhat misleading… Many MBAM policy settings also will change the “classic” BitLocker policy settings, so it will appear that you have configured both classic and MBAM policies in the editor.  This would not really be a problem were it not for the fact that MBAM policies are not comprehensive.  You may need to return to the “classic” settings to configure appropriate behavior in your environment.  For example, we experienced difficulty in encrypting a Dell Latitude 10 tablet using MBAM.  On this machine, we saw the following error in the MBAM Admin Event Log:
    Event ID: 2
    An error occurred while applying MBAM policies.
    Volume ID:\\?\Volume{VolumeGUID}\
    Error Code: 0x803100B6
    Details:  No pre-boot keyboard or Windows recovery Environment detected.  The user may not be able to provide the required input to unlock the volume.

    This error is happening because our policy is set to “Allow PIN” (BitLocker PIN Authenticator is allowed, but not required).  Apparently, MBAM default-fails the attempted encryption, even though this is not a “fatal” error.  To allow encryption to continue, I needed to set the classic policy “Enable use of BitLocker authentication requiring preboot keyboard input on slates” as defined here:
    http://technet.microsoft.com/en-us/library/jj679890.aspx#BKMK_slates
    With this policy in place, encryption completes successfully on the tablet computer.

Other than these caveats, the tool does appear to be working.  Setting up our PGP Universal Server was easier, but suffering though the pain of ongoing PGP disk encryption support was agonizing.  Hopefully a little time spent on configuring a solid BitLocker support environment will bear lasting fruit for our constituents down the road.

Additional Resources:

Rick Delserone’s MBAM: Real World Information – A rundown on MBAM Certificate Configuration, Group Policy Templates, and undocumented registry settings:
http://www.css-security.com/blog/mbam-real-world-information/