Archive for the ‘Active Directory’ Category

Replacing DirSync with Microsoft Azure Active Directory Sync Services

Back in October, MS announced that the new “MS Azure AD Sync Services” (AAD Sync) had gone GA, which means that DirSync is now officially deprecated.

Ironically, I got wind that “Azure AD Connect” would be replacing Azure AD Sync Services, not but a few weeks before:

http://blogs.technet.com/b/ad/archive/2014/08/04/connecting-ad-and-azure-ad-only-4-clicks-with-azure-ad-connect.aspx

Deprecated on release?  Not quite… it appears that Azure AD Sync Services will be at the core of Azure AD Connect, so time spent migrating to Azure AD Sync Services will not be a complete waste of time.

I decided to make this migration happen now, because we will be ramping up to provision faculty and staff with access to Office 365 Pro Plus software in the near future, and I would like to be working with a tool that does not require me to run it in an unsupported configuration (we removed the “Replicating Directory Changes” right from our DirSync service account, which at least at one time was considered a Bad Thing To Do.).

The Good News:

  • The out-of-box configuration does not require a service account with the “Replicating Directory Changes” right, making filtering of sensitive user attributes much easier.
  • The initial setup wizard allows you to de-select attributes that you don’t want to sync to Azure.  No more wading though dense miisclient.exe dialogs and walking off-support to do basic filtering!  The require attributes are listed in grey, so that you don’t risk filtering an attribute that Azure absolutely must have.
  • AAD Sync actually has some documentation available on its installation and use over at MSDN:
    http://msdn.microsoft.com/en-us/library/azure/dn790204.aspx
    …And there actually are quite a few important details in this seemingly cursory documentation.
  • AAD Sync entirely replaces the UI-heavy, un-maintainable attribute filtering tools in DirSync with a new “Declarative Provisioning” engine, that allow robust attribute filtering and transformation rules to be defined by the sys admin.
  • Custom “Inbound” attribute filters will be preserved during product upgrades, according to the documentation.
  • The documentation has been updated with instructions on how to make domain-based sync filters actually work in a multi-domain forest.  This was a key detail that was missing in the DirSync documentation.

The Bad News:

  • The Declarative Provisioning language, while based on VB for Applications (The “best known language by systems administrators”, according to the docs.  Whaaaat?!?!), is not actually VBA.  Debugging methods for dealing with this new VB-like thing are not at all documented, and examples of its usage are few and far between.

So what did I learn during this deployment process that was Blog-worthy?  How to debug Declarative Provisioning!

Debugging Declarative Provisioning (Import Expression Filters):

Let’s take an example.  We want to sync only students, and users who are to be provisioned for use of Lync in Office 365 into Azure.  To accomplish this, we have populated students with the text value “Student” in extensionAttribute1.  Lync users have had extensionAttribute2 populated with the text value “lync”.

When running the Synchronization Rule Editor, we create a new “inbound” rule, assign it to object type “inetOrgPerson”, link type is “join”, and give it a precedence under 100.  We skip the Scoping and Join Rules options, and go to “Transformations”.

The flowType gets set to “Expression”, the Target Attribute to “cloudFiltered”, and we define a VBA-like expression in the “Source” field:

EditSyncRule

My expression follows:

IIF(CBool(InStr([extensionAttribute1], "Student", 1)), NULL, IIF(CBool(InStr([extensionAttribute2], "lync", 1)), NULL, True)

So what is the intent here? “cloudFiltered” is a metaverse attribute that can be set to suppress synchronization of an account upon export to Azure AD.  If set to “True”, the account should not get synced to Azure.

Our expression uses the “IIF” fuction to check to see if “extensionAttrubute1″ contains “Student”.  If so, then “IIF” returns “NULL”, and NULL is fed to the “cloudFiltered” attribute (which, according to the docs, will cause the attribute to be cleared if it is already set).  However, if extensionAttribute1 does not contain “Student”, we perform a second test to see if “extensionAttribute2″ contains “lync”.  If so, cloudFiltered again gets set to NULL.  If “extensionAttribute2″ does not contain “lync”, then “cloudFiltered” gets set to the boolean value “True”.

Syntax quirks:

  • Items enclosed in square braces [] refer to metaverse attributes, and serve as a sort of automatic variable for provisioning.
  • Note that the InStr() function used here is not the same one documented in the VB for Applications language reference:
    http://msdn.microsoft.com/en-us/library/office/gg264811(v=office.15).aspx
    The “start position” parameter in the docs is not supported here, although it does appear that the “compare” parameter is supported (represented by the “1″ in my example, which means perform a case-insensitive comparison).
  • Everything is case sensitive… even the function names!  “IIF()” is all caps.  CBool(), CStr(), InStr(), and Error() are all “Caml Cased”.
  • IIF() is a bit of a strange beast by itself.  You might need this syntax reference:
    http://msdn.microsoft.com/en-us/library/office/gg264412(v=office.15).aspx

Now that we have a filter, how to we test it?  The docs say “Run a full import followed by a full sync” on the AD management agent.  While this will work, it also is a bit time consuming.  If there are syntax errors in your expression, you will get a lot of errors in your sync logs.  Is there something better?

As it turns out, yes there is.  Special thanks to Brian Desmond (co-author of the o’Riley press “Active Directory Cookbook”) for putting me on the right track:

  1. In the MIISClient.exe, right click your AD management agent, and select “Search Connector Space”:
    SearchConnectorSpace
  2. Specify the DN of a user that you want to test your expression on in the search box.  You can quickly get a DN using PowerShell with the ActiveDirectory module:
    Get-ADUser -Identity [samAccountName]
    Click “Search”, click the returned object, and click “Preview”:
    PreviewUser
  3. In the “Preview” dialog, select either a Full or Delta sync, then click “Generate Preview”, and select the “Import Attribute Flow” option on the left.  Review the list of Sync Rules on the bottom pane to make sure your rule was processed.  Any rule that generates a change will be displayed in the top pane:
    ImportFlow
    In this case, the expression did not generate a change in the existing metaverse object.
  4. If an expression that you have authored generates an error, a full stack trace will appear in the Error Details section of the Preview pane.  Scroll down to the bottom of the stack, where it is likely that the actual problem with your expression will be identified:
    PreviewError
    In this example, I inserted an “Error()” function with a null argument.  The language engine did not like this.
  5. Back in the “Start Preview” tab, you could “Commit Preview” to apply the filter rules for this specific object, and then view the results in the Metaverse Search component of miisclient.exe.

Using the preview tool, I was able to get my filter expressions working correctly with minimal fuss and delay.  The final hurdle to getting filtered attributes set correctly was understanding what the various “run profiles” in AAD Sync actually do.

  • AD Agent – Full Import:  Appears to sync data from AD to the AAD Sync Metaverse without applying Declarative Provisioning Filters.  I think.  Not 100% on this one.
  • AD Agent – Full Synchronization:  Appears to apply Declarative Provisioning filters on all metaverse objects.
  • Azure AD Agent – Full Sync: Appears to apply export filtering logic to metaverse objects entering the Azure connector space.
  • Azure AD Agent – Export:  Appears to sync metaverse objects to Azure without applying filtering logic.  My impression is that you must do a “sync” before an export, if you want logic applied to Azure AD objects as you intended.

My understanding of the run profiles may be flawed, but I will note that if I perform syncs in the following sequence, I get the expected results:

AD Agent:Full Import -> AD Agent:Full Sync -> Azure Agent:Full Sync -> Azure Agent:Export

Although it appears that the more authoritative way to get all of your records reconciled is:

               AD Agent:      MV:   Azure Agent:
               ~~~~~~~~~      ~~~   ~~~~~~~~~~~~
               Full Import => MV
                              MV  MV
                              MV => Export
                              MV  Delta Sync
               Export      <= MV                 (Only needed in Azure write-back scenarios)

However, with other sequences, I have seen strange results such as metaverse objects not getting updated with the intended filter results, or objects getting declared as “disconnectors” by the Azure AD Agent, and thus getting deleted from Azure.  Ugh!

That’s all for today… hopefully this info will help keep someone else out of trouble, too.

Provisioning students with Office 365 ProPlus licenses

NOTE:  This post was updated on 2014-11-21 to reflect refinements in the PowerShell provisioning script.

Interesting… I still seem to be working at UVM. There must be a story there, but you won’t read about it here.

Anyway, after getting back from my brief hiatus at Stanford University, I got back on the job of setting up Azure DirSync with Federated login to our in-house Web SSO platforms. I’ll need to post about the security changes required to make that work with UVM’s FERPA interpretation. To summarize, we got it working.

However, once students can log in to Office 365, we need to provision them with licenses. DirSync can’t do this, so I needed to script a task that will grant an office 365 ProPlus license to any un-provisioned active student. You will find the script, mostly unaltered, below. I just set it up as a scheduled task that runs sometime after the nightly in-house Active Directory update process.

To be useful outside of UVM, the code would need to be customized to handle the logic used in your organization to determine who is a student. We have extended the AD schema to include the eduPerson schema, and have populated the “eduPersonPrimaryAffiliation” attribute with “Student” for currently active students. If you do something different, have a look at the “Get-ADUser” line, and use a different LDAP query to fetch your student objects.

Enjoy.

# Provision-MSOLUsers.ps1 script, by J. Greg Mackinnon, 2014-07-30
# Updated 2014-11-20, new license SKU, corrections to error capture commands, and stronger typing of variables.
# Updated 2014-11-21, added "license options" package to the add license command, for granular service provisioning.
#
# Provisions all active student accounts in Active Directory with an Office 365 ProPlus license.
#
# Requires:
# - PowerShell Module "MSOnline"
# - PowerShell Module "ActiveDirectory"
# - Azure AD account with rights to read account information and set license status
# - Credentials for this account, with password saved in a file, as detailed below.
# - Script runs as a user with rights to read the eduPersonAffiliation property of all accounts in Active Directory.
#
#    Create a credential file using the following procedure:
#    1. Log in as the user that will execute the script.
#    2. Execute the following line of code in PowerShell:
#    ConvertTo-SecureString -String 'password' -AsPlainText -Force | ConvertFrom-SecureString | out-file "c:\local\scripts\msolCreds.txt" -Force
#

Set-PSDebug -Strict

#Setup local variables:
[string] $to = 'admin@myschool.edu'
[string] $from = 'ProvisioningScript@myschool.edu'
[string] $smtp = 'smtp.myschool.edu'
[string] $msolUser = 'myschool.dirsync@myTennant.onmicrosoft.com'

#initialize log and counter:
[string[]] $log = @()
[long] $pCount = 0

#initialize logging:
[string] $logFQPath = "c:\local\temp\provision-MSOLUsers.log"
New-Item -Path $logFQPath -ItemType file -Force

[DateTime] $sTime = get-date
$log += "Provisioning report for Office 365/Azure AD for: " + ($sTime.ToString()) + "`r`n"

function errLogMail ($err,$msg) {
    # Write error to log and e-mail function
    # Writes out the error object in $err to the global $log object.
    # Flushes the contents of the $log array to file, 
    # E-mails the log contents to the mail address specified in $to.
    [string] $except = $err.exception;
    [string] $invoke = $err.invocationInfo.Line;
    [string] $posMsg = $err.InvocationInfo.PositionMessage;
    $log +=  $msg + "`r`n`r`nException: `r`n$except `r`n`r`nError Position: `r`n$posMsg";
    $log | Out-File -FilePath $logFQPath -Append;
    [string] $subj = 'Office 365 Provisioning Script:  ERROR'
    [string] $body = $log | % {$_ + "`r`n"}
    Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp
}

#Import PS Modules used by this script:
try {
    Import-Module MSOnline -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading Azure AD (MSOnline) PowerShell module."
    errLogMail $myError $myMsg
    exit 101
}
try {
    Import-Module ActiveDirectory -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading ActiveDirectory PowerShell module."
    errLogMail $myError $myMsg
    exit 102
}

#Get credentials for use with MS Online Services:
try {
    $msolPwd = get-content C:\local\scripts\msolCreds.txt | convertto-securestring -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered getting creds from file."
    errLogMail $myError $myMsg
    exit 110
}
try {
    $msolCreds = New-Object System.Management.Automation.PSCredential ($msolUser, $msolPwd) -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in generating credential object."
    errLogMail $myError $myMsg
    exit 120
}
#Use the following credential command instead of the block above if running this script interactively:
#$msolCreds = get-credential

#Connect to MS Online Services:
try {
    #ErrorAction set to "Stop" for force any errors to be terminating errors.
    # default behavior for connection errors is non-terminating, so the "catch" block will not be processed.
    Connect-MsolService -Credential $msolCreds -ErrorAction Stop
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in connecting to MSOL Services."
    errLogMail $myError $myMsg
    exit 130
}
$log += "Connected to MS Online Services.`r`n"

#Generate license report:
$lics = @()  
$studAdv = Get-MsolAccountSku | ? {$_.accountSkuId -match 'STANDARDWOFFPACK_IW_STUDENT'}  
$log += 'Office 365 ProPlus for Student - license report:' 
$log += 'Total licenses: ' + $ppsSub.ActiveUnits  
$log += 'Consumed licenses: ' + $ppsSub.ConsumedUnits  
[int32] $alCount = $ppsSub.ActiveUnits - $ppsSub.ConsumedUnits
$log += 'Remaining licenses: ' + $alCount.toString() + "`r`n" 

#Set license options for students:
$stuLicOpts = New-MsolLicenseOptions -AccountSkuId $studAdv.AccountSkuId -DisabledPlans YAMMER_EDU,SHAREPOINTWAC_EDU,SHAREPOINTSTANDARD_EDU,EXCHANGE_S_STANDARD,MCOSTANDARD

#Retrieve active student accounts into a hashtable:
[hashtable] $students = @{}
try {
    #$NOTE:  The filter used for collecting students needed to be modified to fetch admitted students that are not yet active
    #   This is a 'hack' implemented by FCS to address the tendency for the registrar not to change student status until the first day of class.
    #   (Actual student count should be lower, but we have no way to know what the final count will be until the first day of classes.)
    #
    #get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(eduPersonAffiliation=Student))' -SearchBase 'ou=people,dc=myschool,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
    get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(extensionAttribute1=*Student*))' -SearchBase 'ou=people,dc=myschool,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Active Directory."
    errLogMail $myError $myMsg
    exit 200
}
$log += "Retrieved active students from Active Directory."
$log += "Active student count: " + $students.count


#Retrieve unprovisioned accounts from Azure AD:
[array] $ulUsers = @()
try {
    #Note use of "Synchronized" to suppress processing of cloud-only accounts.
    $ulUsers += Get-MsolUser -UnlicensedUsersOnly -Synchronized -All -errorAction Stop
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Azure AD. "
    errLogMail $myError $myMsg
    exit 300
}
$log += "Retrieved unlicensed MSOL users."
$log += "Unlicensed user count: " + $ulUsers.Count + "`r`n"

#Provision any account in $ulUsers that also is in the $students array:
foreach ($u in $ulUsers) {
    if ($students.get_item($u.UserPrincipalName) -eq $true) {
        #Uncomment to enable verbose logging of user to be processed.
        #$log += $u.UserPrincipalName + " is an active student."
        try {
            if ($u.UsageLocation -notmatch 'US') {
                #Set the usage location to the US... this is a prerequisite to assigning licenses.
                $u | Set-MsolUser -UsageLocation 'US' -ErrorAction Stop ;
                #Uncomment to enable verbose logging of usage location assignments.
                #$log += 'Successfully set them usage location for the user. '
            }
        } catch {
            $myError = $_
            $myMsg = "Error encountered in setting Office 365 usage location to user. "
            errLogMail $myError $myMsg
            exit 410
        }
        try {
            #Assign the student advantage license to the user, with desired license options
            $u | Set-MsolUserLicense -AddLicenses $studAdv.AccountSkuId -LicenseOptions $stuLicOpts -ErrorAction Stop ;
            #Uncomment to enable verbose logging of license assignments.
            #$log += 'Successfully set the Office license for the user. '
            $pCount += 1
        } catch {
            $myError = $_
            $myMsg = "Error encountered in assigning Office 365 license to user. "
            errLogMail $myError $myMsg
            exit 420
        }
    } else {
        $log += $u.UserPrincipalName + " is not an active student.  Skipped Provisioning."
    }
}

#Add reporting details to the log:
$eTime = Get-Date
$log += "`r`nProvisioning successfully completed at: " + ($eTime.ToString())
$log += "Provisioned $pCount accounts."
$tTime = new-timespan -Start $stime -End $etime
$log += 'Elapsed Time (hh:mm:ss): ' + $tTime.Hours + ':' + $tTime.Minutes + ':' + $tTime.Seconds

#Flush out the log and mail it:
$log | Out-File -FilePath $logFQPath -Append;
[string] $subj = 'Office 365 Provisioning Script:  SUCCESS'
[string] $body = $log | % {$_ + "`r`n"}
Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp

NTLMv2 – Troubleshooting Notes – NTLM Negotiation is a lie!

We are now in the process, several years late, of trying to disallow use LM and NTLM authentication on our campus. First attempt? Train Wreck!

For stage 1 of enforcement we chose to restrict our domain controllers to accept only NTLMv2 authentication, and reject LM/NTLM. Our thinking was that clients are supposed to negotiage for the the best auth security available, so by forcing NTLMv2 at the DC, we would be able to quickly flush out non-compliant clients. Futher, we assumed that there would not be a lot of non-compliant clients. Guess what? We were WRONG WRONG WRONG

You know which clients are non-compliant? Windows 7, IE8, IE8, FireFox 4b10, Chrome 9. What? Really? It’s true… when accessing IIS web servers configured to use Windows authentication, with NTLMv2 enforced on the authenticating server, even Windows 7 will negotiate down to the weakest possible NTLM level.   So… access to SharePoint 2007 from IE 9 beta on Win 7 falls back to welcome-to-the-80s LM authentication.  This is very shocking, and annoying. When inspecting authentication negotiation traffic to the web server, we see that both client and server have indicated that NTLMv2 is supported, and that LM should not be used. However, when the client sends its authentication challenge response, it sends NTLM and LM responses only and not NTLMv2. Only by setting local security policy on the Windows client to “NTLMv2 only” are we able to prevent the sending of LM/NTLM. Negotiation is a lie!

So, we are going to take this us with MS Support. Some resources that may be helpful going forward:

http://technet.microsoft.com/en-us/magazine/2006.08.securitywatch.aspx
A good explanation of how the “LMCompatibilityLevel” security setting works.  Contains one major factual error… it claims that clients and servers will always negotiate the highest mutually-support LM auth level.  This may be true for SMB/CIFS.  It certainly is not true for web auth.

http://kb.iu.edu/data/bamu.html
Over at Indiana University, LM/NTLM has been disabled for some time.  They have thorough documentation on configuring workstations to function properly in their more restrictive environment.  This page concerns the use of SharePoint at IU, and the required configuration settings for various browsers and operating systems for their network.  There is nothing specific to our problem here, but there is the implication that they experienced problems similar to our own.  All of their docs state that you must configure your client to perform NTLMv2 only, or authentication will fail.  Also included are notes on FireFox and Mac platforms, where client-side configuration apparently is required.
http://kb.iu.edu/data/atpq.html
An additional page that suggests that network clients will not be able to authenticate to IU resources unless they disabled NTLMv2 on the client.

Concerning FireFox on Windows:
http://forums.mozillazine.org/viewtopic.php?f=23&t=1648755
A developer notes that as of FF 3.6, FF uses Microsoft APIs for NTLM authentication.  So, if FF is not performing NTLM auth as you expect, you need to look to your OS for the source of the problem.  It’s true… we confirmed via packet capture that FF 4.0b10 and IE9RC both perform NTLM authentication in exactly the same way.

Concerning Macintosh client settings:
http://discussions.apple.com/thread.jspa?threadID=2369451
Recommended smb.conf file settings to enforce use of NTLMv2 on the Mac.  References are made here to an “nsmb.conf” file, which is not a typographic error.  More on this file here:
http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man5/nsmb.conf.5.html

Exploring Smart Cards and Windows Logon

I have been having some “fun” this week in exploring two-factor authentication possibilities for Macintosh and Windows 7 clients when connecting to Server 2008 and Server 2003 resources, especially via RDP.  Findings so far:

All Smart Cards are not created Equal:

  • Microsoft released a new Cryptographic API (CNG) with Vista/Server 2008.  This API allows Smart Cards to use the “Microsoft Smart Card Key Storage Provider” CSP (which, apparently, is part of the MS “Base CSP”), instead of a vendor-specific CSP, when generating certificates for Smart Card-based logon.  You must still install a driver for each Smart Card, but very often these drivers are available though Windows Update.
    • Aladdin/SafeNet eTokens do not support CNG, so you cannot use CNG-based Certificate Templates (i.e. Server 2008-compatible templates) when issuing Smart Card Logon certificates to Aladdin eTokens.  Thus, you must make sure that any Certificate Template that you use for Smart Cards are compatible with Server 2003 or earlier.  You also will need a software stack including the eToken CSP and drivers before you will be able to perform Smart Card login using an eToken.
    • Gemalto .NET Smart Cards do support CNG, so you can use CNG certificate templates with these Smart Cards.
    • RSA SecurID Hybrid Authenticators do not appear to support CNG, according to their (dated) product data sheet, so I expect you would need a custom CSP to use these for Windows Logon, too.
  • PGP Desktop can be configured to use keys stored on a Smart Card to unlock PGP-WDE encrypted drives.  However, only a few vendor’s Smart Cards are supported in this capacity, probably because PGP has hard-coded in support for only a few CSPs:
    • Aladdin eToken devices are supported both for WDE sign-on and for Administrator Key storage.
    • RSA Smart Cards are supported for WDS sign-on only
    • Gemalto Smart Cards are not supported at all.

Web Service Enrollment was dead:

  • Most how-to sites you visit on certificate enrollment and smart card logon (including one in Tech Net) state that you should set up a Certificate Enrollment Agent Workstation to use the Web Services interface on your MS Certificate Server.  Guess what?  The procedures do not work anymore on Server 2008:
    http://technet.microsoft.com/en-us/library/cc753800(WS.10).aspx
  • Server 2008 R2 has a new Web Enrollment interface that supports Smart Card enrollment from Vista/Win7 workstations.  Ban the use of Server 2008 R1!
  • Use the “Certificates” MMC snap-in in place of web enrollment if you must run Server 2008 (R1).

RDGateway – Smart Card Authentication requires trust:

  • Smart Card auth to our Remote Desktop Gateway Load Balancing cluster (based on F5) was failing.  Apparenly something about the SSL-offload configuration was creating a trustworthiness problem, and this was preventing Kerberos/Smart Card authentication from working.  We switched to a TCP/Layer-4 forwarding config, and now Smart Card authorization works just fine.  Note that the config that works is not the one that was recommended by F5.  They are big into TCP offload over there.  The thing is, for Kerberos and Smart Cards to work in an SSL-offload config, you need F5′s expensive Advanced Client Authenticaiton module.  I do not need more cost and complexity, so we will keep things simple this time.

ADFS 2 and SharePoint 2010 Integration

Here is a quick entry on ADFS2 and SharePoint 2010 integration. It is not an implementation guide or end-to-end walkthough… that comes later, if we decide to implement this thing.

At present, I am most interested in the model of SharePoint->ADFS2->Shibboleth, where the SP-STS trusts tokens from ADFS2.  ADFS2 is part of a chained federation with our Shib service.  ADFS will consume Shib tokens, then transform them for the benefit of SharePoint.  However, I have no idea how to implement this solution at this time.

There are a few-too-many blog entires out there detailing how to configure ADFS2 and SharePoint 2010 for integration.  Trouble is, many of the step-by-step guides present contradictory configuration steps.  I guess there is no substitute for a deep, working knowledge of ADFS 2, SAML, and other Federation topics.

Here are some of the claims setup guides I have been working with:

Here are additional configuration posts on the process of upgrading an existing SharePoint Web Application from “Windows” authentication to “Claims” authentication.  The common denominators?

  1. You must add a valid new user to your claims-aware web app before migrating existing users, or the web application will be inaccessible after migration (or indeed, even before migration!)
  2. To trigger migration of users, you must invoke the “Migrate Users” method on your web app, E.g.:
    $wa = get-SpWebApplication "https://webappurl"
    $wa.MigrateUsers($true)

The things here that seem very unclear to me are:  What exactly is being done when you invoke the “MigrateUsers” method on the Web Application object?  How does SharePoint map legacy “Windows” users to new “Claims” users?  Anyway, here are the links:

Pages containing information that I have found useful while contemplating how to pull off SharePoint 2010:

Many of these links, as it turns out, were already discovered by members at dirteam.com.  Doh…

http://blogs.dirteam.com/blogs/jorge/archive/2010/07/06/configuring-sharepoint-2010-to-use-adfs-v2-as-an-authentication-provider.aspx

ADFS 2 Setup and Configuration

The following is a work in progress… Everything done so far works, but this is not yet a complete end-to-end deployment guide…

ADFS 2: new… improved?

There were lots of setup and configuration guides for ADFS 1.0, even though the product was nauseatingly difficult to understand.  Along comes ADFS 2… new, more powerful, better, more standards compliant… documentation reads like scrawl on a napkin.

Some setup quirks:

  • Don’t forget that no current version of Windows Server includes ADFS 2.0… not even Server 2008 R2.  If you want to install ADFS 2.0 you must download it from MS and install.  DO NOT add the out-of-box ADFS role on Server 2008 R2.  It will confuse you, because it does not disclose which version of ADFS that it is.
  • Since we are planning a farm, generate a farm SSL certificate on one of the ADFS servers, then export the certificate in PFX format (that is, with the private key), and restore the cert to the second ADFS server.
  • It is considered desirable to put the ADFS database on a reliable external database server, but documentation on this options is limited to command line help.  Here is what I did:
    1. On one of the ADFS servers, run:
      Fsconfig.exe GenerateSQLScripts /ServiceAccount [account] /ScriptDestinationFolder [destination folder]
    2. Install the appropriate version of the SQL Native Client on the ADFS servers.
    3. On the SQL server, run the two scripts that were generated by the above command.  This will create two databases… AdfsConfiguration and AdfsArtifactStore, and set permissions for the service account specified.
    4. Make sure that any firewalls on the SQL server will allow the ADFS servers to connect to it.
    5. Since we use SQL mirroring, set up mirrors for these databases.  Add the service account to the list of accepted logins on the mirror server, since this will not be done automatically.
    6. Generate certificates that will be used for the ADFS web site, token signing, and token decryption.  Put the certificates in the Windows certificate store of the local computer.
    7. Preconfigure binding of the SSL certificate to the default IIS web site.
    8. Configure the first server in the farm using the following syntax:
      Fsconfig.exe CreateSQLFarm /ServiceAccount [Service Account]
      /ServiceAccountPassword [password] /SQLConnectionString "Initial Catalog=AdfsConfiguration; Data Source=spellbound; failover partner=rearwindow; integrated security=SSPI; Network Library=DBMSSOCN" /FederationServiceName login2.uvm.edu /CleanConfig /CertThumbprint "[thumbprint]" /SigningCertThumbprint "[thumbprint]" /DecryptCertThumbprint "[thumbprint]"
      Note that the thumbprint can be obtained by viewing the properties of the certificate in Windows explorer.
    9. Note that the ADFS “Artifact Store” database should get configured automatically, but you can ceck on this by doing the following:
      1. Launch PowerShell with admin privs
      2. Run the command “Add-PSSnapin microsoft.adfs.powershell”
      3. Run “get-adfsproperties | select artifactdbconnection”
      4. Use “set-adfsproperties -artifactdbconnection” to change the string if necessary.
      5. See this resource for more details:
        http://social.technet.microsoft.com/wiki/contents/articles/ad-fs-2-0-migrate-your-ad-fs-configuration-database-to-sql-server.aspx

Of course, an ADFS server with no Federation partners does not do us much good.  Unfortunately, I don’t have any other ADFS servers that I want to integrate with, either.  What interests me more are Shibboleth services (such as our own “login.uvm.edu”), and federation with other “InCommon” partners.  I also am interested in “user centric” identity providers such as OpenID and “Windows Live ID”.  Here are some links to get us started down this path.  Unfortunately, I cannot find anything that details exactly what we want:

ECTS Login Errors – Troubleshooting

Users of our ECTS implementation “PartnerPoint” are not an overly happy set.  Most of the problems that we have experienced are centered around login errors.  This application is particularly prone to login errors for the following reasons:

  • Randomly generated initial password is too complex – data entry errors cause login denial
  • Password expiration errors are not transparent – We need to capture the error that is seen in a password expiration instance.
  • Passwords generated by ECTS (either during account creation or a ECTS admin reset) are temporary and must be changed on next login.  The account attribute “eatmuPwdGenerated” holds this information.
  • Password strength requirements are not displayed in the forms, and password strength errors are not detailed or helpful (i.e. they do not tell you why your password is unacceptable).
  • Login errors are not detailed or helpful – they do not tell you if the account is locked or disabled, it the password is expired, or if you simply entered an invalid username/password combination.
  • Even when login is successful, users often will get “access denied” messages because of permissions problems:
    • The ECTS “Add External User” dialog generally refuses to add permissions to the ACL for a site… it only works consistently when you add the user to an existing site group
    • Sign-in for ECTS users requires at least “Read” permissions to the to-level site in a site collection.  You cannot grant external users rights to a child site with no permissions in the parent.

Clearing account lockouts – accomplished by setting the “lockoutTime” attribute of the AD LDS account to “0”.  This causes the “badPwdCount” attribute to be reset to zero.  Note that you cannot set “badPwdCount”, nor “badPasswordTime” as these attributes are owned by “SYSTEM”, and thus cannot be edited manually.  Solution located in the “EggHead Cafe”:
http://www.eggheadcafe.com/forumarchives/windowsserveractive_directory/nov2005/post24794404.asp

Other AD LDS account attributes to watch – see MSDN Active Directory Schema documentation for details:
http://msdn.microsoft.com/en-us/library/ms675090(VS.85).aspx

  • eatmuPwdGenerated – attribute added by ECTS installer.  Indicated whether current password was generated by ECTS, or by the user.  Reset when the user successfully logs in and sets his own password.
  • msDS-UserPasswordExpired – populated, but apparently not accurate or used.
  • msDS-User-Account-Control-Computed – Most commonly used for reporting on account state.  This value is computed from other fields, and should not be modified directly.
  • pwdLastSet – in active use for ECTS accounts, uses “Large Integer” value, formatted as “NT Time”, or number of 100 nanosecond intervals from 1 Jan, 1601.  This can be converted to a readable date using “w32tm.exe /ntte [string]”.  This value cannot be reset to a specific value, although you can use the special values “0” (meaning “must change at next login”) or “-1” (meaning “now”); however, ADSI Edit does not allow you to enter the value of “-1”, so we would have to use a different tool to set a “-1” value.  Setting the value to “0” will break login as there is no LDAP method for requesting a password change.