We are hiring!

We need an Exchange admin!  If you know Exchange, and you would like to work with us here on “Team Awesome” at UVM, read on…

———————————————————

The University of Vermont is seeking a Senior Windows and Exchange Systems Administrator/Engineer to help support UVM’s central server infrastructure, focusing primarily on Microsoft technologies, including Exchange, Sharepoint, and Lync. UVM is deploying a new Exchange environment and migrating all students, faculty, and staff to this system.

In this position, you will get to design and build your ideal Exchange environment for 20,000+ people.

The successful applicant will help design, deploy, and support this exciting new service for the campus. We are looking for someone with the expertise and creativity to help improve IT at UVM; someone who can design and build reliable and secure systems to solve complex problems.

The University of Vermont is located in beautiful Burlington, Vermont, recognized as one of the best places to live in the US.

Expertise with both on-premise and Office 365 services is valuable. In addition, this position will help support Active Directory, and related services such as ADFS. Security management and monitoring are also key functions. The successful applicant will provide advanced technical expertise and expert level troubleshooting of Microsoft collaboration services. The applicant will need to configure, install, maintain, and monitor enterprise server equipment and software; initiate and manage projects in collaboration with other systems administrators; train other technical and operational staff; perform system tuning and troubleshooting; identify and implement security enhancements; perform capacity planning, business continuity and disaster recovery services; and provide systems documentation.

UVM’s central Systems Architecture & Administration department works on the latest in server and storage technology across multiple datacenters. Our systems support most aspects of server computing at UVM, including research, on-line learning, and administrative functions. Our highly technical and energetic team works collaboratively to improve IT at UVM, positively affecting thousands of students, faculty, and staff.

Scripting experience in a Windows environment is required (PowerShell at minimum). This position will help build interfaces for the new Exchange environment to other IT systems on campus, such as our website, Blackboard, Banner, and Luminis.

We offer competitive compensation for exceptional people. UVM has a strong benefits package, including tuition remission benefits.

For more information, and to apply, please visit: http://www.uvm.edu/it/sysadmin/exchange/

Replacing DirSync with Microsoft Azure Active Directory Sync Services

Back in October, MS announced that the new “MS Azure AD Sync Services” (AAD Sync) had gone GA, which means that DirSync is now officially deprecated.

Ironically, I got wind that “Azure AD Connect” would be replacing Azure AD Sync Services, not but a few weeks before:

http://blogs.technet.com/b/ad/archive/2014/08/04/connecting-ad-and-azure-ad-only-4-clicks-with-azure-ad-connect.aspx

Deprecated on release?  Not quite… it appears that Azure AD Sync Services will be at the core of Azure AD Connect, so time spent migrating to Azure AD Sync Services will not be a complete waste of time.

I decided to make this migration happen now, because we will be ramping up to provision faculty and staff with access to Office 365 Pro Plus software in the near future, and I would like to be working with a tool that does not require me to run it in an unsupported configuration (we removed the “Replicating Directory Changes” right from our DirSync service account, which at least at one time was considered a Bad Thing To Do.).

The Good News:

  • The out-of-box configuration does not require a service account with the “Replicating Directory Changes” right, making filtering of sensitive user attributes much easier.
  • The initial setup wizard allows you to de-select attributes that you don’t want to sync to Azure.  No more wading though dense miisclient.exe dialogs and walking off-support to do basic filtering!  The require attributes are listed in grey, so that you don’t risk filtering an attribute that Azure absolutely must have.
  • AAD Sync actually has some documentation available on its installation and use over at MSDN:
    http://msdn.microsoft.com/en-us/library/azure/dn790204.aspx
    …And there actually are quite a few important details in this seemingly cursory documentation.
  • AAD Sync entirely replaces the UI-heavy, un-maintainable attribute filtering tools in DirSync with a new “Declarative Provisioning” engine, that allow robust attribute filtering and transformation rules to be defined by the sys admin.
  • Custom “Inbound” attribute filters will be preserved during product upgrades, according to the documentation.
  • The documentation has been updated with instructions on how to make domain-based sync filters actually work in a multi-domain forest.  This was a key detail that was missing in the DirSync documentation.

The Bad News:

  • The Declarative Provisioning language, while based on VB for Applications (The “best known language by systems administrators”, according to the docs.  Whaaaat?!?!), is not actually VBA.  Debugging methods for dealing with this new VB-like thing are not at all documented, and examples of its usage are few and far between.

So what did I learn during this deployment process that was Blog-worthy?  How to debug Declarative Provisioning!

Debugging Declarative Provisioning (Import Expression Filters):

Let’s take an example.  We want to sync only students, and users who are to be provisioned for use of Lync in Office 365 into Azure.  To accomplish this, we have populated students with the text value “Student” in extensionAttribute1.  Lync users have had extensionAttribute2 populated with the text value “lync”.

When running the Synchronization Rule Editor, we create a new “inbound” rule, assign it to object type “inetOrgPerson”, link type is “join”, and give it a precedence under 100.  We skip the Scoping and Join Rules options, and go to “Transformations”.

The flowType gets set to “Expression”, the Target Attribute to “cloudFiltered”, and we define a VBA-like expression in the “Source” field:

EditSyncRule

My expression follows:

IIF(CBool(InStr([extensionAttribute1], "Student", 1)), NULL, IIF(CBool(InStr([extensionAttribute2], "lync", 1)), NULL, True)

So what is the intent here? “cloudFiltered” is a metaverse attribute that can be set to suppress synchronization of an account upon export to Azure AD.  If set to “True”, the account should not get synced to Azure.

Our expression uses the “IIF” fuction to check to see if “extensionAttrubute1″ contains “Student”.  If so, then “IIF” returns “NULL”, and NULL is fed to the “cloudFiltered” attribute (which, according to the docs, will cause the attribute to be cleared if it is already set).  However, if extensionAttribute1 does not contain “Student”, we perform a second test to see if “extensionAttribute2″ contains “lync”.  If so, cloudFiltered again gets set to NULL.  If “extensionAttribute2″ does not contain “lync”, then “cloudFiltered” gets set to the boolean value “True”.

Syntax quirks:

  • Items enclosed in square braces [] refer to metaverse attributes, and serve as a sort of automatic variable for provisioning.
  • Note that the InStr() function used here is not the same one documented in the VB for Applications language reference:
    http://msdn.microsoft.com/en-us/library/office/gg264811(v=office.15).aspx
    The “start position” parameter in the docs is not supported here, although it does appear that the “compare” parameter is supported (represented by the “1″ in my example, which means perform a case-insensitive comparison).
  • Everything is case sensitive… even the function names!  “IIF()” is all caps.  CBool(), CStr(), InStr(), and Error() are all “Caml Cased”.
  • IIF() is a bit of a strange beast by itself.  You might need this syntax reference:
    http://msdn.microsoft.com/en-us/library/office/gg264412(v=office.15).aspx

Now that we have a filter, how to we test it?  The docs say “Run a full import followed by a full sync” on the AD management agent.  While this will work, it also is a bit time consuming.  If there are syntax errors in your expression, you will get a lot of errors in your sync logs.  Is there something better?

As it turns out, yes there is.  Special thanks to Brian Desmond (co-author of the o’Riley press “Active Directory Cookbook”) for putting me on the right track:

  1. In the MIISClient.exe, right click your AD management agent, and select “Search Connector Space”:
    SearchConnectorSpace
  2. Specify the DN of a user that you want to test your expression on in the search box.  You can quickly get a DN using PowerShell with the ActiveDirectory module:
    Get-ADUser -Identity [samAccountName]
    Click “Search”, click the returned object, and click “Preview”:
    PreviewUser
  3. In the “Preview” dialog, select either a Full or Delta sync, then click “Generate Preview”, and select the “Import Attribute Flow” option on the left.  Review the list of Sync Rules on the bottom pane to make sure your rule was processed.  Any rule that generates a change will be displayed in the top pane:
    ImportFlow
    In this case, the expression did not generate a change in the existing metaverse object.
  4. If an expression that you have authored generates an error, a full stack trace will appear in the Error Details section of the Preview pane.  Scroll down to the bottom of the stack, where it is likely that the actual problem with your expression will be identified:
    PreviewError
    In this example, I inserted an “Error()” function with a null argument.  The language engine did not like this.
  5. Back in the “Start Preview” tab, you could “Commit Preview” to apply the filter rules for this specific object, and then view the results in the Metaverse Search component of miisclient.exe.

Using the preview tool, I was able to get my filter expressions working correctly with minimal fuss and delay.  The final hurdle to getting filtered attributes set correctly was understanding what the various “run profiles” in AAD Sync actually do.

  • AD Agent – Full Import:  Appears to sync data from AD to the AAD Sync Metaverse without applying Declarative Provisioning Filters.  I think.  Not 100% on this one.
  • AD Agent – Full Synchronization:  Appears to apply Declarative Provisioning filters on all metaverse objects.
  • Azure AD Agent – Full Sync: Appears to apply export filtering logic to metaverse objects entering the Azure connector space.
  • Azure AD Agent – Export:  Appears to sync metaverse objects to Azure without applying filtering logic.  My impression is that you must do a “sync” before an export, if you want logic applied to Azure AD objects as you intended.

My understanding of the run profiles may be flawed, but I will note that if I perform syncs in the following sequence, I get the expected results:

AD Agent:Full Import -> AD Agent:Full Sync -> Azure Agent:Full Sync -> Azure Agent:Export

Although it appears that the more authoritative way to get all of your records reconciled is:

               AD Agent:      MV:   Azure Agent:
               ~~~~~~~~~      ~~~   ~~~~~~~~~~~~
               Full Import => MV
                              MV  MV
                              MV => Export
                              MV  Delta Sync
               Export      <= MV                 (Only needed in Azure write-back scenarios)

However, with other sequences, I have seen strange results such as metaverse objects not getting updated with the intended filter results, or objects getting declared as “disconnectors” by the Azure AD Agent, and thus getting deleted from Azure.  Ugh!

That’s all for today… hopefully this info will help keep someone else out of trouble, too.

SharePoint Tip-of-the-day: Speed up Servicing

Hats off to Russ Maxwell over at MSDN blogs for this hot tip on SharePoint servicing:

http://blogs.msdn.com/b/russmax/archive/2013/04/01/why-sharepoint-2013-cumulative-update-takes-5-hours-to-install.aspx

I am getting back (finally) to working on our SharePoint 2013 migration, and being reminded of how much I hate servicing the SharePoint stack.  How can installing a few hundred megabytes of SharePoint bits take so much time?!?!?

It turns out you can speed up installation of service packs and cumulative updates simply by stopping (or pausing) Search services, the SharePoint Timer service, and the IIS Admin service.  I tried it, and it works!  Installation of the SharePoint 2013 September 2014 CU took not more than a minute with these services halted.  (At least, it did on three out of four servers… the fourth failed miserably owing to MSI errors.

More on that…

I found the following old, but still relevant, article on troubleshooting Office software installation problems:

http://support2.microsoft.com/kb/954713/en-us

To summarize, you go to your %temp% directory and look for “Opatchinstall(#).log” and “######_MSPLOG.log.” (In my case, there was a file called something like “wss##_MSPLOG.log.  Old-school SharePoint guys will recognize “WSS” as “Windows SharePoint Services”).  Try to locate a line containing “MainEngineThread is returning”, and look up the error code that was returned here:

Mine was error code 1646, or “ERROR_PATCH_REMOVAL_UNSUPPORTED: The patch package is not a removable patch package. Available beginning with Windows Installer version 3.0.”  Apparently the language pack for SP2013 Standard that I was using refused to uninstall.  That’s the legacy of excessive servicing from early release versions, I guess.  Since I was still running on Server 2012 (R1), I decided to nuke and repave rather than troubleshoot.  Boo.

VMware View – Provisioning/Composing hangs, Event log failures, and more!

VMware Horizon View… great product. View Composer? Thorn in my side.

Two weeks back I completed the upgrade of our View infrastructure from 5.3.2 to 6.0.1. It was a smooth upgrade, seemingly, and I was pretty pleased with how little time it took to complete the job. Victory for our team? Not so much.

Over the next week, I had dozens of complaints from IT staff that recompose operations were failing, searches for events related to these failures were returning no results (or just not completing at all), and there were multiple odd “I am getting this weird error on my desktops!” complaints.  The desktop errors all turned out to be unrelated to the upgrade (the template was out of disk space, so the user profile could not load, the View Agent installation was broken, etc. etc.), but sorting out the event log and composer problems were harder…

View 6 Event Log database bug:

Following the upgrade, I was looking into increasing the View Event Log query limit per the request of a client, who was not able to view more than the past few hours of events for his pool owing to the default event query limit of 2000 events.  I noticed that these queries, in addition to being short on useful information, also were taking several minutes to complete.  After bumping the query limit to 6000 events, we found that the queries were taking over 30 minutes to complete, and hogging up all the CPU on the Virtual Center server (where the events database is hosted)!  I verified that memory and disk were not bottlenecked on the SQL database (I could not add more CPU because I already was at the SQL Standard Edition max of four cores), and set SQL tracing to look for deadlock events.  After running into a bunch of dead ends, I finally opened a support case with VMware.

Unsurprisingly, the first response was “well, lower your query limit.”  I explained that no, I was not going to do that.  I also pointed out that selecting 6000 records from a 2.4 Gb database really should not take 30 minutes, and that engineering just needed to buckle down and fix whatever index was causing the problem.  A few days later, I was given one line of T-SQL to run against the View Events database to add a missing index.  Query got executed, index created, and voila!  Event queries started running in seconds, not hours.  Here is the T-SQL:

CREATE INDEX IX_eventid ON dbo.VE_event_data (eventid)

Your table name might be slightly different, depending on the table prefix you selected when setting up the events database.

Composer Failures:

We have seen this before… someone recomposes a pool, the job half-finishes then stops, no error.  The task cannot be canceled, the pool cannot be deleted, and all other Composer operations in the infrastructure grind to a halt.  Why?  If you call VMware support, the first thing they will tell you is “cache corruption”.  The next is “stale desktops”.  Huh?

Deleting Stale Desktops:

http://kb.vmware.com/selfservice/search.do?cmd=displayKC&docType=kc&docTypeID=DT_KB_1_1&externalId=2015112

Clearing the Connection Server Cache:

No KB for this one that I am aware of.  Here is that they always tell me to do… ready?  You are going to like this…

  1. Shut down all of the connection servers in your farm.
  2. Turn the connection servers back on, one at a time.

Augh!

The worst part is, that neither of these solutions worked.  However, what I did find was that after powering the connection servers back on, some composer operations would succeed, but it was only a matter of time before one job failed an brought operations to a halt.  Finally I noticed that when rebooting one of the connection servers (the newest one, used for testing security settings), jammed jobs would immediately resume.  After digging into the logs in C:\ProgramData\VMware\VDM\logs\, I found that the Connection Server was reporting literally thousands of “could not connect to vCenter server at URL…” errors per day.  Why?  Because like a noob I did not give this connection server in interface to the vCenter server.  Bad on me.  However, these critical failures do not show up in the Windows event logs, nor do they get reported up to the View Administrator console.  I had a bad connection server in my environment that was killing Composer operations, and View Administrator thinks everything is peachy.  Boo!  I have complained to VMware support, for what it is worth.  I also fixed the connection server, and things are back to “normal”, whatever that means.

I also got my manager to approve using Splunk to collect all View log files, so that I at least will have an easier time of discovering errors when they arise in the future.

Logon Performance in VDI Land

After spending hte better part of three days attempting to shave time off of login times in our VDI environment (VMware View-based), I thought I should scribe down some notes on effective troubleshooting tools and techniques. There were a lot of self-inflicted wounds this time, and I could have saved myself a lot of time if 1) I had documented the build process for my new VDI pool and 2) I had taken notes that last time I had made login optimizations.

WARNING: This post is largely unedited and probably a bit incoherent. Read at your own risk.

Group Policy:

Computer Configuration->Policies->Administrative Templates->System: Display highly detailed status messages
This setting causes the Windows login screen to provide more verbose feedback to the user about what winlogon.exe is doing at any given time. Rather than just seeing “Preparing Windows”, you will instead see things like “Processing Drive Map Preferences”. If the logon screen hangs on one of these steps for 30 seconds, you will know exactly which Group Policy setting is killing logon performance.

Event Viewer:

Windows 7 and 8 both include a Group Policy operational log under: Event Viewer->Applications and Services Logs->Microsoft->Windows->GroupPolicy->Operational.  This log contains a lot of useful information about the timing of various group policy components, and many times will contain all of the information you need to pinpoint troublesome Group Policy Settings.

If the Event Viewer does not have all of the information you need, you can enable verbose policy logging:
http://social.technet.microsoft.com/wiki/contents/articles/4506.group-policy-debug-log-settings.aspx

I typically find that this is not necessary, and that the Event Viewer has the information that I need.

Problems with User Profile loading often can be found under: Event Viewer->Applications and Services Logs->Microsoft->Windows->User Profile Service->Operational log. This log is especially useful when using roaming or mandatory profiles. Unfortunately, this entry just tracks initial profile location and loading, and does not log anything related to Active Setup.

Windows Performance Toolkit:

Part of the Assessment and Deployment Toolkit (ADK) for Windows 8.1. The Performance Toolkit includes the Windows Performance Recorder and Windows Performance Analyzer. Run the Recorder with the “Boot” performance scenario, with 1 iteration, then use the Analyzer to read the trace file that was created during reboot and logon. Make note of the relative time of each event in the boot/logon process (i.e. time of boot, time of login, time to desktop load). The Recorder only logs relative time from boot up, so you might have some trouble correlating wall-clock time with recorded event times. Try to locate processes that line up with the delays you see during login.

As an alternative, you can enable boot logging using “ProcMon”. The Performance Analyzer arguably offers better visualizations of boot issues, but ProcMon has more comprehensive process information, and may be a more familiar tool for many administrators.

Active Setup:

Active Setup is a pain. This is a pooly documented mechanism by which applications (mostly Microsoft applications) can run per-user configuration tasks (generally first-run tasks) on logon. It is synchronous, meaning each task much be completed before the next runs. Also, Active Setup runs in Winlogon.exe and blocks loading of the desktop. Because of this, Active Setup has the potential to greatly delay first time logon. As a result, it also becomes a scapegoat for logon delays, even when it is not the root cause. I have no really helpful advice for troubleshooting Active Setup other than use use the Performance Analyzer or ProcMon to locate Active Setup processes that take a long time to execute. See the following for a better explanation of the internals of Active Setup:
http://helgeklein.com/blog/2010/04/active-setup-explained/
And this for an explanation of situations in which you might want to disable Active Setup:
http://blog.ressoftware.com/index.php/2011/12/29/disable-active-setup-revealed/

SysInternals AutoRuns:

You can wade though every obscure registry key looking for processes that run at login, or you can just use AutoRuns and pull them up all in one place. Thanks to AutoRuns, I was able to locate the entry point for an irksome logon process that was running for no apparent reason. I had forgotten that under Windows Vista and later, Scheduled Tasks can use user logon events as a trigger event for starting a process. This brings us to the process that killed two days of my life…

Minor Troubles with Google Chrome:

Using The Performance Analyzer, I concluded initially that Google Chrome was adding over 30 seconds of time to logon on one of my VDI pools.  While Google is launching “GoogleUpdate.exe” at each user logon event (via a scheduled task trigger), these scheduled tasks really should not block loading of the desktop.  This task runs in other pools, without significant delay.  In this pool, the task was running for a long time (over a minute) before exiting.  The likely cause of this excessive delay is the internet-bound HTTP/HTTPS filtering that is taking place in this pool… Google cannot update itself if outbound internet access is blocked.  Still, long running or not, Chrome Update was not blocking loading of the desktop.

That being said, our users really do not need Chrome to check for updates on each and every logon, so how to we fix this?
Investigation of Active Setup showed that Active Setup for Chrome already had been completed in our Mandatory roaming profile. So why was Chrome setup running on each and every user logon? It also was configured as a Scheduled Task that runs on each user logon event. Aargh! As noted above, SysInternals AutoRuns was used to locate this entry point.

Unfortunately, Google Update is a bit on the complicated side:
http://omaha.googlecode.com/svn/wiki/GoogleUpdateOnAScheduleOverview.html

There are two separate Google Update system services, two separate Scheduled Tasks related to Google Update, and three separate task triggers, including the one that runs a logon. For now, I have just disabled the scheduled tasks in my template machine. Unfortunately, this completely disabled Google Update in the VDI pool. Also, the changes will be wiped out if we update Google manually or via SCCM in the future. Better would be a Group Policy-based solution.

Some of you may know that there actually is official registry/Group Policy support for control of Google Update. See:
https://support.google.com/installer/answer/146164
and:
http://www.chromium.org/administrators/turning-off-auto-updates
However, these setting just disable Auto Update entirely. They do not allow you to control how and when updates will apply (i.e. disable user-mode updates, but leave machine-mode updates intact.

I expect the “real” fix here would be to run a separate scheduled task script or startup script that used PowerShell to fund and remove the scheduled task triggers. That’s more time than I want to spend on this project at present.

Provisioning students with Office 365 ProPlus licenses

NOTE:  This post was updated on 2014-11-21 to reflect refinements in the PowerShell provisioning script.

Interesting… I still seem to be working at UVM. There must be a story there, but you won’t read about it here.

Anyway, after getting back from my brief hiatus at Stanford University, I got back on the job of setting up Azure DirSync with Federated login to our in-house Web SSO platforms. I’ll need to post about the security changes required to make that work with UVM’s FERPA interpretation. To summarize, we got it working.

However, once students can log in to Office 365, we need to provision them with licenses. DirSync can’t do this, so I needed to script a task that will grant an office 365 ProPlus license to any un-provisioned active student. You will find the script, mostly unaltered, below. I just set it up as a scheduled task that runs sometime after the nightly in-house Active Directory update process.

To be useful outside of UVM, the code would need to be customized to handle the logic used in your organization to determine who is a student. We have extended the AD schema to include the eduPerson schema, and have populated the “eduPersonPrimaryAffiliation” attribute with “Student” for currently active students. If you do something different, have a look at the “Get-ADUser” line, and use a different LDAP query to fetch your student objects.

Enjoy.

# Provision-MSOLUsers.ps1 script, by J. Greg Mackinnon, 2014-07-30
# Updated 2014-11-20, new license SKU, corrections to error capture commands, and stronger typing of variables.
# Updated 2014-11-21, added "license options" package to the add license command, for granular service provisioning.
#
# Provisions all active student accounts in Active Directory with an Office 365 ProPlus license.
#
# Requires:
# - PowerShell Module "MSOnline"
# - PowerShell Module "ActiveDirectory"
# - Azure AD account with rights to read account information and set license status
# - Credentials for this account, with password saved in a file, as detailed below.
# - Script runs as a user with rights to read the eduPersonAffiliation property of all accounts in Active Directory.
#
#    Create a credential file using the following procedure:
#    1. Log in as the user that will execute the script.
#    2. Execute the following line of code in PowerShell:
#    ConvertTo-SecureString -String 'password' -AsPlainText -Force | ConvertFrom-SecureString | out-file "c:\local\scripts\msolCreds.txt" -Force
#

Set-PSDebug -Strict

#Setup local variables:
[string] $to = 'admin@myschool.edu'
[string] $from = 'ProvisioningScript@myschool.edu'
[string] $smtp = 'smtp.myschool.edu'
[string] $msolUser = 'myschool.dirsync@myTennant.onmicrosoft.com'

#initialize log and counter:
[string[]] $log = @()
[long] $pCount = 0

#initialize logging:
[string] $logFQPath = "c:\local\temp\provision-MSOLUsers.log"
New-Item -Path $logFQPath -ItemType file -Force

[DateTime] $sTime = get-date
$log += "Provisioning report for Office 365/Azure AD for: " + ($sTime.ToString()) + "`r`n"

function errLogMail ($err,$msg) {
    # Write error to log and e-mail function
    # Writes out the error object in $err to the global $log object.
    # Flushes the contents of the $log array to file, 
    # E-mails the log contents to the mail address specified in $to.
    [string] $except = $err.exception;
    [string] $invoke = $err.invocationInfo.Line;
    [string] $posMsg = $err.InvocationInfo.PositionMessage;
    $log +=  $msg + "`r`n`r`nException: `r`n$except `r`n`r`nError Position: `r`n$posMsg";
    $log | Out-File -FilePath $logFQPath -Append;
    [string] $subj = 'Office 365 Provisioning Script:  ERROR'
    [string] $body = $log | % {$_ + "`r`n"}
    Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp
}

#Import PS Modules used by this script:
try {
    Import-Module MSOnline -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading Azure AD (MSOnline) PowerShell module."
    errLogMail $myError $myMsg
    exit 101
}
try {
    Import-Module ActiveDirectory -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered loading ActiveDirectory PowerShell module."
    errLogMail $myError $myMsg
    exit 102
}

#Get credentials for use with MS Online Services:
try {
    $msolPwd = get-content C:\local\scripts\msolCreds.txt | convertto-securestring -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered getting creds from file."
    errLogMail $myError $myMsg
    exit 110
}
try {
    $msolCreds = New-Object System.Management.Automation.PSCredential ($msolUser, $msolPwd) -ErrorAction Stop ;
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in generating credential object."
    errLogMail $myError $myMsg
    exit 120
}
#Use the following credential command instead of the block above if running this script interactively:
#$msolCreds = get-credential

#Connect to MS Online Services:
try {
    #ErrorAction set to "Stop" for force any errors to be terminating errors.
    # default behavior for connection errors is non-terminating, so the "catch" block will not be processed.
    Connect-MsolService -Credential $msolCreds -ErrorAction Stop
} catch {
    $myError = $_
    [string] $myMsg = "Error encountered in connecting to MSOL Services."
    errLogMail $myError $myMsg
    exit 130
}
$log += "Connected to MS Online Services.`r`n"

#Generate license report:
$lics = @()  
$studAdv = Get-MsolAccountSku | ? {$_.accountSkuId -match 'STANDARDWOFFPACK_IW_STUDENT'}  
$log += 'Office 365 ProPlus for Student - license report:' 
$log += 'Total licenses: ' + $ppsSub.ActiveUnits  
$log += 'Consumed licenses: ' + $ppsSub.ConsumedUnits  
[int32] $alCount = $ppsSub.ActiveUnits - $ppsSub.ConsumedUnits
$log += 'Remaining licenses: ' + $alCount.toString() + "`r`n" 

#Set license options for students:
$stuLicOpts = New-MsolLicenseOptions -AccountSkuId $studAdv.AccountSkuId -DisabledPlans YAMMER_EDU,SHAREPOINTWAC_EDU,SHAREPOINTSTANDARD_EDU,EXCHANGE_S_STANDARD,MCOSTANDARD

#Retrieve active student accounts into a hashtable:
[hashtable] $students = @{}
try {
    #$NOTE:  The filter used for collecting students needed to be modified to fetch admitted students that are not yet active
    #   This is a 'hack' implemented by FCS to address the tendency for the registrar not to change student status until the first day of class.
    #   (Actual student count should be lower, but we have no way to know what the final count will be until the first day of classes.)
    #
    #get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(eduPersonAffiliation=Student))' -SearchBase 'ou=people,dc=myschool,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
    get-aduser -LdapFilter '(&(ObjectClass=inetOrgPerson)(extensionAttribute1=*Student*))' -SearchBase 'ou=people,dc=myschool,dc=edu' -SearchScope Subtree -ErrorAction Stop | % {$students.Add($_.userPrincipalName,$_.Enabled)}
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Active Directory."
    errLogMail $myError $myMsg
    exit 200
}
$log += "Retrieved active students from Active Directory."
$log += "Active student count: " + $students.count


#Retrieve unprovisioned accounts from Azure AD:
[array] $ulUsers = @()
try {
    #Note use of "Synchronized" to suppress processing of cloud-only accounts.
    $ulUsers += Get-MsolUser -UnlicensedUsersOnly -Synchronized -All -errorAction Stop
} catch {
    $myError = $_
    $myMsg = "Error encountered in reading accounts from Azure AD. "
    errLogMail $myError $myMsg
    exit 300
}
$log += "Retrieved unlicensed MSOL users."
$log += "Unlicensed user count: " + $ulUsers.Count + "`r`n"

#Provision any account in $ulUsers that also is in the $students array:
foreach ($u in $ulUsers) {
    if ($students.get_item($u.UserPrincipalName) -eq $true) {
        #Uncomment to enable verbose logging of user to be processed.
        #$log += $u.UserPrincipalName + " is an active student."
        try {
            if ($u.UsageLocation -notmatch 'US') {
                #Set the usage location to the US... this is a prerequisite to assigning licenses.
                $u | Set-MsolUser -UsageLocation 'US' -ErrorAction Stop ;
                #Uncomment to enable verbose logging of usage location assignments.
                #$log += 'Successfully set them usage location for the user. '
            }
        } catch {
            $myError = $_
            $myMsg = "Error encountered in setting Office 365 usage location to user. "
            errLogMail $myError $myMsg
            exit 410
        }
        try {
            #Assign the student advantage license to the user, with desired license options
            $u | Set-MsolUserLicense -AddLicenses $studAdv.AccountSkuId -LicenseOptions $stuLicOpts -ErrorAction Stop ;
            #Uncomment to enable verbose logging of license assignments.
            #$log += 'Successfully set the Office license for the user. '
            $pCount += 1
        } catch {
            $myError = $_
            $myMsg = "Error encountered in assigning Office 365 license to user. "
            errLogMail $myError $myMsg
            exit 420
        }
    } else {
        $log += $u.UserPrincipalName + " is not an active student.  Skipped Provisioning."
    }
}

#Add reporting details to the log:
$eTime = Get-Date
$log += "`r`nProvisioning successfully completed at: " + ($eTime.ToString())
$log += "Provisioned $pCount accounts."
$tTime = new-timespan -Start $stime -End $etime
$log += 'Elapsed Time (hh:mm:ss): ' + $tTime.Hours + ':' + $tTime.Minutes + ':' + $tTime.Seconds

#Flush out the log and mail it:
$log | Out-File -FilePath $logFQPath -Append;
[string] $subj = 'Office 365 Provisioning Script:  SUCCESS'
[string] $body = $log | % {$_ + "`r`n"}
Send-MailMessage -To $to -From $from -Subject $subj -Body $body -SmtpServer $smtp

Parting Scripts – Add a new network printer and set it as default

Some time back, I discovered that a Group Policy Preference that we had applied to a VMware View VDI pool was adding an additional 30 seconds of time staring at the blue spinning donut at each VDI desktop logon.  The policy in question was a printer policy.  Colleagues at other Higher Ed institutions confirmed that they had the same problem with GPP printer preferences.  It has been reported that using the “Mandatory Printers” policy is faster, but this policy does not allow you to assign a default printer.

Enter good old VBScript…

The following script will install a defined network printer and set it as default. If the print share does not exist, an error will be returned. 95% of the code in this script was lifted from my own “KillAndExec.vbs” script from last year. There really is only two lines of new code in here. It is good having a code library to draw on, because it would have taken be days to generate this stuff from scratch. VBScript is so obtuse… so why do I keep using it? Hmmmm….

'addDefaultPrinter script - J. Greg Mackinnon, 2014-06-11
'  Adds the network printer specified in the script argument "/share".
'  Sets this printer as the default printer for the current user.

option explicit

'Declare Variables
Dim bBadArg,bNoArgs
Dim cScrArgs
Dim iReturn
Dim sBadArg,sLog,sPrintShare,sScrArg,sScrArgs,sTemp,sTextsLog

Dim oFS,oLog,oShell
Dim WshNetwork

'Set initial values:
bBadArg = False
bNoArgs = False

'Instantiate Global Objects:
Set oShell = CreateObject("WScript.Shell")
Set oFS  = CreateObject("Scripting.FileSystemObject")

Set WshNetwork = CreateObject("WScript.Network")    


'''''''''''''''''''''''''''''''''''''''''''''''''''
' Define Functions
Sub subHelp
	echoAndLog "addDefaultPrinter.vbs Script"
	echoAndLog "by J. Greg Mackinnon, University of Vermont"
	echoAndLog ""
	echoAndLog "Installs a printer from a named network share, and sets this"
	echoAndLog "as the default printer for the current user."
	echoAndLog ""
	echoAndLog "Logs output to 'addDefaultPrinter.log' in the %temp% directory."
	echoAndLog ""
	echoAndLog "Required arguments and syntax:"
	echoAndLog "/share:""\\[server]\[share]"""
	echoAndLog "     Specify the UNC of the print share to be set as default."
End Sub

function echoAndLog(sText)
'EchoAndLog Function:
' Writes string data provided by "sText" to the console and to Log file
' Requires: 
'     sText - a string containing text to write
'     oLog - a pre-existing Scripting.FileSystemObject.OpenTextFile object
	'If we are in cscript, then echo output to the command line:
	If LCase( Right( WScript.FullName, 12 ) ) = "\cscript.exe" Then
		wscript.echo sText
	end if
	'Write output to log either way:
	oLog.writeLine sText
end function
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Initialize Logging
sTemp = oShell.ExpandEnvironmentStrings("%TEMP%")
sLog = "addDefaultPrinter.log"
Set oLog = oFS.OpenTextFile(sTemp & "\" & sLog, 2, True)
' End Initialize Logging
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Parse Arguments
If WScript.Arguments.Named.Count > 0 Then
	Set cScrArgs = WScript.Arguments.Named
	For Each sScrArg in cScrArgs
		Select Case LCase(sScrArg)
			Case "share"
				sPrintShare = cScrArgs.Item(sScrArg)
			Case Else
				bBadArg = True
				sBadArg = sScrArg
		End Select
	Next
ElseIf WScript.Arguments.Named.Count = 0 Then 'Detect if required args are not defined.
	bNoArgs = True
End If 
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Process Arguments
if bBadArg then
	echoAndLog vbCrLf & "Unknown switch or argument: " & sBadArg & "."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
elseif bNoArgs then
	echoAndLog vbCrLf & "Required arguments were not specified."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
end if
echoAndLog "Printer share to set to default: " 
echoAndLog sPrintShare & vbCrLf
' End Process Arguments
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
'Begin Main
'
on error resume next
'Add Printer
iReturn = 0
iReturn = WshNetwork.AddWindowsPrinterConnection(sPrintShare)
if err.number  0 then 'Gather error data if AddWindowsPrinterConnection failed.
	echoAndLog "Error: " & Err.Number
	echoAndLog "Error (Hex): " & Hex(Err.Number)
	echoAndLog "Source: " &  Err.Source
	echoAndLog "Description: " &  Err.Description
	iReturn = Err.Number
	Err.Clear
	wscript.quit(iReturn)
end if
if iReturn  0 then
	echoAndLog "Non-zero return code when attempting to set default printer."
	echoAndLog "Return Code was: " & iReturn
end if

'Set Default Printer
iReturn = 0
iReturn = WshNetwork.SetDefaultPrinter(sPrintShare)
if err.number  0 then 'Gather error data if SetDefaultPrinter failed.
	echoAndLog "Error: " & Err.Number
	echoAndLog "Error (Hex): " & Hex(Err.Number)
	echoAndLog "Source: " &  Err.Source
	echoAndLog "Description: " &  Err.Description
	iReturn = Err.Number
	Err.Clear
	wscript.quit(iReturn)
end if
on error goto 0
'echoAndLog "Return code from the command: " & iReturn
if iReturn  0 then
	echoAndLog "Non-zero return code when attempting to set default printer."
	echoAndLog "Return Code was: " & iReturn
end if

oLog.Close
wscript.quit(iReturn)

' End Main
'''''''''''''''''''''''''''''''''''''''''''''''''''

Rejecting read receipt requests with Procmail

I preparation for my exit, I have been re-routing various routing bulk mail messages using “Procmail” recipes. While I was working on this, I got an email from a colleague who always requests read receipts for every message that he sends. Despite being asked to stop, messages continue to come in with read receipt requests. I thought “wouldn’t it be great if I could get procmail to just reject these messages outright?”

Consulting common references on Procmail was not helpful because they rely procmail having access to a full shell environment. My colleague Jim Lawson gave me the framework for a different solution which instead involves the use of the “:0fwc” construct to pipeline multiple procmail actions. Interesting is the use of the “appendmsg” command, for which I cannot find a reference anywhere. This work, though. Aggressive/Aggressive handling of read receipt requests achieved!

#Use ":0c:" below if you want to receive a copy of the original message instead of just rejecting it.
:0:
# Check to see if the message contains any of the following command read-receipt request headers:
* ^Disposition-Notification-To:|\
  ^X-Confirm-Reading-To:|\
  ^Return-Receipt-To:
# Prevent mail loops... mail loops are bad.
* ! ^X-Loop: jgm@uvm\.edu
{
        :0fw
        | formail -pQUOTE: -k -r
        BODY=`formail -I ""`

        :0fw
        | formail -A"X-Loop: jgm@uvm.edu"\
        -I"Subject: Rejected mail: Read Receipts not accepted by this account."
        #-I"To: ${REJECT}" \
        
        # scrape off the body
        :0fwc
        | formail -X ""

        :0fwc
        | appendmsg "Message rejected because it contains a read receipt request." 

        # put back the quoted body
        :0fwc
        | appendmsg $BODY
        
        :0
        | sendmail -t
}

Final days… looking back.

Things have really slowed down here in the Brain Corral.  Projects and tasks have been rolling in faster than my ability to document them.  I am sure all of my “fans” have been most disappointed by lack of updates.  Well fans, sorry to be the bearer of bad news, but this likely will be my last post in the ‘ol corral.  June 13th, 2014 will be my last day of employment at UVM, which will put me a two weeks shy of 14 years of employment in Catamount country.

It has been a long road, with much territory covered.  When I started, I was the new “NetWare Guy”, responsible for Novell file and print services, and some antivirus management.  Since then, I have plowed though:

  • Two major directory service changes, and several upgrades (NetWare bindery to eDirectory, to Active Directory on Server 2003, and AD upgrades all the way though Server 2012 R2)
  • four major file server upgrades (NetWare 4 standalone to Netware 6 failover clusters, to NetApp FAS270, to NetApp FAS3000 series, to native Windows 2008 R2 file services)
  • three print server redeployments (Traditional NetWare to NDP, to monolithic Windows print services on Server 2003, to distributed print servers on Server 2008)
  • four antivirus vendor changes, (McAfee, to Norton/Symantec, to ESET NOD32, to MS Forefront/SCEP)
  • two terminal server projects (Citrix MetaFrame 1.8 to MS Terminal Services on Server 2008.
  • a VDI deployment with several minor upgrades (VMware View 5.0, 5.1, 5.2, and 5.3… over it’s <2 year history, that's an upgrade every six months!)
  • A SharePoint deployment with three major upgrades (STS 2 to WSS 3, to a 64-bit scale-out farm upgrade, to SPF 2010, and an incomplete 2013 Standard Edition upgrade)
  • three Windows deployment technology changes with many version upgrades (Ghost to RIS, to BDD, which later became MDT/LiteTouch)
  • a major deployment of VMware vSphere and vCenter, with many version upgrades (vCenter 2.5, 4.0, 4.1, 5.09, 5.1, 5.5).  From it’s humble beginnings, vSphere now hosts over 90% of our non-research computing operating systems!
  • four major changes in Windows enterprise patch management methods (SUS, WSUS, WSUS with Secunia CSI, SCCM 2012 with ADR and third party “Application packages” with supercedence rules).  We went from full MS-only non-enforced patch management, to adding support for third party applications, to fully automated patch management.
  • Two major whole disk encryption projects (PGP Whole Disk, followed by BitLocker with the “Microsoft BitLocker Administration and Management” add-on)
  • Several Windows server management re-deployments (BigBrother to MOM 2, MOM 2005 (redeploy), SCOM 2007 (redeploy), SCOM 2007 R2 (upgrade), SCOM 2012 (redeploy), SCOM 2012 R2 (upgrade))
  • …and other projects to numerous to detail (security initiatives, network protocol transitions, hosted server upgrade assistance, application deployment and upgrade assistance).

It has been a rocky road, full bumps and potholes.  Fortunately, with the assistance of my truly excellent teammates, we have made this journey with very few significant breakdowns.  Central IT has perhaps quintupled in size and importance over the past 14 years.  I am proud to have been part of this explosive period in IT at UVM, and honored to have worked with such highly intelligent and motivated people.

I wish you all the very best of luck with all future endeavors, and hope that you will keep in touch, even though I will be on the west coast of America (not just the west coast of New England) at Stanford University.

-J. Greg Mackinnon | ETS Systems Architecture and Administration

Migrating Windows-auth users to Claims users in SharePoint

A short time back I published an article on upgrading a Windows-authenticated based SharePoint environment to an ADFS/Shibboleth-based claims-based environment. At that time I said I would post the script that I plan to use for the production migration when it was done. Well… here it is.

This script is based heavily on the one found here:
blog.sharepoint-voodoo.net/?p=68‎
Unfortunately, “SharePoint-Voodoo” appears to be down at the time of this writing, so I cannot make appropriate attribution to the original author. This script helped speed along this process for me… Thanks, anonymous SharePoint Guru!

My version of the script adds the following:

  • Adds stronger typing to prevent script errors.
  • Adds path checking for the generated CSV file (so that the script does not exit abruptly after running for 30 minutes).
  • Introduces options to specify different provider prefixes for windows group and user objects.
  • Introduces an option to add a UPN suffix to the new user identity
  • Collects all user input before doing any processing to speed along the process.
  • Adds several “-Limit All” parameters to the “Get-SP*” cmdlets to prevent omission of users from the migration process.

There are still some minor problems. When run in “convert” mode, the script generates an error for every migrated user, even when the user is migrated successfully. I expect this is owing to a bug in “Move-SPUser”, and there probably is not much to be done about it. Because I want to migrate some accounts from windows-auth to claims-with-windows-auth, there is some manual massaging of the output file that needs to be done before running the actual migration, but I think this is about as close as I can get to perfecting a generic migration script without making the end-product entirely site-specific.

I will need to run the script at least twice… once for my primary “CAMPUS” domain, and once to capture “GUEST” domain users. I may also want to do a pass to convert admin and service account entries to claims-with-windows-auth users.

Set-PSDebug -Strict
add-pssnapin microsoft.sharepoint.powershell -erroraction 0

# Select Options
Write-Host -ForegroundColor Yellow "'Document' will create a CSV dump of users to convert. 'Convert' will use the data in the CSV to perform the migrations."
Write-Host -ForegroundColor Cyan "1. Document"
Write-Host -ForegroundColor Cyan "2. Convert"
Write-Host -ForegroundColor Cyan " "
[int]$Choice = Read-Host "Select an option 1-2: "

switch($Choice)
{
    1 {[bool]$convert = $false}
    2 {[bool]$convert = $true}
    default {Write-Host "Invalid selection! Exiting... "; exit}
}
Write-Host ""

$objCSV = @()
[string]$csvPath = Read-Host "Please enter the path to save the .csv file to. (Ex. C:\migration)"
if ((Test-Path -LiteralPath $csvPath) -eq $false) {
	Write-Host "Invalid path specified! Exiting..."; exit
}

if($convert-eq $true)
{
	$objCSV = Import-CSV "$csvPath\MigrateUsers.csv"

    foreach ($object in $objCSV)
    {
        $user = Get-SPUser -identity $object.OldLogin -web $object.SiteCollection 
        write-host "Moving user:" $user "to:" $object.NewLogin "in site:" $object.SiteCollection 
        move-spuser -identity $user -newalias $object.NewLogin -ignoresid -Confirm:$false
    }
}
else
{
	[string]$oldprovider = Read-Host "Enter the Old Provider Name (Example -> Domain\ or i:0#.f|MembershipProvider|) "
    [string]$newprovider = Read-Host "Enter the New User Provider Name (Example -> Domain\ or i:0e.t|MembershipProvider|) "
	[string]$newsuffix = Read-Host "Enter the UPN suffix for the new provider, if desired (Example -> @domain.com) "
	[string]$newGroupProvider = Read-Host "Enter the New Group Provider Name (Example -> Domain\ or c:0-.t|MembershipProvider|domain.com\) "


    # Select Options
    Write-Host -ForegroundColor Yellow "Choose the scope of the migration - Farm, Web App, or Site Collection"
    Write-Host -ForegroundColor Cyan "1. Entire Farm"
    Write-Host -ForegroundColor Cyan "2. Web Application"
    Write-Host -ForegroundColor Cyan "3. Site Collection"
    Write-Host -ForegroundColor Cyan " "
    [int]$scopeChoice = Read-Host "Select an option 1-3: "

    switch($scopeChoice)
    {
        1 {[string]$scope = "Farm"}
        2 {[string]$scope = "WebApp"}
        3 {[string]$scope = "SiteColl"}
        default {Write-Host "Invalid selection! Exiting... "; exit}
    }
    Write-Host ""
    if($scope -eq "Farm")
    {
        $sites = @()
        $sites = get-spsite -Limit All
    }
    elseif($scope -eq "WebApp")
    {
        $url = Read-Host "Enter the Url of the Web Application: "
        $sites = @()
        $sites = get-spsite -WebApplication $url -Limit All
    }
    elseif($scope -eq "SiteColl")
    {
        $url = Read-Host "Enter the Url of the Site Collection: "
        $sites = @()
        $sites = get-spsite $url
    }

    foreach($site in $sites)
    {
		$webs = @() #needed to prevent the next foreach from attempting to loop a non-array variable
        $webs = $site.AllWebs

        foreach($web in $webs)
        {
            # Get all of the users in a site
			$users = @()
            $users = get-spuser -web $web -Limit All #added "-limit" since some webs may have large user lists.

            # Loop through each of the users in the site
            foreach($user in $users)
            {
                # Create an array that will be used to split the user name from the domain/membership provider
                $a=@()
                $displayname = $user.DisplayName
                $userlogin = $user.UserLogin

                if(($userlogin -like "$oldprovider*") -and ($objCSV.OldLogin -notcontains $userlogin))
                {
                    # Separate the user name from the domain/membership provider
                    if($userlogin.Contains('|'))
                    {
                        $a = $userlogin.split("|")
                        $username = $a[1]

                        if($username.Contains('\'))
                        {
                            $a = $username.split("\")
                            $username = $a[1]
                        }
                    }
                    elseif($userlogin.Contains('\'))
                    {
                        $a = $userlogin.split("\")
                        $username = $a[1]
                    }
    
                    # Create the new username based on the given input
					if ($user.IsDomainGroup) {
						[string]$newalias = $newGroupProvider + $username
					} else {
						[string]$newalias = $newprovider + $username + $newsuffix
					}
                    

                    $objUser = "" | select OldLogin,NewLogin,SiteCollection
	                $objUser.OldLogin = $userLogin
                    $objUser.NewLogin = $newAlias
	                $objUser.SiteCollection = $site.Url

	                $objCSV += $objUser
                }   
            }
        }
        $site.Dispose()
    }

    $objCSV | Export-Csv "$csvPath\MigrateUsers.csv" -NoTypeInformation -Force
}