Category Archives: Uncategorized

App-V 5 Server, F5 Load Balancers, and Kerberos

More fun today with Kerberos and load balancers.  Today’s challenge related to getting the Microsoft App-V publishing server to work with an F5 load balancer in a Layer 4/n-Path/DSR configuration.  Everything was working when I was accessing the individual server nodes, but when I switched to using the load balanced name and address, authentication started to fail.

After lots of log searching I eventually tried a wire trace, and found the following Kerberos error in the response from the App-V server to the App-V client:
KRB5KRB_AP_ERR_MODIFIED

Lots of different resources helped here:

  • This TechNet page explains various Kerberos errors and why they might occur:
    http://blogs.technet.com/b/askds/archive/2008/06/11/kerberos-authentication-problems-service-principal-name-spn-issues-part-3.aspx

    Of note is the scenario where the account handling the authentication request does not hold the SPN for which the request was made.  I set the SPN for my IIS application pool identity, but further analysis of the error packet shows that it was handled by my App-V server machine account, not the service account.  Augh!  Why?

  • This thread on TechNet Social was the biggest help:
    http://social.technet.microsoft.com/Forums/en-US/2b39e2b8-aba1-4e96-b18f-c5bcb9f12687/load-balancing-two-appv-50-servers-the-publishing-service-is-not-able-to-contact-the-management

    The user posted all of the steps they followed in configuring IIS and the service account SPN, including the tidbit:
    changed the authentication of the “Management Service” web site to useAppPoolCredentials=”true”
    I have never used this particular setting, so I dug into it…
  • The following MSDN article explains the IIS 7.0 feature of “kernel authentication”, how it affects the need for SPN entries, and its interplay with application pool identity accounts:
    http://blogs.msdn.com/b/webtopics/archive/2009/01/19/service-principal-name-spn-checklist-for-kerberos-authentication-with-iis-7-0.aspx

    Basically, with kernel-mode authentication, the SYSTEM account will handle all Kerberos authentication by default.  This explains why we were seeing Kerberos errors in the communications with the App-V client… the IIS pool identity account was not handling Kerberos delegation!

    Of special interest is this statement:
    Either:
    Disable Kernel mode authentication and follow the general steps for Kerberos as in the previous IIS 6.0 version.
    Or,
    [Recommended for Performance reasons]
    Let Kernel mode authentication be enabled and the Application pool’s identity be used for Kerberos ticket decryption. The only thing you need to do here is:
    1. Run the Application pool under a common custom domain account.
    2. Add this attribute “useAppPoolCredentials” in the ApplicationHost.config file.

  • This TechNet page documents how to configure Kerberos auth in IIS, and mentions the use of the IIS appcmd.exe to set the “useAppPoolCredentials” option:
    http://technet.microsoft.com/en-us/library/dd759186.aspx
    Included is the exact command line required to set the value to true:
     appcmd.exe set config -section:system.webServer/security/authentication/windowsAuthentication -useAppPoolCredentials:true
    (But the page does not really tell you what it is for, which is where the MSDN article comes in handy.)

So, Kerberos under IIS 7 and later has some nuances not present in IIS 6.  I wonder how I did not encounter this before?

The case of the undeletable directory

I had a “fun” time today with some directories that I could not delete.  These were mandatory roaming profiles that I previously had attempted to upload to a file server using “robocopy”.  Owing to switches that I used when performing the upload, directory junctions were treated as files, and robocopy got caught in a recursive loop because of a circular reference to “Application Data”.  By the time I caught the problem, I had created a set of nested “Application Data” directories that must have been over 400 characters long.

All of the tricks I had used in the past to delete “deep” directories were failing me.  “Dir /s /q [dirName]” failed with “Access Denied”, even though I was the directory owner.  Running the command as System encountered the same problem.  I mapped a drive to the directory, as deep down in the nested folders as I could get.  From there, I was able to “CD” to the deepest “Application Data” directory, but I still could not delete the directory.  (I got a “directory not empty” error.)

Eventually, Google unveiled a suggestion to use “robocopy” to mirror an empty directory to the problematic directory.  (Unfortunately, I have lost track of the link, so I cannot give credit where it is due.)  “Good idea,” I thought.  Why not use the utility that created the problem to solve the problem?  After all, if robocopy can create paths that are 400 characters long, perhaps it can delete them, too.

To test, I simply created an empty directory “E:\foo”, and ran the command:
robocopy.exe /mir /e E:\foo E:\mandatory\corruptProfile

Robocopy quickly chewed though the problematic profile, and a few seconds later I had an empty directory that I was able to delete.  Hurray!

VDI Profile Loading Delays

We are noticing that it takes rather a long time for users to log in to our VDI environment (~2 minutes, in some circumstances).  I did some analysis of login times using Sysinternals Procmon.  (Enable boot logging, use the “view process tree” feature to look at process times at logon.  See http://blogs.technet.com/b/markrussinovich/archive/2012/07/02/3506849.aspx for details).  What I found was that a child process of explorer.exe called “ie4uinit.exe” was running for most of this time.  This process appears to be part of Microsoft “Active Setup” (discussed in some detail here: http://blog.ressoftware.com/index.php/2011/12/29/disable-active-setup-revealed/).

So what if we disable Active Setup?  Noise on the Internet suggests that this is possible , simply be deleting the key:
HKEY_LOCAL_MACHINE\Software\Microsoft\Active Setup
as suggested here:
http://communities.vmware.com/thread/292229?start=0&tstart=0

However, there is some indication that this could have unintended consequences.  In my case, it immediately caused a logon script to fail to run.  Bummer!

What other solutions are possible?  Members of the Windows in Higher Education mailing list recently recommended using mandatory profiles.  There is a reasonably good rundown of the mandatory profile creation process here:
http://markswinkels.nl/2009/12/how-to-create-a-mandatory-profile-in-windows-server-2008-r2/
Missing details are:

  1. It is possible for the mandatory roaming profile to be stored locally (i.e. “C:\Users\VDI_Mandatory.V2” to avoid over-the-network profile copy delays.  However, in our View environment, using a network location appears to be faster!
  2. The mandatory roaming profile can be specified using the Group Policy settings in Computer -> Policies -> Administrative Templates -> System -> User Profiles.  (See “Set roaming profile path for all users logging onto this computer” and “Delete cached copies of roaming profile”.)

In testing, I found initial logon times were reduced from two minutes to approximately 20 seconds.  Good!  (But still not great.)  Additional benefits are that it is no longer necessary to run the logon script that I developed to customize the Start Menu and Task Bar.  I also can remove the Group Policy preferences that clean up local profiles on the computer.

MBAM Configuration Nuances

This week we are continuing testing of the new Microsoft Bitlocker Administration and Management 2.0 tool (MBAM).

MBAM is not overly complicated, but it does have several service tiers and dependencies which make initial setup a bit irksome. After plowing though configuration of a SQL database, SQL Reporting Services, and IIS, we are still need to configured MBAM Group Policy settings, and then we needed to do a fir number of tweaks to make the service actually work. Here are the most significant deviations from the official documentation:

  1. The Group Policy templates for MBAM are not uploaded to the AD Policy Store during product installation, nor does the documentation recommend that you complete this step. However, if you want to be able to edit MBAM Policy from any workstation in the domain, you really do need to upload the ADMX templates. Making this happen is easy… just use the MBAM installer to install the MBAM policy templates locally, then open c:\windows\PolicyDefinitions, and copy BitLockerManagement.admx and BitLockerUserManagement.admx to \\[domain]\SYSVOL\[domain]\Policies\PolicyDefinitions (you will need domain admin rights to do this. Also copy the corresponding .adml files in the local language directory of your local PolicyDefinitions directory to the local language directory on the domain controller (in my case, these are in the “en-US” subdirectory).
  2. After installing the MBAM Client and policy settings, clients were failing to auto-initiate encryption, and were failing to report status to the management server.  The MBAM Admin Event Logs were showing the following error:
    Log Name: Microsoft-Windows-MBAM/Admin
    Source: Microsoft-Windows-MBAM
    Event ID: 4
    Task Category: None
    Level: Error
    User: SYSTEM
    Computer: machinename.domainname.com
    Description: An error occurred while sending encryption status data.
    Error code: 0x803d0013

    This is occurring for a few reasons.  One, the MBAM server is not trusted for delegation, so it cannot perform Kerberos authentication in IIS.  Two, the public URL for MBAM services (https://bitlocker.uvm.edu) does not match the internal name of the server (BAM1).  To fix this, we needed perform a few additional configuration steps:

    1. Create the following key and value on the MBAM management server:
      HKEY_LOCAL_MACHINE\Software\Microsoft\MBAM
      DWORD(32-bit) - DisableMachineVerification
      Value = 1
    2. On the MBAM Administration Server AD object, enable the “Trust for delegation for any service (Kerberos Only) option”, under the Delegation tab.
    3. Use the “setspn” utility to add additional principal names for the public URL of the server to the AD server account:
      setspn -A HOST/bitlocker.mydomain.com MYDOMAIN\MyServer$
      setspn -A HTTP/bitlocker.mydomain.com MYDOMAIN\MyServer$
      setspn -A RestrictedKrbHost/bitlocker.mydomain.com MYDOMAIN\MyServer$
      (Note that if using a service account to run the MBAM Administration Service, you should use “setspn” to set the HOST/HTTP names for the service account instead of the domain computer account).
    4. It appears that it may also be necessary to add the “BackConnectionHostNames” Reg_multi_Sz value to “HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0” to include any public dns names used by the MBAM Administration Server (this likely only is necessary in a load balanced configuration).

    We then needed to perform an IISRESET on the management server, and cycle the MBAM Clients.

  3. The MBAM Help Desk web application was failing to display reports.  This was happening because the installer grabbed the unencrypted reporting services URL from the reporting services instance.  I had to open:
    C:\inetpub\Microsoft BitLocker Management Solution\Help Desk Website\web.config
    Then locate the tag, and edit the URL value to the SSL version of the Reporting Services web site.
  4. The MBAM documentation claims that you will use MBAM policies in place of standard Windows BitLocker policies.  This is somewhat misleading… Many MBAM policy settings also will change the “classic” BitLocker policy settings, so it will appear that you have configured both classic and MBAM policies in the editor.  This would not really be a problem were it not for the fact that MBAM policies are not comprehensive.  You may need to return to the “classic” settings to configure appropriate behavior in your environment.  For example, we experienced difficulty in encrypting a Dell Latitude 10 tablet using MBAM.  On this machine, we saw the following error in the MBAM Admin Event Log:
    Event ID: 2
    An error occurred while applying MBAM policies.
    Volume ID:\\?\Volume{VolumeGUID}\
    Error Code: 0x803100B6
    Details:  No pre-boot keyboard or Windows recovery Environment detected.  The user may not be able to provide the required input to unlock the volume.

    This error is happening because our policy is set to “Allow PIN” (BitLocker PIN Authenticator is allowed, but not required).  Apparently, MBAM default-fails the attempted encryption, even though this is not a “fatal” error.  To allow encryption to continue, I needed to set the classic policy “Enable use of BitLocker authentication requiring preboot keyboard input on slates” as defined here:
    http://technet.microsoft.com/en-us/library/jj679890.aspx#BKMK_slates
    With this policy in place, encryption completes successfully on the tablet computer.

Other than these caveats, the tool does appear to be working.  Setting up our PGP Universal Server was easier, but suffering though the pain of ongoing PGP disk encryption support was agonizing.  Hopefully a little time spent on configuring a solid BitLocker support environment will bear lasting fruit for our constituents down the road.

Additional Resources:

Rick Delserone’s MBAM: Real World Information – A rundown on MBAM Certificate Configuration, Group Policy Templates, and undocumented registry settings:
http://www.css-security.com/blog/mbam-real-world-information/

SQL Server 2012, Transparent Data Encryption, and Availability Groups

We are looking into using Microsoft Bitlocker Administration and Monitoring (MBAM) 2.0 to manage BitLocker in our environment. One requirement for MBAM is a SQL Server database instance that supports Transparent Database Encryption (TDE).  (Update 2013-06-04:  Microsoft now claims that TDE is “optional” with MBAM 2.0, which is nice to know.  If only they had told me this before I went to the trouble of setting up SQL 2012 Enterprise just for this project!)  Currently we also are in the process of investigating the creation of a consolidation SQL 2012 Enterprise Edition “Always On” Availability Group. I wanted to see if I could create the MBAM Recovery Database in a SQL 2012 Availability Group. This proved slightly tricky… fortunately I was able to find a decent reference here:
https://www.simple-talk.com/sql/database-administration/encrypting-your-sql-server-2012-alwayson-availability-databases/

The trick is, you need to create SQL Certificates that on each member server of the Availability Group that have the same name and are generated from the same private key. The procedure follows…

On the first server in the group, create a SQL Master Key and Certificate by running the following code. The script will create a backup file in your SQL Server data directory. Move this file to an archival location. If you lose the file and password, you will not be able to recover encrypted databases in a disaster event:

USE MASTER
GO

-- Create a Master Key
CREATE MASTER KEY ENCRYPTION BY Password = 'Password1';

-- Backup the Master Key
BACKUP MASTER KEY
   TO FILE = 'Server_MasterKey'
   ENCRYPTION BY Password = 'Password2';

-- Create Certificate Protected by Master Key
CREATE Certificate SQLCertTDEMaster
   WITH Subject = 'Certificate to protect TDE key';

-- Backup the Certificate
BACKUP Certificate SQLCertTDEMaster
   TO FILE = 'SQLCertTDEMaster_cer'
   WITH Private KEY (
       FILE = 'SQLCertTDEMaster_key',
       ENCRYPTION BY Password = 'Password3'
   );

Now create a master key on any secondary servers in the availability group, and create the same cert by using the backup file from the first step, above. You will need to copy the certificate backup files to the local server data directory, or use a network share that is accessible to the account running the script:

-- Create a Master Key
CREATE MASTER KEY ENCRYPTION BY Password = 'Password1';
-- Backup the Master Key
BACKUP MASTER KEY
   TO FILE = 'Server_MasterKey'
   ENCRYPTION BY Password = 'Password2';

-- Create Certificate Protected by Master Key
CREATE Certificate SQLCertTDEMaster
   FROM FILE = 'SQLCertTDEMaster_cer'
   WITH Private KEY (
       FILE = 'SQLCertTDEMaster_key',
       Decryption BY Password = 'Password3'
   );

To avoid needless trouble, create your new database and add it to your availability group before encrypting the database. Once the database is created, you can initiate encryption by opening SQl Management Studio, right-clicking your database, select tasks, then select “Manage Database Encryption”. Select the option to generate the database encryption key using a server certificate. Select the certificate created above, and select the option to “set database encryption on”.

Once the database is encrypted, be sure to test availability group failover to make sure the secondary servers are able to work with the encrypted database.

Dell XPS 12 – The Windows 8 Flagship?

Regular readers of my blog (all two of you) may recall the “series” I started this fall on Windows 8 launch devices (concerning the HP Envy X2 and the Samsung SmartPC Pro 700t). These devices both had strengths, but failed in other ways that made them difficult or impossible to support in an enterprise environment. This month, I got my hands on a device that breaks though that barrier and satisfies in a big way. The new Dell XPS 12 finally arrived on our campus about two weeks ago. We immediately were taken with its light weight (3 lbs.), sleek styling, and novel materials (full carbon fiber base, carbon fiber and aluminum lid, and that unique flip-over touch screen). The 8-second boot time is another impressive feature. A longer battery life would have been appreciated, but I can live with it. Other helpful enhancements would be the inclusion of an active stylus. I also would appreciate slightly more resistance in the keyboard.

Others have weighed in on the appearance, performance, and usability of this fancy Ultrabook, though, so I will forgo further commentary on those aspects of the XPS 12. What most concerned us was the ability to support OS redeployment, BitLocker encryption, and hardware servicing on our Campus.

We unboxed and re-deployed the computer with Windows 8 Enterprise within one day. There were a few deployment hiccoughs, but in general re-deployment was what we have come to expect from Dell. All required drivers for the XPS 12 were made available in a single downloadable CAB file. We extracted this CAB to our MDT/LiteTouch Deployment Share, rebuilt our boot media, and initiated a LiteTouch deployment. There was a brief problem getting LiteTouch to start… we needed to disable the “Safe Boot” option in EFI/BIOS, and we needed to set the EFI boot mode to “Legacy” to allow our boot media to operate. Once those changes were made, the XPS 12 booted to our USB WinPE media without complaint. Upon completion of deployment, all devices in the device manager reported as functioning. There were no “poorly-behaved” drivers that required un-scripted installation. We did find that the track-pad was behaving strangely. Investigation revealed that the PnP process had grabbed a Windows 7 track-pad driver from our deployment share. We corrected this manually, then separated our Windows 8 drivers from our Windows 7 drivers in the Deployment Workbench… this should prevent the problem from recurring in future deployments.

BitLocker was easy to implement. The TPM chip readily was recognized by the OS, and TPM-with-PIN encryption was accomplished in minutes. I spent half a day trying to encrypt an older Dell Latitude E6500 a few months back. This was a breeze by comparison.

On the servicing front, we have good news. Dell now is allowing on-site servicing for all XPS models, with full reimbursement for parts and labor for qualified technicians. Physical serviceability is a big concern for newer Ultrabooks. A troubling trend in tablet and notebook design is the use of solder on drive mounts and glue to hold batteries in place (the latest “Retina” MacBooks and the MS Surface tablets suffer from these problems). Fortunately, it appears that all major components of the XPS 12 can be removed and replaced without the need to re-solder or remove glue. The most frequently swapped components such as the battery, mSATA drive, and memory chips look pretty easy to access. The keyboard is a bit of a pain to get to, but at least it can be serviced.

If only more Windows 8 launch products had been this good… I hope we see more products of this quality coming from Dell (and other vendors) in the near future.

Update:  2013-11-1

Five months into using the XPS 12, I started to have trouble with the trackpad.  It would not click anymore!  Since we are working with an evaluation unit, I do not have warranty coverage, so I figured I had no warranty to void by attempting to repair it on my own.

Some digging in the Dell support site revealed that the so-called XPS 12 “User Manual” is actually a service manual!  The readily available PDF document illustrates step-by step how to remove the carbon fiber base plate and the battery in order to get to the track pad.  (The only challenging part was locating a #5 Torx screwdriver to take off the base plate.)  Within 15 minutes I had removed the click pad, and cleaned the trapped grit out from under it.  (Within a half hour I had the unit re-assembled.  In another 15 minutes I had taken the base plate back off, reconnected the battery power connector, and re-attached the base plate, again.)  The unit powered back on as normal, with the track pad working like new.

At a time when consumer devices are moving towards non-serviceable designs (think MacBook Retina), it is nice to see a device that is thin and light while still maintaining serviceability.  Perhaps the track pad on the MacBook Retina is less prone to trapping grit, but imagine if it did?  With all the components glued together, you might be out $2000 because of a bit of sand.  I really have to hand it to Dell.  These XPS Ultrabooks are really nicely engineered.

 

VMware View – Implementing Idle User Auto-Logout

We are going live with out first public VMware View terminals this week (Wyse P25 “zero-clients”… nice units).  I had what I expected would be a easy list of “little jobs” to be completed before going live. Famous last words…

One item on the list was implementing an “idle user logout” process.  This process would detect when a View session had gone idle, and would disconnect the session automatically (preferably after prompting the user).  This disconnected session then would be logged out by View Manager after a fixed amount of time.

This proved rather more difficult than I had predicted.  I tried several solutions before arriving at one that worked.  Among the failed solutions:

  • Using Group Policy to configure Remote Desktop Session Manager idle session limits.  The View configuration documents imply that this should work, but it does not.  I expect that the policies would be effective if you were connecting to your View desktops using RDP, but PCoIP sessions just will not disconnect automatically (at least, they would not for me).
  • Using the Windows Task Scheduler to configure a disconnect script that will trigger on idle.  This did not work for two reasons.  First, the Task Scheduler only evaluates for idle conditions every 15 minutes.  Second, for the Task Scheduler, “idle” means not only that the user is not directing mouse and keyboard to the computer, but that the CPU also is not doing anything.  As a result, we could not get consistent auto-logout times.

The solution that we settled on involved the use of a custom screensaver developed by the “Grim Admin”.  “ScreenSaver Operations”:
http://www.grimadmin.com/staticpages/index.php/ss-operations

This is a great little utility that accomplishes what the “WinExit” screensaver used to for Win XP.  (WinExit cannot easily be used on Win7, and is a bit hostile to 64-bit Windows).  Screensaver Operations has a well-written README describing the use of registry entries to control the screensaver globally (i.e. for all users on the computer).  I set these registry operations as Group Policy Preferences, and we are in business.

Two slight complications… since the screensaver is 32-bit, you need to use the “sysnative” filesystem redirector if your want the screensaver to trigger 64-bit executables.  In our case, I wanted the screensaver to launch “tsdiscon.exe” (to disconnect the View session), so I had to use the path:
%windir%\sysnative\tsdiscon.exe
Additionally, you will need to specify the full path to the screensaver in the Group Policy dialogs (i.e. %SystemRoot%\SysWOW64\Screensaver Operations.scr).  If you fail to do so, the screensaver will appear to be configured in the Control Panel, and you will be able to preview it by clicking the “preview” button, but the screensaver WILL NEVER START.

Ashamedly I will admit that this little challenge too much longer to accomplish than it should have.  No wonder lab managers burn out so easily.

Evaluating Windows 8 Tablets – Samsung ATIV SmartPC Pro (700T)

The journey continues…

The boss approved purchase of a Samsung ATIV SmartPC Pro (the “700T” model).  I wassoooexcited… this was the tablet PC I had been waiting for.  Thin, light, and fully convertible from Ultrabook to slate.  Stylus included, 1080 high-definition display, full Intel i5 processor.  So much to love…

First impressions were really positive.  The build quality seemed really high… solid magnesium case, good keyboard response, fast boot, very responsive Wacom digitizer stylus.  As a tablet, this thing is awesome. And while it is expensize compared to an iPad, it is very cheap compared to the Tablet PCs of yesteryear.

However, I quickly ran into trouble.  When typing with the SmartPC on my lap, the keyboard would frequently disconnect from the display.  It would not fall off, but the tablet component would lose electrical connection to the keyboard, causing typing input to stop.  Sometimes this would happen as often as five times in a single line of text.  Awful!

There were other problems as well.  Like the HP Envy X2, the screen does not tilt back far enough to allow comfortable use of the keyboard on a countertop.  The 1080p display, which is very crisp and bright, is inconvenient to use for remote desktop connections to Server 2008 R2 and earlier hosts (the fonts do not scale for remote desktop sessions, leading to comically tiny print size and rediculiously small buttons and window controls).  The system did not include a TPM chip (that is only available on the models that ship with Win8 Pro… something that was not clear when ordering the device).  And finally, Samsung does not bundle drivers for the SmartPC in any way that is convenient for business deployment.  Re-imaging the systems would be a pain.

It also is worth noting that Microsoft decided that in-place upgrades of retail versions of Win 8 to volume license editions woudl not be supported.  If you want simply to install Win 8 Enterprise over the factory-shipped consumer edition of Win8, you are out of luck.  I also experienced this problem with the HP Envy X2.  For corporate users, volume license installs are strictly a nuke-and-repave operation.  Booooooo!  This is not Samsung’s fault, but the lack of support for business deployment (i.e. driver bundles or driver repository building tools) is a killer for the SmartPC in the enterprise.

I really wanted to love this device, but I really just have to return it.  Consumers seeking a top-performance tablet may love it, but it does not work for this sysadmin.  I am hoping that the Lenovo ThinkPad Helix will work out better.

Evaluating Windows 8 Tablets in the Enterprise – HP Envy X2

In desperation over our inability to tell University employees what they should be looking for in a Windows 8 tablet, I asked the boss if we could get our hands on one of the new Intel “Atom” processor-based Windows 8 tablets (these are the “Clover Trail” Atom processors, designed to compete with ARM-based devices).  I had been wanting to eval a Samsung ATIV Smart PC 500T, but these have been hard to get locally.  Instead, I bought a hot-off-the-shelf HP Envy X2.  This device boasts a well-engineered all-metal shell, full size keyboard dock with full-sized HDMI and SD card readers, and an extra battery in the dock for a claimed 15-hour run life.  It also claims to support an optional digitizer stylus.

I only have just started putting the machine though its paces.  My first impression is that it performs surprisingly well as a standard notebook, but that there will be significant challenges in supporting these types of devices at the same level as our existing business-model Dell systems.  I am not going to bother “reviewing” this tablet… others in the trade can handle that.  Rather, this blog post is going to address the challenges of supporting a consumer tablet in a business environment.

  • Processor:  The new “Clover Trail” Atom processors are 32-bit only.  Surprise!  I though the industry was leaving 32-bt behind, but it appears to be alive and well.  We had made the initial decision to support only 64-bit Windows 8, and have developed only 64-bit baseline images.  I see that this choice will need to be reconsidered.
  • EFI/UEFI:  These new systems boot using EFI, with emulated BIOS, with the “SafeBoot” option enabled.  Out of the box, you cannot boot to USB because the SafeBoot prohibits this.  You need to load your OS to change EFI options.  EFI is not identical between systems, so navigating the process of booting to deployment/maintenance media will be a tough challenge for technicians to work through.  I actually was completely unable to boot the Envy X2 to an USB flash drive, running either WinPE (MDT boot media) or the FreeDOS-based(?) CloneZilla  live CD.  Bummer.
  • Drivers:  Most new tablets are aimed at the consumer market.  As a result, the vendors make little effort to package drivers in a way that is convenient for local IT staff to integrate into on-premise Windows deployment tools.  The Envy X2 is no exception.  A small handful of one-off driver installers are available, including a big bundle of Intel Chipset drivers.  The chipset drivers were critical in getting a freshly installed Windows 8 Enterprise OS working with the hardware.
  • Windows Editions:  This tablet shipped with “Windows 8”.  Not “Home”, not “Professional”, not “Ultimate”.  I tried performing an in-place SKU upgrade to Windows 8 Enterprise, but setup.exe said that this was not supported, so I needed to do a full OS install.  This process worked, but it was seriously aggravating to have to boot to the OEM OS, start the Enterprise OS install, re-install all of the required drivers, then clean up the original OS install.  Our users will not want to have to deal with this, and it will make our IT support staff very tired.
  • Hardware:  No Ethernet.  Unfortunately, our MDT/LTI deployment tools are designed to run over Ethernet, not Wi-Fi.  The LTI scripts actually will terminate if a Wi-Fi connection is detected.  Of course, application-only LTI task sequences really should run just fine over wireless, but the scripts still will not run over wireless.  We either will have to comment out the Wi-Fi checks, or require that the person launching LTI have a USB Ethernet dongle handy.

So… a lot of challenges.  More details as time permits.

Oh, one small “review style” note.  I decided to evaluate this tablet because HP claims that it supports an optional stylus.  However, the stylus for this device is not actually available for sale at this time.  Further, the device does not use the common Wacom or N-Trig digitizers, so buying a spare “Bamboo” stylus will not help you here.  HP has chosen to use the new “Atmel” integrated touch/pen sensors, and as such an Atmel-compatible stylus is required.  I cannot find these on the market anywhere.  As a result I cannot make any recommendation for or against the purchase of this device for Tablet PC enthusiasts.  I don’t even know if the stylus will be available for sale before the return period for this device expires.

UPDATE:

I returned this tablet.  Why?  It was not the screen resolution, which I though would be a problem but was not.  There were three primary reasons:

  1. It was not possible to determine the quality of the digitizer within the return period of the tablet.  I was unwilling to accept the risk of having a low-quality stylus for note taking.
  2. Keyboard dock quality was low.  The keyboard itself was reasonably good, but the trackpad was very annoying.  The texture was awful, and it was overly sensitive to the slightest palm brushes.  Given the small size of the keyboard deck, it was impossible to avoid brushing the trackpad, too.  Also, the screen did not tip back far enough for comfort when used on a countertop or other waist-height surface.
  3. Business deployment essentially was unsupportable.  HP support could not assist me with initialization of the TPM chip for BitLocker.  It appeared that a TPM was present, but there was no option in BIOS to reset the TPM, and the OS could not get ownership of the chip.  Also, the total lack of driver bundles would make deployment using MDT very difficult.
  4. The graphics card could not drive my external display at native resolution.  It maxed out at 1080p.

I did quite like this tablet, though.  Consumers seeking an additional computer for the road may really enjoy using it.  For fussy power users like me, it was close butnot quite there.

Windows 8 Hardware – Waiting for Godot?

I have been running Windows 8 CP on my primary workstation for about two weeks now.  The experience is surprisingly good, although I am sure that work-a-day users of Windows are going to freak out at the site of the Metro, especially when accessed from the traditional Windows keyboard and mouse.  To that end, I though it might be useful to get my hands on some touch-enabled hardware.

This has tuned out to be less than feasible.  According to the Windows 8 build blog, Win8-certified touch devices will have to be capable of handling five-point touch input:
http://blogs.msdn.com/b/b8/archive/2012/03/28/touch-hardware-and-windows-8.aspx

This is an interesting point of data, because Windows 7 “touch ready” devices only needed to support two-point multi-touch.  Thus, the almost then entire mini-ecosystem of touch devices that were built for Windows 7 will never get a Win8 certification.  Those touch monitors from Dell and HP?  Nope.  All-in-one touchscreen PCs from a multitude of manufacturers?  Nope. 

It looks as though the Win8 touch interface has been designed with the capacitive multi-touch displays that are commonplace on tablets and smartphones in mind.  But even a number of current Tablet PC and Windows Slates with capacitive multi-touch will be out in the cold, as a lot of them only support four-point multi-touch.  As for multi-touch monitors, the only that I can find that support 5+ points of touch are the 3M displayes referenced in theWin8 Build blog (see above).  Since these displays retail for over $1000, I think most people would be better off buying a tablet like the ASUS EP121, ASUS B121, or the Samsung Slate 7.

I suppose you might be able to get some mileage out of multi-touch track pads.  Most newer laptops have pads that support multi-touch, but my venerable Dell E6500 does not.  To that end, I am going to try out a Logitech TouchPad Wireless to see if having a gesture-supporting track pad buys me anything in Win8-ville.  I’ll post back with results.

In any event, it seems that those wanting to see what the Windows 8 touch experience will really be like are going to have to wait on some hardware that does not yet exist.  Touch screen ultrabooks?  Hopefully this will be more fruitful than Waiting for Godot.