Tag Archives: Scripting

Driver installation with SCCM Software Distribution

Here we are, working with SCCM again.  Making difficult things possible, and simple things difficult.  Today we wish to distribute a SmartCard driver to all of our managed servers, so that we can require Smart Card for certain classes of logins.  the newer “CNG” Smart Card minidrivers are all simple “.inf” driver packages that you can right-click install.  This ought to be easy, thought the sys admin.  Wrong!

Installation of inf drivers is not a well documented command line procedure (unlike the rather more complicated “.msi” package, which at least is easy to script).

My thanks goes out to the following bloggers and forum users for their assistance with this case:

The script that I cobbled together to install the Athena “ASECard” minidriver is displayed below.  Note that this should work for pretty much any minidriver, as long as it has a “DefaultInstall” section in the inf file.  I just unpack the amd64 and x86 driver cab files into their respective directories, put the batch script one directory above these, and make an SCCM software package of the whole thing.  The installation command line is simply the batch file name.

@echo off
REM Installs the drivers specified in the "DefaultInstall" section
REM of the aseMD.inf that is appropriate for the current (x86 or amd64) platform.
REM Install is silent (4 flag), with no reboot (N flag).
REM The INF is specified to be in the x86 or amd64 subdirectory
REM of the script directory (%~dp0).

echo Detecting platform…
IF EXIST "%programfiles(x86)%" (GOTO :amd64) ELSE (GOTO :i386)

:i386
echo Installing 32-bit driver…
cd x86
%windir%\system32\rundll32.exe advpack.dll,LaunchINFSectionEx "%~dp0x86\aseMD.inf",DefaultInstall,,4,N
goto :EOF

:amd64
REM The command will run in 64-bit mode (%windir%\sysnative\),
REM when called from a 32-bit CMD.exe (as will be the case with SCCM).
echo Installing 64-bit driver…
cd amd64
%windir%\sysnative\rundll32.exe advpack.dll,LaunchINFSectionEx "%~dp0amd64\aseMD.inf",DefaultInstall,,4,N
goto :EOF
REM End of file

Windows Backup Performance Testing with PowerShell

While developing our new Windows file services infrastructure, we wanted to test our pre-production platform to see if there are any file server-side bottlenecks that will cause unacceptable delays in backup processing. Here are UVM we still are using EMC Networker for Enterprise backup (no comments on our satisfaction with EMC will be provided at this time). EMC provides a tool “uasm.exe” that is used at the core of the “save.exe” and “recover.exe” commands on the backup client. If we use “uasm.exe” to backup all of the file server data to “null”, it is possible that we will be able to detect disk, HBA, and other local I/O bottlenecks before they bite us in production.

Since Networker will break up our file server into multiple “save sets”, and run a user-definable number of save set backup processes in parallel, it also is important for us to determine the required number of parallel backup processes required to complete backup in a timely fashion. Thus, we want to run several parallel “uasm.exe” processes in our tests.

PowerShell, with the assistance of “cmd.exe”, and some finesses, can get this job done. Hurdles I ran into while scripting this test follow:

  1. During development, PowerShell consumed huge amounts of CPU while redirecting uasm.exe output to the PowerShell $null object. Interestingly, previous tests using uasm.exe with cmd.exe did not show this problem. To fix this, each uasm job is spawned from a one-line cmd.exe “bat” script, which is included below.
  2. Remember that PowerShell uses the null object “$null”, but that cmd.exe uses the handle “nul” (with one “L”). If you redirect to “null”, you will soon fill up your disk with a file named “null”.
  3. When wanted to examine running jobs, it was difficult to determine which directory a jobs was working on. This was because I initially created a scriptblock object and passed parameters to it when starting a job. For example:
    [scriptblock] $sb = {
    $uasmBlock = {
    	param ([string]$sPath)
    	[string[]] $argList = '/c','c:\local\scripts\uasm_cmd.bat',$sPath
    	& cmd.exe $argList
    }
    $jobs += start-job -Name $myJob -ScriptBlock $sb -ArgumentList $dir1
    

    However, when inspecting the job object’s “command” property, we see “$sPath” in the output. We want the variable expanded. How to do this? Create the scriptblock object in-line when starting the job:

    [string] $cmd = '& cmd.exe "/c","c:\local\scripts\uasm_cmd.bat",' + $dir
    $jobs += Start-Job -Name $jobName `
    	-ScriptBlock ([scriptblock]::create($cmd))
    

    This makes for more compact code, too.

  4. To check on jobs that have completed, I create an array named “$djs” (Done Jobs), populated by piping the $jobs array and filtering for “completed” jobs. I inspect $djs to see if jobs are present. In my first pass, I used the check:
    if ($djs.count -gt 0)

    Meaning, continue if there is anything in the array $djs. However, this check did not work well because output from the $jobs object would put a null item in $djs on creation, meaning that if there were no running jobs, $djs would still have a count of one! I fixed this by changing the test:

    if ($djs[0] -ne $null)

    Meaning, if the first entry in $djs is not a null object, then proceed.

The full script follows:

#uasm_jobQueue.ps1, 2011-09-30, author: J. Greg Mackinnon
#Tests performance of disk when accessed by Networker backup commands.
#   Creates a queue of directories to test ($q), then uses external command 
#   "uasm.exe" to backup these directories to null.
#Change the "$wp" variable to set the number of uasm 'worker processes' to be 
#   used during the test.
#Note: PowerShell $null object causes very high CPU utilization when used for
#   this purpose.  Instead, we call "uasm_cmd.bat" which uses the CMD.exe 'nul'
#   re-director.  'nul' does not have the same problems as $null.

set-psdebug -strict

[int] $wp = 4

# Initialize the log file:
[string] $logfile = "s:\uasm_test.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

##Create work queue array:
# Add shared directories:
[String[]] $q = gci S:\shared | ? {$_.Attributes.tostring() -match "Directory"}`
	| sort-object -Property Name | % {$_.FullName}
# Add remaining targets to queue:
$q += 'H:\','I:\','J:\','K:\','L:\','M:\','S:\sis\','S:\software\','s:\r25\'
	
[int] $dc = 0			#Count of completed (done) jobs.
[int] $qc = $q.Count	#Initial count of jobs in the queue
[int] $qi = 0			#Queue Index - current location in queue
[int] $jc = 0			#Job count - number of running jobs
$jobs = @()				#Jobs array - intended to contain running PS jobs.
	
while ($dc -lt $qc) { # Completed jobs is less than total jobs in queue
	# Keep running jobs until completed jobs is less than total jobs in queue, 
	#  and our queue count is less than the current queue index.
	while (($jobs.count -lt $wp) -and ($qc -gt $qi)) { 
		[string] $jobName = 'qJob_' + $qi + '_';
		[string] $dir = '"' + $q[$qi] + '"'
		[string] $cmd = '& cmd.exe "/c","c:\local\scripts\uasm_cmd.bat",' + $dir
		#Start the job defined in $cmd string.  Use this rather than a pre-
		#  defined scriptblock object because this allows us to see the expanded
		#  job command string when debugging. (i.e. $jobs[0].command)
		$jobs += Start-Job -Name $jobName `
			-ScriptBlock ([scriptblock]::create($cmd))
		$qi++ #Increment the queue index.
	}
	$djs = @(); #Completed jobs array
	$djs += $jobs | ? {$_.State -eq "Completed"} ;
	# $djs array will always have a count of at least 1.  However, if the 
	#    first entry is not empty (null), then there must be completed jobs to
	#    be retrieved.
	if ($djs[0] -ne $null) { 
		$dc += $djs.count;
		$djs | Receive-Job | Out-File $logfile -Append; #Log completed jobs
		$djs | Remove-Job -Force;
		Remove-Variable djs;
		$jobs = @($jobs | ? {$_.State -eq "Running"}); #rebuild jobs array.
	}
	Start-Sleep -Seconds 3
}


# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $outstr =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$outstr | out-file -Append $logfile

The “uasm_cmd.bat” file called in the above code block contains the following one line:

"c:\program files\legato\nsr\bin\uasm.exe" -s %1 > nul

Migrating from NetApp to Windows File Servers with PowerShell – part 2

Previously we saw how PowerShell and RoboCopy can be used to sync multi-terabyte file shares from NetApp to Windows. What I did not tell you was that this script choked and died horribly on a single share in our infrastructure. You may have seen it commented out in the previous script? “#,’R25′”?

CollegeNet Resource25… my old enemy. These clowns worked around a bug in their product (an inability to read an open text column in an Oracle DB table) by copying every text row in the database to its own file on a file server, and to make matters worse they copy all of the files to the same directory. Why is this bad? Ever try to get a directory listing on a directory with 480,000 1k files? It’s bad news. Worse, it kills robocopy. Fortunately, we have a workaround.

The archive utility “7-zip” is able to wrap up the nasty directory into a single small file, which we then can unpack on the new file server. Not familiar with 7-Zip? For shame! Get it now, it’s free:
http://www.7-zip.org/

7-zip ignores most file attributes, which seems to speed up the copy process a bit. Using robocopy, ouy sync operation would either run for hours on this single directory, or just hang up forever. With 7-zip, we get the job done in 30 minutes. Still slow, but better than never.

Troublesome files are found in the R25 “text_comments” directory, a subdirectory of “text”. We have prod, pre-prod, and test environments, and so need to do a few separate 7-zip archives. Note that a little compresson goes a long way here. When using “tar” archives, my archive was several gb in size. With the lowest level of compression, we squeeze down to only about 14 Mb. How is this possible? Well, a lot of our text comment files were empty, but uncompressed they still take up one block of storage. Over 480,000 blocks, this really adds up.

Code snippet follows.

#Sync R25 problem dirs

Set-PSDebug -Strict

# Initialize the log file:
[string] $logfile = "s:\r25Sync.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

function zipit {
	param ([string]$source)
	[string] $cmd = "c:\local\bin\7za.exe"
	[string] $arg1 = "a" #add (to archive) mode
	[string] $arg2 = join-path -Path $Env:TEMP -ChildPath $($($source | `
		Split-Path -Leaf) + ".7z") # filespec for archive
	[string] $arg3 = $source #spec for source directory
	[string] $arg4 = "-mx=1" #compression level... minimal for performance
	#[string] $arg4 = "-mtm=on" #timestamp preservation - commented out for perf.
	#[string] $arg5 = "-mtc=on"
	#[string] $arg6 = "-mta=on"
	#invoke command, route output to null for performance.
	& $cmd $arg1,$arg2,$arg3,$arg4 > $null 
}

function unzipit {
	param ([string]$dest)
	[string] $cmd = "c:\local\bin\7za.exe"
	[string] $arg1 = "x" #extract archive mode
	[string] $arg2 = join-path -Path $Env:TEMP -ChildPath $($($dest | `
		Split-Path -Leaf) + ".7z")
	[string] $arg3 = "-aoa" #overwrite existing files
	#destination directory specification:
	[string] $arg4 = '-o"' + $(split-path -Parent $dest) + '"' 
	#invoke command, route to null for performance:
	& $cmd $arg1,$arg2,$arg3,$arg4 > $null 
	Remove-Item $arg2 -Force # delete archive
}

[String[]] $zips = "V3.3","V3.3.1","PROD\WinXp\Text"
[string] $sourceD = "\\files\r25"
[string] $destD = "s:\r25"

foreach ($zip in $zips) {
	Get-Date | Out-File $logfile -Append 
	[string] "Compressing directory: " + $zip | Out-File $logfile -Append 
	zipIt $(join-path -Path $sourceD -ChildPath $zip)
	Get-Date | Out-File $logfile -Append 
	[string] "Uncompressing to:" + $destD | Out-File $logfile -Append
	unzipit $(Join-Path -Path $destD -ChildPath $zip)
}

Get-Date | Out-File $logfile -Append 
[string] "Syncing remaining files using Robocopy..." | Out-File $logfile -Append
$xd1 = "\\files\r25\V3.3" 
$xd2 = "\\files\r25\V3.3.1" 
$xd3 = "\\files\r25\PROD\WinXP\text"
$xd4 = "\\files\r25\~snapshot"
$roboArgs = @("/e","/copy:datso","/purge","/nfl","/ndl","/np","/r:0","/mt:4",`
	"/b",$sourceD,$destD,"/xd",$xd1,$xd2,$xd3,$xd4)

& robocopy.exe $roboArgs

Get-Date | Out-File $logfile -Append 
[string] "Done with Robocopy..." | Out-File $logfile -Append

# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $outstr =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$outstr | out-file -Append $logfile

Migrating from NetApp to Windows File Servers with PowerShell – part 1

We are retiring our NetApp filer this year. It was nice knowing you, NetApp. Thank you for the no-hassle performance, agile volume management, and excellent customer support. We will not miss your insane pricing, and subtle incompatibilities with modern Windows clients.

In this multi-part series, I will be sharing PowerShell code developed to assist with our migration. In part one, we will look at bulk copy operations with RoboCopy. In part 2, we will look at a situation where RoboCopy fails to get the job done. In future parts, we will look at automated share and quota management and migration.

Migrating large amounts of data off a NetApp is not particularly straightforward. The only real option we have is to copy data off of the filer CIFS shares to their Windows counterparts. Fortunately, with the multi-threading power utility “robocopy” we can move data between shares pretty quickly. Unfortunately, robocopy only multi-threads file copy operations, not directory search operations. So, while initial data transfers with robocopy take place really quickly, subsequent sync operations are slower than expected. MS also released a utility called “RichCopy” whish supports multi-thread directory searching, but this utility is not supported by MS, and has some significant bugs (i.e. it crashes all the time). What to do?

PowerShell to the rescue! Using PowerShell jobs, we can spawn off a separate robocopy job for each subdirectory of a source share, and run an arbitrary number of parallel directory copies. With some experimentation, I determined that I could run ten simultaneous robocopy operations without overwhelming CPU or disk channels on the filer. Under this arrangement, or file sync Window has been reduced from almost 48 hours to a mere 2.5 hours.

Some tricky bits in the development of this script where:

  • PowerShell jobs and job queuing are critical to completing this script in a timely fashion. Syntax for “start-job” is tricky. See my post on backup performance testing for more comments on working with jobs.
  • Robocopy fails top copy a number of source files. This is mitigated though the use of the “/b” switch (backup mode).
  • The PowerShell cmdlet “receive-jobs” fails to capture output from a variety of job commands unless you assign the job to an object. To reliably capture the output of commands within our jobs, I needed to assign the jobs to our $jobs array.
  • I needed to do some post processing on the log file. In doing so, I needed to find UNC paths for our source filer “\\files”. It is important to remember that, when using regular expressions, “\” is the escape character. So, to match for “\”, we need to enter “\\”. To match for “\\” we need to enter “\\\\”, as in:
     get-content $logfile | select-string -Pattern "\\\\files" | ...
  • Initially I allowed the script to process only one top level directory at a time (i.e. Start with \\files\software, and only proceed to \\files\shared when “software” completes). The problem with this was, I was preventing the script from running an optimal job count. Furthermore, a single hung job could bring the whole script to a halt. To combat this, I start the script by building a master queue array “$q”, which holds all of the directories for which I am going to start a job. The result of using a master queue is a considerable improvement in sustained throughput.
  • When building an array with a loop (i.e. while…) you may have trouble with the first item added to the array if you do not initialize the array before starting to loop. In my case, I needed to initialize “[array]$jobs = @()” before using the array to hold job objects in the “while” loop. Failing to do so caused “$jobs” to become a single job object when the number of jobs was equal to one. Bad news, if you are expecting to use array properties such as $jobs.count, or to call in index of the object (i.e. $jobs[0]).
  • ISE programs like the native PowerShell ISE, or Quest PowerGUI make script development much easier. However, production environments are not the same as the debug environment, so keep these tips in mind:
    1. Log your script actions! Use lots of out-file calls. If you are feeling slick, you can enclose these in “if ($debug)” clauses, and set the $debug variable as a script parameter (which I did no do here).
    2. When running in production, watch the log file in real-time using “get-content -wait”. I know it is not a cool as the Gnu command “tail”, but it is close.
  • Scoping… careful of the “global” scope. Initially I modified the $jobs and $dc variables in the global scope from within the “collectJobs” function. This worked fine in my ISE and at the PowerShell prompt. However, when running as a scheduled task, these calls failed miserably. I changed the calls to use the “script” scope, and the script now runs as a scheduled task successfully.

Below is the script I developed for this job… it contains paths specific to our infrastructure, but easily could be modified. Change the “while ($jobcount -lt 10)” loop to set the number of simultaneous robocopy processes to be used by the script…

# FilerSync_jobQueue.ps1
# JGM, 2011-09-29
# Copies all content of the paths specified in the $srcShares array to 
# corresponding paths on the local server.
# Keeps data on all copy jobs in an array "$q".
# We will use up to 10 simultaneous robocopy operations.

set-psdebug -strict

# Initialize the log file:
[string] $logfile = "s:\files_to_local.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

# Initialize the Source file server root directories:
[String[]] $srcShares1 = "adfs$","JMP$","tsFlexConfig","software","mca","sis","shared"`
	#,"R25"
	#R25 removed from this sync process as the "text_comments" directory kills
	#robocopy.  We will sync this structure separately.
[String[]] $srcShares2 = "uvol_t1_1$\q-home","uvol_t1_2$\q-home","uvol_t1_3$\q-home",`
	"uvol_t1_4$\q-home","uvol_t1_5$\q-home","uvol_t2_1$\q-home",`
	"vol1$\qtree-home"
	
[String[]] $q = @() #queue array

function collectJobs { 
#Detects jobs with status of Completed or Stopped.
#Collects jobs output to log file, increments the "done jobs" count, 
#Then rebuilds the $jobs array to contain only running jobs.
#Modifies variables in the script scope.
	$djs = @(); #Completed jobs array
	$djs += $script:jobs | ? {$_.State -match "Completed|Stopped"} ;
	[string]$('$djs.count = ' + $djs.count + ' ; POssible number of jobs completed in this colletion cycle.') | Out-File $logfile -Append;
	if ($djs[0] -ne $null) { #First item in done jobs array should not be null.
		$script:dc += $djs.count; #increment job count
		[string]$('$script:dc = ' + $script:dc + ' ; Total number of completed jobs.') | Out-File $logfile -Append;
		$djs | Receive-Job | Out-File $logfile -Append; #log job output to file
		$djs | Remove-Job -Force;
		Remove-Variable djs;
		$script:jobs = @($script:jobs | ? {$_.State -eq "Running"}) ; #rebuild jobs arr
		[string]$('$script:jobs.count = ' + $script:jobs.Count + ' ; Exiting function...') | Out-File $logfile -Append
	} else {
		[string]$('$djs[0] is null.  No jobs completed in this cycle.') | Out-File $logfile -Append
	}
}
	
# Loop though the source directories:
foreach ($rootPath in $srcShares1) {
    [string] $srcPath = "\\files\" + $rootPath # Full Source Directory path.  
	#Switch maps the source directory to a destination volume stored in $target 
    switch ($rootPath) {
        shared {[string] $target = "S:\shared"}
        software {[string] $target = "S:\software"}
        mca {[string] $target = "S:\mca"}
        sis {[string] $target = "S:\sis"}
        adfs$ {[string] $target = "S:\adfs"}
        tsFlexConfig {[string] $target = "s:\tsFlexConfig"}
        JMP$ {[string] $target = "s:\JMP"}
        R25 {[string] $target = "S:\R25"}
    }
    #Enumerate directories to copy:
	$dirs1 = @()
	$dirs1 += gci $srcPath | sort-object -Property Name `
		| ? {$_.Attributes.tostring() -match "Directory"} `
		| ? {$_.Name -notmatch "~snapshot"}
	#Copy files in the root directory:
	[string] $sd = '"' + $srcPath + '"';
	[string] $dd = '"' + $target + '"';
	[Array[]] $q += ,@($sd,$dd,'"/COPY:DATSO"','"/LEV:1"' )
	# Add to queue:
	if ($dirs1[0] -ne $null) {
		foreach ($d in $dirs1) {
			[string] $sd = '"' + $d.FullName + '"';
	    	[string] $dd = '"' + $target + "\" + $d.Name + '"';
			$q += ,@($sd,$dd,'"/COPY:DATSO"','"/e"')
		}
	}
}
foreach ($rootPath in $srcShares2) {   
    [string] $srcPath = "\\files\" + $rootPath # Full Source Directory path.
	#Switch maps the source directory to a destination volume stored in $target 
    switch ($rootPath) {
        uvol_t1_1$\q-home {[string] $target = "H:\homes1"}
        uvol_t1_2$\q-home {[string] $target = "I:\homes1"}
        uvol_t1_3$\q-home {[string] $target = "J:\homes1"}
        uvol_t1_4$\q-home {[string] $target = "K:\homes1"}
        uvol_t1_5$\q-home {[string] $target = "L:\homes1"}
        uvol_t2_1$\q-home {[string] $target = "M:\homes1"}
        vol1$\qtree-home {[string] $target = "J:\homes2"}
    }
    #Enumerate directories to copy:
	[array]$dirs1 = gci -Force $srcPath | sort-object -Property Name `
		| ? {$_.Attributes.tostring() -match "Directory"}
	if ($dirs1[0] -ne $null) {
		foreach ($d in $dirs1) {
			[string] $sd = '"' + $d.FullName + '"'
			[string] $dd = '"' + $target + "\" + $d.Name + '"'
			$q += ,@($sd,$dd,'"/COPY:DAT"','"/e"')
		}
	}
}

[string] $queueFile = "s:\files_to_local_queue.csv"
Remove-Item -Force $queueFile
foreach ($i in $q) {[string]$($i[0]+", "+$i[1]+", "+$i[2]+", "+$i[3]) >> $queueFile }

New-Variable -Name dc -Option AllScope -Value 0
[int] $dc = 0			#Count of completed (done) jobs.
[int] $qc = $q.Count	#Initial count of jobs in the queue
[int] $qi = 0			#Queue Index - current location in queue
[int] $jc = 0			#Job count - number of running jobs
$jobs = @()

while ($qc -gt $qi) { # Problem here as some "done jobs" are not getting captured.
	while ($jobs.count -lt 10) {
		[string] $('In ($jobs.count -lt 10) loop...') | out-file -Append $logFile
		[string] $('$jobs.count is now: ' + $jobs.count) | out-file -Append $logFile
		[string] $jobName = 'qJob_' + $qi + '_';
		[string] $sd = $q[$qi][0]; [string]$dd = $q[$qi][1];
		[string] $cpo = $q[$qi][2]; [string] $lev = $q[$qi][3]; 
		[string]$cmd = "& robocopy.exe $lev,$cpo,`"/dcopy:t`",`"/purge`",`"/nfl`",`"/ndl`",`"/np`",`"/r:0`",`"/mt:4`",`"/b`",$sd,$dd";
		[string] $('Starting job with source: ' + $sd +' and destination: ' + $dd) | out-file -Append $logFile
		$jobs += Start-Job -Name $jobName -ScriptBlock ([scriptblock]::create($cmd))
		[string] $('Job started.  Incrementing $qi to: ' + [string]$($qi + 1)) | out-file -Append $logFile
		$qi++
	}
	[string] $("About to run collectJobs function...") | out-file -Append $logFile
	collectJobs
	[string] $('Function done.  $jobs.count is now: ' + $jobs.count)| out-file -Append $logFile
	[string] $('$jobs.count = '+$jobs.Count+' ; Sleeping for three seconds...') | out-file -Append $logFile
	Start-Sleep -Seconds 3
}
#Wait up to two hours for remaining jobs to complete:
[string] $('Started last job in queue. Waiting up to three hours for completion...') | out-file -Append $logFile
$jobs | Wait-Job -Timeout 7200 | Stop-Job
collectJobs

# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $out =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$out | out-file -Append $logfile

#Create an error log from the session log.  Convert error codes to descriptions:
[string] $errFile = 's:\files_to_local.err'
remove-item $errFile -force
[string] $out = "Failed jobs:"; $out | out-file -Append $logfile
$jobs | out-file -Append $errFile
$jobs | % {$jobs.command} | out-file -Append $errFile
[string] $out = "Failed files/directories:"; $out | out-file -Append $errFile
Get-Content $logfile | Select-String -Pattern "\\\\files"`
	| select-string -NotMatch -pattern "^   Source" `
	| % {
		$a = $_.toString(); 
		if ($a -match "ERROR 32 ")  {[string]$e = 'fileInUse:        '};
		if ($a -match "ERROR 267 ") {[string]$e = 'directoryInvalid: '};
		if ($a -match "ERROR 112 ") {[string]$e = 'notEnoughSpace:   '};
		if ($a -match "ERROR 5 ")   {[string]$e = 'accessDenied:     '};
		if ($a -match "ERROR 3 ")   {[string]$e = 'cannotFindPath:   '};
		$i = $a.IndexOf("\\f");
		$f = $a.substring($i);
		Write-Output "$e$f" | Out-File $errFile -Force -Append
	}

Mozilla Thunderbird – Implementing the ISP Hook in 5.0 and later

Out buddies at MozillaMessaging are at it again… new with Thunderbird 5.0, all of the “jar” files previously present in the Thunderbird installation directory have been collapsed into a single “omni.jar” file, apparently for program load optimization.

This all would be fine with me if the omni.jar were a “normal” zip file, and the previous jars were. Instead, this is an “optimized” jar, that is not recognized by 7-zip as a valid zip archive. It is not clear to me how the jar is “optimized”, or how to re-apply optimizations when modifying the original, nor do I particularly care as load optimization is of little concern to us… we are not operating with 10 year old equipment, for the most part, so who cares?

I have had to work around the problem by using “jar.exe” from the Java JDK. This program extracts and re-creates omni.jar in a way that Thunderbird does not mind. The resultant file size is about the same, too.

Another quirk of the new version is that the default “prefs.js” file, while present in omni.jar, is not copied into the resultant Program Files directory at install time. On a clean install, there is no default prefs.js! I had to populate a new prefs.js into the “core” installer directory, outside of the new “omni.jar”.

Finally, there are no more “localized/nonlocalized” directories in the installer, just that which is within “omni”, and that which is not. So, I put our prefs.js in .\core\defaults\profile (a new directory in the installer zip). Previously it was in .\nonlocalized\defaults\profile. Likewise, our mailnews.js and ISP Hook “.rdf” files also go in “core” instead of “localized”.

Other than that, the RDF ISP Hook file that we are using is unchanged from the one documented here:
https://sharepoint.uvm.edu/sites/ad/distribution/appconfig/Thunderbird.aspx
Prefs.js is the same as in the SharePoint site, with the modifications noted here:
http://blog.uvm.edu/jgm/2009/12/15/thunderbird-3-re-implementing-the-isp-hook-for-customized-deployments/

What a bunch of gibberish talk… sorry for the dry post. I am guessing if you are a Thunderbird customizer, you will know what I am talking about. If not, you won’t be reading anyway.

On to the build script, in Windows BAT/CMD language:

REM Thunderbird customized build script for UVM.
REM Updated September 2011 for Thunderbird 5.0 support.
REM REQUIRES:
REM – 7z.exe, 7zr.exe and sed.exe in parallel "..\bin" directory
REM – Unmodified Thunderbird installer in .\source directory
REM – all required config files in .\config directory
REM (including 7z control file, ISP Hook RDF file, and modified prefs.js)
REM – local JDK install with "jar.exe". Path to jar.exe will need to be updated in the jdk environment variable
REM OUTPUT: Fully modified Thunderbird installer in .\Installer directory.
REM @echo on

set jdk="c:\Program Files (x86)\Java\jdk1.6.0_27\bin"

Echo Cleaning up old builds…
del .\Installer\*.exe
rmdir /s /q .\build
set /P tbver=Enter Thunderbird version number to build (i.e. "6.0.2"):

Echo Extracting setup files from OEM Installer…
mkdir .\build\temp
..\bin\7zr x .\source\*.exe -o.\build

Echo Extracting omni.jar contents…
mkdir .\build\temp
cd .\build\temp
%jdk%\jar.exe xf ..\core\omni.jar

Echo modifying messenger functions…
..\..\..\bin\sed.exe –binary "s/NewMailAccount(msgWindow, okCallback);/MsgAccountWizard(okCallback);/" .\chrome\messenger\content\messenger\msgMail3PaneWindow_new.js
MOVE /Y .\chrome\messenger\content\messenger\msgMail3PaneWindow_new.js .\chrome\messenger\content\messenger\msgMail3PaneWindow.js

Echo modifying default mailnews preferences…
..\..\..\bin\sed.exe –binary "s/try_ssl\", 0)/try_ssl\", 2)/" .\defaults\pref\mailnews_new.js
MOVE /Y .\defaults\pref\mailnews_new.js .\defaults\pref\mailnews.js

Echo moving UVM modified prefs.js into place (note that this file is not actually used by Thunderbird!)
copy /Y ..\..\config\prefs.js .\defaults\profile\prefs.js

Echo Repacking omni.jar…
del /f /q ..\core\omni.jar
%jdk%\jar.exe cf ..\core\omni.jar *

Echo Copying UVM Custom ISP file to source…
cd ..\..\
mkdir .\build\core\isp\en-US
copy /Y .\config\UVMMail.rdf .\build\core\isp\en-US\UVMMail.rdf
Echo Copying UVM default prefs.js to core directory (tbird no longer has a prefs.js by default, but it will be used if present)…
mkdir .\build\core\defaults\profile
copy /Y .\config\prefs.js .\build\core\defaults\profile\prefs.js

Echo Deleting temporary files that should not be present in the installer…
rmdir /s /q .\build\temp

Echo Repackaging Thunderbird installer…
..\bin\7zr a .\Installer\UVM_Thunderbird_setup_%tbver%.7z .\build\*
copy /b ..\bin\7zS.sfx + .\config\config.txt + .\Installer\UVM_Thunderbird_setup_%tbver%.7z .\Installer\UVM_Thunderbird_setup_%tbver%.exe

Echo Cleaning up installation source…
del /s /f /q .\build\*.*
rmdir /s /q .\build\core
rmdir /s /q .\build
del /f /q .\Installer\UVM_Thunderbird_setup_%tbver%.7z

Discovering orphaned vmdk files in vSphere

On occasion we have found abandoned vmdk files in our vSphere infrastructure. I often have thought we needed to take some time to hunt down and exterminate these orphans. As is often the case, someone else already did the initial research required to make automation of this task possible, but I fou nd I needed to do some updating of the source scripts for improved accuracy, improved formatting, and compatibility with vSphere 4.1:

# getOrphanVMDK.ps1
# Purpose : List all orphaned vmdk on all datastores in all VC's
# Version : v2.0
# Author  : J. Greg Mackinnon, from original by HJA van Bokhoven
# Change  : v1.1  2009.02.14  DE  angepasst an ESX 3.5, Email versenden und Filegrösse ausgeben
# Change  : v1.2  2011.07.12 EN  Updated for ESX 4, collapsed if loops into single conditional
# Change  : v2.0  2011.07.22 EN: 
	# Changed vmdk search to use the VMware.Vim.VmDiskFileQuery object to improve search accuracy
	# Change vmdk matching logic as a result of VmDiskFileQuery usage
	# Pushed discovered orphans into an array of custom PS objects
	# Simplified logging and email output
			
Set-PSDebug -Strict

#Initialize the VIToolkit:
add-pssnapin VMware.VimAutomation.Core
[Reflection.Assembly]::LoadWithPartialName("VMware.Vim")

#Main

[string]$strVC = "myViServer.mydomain.org"								# Virtual Center Server name
[string]$logfile = "c:\local\temp\getOrphanVMDK.log"
[string]$SMTPServer = "mysmtp.mydomain.org"							# Change to a SMTP server in your environment
[string]$mailfrom = "GetOrphanVMDK@myViServer.mydomain.org"	# Change to email address you want emails to be coming from
[string]$mailto = "vmware@mydomain.org"							# Change to email address you would like to receive emails
[string]$mailreplyto = "vmware@mydomain.org"						# Change to email address you would like to reply emails

[int]$countOrphaned = 0
[int64]$orphanSize = 0

# vmWare Datastore Browser query parameters
# See http://pubs.vmware.com/vi3/sdk/ReferenceGuide/vim.host.DatastoreBrowser.SearchSpec.html
$fileQueryFlags = New-Object VMware.Vim.FileQueryFlags
$fileQueryFlags.FileSize = $true
$fileQueryFlags.FileType = $true
$fileQueryFlags.Modification = $true
$searchSpec = New-Object VMware.Vim.HostDatastoreBrowserSearchSpec
$searchSpec.details = $fileQueryFlags
#The .query property is used to scope the query to only active vmdk files (excluding snaps and change block tracking).
$searchSpec.Query = (New-Object VMware.Vim.VmDiskFileQuery)
#$searchSpec.matchPattern = "*.vmdk" # Alternative VMDK match method.
$searchSpec.sortFoldersFirst = $true

if ([System.IO.File]::Exists($logfile)) {
    Remove-Item $logfile
}

#Time stamp the log file
(Get-Date –f "yyyy-MM-dd HH:mm:ss") + "  Searching Orphaned VMDKs..." | Tee-Object -Variable logdata
$logdata | Out-File -FilePath $logfile -Append
#Connect to vCenter Server
Connect-VIServer $strVC

#Collect array of all VMDK hard disk files in use:
[array]$UsedDisks = Get-View -ViewType VirtualMachine | % {$_.Layout} | % {$_.Disk} | % {$_.DiskFile}
#The following three lines were used before adding the $searchSpec.query property.  We now want to exclude template and snapshot disks from the in-use-disks array.
# [array]$UsedDisks = Get-VM | Get-HardDisk | %{$_.filename}
# $UsedDisks += Get-VM | Get-Snapshot | Get-HardDisk | %{$_.filename}
# $UsedDisks += Get-Template | Get-HardDisk | %{$_.filename}

#Collect array of all Datastores:
#$arrDS is a list of datastores, filtered to exclude ESX local datastores (all of which end with "-local1" in our environment), and our ISO storage datastore.
[array]$allDS = Get-Datastore | select -property name,Id | ? {$_.name -notmatch "-local1"} | ? {$_.name -notmatch "-iso$"} | Sort-Object -Property Name

[array]$orphans = @()
Foreach ($ds in $allDS) {
	"Searching datastore: " + [string]$ds.Name | Tee-Object -Variable logdata
	$logdata | Out-File -FilePath $logfile -Append
	$dsView = Get-View $ds.Id
	$dsBrowser = Get-View $dsView.browser
	$rootPath = "["+$dsView.summary.Name+"]"
	$searchResult = $dsBrowser.SearchDatastoreSubFolders($rootPath, $searchSpec)
	foreach ($folder in $searchResult) {
	    foreach ($fileResult in $folder.File) {
			if ($UsedDisks -notcontains ($folder.FolderPath + $fileResult.Path) -and ($fileResult.Path.length -gt 0)) {
				$countOrphaned++
				IF ($countOrphaned -eq 1) {
					("Orphaned VMDKs Found: ") | Tee-Object -Variable logdata
					$logdata | Out-File -FilePath $logfile -Append
				}
				$orphan = New-Object System.Object
				$orphan | Add-Member -type NoteProperty -name Name -value ($folder.FolderPath + $fileResult.Path)
				$orphan | Add-Member -type NoteProperty -name SizeInGB -value ([Math]::Round($fileResult.FileSize/1gb,2))
				$orphan | Add-Member -type NoteProperty -name LastModified -value ([string]$fileResult.Modification.year + "-" + [string]$fileResult.Modification.month + "-" + [string]$fileResult.Modification.day)
				$orphans += $orphan
				$orphanSize += $fileResult.FileSize
				$orphan | ft -autosize | out-string | Tee-Object -Variable logdata
				$logdata | Out-File -FilePath $logfile -Append
				[string]("Total Size or orphaned files: " + ([Math]::Round($orphanSize/1gb,2)) + " GB") | Tee-Object -Variable logdata
				$logdata | Out-File -FilePath $logfile -Append
				Remove-Variable orphan
			}
		}
	}
}
(Get-Date –f "yyyy-MM-dd HH:mm:ss") + "  Finished (" + $countOrphaned + " Orphaned VMDKs Found.)" | Tee-Object -Variable logdata
$logdata | Out-File -FilePath $logfile -Append

if ($countOrphaned -gt 0) {
	[string]$body = "Orphaned VMDKs Found: `n"
	$body += $orphans | Sort-Object -Property LastModified| ft -AutoSize | out-string
	$body += [string]("Total Size or orphaned files: " + ([Math]::Round($orphanSize/1gb,2)) + "GB")
    $SmtpClient = New-Object system.net.mail.smtpClient
    $SmtpClient.host = $SMTPServer
    $MailMessage = New-Object system.net.mail.mailmessage
    $MailMessage.from = $mailfrom
    $MailMessage.To.add($mailto)
    $MailMessage.replyto = $mailreplyto
    $MailMessage.IsBodyHtml = 0
    $MailMessage.Subject = "Info: VMware orphaned VMDKs"
    $MailMessage.Body = $body
	"Mailing report... " | Tee-Object -Variable logdata
	$logdata | Out-File -FilePath $logfile -Append
    $SmtpClient.Send($MailMessage)
}
Disconnect-VIServer -Confirm:$False

WSUS Reporting with PowerShell

I have been trying to determine if our SCCM service has most of our domain clients registered, and have decided that the WSUS client database may be the best source of information on currently active domain members. As previously mentioned, WSUS is not pre-configured with a lot of useful infrastructure reports, but pulling data out with PowerShell is not overly difficult. Have a gander… this script generates a count of all current clients, counts by OS type, a count of Virtual Machine clients, and a few counts based of various source IP addresses.

#Get WSUS Computers script
# Finds and counts all registered computers matching various criteria specified in the script
# Optionally, the found computer names to the file defined in $outFile, forced to uppercase, trimmed of whitespace, and sorted.
# Generates an object $out, that is sent to the console at the end of the script.

set-psdebug -strict

#Initialize Variables
	#$outFile = [string] "\\files\shared\ets\SAA\jgm\WSUSXps.txt"

	$hwModel = "Virtual|vm"
	$ipMatch = "^132.198|^10.245" # specify your internal network ip ranges here, in RegEx format.
	$wsusParentGroup = [string] "All Computers"
	$wsusgroup = ""
	$WindowsUpdateServer= [string] "wsus.mydomain.com" #specify your WSUS server here
	$useSecureConnection = [bool] $true
	$portNumber = [int] "443" #required if you have added SSL protection to your WSUS (which you should do).

#Instantiate Objects:
	#Required WSUS Assembly – auto installed with WSUS Administration Tools
	[void][reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration")
	$wsus = [Microsoft.UpdateServices.Administration.AdminProxy]::GetUpdateServer($WindowsUpdateServer,$useSecureConnection,$portNumber)
	$computerScope = new-object Microsoft.UpdateServices.Administration.ComputerTargetScope
	$computerScope.IncludedInstallationStates = [Microsoft.UpdateServices.Administration.UpdateInstallationStates]::All
	$computers = $wsus.GetComputerTargets($computerScope)
	$wsusData = new-object System.Object
	$out = @()

$wsusData | add-member -type NoteProperty -name Criteria -value ("Total comptuers")
$wsusData | add-member -type NoteProperty -name Count -value ($computers.count)
$out += $wsusData
remove-variable wsusData

$osType = "Windows 7"
$filtComps = $computers | ? {$_.OSDescription -match $osType}
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("Windows 7")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData

$osType = "Windows Vista"
$filtComps = $computers | ? {$_.OSDescription -match $osType} 
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("Windows Vista")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData

$osType = "Windows XP"
# final "select" in the pipeline if you want to generate a list of computer names matching the criteria.
$filtComps = $computers | ? {$_.OSDescription -match $osType} | select-object -Property FullDomainName
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("XP Professional")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData

#Filter for virtual machine models
$filtComps = $computers | ? {$_.Model -match $hwModel}
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("Virtual Machines")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData 

$filtComps = $computers | ? {$_.IPAddress -notmatch $ipMatch} | select-object -Property IPAddress
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("Non-UVM Addresses")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData

## Following section does not produce useful data... WSUS does not see NAT-based addresses, on the public IP in front of the NAT.
## However, it is a good regex... it matches any non-routable (private) IPv4 address.  Take note for future use.
#$ipMatch = "^10.|^192.168.|^72.[1-2][0-9].|^72.3[0-1]."
#$filtComps = $computers | ? {$_.IPAddress -match $ipMatch}
#$wsusData = new-object System.Object
#$wsusData | add-member -type NoteProperty -name Criteria -value ("NAT Addresses")
#$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
#$out += $wsusData
#remove-variable wsusData

$ipMatch = "^10.245." # Our Wi-Fi and VPN clients fall in this IP range.  Substitute your internal (non-routed) IPs here.
$filtComps = $computers | ? {$_.IPAddress -match $ipMatch} 
$wsusData = new-object System.Object
$wsusData | add-member -type NoteProperty -name Criteria -value ("UVM Wireless/VPN Addresses")
$wsusData | add-member -type NoteProperty -name Count -value ($filtComps.count)
$out += $wsusData
remove-variable wsusData

#Generate file output by: removing all but the RDN of the computer name, trimming any whitespace, forcing to uppercase, 
# sorting, suppressing headers, then writing to file.
#$filtComps | foreach {$_.FullDomainName.split('.')[0]} | foreach {$_.Trim()} | foreach {$_.ToUpper()} | `
	#sort-object | Format-Table -HideTableHeaders | Out-File -FilePath $outFile -Force
	
$out | Format-Table -AutoSize

WSUS – Missing Servers

All of the Server 2003 virtual machines are missing from the WSUS inventory. Poor servers… they are missing the party.

Well not really, they are still getting updates, according to the logs. However, they are not reporting in. They are party lurkers.

This is an old problem, and only took a little digging. All of our 2003 VMs came from a common VMware template. Since the template once connected to our WSUS server, it already had a unique SusID in the registry, so all the clones have the same ID.

To fix, I used the cmd script here:
http://msmvps.com/blogs/athif/archive/2005/09/04/65174.aspx

I had to remove the “pause” commands, then I used “dsquery” to find all of of the relevant servers in our infrastructure:

dsquery * ou=Servers,ou=ets,ou=resources,dc=campus,dc=ad,dc=uvm,dc=edu -attr cn operatingSystem -limit 2000 > servers.txt

I isolated the Server 2003 systems from this list:

find "2003" servers.txt > 2003servers.txt

I then did some quick text processing to remove everything but the host names from the output file.  The final step, we use psexec.exe from SysInternals to run the SusID reset script:

psexec.exe @2003servers.txt -s -c AU_Clean_SID.cmd

I ended up running “psexec.exe” a second time to force the “wuauclt.exe /resetauthorization /detectnow” bit a second time.  Psexec.exe requires the “-d” switch when running this command remotely.  I think the WUAU service needed time to get fully operational before running the authorization token reset.  Perhaps a pause command would be of assistance in cmd script linked above?  Anyway, all of the Server 2003 hosts have come back to the party and are socializing nicely.

For my next trick, I am going to try to match up the WUAU XP client list with our AD XP client list to see if we have a lot of silent XP systems.  If we rely on standard tools, we could use a query similar to the following to extract XP computer objects with their last logon time:

dsquery * domainRoot -Filter "(&(objectclass=computer)(operatingSystem=Windows XP Professional))" -attr Name LastLogonTimeStamp -limit 20000 > xp.txt

There is an excellent cmd script here that will convert the “lastLogonTimestamp” into human-readable format.

However, it probably would be easier to use “dsquery computer domainRoot -inactive 4”, since processing of “lastLogonTimestamp” against current local time can be challenging in CMD (easier with PowerShell).

Time Sync in MDT 2010/WinPE, take 3

Third time is the charm? I found after a few passes that the script I put together for syncing system clocks in Windows was not actually working the way that I thought it was. While the system date was getting set correctly, time consistently was off by three hours. WTF?!?!? The problem, as it turns out, is not in coding errors, but rather in a misunderstanding of WinPE system time.

WinPE, as you may know, has no control panels. It has no “Windows Time” service. It has no time service management utilities such as “w32tm.exe”. So since it has no way to display its time zone settings, does that mean that it is not Time Zone aware? The answer is no.

WinPE has a default timezone of PST (GMT-8). This can be changed using the DISM utility, but sadly setting the default timezone for your LiteTouch/MDT boot images is not an option that is exposed in the Deployment Workbench. You would have to update the source winpe.wim image in the Windows AIK “tools\petools” directory it change the default for all future workbench-generated boot images. Rather than do this, and risk forgetting to do it again for all future AIK updates, I decided to set the timezone in my LiteTouch task sequences. It is pretty easy, requiring one additional custom command line in the task sequence:

reg.exe import "%SCRIPTROOT%\EST_tz.reg"

The “reg” file referenced here is simply an export of HKLM\System\CurrentControlSet\Control\TimeZoneInformation, from a system in the correct local timezone.

I then updated the WinPE timesync script to sync directly against our deployment server using “net time /set /y”, since this seems like the most reliable tool in WinPE:

Option Explicit
RunNewInstance

'//—————————————————————————-
'//  Global Constants
'//—————————————————————————-

'const DEPLOY_SERVER = "\\sysimg3.campus.ad.uvm.edu"

'//—————————————————————————-
'//  Main Class
'//—————————————————————————-

Class ZUVMsetTimePE

    '//—————————————————————————-
    '//  Global constant and variable declarations
    '//—————————————————————————-

    Dim iRetVal

    '//—————————————————————————-
    '//  Constructor to initialize needed global objects
    '//—————————————————————————-

    Private Sub Class_Initialize

    End Sub
    '//—————————————————————————-
    '//  Main routine
    '//—————————————————————————-

    Function Main
	
		' setTimePE.vbs
		' J. Greg Mackinnon, 2010-10-12
		' Sets time from within a WinPE 3.0 environment.
		' ASSUMES:  Local timezone is set correctly in the WinPE system registry.
		' ASSUMES:  Time on the deployment server is set correctly.

		' Declare objects and variants
		Dim oExec
		Dim sDPServ
		
		' Initiate variants
		sDPServ = oEnvironment.Item("SMSDP")

		' Procedure: Display current time:
		oLogging.CreateEntry "setTime> " & "Current Time is: " & Date & " " & Time, LogTypeInfo
		
		' Procedure: Set local time against time on deployment server:
		oLogging.CreateEntry "setTime> About to run command: " & "net.exe time \\" & sDPServ & " /set /y", LogTypeInfo
		set oExec = oShell.Exec("net.exe time \\" & sDPServ & " /set /y")
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		do while not oExec.StdOut.AtEndOfStream
			oLogging.CreateEntry "setTime> " & oExec.StdOut.ReadLine, LogTypeInfo
		loop
		oLogging.CreateEntry "setTime> " & "Current Time is now: " & Date & " " & Time, LogTypeInfo

	End Function

End Class

   

Time Sync Scripts… updated for MDT 2010/LiteTouch

Update: Although the scripts in this post work, a correction is required for time to sync properly in the winPE environment. An updated WinPE script can be found here:
http://blog.uvm.edu/jgm/2010/10/14/time-sync-in-mdt-2010winpe-take-3/

With some help from the indispensable MDT Debugger, I have managed to get my quick Time Sync VBScript into MDT 2010: http://blogs.technet.com/b/deploymentguys/archive/2010/03/22/mdt-debugger.aspx

The script I posted previously needed a few quick adjustments to enable proper logging to the “MININT” deployment directories. Here is the updated version. If you want to use it, give it a name starting with “Z” and drop it in the “Scripts” directory of your deployment share:


   
   

Option Explicit
RunNewInstance

'//—————————————————————————-
'//  Global Constants
'//—————————————————————————-

'//—————————————————————————-
'//  Main Class
'//—————————————————————————-

Class ZUVMsetTime

    '//—————————————————————————-
    '//  Global constant and variable declarations
    '//—————————————————————————-

    Dim iRetVal

    '//—————————————————————————-
    '//  Constructor to initialize needed global objects
    '//—————————————————————————-

    Private Sub Class_Initialize

    End Sub
    '//—————————————————————————-
    '//  Main routine
    '//—————————————————————————-

    Function Main

	' SetTime.vbs script
	' J. Greg Mackinnon, 2010-10-04
	' Actions: Syncs local system clock to "ntp.pool.org" time using the NTP protocol.
	'          Will change registry values for maximum time skew correction If necessary, Then revert to original values
	'          ReSets the w32time service during execution, but NOT at the End of the script.  An manual restart is required to 
	'            revert Domain-joined systems to defaults.
	' Requires: w32tm.exe, net.exe.  Both should be present on all Vista/Win7 systems.
	' Tested on: WinDows 7.  Should work on Vista as well... NOT intEnded for XP systems.

		Dim oExec
		Dim iPosRegVal, iNegRegVal
		Dim strKeyPath, strPosValueName, strNegValueName

		strKeyPath = "HKLM\SYSTEM\CurrentControlSet\services\W32Time\Config"
		strPosValueName = "MaxPosPhaseCorrection"
		strNegValueName = "MaxNegPhaseCorrection"

		oLogging.CreateEntry "setTime> " &  "Current Time is: " & Date & " " & Time, LogTypeInfo

		'This works, If you can understand the screwball interger value that gets returned.  
		'Everything over hex 0x0fffffff is listed as a negative interger.
		'0xffffffff returns -1.
		iPosRegVal = oShell.RegRead(strKeyPath & "\" & strPosValueName)
		iNegRegVal = oShell.RegRead(strKeyPath & "\" & strNegValueName)
		oLogging.CreateEntry "setTime> " &  "strNegValueName value is: " & iNegRegVal, LogTypeInfo
		oLogging.CreateEntry "setTime> " &  "StrPosValueName value is: " & iPosRegVal, LogTypeInfo

		If iPosRegVal  -1 Then
			oLogging.CreateEntry "setTime> " &  "Maximum allowed clock skew correction is NOT large enough... Setting to maximum value."
			'Setting the Max Phase Correction registry values to "-1" (or 0xffffffff in hex), 
			'which will allow correction of local time by any amount.
			oShell.RegWrite strKeyPath & "\" & strPosValueName, -1, "REG_DWORD"
			oShell.RegWrite strKeyPath & "\" & strNegValueName, -1, "REG_DWORD"
			oLogging.CreateEntry "setTime> " &  strPosValueName & " is now Set to: " & oShell.RegRead(strKeyPath & "\" & strPosValueName), LogTypeInfo
			oLogging.CreateEntry "setTime> " &  strNegValueName & " is now Set to: " & oShell.RegRead(strKeyPath & "\" & strNegValueName), LogTypeInfo
		Else
			oLogging.CreateEntry "setTime> " &  "This system already already is configured to allow large clock skew corrections.", LogTypeInfo
		End If 

		oLogging.CreateEntry "setTime> " &  "Setting WinDows Time service Manual-sync NTP Server to ""pool.ntp.org""", LogTypeInfo
		' Pool.ntp.org is a collection of Internet NTP time servers.  
		' It is the default time source for stand-alone RedHat installs,
		' and apparently it is a but more reliable than "time.winDows.com"
		Set oExec = oShell.Exec("w32tm.exe /config /manualpeerlist:pool.ntp.org /update")
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		Do While NOT oExec.StdOut.AtEndOfStream
			oLogging.CreateEntry "setTime> " &  oExec.StdOut.ReadLine, LogTypeInfo
		Loop

		'Stopping the w32time service.  
		'Necessary because changes to the w32time service will NOT take effect until service restart.
		Set oExec = oShell.Exec("net.exe stop w32time")
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		Do While NOT oExec.StdOut.AtEndOfStream
			oLogging.CreateEntry "setTime> " &  oExec.StdOut.ReadLine, LogTypeInfo
		Loop

		'Starting the w32time service
		Set oExec = oShell.Exec("net start w32time")
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		Do While NOT oExec.StdOut.AtEndOfStream
			oLogging.CreateEntry "setTime> " &  oExec.StdOut.ReadLine, LogTypeInfo
		Loop

		'Forcing a time service resync
		'Time would resync on its own soon enough, but we are impatient and want to see results immediately.
		Set oExec = oShell.Exec("w32tm.exe /resync")
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		Do While NOT oExec.StdOut.AtEndOfStream
			oLogging.CreateEntry "setTime> " &  oExec.StdOut.ReadLine, LogTypeInfo
		Loop

		oLogging.CreateEntry "setTime> " &  "Current Time is: " & Date & " " & Time, LogTypeInfo

		If iPosRegVal  -1 Then
			oLogging.CreateEntry "setTime> " &  "ReSetting registry maximum allowed clock skew correction Settings to their original values...", LogTypeInfo
			oShell.RegWrite strKeyPath & "\" & strPosValueName, iPosRegVal, "REG_DWORD"
			oShell.RegWrite strKeyPath & "\" & strNegValueName, iNegRegVal, "REG_DWORD"
			oLogging.CreateEntry "setTime> " &  strPosValueName & " is now Set to: " & oShell.RegRead(strKeyPath & "\" & strPosValueName), LogTypeInfo
			oLogging.CreateEntry "setTime> " &  strNegValueName & " is now Set to: " & oShell.RegRead(strKeyPath & "\" & strNegValueName), LogTypeInfo
		End If
		
	End Function

End Class

   

But wait! This does not really work very… Time only gets fixed after the computer logs in after mini-setup completes. By this time, the initial Windows Activation attempt will have failed. Let’s take care of time synchronization in the WinPE environment, before we even lay down the Windows OS image. The following script is added as a custom action during the “pre-install” phase of LiteTouch deployment:


   
   

Option Explicit
RunNewInstance

'//—————————————————————————-
'//  Global Constants
'//—————————————————————————-

'const DEPLOY_SERVER = "\\sysimg3.campus.ad.uvm.edu"

'//—————————————————————————-
'//  Main Class
'//—————————————————————————-

Class ZUVMsetTimePE

    '//—————————————————————————-
    '//  Global constant and variable declarations
    '//—————————————————————————-
	
    Dim iRetVal

    '//—————————————————————————-
    '//  Constructor to initialize needed global objects
    '//—————————————————————————-

    Private Sub Class_Initialize
    End Sub
	
	Function RegExpFind(patrn, strng)
	  Dim regEx, oMatch, oMatches, iPos

	  ' Create the regular expression.
	  Set regEx = New RegExp
	  regEx.Pattern = patrn
	  regEx.IgnoreCase = False
	  regEx.Global = False

	  ' Do the search.
	  Set oMatches = regEx.Execute(strng)
	  
	  iPos = "0"
	  
	  For Each oMatch in oMatches
		iPos = oMatch.FirstIndex
	  Next
	  
	  RegExpFind = iPos
	End Function

    '//—————————————————————————-
    '//  Main routine
    '//—————————————————————————-

    Function Main
	
		' setTimePE.vbs
		' J. Greg Mackinnon, 2010-10-07
		' Sets time from within a WinPE 3.0 environment.
		
		Dim sDATESEARCH
		Dim sTIMESEARCH
		
		Dim oExec
		Dim sDSTime, sExecOut, sDate, sTime, sDateTime, sCmd, sDPServ
		Dim iPos1, iPos2, iLength
		
		sDATESEARCH = "[0-9]*/"
		sTIMESEARCH = "[0-9]*:"
		
		sDPServ = oEnvironment.Item("SMSDP")
		'sDPServ = "sysimg3.campus.ad.uvm.edu"

		oLogging.CreateEntry "Current Time on localhost is: " & Date & " " & Time, LogTypeInfo

		set oExec = oShell.Exec("net.exe time \\" & sDPServ)
		Do While oExec.Status = 0
			 WScript.Sleep 100
		Loop
		Do Until oExec.StdOut.AtEndOfStream
			sExecOut = oExec.StdOut.ReadLine
			oLogging.CreateEntry "setTime> Output from net time: " & sExecOut, LogTypeInfo
			iPos1 = RegExpFind(sDATESEARCH, sExecOut)
			If iPos1  0 then
				sDateTime = Mid(sExecOut,iPos1)
				iPos2 = RegExpFind(sTIMESEARCH, sDateTime)
				If  iPos2  0 then
					sTime = Mid(sDateTime, iPos2)
					sDate = Left(sDateTime, iPos2)
					Exit Do
				End If
			End If
		Loop

		oLogging.CreateEntry "Current Time on " & sDPServ & " is: " & sDateTime, LogTypeInfo


		REM set oExec = oShell.Exec("%comspec% /c time " & sTime)
		sCMD = """time " & sTime & """"
		oShell.Run "%comspec% /c " & sCMD

		sCMD = """date " & sDate & """"
		oShell.Run "%comspec% /c " & sCMD

		oLogging.CreateEntry "Current Time on localhost now is: " & Date & " " & Time, LogTypeInfo

	End Function

End Class