The various webcams around campus have been used primarily for showing us what is happening now, not what happened before. recently installed camera, however, were deployed speocfically to documents progres, specifically buildong progress, such as the new Davis Center construction.
The basic operation of these cameras is to snap a jpeg image and uplaod it to a web server. A variety of techniques are them employed to stream these captured images, one after another as they are uploaded, back to users to form a set of moving pictures. This is generally complicated by the inability of some browsers to display multi-part mime image streams (see here).
In order to generate a time-lapse movie, we need to preserve a subset of the snapshots, tag them with a date/time, and find a better way to stream the composite back to the users.
The solution seems to be ffmpeg, a flexible unix CLI which can produce MP4 movie files from single image sequence input.
It was a hard won solution, requiring a lot of research, dead ends, compiling, porting, recompiling, script writing, and workflow development, but I finally got it all together. Here are the Results.
Along the way, I expanded the number of archived cameras from Davis Center to ResLife and the UVM green.