You are currently browsing Wesley Wright’s articles.

For those who might not be familiar with it, EDUCAUSE is a nonprofit association whose mission is to advance higher education through the use of information technology. The EDUCAUSE Annual Conference claims to unite “the best thinking in higher education IT.” EDUCAUSE 2013 was held in mid-October, 2013, and I was there to investigate learning analytics.

Analytics is the use of data, statistical, and quantitative methods, and explanatory and predictive models to allow organizations and individuals to gain insights into and act on complex issues. The use of digital tools, especially Learning Management Systems (LMS), like Blackboard, for academic data and Student Information Systems (SIS) for demographic data, can create mounds of digital data that could be mined for discovering trends or predicting outcomes. Examples:

  • Marist College is developing a predictive model using Banner (SIS) and Sakai (LMS) to deliver intervention notices to identify students unlikely to pass a course. A model was built using grade book data across a broad set of courses. They built their own system out of mostly Open Source tools. They found the most powerful predictor to be student’s GPA. Presentation.pptx [3 MB, Powerpoint slides].
  • University of California Chico built a system from server log files and Excel, Tableau, Stata, and SPSS and looked at one large course (373 students). They found LMS usage to be the best predictor of success (not GPA), using these LMS usage variables: total course website hits; total course “dwell time”; administrative tool hits; assessment tool hits; content tool hits; and engagement tool hits. summary of results [455 KB, PDF]; presentation slides from EDUCAUSE 2013 [3 MB, Powerpoint slides]
  • University of Kentucky uses a hardware “appliance” from SAP (HANA) to look at data in near realtime, push out administrative reports to administrators, and “how am i doing” reports to students via a custom mobile application. Academic advisers get an iPad application that compiles advisees’ data, giving both advisor and student a better idea of where they are and where they are going. Using Groundbreaking Analytics and Fast Data [7 MB, Powerpoint slides]
  • South Orange County Community College District  built the mobile app, “Sherpa,” a recommendation engine similar to Netflix or Amazon that helps students choose courses, services, and get information based on previous enrollments, major/minor declarations, and grades. It pushes out warnings and reminders to students via email or text message. Powerpoint slides.
  • Coppin State University implemented Blackboard Analytics for Learn, providing a slew of dashboards for deans, chairs, faculty, and students using data from the Blackboard Learn LMS alone. Mesa Community College has taken it one step further, using Blackboard Analytics to also ingest SIS data. University of Maryland, Baltimore County is using Blackboard Analytics for Learn to explore the LMS in much finer detail and assess the impact of faculty course redesign training.

Barriers? Sure. Analytics are hard. The people who developed Sherpa called in three outside mathematicians to help design their statistical model. Kentucky hired three PhDs. Analytics require buy-in and many of the presenters were CIOs, provosts, presidents, or vice-this-or-thats. There is a lot of missing data (e.g., classes that don’t use an LMS), and a lot of inconsistent data (e.g, variance in how faculty use LMS gradebooks). Statistical models are still in an early stage of development and proprietary software, like Blackboard Analytics, is expensive.

For more on learning analytics, visit The Society for Learning Analytics Research (SoLAR), an interdisciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training, and development.

“Physical activity has been identified as an important behavior to help prevent the development of overweight/obesity and associated conditions including diabetes, cardiovascular disease, and metabolic syndrome. Regular physical activity has also been found to improve dietary intake and patterns. Therefore, interventions targeting physical activity may lead to weight change not only by increasing calories expended each day, but also by influencing the food choices people make.” 1

Or so thought Nutritional Sciences Graduate student Lizzie Pope.  So she designed a study to provide new information on the efficacy of using monetary incentives to help college freshmen meet physical activity guidelines, and therefore establish an important health-promoting behavior. It worked like this: 

One hundred and seventeen students were randomized to one of three groups: continued-incentive, discontinued-incentive, or control. For 12 weeks during the fall semester both the continued-incentive and discontinued-incentive groups received weekly incentives for using the university fitness center. At the conclusion of fall semester weekly incentive payouts ended for both incentive groups. For 12 weeks during spring semester the discontinued-incentive group received no incentives to use the fitness center while the continued-incentive group received incentives on a variable-interval schedule, averaging one incentive payment each month.  During the spring semester the exact schedule for the incentives was not known to participants in this group.

Slide: Burn and Earn: Incentivizing Exercise in First-Year College Students

Burn and Earn: Incentivizing Exercise in First-Year College Students

Great. But how was she going to track the students? Well, perhaps she could set up some sort of card swipe system that would record the student ID, date, and entry/exit times for each visit to the fitness center. Great, how do you do that?

Enter the Center for Teaching and Learning. Lizzie enlisted the aid of CTL staffer Wesley Wright. Together, Wesley and Lizzie assembled a Mac Mini computer and magnetic card reader. The Mac recorded each card swipe and entered the data into a central database. The database fed a web site, and the web site was used by the students to track their individual progress, using both tables and graphs. The web site also provided Lizzie with both individual and group statistics and the incentive payment owed to the students. 

Results?  “Basically, we were able to increase fitness-center use over the fall semester by paying weekly incentives,” says Lizzie, “however this increased exercise did not translate into weight maintenance for the incentive groups over the fall semester.  In the spring semester without incentive payments our discontinued-incentive group no longer met fitness-center use goals.  However, with a variable-interval payment schedule our continued-incentive group continued to meet fitness-center use goals. Unfortunately, this increased exercise again did not translate into weight maintenance over the spring semester.  It would be interesting to measure body composition and metabolic markers to see if the increased exercise had beneficial effects other than weight control.” 

“Burn and Earn: Incentivizing Physical Activity in College Freshman – UNIVERSITY OF VERMONT”, n.d. http://www.reeis.usda.gov/web/crisprojectpages/0223043-burn-and-earn-incentivizing-physical-activity-in-college-freshman.html.