UMBC logo

Blackboard Reports

Show Them The Data: How PUBLIC Blackboard Usage Reports Can Improve Teaching & Learning

By John Fritz

Problem

UMBC has used the Blackboard course management system (CMS) since 1999, but it has not always been easy to answer two important questions:

1. How (and why) is Blackboard being used by faculty and students?
2. Is it making a difference in teaching and learning?

Blackboard can track the number of students, instructors and courses on a server, and even provide a rudimentary list or graph of "hits" within specified date ranges that correspond to semesters. But it is difficult to "drill down" from a system to a specific course or community level (or vice versa). Unless faculty enroll in and check every other course, they cannot see how their own site usage patterns compare with or complement other sites. The only people who could possibly see the 30,000 AND 5,000 foot views are OIT system administrators who can view the entire system and each course or commuity--if they had time to do so.

This begs the question: When it comes to understanding how Blackboard (or any instructional technology) improves teaching and learning, why should OIT system administrators have the only "bird's eye view" of how the system is being used? Or bear the sole responsibility of interpreting for the campus what is (or isn't) effective use?

Solution

In December 2006, OIT began experimenting with an "average hits per user" approach to determine UMBC's Most Active Blackboard courses "turned on" and used by instructors during a given semester. For our purposes, a "hit" is recorded whenever a user clicks anything inside a Blackboard course or community (e.g., announcements, documents, discussion board, assignments, etc.). This way, "average hits per user" rankings don't favor large enrollment sites over smaller ones. To date, we now have reports for all of 2007, and spring 2008. We will be running these reports every semester on the day after the last day of classes.

The key to understanding and using these reports is that activity alone is not an indicator of course quality. After all, high activity could mean students are simply "lost" and desperately clicking anything to find what they're looking for. Similarly, low activity could be due to an instructor using an external, "homegrown" website that he or she simply links to inside a Bb course shell (and thus the student activity "hits" don't get tracked in the Bb reports). But as simply ONE indicator of student engagement, published Blackboard activity reports can provide useful insight to instructors. Faculty learn best from other faculty, so if they can see what is happening in each other's courses, they can seek each other out about what works (or doesn't) in teaching online.

Early on, an important concern was that the Most Active Blackboard Courses reports not embarrass any faculty member who was not on the original "top 50 most active courses." But before publishing the reports, OIT staff emailed several faculty (on and off the top 50 list) to see if they perceived our reports as an endorsement or indictment of quality. They said no.

We also now publish the activity ranking of ALL courses (see most active courses by discipline, which shows the overall rank for a semester as well as within the discipline). One interesting trend warranting further study: disciplines with more than 10 active Bb courses appear to have higher per course activity levels than disciplines with fewer active Bb courses. If true, one reason may be that having more colleagues using the tool provides more sounding boards to bounce ideas off of, which further underscores the importance to faculty learning from each other.

In addition, with permission from a few faculty, OIT has recently published pilot reports on their Bb courses showing student activity levels by final grade distribution. As one might expect with attendance in a face-to-face class, students who earn higher grades tend to use Blackboard more than those earning lower grades.

What's New

Our next step was going to be to run a similar summary report for our top 25, 50 and 75 quartiles for all Bb courses, to see if the grade distribution trend can be broadly generalized. We're working on this, but we've already heard from a few faculty who want this report for themselves. Now all faculty can run their grade distribution reports by student activity in Blackboard by using myUMBC. Faculty can also analyze their course tool usage by students (click here for a video demo).

In addtion, students can use myUMBC's "Check My Activity" link to view their own activity against an anonymous summary of all other user activity in their own courses (click here for a video demo).

Future Plans

While tempting, students should not see a cause and effect relationship between Blackboard use and higher grades. Unlike a video game's secret passageway or unpublished "short cut" they may be accustomed to "hacking" through multiple, meaningless clicks, students should not expect to earn a higher grade based on brute force or repetition. Instead, if they can view their own activity in the context of an entire class roster, they may be able to monitor and improve their own performance.

For example, here's how a "check my activity" function might be further refined to work in Blackboard:

  • Like the current Bb grade book, the instructor could decide if, when and how students can view a "check my activity" report.
  • There might need to be a minimum number of students enrolled (10?) or period of time that passes (10 days?) so they can't deduce who is active and who is not.
  • If enabled, students would be able to see where they rank relative to other students in the same Bb course with regard to activity level and/or type of tool usage.
  • If past activity by grade distribution reports are available for a current course, the instructor could elect to make these available to students either inside the course, or perhaps on an external site so prospective students can gauge what it might take to succeed.
  • Students might be able to "opt in" to an email, rss or txt message alert if they fall below a specified or desired level of activity relative to their classmates-and their course goals.
  • Academically "at risk" students might have this alert function enabled by default.
  • Rather than wait till the end of semester, the instructor could elect to run activity reports by grade distribution on specific exams, assignments or activities, so students can see where they rate currently.

The key is that a CMS collects all kinds of data that might be used as "cheap feedback" instructors don't have to work hard to provide. The CMS could do it for them, and become a self-monitoring tool for students-before or after an instructor assigns an actual grade to their work. In this way, the CMS could be even more of a "self-paced" learning assistance to students, serving almost as an "academic biorhythm" that provides a more realistic view of their own activity-and perhaps improves their performance.

Conclusion

Unlike surveys or focus groups where users are asked what they think about a course management system (CMS) or say how they would use it, mining actual human computer interaction data from an enterprise CMS-and displaying feedback to those users through contextualized, individual reports-could improve student retention, online learning design and faculty development.

Credits

This site was developed and is maintained by Jeffrey Berman, a graduate student in Information Systems (2008), who is now a senior applications developer at Drexel University. For a video show & tell about the technical details of our reporting project, click here.

For more information or to request custom queries for information not displayed in these reports, please send email to blackboard@umbc.edu.

Note: To protect the privacy of students who may have elected not to disclose their personal information, including which courses they are taking, the "Most Active Students" report is only visible by UMBC employees (faculty and staff).

FYI

Other colleagues and schools working on Blackboard reporting include Santa Nucifora (Seneca College) and Eric Kunnen (Grand Rapids Community College), winners of Blackboard's 2007 "Greenhouse Grant" competition ; Charles Leonhardt (Georgetown University), who piloted a detailed analysis of spring 2006 by analyzing raw data with Excel; and Ronald Chalmers (Hofstra University) who uses MS Access to query nightly downloads from a server hosted by Bb (see the UMBC & Hofstra joint 2008 MARC presentation). Special thanks to Blackboard's Deborah Everhart for facilitating discussions among all of these schools and UMBC.