In Blackboard’s data science research practice, we’ve been investigating how faculty and students use Blackboard Learn, and looking into the relationship between that use and student achievement. We’ve posted prior results a few times (here and here) in which we found a large variation in this relationship between courses, and uncovered interesting differences based on tools usage. However, we know that tool use doesn’t occur in a vacuum. These tools are part of a broader course context and the instructional goals that teachers have set for their course.

To pursue this research thread, we conducted analysis for patterns between courses based on the relative time students spent using different tools, taking use of features/functionality as a proxy for course design. While this question has been answered previously by scholars in conceptual studies (Janossy, 2008; Dawson and McWilliam, 2008) and has been analyzed empirically in one study (Fritz, 2016), it has never before been investigated in a multi-campus study.

This post summarizes our findings; you can download the complete research report for a deeper understanding.

Data & Results

The data sample for this study included 70,000 courses from 927 institutions, with 3,374,462 unique learners using Blackboard Learn during Spring 2016 in North America. All of this data was anonymized at the individual and institutional levels; only aggregate data was used for analysis. We conducted a cluster analysis and five course patterns emerged.

The five course design archetypes include:

  1. Supplemental – high in content but with very little student interaction
  2. Complementary – used primarily for one-way teacher-student communication
  3. Social – high peer-to peer interaction through discussion boards
  4. Evaluative – heavy use of assessments to facilitate content mastery
  5. Holistic – high LMS activity with a balances use of assessments, content, and discussion

The majority of courses fell in the “Supplemental” category (9,909 courses, or 53%), while the second largest amount (4,588 courses, or 24%) fell in the “Complementary” category. The rest include the “Social” category, which represented 2,130 courses (11%), the “Evaluative” category represented 1,832 courses (10%), and the “Holistic” category included 351 courses (2%). See the image below for this overview.

Chart: Course Archetypes

The first two course archetypes account for over three-quarters of the courses analyzed, and primarily use the LMS to provide students with access to course materials. The Evaluative and Social categories make intensive use of assessments and discussion forums, adding this functionality to similar levels course content use. Courses in these later categories have a much larger amount of student engagement with Blackboard Learn, whether calculated as time spent or as click-through interactions.  

See the images below (or the text alternative version) to learn more about the characteristics of each of the archetypes.

Chart: Supplemental Course Archetype Characteristics

 

Chart: Complementary Course Archetype Characteristics

 

Chart: Social Course Archetype Characteristics

 

Chart: Evaluative Course Archetype Characteristics

 

Chart: Holistic Course Archetype Characteristics

 

Discussion

While there are no “correct” or incorrect uses of Learn independent from course educational goals, there are several implications for practice from this research for faculty and campus leaders seeking to increase student participation.

  • The majority of Learn courses are primarily used to access Course Content, and have a small amount of student activity. However, even these courses have substantial use of the gradebook and announcement tools.
  • To increase student engagement in Learn, instructors should consider adding assessments or discussion forums. Student time is focused on one of these two areas in the next two categories of courses, and associated with much higher levels of student activity.
  • Use of these tools could provide instructors with in-person class time for interactions with students and ad-hoc discussions. It is possible that assignments have a similar effect, but due to limitations of Learn logging we cannot determine how much time students spend on assignments submitted through Learn.
  • Courses making extensive use of Discussion forums have a substantially smaller enrollment than courses in other categories. This finding speaks to the need for advancing strategies for effective forum facilitation in large enrollment courses.
  • Courses in categories with a larger amount of student use continue to have a large amount of activity in course content in raw numbers, but proportionally have a large amount of activity in other tools. It appears that courses often “top out” at a certain amount of online resources and then expand into other types of activities.
  • Courses with the largest amount of student activity take advantage of a diverse set of tools; campuses should identify and investigate these leading courses as sources for best practices and examples that can be adapted by other faculty in their courses.

Next Steps

Our next area for research is to examine the relationship between these use of Learn and student achievement (e.g. course grade), comparing the results across categories. One might assume that as more students use Learn, there is a greater opportunity for learning, or perhaps a greater amount of student effort put into learning, and therefore there will be a stronger relationship between Learn use and grade in the course categories with deeper use of Learn.

However, in initial exploration we have found a similar distribution in final grades in courses across all categories, and uneven results across tool use by course category. This suggests, counter-intuitively, that grade may be independent of course category – or perhaps there is more systematic relationship that depends on the specific tools in each course. We’ll find out the true relationship in the data. Stay tuned!

If you’re conducting research into how people are using Learn or another Blackboard Learning and Teaching application, and would like help interpreting results or feedback about how your findings might scale, I’d love to hear from you. You can reach me via email using john.whitmer@blackboard.com, on Skype by my screen name john.whitmer, or on Twitter with my handle @johncwhitmer.

Sources

  • Dawson, Shane, McWilliam, Erica, & Tan, Jen Pei-Ling. (2008). Teaching smarter: How mining ICTdata can inform and improve learning and teaching practice. Paper presented at the ascilite 2008, Melbourne.
  • Fritz, J. L. (2016). Using analytics to encourage student responsibility for learning and identify course designs that help. (Doctoral Dissertation) University of Maryland, Baltimore County. doi:101189
  • Janossy, James. (2008). Proposed Model for Evaluating C/LMS Faculty Usage in Higher Education Institutions. Paper presented at the MBAA, Chicago, IL.
  • Russell, Thomas.  No Significant Difference Phenomenon.  North Carolina State University, Raleigh, NC, USA.
  • Whitmer, John, Nuñez, Nicolas & Forteza, Diego (2016a, March 18). Research in progress: Learning analytics at scale for Blackboard Learn [Blog post].  Retrieved from http://blog.blackboard.com/research-in-progress-learning-analytics-at-scale/
  • Whitmer, John, Nuñez, Nicolas & Forteza, Diego (2016b, September 7). How successful students use LMS tools – confirming our hunches [Blogpost].  Retrieved from http://blog.blackboard.com/how-successful-students-use-lms-tools/

Text alternative

The Supplemental Archetype Characteristics

Average Class Size: 35

Average Student Class Time: 15 hours

Average Student Interactions: 222

Proportion of course time by the tool used for each of the content archetypes*

  • Course Content: 58%
  • Gradebook: 22%
  • Announcements: 17%
  • Assignments: 6%
  • Other: 6%
  • Assessment: 5%
  • Discussion Board: 3%

The Complementary Archetype Characteristics

Average Class Size: 31

Average Student Class Time: 25 hours

Average Student Interactions: 560

Proportion of course time by the tool used for each of the content archetypes*

  • Course Content: 66%
  • Announcements: 20%
  • Gradebook: 15%
  • Assessment: 11%
  • Discussion Board: 11%
  • Assignments: 8%
  • Other: 8%

 

The Social Archetype Characteristics

Average Class Size: 25

Average Student Class Time: 39 hours

Average Student Interactions: 1,348

Proportion of course time by the tool used for each of the content archetypes*

  • Course Content: 51%
  • Discussion Board: 44%
  • Announcements: 14%
  • Gradebook: 10%
  • Assessment: 9%
  • Other: 8%
  • Assignments: 5%

 

The Evaluative Archetype Characteristics

Average Class Size: 33

Average Student Class Time: 38 hours

Average Student Interactions: 808

Proportion of course time by the tool used for each of the content archetypes*

  • Assessment: 49%
  • Course Content: 42%
  • Gradebook: 11%
  • Announcements: 10%
  • Discussion Board: 6%
  • Other: 5%
  • Assignments: 3%

 

The Holistic Archetype Characteristics

Average Class Size: 39

Average Student Class Time: 70 hours

Average Student Interactions: 1,596

Proportion of course time by the tool used for each of the content archetypes*

  • Assessment: 67%
  • Course Content: 33%
  • Announcements: 8%
  • Gradebook: 7%
  • Discussion Board: 6%
  • Other: 4%
  • Assignments: 2%

*(Note: Activity proportions represent the average of individual student time distributions and do not total to 100% at the course level)

New report: Using technology to measure student performance

Related Posts

Share This Article

Twitter Facebook LinkedIn Pinterest Email

  • Carl Kuzmich

    Our faculty use blogs and journals as much, if not more than discussion boards. You might want to add those tools to the “social” part of the study. Also, our students submit approximately 45,000+ papers a month to Turnitin, so hopefully tools like Turnitin are included in the study.

    • http://www.johnwhitmer.net John Whitmer

      Thanks @Mandy and @Carl for your comments — we are investigating LTI-enabled tools as much as possible, but at this macro-level we don’t see any specific tool as rising to the level of activity as the built-in tools. It is entirely likely that a significant category of courses in the “supplemental” category are using BB in a “supplemental” way, but using other tools to round out their activities. This is a perfect case where open standards like IMS Caliper will provide help to wrangle in the diverse tools and activities that faculty are using in their courses for a more comprehensive view. Your feedback also makes me think that it may be a good idea to revise the category descriptions to mention the broader ed tech ecosystem that these courses are a part of …

  • Mandy Lupton

    I would be very cautious about assuming that BB = the course. My BB courses (100% online) would register as having almost non-existent activity apart from announcements, Gradebook and use of Collaborate. This is because I use a website for content management and Google Communities/Facebook for discussion. Thus the distribution of grades would be meaningless.

    • Kevin Wang

      I hope John will keep your suggested caution in mind and rethink the direction of the next step of his research. BB is not equal to the course. In your case of totally online teaching, many activities exist outside of BB. In most other traditional instructors’ teaching, BB is only an LMS and major activities exist still in the classrooms. Therefore, the final grades are not just based on student’s use or performance in BB.