Student Engagement in Online Classrooms: How to Measure It And Act On It

Streamed Session

Brief Abstract

When a student speaks up in an online class, they are engaged—at least for that moment. We present new methods and software that measure this engagement in Zoom classes in detail. Our “discussion dashboards” help teachers engage students more equitably and effectively, and give students insight on their performance.

Presenters

Ben Gomes-Casseres is the Peter A. Petri Professor of Business and Society at Brandeis International Business School. He has been teaching by the case method for over three decades, starting as a professor at Harvard Business School. He has taught online classes since 2020, which led him and his co-inventors to develop the Brandeis Online Class Analytics (BOCA) toolset. Ben has won two teaching awards at Brandeis and previously led the MBA program for many years. He teaches courses on business strategy, innovation, partnerships, and climate change. He is an expert in business combinations, with five books and many articles on M&A, alliances and international business. As BOCA’s co-principal investigator, Ben works on product design, data collection and analysis, and business development.

Extended Abstract

A NEW WAY TO MEASURE AND MANAGE STUDENT ENGAGEMENT

Teachers are always looking for ways to improve student engagement. Measuring engagement is the first step to managing it.

When a student speaks up in an online class, they are engaged—at least for that moment. That’s why any specific information we can glean about when and for how long each of our online students participates can only help improve engagement and deepen learning.

In our lab at Brandeis University, we have developed new systems and methods and a unique software package to generate reliable, objective, and detailed information about student participation in online synchronous courses. From standard Zoom recordings, we measure the speech time of individuals in a meeting and plot it in revealing ways – showing who spoke when, how interactive the discussion was, and how inclusive participation was.

Our “discussion dashboards” help teachers engage students more equitably and effectively and give students actionable feedback on their performance. A free version of our software is available. The software heeds security and privacy concerns in a typical university setting. We have been deploying and testing various prototypes at our university for three years.

The data and methods presented in this session are new to the market and to this conference, having been developed during 2020-2023. We used the forced learning of COVID to build and test prototypes of the software, and to collect data on multiple courses. This unique database is now being analyzed for lessons about how to manage class participation in Zoom classes.  A review of the methodology has been published in "Inspiring Minds," the newsletter of Harvard Business Publishing Education (April 13, 2023).

 

GOALS AND USES OF THE NEW TOOLS

Our methods are relevant to any course taught online with synchronous discussion methods, such as case discussions common in all business schools, at all levels. Many business schools continue to have selective online classes, and many more have expanded their online offerings and degrees. Our methods will be of interest to course designers, instructors, administrators, and students in these online courses. The work may also have implications for remote work meetings and for teaching in other contexts, including traditional in-person classes.

The educational outcomes that our tools seek to promote are the degree, breadth, and inclusiveness of student engagement in class discussions. Using our measures of student engagement, we explore the pedagogical conditions and actions that affect these outcomes. For example, teaching plans and protocols vary by instructor and by class and have different impacts on engagement.

We will focus in this session on three types of output from our dashboards. The first dashboard helps us evaluate the dynamics of the discussion—who talked when and for how long. This shows how active and broad the discussion was, as well as which segments of the class worked and which didn’t. The second dashboard helps track and manage the inclusiveness of the discussion, measuring the diversity of the participants who engaged. The third dashboard helps in evaluating and giving feedback to students. We will discuss how the dashboard data are generated, how we interpret them, and how we use them to improve our teaching.

 

WHY ATTEND THIS SESSION?

If you taught or ever sat in a Zoom class, this session will open your eyes!

We will show you dashboard visualizations of a kind that you haven’t seen before. And we’ll invite you to consider how you’d use such data if it were routinely available, which will be soon enough.

We will bring to the session examples of each of these dashboard outputs, from real classes that used discussion pedagogy. These graphics are striking and can be viewed from various angles. We will ask you to interpret the data visualizations and consider how they can be used to improve online learning.  

In our own experience, the data yield insights into the dynamics of class discussion, the balance and inclusiveness of the discussion, and the class performance of individual students. The insights can be used to improve instructional design, to create an inclusive learning environment, and to help students improve their in-class learning.  

 

LEARNING GOALS IN BRIEF

(1)        Learn new methods to measure student engagement during online synchronous classes

(2)        Practice deriving insights from multivariate data on student in-class engagement

(3)        Develop new ways to use in-class engagement data to improve teaching and learning

 

FURTHER DETAILS OF THE TOOLS TO BE PRESENTED

The dashboards we will present are part of a larger toolset. There’s a free demo available now that is easy to run, keeps all data on the local computer, but is limited to measuring one meeting at a time.  Our enterprise version, currently in beta test at Brandeis University, requires central implementation but offers more results, controls access to student data, and compares results across courses.

Our full suite of tools has been developed for deployment in a university IT environment. We use local servers to store and process all data and integrate with data from the Registrar. Our Zoom access, security, and human-subjects protocols are approved by our IT department and the relevant IRB bodies. Student names are encrypted for aggregate analysis and in course comparisons, but each participating faculty sees the names of their students, according to their access privileges. These differential privacy methods have served us well in our R1 university and we expect them to be appropriate in other similar settings. Other forms of deployment can be developed in the future.

Our lab is a collaboration between faculty at the business school and the department of computer science at our university, with support from university IT professionals and our center for teaching and learning. The co-inventors are senior professors at the university, experienced in case-method teaching and in software development. The data and methods presented in this session are new to the market and to this conference.