The Great Learning Analytics Debate Continues: Promises, Pitfalls, and Progress from Multi-Stakeholder Teams and Institutions

Workshop Session 2
Leadership

Brief Abstract

This workshop aims to develop multi-stakeholder case studies that tackle the great learning analytics debate on the promises vs. the pitfalls of learning analytics in higher education environments. All designers, faculty, researchers, developers, administrators, and ;earning professionals are encouraged to join and contribute best practices for implementing learning analytics.

Presenters

J. Garvey Pyke, Ed.D., leads the Center for Teaching and Learning in fueling the enrollment growth at the university through online course development, creating high impact student success programs using personalized and adaptive learning, promoting faculty success and scholarly teaching through innovative faculty development programs, and overseeing the provision and support of enterprise academic technologies, with a focus on high quality instruction in all teaching modes and models. The success of the CTL has made him a sought-after contributor for professional organizations and consultant for other institutions, having served as President of the University of North Carolina Faculty Developers Consortium and also as a member of steering committees for the both the Online Learning Consortium (OLC) Innovate and Accelerate conferences, and he has presented multiple times at these and other conferences. Garvey has also been a keynote speaker at the Georgia State University Conference on Scholarly Teaching in 2016 and University of Central Florida’s TOPkit Workshop in 2022. An alumnus of the 2010 OLC Institute for Emerging Leadership in Online Learning (IELOL), Garvey has stayed active in the IELOL network, assisting Larry Ragan with the design and facilitation of the IELOL Master Class in the years 2014 to 2017. Garvey was co-director of IELOL in 2018 and 2023 and has been serving on IELOL faculty since 2019. As an educator for over 25 years and having been with UNC Charlotte since 2003, Garvey enjoys collaborating with faculty members and staff to design and develop programs which impact faculty satisfaction and lead to student success. His work involves the practical application of research methods and instructional systems design methods to various instructional projects at UNC Charlotte, and he is an affiliate member of the Graduate School and has served on several dissertation committees. He holds a doctorate from Indiana University’s School of Education in Instructional Systems Technology and has taught at the university and K12 levels. He also holds a master’s degree in Educational Leadership from Pepperdine University and a bachelor’s degree in English from Tulane University.

Extended Abstract

Topic Introduction

Learning analytics is “the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purpose of understanding and optimizing learning and the environment in which it occurs” (Siemens and Long, 2011, p. 32). In simple terms, learning analytics focuses on capturing and using data to improve student learning and learning environments. As learning analytics has become more prominent in our mainstream learning management systems and adaptive learning platforms, there are continuous debates on the promises vs. the pitfalls of learning analytics amidst on-going efforts to progress learning analytics platforms/tools, policies, and processes in higher education environments. 

Promises: The field is growing with interest from stakeholders to optimize the design processes by providing effective tools to support learners. Capturing data about students, their behaviors and engagement yields patterns that can be further utilized to deepen understanding of how students learn and improve learning processes. This data also helps guide the design processes of instructional materials. In more recent years, personalization of learning is supported by capturing learner profiles and characteristics. 

Pitfalls: There are issues to unpack in learning analytics. Data are often complex and heterogeneous, difficult to be interpreted and translatable by students, faculty, and other stakeholders. For example, at the course level, the practice of translating data into actionable, just-in-time interventions is still uncommon (van Leeuwen, 2019). Although instructional designers perceive the valuable use of learning analytics to inform design practices, challenges exist in comprehending complex data and informing the redesign process. There remains a risk that learning analytics tools may not be sustainable and that the assumptions about their effectiveness may not necessarily meet the needs of students, faculty, and designers (Dawson, Gasevic, & Mirriahi, 2018). One of the most prominent issues that have been identified is whether learning analytics is racially, politically, or otherwise neutral in the way they discriminate and label students based on their performance (Johanes & Lagerstrom, 2017). Despite the promises that learning affords, pitfalls include the ‘gaming’ of learning systems, limiting students’ domains of knowledge and mastery states to system-driven concept maps/trajectories.  

Progress: Amidst the debates of promises and pitfalls, strides in learning analytics have been seen in common learning management systems and adaptive/personalized learning platforms. Dynamic or real-time assessment and feedback is available for students and instructors to view through dashboards. Recommender systems help students find answers to questions quickly. Learning analytics is also perceived to help instructors identify students at risk to provide interventions. Intelligent systems offer individual learning progress as well as a holistic view of classroom performance. 

Outcomes for the Workshop: 

There is a critical need to understand the bigger picture, identify best practices, and reveal the idealisms of learning analytics towards optimizing learning and student success. Common themes as outcomes for the workshop include:

  1. Availability of Useful Data

  2. Data Literacy

  3. Process and Strategy

  4. Time and Effort 

  5. Philosophical Resistance or Skepticism 

  6. Privacy, Security, and Misuse 

Interaction/Collaboration Goals:

This topic first was offered as a two-part debate-style workshop at the Instructional Design Summit workshop in OLC Innovate 2023 and was highly requested to continue the conversation at OLC Accelerate 2023.

This multi-stakeholder session brings forth ideation among faculty, researchers, developers, administrators, and designers into the great learning analytics debate on the promises vs. the pitfalls of learning analytics amidst on-going efforts to progress learning analytics platforms/tools, policies, and processes in higher education environments. 

Participation:

We invite participants into a guided, interactive, debate-style conversation with visual facilitation through a digital whiteboard canvas to capture mini case studies from the session.

Workshop Structure (90 min):

  • 10 min - Introduction and participants pick a theme about learning analytics (i.e., Availability of Useful Data; Data Literacy; Process and Strategy; Time and Effort; Philosophical Resistance or Skepticism;  Privacy, Security, and Misuse)

  • 15 min - Interactive workshop: User-stories related to promises and pitfalls of learning analytics

  • 10 min - Share out and debate among groups

  • 15 min - Interactive workshop: User-stories on successful and challenging implementation of learning analytics tools and strategies

  • 10 min - Share out and debate among groups

  • 15 mins - Synthesis of user stories into mini case studies

  • 15 mins - Strategies for dissemination of mini case studies to individual institutions