“Does it work?”: Formalizing a process for Instructional designers and technologists to test teaching and learning tools

Concurrent Session 7

Session Materials

Brief Abstract

Do faculty often bring new technology to your attention? Learn how we harnessed the curiosity and experience of instructional designers and technologists to develop a system for consistent testing of educational technologies to identify best practices for use, develop materials for instructional support, and identify student data or privacy issues.

Presenters

Allyson has more than 8 years of experience in instructional design and delivery across K12 and higher ed settings. After completing her M.E. in Electrical Engineering she decided to make Gainesville her home and has worked at UF for most of her career. In addition to facilitating course design and development she also teaches workshops for faculty focused on course design and teaching with technology, and has presented at OLC Accelerate, Educause ELI, and AALHE conferences.

Extended Abstract

When we evaluate tools for teaching and learning we use a rubric of standards to stay consistent across evaluators. However, as instructional designers and technologists, we often wonder, “Does this tool work as promised?” Most of our requests for tools come directly from instructors who have either seen a demo or heard about the tool from a colleague. While the tool may sound great on paper we have found that they do not always function as advertised. We were also motivated to develop a testing process for these LMS or web-based instructional tools because as many tools are updated to the LTI 1.3 standard, some of their functionality is changing.

When it came time for deciding who would be involved in the testing process, we turned to our instructional design and educational technology staff to lead the charge. Their experience with learning how to use tools, educating others on how and why to use them, and implementing them in courses with varying needs, modalities, and assessments provided us with the best opportunity to achieve our goals. As highly-skilled users of our LMS, they were able to quickly and thoroughly work through the role testing prompts laid out for them on our testing worksheet and also provide critical feedback on the tool's UX/UI, including instructional considerations.

The testing process informs how we provide support to instructors and allows us the opportunity to interact with the tool as different user roles in our development instance of our LMS. Assigning each technologist and instructional designer different roles allows us to proactively identify issues with student data privacy and escalate those concerns to leadership. By using the tool, we are able to identify potential areas of confusion and create institutional resources that help clarify or support a tool’s use for students and/or instructors.

In this short session, we will share the testing worksheet we created, how we collaborate with our instructional designers and technologists to complete this testing, and explain how it benefits instructors, university leadership, and students, paving the way for innovation and change. To engage with attendees and provide some interactivity during our sessions, we plan to begin each 15-minute portion by asking the question: "What challenges have you encountered when implementing tools for teaching and learning at your institution?" Answers will then be accepted by accessing a Mentimeter code and be displayed live as a word cloud shown on a laptop. We hope that this exercise encourages attendees to connect with others who are having similar challenges and create discussion about possible solutions. Finally, we will provide a handout showing our process for attendees to review during the session and take home with them.

They will also leave this session with resources they can use to compare and evaluate how their institution or instructional design group reviews and supports the use of educational technology. We hope they walk away with strategies for consistent evaluation of tools that will enable instructional designers and technologists to be aware of product limitations and allow them to feel confident in their technology recommendations to instructors.