Student Engagement Roster
UX Research for Enterprise Systems
Background
In an ongoing effort to provide more accurate educational assistance and services to students, Indiana University needed to better understand the needs and behaviors of students and their performance in class.
The Student Engagement Roster (SER) was created as a means of collecting observations and recommendations from instructors to provide more accurate assistance to the specific needs of students as individuals.
SER was first launched in a pilot phase in the fall of 2017 to a subset of around 40 instructors, then fully released to all instructors at Indiana University in the spring of 2018. During the pilot phase release, my team of two designers conducted feedback interviews with users of the system to evaluate SER as a tool for recording instructor’s feedback for students before the full release to all Indiana University faculty.
My role
UX Researcher
Methods
Informational interviews
Moderated usability study
Full team observation and debrief following sessions
Objectives
These were our main learning goals for this project:
How do faculty see SER as a tool? What is its primary use? How does it fit into their day? How is this different from how we view SER as the product team?
Does SER communicate itself well? Is the interface understandable and are tasks able to be accomplished with little to no deliberation about what a certain button or feature might do? Is SER realistic/efficient/ideal in the context of the user’s work?
Do the options for feedback in SER match how faculty might describe student performance in their own words?
Insights
This research project identified several pain points:
The application did not allow instructors to enter feedback as quickly as they needed for it to be a manageable task within their day. Even for large class sizes, this task should have been a relatively small time sink.
Many features in the system were going completely unnoticed, even when they would have been useful for the tester’s workflow. Instructors needed a quick, easy way to get instruction when using SER, which the current help articles were not providing.
Faculty rely on Canvas, a grade and assignment management software, to inform observations they make about students.
Organizing the study
Because our faculty members were located all across Indiana, we needed to run these sessions via remote video conference. Chris and I organized research materials for the product team and scheduled hour long sessions broken up into three main parts:
General feedback and Q & A
Usability testing and observation. For this portion, we asked instructors to reserve some of their actual work to be performed during the session.
Usability and concept testing of new features.
To ensure whole team received the feedback directly as well, we arranged the video call so the rest of the team could observe silently and take notes while Chris and I led the sessions. Using a combination of Google Forms and Sheets, we consolidated notes taken during the sessions and routinely met immediately following the interviews to debrief and synthesize our findings with the entire product team.
Turning findings into strategy
After discussing our initial findings with the product team and stakeholders, we focused on three main actionable objectives to provide more benefit to users:
Make it easier for instructors to identify patterns in student performance using feedback in SER.
Ensure SER makes users aware of the features and abilities they have in the product.
Make information outside of SER, in this case Canvas gradebook information, available to users within the SER application.
Prototyping for more testing
Chris and I whiteboarded options to explore the design space, sketched iterations for quick feedback from the team and higher ups, created low or high fidelity mockups for critique sessions with designers on other teams, and created high fidelity interactive wireframes for presentations to our higher up stakeholders.
We also used these prototypes in later feedback sessions with faculty to assess whether these new features and tools were providing value to our users.
One thing I really like about our process is we do not hesitate to show early iterations of work. Even for the higher ups, sketches and low fidelity mockups were fine as long as they communicated the message well enough. This allowed us to apply our efforts and resources in more intelligent ways to expedite our process of discovery and feedback.
New benefit for users
After a few rounds of iterative testing, feedback, design critique, sketching, and prototyping, the following features were suggested for the Student Engagement Roster:
Canvas grade filter and grade import: Integrate Canvas grade information into SER, and allow instructors to filter their students by grade and apply grade values as feedback.
Preset feedback values: Give instructors the ability to create a group of feedback values that can then be applied as feedback in one action.
Instructional content: Provide instructors with easy access to demo videos and help articles within the SER application.
Reflection
As this was one of my first projects in a professional sense, I heavily relied on the excellent mentorship from the lead designer, Chris, who helped me learn to prototype with code, negotiate the challenges of meeting tons of different (oftentimes conflicting) expectations, and get unstuck when I was unsure of how to move forward.
Sharing concepts and rough sketches early and often prevented me from wasting time chasing a bad idea all the way to a high-fidelity prototype. My product team was patient with my learning process and was not afraid to tell me when my designs and ideas overstepped specifications, or created problems that could come back to bite us down the road.
There is no perfect design, there is only the first design, then the second, the third, and so on. The important thing is to not be hesitant to look for feedback, especially when you just started and your design has major problems. Just start, try something out, and work from there.