LearnChemE Screencast Project
LearnChemE's project team's lack of understanding about how students and faculty are using the screencasts. What do users like, how can screencasts be improved?
Usability as a lens for understanding screencast use
Better understand interface barriers to learning
Improve the design and quality of new screencasts
Role: UX Researcher | Team: co-Researcher, Data Analyst, Screencast Designer, and Project Manager
Help us understand our users!
LearnChemE has developed over 1,700 screencasts and interactive simulations to help students learn chemistry and engineering concepts. LearnChemE videos have more than 31 million views at over 5 million per year.
As the project grew, the team realized they needed to know more about how users interact with their content. LearnChemE asked our team to gather insights to better understand how videos are used and how their quality could be improved.
We conducted a mixed-method approach that included analytics reports, surveys, interviews, and usability tests. This combination of methods provided a range of insights including how users interact with the content, what's working and what's not, decision-making about the use of videos, and feedback to improve quality. Along with my research colleague, I developed specific research questions, a research project plan including calendar, participant recruitment, data sourcing, survey design, interviews, and usability testing scripts.
Product: Screencasts | YouTube, iTunesU
Audience: Students & Teachers worldwide
Guiding Questions: How do faculty & students use screencasts? How can the design of screencasts be improved?
Video Platform Analytics
We analyzed analytics reports from YouTube and Apple’s iTunesU (now retired) for macro information including video lengths, popularity, from where they were accessed, how long they were viewed, and likes/dislikes. We also reviewed comments on videos feedback and insights about issues with quality and usability.
We quickly discovered low usage and more complex technical issues with iTunesU compared to YouTube and supported phasing out this platform.
The comments section of YouTube contained many items of feedback about errors and usability of the videos, this area had not been monitoring for that type of information.
As the lead for a video captioning initiative, I also recommended improving the accessibility of the videos by taking advantage of YouTube's captioning function and provided resources to help improve captioning accuracy.
We conducted individual interviews with instructors and faculty. Participants were asked about how they used the screencasts in their teaching, barriers to use, and areas for improvement.
Conducted nine interviews
Faculty and instructors were recruited from multiple participating institutions (to add diversity in rank and institution type and size).
Questions focused on types of courses where videos were used, how the videos were incorporated, their teaching practices, how they were received by students, would they use again/recommend screencasts to others?
My primary goal was to how they view screencasts as pedagogical tools and the ways they utilize them in their teaching practices. I also wanted to know the limitations they experienced with the videos including both technical and process, along with any student concerns.
Instructors also used screencasts as tools for their own preparation.
They wanted more guidance, documentation, and examples for using screencasts.
Most faculty recommended screencasts to their students for supplementary use (reviewing concepts, exam preparation)
Surveys of Students and Faculty
Both students and faculty were surveyed about their use of screencasts for teaching and learning.
One survey was developed that targeted faculty who had expressed interest in the screencasts. 41 responded to the survey representing 74 classes taught.
Aspects that are useful according to faculty:
Availability | Interactivity | Another Authority's Perspective | Brevity | Problem Solving Orientation | Indexed and Organization
Additional surveys targeted students, both generally (n=36) and in specific courses (N=233) at 2 institutions.
Found screencasts useful | Use them to study for exams | When they are struggling with a topic | Help them perform better in their courses
Students and faculty both expressed a desire for more videos that cover more topics and to have additional ways to interact around the content (discussion forums, activities, etc).
We ran 11 moderated usability tests with chemical engineering students using videos that were relevant to their studies. Tests recorded mouse movements, keystrokes, user video, and the moderator and tester’s audio commentary.
After each usability test, we prompted students with think aloud/cued recalls help understand their struggles while using screencasts.
Usability tests were primarily focused on identifying usage patterns and barriers or limitations of videos.
"A clearer or bigger picture would help. The smaller picture/image made it harder to understand. Being able to see the difference in sizes."
"Some of them on the site are very long...Keeping them short is better...10 Minutes tops"
"The problem statement was really long. - not suggested to read through it, this made the video less clear."
"...thought he moved a little bit fast. When they were taking the differential. Throwing equations on the screen. The whole line showed up rather than seeing it written out."
The research team met with the LearnChemE project team regularly throughout the research. We provided ongoing status reports and engaged with the team for input and adjustments.
Results of the study informed both how the screencasts are used and ways the screencasts (existing and new) could be improved.
Example recommendations include:
Limit video length to between three and five minutes, but no more than ten
The pace of videos should be context-specific.
Prioritize high quality audio
Presentation matters, fix mistakes, use care
Expand to cover additional topics
Our user research enabled the project team to meet a National Science Foundation requirement for the continuation of their grant funding.
Many of our recommendations were implemented by the project team.
New screencasts better address length and pacing.
Audio quality was better prioritized in future videos where possible.
I assisted the team in addressing video accessibility. A group of students were hired to help caption videos.
Additional support resources were developed for both faculty and student users.
Unfortunately our engagement with the project was time-limited. While many of the recommendations have been implemented, there has not been ongoing user research to confirm our research and continue to develop new insights.