Relaunching the EdSurge Product Index
Published by: Lindsey Rae Downs | 6/5/2017
Published by: Lindsey Rae Downs | 6/5/2017
Hello! This week we welcome Sunny Lee, the Senior Product Manager with Higher Ed at EdSurge. The EdSurge HigherEd team just completed a total relaunch of their courseware product index, which is a system to help higher education administrators search and find courseware. It’s been great to learn more about this relaunch, as well as the process the team took to complete the refresh of the system.
Thank you Sunny for sharing this blog post with us!
Enjoy the read and enjoy your Monday!
The EdSurge HigherEd team recently relaunched our courseware product index to help college leaders search for courseware to meet their teaching and learning needs. Through the index you can filter your search for courseware products, compare product features, and review case studies.
The index relaunch was a result of consulting with college leaders for more than a year, to deeply understand their current processes as they search for courseware solutions and the problems they encounter. The interactive filters are inspired by the Courseware in Context (CWiC) Framework, which was created by Tyton Partners and a collaboratory of higher-ed institutional partners. The goal is to help decision-makers more effectively navigate the market of courseware products.
In the first wave of our user research, which began in early January 2016, we started with a relatively blank slate. We embarked on a series of in-depth user interviews to define key higher ed-leader personas and the journey they take during their edtech search, discovery, evaluation and selection process.
We interviewed 49 institutional leaders and established four key representative personas:
Through those interviews, we learned about the needs and drivers of higher education (HE) leaders and were able to start articulating them.
HE leaders want:
We also were able to pull out emergent themes from our conversations:
As we learned more about our key personas, the jobs they needed to get done, their motivations and drivers—as well as pain points—we started user testing mockups to gauge features of the index with HE leaders.
You can see an overview of our process along the timeline below:
These were the initial set of work-in-progress mockups we started out with for user testing purposes:
The questions we focused on during our user-testing sessions, in addition to the user experience of the mockups, included:
After thorough synthesis of all the user interviews and user testing sessions, we launched our initial courseware product index in July of 2016 which you can see below:
Following this launch, we continued to be a part of the effort to simplify and improve the CWiC framework, shepherded by Tyton and collaborating institutions. As the framework matured and relaunched in October 2016, it became obvious that our index was not taking full advantage of the lessons learned by being a part of this endeavor.
For example, the three filters we launched with were courseware features, LMS integrations and discipline. From our earlier interviews, many HE leaders expressed these as important baseline considerations during their initial search and discovery process. However, it was clear these filters didn’t take advantage of critical teaching-and-learning considerations that are important to the successful rollout and implementation of courseware that are outlined by the framework.
Through another round of user interviews, we found that institutional leaders often overly rely on operational requirements like LMS integration, discipline, and content sources to select courseware products without considering what important features are necessary to meet their teaching and learning goals. The CWiC Framework was precisely the protocol developed to encourage college leaders to think about their pedagogical goals as they evaluate courseware products. By not interweaving elements of the CWiC framework into the index, we were leaving out an important untapped user experience need.
Our challenge then was to figure out how to simplify a framework that had nine different functional capabilities (with each having an average of five sub-capabilities for a total of 45 different filtering possibilities). This wasn’t even counting the table stakes capabilities, aka operational requirements critical for systems integrations, like LMS, accessibility standards, browser support as well as other important factors in the search process including discipline, content source, modality and institutional use cases etc. There was a clear tradeoff we needed to reckon with – Do we aim for thoroughness by surfacing all the possible filtering capabilities represented by the CWiC framework at the expense of usability or vice versa? Or was there a middleground we could strive for?
We began to take a look at the CWiC framework data submitted to us by approximately 30 companies in the courseware product index and measured variance in responses. Then we ranked the capabilities based on level of variance. For instance, there was a high level of variability in the responses to the adaptivity capability by the 30 companies, meaning if a user selected any of the subfilters under adaptivity, products would be noticeably pared down narrowing the selection possibilities.
Meanwhile, there was rather low variability in the responses to the usability capability. Most companies self reported on the CWiC framework survey that their product had a high level of usability. While usability might be an important consideration in the courseware selection process, if by selecting that filter, products are not further eliminated from the long list, we determined that’s not a very effective filter for the user and thereby not a great user experience.
With such variance in the data at hand, we pared down the functional capabilities from nine to six. We determined a good combination of the CWiC-framework-derived functional capabilities filters, as well as operational requirements filters, would guide HE leaders to effectively shortlist courseware products that both met their technical needs as well as their pedagogical goals.
With that in mind, our next step was to design a user experience around these new filters that would be engaging and intuitive to use. We came up with two design directions; a guided diagnostic and enhanced filters which we tested with various HE leaders at this year’s SxSW Edu in Austin:
The goal of the diagnostic was to guide the user through important considerations in the courseware selection process that encourages the HE leader to think beyond operational requirements and more about teaching and learning needs. Many users told us that they appreciated the guided aspect of the diagnostic as well as the educational moment to learn about key features of a courseware that affect teaching and learning in the classroom that ought to be given more weight in the selection process.
Those same users also conceded that the diagnostic felt like a wonderful first-time user experience that would start feeling redundant as the user grew more familiar with the capabilities. Some also expressed concern about the “blackbox” nature of the diagnostic. One does not know which products they are leaving behind by selecting a certain sub-filter of adaptivity for instance. Meanwhile, the enhanced filters encouraged active exploration and immediate feedback through dynamic filtering based on the selections made by the user.
From the user-testing session, we concluded that the diagnostic, while effective as a first-time user experience, would become tired once the user started getting familiar with the filters. It became clear from the feedback that being able to dive in and explore the filters immediately was a better long-term experience. So we decided to build out the enhanced filter version and table the diagnostic.
The relaunch of our courseware product index in April 2017 was a result of user research and testing that is very much built into our product development process. Immediately after the relaunch, we lined up additional conversations with members in our community to get feedback in order to prioritize ways in which we would further improve the experience.
All of this work is driven by our goal at EdSurge to understand, empathize with, and help higher education leaders get the information they need to make defensible edtech decisions. Through the courseware index we hope to accomplish the following:
Through the dozens and dozens of user research conversations we’ve had with HE leaders, we have come to more deeply understand the jobs they need to get done, the pain points they experience, and the desire they have to help students accomplish their educational goals and succeed in their higher ed careers. We have a strong level of appreciation and admiration for these HE leaders and hope the courseware product index, in its latest iteration, provides value for them and is a tool that can better inform their courseware searches.
If you would like to share any feedback, please reach out to firstname.lastname@example.org!
Sr. Product Manager, Higher Ed at EdSurge