Last week higher education leaders from around the U.S. gathered in Salt Lake City to answer not only the question, “What is ‘big data’?”, but also the “Now What?” for its implications on the future of higher education.
The two day summit kicked off with an orientation to the learning analytics ecosystem by two well known experts – Linda Baer, Minnesota State University – Mankato and Don Norris, Strategic Initiatives, Inc. Key takeaways from the morning session include:
- We have more data than we know what to do with, but not enough analysts to figure out what we do with it.
- The majority of the attendees are looking to accelerate their analytics development but many were just getting started.
- Before launching an analytics initiative, institutions need to complete a readiness assessment. A three stage model was shared:
- Stage I – Getting Started
- Raise the analytics IQ of leaders, faculty, and staff.
- Stage II – Accelerating Development
- Create an action Plan for Analytics.
- Be sure to identify “quick wins” and “low hanging fruit” – it will help demonstrate the behavior you want people to display.
- Stage III – Creating Transformative Strategies for Leveraging Analytics
- Create a ‘single point of truth’ – having the team argue over whether the data is correct or not is counterproductive.
- Need to differentiate the units of analysis and between real-time and leadership-based data.
- Who needs the data?
- What kind of report do they need?
- How often do they need it?
- At what level of granularity do they need it?
- You can’t wait for a plan before taking action. You need to learn from doing. MOOCs show that we may be in a “Shoot, Ready, Aim” culture now – we have to continue moving forward: audit, measure, plan and act all at the same time.
For more, be sure to check out the slides and handouts from Linda & Don’s orientation to analytics.
Data Change Everything
Ellen Wagner, WCET Executive Director, welcomed the group with the story of the Damoclean Sword and its analogous nature to how analytics function in across all market sectors today. She illustrated this point with the example of Marissa Mayer made the decision to end telecommuting at Yahoo – by analyzing the data from the VPN logs to determine the level of productivity. Mayer took much heat in the popular press for her decision, but rarely was it acknowledged that her decision was based upon data that showed workers that were under-performing from home.
Summit Kick-off: Data Change Everything
Featured speaker, Chris Bustamante, President of Rio Salado College, shared how the nation’s largest public online community college, with more than 41,000 online students, uses data to track success, assess the situation, and make changes to improve retention to help more students reach achievement goals. Key takeaways:
- In the systems you build, only include variables that are actionable student behavior – we can’t change a student’s gender or race.
- Providing a student with the right intervention at the time when it is needed can be the difference in student success.
- With the Rio Salado system, they can show that students logging into the course by the second day have a 20% changes of succeeding. By the eight day of the course, their system can predict a student’s outcome with 70% accuracy.
- Lack of engagement is both a student AND a course design issue.
- Now is the time to learn from others – leverage their experience and expertise.
Next up, Jared Mann of Cenage shared their new product, MindTap. Key takeaways:
- MindTap is interactive, customizable content that combines readings, multimedia, activities and assessments to guide students through a Learning Path to complete their course.
- Allows instructors to personalize the content and integrates with learning management systems.
- Can give different point levels for engagement based on activities completed.
- Ellen Wagner noted that being able to look inside an activity to be able to determine its worth/value is important.
Shannon Meadows, CourseSmart then discussed the disrputiveness of data in higher education. Key takeaways:
- New Vision to Transform Education
- Old paradigm = observable behavior
- New Paradigm = Observable Behavior + Data (from content management, online interaction, etc.)
- Analytics are the bridge between causation and correlation
- CourseSmart analytics dashboards, from Beta, are clean and easy to interpret.
- Analtyics will transform teaching, learning and accountability.
- Analytics will help the U.S. regain its competitive edge in higher education.
The final panelist of the opening session was Darcy Hardy, WCET Executive Board member. Key takeaways:
- Big Data is going to be the biggest thing to hit higher education.
- Analytics can help drive conversations about whether students (teenagers, coming straight out of high school) are even ready for college. The Data Driven Gap year.
- What we need are edupreneers = education entrepreneur engineers.
- An edupreneer is the student who will engineer their own credential from life and educational experiences.
Big Data, Big Changes
Next up, Catherine Kelley, Fairleigh Dickinson University, lead a discussion by Linda Baer, Minnesota State University – Mankato, Brett Dennis, Blackboard, Inc. and Peter Smith, Kaplan Higher Education on how big data play into the realms of accountability, capacity building, and performance-based funding.
- MN created a Dashboard related to performance-based funding, as a requirement of the board. The Dashboard displays performance indicators of the Minnesota State Colleges and Universities system and its member colleges on selected key measures.
- Peter Smith quoted Mark Twain – “If they are going to run you out of town anyway – get in front and make it look like a parade,” as an analogy of where higher education is today. Data will be the tool that helps us make the necessary changes.
- Big data is driving a new “mother of all mashups” in which the ecosystems of students are considered. The tool being developed by Kaplan Higher Education will compare a student’s acquired skills against their desired job and its related skills. Then they’ll be provided with a gap analysis and options on how to gain the skills they are missing:
Cathy Kelley closed this session with wise words, “Be afraid. Be excited.”
Analytical Capacity Building for Institutions
This discussion was led by Don Norris, Strategic Initiatives, Inc.
- For institutions that want to move forward, where do we find the talent, the analytics expertise? How do we get this done? Some initial suggestions:
- Grow your own talent. However, be cognizant that even entry level people with big data analytics capability are prime for recruitment by the private sector.
- Consider “in-sourcing.” Bring in industry people to work with your team to develop a cadre of personnel with analytics expertise. Then consider offering their talent as a shared service. Think about creating an “analytics utility.”
- WCET was encouraged to consider doing a boot camp to train institutional teams.
- Faculty will be resistant to any additional workload requirements to enter student course activity or other data. How address the workload issue? The data, the measurements have to be embedded in existing systems so no extra work is required. It has to be automatic.
- In response to a concern about faculty academic freedom, these data analytics strategies also work for traditional/hybrid classes. Rather than capturing the student data within the class, consider all the student activity outside of class….student logins into chat, data from e-text usage, etc. Faculty will not be intimidated by the collection of data of student activity outside the classroom.
- What’s the pitch that will resonate with legislators? Big data will enhance institutional ability to improve student performance and control student costs.
Creating a Culture of Retention and Persistence – Now that we know, what do we do?
This breakout was led by Matt Pistilli, Purdue University and focused on a shift in culture – from a culture of persistence and retention to a student success culture – noting that by getting to student success, you will beget retention and persistence.
- Within the first ten days of class, combining assignments turned in and attendance, we can predict a student’s outcome while accounting for 70% of the variance.
- Student success is not one person’s job – it’s everyone’s job.
- Institutions need to focus on risk behaviors for all students not ‘at risk’ students.
- What data is important is relative to the situation.
- There are many questions that are raised about the “ethics” of big data. Matt referenced an EDUCAUSE article that he co-wrote.
Developing a Data Strategy
This session was led by Alan Drimmer, Apollo Group, and shared the strategies Apollo and its’ institutions use to harness data for student success.
- Data is everywhere. Strategy may be THE most important thing a school does to innovate.
- There is no magic formula for using data, because the issues are complex and every school is different. That does not mean you can’t learn from others’ experience, but data strategy is not a one-size-fits-all situation.
- Learning games are a use of big data – the game reacts with challenges based on your and other player’s data to customize the experience.
- In regard to gaining faculty buy-in for data strategy, don’t go to faculty with a blank sheet – bring them some initial trend data, highlight the outliers and see what questions they ask.
- Focus on something that is important AND something for which you can get the data.
- Assembling the right team is key. Otherwise you will experience GIGO (garbage in, garbage out).
- There is no silver bullet for influencing human behavior.
Summary of Day 1
Ellen Wagner brought the group together and asked all to share their key takeaways from the day:
- There is no such thing as ‘sort of transparent.’ Once our data are out there, we’ll be accountable.
- We should think about not only how to tell the story of the successes, but alongside that outline the critical organizational changes that moved the power levers to obtain those successes.
- Concerned about the hype. The processes have to be in control and carefully outlines so we don’t blow up one process for another.
Predictive Analytics Reporting (PAR) Framework: Academic Risk Identification
Beth Davis, PAR Framework Director, introduced the PAR Framework and its goals, while PAR data scientist, Jeff Grant shared what the tools being developed can do, and Luzelma Canales, Lone Star Community College System, shared the experience of an institution participating in PAR.
- PAR is a “big data” analysis effort to identify drivers related to loss and momentum. It informs student loss prevention.
- WCET member institutions voluntarily contribute de-identified student records to a single federated database.
- Common Data Definitions are at the foundation of reusable predictive models and meaningful comparisons and are shared openly via the DataCookbook by IData, Inc.
- In a Campus Technology article which interviewed Russ Little, Sinclair Community College, these data definitions were coined the Rosetta Stone of Student Success Data.
- The PAR presentation contained several references to the film “The Graduate”:
- Lone Star Community College System has been able to use the data in their PAR dashboard to work with K-12 and legislators to pinpoint which districts need to better align their math curriculum. Using these data, they can talk to policy makers about how to close the gaps on STEM.
- The PAR dashboard allows institutions to see the pass rates in a class for factors such as ‘credit ratio’, GPA, number of withdrawls, gender, receipt of pell grant or not, or ethnicity. It also allows institutions to look at, for instance, the permanent residence location by major, race and ethnicity, home campus, or gender. By knowing the issues behind each campus through looking at the data, we can better understand how to move policy and practice.
- It was noted that the PAR team does not know who the students are, but institutions can know who their own students are.
PAR Framework: Student Success Matrix
In this session, Peter Shea, SUNY; Karen Swan, University of Illinois – Springfiled; Mindy Sloan, Ashford University; and Sandy Daston, PAR outlined the basis for development of a student success matrix.
- We need new models that are explicit to the type of student we are trying to assist.
- Retention is an institutional goal to keep students enrolled to completion. Persistence is an individual’s goal of working to achieve personal educational desire.
- Persistence is a recipe for success, but too many institutional barriers and a persistent student will go elsewhere.
- The PAR Student Success Matrix predictor categories are based on three inputs – literature review, partner experiences, and predictive models.
- PAR Student Success Matrix is defined by four periods – connection, entry, progress, and completion – based off the Completion by Design work and can be applied across a student lifecycle or across a course lifecycle.
- 22% of the interventions reported to date are related to learner characteristics, which are harder for institutions to change.
- Mindy Sloan noted that when Ashford University completed the Student Success Matrix, they found they were putting a lot of resources into the first few weeks, which showed their resource allocation was in-line with their mission.
- Karen Swan noted that the data tells you what, but it does not tell you why. You have to implement the interventions and then figure out why.
- Collaboration is the key to making this work and keeping it viable. There will be opportunities in the future for others to support and join in the work.
- Knowledge isn’t a strategic resource like a pile of money. Sharing makes you and the knowledge stronger.
- Don’t forget, the data are descriptions of living, breathing students, human beings who we are working to help succeed.
Student Success by the Numbers
In this session we learned about several of the commercial products which help institutions explore student success. The panel was moderated by David Leasure, Western Governors University and included Mac Adkins, Smarter Services; Deb Everhart, Blackboard, Inc.; and David Yaskin, Starfish.
- The impacts of college are transgenerational. Lack of success begets the same. Success transforms families for generations.
- SmarterMeasure and SmarterSurveys help institutions model student success by using noncognitive variables including internal (attributes, learning styles), external (life factors), and technical/computer skills.
- Data visualization helps turn numbers into action. We need to move away from post-mortem grades to realtime indicators, to move the bar on student success.
- Blackboard Learn’s Retention Center puts tools in students hands to see how their activity and grades compare with other students in the same course. It also provides instructors with a breakdown of their own activity engaging in the online portion of the course.
- Employing a student success platform does not mean students are succeeding, it means you want to engage more.
- Closing the loop is hard. If you recommend a student do something to improve, determine a way to track if the recommendation was followed.
- Starfish Retention Solutions enlists the whole institutional community to participate in student success by connecting across the institution, not just within one segment.
Phil Ice, APUS and Megan Stewart, Clickstream Learning, opened a look into the future of data analytics as we continually move forward beyond what is known today.
- Explosive growth:
- In 2012 there were 2.4 Billion internet users, with an 8% year to year growth.
- Between 12/08 and 5/13 mobile device access of the web grew from 0.9% to 15%.
- In one minute, 204 Million email messages are sent. Every minute of every hour of every day.
- Higher Education accounts for 10-20% of all internet traffic and the amount of data being created is staggering.
- Extreme Data = Extreme Integration + Extreme Stress. Between terabytes and petaflops of data are being created on our campuses almost daily.
- There is a fundamental shift from structured data (about 5% of data) to unstructured data (about 95% of data) in our work.
- Data collection and analysis are tied to learner behaviors. Behaviors are actions. The goal is to determine the data that can help us inform behaviors and change them for the better.
- Ellen Wagner shared an analogy for very inexpensive software licenses – they are much like the almost free puppy – the costs come in for keeping everything healthy and running.
It’s About the Students!
Vernon Smith, Portmont College at Mount St. Mary’s, brought the Summit back around full circle, reminding us that all of the work we do is about one thing – the students.
- The sword of data is double-edged. Supports the quantified self and adaptive learning. Data can also border on the line of creepy due to our (and our students) level of comfort with sharing our data.
- The student model for Portmont College at Mount St. Mary’s is students with grit, who only have a 10th grade reading/math level but have motivation to learn.
- Instead of placement exams, Portmont diagnoses students to find the best fit. Then combining in-person bootcamp with cohorts, to prep students for fully online courses.
- Focus on building noncognitive skills along with academics to increase success.
- Portmont College focusing on four degrees and providing with them measures of employment-ready competencies, which it makes visible to employers in a ‘double-click’ transcript, to give a 3-D view of the student.
Perhaps it was best said by attendee, Darcy Hardy on twitter – “Head now officially exploding – in a good way. #wcetsummit13.” A lot of learning was packed into two short days. We welcome you to continue the conversation here and through our social media. In the coming days, you will find resources available on the Summit page at WCET – http://wcet.wiche.edu/connect/2013-data-summit.
Thank you to all who attended and to our sponsors who helped make the conversations possible.
WCET, Manager, Communications
Support our work. Join WCET.
Sword Image by Stu Mayhew on Flickr.