A Response to New NCES Report on Distance Education
Published by: WCET | 6/11/2014
Tags: Distance Education, IPEDS, State Authorization, Survey, U.S. Department Of Education, WCET
Published by: WCET | 6/11/2014
Tags: Distance Education, IPEDS, State Authorization, Survey, U.S. Department Of Education, WCET
By Phil Hill and Russ Poulin, cross-posted to e-Literate blog.
Last week the National Center for Education Statistics (NCES) released a new report analyzing the new IPEDS data on distance education. The report, titled Enrollment in Distance Education Courses, by State: Fall 2012, is a welcome addition to those interested in analyzing and understanding the state of distance education (mostly as an online format) in US higher education.
The 2012 Fall Enrollment component of the Integrated Postsecondary Education Data System (IPEDS) survey collected data for the first time on enrollment in courses in which instructional content was delivered exclusively through distance education, defined in IPEDS as “education that uses one or more technologies to deliver instruction to students who are separated from the instructor and to support regular and substantive interaction between the students and the instructor synchronously or asynchronously.” These Web Tables provide a current profile of enrollment in distance education courses across states and in various types of institutions. They are intended to serve as a useful baseline for tracking future trends, particularly as certain states and institutions focus on MOOCs and other distance education initiatives from a policy perspective.
We have previously done our own analysis of the new IPEDS data at both e-Literate and WCET blogs. While the new report is commendable in its improved access to the important dataset, we feel the missing analysis and potentially misleading introductory narrative takes away from the value of this report.
The real value of this report in our opinion is the breakdown of IPEDS data by different variables such as state jurisdiction, control of institution, sector and student level. Most people are not going to go to the trouble of generating custom tables, so including such data in a simple PDF report will go a long way towards improving access to this important data. As an example of the data provided, consider this excerpt of table 3:
The value of the data tables and the improved access to this information are precisely why we are concerned about the introductory text of the report. These reports matter.
We were hoping to see some highlights or observations in the report, but the authors decided to present the results as “Web Tables” without any interpretation. From one standpoint, this is commendable because NCES is playing an important role in providing the raw data for pundits like us to examine. It is also understandable that since this was the first IPEDS survey regarding distance education in many years, there truly was no baseline data for comparison. Even so, a few highlights of significant data points would have been helpful.
There also is a lack of caveats. The biggest one has to do with the state-by-state analyses. Enrollments follow where the institution is located and not where the student is located while taking the distance courses. Consider Arizona: the state has several institutions (Arizona State University, Grand Canyon University, Rio Salado College, and the University of Phoenix) with large numbers of enrollments in other states. Those enrollments are all counted in Arizona, so the state-by-state comparisons have specific meanings that might not be apparent without some context provided.
Even though there are no highlights, the first two paragraphs contain a (sometimes odd) collection of references to prior research. These citations beg the question as to what the tables in this report have to say on the same points of analysis.
Postsecondary enrollment in distance education courses, particularly those offered online, has rapidly increased in recent years (Allen and Seaman 2013).
This description cites the long-running Babson Survey Research Group report by Allen and Seaman. Since the current IPEDS survey provides baseline data, there is no prior work on which to judge growth; therefore, this reference makes sense to include. It would have made sense, however, to provide some explanation of the key differences between IPEDS and Babson data. For example, Phil described in e-Literate the fact that there is major discrepancy in number of students taking at least one online course – 7.1 million for Babson and 5.5 million for IPEDS. Jeff Seaman, one of the two Babson authors, is also quoted in e-Literate on his interpretation of the differences. The NCES report would have done well to at least refer to the significant differences.
Traditionally, distance education offerings and enrollment levels have varied across different types of institutions. For example, researchers have found that undergraduate enrollment in at least one distance education course is most common at public 2-year institutions, while undergraduate enrollment in online degree programs was most common among students attending for-profit institutions.
This reference indirectly cites a previous NCES survey that used a different methodology regarding students in 2007-08.
A 2003 study found that historically black colleges and universities (HBCUs) and tribal colleges and universities (TCUs) offered fewer distance education courses compared with other institutions, possibly due to their smaller average size (Government Accountability Office 2003)
What a difference a decade makes. Both types of institutions show few of their students enrolled completely at a distance, but they now above the national average in terms of percentage of students enrolled in some distance courses in Fall 2012.
Rapidly changing developments, including recent institutional and policy focus on massive open online courses (MOOCs) and other distance education innovations, have changed distance education offerings.
Only a small number of MOOCs offer instruction that would be included in this survey. We’re just hoping that the uniformed will not think that the hyperbolic MOOC numbers have been counted in this report. They have not.
We are doing some additional research, but it is worth noting that we have found some significant cases of undercounting in the IPEDS data. In short, there has been confusion over which students get counted in IPEDS reporting and which do not. We suspect that the undercounting, which is independent of distance education status, is in the hundreds of thousands. We will describe these findings in an upcoming article.
In summary, the new NCES report is most welcome, but we hope readers do not make incorrect assumptions based on the introductory text of the report.
Phil Hill
Mindwires.com
e-Literate blog
Russ Poulin
WCET – WICHE Cooperative for Educational Technologies
If you’re not already a member, come join us!
3 replies on “A Response to New NCES Report on Distance Education”
Thanks to Phil and Russ for their ongoing work on this important topic. I agree with your opinion that the real value of the NCES report is the breakdown of IPEDS data by different variables. It is interesting to look at these variables for useful and even startling data (e.g., why so few DL learners percentage-wise in CA or DE? Do DL learners really comprise almost half of all higher ed. students in AZ, and if so, why?).
It’s also important to find plausible explanations, if not exactly convergence, among different data sources. It seems as though the difference between NCES and BSRG figures (7.1M vs. 5.5M) are explainable by undercounting (as you mention) + inclusion/exclusion of non-degree/certificate-seeking students + differences in DL course definitions (80% vs. 100%; I think I disagree with Jeff Seaman on this latter point). In fact, as noted in my 2012 book The Seven Futures of American Education (pp. ), the research firm Ambient Insight cites even larger numbers (12M DL learners in 2010) because their report included students from all postsecondary institutions which participated in Title IV federal student aid programs, including non-degree-granting institutions. The numbers from MOOCs and blended/hybrid courses will continue to blur the distinction between “distance” and “not distance” students.
In the end, however, we are better off having multiple numbers, because they will help compel us to look more deeply at the assumptions behind those numbers and to select different figures judiciously based on the assumptions needed at the moment. So long as this is done with reasonable transparency, this would be a good thing IMO. And just as the BSRG report data yielded useful patterns which informed subsequent practice, the IPEDS report can do the same in new and different ways as you both note…
I like it.
[…] Read More […]