Categories
Uncategorized

State Authorization Reciprocity Agreement – Just the Facts (and a Few Opinions)

“You can please some of the people all of the time, you can please all of the people some of the time, but you can’t please all of the people all of the time”.” – John Lydgate

Abraham Lincoln referred to this quote after a difficult experience with another candidate. We are human. We don’t always agree with everything we are told or see.  However, we do need to get the facts straight before we are vocal in our disagreement.

Recently, there has been some limited, but vocal opposition to the State Authorization Reciprocity Agreement (SARA).  The opposition has raised concerns specifically about abuse by for-profit institutions with a direct link to online education. One must be aware that abuse is not limited to the for-profit institutions and that many such problems occur at on-ground campuses not affected by SARA.

We condemn the bad actions of all institutions that employ predatory practices, misrepresent accreditation or authorization status, misrepresent job prospects and salary outcomes, and use high pressure sales tactics, whether for-profit or not.

Constructive discussion may be a cornerstone of American process, but allegations have been raised against SARA that requires fact checking.  To that end, we would like to share issues critical of SARA and provide some objectivity and facts about each issue.  Additionally, there are two documents that one should carefully review before making a determination of the feasibility or reliability of SARA.  These documents are: the Unified SARA Agreement and the State Authorization Reciprocity Agreement Policies and Standards. Both will be included in the new SARA Manual due out in June, 2016.

https://www.youtube.com/watch?v=SWrG6l-5CAg

Myth #1:  For-profit institutions offer almost all distance education. For-profit and online education is conflated.

Facts:  According to the U.S. Department of Education’s IPEDS Fall Enrollment data for Fall 2014:

  • For-profit institutions enrolled only 30% of students who took all of their classes at a distance.
  • For-profit institutions enrolled only 17% of students who enrolled in at least one distance education course.

The closure of Corinthian Colleges last year was one of the most publicized and largest shutterings of a for-profit college in recent years. Operating under several names, Corinthian Colleges was a “career college” with the bulk of its students learning in a face-to-face setting.

For SARA, only about six percent of institutions participating in it are for-profit.  Most of them are public.

Opinion:  In a letter to the New York State Education Commissioner, there is repeated reference to “predatory online companies,” which is an apparent attempt to demonize all distance education providers. While there have been predatory online institutions, this reference is being applied very broadly.

There is often the suggestion that online education is synonymous with fraud. As shown with the Corinthian College case, misrepresentation and fraud can happen anywhere.

While trying to attack the for-profit colleges by assailing SARA, there is little worry about the collateral damage to students attending public and non-profit institutions.

Myth #2:  States already provide superior oversight of out-of-state institutions offering online courses to students in their state.

Facts:  Most states do not regulate institutions who only offer 100% online courses to students in their state. SARA makes no such distinction, as it prompts review of an institution for initial SARA admission and review for annual SARA renewal. Additionally, SARA provides the student the ability to file a complaint in the institution’s home state, which has the most knowledge and understanding of the institution. Therefore, students who were previously not protected (because their state did not regulate an institution that served students only by online means) are now protected by SARA provisions.

Opinion:  Many of the states (MA, CA, NY, and WI) in which there has been opposition to SARA do not regulate 100% distance education activities offered by out-of-state institutions.  If the complaint is that SARA provides insufficient oversight, why aren’t these states regulating distance education of all out-of-state institutions?  SARA represents improved student protection in those states.National Council for State Authorization Reciprocity Agreements log

Myth #3:  SARA was developed solely by the colleges without any consumer protection advocates.

Facts:  SARA was developed openly in three phases:

  1. In 2010, Lumina Foundation funded the Presidents’ Forum and the Council of State Governments to develop a Model reciprocity agreement allowing states to acknowledge other states’ decisions in regard to institutional authorization. The Drafting Team included three state regulators, a former state regulator, a State Higher Education Executive Officers (SHEEO) officer with regulator duties, two regional higher education compact representatives, two institutional representatives, and a state legislator. Consumer protection was at the center of these discussions. Listening sessions were held with regulators, accreditors, and the higher education presidential groups. Drafts of the model agreement were widely circulated for public comment.
  2. In 2012, building upon the work of the Presidents’ Forum and Council of State Governments, the Western Interstate Commission for Higher Education (WICHE) advanced the next version of an agreement in collaboration with the other regional higher education compacts (Midwestern Higher Education Compact, New England Board of Higher Education, and Southern Regional Education Board). Drafts of proposed agreements were openly shared for public comment.
  3. In 2013, the Commission on the Regulation of Postsecondary Distance Education (chaired by former Secretary of Education Richard W. Riley) was created by SHEEO and the Association of Public and Land-Grant Universities. Many SHEEO offices include state regulators. After publishing a draft for open comment, in April 2013 the Commission issued its final report recommending SARA’s structure.

Opinion:  State regulators were involved throughout the processes. These regulators are the members of the consumer protection community most familiar with laws, regulations, and infractions surrounding state authorization oversight. In each phase, draft documents were openly shared and public comment was sought. The SARA development process started in 2010. While a few state regulators have opposed the idea of SARA all along the way, organized opposition is recent. There are now 37 SARA states and soon to be at least 40.  (See www.nc-sara.org) Why are objections arising so late in the process?  And why has SARA been so successful if it is so dismissive of students’ rights?

DC heros vs villains_RyanC
SARA requirements apply equally to both the “good” and “bad” guys. Photo Credit: Ryan C

Myth #4:  SARA should regulate the bad guys more than the good guys.

Fact:  SARA’s requirements apply to all institutions equally.

Opinion:  The good guys have nothing to worry about.

In a letter denouncing SARA, former Senator Tom Harkin (IA) opines: “For reasons that fly in the face of the philanthropic mission of the public and nonprofit institutions, the model act they helped draft actually forbids states from regulating differently based on sector. That’s right. Under SARA, Massachusetts is forced to regulate Harvard the same as it regulates ITT. If it doesn’t, then Massachusetts is kicked out of SARA and Harvard has to get approval from each state to offer online education. This is simply not the way it should work.”

That sounds logical until you realize that it is not just the for-profits that break the laws. Yes, there are for-profits that have committed heinous acts of misrepresentation and abuse. They should be punished.

In our work in the State Authorization Network, we have heard from large institutions with familiar names that are openly flouting state authorization laws. We have followed major state universities that have deceived and failed students by enrolling them in academic programs in licensure fields when the student could not practice in their state of residence. Small non-profit institutions can close the same way for-profits can—Burlington College did so last week.

Are transgressions by public and non-profit institutions less common? Probably. But how do you know who is going to become a bank robber until they rob a bank? The regulations should apply equally to all.

Myth #5:  The institution’s home state is the only state involved in the resolution of a student’s complaint.

Fact:  The SARA Policies and Standards Section 4 Subsection 2 provides the complaint process for SARA institutions. The process is more thorough than the myth indicates.  A student may appeal the decision arising from an institution’s own complaint resolution procedure to the SARA portal agency (the agency handling SARA matters) in the home state of the institution. The agency will then notify the portal agency for the state in which the student is located.  The resolution of the complaint will be through the complaint resolution process of the institution’s home state, but the states will work together to identify bad actors. Additionally, complaints are reported and reviewed by the regional compact to provide oversight of the state to ensure the state is abiding by SARA standards. This streamlines the line of complaint resolution and keeps the resolution within the laws of the state under which the institution applied to participate in SARA.

childs hands connecting the dots in coloring book.
Working together allows SARA to connect the dots & recognize patterns that would likely not be visible in a single state. Photo Credit: Camilla Nilsson

Additionally, this process allows NC-SARA to track any emerging patterns in student complaints and take swift investigative action if necessary. This is important as attorney generals in the states may be slower to act due to too few bad actors and little evidence of bad actions. SARA publishes quarterly on its website a list of complaints against colleges that have been appealed to the SARA portal agencies, while most states do not

Opinion: There is strength in states working together to identify and resolve student complaints and correct offending institutions.

Myth #6:  Student complaints can be resolved only by the laws and agencies of the institution’s home state.

Fact:  The SARA Policies and Standards Section 4 subsection 2 g. provides that there is nothing in the SARA Policies and Standards that precludes the state attorney general from pursuing misbehaving institutions that break state consumer protection laws.  Additionally, a violation of a federal regulation (such as that which rises to the level of a federal misrepresentation action) is still under the jurisdiction of federal authorities to pursue any actions against the institution.

Myth #7:  SARA requires student complaints be resolved by the institution. This is the same method that for-profit colleges use to hide complaints.

Fact: SARA follows a practice commonly used by states throughout the country that encourage students to exhaust local options before appealing to the next level. There is no requirement that the complaint remain at the institution. If an institution is stalling or not dealing with the student’s complaint and the institutional complaint process is not yet complete, that student still has the option to appeal to the appropriate SARA portal agency.

According to Section 4 (Consumer Protection) subsection 1 of the SARA Policies and Standards document: “Initial responsibility for the investigation and resolution of complaints resides with the institution against which the complaint is made. Further consideration and resolution, if necessary, is the responsibility of the SARA portal agency, and other responsible agencies of the institution’s home state (see the following section: Complaint Resolution Processes).” The student is expected to begin with the institution, but that is not the end of the student’s options.

Opinion: Again, the critics are confusing this provision with the actions of several for-profit institutions, which require students to sign mandatory arbitration agreements that foreclose their external routes to seek redress. Under pressure from the Department of Education and others, two for-profit universities recently decided to remove arbitration requirements from their enrollment agreements. Even if the for-profit institution requires mandatory arbitration, to remain a SARA member, the institution has to allow the student to use the SARA complaint process.

Myth #8: The purpose of SARA is to make it easier for the institution.

Fact: According to the Operational Principles of SARA, found in Section 2 of the Unified Agreement: “…the purposes of this Agreement are to:

  • Address key issues associated with appropriate government oversight, consumer protection, and educational quality of distance education offered by U.S. institutions.
  • Address the costs and inefficiencies faced by postsecondary institutions in complying with multiple (and often inconsistent) state laws and regulations as they seek to provide high-quality educational opportunities to students in multiple state jurisdictions.”

Opinion: Many institutional personnel also think that SARA is merely to make life easier for them. The primary purposes are listed in the first bullet. Without performing the regulatory requirements outlined in the first bullet, then any institutional benefits are not worth it.

Myth #9:  SARA won’t provide enough oversight to protect students from the bad practices of an institution like Trump University.

Fact: Trump University was a non-accredited and non-degree conferring institution that would never have been eligible to become a SARA institution. Per Section 3.2 of the Unified Agreement, an institution must have the following characteristics to be eligible to participate in SARA:

  1. Location: The institution is located in the United States, its territories, districts or Indian reservations.
  2. Identity: The institution is a college, university or other postsecondary institution (or collection thereof) that operates as a single entity and which has an institutional identification (OPEID) from the U.S. Department of Education.  This includes public, non-profit private and for-profit institutions.
  3. Degree-granting: The institution is authorized to offer postsecondary degrees at the associate level or above.
  4. Accredited: The institution is accredited as a single entity by an accreditation agency that is federally recognized and which has a formal recognition to accredit distance-education programs.
U.S. map showing the states that have joined SARA. Those that have not joined: California, Connecticut, Delaware, DC, Florida, Kentucky, Massachusetts, New York, North Carolina, Pennsylvania, South Carolina, Utah, and Wisconsin,
As of the publishing of this blog post (May 25, 2016), 37 states have been accepted into SARA. More to come soon.

Myth #10:  Institutions will shop for the lowest-regulated state or use back door acquisitions.

Facts: According to the Roles and Responsibilities of Participating States, found in Section 5 of the Unified Agreement, each state joining SARA must agree that this agreement has the capacity to perform several tasks, including:

  • “It has adequate processes and capacity to act on formal complaints…”
  • Demonstrate that consumers have adequate access to complaint processes.
  • Ability to document complaints received, actions taken, and resolution outcomes.
  • Notify institutions and (if appropriate) accrediting agencies of complaints filed.
  • “It has processes for conveying to designated SARA entities in other states any information regarding complaints against institutions operating within the state under the terms of this agreement, but which are domiciled in another SARA state.”
  • “It has clear and well-documented policies for addressing catastrophic events.”

Opinion: The purpose of SARA is to establish a common baseline for regulation of interstate activity. It does the institution no good to shop for the lowest-regulated state, as the bar is set at the same height in all states. If a state is somehow shirking its duties, SARA gives other states leverage to pressure them to meet their responsibilities.

Myth #11:   SARA will require institutions to accept transfer from other colleges.

Facts:  Transfer is not a part of this reciprocity agreement. As stated in the opening paragraph of the SARA Policies and Standards, SARA is an agreement “that establishes comparable national standards for interstate offering of postsecondary distance-education courses and programs.”   The focus is on the activities offered in other states to students of the institution.

Myth #12:  There is a better reciprocity option through the “Interstate Distance Education Reciprocity Agreement” between Connecticut and Massachusetts.

Fact:  Although a reciprocity model was offered by a critic of SARA in a recent letter to the New York State Commissioner of Education, this model does not appear to exist.  Research of legislation in each of these states, review of each state’s higher education websites, review of the State Higher Education Executive Officers Association (SHEEO) Surveys maintained by each state’s higher education agency, consultation with institutions in each state and a direct request to the Connecticut Office of Higher Education lead us to the same conclusion.  There is no Interstate Distance Education Reciprocity Agreement between Connecticut and Massachusetts.

There have been papers published by SARA critics suggesting what a “good” reciprocity agreement might include. They always end with allowing each state to essentially take any actions it wishes to take regarding an out-of-state institution. That’s not reciprocity. That’s the current state of affairs that so poorly serves students and institutions alike.

Meanwhile, Connecticut’s legislature has just passed legislation allowing it to join SARA.

In Conclusion…

The sharing of critical analysis of the State Authorization Reciprocity Agreement is healthy to provide states, institutions, lawmakers, and citizens the ability to assess the pros and cons of this new process.  However, to publicly report an analysis that fails to show completed research or understanding of the language of the Agreement is a disservice to the states, institutions, lawmakers, and citizens.   Please review the Unified SARA Agreement and the State Authorization Reciprocity Agreement Policies and Standards before making any judgments about the viability of SARA.  We hope that the presentation of the publicly reported “myths” and corresponding facts will aid your ability to make an appropriate judgment.

In Disclosure…

In this era of ad hominem attacks we have focused on the statements that we feel are erroneous or misleading and not the personalities involved. In case you wonder why we care about these issues, here is a brief background about the two of us:

  • Russ Poulin served on the original drafting committee and the WICHE committee that developed the language that became SARA. Every discussion in those meetings took the student protection responsibilities very seriously. I developed the WCET State Authorization Network to help institutional personnel navigate and comply with each state’s regulation.
  • Cheryl Dowd is a former institutional compliance officer who now directs the WCET State Authorization Network, which serves more than 75 members encompassing more than 700 colleges and universities.

While others in higher education circles have merely railed against any type of regulation, we have been consistent in trying to find a balance that meets the needs of parties, regulators, institutions, and consumers.

 

Photo of Russ Poulin with a bat.Russell Poulin
Director, Policy & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
wcet.wiche.edu
Twitter:  RussPoulin

 

Dowd-CherylCheryl Dowd
Director, WCET State Authorization Network
WCET – WICHE Cooperative for Educational Technologies
cdowd@wiche.edu

Categories
Uncategorized

Supporting Students with Technology: Academic Advising in Higher Education

With the desire to support our learners, a number of colleges and universities are implementing technological methods and approaches for academic advising. Whether it is campus change or technological necessity, we need a way to encourage advising programs to consider technology for both content and service delivery for advisee-centered approaches. By researching technological trends and challenges, conducting campus-wide assessments, and establishing strategic plans, higher education stakeholders can effectively integrate technology into student support practices to align with individual advising objectives and to further the goals of the institution.Female student listening as we see the back of two adults sitting at a table facing the student and discussing something.

Surveying Institutional Perceptions and Practices on Advising
It is a critical time to assess how these campus stakeholders are employing digital resources to scaffold learners beyond the course curriculum. To understand the impact technology has on student support and practice The Global Community for Academic Advising (NACADA) association, specifically the NACADA Technology in Advising Commission sponsors semi-regular surveys for the NACADA membership (e.g. 2002, 2007, and 2011). In 2013 a new survey instrument was designed to capture data, specifically to identify how higher education advising staff and senior administration employ technology to support their practices. A total of 990 respondents completed the survey; however 65% identified as an academic advisor/counselor. The other respondent’s role on campus included advising administrators (22%) and faculty (4%).

Many Technologies Used in Advising; But Most-Used are Familiar Tools
Here are the key findings we thought were important to highlight from this study:

  • The top three advising technologies: desktop computers, campus storage networks, and Wi-Fi.
  • Differences in technologies for advising: Advisors utilize scanners (24%) and 23% said they used social networks (e.g. Twitter and Facebook) as advising tools. In contrast, most respondents thought their institution emphasized using learning management systems (46%) and laptops (40%) for advising.
  • The campus stakeholder respondents most often communicated with daily was other academic advisors/counselors (86.35%) and students (89.88%).
  • Respondents used technology less frequently to communicate with academic administrators (58.08%), faculty (47.22%), and student affairs administrators (37%).
  • Daily advising technology identified for daily use included e-mail (99%); face-to-face interactions (91%); locally installed word processor, spreadsheets, etc. (80%); phone (73%) and Facebook (30%).
  • Less frequently used technology for advising (< 2%) included: licensed video-conferencing (e.g. Adobe Connect, Wimba), retention software, photo-sharing websites (e.g. Flickr), podcasts, and social studying sites (e.g. OpenStudy).

Overall, we found the advising community communicates with campus stakeholders across their institutions and to stay connected to professional peers outside the institution:

  • 70-90% think advising technology supports information distribution on campus, and sharing knowledge and maintaining connections within higher education.
  • 24% indicated that advising technology tools do not help with communication and student scheduling.
  • 80-92% believe advising technology helps them work faster and more efficiently, produce higher quality work, store advising information, simplifies the academic advising administrative processes, and contributes positively to their academic advising role.

Technology Needs to be Location-Free, Build Rapport, and Use Current Systems
The open-ended responses highlighted the interesting perspectives, challenges, and current practices when respondents shared “ideal technology in advising practice” to meet the needs to support students and their advising functions. Here are a few central themes about their sentiments for advising technology on campus:

  • Student support needs to be intentional and integrated into current systems and technologies being used on campus.
  • Create opportunity and access for student support and academic advising regardless of physical location.
  • These technological tools and resources help to build an advising rapport, make connections, and support communication.
  • Technology in advising needs to support transparent knowledge sharing and degree completion information.
  • Advising approaches need to be implemented to support effective online and blended models for advising to include the human impact and influence.
  • Our institutions are not addressing the needs and challenges in advisor and learner preferences and/or practices for student support.
  • Digital resources and technologies now have the ability to capture the holistic view of the student learning experience, which is essential to enhance academic advising practices and institutional outcomes.

It is imperative that campus decisions about technology and learning also include design and delivery methods that are inclusive of academic advising needs. From this research, it there is both a need and desire to improve front-line advising and student support practices in higher education. Beyond soliciting input during the technology purchasing and implementation, it will also be imperative for our institutions to consider how student support is organized and assess current advising practices.

To integrate or update technology for advising, our institutions will need to also consider how they will provide additional support, training, and job aid resources to scaffold technology use for the students, staff, and faculty user experience. In the efforts to expand this research and distribute this knowledge for higher education technology for advising, the survey instrument, data, and white paper (also shared on Academia.edu) from this study are shared by the researchers with a Creative Commons license.Laura Pasquini

Laura Pasquini, Ph.D.
University of North Texas
@laurapasquini

 

George Steele, Ph.D.George Steele
The Ohio State University
@gsteele1220

Reference:
Pasquini, Laura A.; Steele, George (2016): Technology in academic advising: Perceptions and practices in higher education. figshare. Retrieved from https://dx.doi.org/10.6084/m9.figshare.3053569.v7

 

Photo Credit: Pexels: https://www.pexels.com/photo/businesswomen-businesswoman-interview-meeting-70292/

Categories
Uncategorized

#CountAllStudents and the Move from Graduation Rate to Outcomes Measures

As we make our way through the final few weeks of the traditional college graduation season, it makes me reflect on the flawed first-time, full-time federal graduation rate used by the Department of Education. There’s been news on this front in the last few weeks. Let me update you on how these stories might affect you.

The IPEDS Workshop
A few weeks ago I was fortunate enough to attend the 2016 IPEDS Coordinator Workshop and State Data Conference. Yes, it takes a certain type of data nerd to enjoy room full of equally nerdy people. Wait! Wait! Don’t stop reading! Rest assured that I will focus on the highlights.

IPEDS is the Integrated Postsecondary Education Data System, which is a set of surveys conducted by the U.S. Department of Education and that every institution receiving federal funds is expected to complete. We use IPEDS data for our reports on distance education enrollments.

Thank you very much to the U.S. Department of Education for allowing me to participate. The meeting is focused on the IPEDS Coordinators (they help the institutions with questions they have) and a few others doing interesting things with data. I was pleased that the Department responded to a request from Demi Michelau, WICHE policy director, to include me.

New Outcomes Measures to Accompany the Much-maligned Graduation Rate
The IPEDS Graduation Rate has come under fire in recent years because the measure is based on a few students. The process begins with an institution identifying all of the students in the freshman class who are “full-time” (no part-timers need apply) and “first-time” (first time that they have gone to any college). [IPEDS definitions, choose F to find both first-time and full-time]

Snapshot of the College Navigator produced by the Department of Education. Shown is part of the search tool and results for the University of Colorado Denver.
IPEDS data is used in the College Navigator tool aimed at prospective students.

You can see how this is troublesome for community colleges and other adult-serving institutions as only a small percentage of their enrollments fit this category. The increase in high school dual and concurrent enrollments also calls the usefulness of the measure into question. Likewise, students transferring into an institution have no home in this counting method.

In response, IPEDS recently started collecting data for a new Outcomes Measure (OM) to accompany, but not replace the Graduation Rate. The first set of results should be released later this year. Instead of the one “full time, first time” cohort, the Outcomes Measure includes four cohorts:

  • Full time, first time,
  • Part Time, first time,
  • Full time, not first time (not first-time means that they attended another college before coming to yours),
  • Part time, not first time.

Outcomes will be reported for both six and eight years after each cohort has entered a college. The six-year report will focus on those students who are awarded a certificate or degree in that time. The eight-year report will report completions and will additionally report: students still enrolled at the original reporting institution, students who are now enrolled at another institution, and students whose current status is unknown.

The data will be available in the downloadable datasets provided by IPEDS. It will also be included in the College Navigator tool, which is aimed at people engaged in the college search.

In the reporting, the Outcomes Measures will report the “total number of students who did not receive an award from your institution” from each cohort. An attendee at the IPEDS Workshop asked if there is any way to report students who completed elsewhere. The answer is no because they did not want to double-count graduates at both the original institution and the institution that granted the degree. They also did not want to mandate use of the National Student Clearinghouse, which tracks such activity.

In my opinion, adding the Outcomes Measure is a fantastic first step. It is finally a recognition that the traditional full time, first time group of students is an increasingly smaller set of overall higher education enrollments. I do worry that they will collect this information and still highlight the “students who did not receive an award from your institution” number. A suggested remedy was to add all students who obtain a certificate or degree with transfer students. That’s a nice suggestion, but there is no indication that they will implement it.

Example of an "overall graduation rate" as reported in the College Navigator.
Graduation rate results from the College Navigator

The Student Achievement Measure Urges Us to “#CountAllStudents”
In the last few weeks, proponents of the Student Achievement Measure (SAM) have launched a media campaign in conjunction with spring graduations. They urge the Department to “Count All Students” using an alternative graduation measure jointly-developed by several colleges.

It’s an effective campaign highlighted by stories of real students, who are not counted as graduates by any institution in the current Graduation Rate. Among those not being counted is President Obama. He transferred from Occidental College to earn a bachelor’s degree at Columbia College, Columbia University. I also don’t count as I attended three institutions before earning my undergraduate degree.

The Department of Education highlights the differences between the Outcomes Measure (OM) and SAM in an FAQ:

“OM is similar to SAM in that both have the same goal of measuring postsecondary success and progression of undergraduate students. However, the methodologies used to measure the outcomes are different. First, OM is part of the mandatory IPEDS collection compared to SAM’s voluntary participation. Second, OM has 4 cohorts and SAM has up to 7 cohorts. Third, SAM captures student progress and success at the award level (bachelor’s and associate’s/certificates awards), whereas OM does not make a distinction between award levels. Lastly, the time points for SAM varies depending on the cohorts compared to OM’s standard use of 6-year and 8-year time points across all cohorts.”

Over pictures of several students is written: "Why Aren't All Students Counted in the Federal Graduation Rate?"
SAM’s #CountAllStudents promotion uses real stories of those graduating who are not included in the graduation rate.

Not Everyone Thinks Change is Necessary

Meanwhile, one former Department of Education official discounted the call for improved graduation rates by referring to an article he wrote a few years ago, in which he said, “Officials at colleges with low graduation rates have for years defended their rates by falsely asserting, perhaps mistakenly believing, that their part-time students who eventually graduate are ‘counted as failures.’ There might be a small few participating institutions for which that is true, but according to an Inside Higher Ed article, roughly three-quarters of the nearly 600 institutions supporting SAM are four-year, public institutions. Even though the rest of the narrative explains how the rate is calculated, there is little understanding shown of what actually happens in community colleges, urban universities, or other adult-serving colleges.

The Proof is in the Data…and the Display
From the IPEDS Workshop, they did not seem ready to entertain many changes to the new Outcomes Measure. It will be good to see the real data and how it is displayed. Those factors will help us determine the usefulness of the data.

For our members, I think we need to watch this closely. Since the Outcomes Measure merely supplements the Graduation Rate, will this cause more confusion for your potential students. While we will have finer data, will students want to delve into the different cohorts and figure out what it means? There is a heavy burden in how this is displayed by the Department. Additionally, an increasing number of third-party college rating sites (ugh) use the Graduation Rate with no explanation what its limitations are. Will they use, ignore, or botch the Outcomes Measures?

Finally, I must acknowledge that the IPEDS staff have a thankless task in trying to create understandable and comparable data when colleges can be so different. And colleges OF ALL TYPES are constantly looking for ways to game the numbers.

Understandably, IPEDS staff want to be careful about changes to survey procedures. Since this data release will be the first round of Outcomes Measures, I do hope that they will be open to reasonable improvements.

Although in the current political climate, I’ll admit that I am hard-pressed to say what the definition of “reasonable” is any more.Photo of Russ Poulin with a bat.

Russell Poulin
Director, Policy & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
wcet.wiche.edu
Twitter:  RussPoulin

Categories
Uncategorized

One Graduate’s Elusive Achievement: Thanks to Competency-Based Education

Learn how competency-based education (CBE) helped one Texan (an adult, veteran, fully-employed, grandparent) achieve another important title: college graduate. Thank you to Judith Sebesta, Institute for Competency-Based Education, Texas A&M University-Commerce for contributing this inspiring story.

Russ Poulin

In his harangue against the complexities of “modern,” early-twentieth century American society, The Simple Life, Charles Wagner wrote, “Education, like the mass of our age’s inventions, is after all, only a tool; everything depends upon the workman who uses it” (p. 10). Scott Noffsinger is an exemplary student/“workman,” showing just how effective a tool education – and, more specifically, competency-based education – can be.

Scott Noffsinger
Scott Noffsinger had a sackful of credits and lifetime’s worth of knowledge. CBE helped get him transform those assets into a bachelor’s degree.

An Unfinished College Career
Several years ago Noffsinger, a 52-year old veteran and technical team leader for AT&T, was one of over 3 million Texans ages 24-65 with some college and no degree.   The story of his circuitous path through higher education is not unusual. He graduated from high school in 1981 and entered the United States Marine Corps six months later, where he spent 8 years in the USMC Airwing as an Aviation Maintenance Data Analyst. After separating in December 1989, Noffsinger took his first college classes the next spring at Forsyth County Community College in North Carolina, completing nine courses there.

He then moved out of state due to work requirements, not starting classes again until 2001. However, after completing another six classes, he had to stop due to – as is not uncommon — financial reasons.

Family Gave a Push Toward Completing a Degree
But when three of Noffsinger’s four children became traditional-aged college students, they asked him from where he had graduated. “I couldn’t answer that. So they encouraged me to get my degree.”

With his childrens’ and wife’s support, the father, grandfather of 15, veteran, and long-time employee of AT&T, began to explore pathways to completion. Noffsinger saw an article for the Texas Affordable Baccalaureate (TAB) Program at Texas A&M University-Commerce (TAMU-C), a competency-based BAAS in Organizational Leadership.

Designed in direct response to then-Governor Rick Perry’s 2011 challenge to all Texas institutions of higher education to create a $10,000 bachelor’s degree, the TAB program was created collaboratively by TAMU-C, South Texas College (STC), the Texas Higher Education Coordinating Board, and the College for All Texans Foundation, with funding from an EDUCAUSE Next Generation Learning Challenges grant. Both TAMU-C and STC admitted their first class of students in January 2014.

The CBE BAAS in Organizational Leadership at TAMU-C is a fully online degree program consisting of 99 competencies (for the equivalent of 120 SCH) defined by both faculty and industry.  It is delivered via a subscription model; students can attempt as many competencies as possible in each seven-week term, with each term costing $750 for tuition and fees.

Judith Sebesta playing a guitar.
Judith Sebesta sings the praises of competency-based education.

Making the Leap with CBE
In the spring of 2014 while recovering from back surgery, Noffsinger entered the program. He completed 27 courses over 14 months, sometimes finishing as many as five courses in a term (students initially enroll in two courses per seven-week term, considered a full load, and only enroll in additional courses upon successful completion of the two).

Advising from an academic coach plays a key role:  “I was very nervous about [returning to school],” said Noffsinger. “But I would email the adviser and say ‘I don’t think I can do this.’ She would talk me through and encourage me to continue on, and if I really had a problem, she was there for me.”

Noffinger’s experience is a perfect example of the opportunity that CBE presents both for personalized learning and to accelerate time-to-degree, thus increasing the odds of success and lowering costs. It makes little sense for someone who spent eight years in the USMC analyzing aviation data to spend 14 weeks in a statistics course. According to Noffsinger, “I took an advanced statistics course in 36 hours.  I started on the first day of the term. The next day I was taking the final exam.” Conversely, CBE afforded him the opportunity to slow down in order to ensure mastery of less familiar material. Art appreciation, a course he took to fulfill general education requirements, required significantly more time for the veteran to complete.

Succcess and Completion – Affordably
“When it comes to this program, I can’t say anything but good,” said Noffsinger, who completed his degree — for less than $7,000 — in August 2015. “Fifteen days after that, I started my master’s degree.”

His success is not atypical for students in CBE programs. His accomplishment, coupled with the increasing numbers of students like him, provides exemplary evidence to support Mark Leuba’s claim, in “Competency-Based Education: Technology Challenges and Opportunities,” that CBE is “perhaps the model of education for the 21st century.”Sebesta Judith

Judith A. Sebesta
Executive Director
Institute for Competency-Based Education
Texas A&M University-Commerce

 

 

References

Leuba, M. (2015). Competency-Based Education: Technology Challenges and Opportunities. EDUCAUSE Review. October 12. Retrieved from http://er.educause.edu/articles /2015/10/competency-based-education-technology-challenges-and-opportunities

Wagner, C. (1901). The Simple Life. New York: McClure, Phillips & Co. Retrieved from https://archive.org/details/simplelife00wagngoog.

 

 

Categories
Practice

A Conversation About Today’s Classroom (Three Students, Three Modalities)

We called for help in getting the voice of students into our blog posts. I want to thank Holly Jean Greene (University of Tennessee, Knoxville) for preparing this video for us with real, live students sharing their experiences. Thank you!
Russ Poulin

I was curious: How would students describe the classroom of today?  As an educator who teaches traditional, hybrid, and online courses, is one format more conducive to learning than another?  What are the pros and cons of each?  Most importantly, what do students need from us to succeed?

My curiosity in online, hybrid and flipped instruction ignited last summer when I attended a program offered by the Teaching Learning Center (TLC) on the campus of UT, Knoxville.   According to TLC’s  website: Flipped: A pedagogical model in which the typical lecture and homework elements of a course are reversed. Hybrid: 33–79 % of the instruction is delivered through electronic means and in-class seat time is reduced.

With the help of many, my less than six minute video seeks to comfort my curiosity in an amusing yet instructive form and one of the greatest takeaways is as “old school” as it comes.

Holly Jean Greene
Holly Jean Greene, MBA
Lecturer/Online Course Development Manager
hgreene@utk.edu
University of Tennessee, Knoxville

Categories
Uncategorized

Call to Action: EVERYONE Should Respond to Teacher Prep Distance Ed Regs

If you have not paid attention to the proposed “Teacher Prep” regulations, it’s time to do so. Once again “distance education” is being treated differently by the U.S. Department of Education.

Certainly, institutions with distance education programs that prepare students to become certified K-12 teachers should respond. In talks with Deborah Koolbeck (Director, Government Relations) for the American Association of Colleges of Teacher Education (AACTE), she is encouraging all her member colleges to reply, even if they do not have distance education programs. I make a similar recommendation to all colleges offering distance education, whether you have Teacher Prep or not. You should reply.

I’ll tell you why you should comment and I’ll give you some suggestions on how to reply. First some background…

An Extremely Short History of the Teacher Prep Regulations
The Department of Education was not happy with how states were grading colleges and alternative programs that prepared people to be K-12 teachers. One of the criteria for institutional eligibility to award TEACH grants to students is receiving a passing grade from the state. Using the existing measures, States rarely failed an institution and few States were paying any attention to students learning via distance education.Picture of a hand with a watch and a pen. The words "The Time to Comment is Now" appears on a letter presumably being written by the person in the photo.

The “Original NPRM” on Teacher Prep
In December of 2014, the Department of Education released for comment a Notice of Proposed Rule Making (NPRM) seeking to add more teeth to teacher education oversight. For purposes of this discussion, I’ll call that document the “Original NPRM.” Some highlights of the regulations proposed at that time were:

  • The new unit of measure for institutions would be by “program” and not the institution. For example, a college might have bachelor’s degrees in elementary education, secondary education, and special education that all lead to teacher licensure. Each of those would be considered a “program.”
  • The state would be expected to review distance education programs from other states serving students within the state.
  • The state would be expected to report on each program on four indicators:
    1. Student Learning Outcomes. Measure growth for students in classes taught by new teachers.
    2. Employment Outcomes. Measures of teacher placement rate (with a breakout for those in high-need schools) and teacher retention (with at breakout for those in high-need schools).
    3. Survey Outcomes. Survey new teachers and their employers to see if the program prepared the new teacher to succeed.
    4. Accreditation or Alternative State Approval. Is the program accredited or meets other criteria for alternative programs?
  • Based on those criteria, the state will give each program one of the following four ratings: “low-performing,” “at-risk,” “effective,” or “exceptional.”
  • The methods for measuring each indicator and how those measures map to the ratings are left to each state to devise. Sound familiar? This could be as confusing as state authorization.

Shortly after the Orignial NPRMwas released, I provided a summary of what was being proposed. In partnership with the Online Learning Consortium and the University Professional and Continuing Education Association, WCET submitted written comments focused on the distance education shortcomings of what was proposed and provided  some alternative recommendations.

The “Supplemental NPRM” on Teacher Prep and Distance Education
In response to the comments received to the “Original NPRM,” the Department did something it has never done before by creating a “Supplemental NPRM” with new regulations and questions.  Issued on April 1 (yes, I know), the new NPRM focuses on just one issue:  Distance Education.  The deadline for comments is May 2.

Our previous comment had some impact. Be careful what you ask for, you might just get it.

Why Should You Comment?
In my opinion, the Supplemental NPRM falls short in some significant ways:

  • It discriminates against distance education. Even if you don’t have a Teacher Prep program, you should object to another case of distance education being treated differently. I am NOT suggesting that we dodge accountability, it just should be conducted in the proper context. Even if you comment only on this one point, volume counts! Let’s stop this precedence.
  • It imposes an unfair penalty. States may each use their own measures and different cohort methods to assess programs. If a program rates as “low-performing” or “at-risk” in as few as ONE state, it loses the right to offer TEACH grants in ANY state.
  • The estimates of burden on states and institutions are ridiculously low. To comply, it is estimated that states would incur less than $5,000 in additional annual costs and institutions would incur NO (yes, zero) additional costs.

Who Should Reply?
It would be great to have institutions and/or colleges of education reply. You would need to navigate the proper government relations channels at your institution to do so. This may be difficult given the May 2 deadline. This is why I gave you a heads-up in my blog post that came out the same day the Supplemental NPRM was issues.

You may reply as an individual. You can’t use your institution or organization letterhead, but you can give your name, title, and employer. It might be good to reiterate that you are not responding in your official capacity for the institution.

How Do I Reply?
Directions on how to reply appear in the “Addresses” section of the Supplemental NPRM. You may: “Submit your comments through the Federal eRulemaking Portal or via postal mail, commercial delivery, or hand delivery.” If you plan to use the Portal, give yourself some time to figure it out or get help from whomever usually does this on your campus.

What Should I Say?
Deborah Koolbeck from AACTE created a great template for a letter addressed to the Secretary of the Department of Education. You should:

  • Personalize it as form letters get less attention. As Deborah suggests, briefly tell your story.
  • Add your own comments. I’ll give you some observations of my own below. Deborah included AACTE’s observations in her letter. Focus on what would have the greatest impact on you and your students. Say why what is proposed would help or hurt you. Discard the rest.
  • Be respectful. We can be better than the presidential candidates.
  • Make positive or helpful suggestions. Personally, I hate the responses which object to everything without supplying, at least some, helpful alternatives. This helps to address the sense that we are merely objecting to any type of oversight or anything that inconveniences us. I’m for regulations that serve a purpose and for which the cure is not worse than the disease.

Another example is the letter that we submitted for the Original NPRM. As you look at this blog post and the letter, I’m all about bullets, highlighting, and bolding. They help to drive home the main points as some people merely scan the letter. You want to make sure that your main points or perceived as your main points.

What Points Should I Make?
Here are some of my observations about the Supplemental NPRM. There are more ideas than you should put in a letter. Pick those that you like. Put them in your own words. Add your own observations.

Do Not Discriminate By Mode of Instruction
Do not discriminate against distance education or any mode of instruction by creating separate criteria or measures.

There should not be a distinction between distance education, face-to-face instruction, blended learning, or any other mode of learning. The college of education or other entity (e.g., Troops to Teachers, Teach for America, Boettcher Teacher Residency Program or other alternative certification path) is presenting to the State that their graduates are teacher candidates with the requisite skills to be an effective teacher. The State’s interest is not in how that candidate acquired those skills, but if the candidate possesses those skills and is able to apply them effectively in the classroom.

The “Distance Education” Definition Creates More Problems than It Solves
From the Supplemental NPRM, the Department plans to use the “distance education” definition found in Chapter 34, 600.2. That definition focuses on “instruction to students who are separated from the instructor.” The definition has been applied by the Department as meaning for entirely (or nearly entirely) the course of instruction is taught at a distance. This creates a large loophole for students enrolled in programs that are neither fully “distance” nor fully “brick and mortar.” They would not be covered by the requirements of the Supplemental NPRM, as written. Examples include:

  • “Blended” teacher prep programs with part of the instruction at a “distance” and part of the instruction is back at the home campus.
  • “Blended” teacher prep programs with part of the instruction at a “distance” and part of the instruction is at alternative sites, such as local K-12 schools or rented locations.
  • Competency-based education teacher prep programs that use a variety of online and face-to-face activities and courses for students to obtain their skills.

If the Department requests that States begin to develop different measures for how a teacher candidate obtains teaching skills, then those alternative measures should not be limited to “distance education.”  It would logically follow that alternative measures be developed for other alternative certification pathways, such as Troops for Teachers or Teach for America.

The need for multiple measures can be simplified by treating all teacher education candidates the same regardless of how they obtained their skills. This would avoid the confusion of creating additional measures for distance education or other modes that are difficult to distinguish from each other.

Support for removing the distinction among modes of instruction can be found in the actions of some accrediting agencies. The Higher Learning Commission and AACSB (accrediting agency for business schools) have removed distinctions for distance education. Those agencies expect institutions to provide the same level of academic and student support quality regardless of mode of instruction. Likewise, States should hold all teacher candidates to the same standard.

The Proposed Change to Certification Should be Expanded
The question vexing the Department is one of geography, not mode of instruction. The Original NPRM was clearly written with the traditional model of instruction in mind. Students who moved out-of-state were lost to goals of “reporting and determining the teacher preparation program’s level of overall performance.” The Supplemental NPRM solved the problem by asking the State to review all distance education teaching candidates in the state in which they are certified. The focus on place of certification is a much-needed improvement provided by the Supplemental NPRM. States reporting on newly-certified teachers within their borders are reinforced in their right to review these teachers regardless of mode of instruction used to prepare them.

Allowing a Single State “Veto” for TEACH Grants is Unfair.
The current proposal in the Supplemental NPRM is that if a teacher preparation program is found to be “low-performing” or “at-risk” in a single State for two years or by two different states over a two-year period, then “no student in any State enrolled in that distance education program would be able to receive a Teach Grant” in the subsequent year.

As envisioned, penalizing students in all states for a failure in one state is grossly unfair. Here are the reasons why:

  • Under the Supplemental NPRM, every State is allowed to create its own measures to determine if a program is “low-performing” or “at-risk.” The Department acknowledges the right of every State to set its own standards. The proposal would allow a review outcome in one state to overrule the TEACH eligibility in all other states. Given the vast differences across the country in populations and geography, criteria that are appropriate in one State might not make sense in other States. Therefore, a criterion that has no impact in one State may inadvertently be applied in that State.
  • Given WCET’s extensive experience with State authorization there are (unfortunately) a few States that value protectionism over quality. This Supplemental NPRM would give those states power beyond their own borders. This would embolden them to remain protectionist. If the goal is to assess the quality of teacher prep programs, then all programs should use the same measures.
  • Distance education programs might have few students in a State and might become victims of an unusually unrepresentative sample in a particular
  • The Original NPRM allows for options in aggregation methods if there are fewer than 25 students in a program in a State. This could lead to variations in how the measures are applied. For example, using a multi-year sample in a state could result in an unusually low-performing class of teachers negatively affecting a program.’ Beyond these suggested aggregation methods, there is also a hint that state could use as few students as they wish in a cohort as long as they are not individually identifiable. Decisions could be made on a low sample. Distance programs would probably have very low participation in most states.

Differing Measures will Confuse, Not Inform, the Consumer
The following statement appeared in our letter responding to the Original NPRM regarding consumers. In this context, we define “consumers” as students shopping for Teacher Prep programs or policymakers assessing such programs:

Confusion for the consumer. It appears that the Department plans to post the results on its website using the specified grading categories for each program. It is likely that the consumer will assume that the measures used in each State will be comparable when they will likely vary greatly.

Presumably, a program’s scores would be published for each state in which it had students certified. Wildly varying scores will be confusing and an aggregate score is not possible since different methods of scoring were used by each State.

Since that letter was submitted the Department released the “College Scorecard” to inform students about colleges. The Scorecard’s use of “graduation rate” measures penalizes colleges that serve few first-time, full-time students. It also publishes “average annual cost” data that based on net cost after financial aid is applied and counts only students receiving aid. These measures confuse students and underlines how even standardized data can be deceptive if used improperly. If one of the purposes of these regulations is to better inform consumers and leaders about the quality of these programs, then comparable measures are needed across states without differentiation by mode of instruction.

Addressing Inconsistencies Among States
As for inconsistencies in measures for modes of instruction, States should use the same standards for all modes of instruction.

As for inconsistencies in measures among States, we repeat a recommendation from our letter responding to the Original NPRM. The Department has powers of persuasion beyond regulations. A joint solution would have greater buy-in and power:

Encourage and Incentivize States to Work Cooperatively on Measures. Given the burden of implementing this regulation, the States would very much benefit from working in collaboration to develop the measures. The processes would not be mandated on the States, but those choosing to participate could develop a more robust and defensible system in a fraction of the time that it would take to do it alone. If federal funds are not available, there may be grant support to assure that quality measures are developed. The State Authorization Reciprocity Agreement (SARA) is an excellent example of states working collaboratively to resolve issues of quality assurance, consumer protection, and oversight of colleges.

The Cost Estimate for Institutions is Not Supportable
In the first paragraph of “Number of Distance Education Programs” section of the Supplemental NPRM reads: “it is clear that at least some States have been reporting on distance education programs…” Focusing on the “at least some” statement, that means that not all were reporting on distance education programs. In fact in our letter responding to the Original NPRM we noted that we had great difficulty finding ANY state reporting on a distance education program.

The Supplemental NPRM later states that: “The cost estimate claims that there is no increased burden on institutions because they are already reporting.” This is logically inconsistent. The same document claims that only “some” states were reporting on these programs and then claims all institutions are already reporting. This is especially true when the entire purpose of the Supplemental NPRM is to propose a whole new structure for reviewing programs with additional indicators.

The Department might argue that it is the State’s responsibility to gather data from these measures. This is an unfunded mandate on the states. The Supplemental NPRM estimates that it will probably cost states on average less than $5,000 each per year to implement the new distance education requirements. It is easy to imagine that States will require institutions to collect the data and report it, thus transferring even more costs to the institution. This is especially true for out-of-state institutions as there is no real benefit to the State to assist those programs.

Is this Regulation Meeting Its Stated Purpose?
Is there a better way that you can recommend to meet the Department’s goal of identifying low-performing producers of K-12 teachers?

“Raise the Barn”
WCET is a cooperative. In the most traditional, historical sense, cooperative organizations band together to make what needs to happen, happen.  When a community member needs a barn, a barn is raised.  Right now, the distance education community needs you to help raise the barn to have our collective voice heard by the Department.  All of our voices are stronger than one of our voices.

Russ

Russell Poulin
Director, Policy & Analysis
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
wcet.wiche.edu
Twitter:  RussPoulin

Categories
Uncategorized

Investigating IPEDS Distance Education Data Reporting: Progress Has Been Made

WCET conducted analysis on the Department of Education’s IPEDS (Integrated Postsecondary Education Data System) data since the initial release of distance education data for the Fall, 2012. Most recently, we produced a comprehensive report, WCET Distance Education Enrollment Report 2016, that analyzes trends in the distance education data reported between 2012 and 2014.

In the past, we’ve alerted to you to problems that some institutions had in reporting distance education enrollments to IPEDS. We’re glad to report that many have addressed those problems. A few institutions still maintain their own practices or are a complete mystery. Here’s what we found in revisiting the same problem children from the original IPEDS distance education enrollment reports.

Computer keyboard with the key under the "tab" key labeled as "confused."
Institutional personnel seem less confused about IPEDS distance education enrollment definitions and procedures.

Examining Possible Distance Ed Undercounts Since 2012 – Many Students Missing

In September, 2014 we wrote a blog post that explored discrepancies in distance education enrollments reported in the Fall 2012 IPEDS data. After being tipped-off by some WCET members as to problems they had in submitting data, we reviewed enrollments reports and identified 21 institutions with enrollment responses that seemed unusually low. We contacted representatives of those institutions to seek answers regarding whether the colleges reported all of their for-credit distance education enrollments for Fall 2012. If they did not, we asked about the size of the undercount and the reasons why enrollments were not reported.

The reasons for the undercounts fell into a two main categories.

  1. Distance Education Definition. Confusion about the “Distance Education” definition provided by IPEDS and institutions who shared that they use their own definition of distance education not IPEDS’ definition. The definitional issues were explored in detail in the September 2014 blog post. Institutions are expected to use different definitions for accrediting, state, IPEDS, and other purposes. Some of them said that they just report using the state definition for all purposes.
  2. Not Counting Students in Self-Support Programs. Challenges with the systems that count enrollment, some self-support programs operate with a separate registration system. This issue is characterized in the words of an administrator who asked not to be named, “The enrollments that do not pass through our ERP system are blind to us.” As a result we found two large public university systems who had never reported a single student distance education enrolled in for-credit programs on any IPEDS survey, ever. The problem was larger than just the distance education enrollments as these students were not counted in any other IPEDS report. This issue encompassed many thousands of students each year.

As a result, we learned that the IPEDS numbers undercounted both distance education and overall higher education enrollments by tens, if not hundreds, of thousands of students.

Has Reporting Improved Over the Last Three Years?

With three years of IPEDS data now available for distance education, it seemed like a good time to revisit the challenges of early distance education data reporting. We wanted to see if the challenges of accurately reporting distance education enrollments for the colleges we investigated in 2014 persisted.

Improved Their Reporting
There has been improvement for some institutions. A large, public institutions in the southwest that offers multiple start dates and reported that it was not possible to accurately report fall enrollment to IPEDS with their current data management systems reported over a 70% increase in “Exclusively” Distance Education (DE) enrollments between Fall 2012 and Fall 2014. This at the same time the overall enrollment growth on the campus increased by 20%. This suggests that the staff at the school responsible for IPEDS reporting has found a way to collect and report data for the multiple start dates each fall.

A state university in the south that reported no “Exclusively” DE enrollments in 2012 reported 758 enrollments in Fall 2014, this at a time when total enrollments increased 2% for the institution. When asked about the lack of reporting in 2012, a representative indicated that they did not have the systems in place to accurately report DE enrollments, but that they were reporting accurate DE data beginning in 2013. The recent data indicates that they have put a system in place to report distance education enrollments.

A public system in the west reported a 25% increase in “Exclusively” DE enrollments while reporting a 3% decline in overall enrollments between Fall 2012 and Fall 2014.

While we refrained from naming most institutions in the 2014 blog, we did name the California State University system since it was their admitted issues with the IPEDS definition of “distance education” that triggered our interest in digging deeper into the data to understand the anomalies in reporting. In 2014, Cal State University system representatives freely admitted that they were only counting “state support enrollments” not the 50,000 students that were taking for-credit courses offered by their self-support, continuing education units.

Analysis of the changes in the Cal State system’s IPEDS reporting suggests that they have probably aligned their reporting with the Department of Education requirements. Total enrollments increased 2% between Fall 2012 and Fall 2014, while the “Exclusively” DE enrollments reported increased 60% in the same period.

A similar trend is evident when reviewing another multi-campus state university system in the west that is known to have invested heavily in national advertising in this timeframe. This institution reported a 13% increase in total enrollments and an 85% increase in “Exclusively” DE enrollments between Fall 2012 and Fall 2014. It is not clear how much of the enrollment growth in online is true increases and how much is attributable to changes in reporting.

Reporting Has Not Changed
Another large institution in the southwest, told us in 2014 that they used their own definition of distance education, not the IPEDS definition. A follow up with the contact revealed that they still use their own definition of distance education to report DE data to IPEDS.  This institution reported a 22% increase in overall enrollments and a 35% increase in “Exclusively” DE enrollments between Fall 2012 and Fall 2014. So this enrollment growth is due to increased enrollments. The contact also warned that the DE enrollments are a small proportion of their total enrollments, so the percentage change can be misleading.

It’s a Mystery
A large private institution in the west strongly declined to talk to us in 2014. That institution reported a loss in total enrollment at their main campus of 11% between Fall 2012 and Fall 2014. Even though they have extensive distance education offerings, they continued to report almost no distance education enrollments. Meanwhile, a sister campus of that institution reported a 57% increase in total enrollment and a 227% increase in DE enrollment in the same period. Could they be reporting all of the main campus distance education enrollments through the sister campus? It’s possible, but that would be odd. The sister campus advertises extensive offerings of its own in academic programs that differ from the main campus. The reporting at this institution remains a mystery.

Reporting is Improving, but the 2012 Data is Still a Shaky Base

The IPEDS distance education data allows us to compare institutions using a consistent set of expectations provided by the IPEDS survey. Observing institutions’ IPEDS data reporting since 2012 suggests that they are gaining the experience and improving their systems and reporting processes to ensure that the data is an accurate reflection of distance education at their institutions. This data continues to inform the industry and the students it serves.

Shows the "Exclusively Distance Education Percent Change in Enrollments from 2012 to 2014: Public +12%, Non-profit +33%, For-profit -9%, and Total +9%.
We reported these changes on an admittedly shaky base of 2012 enrollment data. But, it’s the best we have.

 

 

In 2014, Phil Hill and Russ Poulin wrote an opinion piece that stated that, with these uncertainties, the 2012 data served as a shaky baseline. We hold by that statement, but are encouraged by the progress in improved reporting.

This also means that some of the increases for distance education enrollments that we reported earlier this year may be due to addressing the procedural undercounts and not due to additional enrollments. Without having numbers for the undercounts, it is impossible to gauge the exact impact of students going unreported.

Keep the IPEDS Distance Education Questions in the Surveys

The U.S. Department of Education is considering massive cuts to its IPEDS reporting requirements. We can understand the interest in stopping the collection of data that is not used. We encourage the Department to keep the distance education questions in future versions of the IPEDS Fall Enrollment reports.

Even with the problems cited, this is still the best data available.Photo of Terri Straut

 

Terri Taylor Straut
Ascension Consulting

 

With help from….Photo of Russ Poulin with a bat.

Russ Poulin
Director, Policy and Analysis
WCET

 

Photo credit: “Confused” key from Morgue File.

Categories
Practice

Stepping-up Now: Researching Social Media

In an era when undergraduate students emerge digitally engaged, the progressive graduate educator is one who is open to adapt and adjust the delivery of their teaching, assignments, and interactions to incorporate innovative technology. Faculty face both implicit and explicit expectations in order to engage and enhance the learning process for today’s student. Embracing this posture and pursuit needs to be approached collectively rather than by individual faculty alone.

Graphic reading "Social Media in Education" written over the icons for several social media apps.

Current Challenges
With over 100 social media tools now available it can seem overwhelming to know where to start with determining what to choose, why, and how to use it well. Some faculty have already begun to use social media tools in their classes on campus and in online, blended, or flipped classes. However, significant challenges remain. Beyond technical issues and inadequate support to resolve them, institutional policies and guidelines are either unclear or non-existent, as is training in etiquette when using of social media in an educational context.

While some faculty may wish to explore educational benefits in using social media in their class, others don’t consider themselves “tech savvy” and feel they have inadequate training about how to integrate social media into their courses, and some remain unconvinced that this is a worthwhile educational endeavor. Valid questions arise!

  • Will using social media create unnecessary “busyness” and be an additional distraction?
  • How can we use these tools without violating privacy and FERPA regulations?
  • What empirical evidence do we have to indicate social media may be an asset to educators and student learning processes?

Various Affordances of Social Media
There is a diversity of digital resources and rich multi-media components that can indeed be incorporated into student learning tasks to produce creative, alternative formats and accelerated learning modalities. Social media tools provide stimulating opportunity for students to access diverse views and perspectives from a broad audience, including subject-area specialists (collective intelligence).

They facilitate processing of ideas and concepts through collaborative analyzing, ranking, rating, discussing, and annotating resources. Together students can evaluate and critique materials and concepts to produce metacognition.[1] This process is definitely a desirable educational outcome.

Additionally, student self-directed study and higher-order knowledge retention can be fostered and developed through the utilization of active learning principles with social media tools. A well-designed and facilitated learning community can assist students to learn through negotiated meaning as they study and process together.

Research Conducted Thus Far
Qualitative research has been undertaken assessing student engagement as a result of using social media. Findings suggest there is positive engagement resulting in “presence, social presence, social interaction, and sense of community” (Walker, 2007, Preamble).

However, a study of 29 of the most recent dissertations on the use of social media in education reveals a scarcity of empirical research of the assessment of educational outcomes for student learning, knowledge retention, and skills development (Piotrowski, 2015). Most papers seem to conclude that using social media increased student engagement, and therefore, outcomes were achieved in regard to student interaction. These findings provide a good platform from which to move forward.

The Need for Empirical Research
Further research on the educational effectiveness of social media tools is required to fully evaluate what, if any, specific student learning outcomes are achieved when using the various tools. Ideally this research will assess specific social media tools that provide differing educational affordances such as:

  • Content curation and aggregation using tools such as Learninst and Storify, and bookmarking tools like Diigo.
  • Collaboration tools such as Trello and Padlet.
  • Creation and remix using tools, using tools like Smore, Thinglink, and Animoto.
  • Social networking using tools such as Instagram, LinkedIn, and Google+.

From the categories mentioned above, I would propose data be collected from two sample group populations, one group using social media tools and the other not using them. An evaluation instrument for learning outcome assessment of students’ knowledge retention, skills, and content mastery would be administered to both sample groups. This research, conducted across each educational affordance area, would begin to provide a solid basis for moving forward with the integration and appropriate use of various social media tools within classes. We would have the answers to the questions regarding what to use, and why. Then we can more effectively address the “how”!

Now is the Time
Our experience of the somewhat “haphazard” evolution of online education begs that we approach the integration of social media with collective wisdom, foresight, and planning. If we find that research provides data that empirically validates the educational benefits of various social media tools, then institutions, educators, and administrators will need to collaborate and strategize to produce appropriate guidelines and policies to implement their use. Technological advances and student digital engagement now require us to step-up to the plate collectively in order to move forward in a manner that enhances the learning process for our students.Ron Hannaford

Ronald G. Hannaford

Director of Digital Learning and Program Development
Biola University
ron.hannaford@biola.edu

 

References

  • Bransford, John D., Brown Ann L., and Cocking Rodney R. (2000). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press.
  • Chick, Nancy. “Metacognition”, Center for Teaching, Vanderbilt University, Nashville, TN. https://cft.vanderbilt.edu/guides-sub-pages/metacognition/ accessed March 10, 2016
  • Piotrowski, C. 2015. “Emerging Research on Social Media Use in Education: A Study of Dissertations.” Research in Higher Education Journal, 27
  • Walker, Brian K. 2007. “Bridging the Distance: How Social Interaction, Presence, Social Presence, and Sense of Community Influence Student Learning Experiences in an Online Virtual Environment.” Dissertation, Greensboro, NC: University of North Carolina.

[1] “Metacognition is, put simply, thinking about one’s thinking” (Chick).  Metacognitive practices help students become aware of their strengths and weaknesses as learners, writers, readers, test-takers, group members, etc.  A key element is recognizing the limit of one’s knowledge or ability and then figuring out how to expand that knowledge or extend the ability (Bransford, Brown, & Cocking, p. 67).

Categories
Uncategorized

Breaking News on “Teacher Prep” Regulations for Distance Education

Today, the U.S. Department of Education released a new set of proposed regulations for those educating our future teachers at a distance. The long-delayed “Teacher Prep” regulations could cause more complications for colleges of education using distance education to serve students who are learning to become teachers in other states.

Even though today’s date is April 1, this is no joke.

If you are teaching an Education program at a distance, you should plan to comment. And you will need to hurry – this comment period is only open for 30 days.

Teacher US Dept of EdA Very, Very Brief History
You may recall that the Department of Education was not happy with how states were grading colleges and alternative programs that prepared people to be K-12 teachers. One of the criteria for institutional eligibility to award TEACH grants to students is receiving a passing grade from the state. States rarely failed an institution and few states were paying any attention to students learning via distance education.

In December of 2014, the Department of Education released for comment a set of proposed regulations to remedy this situation. A couple major tenets of the proposed “Teacher Prep” regulations were: 1) the new unit of measure for institutions would be by “program” and not institution; and 2) the state would be expected to review distance education programs from other states serving students within the state.

Shortly after those proposed regulations were released, I provided a summary of what was being proposed. In partnership with the Online Learning Consortium and the University Professional and Continuing Education Association, WCET submitted written comments focused on the distance education shortcomings of what was proposed and raised questions about what was intended.

The Current Proposed “Teacher Prep” Regulations on Distance Education
As a result of the December 2014 call for comments on the proposed regulations, the Department  received about 4,800 responses. Out of all those comments, the Department is adding additional regulations and opening a new comment period on just one issue:  Distance Education. Our joint comment had some impact.

The document for comment is 60 pages long, but most of the issues that will have an impact on colleges are in the first 18 pages. In a quick review by Cali Morrison and me, the most troubling passages were on page 15  in “Section 686.2 – High-Quality Teacher Preparation Program Provided Through Distance Education, where it is proposed that a college’s program could fall short in ONE state and lose the right to offer TEACH grants to distance students in ANY state. That might work if every state used the same criteria, but that’s not required. The “Discussion of Costs, Benefits, and Transfers” is a baffling attempt at measuring activity for which there is no good data.  [NOTE: This paragraph was updated on April 4, 2016 as I linked to the final version of the rules that did not have page numbers.]

What to Do Next?
There are only 30 days to comment.

If you are offering distance education “teacher prep” programs, you should review the language with the leadership of that program and your institution’s government relations personnel. You should seriously consider submitting a comment.

In the next few weeks, we will be developing our own comments. I would LOVE to have your insights and suggestions. Please send them to me. I hope to develop a set of comments that I can share with you, as you may wish to use what we write as the basis for your response.

If I read this right, the regulations (as proposed) could have a chilling effect on serving budding teachers in other states. I’d like to hear your opinions. I’d like you to comment.

Thank you,

Russ

Photo of Russ Poulin with a bat.

 

Russell Poulin
Director, Policy & Analysis
WCET – the WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu

 

Photo Credit: U.S. Department of Education

Categories
Practice

Bringing Joy to Technology Design

We welcome Alexis Hope, MIT Media Lab, as today’s guest blogger, as she gives us a peek at a new publishing platform that incorporates many types of multimedia into your text copy. Alexis and a panel of MIT Media Lab students will be speaking at the WCET Annual Meeting, Oct. 12-14 in Minneapolis. Join us to hear more about innovation, cultivating entrepreneurial spirit, and design. — Megan Raymond and Russ Poulin, WCET

For the past year, I’ve been developing an open-source multimedia publishing platform called FOLD. FOLD grew out of my thesis work at the MIT Media Lab, where I worked in the Center for Civic Media led by Ethan Zuckerman. The platform was originally created to help journalists supplement news with context to support novice news readers, but we opened it up to the public when we launched. Now, alongside use by journalists and independent writers, FOLD has begun to see wide use in the classroom by teachers looking to help students build media literacy and learn how to write for public audiences.

Options on the FOLD home page include: What is FoLD?, Creative photography guide, Learning from one another-my time at the MIT Media Lab, What's a Zine?
FOLD homepage (fold.cm)

Being “Playful” with Design

In our research, we’re investigating how being playful with design can give students the space and freedom to find their unique voice and writing style. We’ve found that students of all ages are motivated by being able to incorporate the kinds of media they interact with on a daily basis into an assignment, and older students in particular are proud of a polished and professional final product that helps elevate their work.

On FOLD, stories are composed of text and media cards. Text cards form the backbone of the story, and media cards branch out to the side of each text card. Writers can annotate their text with these media cards to create an interactive story. Media cards can be created by searching through user-generated content sites like YouTube, Flickr, Soundcloud, and more from inside the text editor. Blending research tools with a writing environment, FOLD allows writers to easily find source material and references to support their words.

It’s been a joy to see the creativity and expressiveness of student writers; FOLD has been used to create how-to’s, science explainers, photo essays, project portfolios, and history reports, but outside of formal assignments, students are creating fiction and poetry, game tutorials, movie reviews, and more experimental pieces. Giving students opportunities to connect learning goals with outside interests can be incredibly rewarding and gives them the chance to create a portfolio piece of which they can be proud.

Demonstrates how when you scroll through the text in the "backbone" part of the page on the left hand side of the screen, that additional resources appear on the right part of the screen.
A FOLD story has a text backbone with branching media cards that can be linked to the text. A “minimap” of the story is generated in the bottom right corner. The minimap was inspired by our team’s love of video games—minimaps help players orient themselves within a game world.
Demonstrates that in edit mode links from "backbone" text can be made to several types of "media cards" using various applications.
Editing mode allows writers to build their story by combining their original writing with photos, videos, animated gifs, maps, tweets, links to web articles, and more.

Prior to joining the MIT Media Lab, I worked to re-design extremely complicated medical device interfaces. I’m motivated by design challenges that center on making complex systems understandable and accessible to wide audiences. I’m also inspired by playfulness, and believe that when possible, our interfaces should add some fun to our day. In an educational context, infusing technology with joy is especially important when so many other technologies are competing for students’ attention.

I love creating moments of joy for the people who interact with what could otherwise be just another boring tool. FOLD’s moments of joy are created by bright colors, playful language, bold iconography, and support for a wide variety of multimedia. We’ve also incorporated the ability to find and “follow” other authors on the platform, an aspect of social networking platforms with which many students are familiar.

FOLD Helps Connect the Fragments of Information Across Media

Visual design is important, but thoughtful design goes beyond the surface. I’m fascinated by the way that the Web has transformed how people think, write, and learn. There are many writing tools available to students, but few that speak to the changing nature of how we learn and how we interact with information. Increasingly, we experience content in discrete fragments—a YouTube tutorial, a photo of a protest, a humorous Tweet, or a link to the viral article everyone in our Facebook circle is talking about. But sometimes it feels like all of this information doesn’t really amount to much. On FOLD, students are able to bring together the fragments of the Web into a cohesive whole, so they can turn that one YouTube video into something much more substantial.

As technology users we have come to expect beautiful and thoughtful design in many of the products we use every day, and educational technology should be no different. I’m inspired by many other technologies I see being developed with joy and play in mind, such as:

  • LittleBits, whose electronics kits are so engaging you can’t help but invent something;
  • the Amino, which has crafted a beautiful experience around teaching complex bioengineering concepts; and
  • Codecademy, a website that helps people learn programming with a fun, interactive editor.

When design and engineering sit side-by-side and are attentive to the needs of the people at the center of complex systems, our technologies can be beautiful, fun, and useful in equal measures.

If you’d like to try FOLD with your students, or just want to chat about design and technology, feel free to e-mail me at alexis@fold.cm. You can also find me on Twitter— I’m @alexishope.Alexis Hope (with Slinky)

Alexis Hope
Creative Director at FOLD
alexis@fold.cm