Categories
Uncategorized

Putting a Student Face on Not Complying with State Authorization

In talking to institutions about state authorization, we often get asked about the consequences to the institution for not complying with the regulations.  Rarely do we get asked about the impact on students.  We became aware of some instances where students were directly impacted by state authorization and wanted to share their stories.

Meet Claudia

Claudia Wiseman, R.N., M.P.H., N.P., Ph.D. and online student.

With a bachelor’s in health education, a R.N., two master’s degrees, and a Ph.D. in nursing as well as a successful career as the primary women’s health nurse practitioner for two communities, why would Claudia Wiseman ever want to return to school again? Like many other rural health providers, Claudia wanted to expand her capabilities to be able to serve a new population in her community – men.

Upon researching graduate certificates that would allow her sharpen her skills and knowledge in treating men and sit for the certification to be licensed, her options were slim.  Within her home state of Oregon, she could move to Portland, abandoning her practice to attend the one state institution which offers this certificate.   Since she was getting the certificate to better serve her community, she didn’t want to abandon it in this pursuit.

So, Claudia turned to online learning.  She found a fully online program many states away which offered the specialized courses she needed in order to become licensed.  Problem solved, right? Wrong.

This is where things get sticky.  After taking a couple of course hours, Claudia discovered that though her school had all of the right accreditations, the one thing they did not possess was state authorization, specifically from the Oregon Board of Education.  Without authorization, even if she completed all of the coursework successfully, she would be ineligible to become a certified Adult Nurse Practitioner in Oregon.

Meet Ricky

Claudia is not the only student affected by state licensure. Meet Ricky (whose name has been changed and identity shielded, at his request) who also is a working health care provider in a rural community and a veteran.   Ricky is working and going to school full-time while raising two children.  He wished to move up a level at work, but wanted to continue providing for his family without uprooting them and moving across the state or country.  Ricky sought out a blended program that advertised short, on –campus components with most coursework done online and clinicals in your home state.  It sounded perfect.

Perfect until he went to find a clinical position and learned that his school did not have authorization to operate in his state, and therefore he could not participate in an official clinical locally.  No clinical, no degree and his VA financing would be in jeopardy.  Even with a clinical, he would not be able to sit for the licensing exam unless his school received state authorization and approval from the state board of nursing.  He was faced with having to move his family across the country again, this time even considering allowing his house to foreclose in order to complete his degree.

A Happy Ending for Claudia and Ricky, but Not for Everyone

Luckily for both Claudia and Ricky, their schools stepped up to the plate, completed the paperwork, and paid the fee to be authorized in their states. Both Claudia and Ricky will tell you that compliance was not their institution’s first reaction.  It took informing, cajoling, and perhaps even a threat of legal action to reach their goal.  Ultimately the institutions, state regulators, and licensing boards collaborated to ensure a positive outcome for the students.

Their stories ended well, although it was a frustrating journey for these students.  However, we have heard from state regulators and licensing boards that it has not ended so well for all students.

Jennifer Diallo, director of the Office of Degree Authorization in Oregon, says there are many other cases where students didn’t find out until they were sitting for the licensure exam, after having graduated, that their programs did not qualify them to become licensed.  She called clinical placements the “canary in the mineshaft,” because often times its not until a student starts searching for a placement that they find out they are attending an unapproved institution.

All of this was brought to the forefront with the federal state authorization rules announced in October 2010, which require institutions to demonstrate that they have the approval in states in which they operate, by July 2014.  States have had these laws on their books for years, decades even, but many public and non-profit institutions had not worried about seeking authorizations.

The federal regulation was what tripped the security light for state regulating offices such as Ms. Diallo’s, which subsequently found 380 institutions operating within the state of Oregon without authorization.  According to Ms. Diallo, the worst part is that some institutions are “asking the passengers to fly the plane” on processing state authorization by involving students in the process.   Some institutions have asked students to research all the regulations for their state and report back to the institution as to what needs to be done.  Anyone who has tried to navigate these regulations knows that it’s a task that should not be left to novices, let alone students.

According to Russ Poulin, WCET, some state institutions have decided that it will be cheaper to keep operating without state authorization and pay the legal fees if they get sued, than it would be to seek authorization in each state.  This sentiment was echoed by a state institution we spoke with as well.  Please note, we do not recommend or endorse not following state laws.

What should be done?

From these stories, we see the need, not for further regulation, but for states to work together to create reciprocity.  There is no magic bullet – this will not be an easy process. It is one that the online education community must tackle on behalf of our students and WCET continues to support this work.

Institutions need to recognize their responsibilities and the possible negative impacts than non-compliance can have on students.

Do you know a student who has been affected by their institution not having authorization in your state?  Or conversely, as an institution have you had to deny admission or graduation to a student because the state in which they reside had not given you authorization?

Blog Post co-authored by:

Megan Raymond, Manager, Programs and Events, mraymond@wiche.edu

Cali Morrison, Manager, Major Grants, cmorrison@wiche.edu

WCET:  wcet.wiche.edu

Support our work.  Join WCET.

Categories
Practice

Intersecting Parallel Universes – Online Learning and Military Voluntary Education

Last week the Council for College and Military Educators held their annual symposium and as an attendee and concurrent session presenter, I had the opportunity to observe the many conversations those in the military voluntary education community are having that parallel and intersect with those we’re having in the online education community.  One surprise for me was that by the time a military student completes a credential they have attended on average 5 institutions to compile all of the credits necessary and often times graduate with far more credits than are necessary for the degree because all of those credits don’t fit the degree requirements.  To me, that speaks to a necessity to reconsider our general education requirements, so that they are more broad to allow a greater diversity of credits to be accepted, thereby reducing the burden on the student and the tuition assistance/financial aid system.

My observations:

  • Data Metrics are ubiquitous.  In every higher education community I have contact with, in every conference program I’ve seen this year, data metrics are omnipresent.  CCME was no different.  On Monday, I participated in the Servicemembers Opportunity Colleges (SOC) Burning Issues Summit.  The number one item on the agenda – reviewing a suggested methodology for tracking the success metrics of military students.  As the Transparency by Design project director, I participated as a subject expert in this summit as the SOC subcommittee utilized our Learner Progress Methodology in developing their recommendations.  Common metrics were also a part of the conversation regarding the new Department of Defense (DoD) Third Party Education Assessment process (formerly Military Voluntary Education Review – MVER) as well as the DoD Memorandum of Understanding for those offering tuition assistance.
  • Defining the students we serve. Just as we in the online education community struggle with defining WHO is an online student, the military serving community struggles to define who is a military student.  There are many factors to consider such as enrollment level, enlistment type (active, reserve, veteran), branch, and deployment status.  73% of students receiving military tuition assistance are taking their coursework online, which crossroads them into the online student category as well.  With both populations being very mobile, this leads into the next issue being discussed…I’ll bet you can guess what it is…
  • State Authorization of Higher Education. The state authorization regulations are also weighing heavy on military serving institutions.  They are dealing with the added layer of soldiers living on bases, at sea and in our war zones, all while maintaining their primary residence in other states, in trying to figure out which states they need to apply for authorization.  WCET’s Russ Poulin, North Dakota’s Bob Larson and Dow Lohnes’ Jeannie Yockey-Fine participated in a panel sharing their vast experiences with state authorization.  In many cases, as you well know, the common refrain for this issue is – It Depends. Major concerns raised by the audience focused on the transient nature of military students and how institutions track that for proper authorization as well as the possibility that military learners could face adding even more institutions to their already burgeoning transcripts if forced to transfer to an institution with authorization where they are.
  • U.S. Soldiers from 2nd Platoon, Bravo Troop, 1st Battalion, 150th Armored Reconnaissance Squadron, 30th Heavy Brigade Combat Team, 1st Cavalry Division, from Bluefield, W. Va, take time to surf the internet, at Forward Operating Base Yusifiyah, in central Iraq, Aug. 16, 2009. Photo by Petty Officer 2nd Class Edwin L. Wriston.

    Military students choose online education for convenience.  During Tuesday’s Military Student Panel, all five students (representing the Army, Navy, Marines, Air Force and Coast Guard) echoed that the primary reason they take online classes, even those who preferred the face-to-face interaction, was for their convenience.  This becomes even more imperative if they are studying while deployed.  An interesting take away for online educators, when deployed, connectivity can be spotty and slow.  While they see the value in the bells and whistles (embedded graphics, videos, etc.), they often don’t have the bandwidth to take advantage of these.  This also goes for completely online textbooks – while deployed they may not have the computer time or bandwidth to utilize these resources.

  • Reclaiming our quality indicators. During the SOCs burning issues summit, the second item on the agenda was reclaiming or replacing the term “Military-Friendly Institution.”  It has been appropriated by certain publications which rank, rate and promote military friendliness as a marketing tactic.   For me, this struck a tune similar to the conversations we’ve been having about reclaiming online education’s quality indicators from marketing entities such as U.S. News and World Report. As I said in the session, if we {as online or military educators} want to reclaim these indicators, the first thing we must do is stop using their rankings in our marketing to students and stop paying them for advertising in their publications, be they in print or online.  As Russ and I discussed in an earlier post, if we were to create a culture of transparency throughout higher education, with institutions sharing data openly, publicly and giving students tools to make informed decisions, would the rankings live on?  If we were accountable to ourselves and our students, would we need to use the arbitrary rankings in our marketing?

Considering the overlap in our student populations, how, or are, your online programs serving military students? I would love to hear about your best practices for serving online military students in the comments.

Cali Morrison
Manager, Major Grants, WCET – WICHE Cooperative for Educational Technologies
Project Director, Transparency by Design
cmorrison@wiche.edu

Photo by Petty Officer 2nd Class Edwin L. Wriston, Creative Commons Licensed by The U.S. Army on Flickr.

Categories
Uncategorized

U.S. News on Their Rankings of Online Colleges: In Their Own Words

The developers of the US News and World Report’s Top Online Programs rankings invited WCET members to ask questions regarding the implementation, methodology, survey questions, and future plans.  We thank Bob Morse and Eric Brooks of U.S. News for extending the invitation and for their quick turnaround in providing responses.

Their invitation followed on the heels of our analysis of their ranking methodology and their subsequent rebuttal against our arguments.   While we stand by our original analysis, we welcomed this opportunity for our members to ask questions that will help bring greater clarity to how U.S. News arrived at these rankings.

A big thank you goes to our members* who took the time to craft insightful questions or provide colorful commentary so that the entire online education community may benefit from the answers provided.  You will note that the questions were submitted to U.S. News without citation of who asked the question.  We wanted the questions to stand on their own without regard to who asked them.

We are intrigued by a response containing Bob and Eric’s invitation to WCET members and we will explore it in more detail:

“US News would like to do that in systematic way like an advisory group of WCET members to meet with us on a regular basis in order to improve what we have done and advise us. WCET members are the experts and US News would very much like to work with WCET in an organized way.”

For the most part, Eric and Bob do a thorough job of answering your questions or of admitting the limitations of this type of survey, which is highly dependent on standardized data.  We gave our opinion in our earlier blog piece.  We want to leave the final analysis to you.  To get the full scoop, read the full set of questions and responses – Q&A with US News & World Report.  We’d love to hear your reactions in the comments.  Join in!

Blog Post co-authored by:

Russ Poulin, Deputy Director, Research & Analysis, rpoulin@wiche.edu
Cali Morrison, Manager, Major Grants, cmorrison@wiche.edu

WCET:  wcet.wiche.edu

Support our work.  Join WCET.

In case you missed it…

*WCET Members Who Provided Questions & Comments

  • Deb Adair – Quality Matters
  • Shirley Adams – Charter Oak State College
  • Patricia Book – University of Northern Colorado
  • Jeff Borden – Pearson
  • Gary Brown – Portland State University
  • Mike Buttry – Capella University
  • Patricia Fenn – Ocean County College
  • Jerry Foster – Boise State University
  • Jennifer Stephens Helm – American Public University System
  • Geri Malandra – Kaplan University
  • Denise Nadasen – University of Maryland University College
  • Lisa Nordick – North Dakota State University
  • Linda Norman – Vanderbilt University School of Nursing
  • Peg O’Brien – Dakota State University
  • Karen Pedersen – Northern Arizona University
  • Vicky Phillips – GetEducated.com
  • Ann Randall – Boise State University
  • Cyndi Rowland – Utah State University
  • Ray Schroeder – University of Illinois – Springfield
  • John Sener – Sener Knowledge LLC
  • Anne Zalenski – University of Iowa
Categories
Uncategorized

Black bars around the Internet

As you may have noticed, many sites around the web have gone black today to protest the Stop Online Piracy Act (SOPA – H.R. 3261) in the House and the Protect IP Act (PIPA – S.968) in the Senate.

We encourage all WCET members and the online higher education community as a whole to educate yourself on these issues, and what they mean for the operations of your online programs, your websites and your own web usage.  Once you have come to your conclusions, make sure your voice is heard.  Share your opinions with your representatives.

WordPress.com homepage January 18, 2012

Here are some resources to help you learn what SOPA and PIPA mean for you.

Categories
Uncategorized

Learner Progress: Capturing the Adult Learner

Cali Morrison, Transparency by Design Project Director, brings to Frontiers the process by which the Transparency by Design initiative created the recently launched learner progress metrics available on its website College Choices for Adults.

The Problem

Transparency by Design (TbD) member institutions recognized that the national metrics for measuring learner retention and completion, the Integrated Postsecondary Education Data System (IPEDS) Graduation Rate Survey (GRS) were not capturing the majority of their students – those learners returning to college and attending part-time.   The IPEDS GRS takes those students who enter into an institution as first-time, full-time students, puts them in a cohort and looks at where they are at 100% and 150% of  “normal time” (defined as 4 and 6 years for a bachelors degree, for example).  The problem here is that this cohort only comprises a small percentage of students at TbD member institutions.  For one institution, none of their students fit into this category.  So, when journalists and legislators and students are talking about and considering the performance of an institution based on the IPEDS GRS, they are not considering the whole picture for many institutions.

The Process

Coming to a solution for capturing a greater proportion of students was no simple task.  TbD institutions span a wide range of institutional model – some are more traditional online or hybrid programs, some are competency based, some are degree completion and some are primarily graduate. Trying to accommodate all of these institutions, how they count students as enrolled and still have a metric that is understandable for a person without an advanced degree in statistics was a challenge.  The learner progress committee* considered models that looked more deeply at the moment a student becomes a student, models that would account for more of the ‘swirl’, or in other words students who transfer in and out and back in, but each of those models was met with barriers to comparability and validity.  In pilots the metric was not being applied consistently across institutions and therefore the resulting reports were not truly comparable across institutions.

The Solution

After several unsuccessful pilot metrics, the learner progress committee decided to build from a widely recognized and accepted methodology with a construct that can be reliably replicated at many different institutions.   The IPEDS GRS provided the base needed to build Learner Progress.  However, in contrast to the GRS, the Learner Progress metrics include transfer-in and part-time students in the cohort in addition to first-time, full-time students.  All other parameters for cohort development remain the same.  This helps the learner progress metric maintain the face validity of the graduation rate survey while capturing a more accurate picture of the adult learner.

Regis University Learner Retention Report

Learner Progress is divided into two distinct categories – Learner Retention and Learner Completion.  Learner retention reports the percentage of students who remain enrolled or complete a degree/certificate after one year.  Institutions have the opportunity to report learner retention at the degree level, which is preferred, or for those who are not able to report at the degree level, they may report at the institution-wide level. Learner completion reports the percentage of the cohort which completed a degree within 150% and 200% of ‘normal time,’ as defined by IPEDS (for example, ‘normal time’ for a bachelors degree is 4 years).  Another interesting piece of learner completion is the opportunity for institutions which have transfer as a core part of their mission to report out the percentage of the cohort which has successfully transferred to another institution at 150% of normal time.  We currently do not have any institutions tracking and reporting this data but the purpose was to be more inclusive of community colleges. There are currently 67% of institutions involved in the Transparency by Design initiative reporting this data on the College Choices for Adults website.

What’s next

TbD challenges accrediting bodies, federal regulators, and reciprocity compacts to adopt this metric into their quality standards and adult learners to stand up and demand to be counted and provided with data that is reflective of their situation to help inform their educational pathways.   As always, TbD challenges institutions serving adult learners at a distance to set themselves apart from the crowd by reporting this and other data VOLUNTARILY on the College Choices for Adults website.

*This blog would not be complete without a special thank you to the Learner Progress Committee who put many hours of work into this effort: David Hemenway, Charter Oak State College; Kim Pearce, Capella University; Lisa Daniels, Excelsior College; Jennifer Mauldin, Regis University; Linda Van Volkenburgh; Union Institute & University; Dave Becher, American Public University System; Karen Paulson, NCHEMS; and Russ Poulin and Cali Morrison, WCET.

Categories
Uncategorized

To Enter the “Ranks” or Not…

At the end of July, I moderated a webcast for WCET with Bob Morse and Eric Brooks from US News and World Report regarding their forthcoming ranking of online education.  The purpose of the webcast was to answer the questions our members had raised – to help clarify the process of how US News plans to use the data they are collecting to determine the rankings.  Unfortunately, the webcast left me staring at the same muddied water.

US News (USN) reiterated that they still have not developed the methodology for determining the rankings, as they reported in Inside Higher Ed in June.   As they answered in the open questions after the webcast, “At this point there is not a methodology, as was explained on the webcast. In an ideal world, the prior year’s ranking methodology and ranking variables for online education could be included with the surveys, as is practiced with U.S. News Best Graduate and Best Colleges surveys. Unfortunately, due to the inaugural nature of these online surveys, no methodological plan can be established until U.S. News assesses the robustness of the submitted data. When there eventually is a plan it will be disclosed at the appropriate time. In future years, U.S. News hopes to be able to be more specific at an earlier stage of the survey process about its online degree ranking methodologies (at least what was used the previous year) and which data points were used in the rankings.”  Frankly, this ‘groping in the dark’ methodology surprised me.

Who suffers here?  The students. USN claims to empower students to make informed decisions.  In reality USN is taking the power of choice away from the students.  USN is encouraging students to concede to their opinion of what a quality program is rather than providing students with data to investigate the options that are important to the each student’s individual professional and personal goals.   They will, based solely on the popularity of the publication, be funneling students to institutions and programs which may not be a proper match.

Choice is affected by context by Will Lion (on Flickr http://www.flickr.com/photos/will-lion/2681240098/)*

In a recent email conversation with Vicky Phillips, of Get Educated.com, she noted that, “US News accepts advertising on a pay per lead model.  This means the company gets paid based on how successful those ads are as direct recruitment vehicles for the very schools they rank.  The US Department of Education disallows paying college recruiters based on recruitment success because of the corruption in this model.  If you’re a paid recruiter for a particular college, can you simultaneously operate a neutral rating agency for the same college? I’d argue not.” (It’s important to note here that GetEducated.com does its own rankings of online degrees.)

I have also had several schools, who wish to remain anonymous, express the struggle they face in making a decision to or not to respond to this survey.  Many feel that without knowing the criteria for ranking, they are hesitant to answer questions.  They have no way to provide context to their responses or the questions are not wholly appropriate for their population of students. Yet at the same time, they are concerned about how they’ll be perceived or portrayed if they don’t answer the survey.  The majority of online students are what has typically been called “non-traditional” – they’re older, have jobs, families and community commitments and are motivated to go to college either for personal or career improvement.  As such, many may be 5, 10, 20 or more years removed from having received a high school diploma.  Yet, institutions are required to report high school rank, GPA and SAT scores of their students in this survey, though in the responses to open questions USN does note, “…questions about the high school ranks and SAT scores of online bachelor’s degree students are highly unlikely to play any role in ranking of online bachelor degree programs.”  So why put institutions through the effort of reporting on indicators that will not be used?

As these rankings are touted as indicators of quality in online education I asked Ron Legon, Executive Director of the Quality Matters Program, about his opinion. He said, “Although the survey acknowledges that institutions may not track or be able to provide all the information requested (e.g., in distinguishing between online and on ground students in their programs), I’m not sure U.S. News fully appreciates how difficult it will be for most institutions to complete the instrument with meaningful information.  For example, the huge disparity between the small numbers of first time freshmen in online programs and the predominantly older students, who typically have some prior higher ed background and, in most cases, are already employed, will distort many of the retention, graduation and employment statistics – to the point where they may be meaningless.”

To this point, one institution I have been in communication with, who responded to US News that they would not be participating in the survey, received this message:

“In order for an institution to opt out of receiving further communication regarding this year’s survey, U.S. News requires that a scanned letter or email be sent directly from the President, Provost, or academic dean referencing the decision to not participate and stating whether or not the institution offers the type(s) of online degree program(s) indicated in the survey. If your institution does offer this type of online degree program, please be advised that your institution may still be included in this year’s rankings and will still appear on the usnews.com website. When possible, U.S. News may gather other publicly-available data about your online programs.”

WHAT?  So, even if an institution chooses not to provide the data, US News will still publish incomplete and potentially inaccurate data about their institution?  That sounds like blackmail to me.  “Answer our survey, or else!”  And I can only imagine that by ‘publicly-available data’ they are referencing the IPEDS data reported on College Navigator, which (back to Ron’s point) would be pointless to report for most online bachelor’s degrees.  The IPEDS graduation and retention rates only account for first-time, full-time freshman.

Because he said it so well, I’m going to leave the conclusion in the words of Ron, “We can hope that the data collected will profile the major structural varieties of online programs that are out there, and, to this extent, the U.S. News coverage may be helpful to those of us who are trying to track the phenomenal growth of distance learning.  But I can sympathize with others who have expressed concerns about any premature attempts by U.S. News to rank programs, especially without any pre-announced weighting of criteria.  U.S. News would be providing a service to students and the general public by simply publishing an inventory and an analysis of the characteristics of different types of online programs uncovered by their survey.  This, in itself, would help students choose among programs.  But, as a news organization trying to sell magazines, I expect that U.S. News will need some headlines and give in to the temptation to rank programs based on insufficient, unreliable data and ad hoc criteria.”

We’d like to hear from you.  What are your thoughts?  Is your institution participating? Fill out our poll and/or leave us a comment to join this conversation.[polldaddy poll=5468389]

*Choice is affected by context by Will Lion (on Flickr). Text:  Choice and preference for that choice are affected by the other items on choice at the time. For example, “a pen selected from a set in which it asymmetrically dominated another pen produced a more positive writing experience and a greater willingness to pay for the pen than if the same pen was selected from a set in which it did not dominate another option” Yoon et al. (2008). Choice Set Configuration as a Determinant of Preference Attribution and Strength. Journal of Consumer Research www.journals.uchicago.edu/doi/abs/10.1086/587630 Background image courtesy of: www.flickr.com/photos/orinrobertjohn/114430223. This citation appears in the bottom left of the image.