How do you define academic rigor? I know when I was completing my undergraduate and graduate coursework, I could tell the difference between a rigorous course and one that would be a little less time consuming. I also understood, especially in graduate school, that the more rigorous a course was, the more I “got out of it.” Is there a way to capture the difference so instructors can ensure the best educational experiences for their students?

To help us do just that, today we’re excited to welcome Andria Schwegler from Texas A&M University. Andria is here to discuss her search for the definition of academic rigor, gathered through discussions with her colleagues and presentations at Quality Matters conferences.

Enjoy the read and enjoy your day,

Lindsey Downs, WCET


Rigor is touted by many educators as a desirable quality in learning experiences, and many institutional mission statements promise rigorous academic experiences for students. Unfortunately, a definition of rigor has been elusive. How do institutions leverage evidence to support their claim of rigorous experiences?

The Task to Operationally Define Academic Rigor

In search of a definition of academic rigor, I sought out the types of evidence that faculty members at my institution use to document rigor in their academic programs and courses. The search was prompted by an invitation to serve as a panel discussant representing a faculty member’s perspective in one of two special sessions “Quality Online Education: What’s Rigor Got to Do with It?” at the Quality Matters Connect Conference in 2017. The purpose of the sessions was to open dialogue across stakeholders in higher education regarding the role of rigor in traditional and alternative learning experiences. As a faculty member, I could articulate rigor in my own courses and program, but because defining rigor had not been a topic of discussion across departments, I sought to learn what my colleagues considered evidence of academic rigor.

Group of instructors
Photo from #WOCinTech Chat

Operational Definitions of Rigor Offered by Faculty

Several of my colleagues accepted my invitation to discuss the issue, and they provided a variety of measurable examples to demonstrate rigor in their courses and programs. Rigorous learning experiences they described included:

  • audio and video recorded counseling sessions with clients that were subsequently critiqued in class,
  • problems identified by students in current or former employment contexts that were brought to class and addressed by applying course content to the cases, and
  • quantitative research projects requiring application of completed coursework to collecting and interpreting original datasets.

Distilling a broader summary from course-specific assignments and program-specific assessments revealed that the most frequently cited evidence to support claims of rigor were student-created artifacts. These artifacts resulted from articulated program- and course-learning outcomes that specified higher-level cognitive processing. Learning outcomes as static statements were not considered evidence of rigor in themselves; they were prerequisites for learning experiences that could be considered rigorous (or not) depending on how they were implemented.

Implementing these activities included a high degree of faculty support and interaction as students created artifacts that integrated program- or course-specific content across time to demonstrate learning. The activities in which students engaged were leveled to align with the program or course and were consistent with those they would perform in their future careers (i.e., authentic assessment; see Mueller, 2016). Existing definitions of rigor include students’ perceptions of challenge (“Rigor,” 2014), and the evidence of rigor faculty members provided highlighted the intersection of students’ active engagement with curriculum relevant to their goals and interaction with the instructor. These conditions are consistent with flow, which is characterized by concentration, interest, and enjoyment that facilitate peak performance when engaged in challenging activities (Shernoff, Csikszentmihalyi, Schneider, & Shernoff, 2003).

Translating Rigor into Student Assessments

Creating meaningful assessments of learning outcomes that integrate academic content across time highlights the importance of planning learning experiences, not only in a course but also in a program. Faculty colleagues explicitly warned against considering evidence of rigor in a course outside of the context of the program the course supports. Single courses do not prepare students for professional careers; programs do. It was argued that faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

text box which reads: …Faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

Planning at the program level allows faculty members to make decisions regarding articulating curriculum in courses, sequencing coursework, transferring coursework, creating program goals, and assessing and implementing program revisions. Given these responsibilities, instead of being viewed as an infringement on their academic freedom (see Cain, 2014), faculty members indicated that collaborative planning and curriculum design were essential in setting the conditions for creating assessments demonstrating rigor.

Though specific operational definitions of rigor provided by colleagues were as diverse as the content they taught, the underlying elements of their examples were similar and evoked notions of active student engagement in meaningful tasks consistent with a “vigorous educative curriculum” (Wraga, 2010, p. 6).

Student Activities During Lecture

Stepping back from the examples of student work that faculty members offered as evidence of rigor, I reflected on student activities that were missing from our conversations. None of my colleagues indicated that attending class, listening to lecture, and taking notes were evidence of rigor. Though lecture is “the method most widely used in universities throughout the world” (Svinicki & McKeachie, 2011, p. 55), student activities associated with it never entered our conversations.

In fact, one colleague’s example of rigorous classroom discussion directly contradicted the approach. She explained that during discussions, she tells her students not to believe a word she says, though she was quick to add that she does not mislead students. Her approach puts the burden to obtain support for discussions on students, who cannot passively rely on the teacher as an authority. Instead, students are held accountable for substantiating claims provided. This technique offers more evidence of rigor than simply receiving the content via lecture.

text box reads: None of my colleagues suggested that students’ grades were evidence of rigor.

Student Grades

None of my colleagues suggested that students’ grades were evidence of rigor.

One noted that some students may not meet high standards, leading them to fail a course or program. But, these failures in demonstrating performance were viewed as unfortunate consequences of rigor, not evidence to document its existence. This sentiment was complimented by another colleague’s comment that providing support (e.g., remedial instruction, additional resources) to students was not a threat to the rigor of a course or program. Helping students meet high standards and improve performance was evidence of rigor, whereas failing grades because students found the content difficult were not.

Teaching Evaluations and Mode of Delivery

None of my colleagues suggested that students’ evaluations of a course or an assignment were evidence of rigor. When documenting rigor, faculty members offered students’ performance on critical, discipline-specific tasks, not their opinions of the activities. Supporting this observation, Duncan, Range, and Hvidston (2013) found no correlation between students’ perceptions of rigor and self-rated learning in online courses. Further, definitions of rigor provided by graduate students in their study were strikingly similar to the definitions provided by my colleagues (e.g., “challenge and build upon existing knowledge…practical application and the interaction of theory, concept, and practice…must be ‘value-added’” p. 22). Finally, none of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor, an observation also supported by Duncan et al. (2013).

Defining Academic Rigor: A Research-Based Perspective

textbox which reads: None of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor.

Thanks to my colleagues, I arrived at the Quality Matters Connect conference with 17 single-spaced pages of notes documenting an understanding of rigor. Though the presentation barely scratched the surface of the content, I was optimistic that we were assembling information to facilitate multiple operational definitions of rigor that could be used flexibly to meet assessment needs. This optimism contributed to my surprise when, during small group discussion among session attendees, the claim was made that academic rigor has too many interpretations and cannot be defined.

I cannot support this claim because most variables addressing human behavior have multiple ways they can be operationally defined, and converging evidence across diverse assessments increases our understanding of a given variable. From this perspective, a single, narrow definition of rigor is neither required nor desirable. A research-based perspective allows for multiple operational definitions and makes salient the value of assessment data that may be underutilized when it informs only a single program’s continuous improvement plans. As Hutchings, Huber, and Ciccone (2011) argue, faculty members engage in valuable work when they apply research methods to examine student learning and share results with others. When assessment and improvement plans are elevated to the level of research, the information can be shared to inform others’ plans and peer reviewed to further improve and expand the process.

Articulating and sharing ways to observe and measure rigor can provide educators and administrators a selection of techniques that can be shaped to meet their needs. Engaging in an explicit examination of this issue across institutions, colleges, programs, and courses facilitates the identification of effective techniques to provide evidence of rigor to support the promises made to our stakeholders.

author headshot Andria Schwegler

 

Andria Schwegler
Associate Professor, Counseling and Psychology
Texas A&M University – Central Texas

 

 

 


References

Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not conflict. (Occasional Paper No. 22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Retrieved from http://www.learningoutcomesassessment.org/documents/OP2211-17-14.pdf

Duncan, H. E., Range, B., Hvidston, D. (2013). Exploring student perceptions of rigor online: Toward a definition of rigorous learning. Journal on Excellence in College Teaching, 24(4), 5-28.

Hutchings, P., Huber, M. T., & Ciccone, A. (2011). The scholarship of teaching and learning reconsidered: Institutional integration and impact. San Francisco, CA: Jossey-Bass.

Mueller, J. (2016). Authentic assessment toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm

Rigor. (2014, December 29). In The Glossary of Education Reform. Retrieved from https://www.edglossary.org/rigor/

Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. S. (2003). Student engagement in high school classrooms from the perspective of flow theory. School Psychology Quarterly, 18(2), 158-176.

Svinicki, M., & McKeachie, W. J. (2011). How to make lectures more effective. McKeachie’s teaching tips: Strategies, research, and theory for college and university teachers (13th ed., pp. 55-71). Belmont, CA: Wadsworth.

Wraga, W. G. (2010, May). What’s the problem with a “rigorous academic curriculum”? Paper presented at the meeting of the Society of Professors of Education/American Educational Research Association, Denver, Colorado. Retrieved from https://eric.ed.gov/?id=ED509394

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

2 replies on “Rigor, Meet Reality”

Fabulously well thought out and thought provoking! Great approach to parsing a very gordian knot! As we see technology taking on more and more traditional course-level assessment roles (scoring tests, providing feedback, checking for plagiarism..) the kinds of interstitial, integrative, capstone work to culminate programs of study described here underscore the irreplacable and compelling role faculty currently, and will continue to play in understanding, designing and validating learning.

Comments are closed.

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,537 other subscribers

Archive By Month

Blog Tags

Distance Education (340)Student Success (313)Online Learning (242)Managing Digital Learning (241)State Authorization (230)WCET (223)U.S. Department of Education (215)Regulation (212)Technology (169)Digital Learning (164)Innovation (125)Teaching (121)Collaboration/Community (114)WCET Annual Meeting (106)Course Design (103)Professional Development (101)SAN (101)Access (99)Faculty (90)Cost of Instruction (89)Financial Aid (84)Legislation (83)Completion (74)Assessment (69)Accessibility (68)Instructional Design (68)Open Educational Resources (68)Professional Licensure (66)Accreditation (65)COVID-19 (64)SARA (64)Credentials (62)Competency-based Education (61)Quality (61)Data and Analytics (60)Diversity/Equity/Inclusion (59)Research (58)Reciprocity (57)WOW Award (54)Outcomes (47)Workforce/Employment (46)Negotiated Rulemaking (45)Regular and Substantive Interaction (43)Policy (43)Higher Education Act (41)Virtual/Augmented Reality (37)Artificial Intelligence (36)Title IV (36)Practice (35)Academic Integrity (34)Disaster Planning/Recovery (34)Leadership (34)State Authorization Network (33)Every Learner Everywhere (31)WCET Awards (31)IPEDS (28)Adaptive/Personalized Learning (28)Reauthorization (28)Military and Veterans (27)Survey (27)Credits (26)Disabilities (25)MOOC (23)WCET Summit (23)Evaluation (22)Complaint Process (21)Retention (21)Enrollment (21)Correspondence Course (18)Physical Presence (17)WICHE (17)System/Consortia (16)Cybersecurity (16)Products and Services (16)Blended/Hybrid Learning (15)Forprofit Universities (15)Member-Only (15)WCET Webcast (15)Digital Divide (14)NCOER (14)Textbooks (14)Mobile Learning (13)Consortia (13)Personalized Learning (12)Futures (11)Marketing (11)Privacy (11)STEM (11)Prior Learning Assessment (10)Courseware (10)Teacher Prep (10)Social Media (9)LMS (9)Rankings (9)Standards (8)Student Authentication (8)Partnership (8)Tuition and Fees (7)Readiness and Developmental Courses (7)What's Next (7)International Students (6)K-12 (6)Lab Courses (6)Nursing (6)Remote Learning (6)Testing (6)Graduation (6)Proctoring (5)Closer Conversation (5)ROI (5)DETA (5)Game-based/Gamification (5)Dual Enrollment (4)Outsourcing (4)Coding (4)Security (4)Higher Education Trends (4)Mental Health (4)Fall and Beyond Series (3)In a Time of Crisis (3)Net Neutrality (3)Universal Design for Learning (3)Cheating Syndicates Series (3)ChatGPT (3)Enrollment Shift (3)Minority Serving Institution (3)Nontraditional Learners (2)Student Identity Verification (2)Cross Skilling/Reskilling (2)Virtual Summit (2)Department of Education (2)Higher Education (2)Title IX (1)Business of Higher Education (1)OPMs (1)Third-Party Servicers (1)microcredentials (1)equity (1)Community College (1)Formerly Incarcerated Students (1)Global (1)Compliance (1)