Last week we welcomed Lynn Wahl, Instructional Designer for UNC Charlotte, to discuss how they used design thinking to create online faculty development workshops. In a continuation of the topic, Lynn joins us today to talk about how to assess a learning opportunity such as the online workshops to ensure they continue to engage attendees and meet their needs.

Thank you very much to Lynn for showcasing the creative endeavors and assessment practices going on at UNC Charlotte!

Enjoy the read and enjoy your day,

Lindsey Downs, WCET


A major focus in planning and development training is ensuring that it fulfills the needs of its intended audience. This is particularly important in online workshops since they often require a higher facilitation effort that you don’t want to be wasted. This places a heavy emphasis on iterative revisions of workshops to make sure everything is working well.

One method commonly used to determine where to make changes to training are end of workshop surveys; but in online workshops, surveys alone are not enough to show the complete picture of how participants engage with content, assessments, and the facilitator.

Learning analytics tracked within a learning management system can provide the rest of the picture developers need to make meaningful revisions to training.

Workshop Types

The Center for Teaching & Learning at UNC Charlotte began delivering online pedagogy workshops in Spring 2017. The first online workshop on writing learning objectives was created using a design thinking methodology. We discussed the development of our online faculty workshops in a post last week on Frontiers. After great participant feedback, two more workshops were transformed to an online offering.

All three of our online workshops utilize different facilitation models matched to the topics and the types of interactions most helpful for participants to learn the required skills:

Workshop Facilitation model Interaction types and skills
Introduction to Learning Objectives and Backward Design Facilitated Contains auto and manual graded self-checks, discussion, and an application-based assignment. Helps faculty practice a difficult skill and get feedback.
Syllabus 101 Resource Based Contains optional auto-graded self-checks and a culminating discussion forum. Provides valuable “just-in-time” information.
Using Feedback to Improve Teaching and Learning Discussion Based Asks faculty to share their experiences in overcoming feedback challenges in their courses.

Gathering Data

Gathering and analyzing data from online workshops can be a little overwhelming, but even simple learner analytics like page views and discussion forum mapping can give you the extra information you need to improve participant experience. This type of information can be pulled from your learning management system (LMS).

For example, we use Canvas to house our faculty workshops. A free Canvas workshop created by Dartmouth College gave us a starting point for pulling data out of Canvas. We were able to pull information out of our institution’s instance of Canvas and use pre-built analytics tools to explore and map the data. This data should be available to your Canvas administrators who may be able to assist you with this process.

Analyzing Data

Once you’ve retrieved data, often the more difficult task is making sense out of it and answering the “so what?” question.

In looking through the huge amount of information provided through page views across multiple sections of the three workshops, we noticed a few things:

  • Participant working time is between 8am and 11pm, with peak working time between 9am and 11am.
  • Participants spend significantly less time than allotted in the facilitated learning objectives workshop (workshop 1 in the chart above).
  • There are clear divisions between the top page views in the workshop and the less visited resources.
  • Facilitation and engagement with instructors in the discussion-based workshop has a big impact on how participants interact in the forums.
Example of a network analysis graph for the discussion-based workshop. Each circle represents a participant in the workshop, while the shaded groupings represent different sections of the workshop. Circles show how particpants are grouped in large circles. The lightly facilitated group is more dispersed and have lines drawn between participants showing how they interact. The Facilitated discussion forum is one tightly connected circle with more interaction.
Example of a network analysis graph for the discussion-based workshop. Each circle represents a participant in the workshop, while the shaded groupings represent different sections of the workshop. Participant names have been blurred to protect their privacy. Author created image.
  • Survey results and participant feedback don’t always match up with how participants engage with the workshops
  • Participants frequently return to the content (both resources and discussion posts) in the resource-based workshop, infrequently to the facilitated workshop, and not at all to the discussion-based workshop

chart showing the return access at +15 days for each group. Facilitated had 47 page views with 9 users, who viewed several different pages of the class. Forum based group had 2 page views and 2 users who only viewed the homepage. The resource based group had 484 page views with 51 users. They viewed a variety of pages.

Revisions

Based on what we saw in the data, we could see that the best strategy was to design for the normal participant experience and accommodate the outliers when necessary.

Under this guiding thought, we are now implementing the following revisions in the online workshops:

  • Change the amount of time we state it takes to complete the workshops to more accurately reflect the workload.
  • Reply to questions and schedule announcements and the release of new content before 9am every day.
  • Create additional workshops and resources around the top page views in all of the workshops.
  • Remove the bottom viewed pages in all of the workshops.
  • Add more media and examples.
  • Remove some of the self-check multiple choice quizzes.

The guiding thought of course develop was to design for the norm, but accomodate outliers. Bell curve chart showing the
Once the changes are in place for a few semesters, we can rerun the analytics to see how our changes affect participant experiences.

How can you use analytics to help make your events or courses better? Once you have the data you are empowered to make meaningful updates to these learning experiences and make them better for your attendees.

Resources

 

author headshot
Lynn Wahl
Instructional Designer
UNC Charlotte Center for Teaching and Learning

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,537 other subscribers

Archive By Month

Blog Tags

Distance Education (340)Student Success (313)Online Learning (242)Managing Digital Learning (241)State Authorization (230)WCET (223)U.S. Department of Education (215)Regulation (212)Technology (169)Digital Learning (164)Innovation (125)Teaching (121)Collaboration/Community (114)WCET Annual Meeting (106)Course Design (103)Professional Development (101)SAN (101)Access (99)Faculty (90)Cost of Instruction (89)Financial Aid (84)Legislation (83)Completion (74)Assessment (69)Accessibility (68)Instructional Design (68)Open Educational Resources (68)Professional Licensure (66)Accreditation (65)COVID-19 (64)SARA (64)Credentials (62)Competency-based Education (61)Quality (61)Data and Analytics (60)Diversity/Equity/Inclusion (59)Research (58)Reciprocity (57)WOW Award (54)Outcomes (47)Workforce/Employment (46)Negotiated Rulemaking (45)Regular and Substantive Interaction (43)Policy (43)Higher Education Act (41)Virtual/Augmented Reality (37)Artificial Intelligence (36)Title IV (36)Practice (35)Academic Integrity (34)Disaster Planning/Recovery (34)Leadership (34)State Authorization Network (33)Every Learner Everywhere (31)WCET Awards (31)IPEDS (28)Adaptive/Personalized Learning (28)Reauthorization (28)Military and Veterans (27)Survey (27)Credits (26)Disabilities (25)MOOC (23)WCET Summit (23)Evaluation (22)Complaint Process (21)Retention (21)Enrollment (21)Correspondence Course (18)Physical Presence (17)WICHE (17)System/Consortia (16)Cybersecurity (16)Products and Services (16)Blended/Hybrid Learning (15)Forprofit Universities (15)Member-Only (15)WCET Webcast (15)Digital Divide (14)NCOER (14)Textbooks (14)Mobile Learning (13)Consortia (13)Personalized Learning (12)Futures (11)Marketing (11)Privacy (11)STEM (11)Prior Learning Assessment (10)Courseware (10)Teacher Prep (10)Social Media (9)LMS (9)Rankings (9)Standards (8)Student Authentication (8)Partnership (8)Tuition and Fees (7)Readiness and Developmental Courses (7)What's Next (7)International Students (6)K-12 (6)Lab Courses (6)Nursing (6)Remote Learning (6)Testing (6)Graduation (6)Proctoring (5)Closer Conversation (5)ROI (5)DETA (5)Game-based/Gamification (5)Dual Enrollment (4)Outsourcing (4)Coding (4)Security (4)Higher Education Trends (4)Mental Health (4)Fall and Beyond Series (3)In a Time of Crisis (3)Net Neutrality (3)Universal Design for Learning (3)Cheating Syndicates Series (3)ChatGPT (3)Enrollment Shift (3)Minority Serving Institution (3)Nontraditional Learners (2)Student Identity Verification (2)Cross Skilling/Reskilling (2)Virtual Summit (2)Department of Education (2)Higher Education (2)Title IX (1)Business of Higher Education (1)OPMs (1)Third-Party Servicers (1)microcredentials (1)equity (1)Community College (1)Formerly Incarcerated Students (1)Global (1)Compliance (1)