Since our last blog in January on generative artificial intelligence (AI), the field has changed by leaps and bounds:

The higher education press continues to increase its coverage with multiple articles, blogs, and op-ed pieces in The Chronicle of Higher Education and Inside Higher Education among other outlets. Much of that coverage, though, continues to focus on the pedagogical implications of generative AI, including academic integrity concerns. Equally important, but discussed very little, are the equity considerations around how AI can and should be leveraged in higher education. Today we’ll look at both the positive and potentially negative aspects of generative AI in higher education as it relates to educational equity.

AI as a Tool for Equity

According to the U.S. Department of Education’s Office of Educational Technology, “Technology can be a powerful tool for transformation learning. It can help affirm and advance relationships between educators and students, reinvent our approaches to learning and collaboration, shrink longstanding equity and accessibility gaps, and adapt learning experiences to meet the needs of all learners.” Artificial intelligence, when used deliberately and carefully, can advance equity by improving educational accessibility and assisting second language learners, among others.

AI can be especially powerful when addressing learner accessibility. For example, students with dyslexia can benefit from AI as a December 10, 2022, Washington Post article demonstrated when it described how ChatGPT is being used by a British landscaper with dyslexia to rewrite emails so that they are more professional and more easily understood. Additionally, students and faculty with AD/HD are finding generative AI useful in approaching research and writing. As Maggie Melo wrote in her February 28, 2023, op-ed in Inside Higher Education, “My thinking and writing processes are not linear. ChatGPT affords me a controlled chaos.” Melo goes on to describe how the need to create an abstract can feel overwhelming despite having having done so numerous times. However, after asking ChatGPT “How to write an abstract,” she received an outline that, as she put it, “provides my mind with an anchor to focus on and return to.” And much like our British landscaper, non-native English speakers may also benefit from ChatGPT’s ability to revise and rephrase text. ChatGPT can be used to revise text for grammatical correctness and clarity.

Challenges to AI as a Tool for Equity

Algorithmic bias and the “New Jim Code”

In her 2019 work, Race After Technology, Princeton University sociologist Ruha Benjamin wrote about what she calls the “New Jim Code” and the problem of data and algorithmic bias.

Benjamin defined the “New Jim Code” as “the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era” (5). Generative AI is trained on large existing data sets, mostly scraped off the internet. This means that models are ingesting biased information as well as information that is likely to over-represent certain groups such as white, economically well-off individuals. As the old adage goes, “garbage in, garbage out.”

The result is algorithmic bias. Algorithmic bias “describes systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. Also, occurs when an algorithm produces results that are systemically prejudiced due to erroneous assumptions in the machine learning process.” In addition to challenges with training data, algorithmic bias is also impacted by the implicit bias of generative AI developers, a field in which white men are over-represented and women and some racial groups such as Blacks and Latinos are under-represented. As Henry Kissenger, Eric Schmidt, and Daniel Huttenlocher wrote in Age of AI: And Our Human Future, “The algorithms, training data, and objectives for machine learning are determined by the people developing and training the AI, thus they reflect those people’s values, motivations, goals, and judgment” (77).

Student Access and the Expansion of the Digital Divide

In addition to challenges around algorithmic bias, AI also experiences challenges of access. According to the 2021 American Community Survey One Year Estimates, “Types of Computers and Internet Subscriptions,” of the 127,544,730 households in the United States, 6,320,698 (4.9%) had no computing device including a smartphone and 12,374,652 (9.7%) had no internet subscription including no cellular data plan. This digital divide is especially acute for low-income Americans with 15.3 percent of Americans with income less than $75,000 lacking internet access. And a Pew Research Center 2021 study found that 15 percent of all American adults were smartphone only internet users but that number rose sharply to 28 percent when looking at 18-29 years old. Even then, that number is not equally divided among all racial groups. 25 percent of Hispanic young adults and 17% of Black young adults were smartphone only internet users as compared to 10 percent of White young adults.

Why does the digital divide matter when we explore equity and AI? Simply put, most generative AI is difficult to use without an internet connection. Although text based generative AI like ChaptGPT can run on mobile devices, its response time may be slower than one would experience when using a high-speed internet connection. Making more sophisticated queries with long outputs would be difficult, at best.

In addition to challenges resulting from the digital divide, there are challenges associated with the cost of using generative AI tools themselves. Chat GPT, which started out free, is now partially behind a paywall begging the question of how much longer these tools will remain freely available. In fact, the economic realities of the astronomical costs of running generative AI almost guarantees that such paywalls will become more common. CNBC reports that just training a large language model could be more than $4 million. Some estimates of daily cost to run it put it at $100,000 per day or $3,000,000 per month.

What will be the result of fewer students having access to generative AI? We run the risk of the digital divide turning into an AI divide. Some students who lack sufficient access will not gain the skills related to working with generative AI that will be increasingly necessary as we enter an age of hybrid human/AI work.

What Does this Mean for You?

Generative AI has the potential to revolutionize society, including higher education, in ways that we still are determining. But as higher education professionals, we need to be cognizant of how we leverage generative AI. How can you build upon the promise of generative AI while mitigating some of its challenges?

  • Explore the ways that generative AI can be leveraged to improve learner accessibility. This may mean working with various offices on campus including the office that handles student accommodations.
  • Be especially cognizant of the limitations that students may have accessing generative AI and plan accordingly. This might mean ensuring that there are adequate campus resources and, for face-to-face and hybrid courses, even consider focusing on AI usage during class via campus computer labs or in-classroom device loan programs.
  • Help students think critically about the results of generative AI, especially large language models trained on biased data sets. A key piece of data literacy in the age of artificial intelligence needs to be a discussion of algorithmic bias and the New Jim Code.

As we continue to explore the ways in which generative artificial intelligence can impact higher education, it is critical for us to remember that no technology is neutral.

As Kate Crawford in Atlas of AI puts it, “Artificial intelligence is not on objective, universal, or neutral computational technique that makes determinations without human direction. Its systems are embedded in social, political, cultural, and economic worlds, shaped by humans, institutions, and imperatives that determine what they do and how they do it” (211). Does this negate the potential advantages of generative AI and the ways that it can improve educational equity? No, but it does mean we should be cognizant of the potential for further educational inequity and work to counter that potential.


Van Davis

Chief Strategy Officer, WCET


vdavis@wiche.edu

LinkedIn Profile

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,537 other subscribers

Archive By Month

Blog Tags

Distance Education (340)Student Success (313)Online Learning (242)Managing Digital Learning (241)State Authorization (229)WCET (223)U.S. Department of Education (215)Regulation (212)Technology (169)Digital Learning (164)Innovation (125)Teaching (121)Collaboration/Community (114)WCET Annual Meeting (106)Course Design (103)Professional Development (101)SAN (100)Access (99)Faculty (90)Cost of Instruction (89)Financial Aid (84)Legislation (83)Completion (74)Assessment (69)Accessibility (68)Instructional Design (68)Open Educational Resources (68)Accreditation (65)Professional Licensure (65)COVID-19 (64)SARA (64)Credentials (62)Competency-based Education (61)Quality (61)Data and Analytics (60)Diversity/Equity/Inclusion (59)Research (58)Reciprocity (57)WOW Award (54)Outcomes (47)Workforce/Employment (46)Negotiated Rulemaking (45)Regular and Substantive Interaction (43)Policy (43)Higher Education Act (41)Virtual/Augmented Reality (37)Artificial Intelligence (36)Title IV (36)Practice (35)Academic Integrity (34)Disaster Planning/Recovery (34)Leadership (34)State Authorization Network (32)Every Learner Everywhere (31)WCET Awards (30)IPEDS (28)Adaptive/Personalized Learning (28)Reauthorization (28)Military and Veterans (27)Survey (27)Credits (26)Disabilities (25)MOOC (23)WCET Summit (23)Evaluation (22)Complaint Process (21)Retention (21)Enrollment (21)Correspondence Course (18)Physical Presence (17)WICHE (17)System/Consortia (16)Cybersecurity (16)Products and Services (16)Blended/Hybrid Learning (15)Forprofit Universities (15)Member-Only (15)WCET Webcast (15)Digital Divide (14)NCOER (14)Textbooks (14)Mobile Learning (13)Consortia (13)Personalized Learning (12)Futures (11)Marketing (11)Privacy (11)STEM (11)Prior Learning Assessment (10)Courseware (10)Teacher Prep (10)Social Media (9)LMS (9)Rankings (9)Standards (8)Student Authentication (8)Partnership (8)Tuition and Fees (7)Readiness and Developmental Courses (7)What's Next (7)International Students (6)K-12 (6)Lab Courses (6)Nursing (6)Remote Learning (6)Testing (6)Graduation (6)Proctoring (5)Closer Conversation (5)ROI (5)DETA (5)Game-based/Gamification (5)Dual Enrollment (4)Outsourcing (4)Coding (4)Security (4)Higher Education Trends (4)Mental Health (4)Fall and Beyond Series (3)In a Time of Crisis (3)Net Neutrality (3)Universal Design for Learning (3)Cheating Syndicates Series (3)ChatGPT (3)Enrollment Shift (3)Minority Serving Institution (3)Nontraditional Learners (2)Student Identity Verification (2)Cross Skilling/Reskilling (2)Virtual Summit (2)Department of Education (2)Higher Education (2)Title IX (1)Business of Higher Education (1)OPMs (1)Third-Party Servicers (1)microcredentials (1)equity (1)Community College (1)Formerly Incarcerated Students (1)Global (1)