Keynote speaker Rachel (Rae) Walker, Ph.D., RN, FAAN, opened the 2024 Zeigler Forum with a description of one of their early experiences with Artificial Intelligence (A.I.), a study that used eye tracking technology and machine learning to help cancer patients better manage chronic fatigue syndrome, a complicated disorder with no known cause and no single test to confirm diagnosis. The study helped ease physical suffering for participants, garnered positive attention for the computer engineering team that built the technology, and thrust Walker into unfamiliar territory as a consultant addressing health issues involving technologies like A.I. and machine learning.
But, said Walker, “The study didn’t move us any closer to justice for patients experiencing chronic fatigue.”
Walker (they/them), whose scholarship focuses on equity-centered and community-directed health innovation and digital defense against technologies that cause harm, is the only nurse Invention Ambassador for the American Association for the Advancement of Science and co-founded Health Tech for the People, a multi-disciplinary research thrust focused on tech ethics and accountable design, in 2023.
Explaining why mitigating chronic fatigue was a challenge that A.I. was unequipped to meet, Walker pointed to medical gaslighting, a behavior in which a physician or other medical professional dismisses or downplays a patient’s physical symptoms or attributes them to something else, such as a psychological condition.
“Fatigue is one of those invisible symptoms that can be incredibly disabling,” said Walker. “It's something we're seeing now with syndromes like long COVID. And depending on who you are in the world and where you sit, this can determine whether anyone takes your symptoms seriously.”
Walker’s point was that for A.I. to help patients with chronic fatigue, the Artificial Intelligence would also have to be able to convince the providers to believe the patients were experiencing symptoms of the condition.
“We would have had to have designed a means by which to address the underlying structures that lead to medical gaslighting and chronic undertreatment of certain symptoms - things like racism, sexism, ableism and the structural poverty required by capitalism,” said Walker.
Walker focused the rest of the talk on what they’ve learned from their experiences with A.I. and underscored the importance of questioning its potential for both good and harm, particularly in the context of the ways that care work is - and has historically been – valued and undervalued.
“These are questions about A.I., but they’re not just about A.I. They’re about the forces that shape the society and systems in which we live, the ways in which we recognize and give value to resources and labor in our society and health systems, and who decides,” said Walker. “The ‘who’ is critical here: Who is defining, who is assigning value, who makes the decisions about what is health, and what is care, and what should that look like going forward?”
Walker emphasized that A.I. technologies like machine learning and generative A.I.’s predictive text draw upon past data to create statistically representative patterns, effectively replicating the past in the present.
“At a time when A.I.- animated technologies are in the headlines and on everybody’s lips and quite literally in our bedrooms and our pockets in the form of smart speakers and smartphones and the internet of things, we should make sure aspirations for health justice and our commitments to each other and all those we accompany in care stay front and center,” said Walker.
How technology impacts clinical care
“Technology isn’t flawless,” said Clinical Assistant Nursing Professor Brandon Brown, M.S.N., RN, reflecting on Walker’s keynote and its value for students in the health professions. “It has bias, just like people do, and I think that's important to think about for nursing, especially as people are sorted into algorithms for care. Thinking about who gets sorted, where, and why is important for the nurse to keep in mind because that can cause harm.”
Brown also noted the limitations of some widely used healthcare technologies: “There's an interesting case of the automatic soap dispensers in the bathrooms, how the technology was meant to be for white skin and wasn't working for people of color. And a study found that the pulse oximeter can be inaccurate for people who are not white."
"It's important for our students to think about how technology might impact the way in which they care for folks,” said Brown.
For Kristen Koeller, who earned her Doctor of Nursing Practice degree from the University of Vermont this spring, Walker’s perspective prompted an additional question.
“The key that's missing with A.I. is that there's no person-to-person connection,” said Koeller. “I question if A.I. will ever replace that. I don't think it will.”
The 2024 Zeigler Research Forum featured 70 student poster presentations, a talk by Dr. Melissa Scheiber, recipient of the 2023 CNHS Research Incentive Award, and a series of data blitz presentations by students.
Walker is a co-author, and Brown is co-editor, of ‘Nursing a Radical Imagination: Moving from Theory and History to Action and Alternate Futures.'