СÀ¶ÊÓƵ

Skip to content

Opinion: Artificial intelligence can discriminate on the basis of race and gender, and also age

Algorithms have been shown to discriminate on the basis of race and gender. Studying age-related discrimination is essential to develop more equitable AI systems and technologies
elderlyusestechnolgy
People must understand the risks and harms associated with age-related biases as more older adults turn to technology.

We have accepted the use of artificial intelligence (AI) in complex processes — from health care to our daily use of social media — often without critical investigation, until it is too late. The use of AI is inescapable in our modern society, and it may . When health-care providers rely on biased technology, there are real and harmful impacts.

This became clear recently when a study showed that pulse oximeters — which measure the amount of oxygen in the blood and have been an essential tool for clinical management of COVID-19 — . The findings resulted in a , in an attempt to create international standards for testing medical devices.

There are examples in health care, business, government and everyday life where biased algorithms have led to problems, like and .

AI is often assumed to be more objective than humans. In reality, however, AI algorithms make decisions based on human-annotated data, which can be biased and exclusionary. Current research on bias in AI . But what about age-related bias — can AI be ageist?

Ageist technologies?

In 2021, the World Health Organization released , which called for urgent action to combat ageism because of its widespread impacts on health and well-being.

Ageism is defined as “.” It can be or implicit, and can take the form of .

The pervasiveness of ageism has been brought to the forefront throughout the COVID-19 pandemic. Older adults have been labelled as “,” and in some jurisdictions, age has been used as .

Digital ageism exists when . A recent report indicates that a “digital world” of more than . Yet even though older adults are using technology in greater numbers — and benefiting from that use — they continue to be the age cohort least likely to have access to a computer and the internet.

Digital ageism can arise when ageist attitudes influence technology design, or when ageism makes it more difficult for older adults to access and enjoy the full benefits of digital technologies.

Cycles of injustice

There are several intertwined cycles of injustice where technological, individual and social biases interact to produce, reinforce and contribute to digital ageism.

from the research, design and development process of digital technologies. Their absence in technology design and development may also be rationalized with the ageist belief that older adults are incapable of using technology. As such, older adults and their perspectives are rarely involved in the development of AI and related policies, funding and support services.

The unique experiences and needs of older adults are overlooked, despite age being a more powerful predictor of technology use than other demographic characteristics including race and gender.

AI is trained by data, and the absence of older adults could reproduce or even amplify the above ageist assumptions in its output. Many AI technologies are focused on a stereotypical image of an older adult in poor health — a narrow segment of the population that ignores healthy aging. This creates a negative feedback loop that not only discourages older adults from using AI, but also results in .

Even when older adults are included in large datasets, they are often grouped according to . For example, older adults may be defined as everyone aged 50 and older, despite younger age cohorts being divided into narrower age ranges. As a result, .

In this way, AI systems reinforce inequality and magnify societal exclusion for sections of the population, creating a “” primarily made up of older, poor, racialized and marginalized groups.

Addressing digital ageism

We must understand the risks and harms associated with age-related biases as .

The first step is for researchers and developers to acknowledge the existence of digital ageism alongside other forms of algorithmic biases, such as racism and sexism. They need to direct efforts towards identifying and measuring it. The next step is to develop safeguards for AI systems to mitigate ageist outcomes.

There is currently very little training, auditing or oversight of AI-driven activities from a regulatory or legal perspective. For instance, .

This presents alongside other forms of biases and discrimination in need of excision. To combat digital ageism, older adults must be included in a meaningful and collaborative way in designing new technologies.

With bias in AI now recognized as a critical problem in need of urgent action, it is time to consider the experience of digital ageism for older adults, and understand how growing old in an increasingly digital world may .

The Conversation

Charlene Chu receives research funding from the Canadian Institutes of Health Research, New Frontiers Research Fund, Social Sciences and Humanities Research Council, and the Alzheimer Society of Canada. She is an Affiliate Scientist at KITE-Toronto Rehabilitation Institute- University Health Network.

Kathleen Leslie receives funding from the Canadian Institutes of Health Research, Social Sciences and Humanities Research Council, and the National Council of State Boards of Nursing. She is the Governance and Regulation theme lead at the Canadian Health Workforce Network.

Rune Nyrup receives funding from the Wellcome Trust and the Leverhulme Trust. He is a Senior research fellow at the Leverhulme Centre for the Future of Intelligence and a research fellow at the Department of History and Philosophy of Science, University of Cambridge.

Shehroz Khan receives funding from Natural Sciences and Engineering Research Council, Canadian Institutes of Health Research, ocial Sciences and Humanities Research Council. He is affiliated with the University of Toronto as an Assistant Professor.

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks