New Publication: Study finds that AI chatbots neglect diverse sources of conservation expertise
New CSIRO research examines the environmental justice implications of using chatbot-driven information in conservation research and practices.
Artificial Intelligence (AI) tools and technologies are increasingly being used in conservation science for tasks such as the collection and modelling of environmental big data. But there is growing concern that the potential negative consequences of AI innovations are not being identified and considered in conservation research and actions.
A new open-access paper published in Nature Humanities & Social Sciences Communications assesses how chatbot-generated content considers conservation justice debates.
Researchers from CSIRO’s Valuing Sustainability Future Science Platform (VS FSP) undertook an evaluation to understand how ChatGPT’s content from OpenAI is affecting biases and representation in global conservation strategies.
“We are interested in understanding the knowledge systems that underpin how data-driven AI tools are designed and implemented,” explains lead author Dr Danilo Urzedo.
“Some of our findings are very concerning in terms of the biases that are being reinforced and the potential impacts on conservation practices and policymaking. ChatGPT’s sources rely mostly on databases from high-income countries, while the experiences and expertise of the Global South are often completely neglected.”
Researching AI chatbots
The research team’s first task was to design a 30-question interview that would be used to question ChatGPT and produce AI-generated information about ecological restoration.
The interview questions covered three distinct themes: knowledge systems, stakeholder engagement and technical approaches used for ecological restoration activities. To ensure comprehensive data collection, each question was asked 1000 times, resulting in a dataset of 30,000 answers.
As Dr Urzedo explains, “The questions around knowledge systems explore the types of experts included in the datasets considered by ChatGPT. Stakeholder engagement looks at the organisational engagements described by the chatbot, while technical approaches refer to the restoration techniques and outcomes when recovering degraded ecosystems.”
Over-reliance on voices from the Global North
The results of the study revealed that more than two-thirds of the chatbot’s answers relied on the expertise of male academics working at universities in the United States.
And while ChatGPT’s answers covered the restoration experiences of 145 countries across different regions of the world, evidence from low- and lower-medium-income countries provided only 7% of the answers, while Indigenous and community restoration experiences were even lower at only 2%.
This concerning representation gap, in which expertise from the Global South is so heavily neglected, has significant implications for conservation science and practices.
“Most of the ecological restoration expertise being presented by ChatGPT is coming from the US, Europe, Canada, and Australia,” says Dr Urzedo.
“But when we consider the international targets and policies that are in place to restore the planet, such as the Paris Agreement – most of the official pledges are actually made by developing countries. There is a major imbalance in which high income countries have a concentration of knowledge production and resources – they possess extensive open access databases that are being mined for data by big tech companies. However, many of these nations lack official commitments to implement restoration actions.”
Who are the influential stakeholders?
While the questions on knowledge systems revealed a strong bias towards expertise from the Global North, the questions on stakeholder engagement revealed a similar bias in stakeholder engagements.
The research team analysed the organisation engagements described by the chatbot in order to identify which stakeholders – including companies, government agencies, universities, international bodies, and Indigenous and community groups – were considered influential in ecological restoration actions.
265 organisations were mentioned in the dataset of ChatGPT’s answers, and more than half of these were related to not-for-profit entities, mainly transnational organisations like the Worldwide Fund for Nature (9.8%), The Nature Conservancy (7.9%) and the International Union for Conservation of Nature (4.5%). Government agencies, particularly in the United States, also featured prominently.
“There is a very strong push globally just now to elevate Indigenous and community engagement in conservation strategies,” says Dr Urzedo. “But the information produced by ChatGPT doesn’t reflect this consideration. Instead, it reinforces the importance of globally influential stakeholders in decision-making processes.”
A focus on forest tree-planting
The final set of questions posed to ChatGPT explored how the chatbot approached the diversity of ecosystems and plant life and environmental outcomes.
Analysing the responses to their questions, the research team found that ChatGPT reinforced the adoption of tree planting interventions as the main restoration intervention associated with positive environmental outcomes.
Among all the plant life forms mentioned in the dataset, trees accounted for almost 92% of the mentions, whereas more diverse ecosystems including grasslands, coastal, savannahs and drylands were largely disregarded.
“Tree planting is a simplistic discourse to mobilise society where most of the international restoration funding and resources have been allocated,” says Dr Urzedo. “We know that tree-planting restoration in non-forest ecosystems is a problematic intervention. When projects don’t consider the local ecosystems, and they don’t take into account the diversity of species, it can cause several negative environmental impacts, such as soil degradation and water scarcity.”
The concern for Dr Urzedo and other researchers who are analysing justice issues in conservation is that as practitioners and policymakers increasingly rely on AI tools and technologies to inform their decision making, the biases of literature are going to be strongly reinforced.
“Although ChatGPT as a language model is not yet specialised in any particular operation, big tech companies are already developing specific tools that will assist in research and policy development and will allow users to interact with vast datasets – asking questions of a specific database, for example. The findings of this study show that it is really important for us to question the justice implications of using data-driven AI information.”
Starting an important conversation
The study clearly demonstrates geographical, expertise and organisational biases in AI-driven content on ecological restoration.
These biases run counter to ongoing international efforts, such as through the Kunming-Montreal Global Biodiversity Framework, to ensure that women, Indigenous peoples, local communities, and civil society organisations can effectively participate in global conservation efforts.
“Clearly we need to be examining AI tools and technologies used in conservation science, exploring what the consequences of those might be, and considering what safeguards we need to put in place to ensure diverse experiences and expertise are not overlooked,” says Dr Urzedo.
Author – Ruth Dawkins