EARTH DAUGHTERS
  • ABOUT US
    • Partnerships
    • CONTACT
  • Projects
    • Funding Opportunities >
      • Earth Daughters Fund
    • Community Projects >
      • Indigenous Rural Schools
      • Food Security & Sovereignty
      • Natural Disaster Relief
    • Global Advocacy >
      • Campaigns
  • Publications
    • ECO-COLONIALISM
    • Land Rights for Climate Justice
    • AI & Indigenous Peoples
    • Our Voices from the Land
  • Support
    • DONATE
    • Sponsorships
    • APPAREL
    • Online Store

Risks of AI for Indigenous Peoples (bias, surveillance, cultural appropriation, data exploitation).

11/2/2025

0 Comments

 
By: Arbay Abdulahi 
​

I am a contributor to the earth daughters’ campaign, and I believe that artificial intelligence that is not looked over can not be a futuristic tool for progress, and I believe that it is a threat of digital colonialization that will be or could be something that erases the identity of indigenous peoples, and their knowledge.

As of right now Ai has been everywhere, AI has so many negative effects, for example, it can have misinformation and people use it for data entry, and to manage systems. And for indigenous people AI brings a threat because right now AI is acting like a new colonial power by taking what’s not theirs and making it into something different. This is not only for technical issues but it’s also a fight for human rights and environmental justice.

I would say that one of the biggest problems with AI is that it gets all its information from data collecting. Artificial intelligence leaves out and wrongfully judges indigenous communities, AI is biased and when it is used by the government or agencies it creates real harm. For example, in the article “Artificial intelligence and indigenous people’s realities” by Nina Sangma she says, “For indigenous peoples the dawning age of Artificial intelligence threatens to exacerbate existing inequalities and violence.” This proves that AI is a threat and that Artificial intelligence doesn’t just create new problems, but it makes the problems that already exist even worse because it works very fast and on a large scale.  

The systems that control artificial intelligence are risky for indigenous people due to high-risk surveillance. For example, facial recognition is trained and used by data that poorly represents indigenous people, they are wrongfully identified, falsely accused, and unfairly targeted. The use of AI tools makes existing discrimination a lot worse by letting authorities have ways to watch and control marginalized communities. In the article “Biased Technology: The Automated Discrimination of Facial Recognition” by Rachel Fergus, it states that “Law enforcement and the criminal justice system already disproportionately target and incarcerate people of color. Using technology that has documented problems with correctly identifying people of color is dangerous.” And this proves that indigenous people face a double burden of risk and that adding an AI technology like face recognition that is not always correct is dangerous.

In conclusion biased AI and the digital theft of culture are major new dangers to indigenous peoples. AI had been actively hurting indigenous rights and damages their unique cultural integrity by using their information without permission. For the earth’s daughters’ campaign their land also means that we must fight for fairness in technology. We need to stand up for indigenous sovereignty and make sure that AI stops trying too digitally takeover everything and in order to be able to do that I believe that we must all work together to figure out how we can take control to manage this situation.

References
  • Nina Sangma (March 19 2024) Artificial Intelligence and Indigenous Peoples’ Realities
  • https://www.culturalsurvival.org/publications/cultural-survival-quarterly/artificial-intelligence-and-indigenous-peoples-realitiesLinks to an external site. 
  • Rachel Fergus ( February 29 2024) Biased technology: The automated discrimination of facial recognition 
  • https://www.aclu-mn.org/en/news/biased-technology-automated-discrimination-facial-recognitionLinks to an external site. 
0 Comments



Leave a Reply.

    Categories

    All

    RSS Feed

Picture
HOME    ABOUT US    PROJECTS    PUBLICATIONS    CONTACT
INSTAGRAM
© 2025 Earth Daughters. All Rights Reserved.
​Earth Daughters is a registered 501(c)(3) nonprofit organization.
  • ABOUT US
    • Partnerships
    • CONTACT
  • Projects
    • Funding Opportunities >
      • Earth Daughters Fund
    • Community Projects >
      • Indigenous Rural Schools
      • Food Security & Sovereignty
      • Natural Disaster Relief
    • Global Advocacy >
      • Campaigns
  • Publications
    • ECO-COLONIALISM
    • Land Rights for Climate Justice
    • AI & Indigenous Peoples
    • Our Voices from the Land
  • Support
    • DONATE
    • Sponsorships
    • APPAREL
    • Online Store