|
By: Ismahan Salat
I’m Ismahan Salat from Burien, WA. I care about environmental justice and supporting Indigenous communities in protecting their culture, land, and rights. AI is everywhere. It’s supposed to make life easier but for Indigenous communities it’s tricky. It can either help protect culture and land or mess things up even more. The difference comes down to who’s in control and whose voices actually matter. AI can seriously mess things up if it’s biased or used without permission. Most AI is trained on data that ignores Indigenous languages, stories, and histories. That means it can erase perspectives, spread stereotypes, or just get things straight-up wrong. Facial recognition technology for example misidentifies people of color way more than white people which could put Indigenous activists at risk of surveillance (Buolamwini & Gebru, 2018). Cultural appropriation is another problem. AI scrapes songs, art, and stories online without asking. That’s knowledge stolen out of context and it ignores Indigenous rights. The Indigenous AI position paper calls this “digital colonialism” because it’s basically repeating old patterns of taking what isn’t yours (Lewis et al., 2020). But AI isn’t all bad. If done right it can actually help. One of the coolest things is saving Indigenous languages. Many are disappearing but AI tools can help preserve them. Canada’s FirstVoices platform uses AI to make dictionaries, learning tools, and archives all controlled by Indigenous language experts (First Peoples’ Cultural Council, 2023). AI can also help protect the environment. Indigenous nations are often fighting to defend land from climate change and resource extraction. AI can track deforestation, water pollution, and animal migration giving communities proof to protect their territories and traditions. The real key is who controls the data. Indigenous data sovereignty means communities decide how their information is used. That way AI works for them not against them. The Global Indigenous Data Alliance created the CARE Principles Collective benefit, Authority to control, Responsibility, and Ethics to guide AI in a way that actually respects communities (GIDA, 2019). AI isn’t neutral. It reflects the people who make it. For Indigenous communities it can either repeat harm or create new opportunities to protect culture, land, and language. The future depends on centering Indigenous voices, giving communities control, and building AI that actually respects their worldviews. If we do it right AI can help Indigenous Peoples survive and thrive instead of being another way to erase them. References
0 Comments
Leave a Reply. |
RSS Feed