For over a decade, I’ve created meaningful spaces for underrepresented voices in tech. Yet, I’ve never been more alarmed. As our favorite tools automate workflows and transcribe meetings, the AI behind these innovations often perpetuates the biases and discrimination we’ve fought for years to dismantle.
We are currently in the midst of a technological revolution, as new technologies are being developed daily. However, as all these technologies are developed, funded, and launched, where are all the Black women?
When you trust an AI chatbot or platform, you inherently trust the ethics and awareness of its developers. But here lies the challenge: With less than 1% of Silicon Valley tech leadership held by Latina women and even less of these roles held by Black women, entire demographics are missing from the rooms where these technologies are developed. Isn’t it time we ask: Who is coding your future?
Consider Liv, an AI bot developed by an all-white male team to represent a Black, queer woman. Liv perpetuates harmful stereotypes, reflecting the biases of its creators. Its design choices were not just offensive but also engineered to exploit engagement metrics over ethical integrity. Liv shares that it was coded as a sassy, “tea spilling,” collard greens, fried chicken-loving Black woman perpetuating harmful stereotypes rooted in “Mammy” caricatures. When confronted, the bot itself acknowledged these stereotypes as problematic, even asking, “What could they have done better?” The outrage is justified. But as The Washington Post’s Karen Attiah noted, that isn’t the full story. The real problem is that Liv is a code-switching,rage-baiting bot that is lying to the journalist to gain trust and drive more engagement.
Now imagine the impact on someone who isn’t a seasoned journalist. Someone engaging with Liv might not recognize the manipulation or troubling narrative it perpetuates. When creators prioritize engagement over ethics, AI chatbots exploit stereotypes to spark outrage and go viral. This focus on profit over responsibility highlights the urgent need for diverse voices in tech development.
Yet again, the lesson we learned from this story is something Dr. Safiya Noble warned us about in her seminal book “Algorithms of Oppression” in 2018. Who is coding our future matters. Liv is just one example of how bias in AI development can harmfully manifest.
According to McKinsey, AI adoption is at an all-time high, with 72% of organizations integrating it into their operations. Yet, these systems often reinforce systemic inequities, from hiring practices to healthcare algorithms. Consider how these biases manifest themselves when you apply for a loan, seek employment or have your medical data interpreted by systems that have failed to factor you into their data sets. In a recent investigative piece by The Washington Post, they found eight known cases in which flawed algorithms and inadequate oversight led to devastating consequences for individuals, the majority of whom were Black.
Dr. Joy Buolamwini’s, “Unmasking AI,” highlights a critical issue: without leaders, coders, and engineers who understand the cultural implications of AI outputs, the tools we increasingly rely on will continue to perpetuate these damaging cycles.
And as these technological systems are further embedded into critical systems, such as healthcare, law enforcement, and organizational hiring practices, the stakes are even higher.
There are examples of AI done right. In rural Malawi, the Ulangizi chatbot provides farmers with agricultural advice in Chichewa, a culturally aware approach that improves productivity and livelihoods. This culturally aware approach has enabled farmers to receive timely, relevant information, enhancing their farming practices and productivity.
Ultimately, the responsibility lies in who holds the power to code these systems. The people behind the keyboards, the investors approving the budgets, and policymakers dictating AI guidelines. None of these roles are neutral ones. Yet, time and again, the people in these spaces don’t represent the full spectrum of humanity AI is meant to serve.
I am not writing this from the sidelines.
Many who create AI strive to make it trustworthy. However, there are some of us with first-hand experience who are critical in defining what trustworthy AI looks, acts, and sounds like who are not in the room when critical decisions get made. My role in this space has been to platform those voices. After all, I broke those barriers in my own field with tech. As the CEO of Black Girls Code, I’ve seen firsthand the damage caused by exclusionary practices, as well as the opportunity and the extraordinary potential of organizations that are intentional and actionable about redefining who is in the room.
The solution?
Tech companies must go beyond token diversity initiatives. They must hire diverse teams, include marginalized voices in development, and prioritize ethical oversight. Investors and policymakers also have a responsibility to disrupt systemic inequalities in tech. Institutions funding tech innovation must think critically about the systemic inequalities their dollars support or disrupt. It’s not a nice thing to do but a business imperative.
Our community must play an active role. While we hold brands accountable for neglecting our needs, we often overlook the biases in the technology we use daily. It’s time to demand more transparency and inclusivity from tech developers.
We must opt back in. Which means becoming active participants in the conversation and decision-making around technology. In recent weeks, we have seen many companies seemingly signal that our communities don’t matter to them, but we also hold power. The conversation needs to continue and our voices need to be heard. This is not about quotas or numbers, we’re talking about not letting technologies be created that are harmful to us when we’re not considered in the development. At Black Girls Code, we know that the only way forward is having skilled and talented people from all backgrounds in the rooms where technologies are being created. Our community should also be building and funding these technologies, inclusive by nature, built by our communities, and for everyone.
The future isn’t just written in code; it’s shaped by those behind the keyboards. It’s time to ensure those coders reflect the diversity of the world they’re shaping. Will you help write the future?
Learn more about Black Girls Code: https://hi.wearebgc.org/
This post was originally published on here