ERP News


AI is hurting people of color and the poor. Experts want to fix that

795 0

New technology brings great promise, and as many problems. Smartphones put access to infinite knowledge in our pockets, but led to the rise of tech addiction. The social media platforms that connected billions of people were turned against democracy.

And so it is with artificial intelligence, which could fundamentally change the world while contributing to greater racial bias and exclusion.

Much of the focus on any downsides of artificial intelligence has been on things like crashing self-driving cars and the rise of machines that kill. Or, as CNN commentator Van Jones put it at a discussion on the topic last week, “What about Terminator?”

But many of the researchers behind this technology say it could pose a greater threat to society by adversely impacting the the poor, the disenfranchised, and people of color.

“Every time humanity goes through a new wave of innovation and technological transformation, there are people who are hurt and there are issues as large as geopolitical conflict,” said Fei Fei Li, the director of the Stanford Artificial Intelligence Lab. “AI is no exception.”

These are not issues for the future, but the present. AI powers the speech recognition that makes Siri and Alexa work. It underpins useful services like Google Photos and Google Translate. It helps Netflix recommend movies, Pandora suggest songs, and Amazon push products. And it’s the reason self-driving cars can drive themselves.

One part of AI is machine learning, in which a system analyzes massive amounts of data to make decisions and recognize pattens on its own. And that data must be carefully considered so that it doesn’t reflect or contribute to existing biases.

“In AI development, we say garbage in, garbage out,” Li said. “If our data we’re starting with is biased, our decision coming out of it is biased.”

We’ve already seen examples of this. A recent study by the M.I.T. Media Lab found facial recognition software has trouble identifying women of color. Tests by The Washington Post found that accents often trip up smart speakers like Alexa. And an investigation by ProPublica revealed that software used to sentence criminals is biased against black Americans.

Read More Here

Article Credit: CNN

Leave A Reply

Your email address will not be published.