• Get in Touch
  • Get in Touch with our Support!
  • Privacy Policy
Friday, January 27, 2023
OvaNewsBlast.com
  • Home
  • News
  • African Americans
  • Business
  • Sports
  • Technology
  • Entertainment
No Result
View All Result
  • Home
  • News
  • African Americans
  • Business
  • Sports
  • Technology
  • Entertainment
No Result
View All Result
OvaNewsBlast.com
No Result
View All Result

AI inevitability – can we separate bias from AI innovation?

October 15, 2020
in Technology
Reading Time: 5min read
A A
AI inevitability – can we separate bias from AI innovation?
0
SHARES
2
VIEWS
Share ShareShareShareShareShare

We’ve been led to believe that A.I. is going to solve all of our problems – economically, socially, environmentally. It stretches credulity that it can do that when all it does is find patterns in numbers. But what it is capable of – in that limited role – is dangerous.

Nevertheless, A.I.’s inevitability, predicted by industry, academics, and industry analysts, goes without question. That the issues of bias, exclusion, and disinformation are social problems that cannot be addressed with pattern-matching and curve fitting, and cannot satisfactorily be dealt with by technology is a strength, not a weakness of the inevitability narrative.

The false promise of “solving” bias computationally obscures the larger issue: bias is pervasive. But there is an industry-wide strategic switcheroo lurking here, a beguiling diversion. Instead of being a problem, bias becomes part of the broad picture of A.I. innovation. It’s like Hunter S. Thompson’s definition of Gonzo Journalism: start a fire and report on it.

Beyond Defense and Intelligence operations, whose scale is unknown, most A.I. today is used to pierce your privacy veil and provide a “360 degree” view of you to sell you things. Dealing with data about human beings, everyone knows A.I. has a big problem: bias. A.I. Ethics and notions of “fairness” are now the full employment act of swarms of thinkers, writers, academics, consultants, and self-declared “ethicists,” many of whom have no credentials, no experience in A.I., or even a clue how to solve the bias problem.

The A.I. industry purports to attack the bias problem computationally. It can’t be done. A.I. engineers’ imagination for ethics, fairness, and the law is hopelessly limited because bias isn’t a technology problem; t’s cultural, subject to interpretation, and insidious. Bias creeps into algorithms, into models of algorithms, into the data and semantics and labels used by A.I., and even into how the models are deployed. But if you step back a little, you see that the technology companies’ hand-wringing and focus over bias obscures a much more significant problem with A.I. With all the hype, what mostly goes unnoticed is the two-headed monster of A.I. –   inevitability and A.I. bias. All sorts of undesirable societal issues, like bias, job losses, environmental issues, housing, economic disparity, and others, become part of A.I. innovation’s promise. 

I’ve seen up close narrow increases in “fairness” celebrated as progress, such as loosening the credit approval algorithm for borrowers whose criteria are not 100%. But buried in the rationalization is that the requirements were biased and discriminatory in the first place. This isn’t fairness. These tepid efforts are baby steps to address long-standing discrimination.

Everyone has heard of the iconic cases of egregious A.I. bias: Amazon’s hiring algorithm, Facebook’s ad server, and the COMPASS Judicial system. Why hasn’t there been more publicity about a system sold to healthcare organizations by Optum, the $100 billion unit of UnitedHealth Group? It predicts which patients will benefit from extra medical care, and dramatically underestimates the sickest black patients’ health needs, amplifying long-standing racial disparities in medicine. The model’s flaw was that it used the cost of care as the significant predictor of risk. Black patients historically were less able to pay, so the system ran an infinite feedback loop, denying black patients a higher quality of care.  Having made this discovery, a team from UC Berkeley worked with Optum to find variables other than cost to assign the expected risk scores, reducing 84% of bias. The problem was solved in this one instance, but it is still endemic in similar tools employed by public and private institutions that provide healthcare to 200 million Americans.

When insurers price coverage on drivers’ economic status, they are invariably doing so in a manner that disproportionately targets African Americans with higher prices. Companies have not asked for a customer’s race for decades, but they may or may not be aware that they can infer it from data they collect (the so-called proxies or latent-values), but their AI-driven pricing tools are well aware of it. There is no reason to believe that they are deliberately supporting systemic racism, but they’d have to be blind actuaries not to notice it. Still, the results are undeniable – government-required auto insurance is consistently more expensive for black Americans and the working poor in general. A state insurance regulator told me that auto underwriters previously used only a handful of data to price a policy but enough to determine an applicant’s race a few years ago. Today, they have access to hundreds of data points on each individual and AI-driven pricing tools to leverage it. As she explained, auto insurance is mandatory and expensive and is essentially a regressive tax on the working poor. When she receives a rate filing from an insurance company, she sees her role first as “Is it fair?”  

The FICO score is a real boat anchor for the working poor. They don’t have low FICO scores because they’re deadbeats or dishonest or reckless drivers. They have a low FICO score because they struggle with finances for a plethora of reasons: low wages, employment instability, housing instability, high prices for substandard food. But their score follows them around and causes high auto insurance costs, high-interest rates, difficulty finding mobility in careers. No, it’s not fair.

Why do we care about ethics in A.I.? For reasons like the ones above, A.I. can harm people at scale before anyone notices, but the damage is already done. The first ones to notice are usually those with the least agency to do anything about it. 

Part of the problem is a lack of understanding of how machine learning works. It’s not a magic eight-ball. The model has to be “trained” by processing data that has been labeled (this record is a picture of a horse). Data is biased. Labeling introduced more bias. Once trained, the model has found specific patterns. It applies to unlabeled data to make a prediction. In a typical scenario, the algorithm uses gradient descent (or ascent, depending on the model) to converge on a cost function solution. But what happens when it doesn’t? Like Jurassic Park, the algorithm finds a way. It will deviate from the selected features and apply “latent values” to get to a convergent solution. Sometimes, these solutions look perfectly reasonable, but they are utterly wrong. Amateurish development, an aching desire to push something out, insidious problems with the data can cause a significant mess.

My take

So, if bias can’t be solved computationally, what’s the solution? I’ve had managers of companies I’ve trained in A.I. Ethics notice the participants have a solid grasp of the concepts but continue to operate in ethically-compromised ways. The reasons are apparent. There is a massive asymmetry between adverse effects in the social context and the economic benefits in deploying inferencing systems. These forces augur against asking the question: We can build these systems, but should we? 

Frank Lloyd Wright once quipped that “Route 66 is a giant chute through which everything in the middle of the country is falling to southern California.” Today, it’s the one-way transfer of our data to the elephant tech companies and applying automated, predictive solutions to everything we do, personally, collectively, and politically, and supports a malicious monopoly. Any A.I. system that affects people’s lives must be subject to protest, account, and redress.

I’m afraid we need legislation to disincentivize data hoarding, including carefully defined bans, levies, mandated data sharing, and community benefits policies, all backed up by enforcement. Smarter data policies would reenergize competition and innovation, both of which have unquestionably slowed with the tech giants’ concentrated market power. The most significant opportunities will flow to those who act most boldly.

The second great opportunity is to wrestle with fundamental existential questions and build robust processes to resolve them. Which systems deserve to be made? Which problems most need to be tackled? Who is best placed on building them? And who decides? We need genuine accountability mechanisms, external to companies, and accessible to populations.

Credit: Source link

ShareTweetSendSharePinShare
Previous Post

Black Products. Black Shoppers. Black Workers. But Who Owns the Store?

Next Post

An Ad-Hoc Group of Activists and Academics Convene a “Real Facebook Oversight Board”

Next Post
An Ad-Hoc Group of Activists and Academics Convene a “Real Facebook Oversight Board”

An Ad-Hoc Group of Activists and Academics Convene a “Real Facebook Oversight Board”

  • Trending
  • Comments
  • Latest
DOSS becomes the FIRST AFRICAN AMERICAN-FOUNDED REAL ESTATE BROKERAGE BRAND FRANCHISE | News

International African American Museum delays January opening | Ap-entertainment

December 20, 2022
Past Pages for January 5 to 7, 2022

Past Pages for January 5 to 7, 2022

January 5, 2022
these researchers want to fix it

these researchers want to fix it

October 19, 2022
UHD President’s Lecture Series to Feature Scholar, Journalist Nikole Hannah-Jones

UHD President’s Lecture Series to Feature Scholar, Journalist Nikole Hannah-Jones

January 13, 2022
Mary Beatrice Davidson Kenner – Inventor of the Sanitary Belt (LISTEN) – Good Black News

Mary Beatrice Davidson Kenner – Inventor of the Sanitary Belt (LISTEN) – Good Black News

March 4, 2022
Willits medical business robbed of more than 40 pounds of marijuana

Willits medical business robbed of more than 40 pounds of marijuana

January 27, 2023
SoFlo Live | South Florida Times

SoFlo Live | South Florida Times

January 27, 2023
Rowden: Sports transgender topic will get Senate attention | State News

Rowden: Sports transgender topic will get Senate attention | State News

January 27, 2023
Texas Artist Finalist in Nationwide Competition – NBC 5 Dallas-Fort Worth

Texas Artist Finalist in Nationwide Competition – NBC 5 Dallas-Fort Worth

January 26, 2023
Brown hired as general manager of Houston Astros | Sports

Brown hired as general manager of Houston Astros | Sports

January 26, 2023

Recent News

Variolation: How black slave from West Africa introduced vaccination to America

Variolation: How black slave from West Africa introduced vaccination to America

January 25, 2023
Historically Black Catholic university announces new medical college to meet dire shortage

Historically Black Catholic university announces new medical college to meet dire shortage

January 22, 2023
How the Goodwin Family Maintains a Legacy of Entrepreneurship

How the Goodwin Family Maintains a Legacy of Entrepreneurship

January 22, 2023
15 Best Things to Do in Hempstead, NY

15 Best Things to Do in Hempstead, NY

January 22, 2023
OvaNewsBlast.com

A reliable source for African American news, from a different lens. Yours. News about us, by us.

Follow Us

Recent News

Willits medical business robbed of more than 40 pounds of marijuana

Willits medical business robbed of more than 40 pounds of marijuana

January 27, 2023
SoFlo Live | South Florida Times

SoFlo Live | South Florida Times

January 27, 2023

Topics to cover !

  • African Americans
  • Business
  • Entertainment
  • News
  • Sports
  • Technology
  • Get in Touch
  • Get in Touch with our Support!
  • Privacy Policy

© 2020 ovanewsblast.com - All rights reserved!   Download Our App   Read News on odbnewsblast.com

No Result
View All Result
  • Home
  • News
  • African Americans
  • Business
  • Sports
  • Technology
  • Entertainment

© 2020 ovanewsblast.com - All rights reserved!   Download Our App   Read News on odbnewsblast.com