Editor’s note: Business content from The New York Times will now be included with your subscription to Finance & Commerce. Not a subscriber? Start your subscription here.
In 2015, Melany Anderson’s 6-year-old daughter came home from a play date and asked her mother a heartbreaking question: Why did all her friends have their own bedrooms?
Anderson, 41, a pharmaceutical benefits consultant, was recently divorced, living with her parents in West Orange, New Jersey, and sharing a room with her daughter. She longed to buy a home, but the divorce had emptied her bank account and wrecked her credit. She was working hard to improve her financial profile, but she couldn’t imagine submitting herself to the scrutiny of a mortgage broker.
“I found the idea of going to a bank completely intimidating and impossible,” she said. “I was a divorced woman and a Black woman. And also being a contractor — I know it’s frowned upon, because it’s looked at as unstable. There were so many negatives against me.”
Then, last year, Anderson was checking her credit score online when a pop-up ad announced that she was eligible for a mortgage, listing several options. She ended up at Better.com, a digital lending platform, which promised to help Anderson secure a mortgage without ever setting foot in a bank or, if she so desired, even talking to another human.
In the end, she estimated, she conducted about 70% of the mortgage application and approval process online. Her fees totaled $4,000, about half the national average. In November 2019, she and her daughter moved into a two-bedroom home not far from her parents with a modern kitchen, a deck and a backyard. “We adapted to the whole COVID thing in a much easier way than if we were still living with my parents,” Anderson said this summer. “We had a sense of calm, made our own rules.”
Getting a mortgage can be a harrowing experience for anyone, but for those who don’t fit the middle-of-last-century stereotype of homeownership — white, married, heterosexual — the stress is amplified by the heightened probability of getting an unfair deal. In 2019, African Americans were denied mortgages at a rate of 16% and Hispanics were denied at 11.6%, compared with just 7% for white Americans, according to data from the Consumer Finance Protection Bureau. An Iowa State University study published the same year found that LGBTQ couples were 73% more likely to be denied a mortgage than heterosexual couples with comparable financial credentials.
Digital mortgage websites and apps represent a potential improvement. Without showing their faces, prospective borrowers can upload their financial information, get a letter of preapproval, customize loan criteria (like the size of the down payment) and search for interest rates. Software processes the data and, and if the numbers check out, approves a loan. Most of the companies offer customer service via phone or chat, and some require that applicants speak with a loan officer at least once. But often the process is fully automated.
Last year, 98% of mortgages originated by Quicken Loans, the country’s largest lender, used the company’s digital platform, Rocket Mortgage. Bank of America recently adopted its own digital platform. And so-called fintech startups like Roostify and Blend have licensed their software to some of the nation’s other large banks.
Reducing — or even removing — human brokers from the mortgage underwriting process could democratize the industry. From 2018 to 2019, Quicken reported a rise in first-time and millennial homebuyers. Last year, Better.com said, it saw significant increases in traditionally underrepresented homebuyers, including people of color, single women, LGBTQ couples and customers with student loan debt.
“Discrimination is definitely falling, and it corresponds to the rise in competition between fintech lenders and regular lenders,” said Nancy Wallace, chair in real estate capital markets at Berkeley’s Haas School of Business. A study that Wallace co-authored in 2019 found that fintech algorithms discriminated 40% less on average than face-to-face lenders in loan pricing and did not discriminate at all in accepting and rejecting loans.
If algorithmic lending does reduce discrimination in home lending in the long term, it would cut against a troubling trend of automated systems — such as AI-based hiring platforms and facial recognition software — that turn out to perpetuate bias. Faulty data sources, software engineers’ unfamiliarity with lending law, profit motives and industry conventions can all influence whether an algorithm picks up discriminating where humans have left off. Digital mortgage software is far from perfect; the Berkeley study found that fintech lenders still charged Black and Hispanic borrowers higher interest rates than whites. (Lending law requires mortgage brokers to collect borrowers’ race as a way to identify possible discrimination.)
“The differential is smaller,” Wallace said. “But it should be zero.”
The persistence of gatekeepers
Better.com started in 2016 and is licensed to underwrite mortgages in 44 states. This year, the company has underwritten about 40,000 mortgages and funds roughly $2.5 billion in loans each month. After a COVID-19 slump in the spring, its fund volume for June was five times what it was a year ago.
With $270 million in venture funding, the company generates revenue by selling mortgages to about 30 investors in the secondary loan market, like Fannie Mae and Wells Fargo. The company attracts customers as it did Anderson: buying leads from sites like Credit Karma and NerdWallet and then marketing to those customers through ads and targeted emails.
In 2019, Better.com saw a 532% increase in Hispanic clients between the ages of 30 and 40 and a 411% increase in African Americans in the same age bracket. Its married LGBTQ client base increased tenfold. “With a traditional mortgage, customers feel really powerless,” said Sarah Pierce, Better.com’s head of operations. “You’ve found a home you love, and you’ve found a rate that’s good, and somebody else is making the judgment. They’re the gatekeeper or roadblock to accessing financing.” Of course, Better.com is making a judgment too, but it’s a numerical one. There’s no gut reaction, based on a borrower’s skin color or whether they live with a same-sex partner.
Digital lenders say that they assess risk using the same financial criteria as traditional banks: borrower income, assets, credit score, debt, liabilities, cash reserves and the like. These guidelines were laid out by the Consumer Finance Protection Bureau after the last recession to protect consumers against predatory lending or risky products.
These lenders could theoretically use additional variables to assess whether borrowers can repay a loan, such as rental or utility payment history, or even assets held by extended family. But generally, they don’t. To fund their loans, they rely on the secondary mortgage market, which includes the government-backed entities Freddie Mac and Fannie Mae, and which became more conservative after the 2008 crash. With some exceptions, if you don’t meet the standard CFPB criteria, you are likely to be considered a risk.
Fair housing advocates say that’s a problem, because the standard financial information puts minorities at a disadvantage. Take credit scores — a number between 300 and 850 that assesses how likely a person is to repay a loan on time. Credit scores are calculated based on a person’s spending and payment habits. But landlords often don’t report rental payments to credit bureaus, even though these are the largest payments that millions of people make on a regular basis, including more than half of Black Americans.
For mortgage lending, most banks rely on the credit scoring model invented by the Fair Isaac Corp., or FICO. Newer FICO models can include rental payment history, but the secondary mortgage market doesn’t require them. Neither does the Federal Housing Administration, which specializes in loans for low and moderate-income borrowers. What’s more, systemic inequality has created significant salary disparities between Black and white Americans.
“We know the wealth gap is incredibly large between white households and households of color,” said Alanna McCargo, the vice president of housing finance policy at the Urban Institute. “If you are looking at income, assets and credit — your three drivers — you are excluding millions of potential Black, Latino and, in some cases, Asian minorities and immigrants from getting access to credit through your system. You are perpetuating the wealth gap.”
For now, many fintech lenders have largely affluent customers. Better.com’s average client earns over $160,000 a year and has a FICO score of 773. As of 2017, the median household income among Black Americans was just over $38,000, and only 20.6% of Black households had a credit score above 700, according to the Urban Institute. This discrepancy makes it harder for fintech companies to boast about improving access for the most underrepresented borrowers.
Ghost in the machine
Software has the potential to reduce lending disparities by processing enormous amounts of personal information — far more than the CFPB guidelines require. Looking more holistically at a person’s financials as well as their spending habits and preferences, banks can make a more nuanced decision about who is likely to repay their loan. On the other hand, broadening the data set could introduce more bias. How to navigate this quandary, said McCargo, is “the big AI machine learning issue of our time.”
According to the Fair Housing Act of 1968, lenders cannot consider race, religion, sex, or marital status in mortgage underwriting. But many factors that appear neutral could double for race. “How quickly you pay your bills, or where you took vacations, or where you shop or your social media profile — some large number of those variables are proxying for things that are protected,” Wallace said.
Lisa Rice, the president and chief executive of the National Fair Housing Alliance, said she was skeptical when mortgage lenders said their algorithms considered only federally sanctioned variables like credit score, income and assets. “Data scientists will say, if you’ve got 1,000 bits of information going into an algorithm, you’re not possibly only looking at three things,” she said. “If the objective is to predict how well this person will perform on a loan and to maximize profit, the algorithm is looking at every single piece of data to achieve those objectives.”
Fintech startups and the banks that use their software dispute this. “The use of creepy data is not something we consider as a business,” said Mike de Vere, the chief executive of Zest AI, a startup that helps lenders create credit models.
Credit: Source link