Around one in five American adults do not have a credit score, and among those who do, only a third checked their credit score in the past year. A credit score plays a critical role in your financial health. But what is it in the first place?
A credit score is a three-digit number calculated by a credit scoring system that shows how safe or risky you are as a borrower. Lenders use your credit score to analyze your creditworthiness. It helps them evaluate if you can repay a loan, determine if it is worth lending you money, and set an interest rate corresponding to your level of credit risk.
There are two popular credit scores: FICO and VantageScore. These scores, which range between 300 and 850, are based on the information compiled by credit-reporting companies. In the U.S., the three largest credit-reporting companies are Equifax, Experian, and TransUnion.
While today most Americans have credit scores, the concept only began three decades ago. So, how did credit scores start? And what brought about the need for credit reporting?
1800s: Start of Credit Reports
Lending to consumers was not even a thing in the early 19th century. It was only to businesses that commercial lenders extended credit. And to help them manage their risk, local merchant associations formed credit reporting agencies. They collected credit and personal information about borrowers and sold this information to lenders.
The largest commercial credit reporting agencies in the mid-1800s were R.G. Dun & Co and the Bradstreet Company, which is now Dun & Bradstreet. They developed alphanumeric scores to assess the credit risk of business borrowers.
Two of the largest consumer credit bureaus today – Experian and Equifax – trace their roots to this era. Experian has a long history which began in 1897 whenJim Chilton founded the Merchants’ Credit Association. Meanwhile, two Tennessee-based grocers – brothers Cator and Guy Woolford – started the Retail Credit Company (RCC) in 1899, which later became Equifax.
1900s-1920s: Rise of Consumer Credit Reporting
By the turn of the century, while banks and other lenders focused solely on businesses for credit, retailers such as grocers and department stores began extending credit lines to individual consumers. They had credit managers assess their customers’ credit risk using similar methods developed by commercial lenders. In 1912, these credit managers formed a national association to standardize credit information called Retail Grocer’s Association.
By 1920, RCC had offices throughout the U.S. and Canada, and by 1941 was providing 7.5 million consumer credit reports per year.
Around this time, more than 20% of purchases in department stores were on credit. And credit agencies have grown to more than 1,000 by the 1920s. However, the credit information they produced and sold was highly fragmented and qualitative.
1930s-1960s: Shift to Quantitative Scores
In the 1930s, a more quantitative credit scoring system took root. Department stores were early adopters, assigning points to customers to assess their creditworthiness. However, even though they used more standardized point-based scores, the basis was still subjective and often unfair. The emphasis was on character traits and demographic data like race, income, and employment status.
Fair, Isaac and Company, or FICO, changed all that in 1956. Founded by engineer Bill Fair and mathematician Earl Isaac, FICO developed an objective credit scoring system using a statistical approach. They designed this to eliminate the inherent biases of previous qualitative character assessments.
The two sent letters to the 50 largest lenders, but only one – American Investments – adopted the new credit scoring system. National department store chains also adopted FICO’s system early on.
In 1968, Union Tank Car Company, a railway equipment leasing company, started TransUnion to branch out into credit reporting. The following year, it acquired the Credit Bureau of Cook County, setting off its foundation to become one of the three largest credit bureaus in the U.S., along with Equifax and Experian.
1970s-1980s: Strengthening of Consumer Protection
Until then, credit bureaus were unregulated, leading to complaints about inaccurate, unfair, or biased credit information. This prompted Congress to pass the 1970 Fair Credit Reporting Act (FCRA) and the 1974 Equal Credit Opportunity Act (ECOA) to protect consumers from lender discrimination and invasion of privacy.
In 1975, FICO developed the first behavior scoring system for Wells Fargo to predict the credit risk of its customers. It also started creating credit scores based on data from the credit bureaus. In 1989, FICO and Equifax launched the first modern credit score called BEACON. In the same year, FICO debuted the first general-purpose FICO score. This FICO score system has since become the industry standard.
1990s-2000s: Growth of Credit Scores
In 1995, Fannie Mae and Freddie Mac made FICO scores mandatory for lenders to use on residential mortgage applications. In 2003, Congress enacted the Fair and Accurate Credit Transactions Act, giving every American the right to free credit reports every year.
Experian, TransUnion, and Equifax decided to compete with FICO. So, in 2006, they launched VantageScore with its proprietary credit score, paving the way for consumers to have two credit scores that lenders and landlords use to assess their creditworthiness.
Both FICO and VantageScore continue to tweak their system to improve accuracy and release new and improved versions. FICO Score 9, rolled out in 2014, remains the most widely used FICO score version by the three major credit bureaus.
Credit reporting and credit scoring have a long, rich history. They continue to be integral to our economy and consumers’ financial health. Still, they have a long way to go to include more Americans with no access to formal credit and are considered unbanked or invisible. But with the rise of Big Data, data science, and artificial intelligence (AI), credit scores will keep getting more accurate, impartial, ubiquitous, and inclusive.