The story is a familiar one: Irresponsible consumers and rapacious borrowers collide in a frenzy of consumption that takes down the American economy and banking system. Only heavy-handed government regulation can rein in the excesses, providing comprehensive regulation of the consumer-credit economy and protecting vulnerable groups of Americans from exploitation by lenders who goad consumers to take on more debt than they can handle to sustain a consumerist lifestyle they can’t afford.
Only the year isn’t 2008, or 2010 when the Dodd-Frank reform legislation was passed. And the target isn’t credit cards or subprime mortgages. It is the mid-1950s and the target is installment sales offered by furniture stores, appliance shops, and car dealers, who seduced consumers with the promise of “easy monthly payments” to lure them into debt. And the supposedly vulnerable subgroups that needed protection from rapacious lenders included not only the poor and immigrants, but “math-impaired females” who were thought to be easy targets for fast-talking salesmen.
In short, as long as there has been consumer credit, there have been many people who are concerned about how other people use consumer credit. While each of us believes we are responsible in how we use credit, we “Tsk, tsk” at the family down the street with the new car in the driveway and the big screen television in the family room. And we pine for “the good old days” when people lived within their means and not on credit cards.
Former professor and now-Senator Elizabeth Warren (D-MA) sums up this mindset in a decade-old PBS interview, saying that the credit-card industry “has no evidence that people were being turned down for loans in the early 1980s. What they have is evidence that people more often in the early 1980s preferred to pay cash than to pay on credit.”
But is that true?
In fact, as I and Thomas A. Durkin, Gregory Elliehausen, and Michael Staten demonstrate in our new book, Consumer Credit and the American Economy, the reality of the relationship between American consumers and consumer credit is much more complicated. In fact, consumer credit was the fuel of the great growth of the American middle class in the postwar era. Widespread use of consumer credit—and even more widespread complaints about it—are much more than a phenomenon of the 1980s and the ubiquity of credit cards.
Economic historians have actually found that in the 1930s, the majority of sales of household furniture, appliances, radios, cameras, and jewelry were credit sales as were a substantial percentage of rugs, hardware, sporting goods, and books (such as encyclopedias and other book sets), not to mention items like pianos and sewing machines. While the use of consumer credit retrenched during the Great Depression and World War II, it exploded after 1945, funding the great postwar migration to the suburbs.
A detailed examination of the data debunks Senator Warren’s mythologizing about cash usage and abhorrence of debt before Ronald Reagan was elected president. Consider just one measure of consumer-credit usage, the ratio of consumer credit to household assets:
Indeed, looking at the matter beginning in the 1980s and the beginning of widespread access to credit cards, the same trends emerge, with credit cards merely changing the mix of consumer credit while leaving the overall level of indebtedness mostly unaffected.
Examining the Federal Reserve’s data on the debt-service ratio (which measures the amount of household income dedicated to paying consumer debt each month) makes this relative stasis even more apparent:
From this chart we can observe that, although the percentage of the average household credit-payment burden has risen for revolving (credit card) debt, it has fallen by an equal amount for non-revolving (installment loan) debt, such that the overall household debt-service burden is approximately the same as it was in 1980. Indeed, since the early 1990s it has been the non-revolving portion of the household debt-service burden that increased over the subsequent decade, not the revolving—largely as a result of the massive layering on of student-loan debt beginning in the early 1990s and which today amounts to the second-largest non-mortgage liability on the typical household balance sheet (behind only auto loans, which surpassed credit-card debt a few years ago).
Why Consumers Use Credit
Why did consumer credit come to be so closely associated with the development of the American economy in the postwar era? Was the reason, as Senator Warren and her fellow populists have argued, that creditors goad consumers into taking on excessive debt? Or is something different, and less nefarious, at work?
It is difficult for many to think about why Americans use consumer credit without falling into the Warren-esque trap of mistaking their subjective impressions of the past for data and economic analysis. Instead, it might be useful to focus on a different but related question: Why do businesses use credit?
The answer to this is well understood. It’s so they can invest in capital goods and smooth income and expenses. For example, a homebuilder could pay 10 guys to dig a basement with shovels or invest in a new backhoe and borrow to purchase it, paying for it out of the savings from not having to pay the laborers and, presumably, the increased productivity of being able to dig the hole faster. Similarly, the business may borrow during slow seasons (say, during the winter) and repay the debt in the spring and summer when business picks up, rather than constantly laying off and rehiring workers from quarter to quarter.
Returning now to consumers, why do they use credit? For the same reasons businesses do: to invest in capital goods and, potentially, to smooth income and expenses. An individual will often use a mortgage to buy a house, and often finance human capital development through student loans. These are investments that are obviously comparable to the homebuilder’s backhoe. Less obvious, however, is that cars, refrigerators, or washing machines are also capital goods in that they provide a stream of productive services over time and increase productivity by reducing the need for frequent trips to the grocery store or Laundromat. Similarly, consumer credit can be used by consumers to smooth income and expenses, such as to pay for necessary car repairs or unexpected medical bills when the household lacks sufficient cash on hand.
Thus, the postwar growth in consumer credit is a predictable complement to the great migration to the suburbs that began after 1945, as consumers moved from a rented apartment in the city with hand-me-down furniture to a three bedroom house in Levittown with furniture, modern appliances, and an Oldsmobile in the driveway—all financed with credit.
The Advent of the Card
It can now be appreciated that the impact of credit cards was not to increase overall indebtedness but to simply modernize the delivery of credit. Whereas credit traditionally was offered as an ancillary product or service by retailers such as department or furniture stores (often at a loss to those companies), credit cards enabled consumers and businesses to unhook the credit transaction from the provider of the goods and services.
Consider Senator Warren’s hypothetical consumer circa 1980. If she bought a new bedroom set, she’d get a line of credit from the furniture company and pay it off over time. If she needed a new refrigerator, she’d open a line of credit from the appliance store and pay it off over time. Or if she needed to borrow $500 to fix her transmission to get to work on Monday, she’d go to a local personal finance company and pay it off over time. Today, of course, it is likely in each of these scenarios that she’d simply put each of those transactions on a credit card and pay that centralized debt off over time.
Not only did the growth of credit cards enable greater efficiencies by putting credit underwriting and collection in the hands of specialized financial institutions (rather than department stores and tailors), it promoted competition in both spheres. That is, it enabled small businesses to compete with Sears and Target, and it empowered consumers to shop among different stores without being tied to those that were large enough to offer credit. Thus, not only have credit cards not increased indebtedness, they’ve reduced the overall cost of credit to consumers and enabled more consumers access to better credit at lower prices than ever before.
But it isn’t just Senator Warren’s claim that before the soul-damaging 1980s, Americans “preferred to pay cash” rather than “to pay on credit” that is off the mark. She is also wrong in suggesting that prior to that time few consumers got turned down for credit cards. Why did credit-card use, rare before the early 1980s, rise so suddenly at that juncture? Given the obvious superiority of credit cards to retail credit, why did retail credit hold on for so long, and why was it overtaken so rapidly?
The main reason lies in antiquated and economically dubious consumer-credit regulations (as my coauthors and I discuss in the first chapter of our book). For much of U.S. history, many states imposed “usury” regulations on consumer credit, limiting the interest rates and other substantive terms of consumer credit. Thus, for example, interest rates on unsecured credit might be capped at rates as low as 10 percent in some states, such as Arkansas. While those regulations did have the intended effect of capping interest rates, they had several unintended consequences as well. In particular, they were universally seen to hurt precisely those consumers they were intended to help. Moreover, all of these factors combined in the high-interest rate environment of the late 1970s and early 1980s to bring down the traditional regulatory regime focused on price controls on interest rates and the like.
Regulation and Its Unintended Consequences
The unintended consequences of consumer-credit regulation typically have three forms: term repricing, product substitution, and rationing. Consider each in turn.
Regulating some terms of a consumer-credit contract but leaving others unregulated will result in term repricing, which can be understood as changing some unregulated terms to offset limits on regulated terms. Consumer-credit contracts are often analogized to squeezing a tied balloon: you can’t make the air leave the balloon, but if you squeeze one part the excess air pops up somewhere else.
Similarly, a state legislature can try to control the price of a loan by capping interest rates, but it cannot alter the fundamental supply and demand characteristics of consumer lending. For example, when confronted with an inability to charge a market rate of interest, credit-card issuers adjusted by imposing an annual fee on credit cards to make up the loss. Not only was this regulatory-induced pricing mix of artificially low interest rates and artificially high annual fees less efficient (and less preferred by consumers, who dislike annual fees) it also had other negative consequences.
One is that annual fees were highly regressive. Consumers who charged $3,000 in purchases in a year typically paid the same annual fee as those who charged $30,000 in purchases per year. Indeed, economic research on the impact of usury ceilings usually finds that the overall effects are regressive and have resulted in wealth redistribution from lower-income to higher-income consumers.
Another is that annual fees dampened competition in the credit-card market by making it more expensive for consumers to carry more than one card and hampering them from easily switching among different card issuers.
Now for the second unintended consequence. Usury ceilings generally result in product substitution, driven not by consumer preferences but by the relative ability of different types of creditors to evade the binding effects of price controls. Thus, while banks could evade interest-rate price controls by imposing annual fees on cards, retailers had an even more effective way to do so—they could simply bury their credit losses in the price of the goods.
For example, one notable study found that in Arkansas, which had the nation’s most restrictive interest rate ceiling of 10 percent in the 1970s, the price of a refrigerator was 8 percent higher than in Texas, where credit terms were not as tightly regulated. Moreover, while very few Arkansas consumers could qualify for credit cards due to the low interest rates (and those who did generally had very low credit lines), Arkansas was also recognized as the pawn shop capital of America, as pawn shops were three times more prevalent than in states with less strict regulation and thus greater access to consumer credit.
Finally, strict controls on consumer credit could produce rationing. That is, controls on effective risk pricing will simply lead to consumers being completely priced out of the market for credit. Eliminating the supply of credit, however, does not eliminate the demand. A 1968 report of the U.S. Senate Subcommittee on Legal and Monetary Affairs found that illegal loan sharking was the second-largest revenue source of the mafia. Similarly, in 1969 famed Nobel Prize economist Paul Samuelson testified before the Massachusetts state legislature in favor of repealing the state’s usury regulations on the ground that the only beneficiary of the limits was not the consumer but organized crime, which prospered from the creation of a loan-sharking market.
So it’s important to recognize that up to the late 1970s, consumer-credit markets were dominated by such players as retail store credit, pawn shops, and the like. Credit cards were able to make little headway in such a severely regulated environment.
In 1978, however, the Supreme Court decided the case of Marquette National Bank of Minneapolis v. First of Omaha Service Corp., which effectively deregulated the market for credit cards by holding that for nationally chartered banks, the terms of credit-card contracts (such as interest rates) would be governed by the regulations of the state in which the issuing bank was located, rather than that of the consumer. Soon thereafter, Citibank relocated its credit-card operations to South Dakota, which effectively had no usury limit on credit-card interest rates. Other issuers soon followed suit, migrating operations to other states such as Delaware, which had no usury ceilings. Since that time, of course, credit cards have largely supplanted many of those earlier types of credit.
Repeating Past Mistakes
Yet seemingly it remains for new generations to relearn these painful lessons of history over and over again. What we have had since the 2008 financial crisis is a virtually unprecedented orgy of economically illiterate legislation and regulation in Washington, culminating in the Wall Street Reform and Consumer Protection Act of 2010, otherwise known as Dodd-Frank for its main congressional sponsors. Having ignored the lessons of history, these statutes and regulations have driven millions of Americans out of the mainstream financial system and into alternative consumer-credit products.
Take, for example, the Credit Card Accountability, Responsibility, and Disclosure Act of 2009, which imposed new limits on the ability of credit-card issuers to adjust the terms of credit-card agreements when a consumer’s risk changes. The results of the law (and the Federal Reserve regulations that preceded it) have been predictable. Not only have those rules led to an increase in interest rates and a slashing of credit lines for consumers, but most regrettably, they have led to a substantial reduction in the number of low-income households with credit cards (as I’ve shown in a recent article co-authored with Gregory Elliehausen and Thomas Durkin). Having lost access to credit cards, those consumers have now turned to alternative products such as payday loans and the like.
Washington’s response, of course, is typically misguided. The Consumer Financial Protection Bureau, a creation of Dodd-Frank, is now considering new limits on access to payday loans and other alternative credit products.
Dodd-Frank also imposed price controls on debit-card interchange fees for banks with more than $10 billion in assets via the so-called Durbin Amendment, named after its primary sponsor Richard Durbin (D-IL). Here, too, the effects have been entirely predictable but no less tragic. For as billions of dollars of revenue losses were imposed on large banks, they responded by passing through those costs to consumers in the form of higher monthly fees and reduced access to free checking. Moreover, these higher costs have been imposed only at the large banks to which the Durbin Amendment applies, as smaller banks exempted from its price controls have continued to offer free checking to consumers at the same rate as before. Unfortunately for low-income consumers, however, those smaller banks that still offer free checking are disproportionately located in higher-income neighborhoods, which doesn’t help them very much.
If consumers don’t benefit from the Durbin Amendment, who does? So far it has largely been big box retailers such as Home Depot and Wal-Mart, massive online superstores such as Amazon.com, and gas stations. Yet there is no evidence that they have passed on any of their savings to consumers in the form of lower prices. In a recent study of the effects of the Durbin Amendment, I and my coauthors Geoff Manne and Julian Morris estimate that as a result of the higher bank fees and other offsetting responses to the Durbin Amendment, low-income consumers are transferring between $1 and $3 billion to large retailers every year and getting nothing in return. Moreover, according to the Federal Deposit Insurance Corporation (FDIC), the number of unbanked consumers increased by one million from 2009 to 2011, and it is expected that this group has enlarged still further since the Durbin Amendment has taken full effect.
It is now a cliché to observe that those who fail to learn from history are doomed to repeat it. Yet it is hard to escape that conclusion looking back over the economic and political history that we survey in Consumer Credit and the American Economy. In response to the financial crisis, the federal government has responded the same way it always has—and with the same unfortunate fallout that hits the most vulnerable and lowest-income consumers the hardest.