You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation
MIDDLEMEN

Throw the Book at Visa

Transaction fees on debit-card purchases are out of control. The Biden administration's anti-trust cops are on the case.

A close-up of four VISA credit cards, stacked in a stylish pile
Justin Sullivan/Getty Images

The Justice Department’s antitrust case against Visa, which was announced Tuesday, is best understood as an important (and, in my view, overdue) new stage in the history of money, which is basically a sad tale of tangible currency becoming ever-less-tangible and therefore ever-easier for a private third party (typically banks; in this case, Visa) to siphon away a portion without attracting notice from nice people like you and me. Other, less-nice people may notice, but when they do the third party can buy their silence. Eventually the government catches on and puts a stop to this anti-competitive behavior. But as we’ll see in the case of Visa, the arc of the moral universe is long.

I’m getting ahead of our story.

According to the American Numismatic Society, money was first used in Mesopotamia and Egypt about 4500 years ago. Gold and silver, then (as now) precious metals, were traded “in the form of metal bars or bits of wire.” Later, starting around 650 B.C. in Lydia (situated in modern-day Turkey), money came in “little round lumps of electrum (a naturally occurring alloy of gold and silver).” These were “the first coins.” About a century later, King Croesus accumulated enough of these lumps to be remembered as the richest person in history, though in fact Croesus doesn’t even crack Forbes’s top 25. (More on the history of extreme wealth here.)

At this early stage, money was free. Every transaction was a straight-up exchange of coins for whatever goods or services were desired. Manufacturing the coins created an external cost, of course, but this was absorbed by emperors and kings. When you bought something, you paid no additional transaction fee.

Eventually it became a nuisance to cart around a lot of gold and silver coins, prompting the Chinese, in the seventh century AD, to invent paper money. Paper money wasn’t real money, because pretty obviously a piece of paper is not in itself worth anything. It was an IOU promising to fork over a certain quantity of silver or gold, which was worth something, in exchange for some item or service. For a long time, such promises aroused suspicion, often justified, that the IOU would never be honored. Consequently, western nations looked askance at paper money until the 17th century, when the Swedes started experimenting with it. By the 18th century paper money was in common use.

With paper money, currency shifted from being an item of intrinsic value (the British “pound sterling” being quite literally a quantity of silver that weighed one pound) to an item of no intrinsic value except as a promise to furnish something of value on request (the British pound note). The recipient extended credit; the giver incurred debt; and the government balanced the scales with precious metals—silver at first, then later gold, which, being more scarce, made the paper flow less freely (thereby bequeathing America Populism, The Wizard of Oz, and William Jennings Bryan).

As with coins, purchases by paper money incurred no transaction fee. The emperors and kings who printed it had by this time devised an infinite variety of superior means to extract wealth from their populations—methods that did not impede commerce.

Eventually governments figured out that they didn’t really need gold to control the circulation of money; indeed, going off the gold standard, as the United States did temporarily during the Great Depression and, later, permanently in the 1970s, gave governments more control. (It took the Federal Reserve about a decade to master that control, greatly worsening in the meantime what’s now remembered as the Great Inflation.) With the United States off the gold standard, a dollar bill became an item of no intrinsic worth that was valuable not because it could be redeemed in gold but because we all chose to believe it was valuable.

But paper money was almost as big a nuisance to cart around in large quantities as silver and gold, necessitating the creation of checking accounts, which first gained widespread use in the United States around the middle of the 19th century. A check was a paper IOU redeemable for another paper IOU (cash). Its value, rather than fixed in advance, as paper money was, varied according to the transaction. The guarantor of this do-it-yourself currency was not a government, but a private bank. The snake had entered the garden.

Paying by check incurred a transaction fee (direct or indirect) because the bank needed to cover the cost of printing the check, or of moving your money from Point A to Point B, or of providing some other banking service such as mailing you a monthly statement. For the first time, it cost (slightly) more to buy something than the price of the thing itself.

The next stage in money’s march toward intangibility arrived in 1958 with the advent of Bank Americard, which we know today as Visa. Credit cards came into being because carting around a checkbook—like carrying pieces of silver and paper money before it—became a nuisance. With Bank Americard, you could carry a small piece of plastic in your wallet, use it pay for stuff, then decide when to hand over the cash.

There had been charge cards before (Diners Club, Carte Blanche, American Express among them), but these required that you pay off your entire tab every month. Bank Americard did not. In fact, it preferred that you didn’t, because then it could charge you interest on the unpaid portion. Everyday consumption became everyday debt. This feature allowed Bank Americard (then an arm of the Bank of America) to greatly increase what your bank charged you to buy stuff, spawning a new industry. Let’s call it the Buying Stuff Industry.

Then, in 1969, something terrible happened. Terrible for the Buying Stuff Industry, anyway. Chemical Bank installed in Rockville Centre, New York, something called a Docuteller, which turned out to be the first Automatic Teller Machine, or ATM. The ATM, which took about a decade to catch on—I saw my first one in 1976—allowed you to extract cash from your account at any time of day or night; all you had to do was insert a card that looked exactly like a credit card but was called a debit card. The debit card was not an instrument to borrow money and pay interest to credit card companies and banks. It was an instrument to withdraw money in your possession and not pay interest, or even a check fee, to anyone. For many years debit cards incurred no transaction fees at all. You didn’t pay a transaction fee during bank hours when a human teller handed you cash from your account. Why should you pay a fee when a machine (my first was called, I kid you not, “the Cool-O-Mat”) did the same thing?

To the Buying Stuff Industry, the ready availability of cash at all hours was bad. But much worse was that, within a few years of the debit card’s debut, it could be used to buy stuff in stores, displacing purchases by credit card. Suddenly people were buying stuff not by incurring debt but by spending money in their bank accounts. This was intolerable!

Then, in the early 1990s, the point-of-sale terminal came to the rescue.

Credit cards were processed initially with a manual imprinter (“knuckle buster”) that made a carbon copy of your credit card. The cashier also had to call the issuing bank to make sure the card number wasn’t stolen. Later, these two functions were combined digitally, but the process still took about five minutes. Only in the 1990s did point-of-sale technology become available that could process credit card payments as quickly as cash or a paper check.

In theory, this same technology also enabled a retailer to process a debit card payment as quickly as cash or paper check. But the credit card companies owned the computer networks that processed point-of-sale transactions, charging a swipe fee for each transaction, and they didn’t feel like sharing them with debit cards. For a while this slowed the spread of debit cards.

Eventually, though, debit cards became too popular to exclude. The credit card companies then created a new kind of “offline” debit card. You couldn’t use this card to withdraw cash from an ATM, but you could use it to make a retail purchase that took money out of your bank account. Unlike an ATM card, the offline debit card incurred swipe fees comparable to those for credit cards. Later, ATM cards started charging swipe fees, too, but these were only half those for offline debit cards. Still later, ATM cards and offline debit cards were combined into a single “general purpose” debit card—and these new hybrids incurred retail swipe fees as high as 44 cents per transaction. Buying stuff with your own money was now an invisible transaction conducted silently with a stripe or a chip on a piece of plastic. It had lost all sense of being an interaction between two humans. And it was getting very expensive.

The 2010 Dodd-Frank financial reform law directed the Fed to put a ceiling on debit-card swipe fees. But after considering a ceiling of about 12 cents per transaction, the Fed settled for a too-generous 21 cents, plus an “ad valorum” equal to 0.05 percent of the purchase price and a 1-cent charge to cover fraud protection. Last November, the Fed issued a proposed rule that would lower the ceiling to 14.4 cents and the ad valorum to 0.04 percent. (Credit cards can still charge whatever swipe fee they want.)

All this happened well outside the consciousness of the ordinary shopper. When you buy something with a debit or credit card, the receipt doesn’t include an entry for a swipe fee, because that’s charged directly to the merchant (who has already raised prices to cover it). On rare occasions the merchant may tell you that an item will be more expensive if you purchase it with plastic, but most of the time he just charges everybody a little bit more.

Debit card usage has now grown to the point where it matches or outpaces credit card usage. In 2020, debit card spending for the first time exceeded, in dollars, credit card spending. Back in 1969, Bank Americard would have considered this a catastrophe. But, in fact, it was very good news for the successor company, Visa, because Visa now dominates the debit card business, and has for many years. Sixty percent of all debit transactions run on Visa’s network; only about 25 percent run on Master Card’s network, which ranks a distant second. There have been years (for instance, 2022) when Visa earned more revenue from its debit business than from its credit card business. Think about that. Credit cards charge interest, whereas debit cards can at best incur overdraft fees. That’s why the Buying Stuff Industry used to hate them! Visa was handed a lemon and turned it into lemonade. Last year its swipe fees on debit cards brought in $7 billion.

How did this happen? According to the Justice department, Visa got there “through exclusionary and anticompetitive means.” In large part, that consisted of buying off potential competitors—companies like Apple Pay, PayPal, and Square. The Justice Department complaint quotes Visa’s CFO actually saying, “Everyone is a friend and a partner. Nobody is a competitor.” More from the Justice Department complaint:

Visa offers lucrative incentives, sometimes worth hundreds of millions of dollars annually, to these potential competitors under the express condition that they do not develop a competing product or compete in ways that could threaten Visa’s dominance. In addition to the carrot of these incentives, Visa has also threatened to use the stick of additional fees to dissuade their potential competitors’ innovation—if they develop competing products.

The Dodd-Frank law increased the number of available networks by requiring banks to put at least two unaffiliated networks on every debit card. When this provision took effect in 2012, Visa lost market share to other networks. To fight back, the Justice Department says, Visa got really aggressive with discounting.

For any given debit-card purchase to work, the bank that issued Customer A’s card has to be on the same network as Merchant B’s bank. Sometimes there are multiple overlapping networks from which the merchant may choose, but sometimes there’s only one. When there’s only one, it’s usually Visa’s. The many “non-contestable” transactions that require Visa’s network, the Justice Department says, gives the company leverage to pressure merchants to also use Visa’s network for contestable transactions that could be processed by some different network that charges lower fees. If the merchant says “no thanks” to Visa and goes with the competitor instead, then Visa can punish him by charging a high sticker price (“rack rate”) on Visa’s non-contestable transactions. If the merchant says yes, however, he avoids the rack and gets a discount. According to the Justice Department, “this effectively insulates at least 75 percent of Visa’s debit volume from competition.”

And the consumer? You’re pretty much a bystander to all this, because you didn’t choose your debit card, did you? Your bank chose it for you. You could close your bank account and move your money to some other bank that doesn’t use Visa’s debit network. But Visa is so ubiquitous that such banks can be hard to find. Alternatively, you could get rid of your debit card and go back to writing checks. But increasingly debit cards, like credit cards, do things that paper checks can’t, like get you into the New York City subway or the London Underground.

Really, you’re at the mercy of the Justice Department’s antitrust division. One thing you can do, though, is vote for Kamala Harris, because it’s anybody’s guess whether Trump, who’s aggressively business-friendly, would allow this important monopoly-busting litigation to continue. We may have found one Visa Trump will turn out to like.