AI Isn’t Just Spying on You. It’s Tricking You Into Spending More. | The New Republic
Scam Artists

AI Isn’t Just Spying on You. It’s Tricking You Into Spending More.

Recent reports reveal how companies are using artificial intelligence to harvest customers’ data and even secretly target them with higher prices.

An Instacart personal shopper at a store in San Francisco
Lea Suzuki/The San Francisco Chronicle via Getty Images
An Instacart personal shopper at a store in San Francisco

As if it’s not bad enough that artificial intelligence is coming for your job—or at least making it worse—and the data centers fueling it are raising your utility bills and destroying your environment. Turns out, AI is also spying on your shopping habits and using that information not only to sell you more things, but at a higher price.

This has all been illustrated by a string of reports on how companies are using AI to target consumers. The reports make clear that, while Big Tech has pushed AI on Americans as a tool to make their lives easier, the technology may in fact be making their lives even harder. This is what happens when any large, concentrated industry is allowed to grow unregulated, and all signs suggest it will get much worse.

In October, the Vanderbilt Policy Accelerator published research showing that companies use loyalty programs to hook consumers, creating a captive audience they can track. Because consumers opt into these programs in the hopes of discounts and rewards, the programs are exempt from many state and federal laws about surveillance. And some companies eagerly abuse that legal opening.

Of course, loyalty programs pre-date AI, not to mention the internet itself. But gone are the punch cards of yore. The Vanderbilt report’s lead example is McDonald’s Monopoly game promotion, which has been around for decades but has now gone digital. Let’s say you win a food item on a peel-off token; to redeem it, you have to use the restaurant’s app, and that requires forfeiting one’s privacy:

[T]o be eligible for these prizes, customers must agree to be tracked on far more than their Big Mac purchases. McDonald’s nearly 10,000-word privacy policy notes how the company can monitor customers’ precise geolocation, browsing history, app interactions, and social media activity. The company then uses this data to train its artificial intelligence models and build profiles on its customers—predicting their “preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, or aptitudes.” McDonald’s leverages these psychological profiles to drive repeat customer engagement over time.

If McDonald’s succeeds in its goal of reaching 250 active loyalty users by 2027, the report adds, the company “will hold psychological and behavioral profiles on a quarter of a billion consumers—a scale rivaling that of a national intelligence agency.”

The central goal here, at least as far as we know, is to drive more McDonald’s sales—both on an individual level and across its entire customer base—by harvesting data. That’s bad enough. But sometimes companies are deploying AI to actually trick you into paying even more for things.

Groundwork Collaborative, Consumer Reports, and More Perfect Union collaborated on an investigation released this month showing that Instacart, the grocery shopping and delivery app, used AI-enabled data collection to manipulate and change prices. The investigators sent 437 shoppers out in four cities and placed items in their Instacart shopping carts at the same time. The results showed that almost three-quarters of the items had prices that varied, and prices for the same items could vary as much as 23 percent. These variations are often invisible to consumers.

Dynamic pricing has long been used by airlines to shift prices as demand changes, and it’s been adapted to consumer goods at online retailers, like Ticketmaster. But AI has turbocharged the practice, making it easier to target price changes to specific customers based on geolocation and IP data, in an attempt to maximize the amount a customer will pay. Customers are largely ignorant of this fact—because companies are loath to disclose it prominently—and in some cases may not have much choice. Many families use apps like Instacart for the convenience; they may not have the time or ability to physically shop in a store themselves. But now we know what they’re paying for this convenience: The Instacart report estimates that the hidden pricing differences could amount to as much as $1,200 a year for some customers.

It’s hard to set a shopping budget if you can’t be sure what your grocery staples will cost on any given week. In other words—trigger alert, President Trump!—it’s an affordability issue. Representative Greg Casar, a Texas Democrat, has introduced legislation to regulate how companies can use AI to set prices and wages, but it’s unlikely to go anywhere in this Republican-led government. In fact, Trump issued an executive order this month promising to investigate and withhold federal funds from states that pass “cumbersome” AI regulations.

Some states have promised to push for new regulations regardless. That’s what Americans want: Half of Americans are more concerned than excited about AI and 61 percent want more control over how it is used in their lives. They’re also weary of the second-order effects of building up AI, like the toll of data centers on their communities.

But reform-minded politicians are playing catchup while the AI bubble continues to balloon, enriching a handful of companies that are propping up an otherwise ho-hum stock market. Meanwhile, Americans who want to be treated fairly as consumers find themselves getting a bad deal. They can’t shop for Cheerios or a plane ticket for the holidays without wondering just how severely they’re being screwed by AI-trained algorithms. A reckoning is coming. The bubble will probably burst, and Americans’ retirement savings will plummet. But whom will they blame for the damage? The wise policies of the future depend on naming the right villains today.