Seeking examples of mundane low-stakes data modeling fails
Flickr: Matt Mets

Seeking examples of mundane low-stakes data modeling fails

tl;dr: Can you give me examples of where data modeling in your everyday life is deeply flawed? I’m looking for interesting examples to think with.

When we worry about our algorithmic world, there are good reasons to focus on “high stakes” modeling failures. Situations in which life/death decisions are being made. Deeply entrenched biases stemming from prejudice and inequality. Moments where people’s livelihood, opportunities, and freedoms are being impinged. Indeed, when the stakes are higher, we should care more.

However, I also think we need to pay attention to the low-stakes moments, not because they matter more but because they help us see how entrenched data logics are becoming — and how we’re all being configured in minor ways by shitty algorithms. Low-stakes data modeling fails don’t destroy our lives, but they make our lives a bit more frustrating. And, collectively, they make us irritable.

We saw a lot of modeling fails early on in the pandemic as we collectively came to realize that supply chains of toilet paper were dependent on models that didn’t fare well to a society-wide change in bathroom behavior. We were annoyed but everything was weird. Then, as the pandemic rippled across society — followed by massive deaths and complexities in the labor market — we saw lots of consumer supply chain fails, from the politically complex to the utterly mundane. Most media attention focused on things that required computer chips or complexly manufactured machines like cars, but there were also supply chain fails for everyday items due to shifts in consumption. (Subsequently, there have been non-pandemic supply chain messes that have just added to the collective perception that something is just wrong with our supply chain resilience. Think: baby formula and eggs.)

Supply chains aren’t the only site of model failure though. Many service-based businesses depend on tightly controlled supply-demand models where “efficiency” is about getting this equation just right. This is hard in the best of time, but the promise of more sophisticated data modeling is that this is manageable. However, we’ve seen this completely out of whack for years now. The rhetorical blame is “lack of labor.” This is reinforced by unemployment numbers that don’t even account for all of the people who can’t work because they haven’t solved child/elder care, let alone the ridiculous number of people who are underemployed. We are watching the food industry add “service charges” even as service becomes crappier and crappier by the day. The narrative here is “inflation” (but it’s really about how far customers can be deceived by the distance between sticker price and the bill). We are watching wait times for phone-based customer service eek into the hours, revealing how this is not simply a labor issue but a willingness-to-spend issue.

At the local level, bespoke supply-demand models aren’t working. I suspect that in more sophisticated settings, the models are tuned to see how much pain people can tolerate. They are normalizing us accepting crappy service, replacing previous services (like cleaning our own tables), and only seeking service when we’re desperate. Meanwhile, managers dream of AI replacing these jobs without warning us how this will collectively put more burden on consumers to twist themselves into machine-readable formats.

Supply chain and supply/demand tensions are pervasive. But the examples that really confound me these days are the ones where I’m convinced that the data modeling is just dead wrong in ways that don’t benefit anyone — not the customer, not the business, not the worker, not the bottom-line. And I keep scratching my head wondering what’s going on.

Let me offer an example… I’ve been waiting for almost a year for an electric vehicle that I ordered (whose delivery estimates were dead wrong… “supply chain”). During the wait, I started regularly renting cars from Avis. I’ve rented 10 times since March, twice from SEA-TAC, once from LAX, and the rest from Denver airport or my local branch. I’m a preferred member with a long history of renting (and am tagged as connected to my employer). Prior to this year, the App let me choose from a pre-selected list of quasi-reasonable cars. Three times this year, I’ve not even been offered a car to select from and had to stand in line at the preferred counter (including over an hour at LAX). The other seven times, I’ve been offered completely unreasonable cars as “upgrades.” This included a 15-person passenger van and a Jeep monstrosity where the doors come off. (I foolishly accepted one of those once as the last car available one night and thought that I was going to die on the freeway; I promptly returned the damn thing.) Every time, I go to the counter and I beg for a sedan. Every time, I’m told that they only have exceptionally ridiculous things in stock. I’ve come to expect that it’ll take me an hour to wait until they have a normal car available. Three additional times, I’ve canceled my car rental rather than bothering with this insanity. And I’m not alone. There are typically long lines waiting in the preferred section begging for something sensible. Twice, I rented a car late enough at night that I could talk with the manager or customer service person about what was going on. Both described the problem as “supply chains” and talked about how miserable their job is now because they’re stuck between not having cars and frustrated customers. Both told me how lucky I was that I at least got a car — over in the regular section, there weren’t any cars available. I could only imagine the tears of frustration that were flowing.

Avis is not alone here. There’s a downwards spiral happening across the car rental ecosystem. I am sure that there are “supply chain” issues — as in, cars that were ordered but have not been delivered. However, there are also a range of other data modeling issues. Customers are renting cars and not getting what they rented (or not getting anything at all). The companies aren’t saying “sorry — unavailable”; they’re choosing to allow wishful thinking. There’s no way that Avis’ system (with its “choose your own car”) doesn’t know that the cars reserved don’t match the supply. Are they just ignoring the modeled information because they can get away with it?

Everyday life is filled with interdependencies. It makes sense that when the postal system buckles, the models for when a purchased item will be delivered are completely wrong. But where I start to scratch my head is when it’s always wrong.

Consider another example. Almost all food delivery in my town is managed by the handful of 2010’s national delivery companies that are on a race to the bottom (think: Grubhub, Doordash, Ubereats…). Through most of the pandemic, I assumed that a delivery would take 10 extra minutes from the estimate given. This year, though, I’ve noticed that I’m lucky if my food arrives within 20 minutes after the estimated time (and the estimated time is typically over an hour these days). The prices of delivery have gone sky high even as the service has disintegrated. For the most part, I’ve given up on delivery; I just call in the order and drive over to the restaurant to pick it up. In talking with my local pizza joint, I was told that these delivery companies used to be helpful for them but now it’s just an extra headache cuz something is always going wrong. They pinpoint it as drivers not being assigned appropriately. In other words, the model that accounts for drivers’ locations and deliveries is wrong (and presumes that they’ll speed like hell).

Are the models for delivery time and worker allocation completely wrong? Or are the national delivery companies just outright lying and seeing how far they can get away with it? It’s hard to tell. But either way, they’re normalizing inaccurate wait times and shitty service while selling a narrative of data modeling.

These are just two examples of how data modeling has eased its way into everyday consumer business interactions only to disintegrate in ways that frustrate both customers and workers and undermine the reputation of the brand. At the same time, because these dynamics are so pervasive, we’re collectively learning to live with a dreadful world in which models are being used to abuse our sanity. Diane Vaughan warned us that “normalization of deviance” in organizations helps make systems more brittle in ways that can make accidents inevitable. I can’t help but think about what this means at the societal level.

As I sit and think about these examples, I can’t help but wonder what other people are seeing. Where are you seeing data modeling in the mundane parts of everyday life that appears to be really flawed (or perhaps deceptive, abusive, malfeasant)? Our imagined Jetsons future came with many promises about how technology would make life better, but I keep reflecting on the subtle and mundane ways in which technology-in-the-loop is adding layers of frustration under the guise of “efficiency” or “scale.” What other everyday examples should we be thinking with?

PS: Never forget that in Neal Stephenson’s dystopic metaverse, pizza delivery suuuuuuucks.

Kimberlee Weatherall

Law Professor; Research leader at ARC Centre of Excellence for Automated Decision-Making and Society; Member of Commonwealth Government AI Expert Group

1y

Easiest source of these is to look at how you've been categorised into advertising segments on social media. Full of #failure.

Like
Reply
Ian Soboroff

Group Leader, Retrieval Group at NIST

1y

My local large grocery chain charges different unit prices on berries and other fruit depending on the container… for example a pint of blueberries is more expensive per ounce than a container three times as large. They never post unit pricing. Some data science evil is at work someplace

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics