Menu

Why we should be wary of highly targeted information and ads

In his post Why we fear Facebook and why we shouldn’t Paul Jacobson makes an interesting counterpoint to the common refrain that it’s bad to share our personal data with companies:

Conventional wisdom is that if you are not paying for a product, you are the product. That may be true, as a generalisation. I prefer to think it isn’t so much we who are the products on Facebook but rather our preferences and attention. What does that buy us? For starters, it buys us Facebook, Twitter, Google services and more. It also buys us slightly less annoying ads that can be remarkably relevant. It buys advertisers a better chance that we may want to buy their products and services because those products and services may just be what we are looking for at that point in time.

It’s a good question. Is it really that bad to get highly targeted ads in our news feeds? The more targeted the ads are, the more useful they are to us, right? So why is there such pushback against this trend in companies like Google and Facebook to try to find out everything they can about us?

I think there are three main reasons why we need to be wary of letting ad-driven companies know too much about our preferences, even if they just use it to serve us more targeted information and ads.

1. It makes the web smaller

If we only see stuff we’re already interested in, we run the risk of becoming sucked into the Internet’s “filter bubble”, where it’s much harder to discover new information beyond our current knowledge. Maria Popova puts it like this in Are We Becoming Cyborgs?:

The Web by and large is really well designed to help people find more of what they already know they’re looking for, and really poorly designed to help us discover that which we don’t yet know will interest us and hopefully even change the way we understand the world.

When an algorithmic constraint is placed on the information we see, and that constraint is based solely on our current preferences, we will remain safely locked into the world we know. That means that we become less likely to broaden our horizons with new discoveries.

2. It results in heightened confirmation bias

When we’re steeped in information that confirms our existing beliefs (regardless of whether those beliefs are true or not) we not only seek out more of the same information everywhere we go, but we also become incapable of changing our minds even if we eventually are presented with the truth (the denial of Global Warming is a good example of this…). This is called confirmation bias, and Clay Johnson writes about it in the context of media and the Internet in his book The Information Diet:

It’s too high of a cognitive and ego burden to surround ourselves with people that we disagree with. If you’re a Facebook user, try counting up the number of friends you have who share your political beliefs. Unless you’re working hard to do otherwise, it’s likely that you’ve surrounded yourself with people who skew towards your beliefs. Now look beyond political beliefs—how many of your friends share the same economic class as you? […]

Those algorithms are everywhere: our web searches, our online purchases, our advertisements. This network of predictions is what Pariser calls the Filter Bubble in his book by the same name—the network of personalization technology that figures out what you want and keeps feeding you that at the expense of what you don’t want.

So, for example, through its EdgeRank algorithm Facebook figures out what we like and what we believe in, and then shows us stories and ads that confirm those beliefs. It doesn’t care about truth, it cares about engagement — even if that engagement comes at the expense of what is right.

3. It designs our lives for us

This is true for all advertising, but even more so for hyper-targeted advertising: it tries to sell us stuff we don’t necessarily need. Yes, I know we’re tired of hearing how we should all live with less stuff blah blah blah. That’s not necessarily what I’m saying. What I’m saying is that we need to be careful that we don’t become a society built around the needs of corporations. David Cain talks about this in his chilling essay called Your Lifestyle Has Already Been Designed:

We’ve been led into a culture that has been engineered to leave us tired, hungry for indulgence, willing to pay a lot for convenience and entertainment, and most importantly, vaguely dissatisfied with our lives so that we continue wanting things we don’t have. We buy so much because it always seems like something is still missing. […]

The perfect customer is dissatisfied but hopeful, uninterested in serious personal development, highly habituated to the television, working full-time, earning a fair amount, indulging during their free time, and somehow just getting by.

There’s nothing wrong with stuff, of course. But there is something scarily wrong about the way we let our desires be dictated by advertising — especially targeted advertising by companies that know us so well.

What it means…

I don’t think our biggest fears about the data that companies collect about us should revolve around identity theft or the government coming to get us (although, in some regions, that’s certainly legitimate concerns). Our biggest fear should be what Huxley points to in the future he paints in Brave New World: that we will be ruled by what he calls “man’s almost infinite appetite for distractions”. Huxley believed we should fear companies who aim to control us by inflicting pleasure on us, and I think he might have been on to something.

I know that sounds really alarmist. But still, I can’t look at my Facebook news feed and not think about this possible future. That’s why I think we should hold our personal data and preferences just a little bit closer to our hearts.