You know the feeling. Whether it is for access to an app, a subscription, or your favorite pair of sneakers.
You press “add to cart”, finalize your online purchase, and fill out your payment details at the checkout. Just before you hit the ‘Buy’ button, there’s that all-important box to check: the terms & conditions.
It’s no coincidence that volumes of 3-point fonts then show up right at that last stage of the transaction process.
A familiar feeling of annoyance aside, you click “Agree”.
By agreeing with what the tiny words say, you acknowledge giving up your personal data and the consequences that come with it.
And in today’s increasingly data-driven world, that can have profound implications.
Learning to Agree on Things You Don’t Know
The economics associated with deciding whether to conceal or disclose our personal details has become especially important now that we are firmly in data’s gold rush era.
With firms having access to increasingly granular data about their customers, it seems that both economic benefits and costs of using personal information are being magnified for all parties involved – the consumer, the company and society at large.
Obtaining personal consumer data (and forming aggregates of those) enables companies to identify and profile customer characteristics in unprecedented ways – such as the price customers are likely to pay, how they pay, the mix of products they tend to buy together, and many more creative combinations.
Not just that.
Beyond the ones and zeros that companies collect, they also develop increasingly large and deep datasets from which to learn. Even they themselves don’t know what they will be capable of producing out of that gold mine.
This is a special form of information asymmetry.
Although neither side is fully certain of the additional knowledge that can be learned from that data in the future, one side will certainly know much more than the other side as time progresses.
What the customer receives, however, is quite certain – the fulfillment of his/her instant gratification or a compulsory addiction. The focus here is that customers agree to give up information with an unknown but increasing value, in exchange for a utility that is to be experienced today.
Deep Problem with Deep Learning
Citing his own research, Alessandro Acquisti, Professor of Information Technology and Public Policy at Carnegie Mellon University observed that consumers are decidedly worse off where there is no privacy protection, with consumer surplus being extracted entirely by sellers: “The point, here, is that there is a very clear and obvious economic rationale for privacy.”
This begs the question of whether we as individuals can even remotely gauge how much our privacy is, or will be, worth.
Given the dynamic information asymmetry discussed above, companies are only rational to support technologies and capabilities to extract more value out of all the data they just collected from you, after you agreed (with a compulsively clicking finger).
Companies can’t stop it. With increased profit, their shareholders are likely to assume companies are doing a fantastic job. Professionals (such as lawyers, or the “compliance” guys) work their usual rounds to make sure things are legal. Lawmakers, being slow to learn and respond, make laws to protect us.
Beyond the known problems, data scientists and coders are eager to sharpen their superhuman skills to extract even more value out of the increasingly rich datasets!
It’s quite a riskless bet for the corporate world, I would say.
As if clicking “agree” with your index finger isn’t quite enough. Soon you should even give them a thumbs up! because they get to know so much about you.
When the world is so well optimized with more data, it makes people wonder where the knowledge is put to use.
It can be used to make a better toothbrush. It can make us more productive at work, travel faster and achieve more goals in a shorter time.
It can also make us pay attention to demand, make us respond predictably and optimally to stimuli. Individual idiosyncrasies become so well studied that the insights are even offered for sale on the marketplace.
New insights about us will be increasingly commoditized, in exchange for better user experience for all things. Better user experience comes with more predictable mass behavior, yielding fewer surprises and lower-value data over time.
To step up the game, the “competitive extraction” of insights could only lead to two outcomes.
Either more granular data of customers is extracted and offered to the marketplace, so there are more idiosyncrasies and patterns to learn from (i.e. higher-value data), or the cost of obtaining private data become inexpensive, due to an orchestrated effort to engineer your experienced convenience, utility, or happiness (e.g. addiction), or both.
The availability of 5G network and IoT technology will most likely achieve the former effect.
That’s not to mention the other, non-pecuniary pitfalls of disclosing our personal, often sensitive information, especially if that data ends up in the wrong hands. Issues such as identity theft and data breaches have been well-publicized over the last year or so. And if information gets into the hands of unintended parties, there can even be a societal price to pay for individuals.
Given this slew of unknowns, then, I’d suggest we don’t yet have an accurate notion of what we are giving up when we relinquish our privacy.
If we did, we could more realistically determine just how much of ourselves we ought to sacrifice in exchange for the utility we gain from the product we are buying.
Of course, there are scenarios were giving up our personal data may yield some benefits. If you contribute your favorite movie genres to an online content vendor, you will receive more tailored recommendations of which movies to watch next. Or in the case of online ads, you may see more targeted ads for products that are aligned with your interests, which benefit both you and the advertiser.
But again, giving up such information comes at a price that consumers are incapable of evaluating with any real accuracy. The vendor may start targeting you with more expensive movies, or the advertiser begins using your data in a way you didn’t initially intend, perhaps by selling it to a third party.
Looking to the future – a future that will rely more and more on insights revealed from data mining and analytics – one wonders just how granular things will become. It’s conceivable that we might end up having to ‘agree’ to stipulated terms just to perform the most routine, mundane acts such as using a public toilet or taking a bus.
Given customer demands can be engineered and optimized, we will be ‘studied’ with increasing precision and higher predictability.
For some economists, that may not be a problem, especially if they believe that a loss of privacy is just the price to pay for the great things that big data will bring. But what are the great things? It is a philosophical perspective that has much to do with society’s value system. We will reserve these topics for a future article.