Categories
Privacy

What Shoshana Zuboff and Mark Zuckerberg both get wrong about privacy, and how you can fix it.

The debate over privacy rights has devolved into a polarized and unproductive shouting match between two opposing points of view. On one side is Silicon Valley, who believes that the benefits of innovation trump any quaint notions of “private” lives. Privacy rights get in our way! On the other is the New York elite – The New York Times Editorial Board and Shoshana Zuboff, who believe our very lives are being threatened in an act of corporate violation. We are here to save you!

Both perspectives are wrong. They’re not wrong because they don’t have valid arguments; they are wrong because they both forgot to ask consumers, citizens, patients, and employees what they think about their own private lives. To put it simply: Zuckerberg and Zuboff think they know better than you do about how you should think about your own privacy. That’s why the so-called “debate” over privacy will never result in meaningful progress.

It’s long-past time for a new approach. To do that, we’ll turn the question of privacy on its head and explore the roots of the complex set of tradeoffs everyday people make when they device whether or not to share private information. When we’re done, we will have a framework that will not only help you make better decisions, but also a way to predict how others might make similar choices.

.   .   .

How did privacy become not about people?

Privacy concerns? Uber completes over 40 million rides per month in the United States.

Who would have predicted 10 years ago that we would, routinely, get into a stranger’s car and trust that we would arrive safely?

Privacy concerns? Amazon has sold 100 million Alexa-enabled devices.

Would anyone have known 10 years ago that we would, routinely, allow a listening device into our homes so that we could order pizzas and play music?

Privacy concerns? The Mayo Clinic’s Biobank has over 50,000 participants.

Was it reasonable to guess that 10 years ago that we would, routinely, share our health information with a private massive database?

Ten years ago, each of these statements would have seemed ridiculous. Sure, people may be willing to share what they had for dinner on Facebook, but they would never submit to such blatant intrusions of their privacy.

And yet, here we are.

That’s not to say that the privacy situation has not become the subject of intense debate. The New York Times has been publishing articles throughout 2019 in an ongoing series titled “The Privacy Project”. Surveillance Capitalism was one of the top non-fiction books of the past year. Security expert Norton reported that in 2019 alone, 4 billion records were breached.

What does that all mean? We can no longer argue that we aren’t aware of these intrusions, nor can we argue that we don’t know the risks.

And yet, here we are.

The best-selling devices worldwide included Echo Dot, Fire TV Stick with Alexa Voice Remote and Echo Show 5. (Amazon Press Release, Holiday Shopping Season, 2019)

In fact, over two years of intense and overwhelmingly negative media coverage hasn’t made a dent in the growth of the personal information economy – the so-called “internet of all of us”. If that’s true, then business owners, organizational leaders, healthcare experts, and politicians who see consumer, employee, or patient data as critical to their business models must ask themselves a difficult question:

Why is it that people are aware of data gathering, and understand its risks, do they keep clicking “accept”? Will they stop? What happens if they do?

Are people privacy lions or are they privacy sheep?

If we ask the New York Times, they might argue that consumers are only just understanding the true costs, and that once they do, they’ll push back hard. Perhaps they’re right.

If we ask Silicon Valley, they might argue that consumers say they’ll do all sorts of things, but they know better. If consumers want it, they’ll take the risk. Perhaps they’re right.

When I began my research, I saw privacy the same way Shoshana Zuboff and Mark Zuckerberg see it – as a simple paradox born of the personal data economy. The problem was that this simplistic approach didn’t work. It failed again and again to explain the decisions people made when it came to their private lives.

Simple narratives may make good stories, and they may sell books, but they clearly weren’t the truth.

The privacy debate couldn’t lead me to the insights that would help me answer critical business questions. I needed a more practical approach.

  • Will employees consent to listening devices if those devices can prevent harassment and abuse?
  • Are customers likely to adopt a freemium software offering if it aggregates and remarkets their data?
  • Should patients accept a lower cost health plan option if it requires ongoing monitoring?

To get there, I spent the past three years working to understand how individual privacy and the data economy interact. And today, I intend to share with you what I’ve learned. As I do, I’ll ask you to confront the same questions I did during my exploration. And finally, I’ll show you how that broader perspective will pay off for both your personal life, as well as your business.

But before we begin, a warning: You may not like everything you hear, and the questions won’t be easy. And when we’re finished, you won’t have better answers, but you will be able to ask better questions.

Let’s get started.

.  .  .

Perspective #1: Deep History

Private Lives in The World Until Yesterday - What Can We Learn from Traditional Societies?

When you met your colleague or friend this morning, was the first thing you talked about the color, odor, and volume of your last urination? (I hope not.) But tribespeople in New Guinea do. And it makes sense. If you relied on the person next to you for your life and safety, you would be very interested in their elimination habits. It’s probably the best indication we have of overall health absent modern measurement techniques.

It is only since the industrial revolution that we substituted processes, infrastructure, and technology instead of other people for our survival. Privacy is a very recent invention.

But no system can address all of our needs. We continue to rely on our family, friends, and neighbors to help us navigate the uncertainty and scariness of daily life.

Nextdoor user interface

Nextdoor is a good modern example of this tradeoff. Yes, you could call it a “system”, but it is more accurate to think of it as a way for busy neighbors to help each other figure out what’s happening near them. To share in this way requires releasing some level of personal anonymity for the feedback other people can provide – much like a modern version of a tribe.

But let’s not be theoretical, let’s get personal. My first question for you relates to the deep historical perspective I’ve just described:

Would you rather rely on other people for your personal well-being or would you rather rely on systems, infrastructure, and technology?

Privacy question 1: Would you rather rely on other people for your personal well-being or would you rather rely on systems, infrastructure, and technology?

Make a mark on the continuum between those two extremes where you personally fall. There is no right or wrong answer. And don’t overthink it. Your first impression is what your unconscious brain is telling you is right for you.

Now, let’s go. We have nine more to go!

.  .  .

Perspective #2: Global Privacy Law

Our second perspective is one that many of us is familiar with – the legal and regulatory approach. Critics may argue that privacy law simply hasn’t caught up with technology and marketing, and that government is always bumbling and slow. But I’m not so sure that’s fair.

Specifically, the new California Consumer Privacy Act went from idea to implementation in 18 months. That’s pretty fast. If we hop a plane to the European Union, we see the ongoing implementation of GDPR – the first large-scale attempt to address data privacy and the rights of individuals. Leave Paris and land in Beijing, and you have privacy rights – unless the entity who wants to know sf the government, in which the public need outweighs your personal right. More on that later.

Global data privacy laws

My intent isn’t to debate the specifics of any legal or regulatory framework, but rather to show that privacy “rights” depend entirely on where you are.

More than that, all laws have consequences. Here are just a few examples:

  • Can your credit card company still offer effective fraud protection when criminals can hide behind data anonymity?
  • Will hackers in another country care that you live in California?
  • Should abusers on social media be able to hide behind aliases?

There aren’t easy answers. This complexity leads to the second question I will ask you to consider:

Is it better for the law to define privacy for you, or is it better to guarantee transparency and allow people to decide for themselves what they share or do not share?

Is it better for the law to define privacy for you, or is it better to guarantee transparency and allow people to decide for themselves what they share or do not share?

Mark your spot on the continuum.

.  .  .

Perspective #3: Free

New laws aren’t emerging for no reason. The fastest-growing category of new products, services, and business models rely on information about you as the primary “product”.

To understand how we got here, we need to put ourselves back in the early days of the internet. Business leaders aren’t dumb. They learned from the experience of my hometown university – the University of Minnesota – who, in 1993, decided to charge a small fee for the implementation of its Gopher protocol. Remember Gopher? Most people don’t. We all use HTTP now for a simple reason: It was free.

It’s easy to see why: Who would pay for something that hadn’t really proved much tangible value beyond the military and academic communities? It had to be free to get people to adopt it.

The Gopher server interface

But once people get a taste of “free” it’s hard to get them to pay anything else. Marketing people (like me) understand that better than most. Someone had to pay the bill, and advertisers were the only ones who would.

In fact, “Free” is Google’s entire business model. Facebook’s too. Even Netflix – the poster child for getting people to pay for content – is considering an ad-supported model to compete with new streaming services on price (no matter what executives say they’re not considering). For many business, not-for-profit, and healthcare organizations, “freemium” is (or will be) part of the business model.

But just because it has been that way, doesn’t mean it needs to stay that way. You may feel that pandoras box has already been opened, but I’ll remind you that no one thought that free Google would take off in 1998. We all survived “free, ad-supported” television. There is no reason to think paid Google won’t work in 2020.

The age of ad-supported television

Here’s your question:

Is it better to pay for services so that you can restrict the use of data, or is it better to use free services and accept the release of your data for advertising and other data mining purposes?

Is it better to pay for services so that you can restrict the use of data, or is it better to use free services and accept the release of your data for advertising and other data mining purposes

Go ahead and put yourself on the continuum.

.  .  .

Perspective #4: Safety & Security

Let’s stick with Google for a moment. Last month in Milwaukee, Wisconsin, Google complied with a request from federal law enforcement for any device using Google services within a 30,000 square meter geofence. They were looking to solve a spate of arsons in 2018 and 2019.

Knowing that most people (including, presumably, the criminals) carry smartphones, and that most of those smartphones use some Google service, and that those devices track time stamps and geo-locations, police could then ask for more detail on those accounts that match specific areas and times of interest.

Anonymized Google Data

But it’s funny, if you search for “Milwaukee Geofence” on Google, this is what you get: A new feature built into costly Milwaukee brand power tools to help owners track them down when they’re stolen. Power tools, as anyone in construction knows, are both expensive and portable (for obvious reasons) and also makes them easy and profitable targets for thieves.

Milwaukee OneKey

Neither instance is as simple as it appears.

In the first case, most people get a little uncomfortable about the idea of a “dragnet”, but police are only asking for additional data on specific devices at the scene of a crime … and with a judge’s consent. If they get through that hurdle, defense attorneys could challenge the admissibility of that evidence in US court.

The second case seems more straightforward. The power tools are simply a matter of tracking property. However, let’s say this drill is in the worker’s toolbox and he puts that toolbox in his trunk. Could his employer use that information to track that employee on his way to a marijuana dispensary? Wouldn’t that tracking be warranted because of the safety risk of using marijuana on the job?

What’s more important: Protecting privacy or preventing harm?

I’ll bet we could spend the better part of a day arguing the details, but I won’t give you that kind of time. Here’s your question:

It is better to allow law enforcement complete access into our lives in order to protect us, or should law enforcement only respond after a crime has been committed?

Struggling with that one? This isn’t going to get easier.

It is better to allow law enforcement complete access into our lives in order to protect us, or should law enforcement only respond after a crime has been committed?

Mark your spot.

.  .  .

Perspective #5: Privacy Technology

Speaking of not getting easier, now let’s have a discussion of large semi-prime number factorization. I’m kidding, we won’t. Just suffice to say that the biggest security issue today isn’t quantum computers breaking RSA encryption, it’s us. We are the problem.

According to PC Magazine, 35% of people never change their passwords. That’s closer to 80% for so-called IoT devices. We respond to phishing emails. We write passwords on Post-It notes. We toss our health records in the recycler. Security is a cat and mouse game, and unlike Tom and Jerry cartoons, the cat usually wins.

PC Magazine 35 percent of people never change their passwords

We can try two-factor authentication. Our phones can incorporate biometrics. Online banking can insist on redundancy. We lock ourselves in faraday cages for goodness sake. But convenience is a more powerful motivator than any of them. We want one-click purchasing and instant answers. That’s the real reason no security system is foolproof. We won’t stand for it.

One click versus two clicks

From a technical perspective, privacy requires constant vigilance. And that is the essence of my next question for you:

Is privacy more important than convenience?

Is privacy more important than convenience?

Go ahead. What answer comes to mind?

.  .  .

Perspective #6: Media & Information Flow

Now is a good time to circle back with the New York Times. I’d recommend reading the entire opinion series on the Privacy Project, but be warned. This series falls victim to the oldest truth in media: If it bleeds, it leads.

I pulled out some keywords and phrases from a long piece from December 19, right before the Holiday last year. You can see them below. To be fair, the authors are describing what they believe is a serious problem, but I want you to notice something about the words: They are universally negative – there is no discussion of the positive side of data sharing, no balanced perspective.

New York Times Privacy Project Dec 19 2019 word cloud

This is not to say that the matter isn’t serious, but this perspective hints at a powerful dilemma in a modern society – exemplified by the founders of the internet, Wikileaks, and paradoxically, the New York Times itself.

It is that the free flow of information is critical for the functioning of society. Without it, we cannot hold people and systems accountable. The paradox is obvious: In a world where the most important information is about you, it’s hard to have it both ways.

information wants to be free

Where do you stand?

Should information be free, or should information be restricted?

Should information be free, or should information be restricted?

Wow. This is starting to get uncomfortable, isn’t it? You ain’t seen nothing yet. Let’s talk about religion.

.  .  .

Perspective #7: Faith

All major religious traditions address privacy in one way or another. And while I am not – nor do I claim to be – a scholar on comparative religions, it’s not difficult to find an opinion from all the world’s major faiths:

  • Judaism addresses the twin issues of consent and modesty, while stepping back from the perspective that privacy is a right.
  • Islam takes a stronger view regarding the inviolability of the private life.
  • Hindu scholars hold a more nuanced view – privacy exists, but they recognize that the concept is difficult to pin down.
  • Confucian and Taoist traditions seem to favor a different view – that family and society play a stronger role than individual self-determination.

Map of World Religions

But perhaps no religious tradition addresses the polarity of privacy more than Christianity. To see that, we only need look at two passages from the Gospel of Matthew. I could have picked many others, but we can clearly see the tension between private faith and public faith. (below, emphasis mine).

Matthew 6:6
But you, when you pray, enter into your inner chamber, and having shut your door, pray to your Father who is in secret, and your Father who sees in secret will reward you openly.

Matthew 28:19
Go, and make disciples of all nations, baptizing them in the name of the Father and of the Son and of the Holy Spirit.

We may hear plenty about how people are turning away from religion, but that does not mean they are turning away from faith, and it certainly does not mean that our concept of right and wrong aren’t strongly influenced by religious traditions – even if we, ourselves, don’t consider ourselves “faithful.”

How do you see it?

Do you agree that your private life is akin to only your inner relationship with your creator (or with yourself)? Or do you see yourself as having a duty to live openly to serve as an ambassador and example for others?

Do you agree that your private life is akin to only your inner relationship with your creator (or with yourself)? Or do you see yourself as having a duty to live openly to serve as an ambassador and example for others?

Mark the spot on the continuum that feels right for you.

.  .  .

Perspective #8: The Greater Good

Let’s expand on that last question from a different perspective. As we’ve already discussed, since the industrial revolution, we’ve relied on systems and technology (rather than other people) to achieve things no small group of individuals could do on their own.

But we’re running up against the wall. In order to break through and understand the toughest and most intractable problems – cancer, climate change, and racism to name just three – we may need to rethink the importance of other people in our collective lives.

Speaking of personal, let’s get personal.

I am part of the Mayo Clinic’s bio bank. It is a massive data collection program with the mission to build an ever-expanding database of people’s personal lives and habits – and how those variables impact our health and the course of disease. The goal is profound: To create next generation of medicine.

For me, the decision to participate was easy: Cancer took my dad at 61. I would give very much for someone else to be spared that. I might even be inclined to support a mandatory program of data sharing.

Donald Voiovich

In that, the Chinese government might agree with me.

In China, the government is engaging in perhaps the largest data collection exercise in history – they call it social credit – and their goal is to use data in the most ancient of Chinese objectives: to create a more harmonious society. Don’t visit your aging parents enough? You might not get a small business loan. That’s oversimplifying, but it’s the basic idea.

China social credit system

To do that, China tracks all manner of information – from everything you buy to everywhere you go to everything about your health, and so much more. Where some in the West view this as Orwellian, and that the government is only doing this to maintain “control”, China’s experience reminds us that “the greater good” is in the eye of the beholder.

With those two examples in mind, here is your next question:

What is more important to you, protecting your individual privacy or the contributing to the greater good?

What is more important to you, protecting your individual privacy or the contributing to the greater good?

Make your mark.

.  .  .

Perspective #9: The Psychology of Privacy

It’s funny, isn’t it, that in all this time we haven’t talked about the study of the human mind itself – and what psychologists can tell us about the balance between a private and public life.

When most people think of “privacy”, they tend to think of it in very simple terms – as in being away from other people or having your thoughts, actions, or identity shielded from others.

But psychologists say it’s more productive to think of privacy as a “boundary control process” through which we control whom we interact with, how we interact with them, and when and where these interactions occur.

But even psychologists admit that our need for privacy is a combination of nature and nurture. In other words, our introversion and extroversion, the context situation we’re in, the culture to which we belong and identify, and our biological heritage. On that last point, we only need to look at our closest living relatives to see that we seem to be more social than solitary.

But even individual chimpanzees have different preferences. How about you?

What’s more important to your emotional well-being in general: Privacy or socializing?

What’s more important to your emotional well-being in general: Privacy or socializing?

Make your selection.

.  .  .

Perspective #10: Economics

We’ve ended at what I feel is the most honest, and useful, aspect of privacy: That privacy is, fundamentally, an economic question. There are benefits to be gained by sharing information, whether those are social, psychological, legal, religious, or monetary. There are also costs. From an economic point of view, it’s a simple question: Do the benefits outweigh the costs?

To explore this is just a bit more depth, let’s compare the idea of privacy as a right to the idea of privacy as an asset.

As a right, privacy is something to which you are entitled. The downside is that rights are given to you by where you live. You don’t control them. Inalienable rights in one country don’t translate to inalienable rights in another.

Privacy as a right

As an asset, privacy is yours to use as you see fit – to incur the benefits and the costs. However, that control comes with responsibility. Ignorance is not bliss, and the powerful will tend to take advantage of the powerless.

Privacy as a asset

With those definitions in mind, here is your final question:

Do you see privacy more as an inalienable right, or more as an asset to be utilized?

Do you see privacy more as an inalienable right, or more as an asset to be utilized?

Make your final selection.

.  .  .

Predicting the Future of Privacy?

Do you feel a little mentally exhausted? Confused? Frustrated? The first time we truly broaden our perspective, it will seem challenging. Privacy is complicated. It’s high time we respected privacy for the tough series of tradeoffs that it is.

Let me help you make the past 15 minutes actionable for you. What you see below is my privacy profile based on the same questions I just had you answer. Now you’ll see the power of this perspective to help me make an everyday decision about my privacy.

Here’s the question:

Should I purchase an Alexa-enabled virtual assistant for my home?

I could use any (or all) of my answers to the 10 questions to help me decide, but my research has taught me that middle-of-the-road opinions don’t drive much action. Strong opinions do. In my case, my three strongest opinions refer to security, the greater good, and economics.

From a security perspective, I know that the more people who are involved in the process, the less secure information is. At this point, we know that Amazon is using humans to “train” its artificial intelligence. The likelihood of a breech for someone (not necessarily me) might be remote, but the likelihood of it happening eventually to someone is basically 100%. It’s not irrational to be wary of fat tail risks like this one.

Secondly, I am not sure that by buying one of these voice assistants that I am contributing to the greater good. If, perhaps, the voice assistant was tracking the tone of my voice to help create a database of anxiety or conflict disorders, perhaps. But for ordering toilet paper online? Not so much.

Jason Voiovich - filled in provacy profile with Alexa - choice 2

Finally, the economic tradeoff. The risk I could incur, and the lack of any greater benefit, means that the “price” I pay for the Alexa-enabled device isn’t worth the “costs” I incur – even if it were free of charge.

Jason Voiovich - filled in provacy profile with Alexa - choice 3

So, no. An Amazon Alexa-enabled device isn’t the right decision for me, at this time.

But that doesn’t mean that you, using your own chart, may not come to a different conclusion. Your chart is your decision-making machine.

That’s great for you in a personal context, but how can this exercise help you answer critical business questions?

We can use these questions to build a survey of our patients, employees, or customers, and learn how they view privacy in aggregate. Using regression analysis, we can discover which of the 10 answers in strongest in a given context. Finally, we can use that analysis to better predict the answers to critical questions, like the ones I showed you at the beginning of our discussion.

  • Will employees consent to listening devices if those devices can prevent harassment and abuse?
  • Are customers likely to adopt a freemium software offering if it aggregates and remarkets their data?
  • Should patients accept a lower cost health plan option if it requires ongoing monitoring?

But the real question for you is:

What different decisions might you make if you knew this information about how your critical stakeholders view privacy? I can’t answer these questions for you, but I know who can.

Because not knowing how your customers feel about privacy – in this next “personal” information age – is as irresponsible as failing to test your product before you launch it.

More than ever, understanding privacy is a business imperative.

Now you have a tool to start asking better questions.

###

This isn’t the first time I’ve tackled the privacy issue, although I think the piece you just read is the furthest along in my thinking on the topic. Interested in past work? You can see how much I struggled with the issues, and also get a sense for how I ended up where I did. Here goes:

“Alexa, play some music” isn’t the only time Amazon is listening to you.
Using Google Maps costs more than you think.
Data Exchange Networks, AI interrogators, and corporate espionage (Chapter 2 of the Dr. Thomas story)
Your “smart” TV is a dumb idea
Messing with data: The 10-step subversive instruction manual to hit the tech companies where it (really) hurts.
What if someone offered $6,495 for your private data? Would you sell?
You don’t have a right to privacy. You have something better.
In America, your digital freedoms are what the tech companies say they are.

###

About Jason Voiovich

Jason’s arrival in marketing was doomed from birth. He was born into a family of artists, immigrants, and entrepreneurs. Frankly, it’s lucky he didn’t end up as a circus performer. He’s sure he would have fallen off the tightrope by now. His father was an advertising creative director. One of his grandfathers manufactured the first disposable coffee filters in pre-Castro Cuba. Another grandfather invented the bazooka. Yet another invented Neapolitan ice cream (true!). He was destined to advertise the first disposable ice cream grenade launcher, but the ice cream just kept melting.

Jason Voiovich and grandpa

I think this photo explains a lot about why things didn’t turn out the way I’d hoped.

He took bizarre ideas like those into the University of Wisconsin, the University of Minnesota, and MIT’s Sloan School of Management. It should surprise no one that they are all embarrassed to have let him in.

These days, instead of trying to invent novelty snack dispensers, he has dedicated his career to discovering why people do what they do – because that’s the only way we’ll tackle our biggest challenges and accomplish the next great thing.

Categories
Agile Learning Audience Empowerment Audience Engagement Information Management Long Form Articles Rehumanizing Consumerism

Big Data promised less (and better) marketing. It hasn’t worked out that way.

Consumers: So, remind me again why I need to give up oodles of my private data?

Marketing: Well, not only do you get to use our awesome products for free (or for less than their true cost), but we also will use that data to stop bombarding you with irrelevant advertising.  It’s better for us because we can be more efficient, redirecting that money into developing better products and services instead of wasteful advertising spending. And it’s better for you because you see advertising that’s much more useful to you.

You’ve heard some version of this argument from marketing for the past 20 years. If consumers allow marketing to collect ever increasing amounts of data, they will use it to produce more targeted advertising. More targeted advertising is more efficient, meaning that (ideally) marketers should be producing less advertising, not more. As a consumer, you should be seeing fewer promotional messages, and the ones you do should be much better.

Who among you thinks that is true?

I certainly don’t.

Let me walk you through just one example.

My wife and I enjoy cooking at home. We patronize several grocery stores, delis, and kitchen supply outlets to find the just the right ingredients and tools to try new recipes. (A Thai coconut sweet potato soup was our latest win.) As you might guess, one of the stops on our shopping trips is Williams-Sonoma. We’ve purchased all manner of utensils and tools from them over the years, and we were one of the first members of their “email list” – allowing them to collect data on our purchases at the point of sale, whether that’s online or in store.

You would think that Williams-Sonoma would know us well enough through our extensive data trail to target advertising and offers precisely to our buying habits.

You would think that, and you would be wrong.

How do I know?

I ran an experiment.

From February 1 to March 31, 2019, I collected every email Williams-Sonoma sent to us. During that time, we made two purchases, and in both cases, provided our email address. The test is simple: Do the promotional email messages reflect our buying patterns? In other words, does Williams-Sonoma use the data we provide them to deliver better advertising?

Here is the data summary:

n=175 (number of emails)

d=59 (number of calendar days)

n/d=2.97 (emails per day)*

*This measure of central tendency isn’t hiding anything. Williams-Sonoma sent three emails per day, every day, for two months, save for a couple of exceptions.

What did the emails say? I created a word cloud to help visualize the subject lines. You can see that word cloud below.

The most immediate and obvious conclusion is the word “Percent” which relates to some sort of “percent off” offer, anywhere from 20 to 75 percent. This is a typical example:

LE CREUSET **Special Savings** & Recipes + up to 50% Off Spring Cookware Event

The rest of the data set is barely worth an analysis at all: Williams-Sonoma has an inventory of brands to sell us. They’re experimenting with different percentage offers, different levels of urgency (today only!), and different deadlines (Easter is coming!) to get us to bite.

We reviewed all the percentage offers, urgencies, and deadlines: We often buy at full price, because when you’re interested in a specific receipt, you don’t want to wait for a sale. (Wouldn’t you think they’d notice that we downloaded a specific recipe?) We reviewed all the brands featured. We have never bought any of them. (Wouldn’t you think they’d notice what we just bought?)

Here’s the rub: Williams-Sonoma does know all that. They have all of our purchase data, yet they have chosen not to use it.

#

It may seem like I’m picking on Williams-Sonoma, but I could just as easily have picked any number of brands. I suspect you could hunt through your inbox and find a dozen examples of bizarre, irrelevant marketing from brands you patronize as well.

But this was just one example. Other brands do better, don’t they? Perhaps the macro-trend is heading in the right direction, and brands such as Williams-Sonoma eventually will be out-competed by brands who are more efficient and can redirect that excess capital. Perhaps this is just a symptom of struggling retailers. If that were true, what might we expect the macro trends to look like?

First, we might expect that marketing spend would be growing at a rate at least equal to, but ideally lower than, population growth. In other words, the ratio of marketing dollars per person on the planet should be shrinking over time. Is that the case?

The chart below shows global marketing spend growing at 3.9% per year:

Source data

The next chart shows global population growth slowing over time, about 1.0% per year during the same period.

In other words, marketing is spending more per person each year, not less.

But wait, you say. Population growth is not necessarily an indicator of economic growth. It would be fairer to look at global GDP growth over the same period.

Great. Let’s do that.

Source data.

Over the same period, we see global GDP at an average of 3.6% per year. In other words, at an average of 3.9% per year, marketing is overrunning GDP growth by about 10%. And because North America and Western Europe are the largest marketing “markets,” and those regions are growing slower than Asian markets, the overshoot is even higher.

In other words, for all its data, marketing is becoming less efficient over time. Put simply: Big data is making marketing worse, not better.

#

How on earth can that be?

Let’s refute a number of possible alternative explanations.

Explanation #1: It takes a certain amount of time to realign marketing based on what it’s learning from Big Data. What’s more, that knowledge has yet to completely diffuse into the professional community.

Really? It’s been 10 years, and there is no evidence that the growth rate in marketing spend in bending downward. In fact, it’s accelerating. No, marketing knows what it should be doing, but it is not doing it for a much more obvious reason: There is no downside.

Email protection laws are barely enforced. GPDR is just finding its footing n in Europe, but enforcement has been spotty. A state-by-state patchwork of privacy laws in the United States isn’t likely to do much better. Enforcement takes resources. In other words, marketing has no incentive to be efficient.

Explanation #2: We’re looking at the wrong channels. Email (in the Williams-Sonoma example above) is an “owned” channel, meaning the company does not need to follow guidelines as it would on Google or Facebook. Email might be inefficient because it’s “free,” but when marketers are paying for advertising, they do better.

Really? A shift from tough-to-measure analog media to digital, data-driven media over this 10-year period should have resulted in more efficient performance. But look at the growth pattern in marketing spending over the past 10 years and compare it to GDP. You would expect better data to lead to more efficient use of resources as it does everywhere else in organizational operations, but that is not the case.

Explanation #3: You’re looking at average data, and averages can distort the picture. We should be examining the distribution (variance) in the data to truly determine marketing efficiency.

Really? Marketing success doesn’t follow a normal distribution (aka a “bell curve”), it operates on a power law distribution. In other words, a small number of marketing operations and tactics deliver a disproportionate amount of the success. The bottom line is that a vast majority of marketing operations and spending does not generate a positive return on invested capital (ROIC).

Explanation #4: Of course, we know that most marketing doesn’t meet an ROIC threshold. That’s because marketing is an investment in the future of the organization. We’re building a brand, not quarterly returns. Failure is necessary to the learning process.

Really? So, when precisely will “investment” turn into “returns” on that investment? The data over 10 years shows no appreciable return on marketing investment that outstrips economic growth. You may be able to cherry pick organizations or campaigns that deliver good results, but the overall impact is a negative ROIC over the long term.

Explanation #5: You’re aggregate analysis hides material differences in the performance of marketing by industry. Put simply, B2C is not B2B, and doesn’t need to spend as much. Consumer marketing might be more wasteful, but business to business marketing is much more efficient.

Really? My B2B friends, what happens when you count all selling expenses? That includes “marketing”, but it also includes “tradeshows” and “salespeople” and “executive time selling” and a whole host of other goodies you’re probably not counting in the marketing line on the balance sheet. When you do that, B2B is just as out of whack as B2C.

#

Sorry, marketing. I hate to poop in your sandbox, but none of these explanations hold up. As an organizational function, marketing is not delivering a positive return on investment.

Yes, there is plenty of industry scuttlebutt about how consumers are getting pissed off and opting out. Marketing frets over Netflix and Apple end-running traditional advertising channels by switching to ad-free subscription models. But marketing, I wouldn’t be as worried about consumer anger as I would be worried about the next conversation with your CFO.

The party ends the instant the global economy goes into recession. Marketing bemoans the “short-sightedness” of financial professionals when they look at ROIC instead of “brand health” in their calculations, but what are they supposed to think? The rates of growth don’t match, meaning marketing is delivering a lower return on investment, in aggregate, with each passing year.

A shotgun approach to email – per my example above – is simply the canary in the coal mine.

Ask yourself this question: If you needed to get better results with 80% of your current budget, could you do it? If the answer is “no,” you had better start working on a plan. It might be time to actually use all that “big data” you’ve been so excited about.

Because the day of reckoning is coming.

Good luck.

#

About Jason Voiovich

Jason’s arrival in marketing was doomed from birth. He was born into a family of artists, immigrants, and entrepreneurs. Frankly, it’s lucky he didn’t end up as a circus performer. He’s sure he would have fallen off the tightrope by now. His father was an advertising creative director. One grandfather manufactured the first disposable coffee filters in pre-Castro Cuba. Another grandfather invented the bazooka. Yet another invented Neapolitan ice cream (really!). He was destined to advertise the first disposable ice cream grenade launcher, but the ice cream just kept melting!

He took bizarre ideas like these into the University of Wisconsin, the University of Minnesota, and MIT’s Sloan School of Management. It should surprise no one that they are all embarrassed to have let him in.

These days, instead of trying to invent novelty snack dispensers, Jason has dedicated his career to finding marketing’s north star, refocusing it on building healthy relationships between consumers and businesses, between patients and clinicians, and between citizens and organizations. That’s a tall order in a data-driven world. But it’s crucial, and here’s why: As technology advances, it becomes ordinary and expected. As relationships and trust expand, they become stronger and more resilient. Our next great leaps forward are just as likely to come from advances in humanity as they are advances in technology.

Thank you! Gracias! 谢谢!

Your fellow human.

Categories
Long Form Articles

Data Exchange Networks, AI interrogators, and corporate espionage (Chapter 2 of the Dr. Thomas story)

What follows is the next chapter of the story in a possible future filled with Data Exchange Networks (DENs) that help us sell our private data. Our protagonist learns that not all security is created equal, and that breaches have consequences.

Haven’t read the first chapter? Read here first.

. . .

March 25, 2029

How long had it been? Lynn thought.

Twenty minutes? Two hours? Two days? It was hard to know. Her smartphone and watch were confiscated during her arrest this morning. She had no way to know how long she had been in this room – a small space by even her apartment’s standards. The walls on three sides were painted cinderblocks. What was the correct name for a color that peeled in places with the two prior colors peeking through in random blotches? The flaking concrete walls stood in contrast to the sleek mirror that faced her.

She looked terrible. She felt worse.

A streak of mascara scarred her face – the failed result of attempting to scratch her cheek. Her hands were cuffed to the top of a plain metal desk, giving Lynn about six inches of movement.  Both ankles were chained through a metal loop in the concrete floor. Her metal folding chair was manufactured in an era before luxuries such as “padding.” Her ass hurt.

Just this morning, everything seemed to be going so well.

Her experiment using nine photovoltaic paint samples was running in her lab – a simulation that would take about three hours – giving her plenty of time to teach her introductory physics class. But that was before all of this. Would her software shut down properly? Was the experiment ruined? Would she need to rerun it? Maybe one of her students shut it down for her?

No, Lynn thought. They were freshmen. This group had trouble making it to class on time.

And what must they think of her now?

#

Lynn replayed the scene in her head.

As she walked from her lab to the classroom, Lynn lamented her “junior professor” status at the University. Frankly, she was lucky to be a “professor” at all. Nine in ten “faculty” positions were now “adjunct” instructors – basically, gig researchers. And because none of them would teach this group, the “junior” professor was stuck with them. There were no “A students” in this class. Lynn was surprised they made it out of high school, and even more surprised they were in a good college. She thought darkly that the era of rich people getting their pretty children into good colleges clearly wasn’t over.

Just like the over-promoted high school students they were, they were nearly impossible to manage. But Lynn wasn’t so easily defeated. She decided to run the classic “pendulum” experiment to snap them back in line. This classroom was equipped with a 20-foot chain attached to an anchor on the arched ceiling. From the bottom of the chain, she attached a 15-foot weight.

Lynn remembered the look on Frat Boy’s face as she called him to the front of the class, carefully told him where to stand, and stretched the pendulum’s weight to the tip of his nose. He looked nervous. She knew he was in no real danger – and he would know that too, had he paid attention in class. But she didn’t let on. Lynn made a dramatic production of telling the young woman nearest the emergency phone to be ready to dial 911 in case “anything went wrong.”

Lynn warned him – in a deathly serious tone – not to move.

Frat Boy didn’t breathe as the pendulum released from the tip of his nose, swung in a wide arc toward the back of the room, hung for a moment in the air motionless, and then accelerated back towards his face at alarming speed. Frat boy flinched, but he didn’t move. About an inch from his face, the pendulum came to a slow stop and reversed direction.

“Good work,” Lynn remembered saying to him. “You may return to your seat.”

A slightly sweatier version of Frat Bot returned to his seat, quickly and quietly.

She finally had their attention.

Good, Lynn remembered. At least we can get through one class without disruption. What demonstration would she run next week to keep them in line?

It was at that moment of contemplation and success when the classroom door flung open.

“Lynn Thomas?” announced a police officer in a crisp black uniform. Her voice was firm. Unfriendly.

“Yes.”

“You are under arrest. Officers, please take the suspect into custody.”

#

The next few moments were a blur. Jonathan Freeman (a mountain of a man, her eyes came up only to the nameplate above his badge) slipped handcuffs over both her wrists. She heard the first officer’s voice as she read what she assumed to be Miranda rights. She had heard the standard lines on Law & Order reruns, but they were a blur now.

She stammered a confused Yes and felt Officer Freeman nudge her toward the door.

As Lynn was being led out of the room, she caught a glimpse of Frat Boy’s face in open shock. He looked scared. She was too.

Students and faculty on the front lawn followed her with their eyes in silence as officers helped her into a waiting police cruiser. Ten minutes later, she was at the police station. After electronic fingerprints and retinal scans, another officer led her here.

The cinderblock room.

Murder? Is that what the officer said?

#

“Doctor Lynn Thomas?”

What Lynn thought was a mirror in front of her flashed on in an instant. Instead of her own face, she was now looking into the eyes of – she could swear – her college roommate.

“Uh, yes?”

“I am here to ask you a few questions. My name is Rachel. May I call you Lynn?”

Okay, this was weird. Rachel was her college roommate’s name too.

She hesitated.

“I’m sorry. You just remind me of someone I knew. Yes, Lynn is fine.”

Rachel smiled.

“I get that a lot. This must be very uncomfortable for you, Lynn. Is there anything I can do for you before we get started?”

“I’d love something to drink. And if it isn’t too much trouble, the wrist restraints are very uncomfortable.”

Rachel winced.

“Yes, I can see them cutting into your wrists. That looks like it hurts. She looked away and typed. I just put in a message to the detective in charge of this unit. He should see it shortly.”

Lynn relaxed a little. Rachel (her roommate Rachel) had always looked out for her.

“While we wait for him to respond, I was hoping you could help me clear up a few questions I have. Hopefully, this will all be over soon. Can you tell me where you were last night between 6 and 8 o’clock?”

Lynn started to feel, at least a little, at ease.

“That’s easy,” Lynn replied. It felt good to be using the logical part of her brain.

“I finish teaching a class a 5:30. That night, two students stayed after to talk about the upcoming test. I must have left around 6 and started walking back to my apartment. I stopped by a restaurant for some pho ga.”

“Yes, I have a purchase record her from the Orchid Restaurant for a bowl of soup and a mixed drink.”

Lynn felt a little embarrassed.

“It was a long day.”

“I understand, Lynn. I’m certainly not one to judge.”

Rachel smirked. Lynn couldn’t help it. She smirked too.

“I cross referenced that purchase with your smartwatch’s GPS locator. The two records matched. So, we can establish where you were from 6:12 to 7:38 pm.”

Okay, Lynn thought. This was good. She felt a surge of relief that she had selected that restaurant as one of her “places to try” from her personal data sale in February. Thank goodness! Had she not done that, maybe she would never had gone there. Maybe she would have gone straight home where she turns off GPS to protect her privacy.

“Do you remember taking a napkin out of the restaurant with you.”

Lynn thought for a moment.

“Yes, I did,” Lynn remembered. “I added too much siracha sauce to the pho and my nose was running. I wiped my nose with it as I was leaving, but I must have thrown that away.”

“You did,” Rachel replied, her tone noticeably cooler.

Lynn’s heart started to race.

“Officers recovered the napkin in a garbage can about five feet from where a young man was killed that night. The DNA on the napkin matches the DNA medical examiners recovered under his fingernails.”

Lynn couldn’t breathe.

“Are you familiar with an organization named Central Biopharma Specialties?”

“Uh,” Lynn stammered.

“Let me help. They conduct research on BRCA gene variants.”

“Uh, yes, I guess. I am involved in one of their research studies. What does that have to do with this?”

“We obtained a warrant for your DNA records they had on file as part of the study. We used that data to match your DNA to the napkin and to the DNA on the body of the deceased. Our GPS records place you within 10 feet of the scene within five minutes of the murder.”

Lynn could feel her heart beat. It was loud.

“We’re pulling the security footage. We believe it will show you confronting and strangling the victim.”

Silence.

“Lynn,” Rachel paused and regained a measure of compassion in her tone. “It’s time for you to admit what you did.”

#

The door flung open.

A tall woman entered. She had severe features, a ramrod posture, and a brilliant crimson suit. She was terrifying, but oddly familiar.

“This interview is over,” the woman scowled to Rachel. “My client invokes her right to legal counsel.”

The screen immediately switched off.

This new woman reached into her breast pocket, took out a small stack of adhesive notes, and peered into the mirror. After a moment, she carefully placed three notes in separate locations on the mirror.”

“Okay, now that we’re not being watched, let me introduce myself. My name is Jessica Fulbright, and I am an attorney. Your attorney, to be precise.”

“But,” Lynn said. “I didn’t hire anyone. I haven’t seen or talked with anyone since I got here. Well, except for Rachel, the detective.”

Jessica smirked.

“Rachel isn’t a detective, she’s AI meant to put you at ease and conduct initial interrogations. Let me guess, the name ‘Rachel’ means something to you?”

“Um, yeah, Rachel was my college roommate. She even looked like her.”

“Hmm. Figures. I’ll bet the detectives pulled your old Facebook posts, ran a bit of a scrambling algorithm to obscure some of the details, and generated a persona tailored precisely to you. I’ve seen it before. Some law enforcement departments have found AI is more convincing than a human detective at building rapport and encouraging quick confessions.”

Lynn couldn’t believe what she was hearing.

“But … what? What’s going on here?”

Rachel slid a thin silver tablet out of her flawless Coach bag, touched the screen, and scanned it.

“Authorization Jessica Fulbright, practicing legal license XV1998, representing Dr. Lynn Thomas. Request file transfer.”

“Does the client accept representation?” came a firm voice from the tablet.

Jessica turned to Rachel and waited.

Stunned, Lynn didn’t move.

“Lynn, you need to authorize representation, otherwise I can’t see the charges and evidence.”

“But I didn’t hire you.”

“My son did, on your behalf. He is a student of yours. Steven Fulbright. He called me just after you were arrested. Apparently, he’s quite fond of you.”

Frat Boy.

Lynn never would have guessed. She had misjudged him.

“Okay, yes, I give authorization.”

Jessica nodded sharply and started to scan the screen quickly.

“Ah, I see what’s happening here.” Jessica began. “A young man was killed near the Orchid Restaurant last night. Surveillance cameras see you leaving the restaurant and crossing into the proximity of the crime scene just a few minutes before he was killed. See here?”

Jessica turned the screen to Lynn. It was her, walking down the street, stopping to wipe her nose, and tossing the napkin in the trash.

“Police recovered the napkin after they cross referenced nearby restaurant purchase records. Once they had that, they were able to match your DNA records to a trove of genetic information they purchased through a Dark Web broker. They didn’t need a judge to compel the biotech company to release your records because the data had already leaked. It’s solid police work. I would have picked you up too.”

“But,” Lynn stammered.

“Hold on, Lynn. I’m not finished. Ah, I see. The police weren’t able to recover any physical evidence from the deceased. Lynn, they can’t match you to the victim.”

“Wait! That’s not what Rachel … should I even call her Rachel … is she even a her … what the hell is going on!?”

Jessica looked sympathetic.

“Detectives can lie to you during interrogation. It’s a common technique to get a confession. They present just a little more evidence than they have hoping you’ll fill in the details. There’s a reason they tell you that you have a right to remain silent.”

A red light flashed on the tablet.

“What’s this?” Jessica said.

After a quick scan, a satisfied smile came across Jessica’s face.

“I knew it!” Jessica said. “New security footage just in from the deli across the street,” Jessica said. Her tone quickened. “Here’s you … blowing your nose … throwing away the napkin … and … getting on the train.”

A small red light on the restraints on her wrists turned green. The latch popped open. She couldn’t see them, but she felt the restraints on her ankles release in the same moment.

“I still don’t understand what’s going on?” Lynn exasperated.

#

The door opened and a giant man walked in. Lynn recognized him. Jonathan Freeman. The same one who arrested her what seemed like days ago.

“It means you’re free to go, although I’m hoping you won’t,” he said. “My name is Lieutenant Freeman. It’s nice to meet you, Lynn, and to see you again, Jessica.”

“You have some balls to walk in here and ask that after how you treated my client,” Jessica stood up and glowered straight at him. He might outweigh her by 100 pounds, but at 6-foot, 2-inches, they stood eye to eye.

“We couldn’t be too careful,” Jonathan said, nodding respectfully at Jessica and taking one of the (padded) chairs in the room. “In about 30 minutes, the NYT/WaPo will publish the details of a major genetic data breach. Ordinarily, it would take days or weeks for leaked information to turn up on the black market, but this case was different. We tracked a criminal organization who purchased 62 people’s records within just a few minutes. Along with the Google Maps data breach yesterday afternoon, organized crime members have been able to kill 13 people in the past 24 hours – each planting evidence of a different hacked victim. We needed to verify Dr. Thomas’ identity and alibis before we could release her.”

“What? Data breach? Google Maps? What is happening here?” Lynn couldn’t believe what she was hearing.

“Dr. Thomas, are you familiar the MENSA Data Exchange Network?”

“Lynn, stop. You are free to go. You don’t have to say anything more.” Jessica glared at the detective.

“She’s right, Lynn, you don’t. But before you go, let me explain the situation. You don’t need to say anything.”

Jessica looked at Lynn, questioning. Lynn nodded. She wanted to know.

“The MENSA DEN was the place the genetic hack originated. Once hackers got into their database, your own security measures were compromised as well. Our records indicate you accepted coupons from the Orchid Restaurant, and that you used Google Maps to allow advertising notifications for that restaurant.”

Lynn remembered the push notification on her phone on her way back from class. Had she not seen that … oh my God.

“It seems like you’ve been set up to take the fall for this. The victim’s name is Muhammed Farooqi, in the United States from Qatar.  Does that name mean anything to you?”

Lynn looked to Jessica. Jessica nodded.

“Um, yeah. He is the coordinator for a Qatar-based science group. I have been paid to try to recruit female science students using a private social network.”

“Muhammed Farooqi is not his real name, and unfortunately, the group isn’t real either. Do you know of anyone who might have a grudge against you or have a reason to hurt you?”

“This is unbelievable. What could I have that anyone wants? I’m barely making ends meet while I work on my startup. I mean, I just had the first successful experiment on photovoltaic paints. I’ve cracked the 80 percent efficiency barrier. I had a meeting with a venture capital team tomorrow. If I can replicate the results in my lab, they said they’d fund large-scale production.”

Lynn stopped herself. The paints she had left in the lab.

“I don’t understand half the words you said,” Jonathan sighed, smiling a bit nervously. “But I know one thing. Someone wants you gone, and we need to find out why. You’re not safe.”

#

Obviously, this is a work of fiction, and probably not a very good one. The more I experiment with fiction, the harder it gets. I was under the foolish impression that fiction would be easier than non-fiction. It’s not. Thanks for playing along.

That said, I think this dramatization raises some important questions:

  • Does a future of private data monetization make the impact of data breaches riskier? Less risky? Does it introduce new risks we haven’t considered?
  • Many people struggle to understand the underlying technology behind data collection and privacy today – is making data brokering more common better or worse for consumers?
  • Is it ethical to create data exchange networks where people do not have the expertise to use them and protect themselves?
  • Do data exchange networks introduce more vectors for hacking?
  • Will criminals use this data for more elaborate crimes, rather than simple phishing schemes and credit card fraud?
  • What about the next generation of corporate espionage?
  • Is it acceptable for the police to subpoena these records in the course of an investigation? How about when the data have been hacked and posted on so-called Dark Web sites?
  • Should the police be able to use AI interrogation techniques?

Fiction is just that, fiction, but I think it can help us understand real life in a way an explanation of the facts alone cannot. Simply understanding how genetic information is shared is one thing; seeing how it could positively (and negatively) impacts real people is another.

I don’t have good answers, but I’m getting better at asking good questions. That’s the place we need to start.

#

About Jason Voiovich

Jason’s arrival in marketing was doomed from birth. He was born into a family of artists, immigrants, and entrepreneurs. Frankly, it’s lucky he didn’t end up as a circus performer. He’s sure he would have fallen off the tightrope by now. His father was an advertising creative director. One grandfather manufactured the first disposable coffee filters in pre-Castro Cuba. Another grandfather invented the bazooka. Yet another invented Neapolitan ice cream (really!). He was destined to advertise the first disposable ice cream grenade launcher, but the ice cream just kept melting!

He took bizarre ideas like these into the University of Wisconsin, the University of Minnesota, and MIT’s Sloan School of Management. It should surprise no one that they are all embarrassed to have let him in.

These days, instead of trying to invent novelty snack dispensers, Jason has dedicated his career to finding marketing’s north star, refocusing it on building healthy relationships between consumers and businesses, between patients and clinicians, and between citizens and organizations. That’s a tall order in a data-driven world. But it’s crucial, and here’s why: As technology advances, it becomes ordinary and expected. As relationships and trust expand, they become stronger and more resilient. Our next great leaps forward are just as likely to come from advances in humanity as they are advances in technology.

Thank you! Gracias! 谢谢!

Your fellow human.

Categories
Long Form Articles Rehumanizing Consumerism

Your “smart” TV is a dumb idea

That Hisense 55-inch 4K LED flat screen Smart TV with built-in Roku for $349 sounds like a great deal, doesn’t it?

This isn’t some “Black Friday” special or a “scratch and dent” fire sale, this is the regular price. At some retailers, you might even get a special offer – I’ve seen this model sell for as low as $299. Want to go even lower? Best Buy’s Insignia-brand model retails for a bit cheaper. Prefer a big-name brand? Samsung, Sony, VIZIO, and others all offer similar models in the same price range.

But if you buy one of these today, you might be disappointed. A new wave of Smart TVs is on its way from Xiaomi and Huawei later this year that are reported to cut that price by more than half. That’s right, these new 55-inch 4K LED Smart TVs might start to approach the $150 mark. At some point in the future, we could see a scenario in which the Smart TV comes free as part of a package of “cable” or “streaming” services. Smartphones use that pricing strategy today. Free Smart TVs might arrive as early as 2020.

Pretty cool, huh? Wouldn’t you like to pick up a new 55-inch flat screen for the price of a nice dinner and bottle of wine? Low-cost manufacturers are already seeing success in the Indian market; if the reports are true, the rest of the world doesn’t have long to wait.

I can almost hear my dad…

If it seems too good to be true, it probably is. 

He’s right. I intend to show you just how much you’re paying for a Smart TV.

#

Let’s start with a basic rundown of the critical features most people look for in a new television:

  1. Screen size: We can almost stop the list right here. When surveyed, buyers talk about additional features, but the truth is that most people make their buying decision based on the measurement of the screen size (measured diagonally from one corner to the opposite corner). The bigger the better – up to a point. In practical terms, the television needs to fit in your car or truck (or you need to be comfortable paying a delivery fee) and it needs to fit on your wall. Those buyers will say they bought that monster screen so that they can stay home instead of going to a theater, but that’s usually not the case. Heavy entertainment users (most Americans) do both. The actual reason is quite simple: People buy “big” to impress their friends and neighbors.
  2. Screen resolution: This is the second most sought-after feature. Resolution is essentially a measure of picture quality. The most common measurement is the number of “pixels” on the screen, and here’s where it gets a little confusing:
  • 480p SD: Standard Definition, or 640 pixels wide by 480 pixels tall. You’ll have trouble still finding one of these models even if you wanted one.
  • 780p HD: This is the first so-called “High Definition” standard, with dimensions of 1280 pixels wide by 720 pixels tall. These are the “cheap” HD screens.
  • 1080p HD: Often (confusingly) called “Full HD” with dimensions of 1920 pixels wide by 1080 pixels tall. The average consumer can be forgiven for looking at the “1280 pixels” in the other HD standard and believing that was “more pixels than” the 1080p HD model. Marketers aren’t always clear on which dimension they’re referring to, as we’ll see.

Okay, watch what happens now. It’s a little marketing trick. Instead of using the vertical pixel dimension, marketing switched to using the horizontal pixel dimension. That’s not necessarily inaccurate, but it’s not very clear either.

  • 4K Ultra HD: If we were using the same standard, 4K would be called 2160p HD…or 1080p HD should really be called 2K. Confused? Most people are. But in practical terms, with dimensions of 3840 pixels wide by 2160 pixels tall, 4K is often clear enough to see nose hairs on your favorite actors.
  • 8K (Superlative TBD) HD: Still rare, these screens have dimensions of 7680 pixels wide by 4320 pixels tall. Get ready for a journey past the nose hairs and into the nasal cavity. How about “Nasal HD”? No?

The final confusing bit is the relationship between screen size and resolution. A smaller 4K screen will appear clearer to your eyes than a very large 4K screen. Same number of pixels, in a smaller surface area, equals sharper appearance. That’s why you can get away with 1080p HD on the smaller screens and they look just fine…but the larger screens appear to benefit more from the higher resolution (this is called “pixel density”). And yes, television wonks will wax philosophical about signal bandwidth, image contrast, and color quality, but most people can’t tell the difference. (Marketing loves the wonks. You should be suspicious.)

Everything else falls down the list quickly. Almost 80% of the purchase decision is made based on screen size and resolution. Other factors matter, but much less so. Different brands use minor differences in port counts, sound system choices, and mounting options in an attempt to separate themselves in your mind. But once your TV is mounted to your wall, size and resolution drive your enjoyment. Everything else is trivial.

I’ve spent time explaining the basics of television marketing to highlight an important problem: Both of the key driving factors in television purchase selection (screen size and resolution) have become commodities, but we’re still vulnerable as consumers to Smart TV marketing that tugs at our egos and confuses rational decision-making.

It gets worse.

This commoditization puts tremendous pressure on less-critical factors in the buying decision, encouraging manufacturers to resort to gimmicks (curved screens) and confusing marketing (blacker blacks) to drive sales.

What’s more, as retail prices continue to drop, the price you pay as a consumer for that new Smart TV barely covers the cost of the large screen, plastics, electronics, packaging, shipping, distribution, retailing, and marketing – if it covers it at all. At $150, it almost certainly does not.

But as the end consumer, why should you care? If a manufacturer wants to give me a Smart TV in exchange for a year of streaming service (that I would have bought anyway), why would I say no? The reality is that cheap Smart TVs are such a win for consumers, that we often don’t think much beyond the price.

We should start. Manufacturers are not in business to lose money. Profit has to come from somewhere. Let’s find out where.

#

I strategically failed to mention one more important features of modern televisions: Software. Specifically, Smart TV software.

Only 20 years ago, televisions didn’t use software in any meaningful sense. Yes, televisions have long since abandoned mechanical actuators to change channels (and therefore needed basic microprocessors), but the consumers saw little evidence of that software beyond crude on-screen displays. Software of that era simply needed to recognize whether the television was on or off, what channel you were on, your volume, and whether an external device was plugged in. In fact, most of the “software” came secondhand from your video game console, DVD player, cable box, or home audio system.

But televisions have come a long way, driven by competition from mobile devices. Manufacturers saw their share of home entertainment under threat from tablets and smartphones, as well as plug-in devices such as Roku, Apple TV, and Amazon Fire TV. Their flexibility (and competitive advantage) came from software, not necessarily better hardware.

Smart TV systems, in various forms, are the television makers’ answer to the iPad. You may not get all the iPad’s flexibility, but you get access to popular streaming services, a smattering of apps, and management of external devices (DVD players, Cable TV providers, on-demand content, and other gaming systems) – all on a huge, beautiful screen.

It’s not hard to understand why they would do that. Television manufacturers incur massive hardware development expenses, and then go through the trouble of getting the big screen into your living room, only to hand over the after-purchase revenue to someone else.

And that’s what this is all about: After-purchase revenue.

#

The television is the least of what you pay for in home entertainment.

With only a brief look at the average monthly bill for content coming through the television, we can see why television manufacturers might want to get in on that. Let’s start with the obvious costs and benefits – the ones you see on a monthly (or on-demand) bill:

  • Cable or Satellite Service: $107/month on average – People might complain about cable television, but they still buy it, often because it’s bundled with other services (phone or internet) or because that is the only way to get access to popular programming (live sports is a common example).
  • Streaming Service(s): $10-$15/month per subscription (many homes have two or more): Netflix, Amazon (part of a Prime membership), and Hulu are the big ones, but they’re not the only providers. YouTube also offers subscription services to avoid its advertising, and AT&T offers a plan as well.
  • Movies and Pay-Per-View Entertainment: $20-30/month: Want to watch the latest movie? Don’t want to buy the DVD? You can buy it through iTunes for $10-$15. Most homes order 2-3 additional offerings each month.

Yes, some people have “cut the cord” and use streaming services in place of cable and satellite services, but many households use both. If we do the quick math, that’s more than $150 per month in entertainment services for an average home. (And we’re not counting internet connectivity and mobile phone plans.) That’s almost $2,000 each year. Now compare that to the falling retail price of the average Smart TV and you’ll understand the appeal of after-purchase revenue.

The real money isn’t in selling you a Smart TV, it’s in selling you entertainment.

But again, as the consumer, why should you care how much the television manufacturer makes?

When you use your Smart TV to access Netflix, you’re not paying your bill through Samsung. The Smart TV is simply a portal to organize these services, and you know Samsung needs to make money somewhere. As the consumer, you get the benefit of less clutter, fewer external devices, and an easier user interface. You might even get a discount on some of those streaming services.

What’s not to like?

#

In the modern television ecosystem, you’re not consuming entertainment, you are the entertainment.

Here is the point in the story we need to introduce you to Samba TV. It’s not the only such provider of television viewer data, but it’s the big one you may have heard of, mainly from this article last July.

In short, if you enable the Samba Interactive TV function on your Smart TV (and about 90% of people do), the company can track your viewing habits, aggregate that data, and sell that data to advertisers. Content providers and advertisers can then use that data – not only in its aggregated form – but also to deliver individualized programming recommendations and targeted advertising. With Samba, television manufacturers (finally) get a cut of the aftermarket.

You can almost hear marketing directors squealing with joy.

Not almost.

Let’s allow one to tell you herself, as described in the New York Times.

Citi and JetBlue, which appear in some Samba TV marketing materials, said they stopped working with the company in 2016 but not before publicly endorsing its effectiveness. JetBlue hailed in a news release the increase in site visits driven by syncing its online ads with TV ads, while Christine DiLandro, a marketing director at Citi, joined Mr. Navin at an industry event at the end of 2015. In a video of the event, Ms. DiLandro described the ability to target people with digital ads after the company’s TV commercials aired as “a little magical.”

That’s why the Smart TV is such a big deal. By centralizing all of your entertainment consumption activity, you also centralize all of your behavioral data. And there is a bigger market for your television viewing data than you might think:

  1. Content Optimization and Ratings Data: The days of the Nielsen set-top monitoring boxes are now painfully quaint. Why settle for a sampling of television viewers when you can gather all of the data from every Smart TV-enabled system? Content providers know not only how many people watched, but at what points they stopped watching, and even at what points they were unengaged with the content. That last one is the most important. Lack of “engagement” isn’t simply taking a bathroom break; actual engagement is more subtle than that. If you’re playing with your kids, you’re not paying attention to the programming.
  2. Product Advertising: Advertisers want to know if you’ve viewed their ads as well as how engaged you were – just like content programmers. But advertisers want much more than that. Instead of delivering advertising and hoping you make a purchase at some undetermined point in the future, advertisers want you to make the purchase immediately. Ideally, right on the screen. That’s the “magical” part DiLandro referred to.
  3. Improving Facial Recognition and Voice Algorithms: You may have wondered how your Smart TV knows you’re watching it and how much you’re actually paying attention. Here’s a hint: Many (most) of these new Smart TVs have both cameras and microphones built in. When you’re watching a modern Smart TV, the Smart TV is also watching you. Older versions could only tell if “someone was in the room,” but newer models can also track where you’re looking on the screen. With newer voice recognition systems, they also can tell if you’re discussion the program or advertising … or talking about something else. They use this data to improve how content (both entertainment or advertising) should be optimized for maximum consumption and conversion.

This data is worth billions. And we just gave it away for a cheap flat screen TV.

#

At this point, it’s fair to think all that monitoring might seem a bit creepy, but it’s not as though they didn’t tell you they were doing it. You can adjust the privacy settings to disable those Smart TV functions if you don’t want them. And who cares if television manufacturers are making money off your data? They’re delivering better programming and targeted advertising. That sounds like a win-win. What’s more, monthly fees from cable and streaming services are expensive enough. At least the Smart TV is getting cheaper.

It’s hard to argue with that logic.

In fact, you could argue (and many have) that better entertainment and better advertising are small prices to pay for an enhanced experience. The average person in the United States spends about 8 hours in front of the television each day. That surprises you, doesn’t it? You may have thought that computers, tablets, and smartphones have eaten away at that number – and for some segments of the population, they have – but on the whole, people of all generations enjoy consuming content on a big, immersive screen.

And now, finally, the technology inside the television is catching up with technology of the screen itself. What’s wrong with that?

#

Let’s set aside the issue of providing consent, and how difficult it is to read and fully understand privacy policies. That’s a separate issue, but it’s under your control.

I am going to ask you more difficult questions:

Do you consent to a Smart TV monitoring your children?

What about your kids using your Smart TV when you’re not at home (or when you’re out of the room)? Is it okay for advertisers to ask your children to buy a product they see on the screen? Are you aware of (and use) the parental control settings? Do they work as you would expect them to work?

Well, you say, that’s the parent’s job. I don’t want (or need) some intrusive regulation telling me how to raise my kids.

Okay. Let’s ask another question.

Do you consent to a Smart TV monitoring you in your hotel room?

Yes, that same technology exists in nearly every hotel room, and because that Smart TV is not your property, you have little control over its privacy settings. Wiretapping is illegal. Using the Smart TV is not.

Well, you say, the Smart TV is the hotel’s property. They can do what they want. I don’t have to stay at that hotel, and I don’t need to use the Smart TV.

Okay. Let’s ask another question.

What about when they companies violate their own policies about sharing and protecting your data?

We’ve seen this before: Last year, the Federal Trade Commission fined VIZIO $2.2 million for selling data on 11 million viewers without their consent starting in 2014. Samba TV skirts this situation by paying television manufacturers to pre-install its software, but it doesn’t sell the data, it sells targeted ads. That seems like an awfully fine line to walk. If internal controls fail at the company, or its servers are hacked, your data is at risk.

Well, you say. Now you’re being silly. That doesn’t happen that often, and those companies get caught. You can’t prevent all the bad stuff from happening. And besides, I have nothing to hide, so I don’t care if people know what TV shows I watch.

I don’t have to agree with you to respect your point.

But I’m not done asking questions just yet.

#

Are you willing to risk espionage from foreign governments?

To help explain why it’s not unfeasible to use Smart TVs for espionage, we need to revisit the biggest computing story in 2018. No, it wasn’t the launch of the iPhone Xs, or some new AI technology debuting at CES, it was a story about a tiny microchip in an obscure supply chain for ultra-fast server hardware. If you’re an IT professional, this was big news. Most people missed it.

Here’s the short version: California-based manufacturer Supermicro was an important part of the supply chain for several companies, manufacturing circuit boards for high-end, ultra-fast servers used by companies such as Amazon, Apple, and other major corporations (as well as US government agencies) to process huge volumes of data. Allegedly, buried deep in the circuit board was a tiny microchip – a chip that wasn’t supposed to be there, and that the Chinese People’s Liberation Army forced Chinese-based subcontractors to install – that opened a “backdoor” into the server from a remote location.

If it’s true, that’s data espionage. Plain and simple.

The entire story is fascinating. You should read it. Fortunately, impacted companies and agencies discovered the problem and eliminated it (allegedly, they won’t admit it). Predictably, Supermicro vigorously denied the reports. The story is ongoing. In the end, however, it doesn’t matter if it happened precisely as Bloomberg reported it or not. The idea is exposed.

So, let me ask my question a different way. Consumers in the United States alone own about 150 million Smart TVs. What if only one percent of those devices had a “spy chip” installed? That’s 1.5 million potential surveillance devices.

That doesn’t account for the possibility of hacking the Smart TV software – much easier, and far more likely. Research from the team at Consumer Reports (published in 2018) shows Smart TV software was vulnerable to hacking.

They allowed researchers to pump the volume from a whisper to blaring levels, rapidly cycle through channels, open disturbing YouTube content, or kick the TV off the WiFi network.

Researchers (white hat hackers, in this case) couldn’t extract information using these methods solely through the Smart TV interface. But many people use the same WiFi network for their phones and tablets as their Smart TVs. That increases vulnerability to software intrusions that come from elsewhere – say, clicking on a phishing email.

Fortunately, while Smart TV software may be vulnerable, there’s no evidence that hardware tampering has happened or that anyone has found a “spy chip” in a consumer television.

Yet.

But absence of evidence is not evidence of absence.

What if your Smart TV was, unwittingly, a listening device for a foreign government? It makes Russian tampering with Facebook advertising seem quaint by comparison.

This is a big fucking deal.

#

Holy shit, huh?

You didn’t think you’d need to consider geopolitics while browsing for Smart TVs on the sales floor of your local Best Buy.

Sadly, in today’s ultra-connected world, we need to broaden our perspective. But luckily, there are a few easy things you can do today to help mitigate invasions of your privacy, while still accessing the entertainment you want.

  1. Learn the privacy settings on your Smart TV. This isn’t as easy as managing the settings on an Apple iOS or Google Android device. There are many Smart TV versions out there, and even more manufacturer-specific settings. You’ll need to find yours and understand them. Fortunately, the good folks at Consumer Reports have provided a starting point.
  2. Remove any unwanted/unused apps from your Smart TV. Just like your smartphone, any app on your Smart TV might be collecting data, even if you’re not using it.
  3. Be careful of gaming platforms, especially with kids. Microsoft, Sony, and Nintendo have solid protections in place, but many Smart TV-accessible games may not. Know what you kids are doing.
  4. Speaking of kids, learn the parental controls on the Smart TV too. Your kids cannot know the technology better than you do. Sorry, you’ll need to learn.
  5. Find the camera and microphone on your Smart TV. They’re usually described in the instruction manual so that you do not cover them. Cover them.
  6. Unplug your Smart TV when you’re not using it.
  7. Or if you’re not going to do that, have your Smart TV on a different WiFi network than your other devices – especially “listening” devices such as Google Home and Amazon Alexa, or home security systems.
  8. Is it about time to contact your representatives about GDPR-style legislation? What’s it going to take?
  9. Consider purchasing a Smart TV brand based in a country with a lot to lose from pissing off your home country. South Korea and Japan fall into that category for the United States. China, not quite so much. Although supply chains are global, and many of these manufacturers use Chinese sub-contractors, another (friendly) government provides an extra layer of vigilance.

#

Managing your own privacy is part of modern life. The tech companies won’t do it. They barely think humans as anything more than moist computers with a checking account. The Smart TV manufacturers won’t do it. They’ve finally entered the data race, and they’re hardly going to stop now. The advertisers won’t do it. They’re addicted to data – however they can get it. The media can’t do it for you. They only report what’s already happened (and by then, it’s too late). Your government can’t do it either. Even GDPR has holes, and even tight regulation can’t protect you from bad actors who simply break the rules and hide.

No, protecting your privacy is up to you.

That’s the price of entertainment.

#

About Jason Voiovich

Jason’s arrival in marketing was doomed from birth. He was born into a family of artists, immigrants, and entrepreneurs. Frankly, it’s lucky he didn’t end up as a circus performer. He’s sure he would have fallen off the tightrope by now. His father was an advertising creative director. One grandfather manufactured the first disposable coffee filters in pre-Castro Cuba. Another grandfather invented the bazooka. Yet another invented Neapolitan ice cream (really!). He was destined to advertise the first disposable ice cream grenade launcher, but the ice cream just kept melting!

He took bizarre ideas like these into the University of Wisconsin, the University of Minnesota, and MIT’s Sloan School of Management. It should surprise no one that they are all embarrassed to have let him in.

These days, instead of trying to invent novelty snack dispensers, Jason has dedicated his career to finding marketing’s north star, refocusing it on building healthy relationships between consumers and businesses, between patients and clinicians, and between citizens and organizations. That’s a tall order in a data-driven world. But it’s crucial, and here’s why: As technology advances, it becomes ordinary and expected. As relationships and trust expand, they become stronger and more resilient. Our next great leaps forward are just as likely to come from advances in humanity as they are advances in technology.

Thank you! Gracias! 谢谢!

Your fellow human.

Categories
Audience Empowerment Information Management Long Form Articles Rehumanizing Consumerism

You don’t have a right to privacy. You have something better.

What if there was no right to privacy?

That question triggers a surge of righteous rage in many people, especially in the Western world. We rank “privacy” right up there with “free speech” and “freedom of worship.” But as we’ve seen (especially in the past 20 years of the information revolution), the notion of privacy has morphed into something more complicated.

To those who lived through the transition from pre-information to post-information eras, this new reality catches us off guard. In the 1980s, privacy was easy. You knew if you were anonymous. You chose to go public. But in 2019, privacy is challenging. Much of the time, you can’t tell if your actions are public or private – surveillance cameras, GPS trackers, and web tracking is so common that the average person could spend their entire day reading privacy policies and never understand half of it.

At the root of the anger is a contradiction: We want the benefits of modern technology without the intrusion to privacy they require. We don’t want our cars to know where we are … but we want GPS navigation. We want low health insurance rates … but we don’t want to share our dietary and exercise habits. We don’t want advertisers listening in to our conversations … but we want the best deals on products and services tailored precisely to us (without having to endure all of the other advertising).

To put it more simply, privacy is like celebrity – we want the kind of either that we can use when we want something, and then turn it off when we don’t. We want enough “celebrity” to get a good table at a busy restaurant … but not enough to get followed by paparazzi. We want enough “privacy” to keep our political beliefs to ourselves … but still get access to Facebook and Google.

Ask any true celebrity. You can’t have both.

It’s the same with privacy. There is no free lunch.

The issue is how we’ve defined privacy. Merriam Webster sums it up quite well:

privacy | ˈprīvəsē |

noun

  • the state or condition of being free from being observed or disturbed by other people: she returned to the privacy of her own home.
  • the state of being free from public attention: a law to restrict newspapers’ freedom to invade people’s privacy.

That definition didn’t burst forth from the earth fully formed. It has a basis in law in the United States. Although the word “privacy” appears nowhere in the US Constitution, federal and state privacy laws cover plenty of ground. We can categorize privacy into four main groups:

  • Intrusion of solitude: physical or electronic intrusion into one’s private quarters (usually, that means your home, but it can mean other private spaces as well, such as bathrooms and your car).
  • Public disclosure of private facts: the dissemination of truthful private information which a reasonable person would find objectionable (the modern practice of doxxing falls into this category, and it is illegal in some places).
  • False light: the publication of facts which place a person in a false light, even though the facts themselves may not be defamatory (libel and slander laws fall into this general area well, and it gets complicated).
  • Appropriation: the unauthorized use of a person’s name or likeness to obtain some benefits (aka impersonating someone else).

Many states build on federal statutes with their own, more restrictive, laws. Many of those state laws cover technological intrusions explicitly.

With GDPR, the European Union went even further, creating an entire legal framework specifically addressing a modern concept of privacy in a technologically-powered world. It’s a new set of rights and rules that apply to everyone in the EU (as well as a limited set of rights for everyone else).

In many other countries, almost the opposite situation exists. In many places, the concept of privacy is subsumed by the interests of the state. China comes to mind immediately, but it is hardly the only one. Those countries made the choice that benefits of total surveillance outweigh the desires of their population to keep to themselves.

But beyond the legal frameworks and philosophies, the concept of privacy varies by generation. People who lived before the information revolution see privacy differently than those born after it started. Younger people tend to accept the tradeoffs more readily, or at least they don’t think about the downsides quite so much until something very negative occurs (online bullying as an obvious example).

We have to wonder: If privacy can vary so much by law, by country, by culture, and by generation, logic holds that privacy cannot be a “natural” right.

If that’s true, whatever gave us the idea that we have a “right” to privacy?

 

Remember taxation without representation? Today, privacy is like exposure without consent.

Before the privacy equivalent of the Boston Tea Party breaks out, a (very) brief (and oversimplified) history lesson is in order.

The concept of “privacy” is a new idea from a historical context. Pre-agriculture hunter-gatherer bands never had privacy. They traded it for security of the group. The first cities weren’t much better. Rulers of those small enclaves knew who lived there and much of what went on for their own survival. It was only when cities became giants (in the latter half of the 19th century) was anonymity possible – and therefore – a modern concept of privacy could develop … and then only for the privileged.

But it wouldn’t last. The beginning of the 20th century saw the emergence of the “social contract” – older workers living off the resources of younger ones, universal health care (in some countries), and shared defense/sacrifice. Even then, while you may have needed some sort of government identification, you could (for the most part) live “off the grid,” even deep in the city. In fact, that was part of the appeal of “the big city” for many people. The more people who lived in a given area, the less likely you will be noticed (if you chose not to be).

That all changed with the advent of the internet and has been accelerating ever since. In some cases, we gave up our privacy willingly for greater social connection (Facebook comes to mind). In other cases, we gave up our privacy unwittingly for the implicit promise of better products and services (Google comes to mind). We can cite hundreds of other examples. But while there are definite downsides for this new era of interconnectedness, in most cases, we gave up our privacy for the better quality of life these technologies offered.

Here’s the catch: To function, the technologies require ever-increasing transparency. You can’t remain completely private and still retain all the benefits.

For only a brief window in recent history has there been any true concept of privacy based on the choice to remain anonymous. During that short time, we tasted privacy, we liked privacy, and now we feel that privacy is slipping away.

In other words, privacy as we have defined it and as we understand it is a myth. It’s our poor definition of privacy that sits at the root of our frustration.

It’s time we redefined it.

 

Privacy as a right versus privacy as an asset.

Let’s consider a new definition of privacy as an asset:

asset | ˈaset |

noun

  • a useful or valuable thing, person, or quality: quick reflexes were his chief asset | the school is an asset to the community.
  • (usually assets) property owned by a person or company, regarded as having value and available to meet debts, commitments, or legacies: growth in net assets | [as modifier] : debiting the asset account.

What happens when we do that? Let’s highlight the key differences:

  1. Privacy as a “right” – the state or condition of being free from being observed or disturbed
  2. Privacy as an “asset” – a useful or valuable thing, person, or quality

Do you notice something about the first definition? As a “right,” privacy is something others grant us as individuals. It is the “condition of being free from intrusion.” Do you notice something about the second definition? Privacy is a thing of value that you own. It is a “useful or valuable thing.”

That simple shift makes all the difference.

Our new definition transforms privacy from something others control (they choose not to intrude on us) to something you control (you choose to protect your asset). This may seem like a trivial distinction, but it’s not.

Privacy is still all about choice. It’s simply a matter of whose choice. Shouldn’t it be you?

It may seem odd to think of privacy that way at first: You can’t own a bushel of privacies. There is no stock market for privacy securities. You can’t pay my mortgage with my privacy account. But that’s because we’re confining the definition of an asset as something tangible. But assets are not simply physical objects. The real value of privacy in the information age is information itself. That’s all privacy is – an information asset.

When we begin to think about privacy as an information asset, we see immediately a number of benefits:

  1. Instead of an abstract right, privacy as an information asset has measurable value. In other words, we can convert privacy into information that could be sold, traded, or invested.
  2. The act of quantifying our privacy and organizing it into categories illuminates its value. In other words, privacy is a set of assets available for your personal exploitation and benefit.
  3. Because privacy is a quantified asset, it’s also divisible. That means there’s more to privacy than “all or nothing.” You can choose some information to remain private, some to share, and some to sell or invest.

What does that mean in a real situation? You can decide to give away your private information to use Google Maps or Alexa. You can weigh the pros and cons. The choice not to use one of these services may be difficult or costly, but it is your choice.

 

Your privacy information asset portfolio.

At this point, many people are confused. That’s natural. Yes, we follow the argument: (1) privacy is a modern creation; (2) privacy (as we know it) is eroding quickly in the face of technological innovation; and (3) it is more useful to think of privacy as an information asset rather than some sort of inalienable right.

The rational argument isn’t the confusion – the implication of redefining privacy is unclear. In other words, how do we manage privacy in our day to day lives?

Privacy is unlike other assets. Sometimes, it is quantifiable like money (e.g. your credit score information), but often it is not (e.g. the value of your religious affiliation). Sometimes privacy exists on a spectrum (you can share a little personal information on Facebook, but not everything), but often it is a binary choice (you have shared your location data, or you haven’t).

The confusion is natural.

Information is such a new type of asset that we can be forgiven for wondering how to think about it. Each type of data becomes part of your privacy information asset portfolio. You get to choose how to invest your assets to achieve your objectives. But to invest with confidence, we need clarity on the assets in our portfolio. Let’s explore those assets and how you might decide your allocation strategy:

Social Data

Social data is an easy start. If you use Facebook (and most people at least have a profile), you’ve shared at least some social data. In return, those services provide a way for you to stay connected with family and friends. If they’re free services (and most are), your privacy assets are the product you’re selling in return for those services. If you’ve ever felt like you don’t get much in return for social networking, what you should start saying is I am paying too much for this. Remember, just because you’re not exchanging money, doesn’t mean you’re not exchanging value. Consider switching to a paid social network such as Premo Social. I know people who’ve done it. The modest cost of those services allows you to retain additional privacy, and in effect, “pay” less.

Location Data

This is another easy one…especially in the past ten years. Most (if not all) modern cars have GPS trackers. That technology allows automakers to offer emergency services and car rental agencies the ability to track their car after you rent it. Many also feature built-in navigation systems. All modern smartphones have the same GPS location functions, allowing Apple, Google and others to offer driving, transit, and walking directions to wherever you want to go (not to mention to share that data with other apps). These functions are so common, that you can be found by someone almost anywhere you go. Consider learning how to turn off location services when you don’t want to be tracked. Practicing this habit will force providers to ask you to turn them on and make you aware of just how often your location is being shared. If they want your information, they should make a compelling offer of value. If not, just say no.

Purchase Data

If you’re like most people, you make a lot of purchases from a lot of different providers. Who has that information? Banks, sure. Credit cards, them too. Amazon, yes, but less than you think. How about your corner market, Uber, or Amtrak? You may use a combination of credit cards, checks, online bill-pay, cash, and gift cards. Today’s reality is that no one provider knows your entire purchase history, only you do. Services such as Mint are trying to give you greater visibility in your spending by aggregating as many of these different sources as possible. Even if you don’t sign up for one of these services, it’s worth understanding how they work and the value the bring. When one of them will offer to pay you for your data (instead of offering the service for “free”), you’ll be ready to decide.

Financial/Credit Data

Here’s the basic idea behind the credit rating agencies: You’re trading this aggregation of data for the ability to maintain a “credit score.” You can opt out in many cases (or pay in full, in cash, immediately, for absolutely everything), but a credit score is the inevitable consequence of living in a modern economy. (It’s also useful for borrowing money when you need it.) Do you think about your private credit history as an asset to be managed? You should. Frankly, it’s more constructive than feeling powerless when they make a mistake. You wouldn’t let your bank misplace half your paycheck without making a phone call, would you? Well, have you checked your credit report (for free)? You probably should.

Health Data and Biometrics

This is a bigger category than you may realize. Yes, health data includes your medical records (test results, family history, doctor visits, etc.), but it also the biometric data captured by your Fitbit, Apple Watch or smartphone (number of steps, diet choices, blood pressure, heart rate, etc.) In the future, and in some cases today, you will be able to take advantage of your good habits to negotiate lower insurance rates or sell this information to medical innovators. That’s especially valuable if you have got an odd genetic trait or family history. But until there are better protections in place, be careful about sending away for a “low cost” or “free” genetic screening. In the meantime, you can consider signing up for paid pharmaceutical and medical device trials.

Image, Video, and Voice

Pictures of you (or pictures you take), videos of you (or videos you take), and even the sound of your voice have much more value than you realize. Those photos and videos have value. Instead of a free social network, why not post them to a photo/video sharing network where you could earn some money? Voice is the next generation of human-computer interface, and Silicon Valley is racing to get better at this. They’re being coy about telling you just how much they’re collecting and analyzing because they’re hoping you’ll give it to them for free or for the “use of their product.” Make them give you more for it.

Employment

LinkedIn gets your detailed career history and job-hunting desires for free (are you seeing a pattern here yet?) But with more people become “remote,” “virtual,” or “gig workers,” the traditional linear career path will cease to exist. Your job history is more than a series of employers. Your career successes are simply another series of information asset – the entirety of which only you know. Gig job markets may give you a better idea of your true value than a salary benchmark website such as PayScale.com.

Political and Religious Affiliations

Of all the types of private information people have, political and religious information is also the type we’re most likely to give away for free. It may seem counterintuitive, or downright wrong, to think of these pieces of information as “assets,” but bear with me. Don’t think about them in terms of money, think in terms of value exchange. Is it worth it to you to support a political cause? And worth the risk of someone not being your friend because they know that? Then by all means, share that information. The same goes with your faith, although in a more complex context depending on the creed.

 

Defining privacy as an asset demands being intentional with your choices.

That word intentional is critical. When we think of “rights” we think of something we were born with – that’s where the word birthright comes from. We value rights, but mostly in an abstract sense, and often not unless we’re threatened with losing one.

By contrast, when we think of “assets” we think of something we acquire, earn, and use for our own benefit. If we don’t, we’re being wasteful. That waste can translate into actual money, yes, but we also can waste our relationships, or time, or our happiness.

In the modern world, no matter what Google or Facebook may tell you, there is no free technology. There is always an exchange of value. Most of the time, your privacy is the most valuable asset in the equation.

But now, you should realize that you are in complete control. You simply need to take it.

 

About Jason Voiovich

Jason’s arrival in marketing was doomed from birth. He was born into a family of artists, immigrants, and entrepreneurs. Frankly, it’s lucky he didn’t end up as a circus performer. He’s sure he would have fallen off the tightrope by now. His father was an advertising creative director. One grandfather manufactured the first disposable coffee filters in pre-Castro Cuba. Another grandfather invented the bazooka. Yet another invented Neapolitan ice cream (really!). He was destined to advertise the first disposable ice cream grenade launcher. But the ice cream just kept melting!

He took bizarre ideas like these into the University of Wisconsin, the University of Minnesota, and MIT’s Sloan School of Management. It should surprise no one that they are all embarrassed to have let him in.

These days, instead of trying to invent novelty snack dispensers, Jason has dedicated his career to finding marketing’s north star, refocusing it on building healthy relationships between consumers and businesses, between patients and clinicians, and between citizens and organizations. That’s a tall order in a data-driven world. But it’s crucial, and here’s why: As technology advances, it becomes ordinary and expected. As relationships and trust expand, they become stronger and more resilient. Our next great leaps forward are just as likely to come from advances in humanity as they are advances in technology.

If you care about that mission as well, he invites you to connect with him on LinkedIn. If you’re interested in sharing your research, please take the extra step and reach out to him personally at jasonvoiovich (at) gmail (dot) com. For even more, please visit his blog at https://jasontvoiovich.com/ and sign up for his mailing list for original research, book news, & fresh insights.

Thank you! Gracias! 谢谢!

Your fellow human.