What the Facebook / Cambridge Analytica Scandal Teaches Us About the Future of Our Personal Data

Colored Facebook Background.JPG

I’m about to reveal a big secret about myself. I love a good Facebook quiz. Whether I’m finding out what I will look like in 20 years or what my leprechaun name is, it’s fun to do these mindless games on Facebook and compare results with friends. If you’ve ever done one of these, you know it’s easy--you click one button to agree to share information about yourself, information in your Facebook profile, and information on your Facebook friends. What could be the harm? We figure, "Of course this information is needed if we’re looking to find the accurate answer to 'What will my Hollywood movie poster look like?'" It seems harmless, so we trust it.   

The Facebook platform collects massive amounts of data on us, and it does so in a brilliant way. Imagine having a stranger come knocking on your door and asking you for a list of all your family, your friends along with photos and everything you know about them.  No one would ever fall for this. But now that Facebook is such a familiar and popular way to connect with people, it doesn’t feel like a stranger to us. We “trust” Facebook, and we use it to store massive amounts of information about ourselves and the people we know. In fact, we trust it so much that when it comes to their “privacy agreement,” we agree to it without even reading its terms.

The reason why the Facebook/Cambridge Analytica debacle has people angry is because people assumed there was no risk in how their data from Facebook would be used. But in this case, to the shock of the world, Facebook exposed data on 50 million Facebook users to a researcher who worked at Cambridge Analytica. And, as another piece of the puzzle, Cambridge Analytica worked for the Trump campaign.  So as the public is wielding pitchforks at Facebook’s door, the first lesson for us all is this:

#1: Any data that we’re publicly sharing will be used.

And once our data is out there, absent restrictions, we have little control over how it is being used. Data is valuable to companies, both in utility and in dollars. So when it comes to any platform that collects and stores any data on you, you can assume this data will be used in some way or sold to a 3rd party. 

#2: So much more of our personal data exists than what we realize.

It’s scary, I know. Data on you and me is everywhere. And if you have watched my talk for TEDxProvidence, you know how the amount of data we’re able to capture has increased exponentially in just the last 15 years. According to Google’s former CEO Eric Schmidt, the same amount of data created from the beginning of time to 2003 is what was generated in the last 2 days. 

Eric Schmidt.JPG

Our data is used by marketers, by election strategists, by grocery stores, and by prescription drug companies. It’s used by every social media platform, and our data is used by their affiliated companies as well. Simply put, most companies are using our personal data in some way.

#3: Not only are most companies using our data, but the most successful companies are built on data. 

There are 13 companies in the S&P 500 that have managed to outperform the entire S&P 500 5 years in a row. The majority of these companies are “algorithmically driven,” meaning they gather data from their users and they update the consumer experience almost automatically. These are companies like Facebook, Amazon and Google. Global business investments in data and analytics will surpass $200 billion a year by the year 2020. In the future, we will see more and more businesses moving data to the core of their competitive strategy.

What does this mean to us? The time is right for the public to champion a universal code of ethics surrounding our data use.

#4: Our data should be protected by a common code of ethics.

Now that we have just a glimpse of what can happen when data is available unrestricted in the hands of others, we need to have a common set of rules to govern data use. DJ Patil, the first Chief Data Scientist for the White House, reminded us that “with great power comes great responsibility” in his February 2018 call to action “A Code Of Ethics for Data Science.” This post coincidentally was published over a month before the Facebook/Cambridge Analytica Scandal hit the press. The weighty responsibility of using data appropriately weighs on the minds of many within the data science community.

When my partners and I formed our company BetaXAnalytics, our founding principle is that we wanted to use the power of data “for good” to improve the cost and quality of healthcare in the United States. Since we had a deep experience in clinical and pharmacy data science, we knew there was a resounding need for ethical transparency for those who are paying for health services. We wanted to provide the actionable insight that our clients need to make decisions regarding healthcare services and care coordination.

Since my company BetaXAnalytics works with healthcare data, the way we protect data is governed by HIPAA; this legislation ensures both the privacy and safeguard of people’s health-related information. A large amount of our time and resources are put towards our focus of maintaining data security and privacy. The data we use is governed by strict contracts with our clients and we never sell data to third parties.

As a company whose business is built on interpreting health data, we live by the mantra “with great power comes great responsibility.” We hope to see this movement grow both within and outside the data science community to work towards using the powers of data “for good.”

- Shannon Shallcross is Co-Founder and CEO of BetaXAnalytics