On the latest episode of The Stacks Podcast, host Patrick Stanley talked to Brittany Kaiser, who, in 2018 testified before British Parliament about the infamous Facebook data scandal orchestrated by Cambridge Analytica, the consulting firm where she had been Business Development Director. She is the primary subject of The Great Hack — a Netflix documentary premiered at the Sundance Film Festival that examines the data scandal and how data can be weaponized for political gain.
Kaiser is also author of the book Targeted, and co-founder of the Own Your Data Foundation, which trains people to become more digitally intelligent through digital literacy and education so that everyone can enjoy a safe, protected, and informed digital life.
Read on for a synopsis of their discussions about the power of data, how mind viruses are engineered, and the more equitable digital landscape they’re hoping will evolve in the future.
Data is the new oil
To Kaiser, the main learning from her time at Cambridge Analytica that still resonates today is: data is power. “Data is the new oil, and we’re in the middle of an arms race where companies and governments around the world are becoming more powerful depending on how much data they have been able to collect. Because the main pitch for a data-driven strategy is that the more you know about someone, the easier it is to influence them,” she said.
Those who hold that data or technology can use it for good, or for evil. “I’ve always thought that my role in life is to help people use technology to achieve lofty goals that will make the world a better place. But if those tech tools and that data gets into the wrong hands, it is so easy to abuse…and unfortunately that’s what I ended up seeing time and time again during my time at Cambridge Analytica.” Kaiser says the company used a mixture of behavioral, experimental and clinical psychology called psychographic modeling to understand what makes people tick and influence their behaviors.
Kaiser also talked about companies and governments using “soft power” (winning the hearts and minds of people) versus “hard power” (military and weapons) to use communication to stop violence or prevent warfare. “Communications are used in a very scientific way where you collect as much data as possible to understand the population actions, behaviors, thoughts, and opinions you’re aiming to affect. That can be a really beautiful thing if it is used for positive purposes…but it’s also rife for abuse,” she said. “When these strategies are put into the civilian sphere — meaning they can be used for marketing by companies to target individuals by governments — that’s where this really starts to be an ethical and moral question. Should you be able to have enough data about people where you can play on their emotions?”
Kaiser went on to say that what we saw in the Brexit and the Trump campaign were massive abuses of understanding what makes people emotional — and that the current business model of Facebook and many other social media and search platforms is to push the content they know will make individuals emotional to the top of their news feeds. “If it incites fear or anger, then the algorithms push it to the top. This is the content that goes viral — the content that gets more engagement, click-through, and sharing. And that’s the unfortunate place where we’re at today where we’re trying to work on algorithms that have been built, perhaps over decades, that now have a preference for inflammatory content.”
Engineering Mind Viruses
Stanley asked Kaiser for her thoughts on mind viruses — ideas or belief systems that are intentionally implanted into people’s minds with the aim of being shared with other people.
“Unfortunately this type of viral content is created to be embedded into people’s minds, and it’s very difficult to overcome — which is exactly what they planned for. The strategy is to be able to create ideas, campaigns, and communication strategies that will push thoughts or theories onto the population that are difficult for them to get away from. No one’s better at that than Donald Trump when he uses terms like ‘Crooked Hillary,’ ‘Lyin’ Ted,’ or ‘Little Marco,’ ” she said.
Kaiser explained that in 2016, terms like that were part of a strategy to target a specific segment of voters. The “Crooked Hillary” name may have been used to persuade liberal voters who were considering voting for Hillary Clinton to not vote at all, while another “mind virus” campaign used old Clinton speech footage out of context to make African-American voters believe she saw them as a danger to society. “There’s a very complex set of parallel campaigns and strategies that were pushed out in order to achieve that particular goal. And that’s where the problems with lack of transparency of knowing what data is used to target people with what ads, and for what purpose, becomes a huge problem,” explained Kaiser.
“That lack of transparency on platforms like Facebook — where we cannot track and trace what’s going to whom and where, and why, and how — is why everyone’s so confused, and is why we have the most polarized society we’ve ever had. No one can understand each other because we all have different realities depending on which device we happen to be influenced by,” she continued.
Stanley agreed: “The ability to seed an emotion and a sense of identity to individuals to cement them into your desired outcome…I feel like is just far too easy at this point.”
Kaiser went on to warn that in the 2020 election, Trump will likely be using the same campaign infrastructure that was built by Cambridge Analytica in 2016. She said it’s still the same team and technology, but now it’s bigger and better funded.
“It’s not just what topics are most important to people, like traditional politics,” she said. Now it’s what do they need to hear? What images, colors, and music do they need to see and hear? What platform do they need to be targeted on, with which type of ad, and which format on what platform? What time of day? It’s incredibly specific. He couldn’t have more information at his fingertips than he does now…he’s using the most sophisticated tools that are available in the world in order to make sure that important populations — populations in swing states, populations that are his core base, his core supporters — are hearing exactly what they want to hear, when, how, why, and where.”
Stanley then asked Kaiser a question that turned out to be her current favorite, which was her thoughts on Mark Zuckerberg’s censorship stance.
“Last year, when Mark Zuckerberg made a speech at Georgetown University announcing that he would not be moderating political content or holding politicians to the same community standards that they hold you and I to, it was one of the most baffling, shocking and in the end, criminal acts that I’ve ever seen from a technology executive,” replied Kaiser.
“Some of the content that is coming from politicians — and yes, specifically Donald Trump — is criminal. We have laws against incitement of violence. We have laws against incitement of racial hatred. We have laws that prevent voter suppression. We have laws against slander and libel. Yet the content that comes from the president’s mouth himself, as well as campaign content and political ads from his campaigns and supporting organizations, still crosses all of those boundaries on nearly a daily basis. So our problem right now is that we’re not able to enforce the laws that we already have in society on one of the most influential and largest platforms in the world. And this is a massive travesty for a society, and for the respect and rule of law in and of itself,” she continued.
Dismantling Unchecked Power
Kaiser believes that unchecked power within companies like Facebook that allows them to weaponize data is a major threat to digital equality. Stanley asked how we can take action, and Kaiser laid out her three-pronged approach.
First, the creation of laws that are better aligned with our societal goals. Second: education. “People are not digitally literate. The reason why so many people share inflammatory content is because they don’t even realize that it’s misinformation or fake news, or that someone has put a lot of money behind it in order to make them angry. And so once we have a more digitally literate citizenry, then we will see the effects of the algorithms massively dropping.,” said Kaiser. Third, she explained, is the technology that will enforce the laws and regulations, and that will make sure that educated people have access to ethical technology.
“So we have three problems to solve,” posed Kaiser. “What is our legislative and regulatory infrastructure going to look like to make sure that we’re making a better society, not making ourselves worse off? How do we educate a global population to live in a fully digital world without being targeting and manipulated? And then what technologies do we want to invest in and build to make sure that the kleptocratic and predatory tendencies of the tech that we’re using every day today do not become what we see in the future?”
Patrick Stanley: Welcome back to another episode of The Stacks Podcast. You have Patrick Stanley from Blockstack PBC. Today we have a special guest, Brittany Kaiser. Brittany worked for Cambridge Analytica for four years, and following the Cambridge Analytica data scandal, Kaiser fled to Thailand and later testified before British Parliament about Cambridge Analytica and privacy threats posed by Facebook. She started a Facebook campaign in 2018, appealing for transparency, called Own Your Data. She’s the subject of the Netflix Documentary the Great Hack. And in 2019 she cofounded Own Your Data Foundation, an organization that trains people to become more digitally intelligent through digital literacy, education, and it aims to equip every individual with the tools they need to live a safe, protected and informed digital life. Brittany, welcome to the podcast.
Brittany Kaiser: Thank you guys so much for having me today.
Patrick Stanley: Yeah. I was thinking maybe we could start with a little bit of your background and then get into a bit of the Cambridge Analytica stuff next.
Brittany Kaiser: Absolutely. Well I first found myself becoming a technologist when I joined the Obama campaign in 2007. I was part of the original team that invented social media strategy. So for the first time we were using social media tools to gather data, understand more about people’s interests and behaviors, and start to craft communications that were meant specifically for individuals and what they believed in and how you could best get them to engage in the political process and with the content that was coming out of the campaign. I learned quite a lot through that, and over the many years afterwards I spent most of my time training as a human rights lawyer and working for nonprofits, charities, human rights organizations around the world, as well as United Nations consulting departments on exactly how they could use marketing and communications to achieve their campaign goals.
Brittany Kaiser: So, no matter what they were trying to advocate for, really I was helping them understand what data-driven strategy means. What technology platforms they could use to further their cause, and in the end how they could achieve measurable results. So when I was eventually introduced to the CEO of Cambridge Analytica by a mutual friend and she described to me a strategy and a series of different tech tools where you could not only achieve maximum impact from your campaigns, but where you could measure engagement rates, where you could show data-driven results of what an impact you were having. I thought, “Wow. This is something that is so exciting where I’ll actually be able to help the politicians and the causes that I believe in achieve their goals.” I ended up hopping on board as a consultant, and the rest is history I suppose.
Patrick Stanley: Awesome. What was the original pitch, or the most powerful lever that was initially pitched to you early on. How was that described?
Brittany Kaiser: So the main learning that still resonates with me today is that data is power. Data is the new oil and we’re in the middle of an arms race where companies and governments around the world are becoming more powerful depending on how much data they have been able to collect, because the main pitch for a data-driven strategy is that the more you know about someone the easier it is to influence them. And depending on who is the holder of that data, or of that technology, you can use that for good, which I’ve always thought that my role in life is to help people use technology to achieve lofty goals that will make the world a better place. But if those tech tools and that data gets into the wrong hands, it is so easy to abuse, and unfortunately that’s what I ended up seeing time and time again during my time at Cambridge Analytica.
Patrick Stanley: Absolutely. I mean 2016, for me, was the year that I realized people are generally programmable via their emotions, and tapping into the right phrase that gets their emotional brain working can get them to a behavior where they’re spreading that information. They’re telling their friends. They’re actually taking action off online ammo. How much of the emotional component would you say played a role in some of the tactics that were offered to political campaigns?
Brittany Kaiser: So, that is one of the main components, especially of what Cambridge Analytica was doing, but of what politicians and political campaigns and even governments and militaries have done for all of time. Cambridge Analytica’s parent company, the SCL Group, which stands for Strategic Communication Laboratories, was an organization that was really started in the late ’80s and they were a military and defense consultancy that worked in the area of psyops, psychological operations, where you are there to win the hearts and minds. It’s also called soft power, which instead of hard power, using military and weapons, you are using communications in order to stop violence, in order to prevent warfare.
Brittany Kaiser: So communications are used in a very scientific way where you collect as much data as possible to understand the population that you’re meaning to affect their actions, their behavior, their thoughts, their opinions. And that can be a really beautiful thing if it is used for positive purposes, but as I said before, it’s also rife for abuse. So, when these strategies are put into the civilians’ sphere, meaning they can be used for marketing by companies, they can be used for targeting individuals by governments, that’s where this really starts to be an ethical and moral question. Should you be able to have enough data about people where you can play on their emotions if you want to? And what we saw in the Brexit and Trump campaigns especially was a massive abuse of understanding what makes people emotional.
Brittany Kaiser: It’s now the main business model of Facebook and many other social media and search platforms, where what gets pushed to the top of news feeds and search feeds is what makes people most emotional. If it incites fear or anger, then the algorithms push it to the top. This is the content that goes viral, the content that gets more engagement and click through and sharing. And that’s the unfortunate place where we are at today where we’re trying to work on algorithms that have been built, perhaps even over decades, that now have a preference for inflammatory content. So, yes. I mean playing on people’s emotions is the main component. What Cambridge Analytica did is called psychographic modeling, which is a mixture of behavioral, experimental and clinical psychology so that you can understand what makes people tick. What are their levers of persuasion?
Brittany Kaiser: And therefore you can find content that has the correct words and images and sound, if it happens to be film content, that will provoke people’s emotional reaction to the political content. And that needs to start to be tightly regulated, because it’s so easy for bad actors to get ahold of a population and to sway people one way or the other.
Patrick Stanley: I’m right there with you. I think this conversation has two forks in it. I want to get to both of them, the first one being the engineering of a mind virus, what that looks like, and some examples that you’ve seen. I was reading this thing yesterday from Richard Dawkins about religion being a meme that is essentially replicated over and over again. I think there are other mind viruses or belief systems that can be pretty effectively implanted into people’s minds and then shared to other people. What are your thoughts on that and whether there’s a conscious understanding of creating of a mind virus or meme?
Brittany Kaiser: Yeah, I mean unfortunately is that the creation of this type of viral content, concepts that get embedded into people’s minds and are very difficult to overcome is exactly what they planned for. The strategy is to be able to create ideas and therefore the campaigns and communications strategy behind that, that will push thoughts or theories onto the population that are so difficult for them to get away from. No one’s better at that than Donald Trump when he uses terms like Crooked Hillary or Lyin Ted, Little Marco.
Patrick Stanley: Very first digital persuasive terms that stick in your head, for sure.
Brittany Kaiser: Yeah. And the terms that he created, especially in the 2016 election, had full campaign strategies behind them. So, when he starts to call Hillary Clinton Crooked Hillary, an entire super PAC pops up that is running the Defeat Crooked Hillary campaign. And unfortunately, that was run by some of my former colleagues at Cambridge Analytica, and to elaborate on the strategy, they then are able to segment the entire population of people that were perhaps interested in voting for Hillary Clinton but could be convinced to not vote at all, and run negative, targeted ad campaigns discrediting her. Everything from her personal voting record as a Senator, to her personal life in the White House as First Lady, to what she did with the Clinton Foundation, and speeches that she had made publicly in the past.
Brittany Kaiser: And there is a different campaign to target, for instance, middle ground or conservative women that think that family values are the most important thing in society, talking about how she was a woman that her husband cheated on her and she still decided to stay with him to retain power and how immoral that is, for instance. And then there’s another campaign towards young African American voters with old speeches taken out of context that she made to make them think that Hillary Clinton called them all super predators, and she thinks that young African Americans are dangerous to the American society. There’s a very complex set of parallel campaigns and strategies that were pushed out in order to achieve that particular goal. And that’s where the problems with lack of transparency of knowing what data is used to target people with what ads and for what purpose becomes a huge problem.
Brittany Kaiser: That lack of transparency on platforms like Facebook, where we cannot track and trace what’s going to whom and where and why and how, is why everyone’s so confused at why we have the most polarized society we’ve ever had. No one can understand each other because we all have different realities depending on which device we happen to be influenced by.
Patrick Stanley: Absolutely. I think what I also learned in 2016 that relates to everything you just mentioned is, people generally can be presented with the same set of facts, as well, and they can see two different motives. And that was never more clear in 2016, and even until today. Ability to seed an emotion and a sense of identity to individuals to cement them into your desired outcome, I feel like is just far too easy at this point. It almost feels like we’re in a somewhat helpless situation because we’ve never been this connected, ever. I think since 2012, the amount of people on the internet has doubled. That’s a doubling of the amount of people who are influencable, effectively programmable and can interpret symbols and thins like that. I have a question, this is more of a personal curiosity question, but the phrases “Build the wall” and “Lock her up.” If I recall, that was something that was derived through Cambridge Analytica as well. Is that right?
Brittany Kaiser: You’re exactly right. I mean not everything came from Cambridge Analytica to the campaign. Sometimes it came from the candidate himself, or other members of his campaign infrastructure, and then Cambridge Analytica would build out all of the supportive messaging and content behind that. But sometimes, that messaging came from the data. The data told the campaign and the people working for Cambridge Analytica in the campaign, that these were some of the words or phrases or concepts that were going to resonate with people and that were going to get people to turn out for rallies, get people to register Republican, and eventually obviously get out and vote for Donald Trump himself. So, it’s a two-way street in campaigns at all times. You always need to let the data guide you, but Donald Trump is someone who flies off the handle and has his own inflammatory rhetoric. And the campaign infrastructure was behind him in order to make the maximum impact out of that.
Patrick Stanley: Right. And he’s running the AB Test all day, trying different language on Twitter, seeing what gets retweeted, liked, et cetera. And so I think he’s got his own algorithm that may not be obviously to unseasoned bystanders, or even folks who just don’t have a large enough Twitter following to understand that is a thing you can really get skilled at. And half the population thinks he’s not persuasive, and the other half is being completely persuaded. I think to discount him as a total fool who doesn’t know what he’s doing is incorrect at this point. I think he clearly has data, if not intuitive sense of how many people are behind him on a given topic, whether there’s a place for him to create a divide so that he further entrenches his supporters. I’m wondering, given his response to the Black Lives Matter movement and the recent news with George Floyd, what information do you think he’s operating with and what tools do you think he’s using today to effectively influence how he operates, and also influence his unorthodoxy and tone deafness during this time?
Brittany Kaiser: Well, he has some of the most sophisticated communications tools that have ever been built. I mean the people who are running communications for the White House, the people who are running Trump 2020 are still using the campaign infrastructure that was built by Cambridge Analytica in 2016. In fact, Trump 2020 is still being led and run by a lot of the people that I hired and trained for Cambridge Analytica. It’s still the exact same team, it’s just now bigger and better funded, and I’m sure using even more exponential and cutting edge technologies that Cambridge didn’t even have access to in 2016. You and most of the people listening to this know better than anyone that technology changes and advances every single day. So the access that they have to data is unprecedented. In the United States we still do not have any federal data protection legislation, so therefore it’s easier than ever to collect as much data from people as possible and understand exactly who wants to hear what and when and how in order to best affect them.
Brittany Kaiser: It’s not just what topics are most important to people, like traditional politics. It’s now, what do they need to hear? What images and colors and music do they need to see and hear? What platform do they need to be targeted on with which type of ad and which format on what platform? What time of day? And it’s incredibly specific. So he couldn’t have more information at his fingertips than he does. And no he’s not, even though I think he’s a massive criminal and have plenty of evidence for that, he’s not stupid. He is using the most sophisticated tools that are available in the world in order to make sure that there are important populations, populations in swing states, populations that are his core base, his core supporters, and that they’re hearing exactly what they want to hear, when, how, why and where.
Patrick Stanley: Gosh. Everything you’re saying is leading to… I have six potent questions that stem off from that. I’m excited to hear your responses to these. So you were mentioning earlier the business model of Facebook is sort of geared towards, for a lack of a better term, more mind control. I’m wondering what are your thoughts on Mark Zuckerberg’s stance on censorship with, we’ll use Trump for the time being, but censorship of politicians and Trump, how they manage their algorithmic feed.
Brittany Kaiser: Well, one of my favorite questions. Last year, when Mark Zuckerberg made a speech at Georgetown University announcing that he would not be moderating political content or holding politicians to the same community standards that they hold you and I to, it was one of the most baffling, shocking and, in the end, criminal acts that I’ve ever seen from a technology executive. Because although Mark Zuckerberg likes to say that some of what politicians say should not be censored so that we can all make judgements for ourselves about what they’ve said and then make our own choice in the ballot box, unfortunately this is not censorship.
Brittany Kaiser: Some of the content that is coming from politicians, and yes specifically Donald Trump, is criminal. We have laws against incitement of violence. We have laws against incitement of racial hatred. We have laws that prevent voter suppression. We have laws against slander and libel. Yet the content that comes from the president’s mouth himself as well as campaign content and political ads from his campaigns and supporting organizations still cross all of those boundaries on nearly a daily basis. So our problem right now is that we’re not able to enforce the laws that we already have in society on one of the most influential and largest platforms in the world. And this is a massive travesty for a society and for the respect and rule of law in and of itself. The freedom of speech is not an unchecked right. I do not the right to incite violence upon you or suppress your vote, et cetera and so forth. The law will come after me for that if I go and say it on the news, or if I write it in my book, for instance.
Brittany Kaiser: When I was writing my book, Targeted, trust me I worked with Harper Collins very closely to make sure that I had legal evidence for everything that I accused people of. Yet Donald Trump is not being held to the same standards. Being able to exercise your right to free speech means that you have to follow the law in what you say out loud. It doesn’t mean you are censored if content that you have said that breaks a law that we have is taken down. That’s just law enforcement. It really, really shocks me that the whole freedom of speech community and people who are anti-censorship have resonance with what Mark Zuckerberg says, because it honestly is shocking and in the end, criminal, the decisions that he’s made.
Patrick Stanley: That’s definitely a valid lens. I want to pull a thread on that a little bit more. How do you reconcile Facebook being a US-based company and serving a global audience. You mentioned abiding with the law. Whose law? Today it’s effectively US law. But if Facebook wants to operate in China or Indonesia or India, and these different political campaigns are running psyops experiments on their own population and other populations in different countries, Facebook in its current centralized format is incompatible with a nation state. It must either serve one master and abide by the law, or be a somewhat neutral party that can’t be used to leverage one country’s soft power over another. What’s your take on that?
Brittany Kaiser: Well, let’s look at the way that Facebook is actually structured, because they are under requirement by every country’s law to obey those country’s laws. If Facebook, as far as I understand, has offices in every country that they operate in, and they have different legal teams and different censorship teams as well in most of those countries. So, the way that Facebook operates in the US is supposed to abide by US law. The way that Facebook operates in the Philippines has to abide by Filipino law. Same in Mexico, et cetera and so forth. Every country you can imagine, they have their own offices, their own teams, their own translators, et cetera and so forth. So although Facebook is a global platform, they still do have to abide by the law in each of the countries that they go to.
Brittany Kaiser: Although unfortunately, usually when Facebook contravenes the national laws of countries, they’re usually able to quickly pay the invoice and move on. Again, this is why I spend so much time trying to implement criminal liability for the types of negligence that they display, because civil fines where companies, not just Facebook but plenty of companies around the world, just work in the costs of lawyers and fines that they will get from the government, they work that in as operating expenses so that no matter how many laws they break, it’s not going to be a problem for their business model. So although I would like to say that Facebook does have a globally minded business, that’s not the way the law works and in order for them to be a functioning business in all of these different countries, they have to still be conscious of and are supposed to abide by the local legislative infrastructure.
Patrick Stanley: Sure. What I’m hearing is there’s just a large amount of unchecked power within Facebook as essentially a vehicle for soft power within nationals, and almost extra-nationally. And what’s concerning is, while nation states have power on a military level, Facebook has a power through all their platforms to effectively yield influence on a global level. So if we zoom out on a national level and look at a global level, the influence they have is tremendous. One of our investors Naval Ravikant had a quote. He said something along the lines of, “The world’s most powerful people are the ones that control your algorithmic newsfeed.” I’m wondering where does this all stop? How do we pull the plug or make this more equitable? What’s the path forward if this seems to be essentially a weapon at this point?
Brittany Kaiser: Well, yeah. I mean, again, as I said, technology is in the hands of who’s using it. So it can be used for great things or it can be used as weapon, and has been used as a weapon and we can see exactly who is weaponizing it most effectively, unfortunately. So the way that we start to dismantle this type of unchecked power, again I’m trained as a human rights lawyer so I like to use law and regulation in order to protect people’s rights, protect people from abuse of power because that is how you can legally enforce your rights. Now, do I realize that laws and regulation don’t solve everything? Absolutely. Do I recognize that it takes sometimes a while to get laws and regulation passed, and by then technology has surpassed what the legislation allows? Yes, I get that. But laws an regulations are first and foremost incredibly important to get those aligned with what our societal goals are. The social contracts that we have between ourselves, human rights, sustainable development goals need to be incentivized within our laws and regulations.
Brittany Kaiser: Secondly, we obviously need better education. People are not digitally literate. The reason why so many people share inflammatory content is because they don’t even realize that it’s disinformation or fake news, or that someone has put a lot of money behind it in order to make them angry. And so once we have a more digitally literate citizenry, then we will see the effects of the algorithms massively dropping. And third, it’s obviously the technology that will enforce the laws and regulations, and that will make sure that educated people have access to ethical technology. I mean there’s a reason why I spend so much time assisting Blockchain entrepreneurs and people who are experimenting with new forms of peer-to-peer networks and stronger types on encryption methods. It’s because until we have the technology where we can actually own our data and protect ourselves from what is shown to us and from malicious content getting in front of our eyes, then you can have laws and regulations all day.
Brittany Kaiser: But we need the technology to back that up. When you’re talking specifically about algorithmic amplification, we currently have one law in the US Congress that I think is pretty exciting, and it goes on the back end to a very technical level to explain how algorithmic amplification works and banning the negative and discriminatory use cases of algorithms in our news feeds, in our search feeds, so that we can stop inflammatory content and fake content from rising to the top. And as I said before, the only way we can enforce those laws is with technology. So we have three problems to solve. What is our legislative and regulatory infrastructure going to look like to make sure that we’re making a better society, not making ourselves worse off? How do we educate a global population to live in a fully digital world without being targeting and manipulated? And then what technologies do we want to invest in and build to make sure that the kleptocratic and predatory tendencies of the tech that we’re using everyday today does not become what we see in the future.
Patrick Stanley: Absolutely. I couldn’t agree with you more on that three-pronged approach. I think from a practical standpoint, legislation is most effective on a near-term basis, especially since people may not choose where they’re born but most countries have essentially a societal pact with the folks who are voting to enforce law for their own protection. So right now that’s happening through land governance, and the time for land governance to potentially be dominated by cloud governance has not yet appeared, but I think will happen over time. So I think that’s a super practical first thing. The second point you made about, how I was interpreting that was you’re almost fortifying people’s minds to defend against attacks, and that, I think, is essential but nontrivial. How I’m interpreting how massive of a task that is in a vacuum, not supplemented with those other two things you mentioned, is akin to trying to dissuade a religious person from their religion in some cases. But nonetheless, I think it’s absolutely essential and probably should be taught at a really young age.
Patrick Stanley: A quick side note question before I get into the third category is, how do we prevent that from being a Red Queen’s race where the education doesn’t become outdated, the algorithms and the symbols that make their way into people’s brains don’t route around whatever training they’ve had and education they’ve had?
Brittany Kaiser: Well one of the ways that I am seeking to solve this, and again it’s a very complex problem to solve and there is a lot of different pieces of the puzzle that are going to need to come together to make it sustainable, but besides the legislative and regulatory work that I do, I also started the Own Your Data Foundation last year in order to teach digital literacy education. It’s a new curriculum called DQ, so like IQ or EQ. It’s a digital intelligence quotient and that quotient, so the score that you get is made up of an indicator set that’s been developed over the past decade by a lot of the world’s top ministries and departments of technology and innovation, think tanks, universities, that are experts in this.
Brittany Kaiser: And it includes everything from, what are your data rights? To basic cybersecurity protocols so you can keep your data private or share it in a protective way. How do you become media literate? So to spot fake news and disinformation, spot hacking and fishing attempts. How do you prevent cyber bullying online and make sure you don’t become a part of it? How do you use emotional intelligence on social media and in your interactions with people over the internet? How do you not become addicted to devices and have a healthy relationship with technology and manage your screen time? Et cetera and so forth. So these curriculums are new global standard through I Triple E and supported by OECD and the World Economic Forum and UNICEF, and it only came out last year. I mean, officially it came out in October of last year, and we are working very hard to reach a billion children over the next couple years.
Brittany Kaiser: The first full curriculum is for eight to 12 year olds. So it’s supposed to be before or when you receive your first device that you would already be equipped with all of this knowledge. So unlike all of us where we have never read any of the terms and conditions of the apps that we use on a daily basis, instead we can teach kids that the way that you use a phone or a tablet or any device or even your computer is that you read the terms and conditions of something and decide whether or not you’re comfortable with it before you install it onto your device. There’s a completely different way of looking at the world that can be taught at a young age to make sure that we are not creating a vulnerable, manipulatable population.
Patrick Stanley: That’s awesome. When you’re describing that, I’m sort of thinking how great that would be to be almost a naturalization or onboarding process for Facebook, Instagram, Twitter, for people of all ages. Because I think there’s something about civility that’s also linked to the literacy part. Not only avoiding becoming addicted or manipulatable, but also how to be a constructive citizen on any given platform. I feel like that would be just a great… I view Facebook and Twitter as effectively their own little digital nation states that haven’t fully stepped into that role.
Brittany Kaiser: For sure.
Patrick Stanley: In a way that they would be comfortable saying out loud. But they do need a naturalization process, and to not equip people with effectively training they’re essentially leaving them prey to their business model, which is exploitation, unfortunately.
Brittany Kaiser: Yeah absolutely. It leaves everybody vulnerable when you make a 46-page legalese set of terms of conditions that you know no one will ever read. They’re probably not going to click on them in the first place because you have tiny, tiny type where you can hardly tell that it’s a link in order to read the terms and conditions. But if you open them, you wouldn’t even understand what they say.
Patrick Stanley: Or read the whole damn thing. Who’s got the time?
Brittany Kaiser: Who’s got the time, number one. But who has a law degree in order to understand what they’re asking and taking from you. That’s why I call these platforms kleptocratic because they’re using lawyers in order to create something that is so complex that you don’t understand what you’re giving away. And therefore, they’re stealing from you because it is not true informed consent. A lot of the testimony that I spent time doing over the past few years is to organizations and government agencies specifically that are seeking to change this. Saying that this not real informed consent and for instance the Federal Trade Commission of the United States is now ruling that this is in contravention of consumer protection standards. You can’t protect a consumer from something that they do not understand what they are giving away.
Brittany Kaiser: So there is a future that is coming very soon where terms and conditions will be easier to understand, third grade reading level or below, and it’s bullet points so that you know exactly what it is that they want from you and you can toggle it on or off. And you can continue to toggle it off all the way down the list, and if you say no to everything, you can still use the platform, because in the end most people are not going to go that far. But the choice needs to be there. We’re seeing that ever since the implementation of GDPR and CCPA, that now when you go on a lot of websites and they ask you whether or not you’ll accept cookies, you can click and there’s options, and the options are very simple to understand. And you can decide what you are comfortable giving away or not.
Brittany Kaiser: Obviously I don’t think that that’s enough. I think that if you decide to give certain data away that you should have ownership of that and know exactly where it’s going and to whom and for what purpose and if they’re going to monetize it that you should get a dividend off of that. But let’s start by getting to first base here, and then we can go forward.
Patrick Stanley: Yeah. And it’s a great start, serves a dual purpose, I think, in terms of making things more clear and understandable, and therefore able to educate people and empower them for the data ownership and understanding what they’re opting into. But also it gives less wiggle room, I feel, to these large kleptocratic, essentially monopolies to feed off what are effectively innocent civilians. And obviously they’ll try to find ways around it because their business model somewhat depends on it, but that is a really potent and impactful thing to start. The third point you mentioned earlier, regulation, the second thing was education, and the third thing was essentially the technology.
Patrick Stanley: Some ideas that I’ve had come to mind and I’d love to get your feedback on that and get a sense of where you think this is all heading to, are making algorithms more transparent. Just treating these platforms as a public utility where identifying these platforms as weapons that are being used on a civilian population and requiring the steps that would be needed to take to effectively either democratize the weapon down to the citizen level. So people can see what the actual algorithms are doing publicly. Or enact some level of transparency at the algorithmic level in addition to giving true data ownership to users. And when I say true, I don’t mean Facebook has a button you can click and then you get an email 36 hours later, or maybe 72 hours later, with more instructions that you’re unlikely to do and therefore you stay in their system.
Patrick Stanley: And what I’m not talking about is moving your data from platform to platform to platform. I think people legitimately having some ownership over their data in terms of the control of who can access it and where it’s stored and it’s compatibility with the original platform itself, the backwards compatibility is, I think, really important. That’s going to be something I feel like is going to be akin to prying a golden goose out of their cold, dead hands.
Brittany Kaiser: Yeah. Yeah absolutely. There’s so many different ways to go about this. So, what I’m going to say at the 10,000 foot level are the types of laws, regulations, technology and even concepts in education that we have to address are first obviously the importance of transparency. What’s being collected about us, by whom, where it’s being stored, what it could possibly be used for, or not, and what monetization might come from that. Now, throughout that transparency process there needs to be explicit consensual opt-ins where people are truly informed of what they would be agreeing to, and they have the opportunity to opt out of some of those things. So a complex opt-in infrastructure, fantastic. And then if you are having control over your data, if and when I would say, you are owning your data, the monetization process could be multi-layered.
Brittany Kaiser: For instance, if you could agree for your data to be monetized for marketing to you, or you could agree for your data to be monetized for medical research to cure cancer, or to stop coronavirus. Or you could donate your location data to your local city so that you can help stop traffic accidents. Or your online activity to help prevent terrorism. Whatever it happens to be, there are good reasons to share your data, but you could say “I don’t want my data to be used for political advertising or by law enforcement.” Or by whatever it happens to be. And that whole process needs to be transparent, informed and consensual. And for instance, a lot of people question the data monetization piece and they say, “Well, for instance Facebook has spent so much money to build these platforms. We’re giving our data for free access to the platform.”
Brittany Kaiser: Well, I beg to differ. I do think that access to their platform creates so much value for them that they’re one of the most valuable companies on planet Earth and it’s because own all of our digital assets for the past however many years you happen to have been on that platform. Now I prefer something called fractionalized ownership, which is really easily enabled in Blockchain technology for instance, where a percentage of the value of the data that I create in the platform would be mine because it’s my data, and then a percentage of that value would be to the owner of the platform. So Mark could continue to make money out of the data I create in Facebook, but I would also get a dividend off of that. The governor of California, Governor Gavin Newsom, is endorsing a law called the Data Dividend Law, which means that you transparently know what data’s going to be collected about you and how much they’re making off of it, and you get a percentage of that back for you.
Brittany Kaiser: And that’s really what I see as the future to solving a lot of these problems. Obviously we need an informed populace so that you can make informed decisions, but there’s a lot of new platforms that are popping up that will help you manage your own data, help you manage your digital identity or your digital self. And if you want to become a data input to certain algorithms then you can, but you would know what the intention of those algorithms is and then make an informed choice whether or not you wanted to be a part of that. And if you are on the receiving end of data-driven communications you would understand what data went into the decision-making to put that in your news feed or to put that on your device and why you’re seeing it and who paid for it and where it’s coming from and what data was used in order to decide that you were the person that needed to see this. It’s just so important to have access to all of that information. I suppose the way to land the plane for people that still don’t understand what data ownership and being able to monetize your digital assets for yourself really looks like.
Brittany Kaiser: I think most people know what the Airbnb model is, which is, if you own your house, I see your data since data is the world’s most valuable asset, that you’re creating so much data every day that you should consider that your most valuable asset above your house and your car and whatever you have in your bank account. You need to think about the personal data that you create as an asset class that you’re the producer of and you need to be able to have transparency and control over that. So on Airbnb when someone wants access to your asset, to your property, your house, they tell you who they are, what they want to use your property for, how long they’re going to be there, and you agree on a price and get paid before you hand away the keys. And that’s really what data ownership and digital identity platforms that help you control and monetize your data should be seeking to do. It’s a business model that works, it’s very straight forward, and it’s mutual, consensual agreement between two parties to share that asset between each other.
Patrick Stanley: In terms of being a business model that works, my guess is there will probably be some time for that to graft on. From a strategic lens, for Facebook, they have a duty to essentially increase profitability and their in service of, effectively, shareholders. And for the fractional ownership, it allows them to at least bend the knee a bit to at least people having some rights over their data. The place where it ends up being a little work early on is the actual unit economics for people’s data. If people are earning what comes down to cents and low dollar amounts, it’s definitely an improvement from nothing and I think a starting point for something potentially much bigger in terms of folks doing meaningful work in addition to folks having access to their data and creating a personal API. I think a very artful approach is definitely necessary there.
Patrick Stanley: Where I feel Facebook is starting to go into the direction of in anticipation of something like that is towards a digital no man’s land. This is complete prediction that may 100% salt. So take it with a grain of salt, but I need to get updated on the legal program, whether they’re still working with governments. But, in effect, if they can create a digital no man’s land where the lock in is through whether people accept their currency or not, the denominator is their currency, well they can actually expand the value of that currency geometrically. They can create lock in from a currency standpoint and potentially make it easy to pay for and use the data with that currency, and allow that currency to appreciate, new business models be generated in that currency. And just make sure they have the stranglehold on that currency being used and that currency being used alone.
Patrick Stanley: I think given Facebook seems somewhat incompatible with modern day nation state as we know it, it almost feels like they have to decentralize in a very smart way. It’s actually probably less of a prediction and more of the direction I would go in if I were them, is try to avoid kowtowing to any authoritarian governments, get out of your own way, create lock in with the currency, find new ways to get value because they’re effectively going to have to slightly disrupt their old business model. Yes they’ll use the advertising obviously, but they’re going to have to become a ghost almost and not be a control point, but rather almost a promoter of their currency. I think that may end up being a huge strategic point of control for them. I wanted to share that, because I think what you were saying was kind of inspiring some further thinking on where they are today, what they’re likely to do versus what they will do and what’s best for them and their shareholders.
Brittany Kaiser: Yeah absolutely. So let me address the first part of your comments first, which is that you talked about how the monetization process, especially for the data that we produce in Facebook, is not worth enough to incentivize people to go forward with that model. So, Mark Zuckerberg is lying to us when he says that our data’s only worth… I think he usually says $17 per quarter. Our data’s worth an exponential amount depending on what you are using it for. And secondly, on that point, $17 per quarter would completely change the lives of billions of people around the world that live on less than $2 a day. So, there’s already a massive incentive for our data to act as the great equalizer in society where we can raise billions of people out of poverty around the world with just the data we produced in Facebook alone.
Brittany Kaiser: So that’s something I will argue against people who make that comment all day long. We already have an incentive just from Mark’s lies to go forward when he tries to discourage people from being interested in the value of their data on Facebook. I advise multiple companies that work on data monetization, both in Blockchain and outside, and a lot of them are doing different experiments to see how valuable your data can be. If you make your data available to militaries for counter terrorism, that’s worth quite a lot. One example is if you decide to share your medical data with diabetes researchers, that’s worth between six to eight weeks of your medical data, $28,000 said one research organization that we spoke with. So, there are unlimited possibilities for you using your data as a currency in and of itself, and using it to empower yourself with your own personal information and the value that you create as a human every single day. So that’s my stance on that.
Brittany Kaiser: Secondly, you’re talking about Libra, which I believe was just a massive play for Facebook to take all of our financial data as well and use that to sell to other organizations to potentially be used against us. So, I have been very critical of Libra all along the way. Now that Facebook is taking a backseat and there are other organizations that are going forward with it, I just see it as just another Blockchain solution that is not that innovative or interesting, and their currency might be valuable one day if they’re able to keep some of their important partners on board. But I don’t see it as that interesting or as a threat at this time.
Patrick Stanley: Sure. And I think my sense with Libra, they’ll start by combining with a business model that they’re familiar with at first, like advertising, and then move toward payments and finding a way to extract financial information is my guess. I largely agree with your point that value can accrue to users. My point was, maybe I didn’t articulate clearly enough, but my point is more so that I think people will monetize their own data. On my timeline they will, and that will be a meaningful jump off point for more value creation for the end user. Totally on board with that. More so what I think, in the early days I believe there’s a bit of a chasm between what is optimal and fully developed when it comes to people benefiting from owning their own data. I mean, almost by definition things take time to grow and develop and for more value to accrue to the user. I’m just antsy to see that end state and feel like early stages of that will feel slow moving until it’s fast moving. But I agree directionally with your point.
Patrick Stanley: I actually have to jump off but this has been one of the best podcasts I’ve had. We’ve done a couple dozen now. You’ve been a great guest and super, super educational. Super inspiring. Just learned so much and I think our listeners will get a ton out of this so thank you so much. And could you let folks know where to find you, what to look into to see how they can be of help to you?
Brittany Kaiser: Yeah absolutely. Well firstly thank you so much for having me. And anyone that’s listening to this that wants to get further involved or have more resources to educate yourself, you can go to my website, ownyourdata.foundation. If you want to support us, you can either donate on that website or go to ownyourdatafoundation.com where we have a lot of really awesome data ownership merchandise and tools that are pretty cool that you can check out. If you want to follow me on social media, my Twitter is @ownyourdatanow. On Facebook, I am Facebook.com/ownyourdata. On LinkedIn, LinkedIn.com/in/ownyourdata. And on Instagram, @own.your.data as well. So really excited to help anybody that is interested in getting invoked in some of these initiatives. You can write to us at [email protected]
Patrick Stanley: Awesome. Well no doubt the Stacks community is going to be looking into what you’re working on. And thanks again for coming on. And this has been another edition of the Stacks Podcast. See you again next time.