“When I started, it was hard to get C-suite attention on the issue unless there was a data breach. And most companies didn’t think they could get hit,” says LinkedIn’s head of global privacy. “Today, it’s a matter of when, not if.”
As a result, something once seen primarily as a public relations concern has become a fundamental component of corporate philosophy.
But what should that philosophy entail at a global company? It’s a complex question even further complicated by the fact that data transcends borders.
“Global companies have to think about how they’re going to handle data from a global mind-set. You can’t say, ‘We’ll follow this set of laws for this data and this set of laws for this data.’ You have to [ask], ‘What is the highest legal bar we need to meet?’ [Then] work from there, along with the company’s values, and decide what the right thing is to do.”
For many companies, including LinkedIn, the highest bar has been set by Europe.
“Europe views privacy as a fundamental human right, which is not the same way we view it here in the US,” Raina says. “In the US, privacy is developed as a patchwork of different laws. It doesn’t have the same grounding in people’s mind-sets as it does in Europe. If you look to the history [and] the way in which data was used in Europe, people started feeling the need to recognize privacy as a fundamental human right and to put restrictions around how data should be used, and give people control over their information.”
This eventually led to the Data Protection Directive. Passed in 1995, the law prevents the processing of personal data unless certain conditions can be met regarding the categories of transparency, legitimate purpose, and proportionality.
“It began to try and address some of what was beginning to happen on the Internet,” Raina says. “But it was written twenty-two years ago when we didn’t have social networks like LinkedIn and Facebook, and we didn’t have companies like Amazon where we were purchasing the majority of items we need in our home from a website. We didn’t have iPhones and [weren’t] doing most of our daily activities online. So this is part of the reason why Europe is feeling like there’s more and more data out there, and there’s a need to pass laws that recognize that and are updated for the modern Internet we’re living in.”
Raina’s referring to the General Data Protection Regulation, a new law that’s going into effect come May 25, 2018. Essentially a more comprehensive extension of the Data Protection Directive, it allows for greater harmonization of data protection regulations throughout the EU. Although this makes it easier for non-European companies to comply with regulations, there are severe penalties for any company that does not: 4 percent of worldwide turnover. All of this is indicative of privacy laws needing to be as ever-changing and fluid as the data they’re designed to protect.
LinkedIn is in a unique position regarding privacy. On one hand, the company has more than 467 million members (including executives from every Fortune 500 company), all of whom are trusting the company to protect highly sensitive data. On the other hand, one only joins the largest professional network on the Internet because they want to be found.
“It’s definitely a site that needs to walk a fine line between recognizing that people come to be noticed, to be shown new opportunities for jobs,” Raina says. “But at the same time, there is a deep, underlying trust in the way our company will handle the data we have about them: that only the right people see it, that they have control over who it’s exposed to, and that they have the ability to change and correct it. That puts us in a unique position from a lot of other companies that may be more transactional in nature with their customers, enough to have this ongoing and almost daily relationship with how people decide to present themselves to the world for economic opportunity.”
But perhaps the most important component of LinkedIn’s privacy strategy is establishing and nurturing a corporate culture that sees privacy as a moral issue and not just a technical question of compliance. Raina believes this is to be achievable through a three-tiered strategy for the business’ daily operation.
“The first thing in that would be the value system of the company,” Raina says. “That should be the guiding principle. The second piece is actually plotting out your program and its goals. The last piece is cross-organizational reach outs—working across different divisions of your organization to make sure that vision’s actually getting executed into the products and the day-to-day of the company.”
To that last point, “there’s a lot of collaboration that occurs between the engineering teams, the legal teams, the privacy teams, and the security teams.” This is “to make sure that the products we’re developing have been secured from a security and a privacy standpoint, in addition to legal; to make sure they meet the needs of our members, and that we’re really focused on doing the right thing. One of our core values that drives our decision-making around privacy is ‘Members First.’ Does it help to maintain their trust? Does this give them confidence in sharing even more information about themselves on LinkedIn so that they can create greater economic opportunity? All of those factors come into play when we’re thinking about products.”
LinkedIn also empowers employees to make decisions about what data will be collected, how a piece of advertising will be handled, and how easy it is to “opt out” based on what it believes to be in the best interest of
“You can’t expect everyone to know the law, but you can expect people to make decisions based on what makes me feel uncomfortable,” Raina says. “It helps your employees understand that as digital citizens, what would [they] feel is the right thing to do? That [gives] people a kind of compass when the legal team isn’t around.”
It’s crucial to communicate to employees that, even if they never touch data in their day-to-day work, everyone today is impacted by privacy practices. “We educate our employees so that they can protect themselves,” Raina says, “and also so that, when an issue comes across their desks, they wonder, ‘How would I like my data handled?’”
At one point in our conversation, Kalinda Raina points out that long before the Internet came around, humankind has always had a complicated relationship with technology in regard to how it affects our privacy.
“It’s funny,” she says, “because if you look back in history, people have always been worried about privacy when new technologies arise. Back in the 1800s as photographic technology improved, people became nervous that their picture could be taken in public without their permission. Technology has always presented some level of discomfort for us as human beings because we begin to have to think about ourselves in a different way—our relationship with that technology and what it means to us. We don’t need to be afraid of technology, but we do need to find ways to adapt without losing our core values.”