Facebook’s Former Employees Open Up About the Data Scandal - podcast episode cover

Facebook’s Former Employees Open Up About the Data Scandal

Mar 30, 201818 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

As Facebook reels from a public backlash over its handling of user data, former employees are starting to air their hesitations and criticisms of the company they helped build. This week, Bloomberg Technology's Sarah Frier and Aki Ito hear from these former insiders to examine the mistakes that led to the company’s crisis today.

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

The past two weeks have been some of the most devastating for Facebook. The Federal Trade Commission has confirmed it is investigating Facebook's privacy practices. Facebook stock dropping even lower in the fallout, and this morning, Facebook executives are heading the Capitol Hill. Can Mark Zuckerberg keep both advertisers and lawmakers happy? Many of you will know the highlights by now.

There's that political consulting firm called Cambridge Analytica, which got information on fifty million Facebook users, mostly without their permission, and they later went on to advise Donald Trump on its presidential campaign. Cambridge Analytica told Facebook that it deleted the data, then two weeks ago stories in The Guardian

and The New York Times declared that it didn't. People were so furious they started deleting their Facebook profiles, and as of this taping, Facebook stock lost tens of billions of dollars in value. Inside Faceboo's headquarters in Menlo Park, the atmosphere right now is pretty tense. Employees are distracted, reading the news, mingling in small groups to whisper about the latest one the crisis. Many of them joined Facebook because they felt like it was doing something really good

for the world. It had a mission everyone believed in, but now some of them aren't so sure. For both current and former Facebook employees, this is not just about Cambridge Analytica. It's become this moment of reckoning, of realizing this immense power that Facebook now holds and the responsibilities of that. Hi Imako and I'm Sarah Friar, and this week on Decrypted, we're going to hear from a number

of former insiders at the social network. We the public are still discovering things that Facebook really does with our data, and for a lot of people, the realizations have been a little shocking some of them. Value with its tight knit community, can be a hard place to speak out. That's especially the case for people who've worked for Facebook. But now after a series of controversies, some former employees are starting to air their hesitations and criticisms in public.

And with that, we're getting our first peek into the culture that led to Facebook's crisis. Today, stay with us, So, Sarah, let's start with some context back a couple of years ago.

Is incredibly unusual for a former employee to criticize Facebook publicly. Yeah, Facebook has this culture where they really wanted to feel like a family, and you're given access to all this non public information about the company roadmap, so employees feel an immense responsibility to keep it safe your Facebook friends, with all your coworkers. If you speak to the press, you're hurting the family, right, And the company is known

for searching for leakers. Zacher Berry actually announces at all hands meetings when someone who leaked gets fired, and sometimes people applaud. Employees are told that once they leave, they're still part of the Facebook family, which has extensive reach and influence across the Silicon Valley, so don't screw it up. But we did start to see more people speak up right after Trump won the election in the fall of right.

So Facebook is headquartered right here in the Bay Area, which is a very progressive place, so a lot of people who worked at Facebook lean towards the liberal side too. Yeah, and they wondered if they are, in tiny or big ways, partly responsible for Trump's victory. This is something we covered on the show in an episode back then, and people

have been talking about it since. I've spoken to a lot of former and current employees over the last year and a half who are definitely they're grappling with this question in pretty personal ways. They're facing questions from their own friends about whether Facebook has been doing the right thing. And we've seen that with fake news and Russia's manipulation of the election. And you've had a very busy job over the last year and a half. It's been a

little crazy. At the end of last year, to moth Paula Hapatia, he's now a high profile investor, but used to be a key executive in Facebook's early days of growth. He talked to a group of Stanford University students about some soul searching he was doing. I feel tremendous guilt. Um. I think we I think we all knew in the back of our minds, even though we feigned this whole line of like there probably aren't any really bad, unintended consequences.

I think in the back, deep, deep recesses of our minds, we we kind of new something bad could happen. Chama was talking to a pretty small audience, but these comments just blew up in the media. He probably wasn't expecting it. Since he's a good friend of the management team at Facebook. You can imagine he got a lot of calls and text after that. Because he then made this strange appearance on TV where he said he wasn't talking about Facebook, he was talking about social media in general. It's not

like he was trying to walk it all back. Jam isn't the only one who made his name off Facebook and is talking about them now. Sean Parker, who joined Mark Zuckerberg in Facebook's earliest days and was the president of the company for a while. A huge chunk of his wealth comes from his time at Facebook, and he said last year that these tools were built to psychologically manipulate people. Just In Rosenstein. He's the guy who's credited

with inventing Facebook's like button. No words about technology addiction and Chris Hughes, one of Facebook's co founders, talked of Limberg in an interview that was taped a couple of months ago. I think Mark in particular has been really clear about Facebook's responsibility in the election and has taken stock of his own responsibility to think about how Facebook influences our politics and our culture, and the recent changes with the news feed and some of the other things

that they're doing are a reflection of that. But the first step is understanding that that responsibility exists and I think, at least in Facebook's case, there's an embrace of that. Then two weeks ago, the criticism around Facebook notched into higher gear after the news broke that Cambridge Analytica still had the data it improperly obtained millions of people. More people came forward to voice their criticism of Facebook. One

vocal critic has been Sandy Parkles. His job was to address data privacy issues with regards to all the external apps that get access to our data on Facebook when we log in using Facebook. Right after the Cambridge Analytica scandal broke, Sandy wrote an op ed in the Washington Post apparently in two thousand twelve, way before Cambridge Analytica harvested the data on the millions of users, he tried to warn Facebook's executives about the lack of protections that

were in place. Sandy taped this interview with PBSS Frontline before the scandal broke because I had been the main person who was working on uh privacy issues with respect to Facebook platform, which had many, many, many privacy issues. It was it was a real hornets nest of problems because they were giving access to all this Facebook data to developers with very few controls. Sandy was actually a musician before he decided to go back to grad school and get into tech, and this was his first job

in the tech industry. Due to i'd say organizational chaos and you know, a lack of prioritization of privacy, ended up being me, what did you think about that? At the time, I was horrified. I didn't think it always qualified. Did you say that to anybody? I did? Yeah, I did, And the response I got back was basically, you know, don't you think this is important? It was essentially people above me didn't want to be on the hook for this. I spoke to another former Facebook employee, someone who joined

long after Sandy left. His name is Kevin Lee, and when he was at Facebook, he worked on global spam operations. This is his first time speaking out about it. My team was more focused on the mass scale of these things, so it's really around bought attacks and really mass creation of either of fake profiles or the mass kind of takeover of legitimate accounts to spread either misinformation or scams. Kevin started at Spoke in November and left the company

two years later right after the presidential election. The focus of the current scandal how app developers were able to access data on millions of people. Well, that fell outside of Kevin's purview, but still he was able to see that Facebook wasn't in the habit of enforcing its terms and conditions with developers. In this case, hindsight is here. I think the biggest area that was not adhered to or not given enough attention was specifically around the enforcement

of developer policies. So the developer policies were changed to be more private and kind of keep user information more secure and more kind of private. However, there wasn't enough enforcement in my opinion, in terms of Okay, now that we've changed our policies, are we really are we double checking to make sure that the people that had access to this data prior have deleted it or have taken

the necessary steps to remove it. I don't think from a enforcement standpoint that was there was enough attention paid there. So when you were at Facebook, why do you why do you think they didn't devote more resources into enforcements. I mean, I can hypothesize that's really more of a

developer um community question. I think there were resources devoted to it, but there's always a trade off in terms of Okay, we want to open up our developer APIs, we want to grow the company, and oftentimes there's this trade off between security and privacy versus opening up the platform and making it as viral or really addictive as possible.

And some of those decisions, um, maybe we're made more on the growth side of things, and UH as a result opened them up to more potential exposure to some case just like this. This is actually something that I've heard a lot from other form we're in current employees that Facebook prioritize growing its user base over other needs like securing user data. There was a internal model at Facebook and it's been published. The mottel was like move fast and break things, and that was very much a

mantra that the company followed for years. I think, I mean it's shifted a bit, but essentially, unfortunately in this case, the company did move fast and what it broke was really user trust and now there's a lot of repair work and a lot of goodwill that needs to be build again. For some people, the public criticism of Facebook put them in a pretty awkward position. This is what one former employee, Brett Taylor, told her colleague Emily Chang

on Bloomberg TV. You have people like Brian Acton, the co founder of WhatsApp, which Facebook bought for twenty billion dollars, saying to delete Facebook, and other former Facebook insiders, you know, really talking about out the potential threats of the social network. Are you in that camp? You know? I I have a great deal of you know, personal connection and loyalty to that company, So it is obviously sort of uncomfortable

for me to talk about. The thing I would say is that I do think that it's the right way to think about it. Is that these technologies have changed sort of the landscape of our society. Um, you know, everything from the smartphone. We can call that the artful dodge. Well, even with more people speaking up, there's still this expectation to keep your agreements as private. There are two things that happened just this week that illustrate this. So I got a tip on a project and I asked a

current employee about it. Everyone at Facebook has access to all of this information internally, but he was afraid to even search for the code name I gave him because Facebook would be able to see his query. No, he talked to me and fire him. I told him don't worry about it. I don't want you to get fired. In another case, a former employee posted online that they'd be able to help people understand the Cambridge Analytica scandal.

The post didn't stay up very long. They've removed it when Facebook sent them and note reminding them of their contractual confidentiality agreement. So Facebook had someone following all this internet chatter seems like it, but some people in Facebook's community do see a silver lining to the crises that the company has gone through, like Michael Hofflinger. He worked on ads at Facebook and went on to write a

book about the company. Michael told me that he hopes this could be a lesson that will make Facebook a better company in the future, you know, and I hope that in this discussion, in the critique that's coming, especially from folks that have been inside of Facebook, that the objective really is to get to a future where we become smarter at how we build, more empathetic about how we build, not about not building. And Facebook has already

announced some changes. There was a big one just this week about how they're limiting the third party data that advertisers can use to target people on the network. This is something that could actually make it harder for advertisers to use Facebook. That's not usually the direction they move in. Yeah, it means less money for Facebook, right right, It could

potentially mean that. And then there is, according to my sources, just this in depth review happening at Facebook and every team going through all the different data uses of every part of Facebook and figuring out if those are actually necessary, like do we need this data in order to create a better user experience. So I'm sure we're going to keep learning more of the next few weeks, and publicly, we're going to hear from CEO Mark Zuckerberg testifying in

front of Congress pretty soon. We don't know exactly when, but it'll happen. What's the number one thing you'll be looking for in that testimony. I just think it's going to be so difficult for him because a lot of these questions are about things Facebook did wrong so many years ago, and the way somebody explained it to me. One of the users I talked to actually explained to

the best. He said, it's like toothpaste out of the tube, like the data that Facebook shared with developers way back when they are not able to tell Congress today where that data is now. There was no like auditing that happened back then to try to chart this out. So zach Arberg is going to go to Congress and he's going to be explaining how far Facebook has come and protecting its users, and he has to hope that that

will be enough. You know, ahead of this episode, we asked our listeners to send in their own questions for Mark Zuckerberg. We'll close our episode today with a couple of our favorites. Rebecca goose Man from Burlston, Texas. Why should we stay on your platform after your company violated the trust and I have to see of millions of users. Hi, my name's Marcus. I'm from Germany and my question is

Mrs Lackaberg. How can you justify deliberately deceiving all of your users to get access to their private information even though your apps and the disclaim is involved a delusive and non exhaustive referring to the excessive amounts of data that you actually collect. Greg Wall. At the end of the day, Mark, will you and your product be a tool for greater good or a tool for mass surveillance? This is Paul Johnson from Pennsylvania, Sincerely, at night, when

it's you got an a pillow? Do you ever think that anything that you've done is wrong? And then for this week's episode of Decrypted, thanks for listening. We'd love to hear what you thought of today's show. You can email us at Decrypted at bloomberg dot net, or you can find me on Twitter at Sarah Fryar and I've met aki Eto seven. If you haven't already, we hope you'll subscribe to Decrypted wherever you get your podcasts, and while you're there, please leave us a rating in a review.

I read all of them as they come in, and they really help us reach more listeners. This episode was produced by Pia gut Car, Liz Smith, and Magnus Hendrickson. Francesca Levy is the head of Bloomberg Podcast. We'll see you next week.

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast