explicitClick to confirm you are 18+

The ONLY reason we're examining Facebooks sleazy behavior is because Trump won. -Michael Krieger

artemis6Mar 22, 2018, 8:09:59 PM
thumb_up44thumb_downmore_vert

Michael Krieger | Posted Thursday Mar 22, 2018 at 12:00 pm

Trust me, there’s nobody more thrilled to see Facebook’s unethical and abusive practices finally getting the attention they deserve from mass media and members of the public who simply didn’t want to hear about it previously. I’ve written multiple articles over the years warning people about the platform (links at the end), but these mostly fell on deaf ears.

That’s just the way things go. All sorts of horrible behaviors can continue for a very long time before the corporate media and general public come around to caring. You typically need some sort of external event to change mass psychology. In this case, that event was Trump winning the election.

The more I read about the recent Facebook scandal, it’s clear this sort of thing’s been going on for a very long time. The major difference is this time the data mining was used by campaign consultants of the person who wasn’t supposed to win. Donald Trump.

To get a sense of what I mean, let’s take a look at some excerpts from a deeply troubling article recently published at the Guardian, ‘Utterly Horrifying’: Ex-Facebook Insider Says Covert Data Harvesting Was Routine:

Hundreds of millions of Facebook users are likely to have had their private information harvested by companies that exploited the same terms as the firm that collected data and passed it on to Cambridge Analytica, according to a new whistleblower.

Sandy Parakilas, the platform operations manager at Facebook responsible for policing data breaches by third-party software developers between 2011 and 2012, told the Guardian he warned senior executives at the company that its lax approach to data protection risked a major breach…

Parakilas, whose job was to investigate data breaches by developers similar to the one later suspected of Global Science Research, which harvested tens of millions of Facebook profiles and provided the data to Cambridge Analytica, said the slew of recent disclosures had left him disappointed with his superiors for not heeding his warnings.

“It has been painful watching,” he said, “because I know that they could have prevented it”…

Asked what kind of control Facebook had over the data given to outside developers, he replied: “Zero. Absolutely none. Once the data left Facebook servers there was not any control, and there was no insight into what was going on.”

Parakilas said he “always assumed there was something of a black market” for Facebook data that had been passed to external developers. However, he said that when he told other executives the company should proactively “audit developers directly and see what’s going on with the data” he was discouraged from the approach.

He said one Facebook executive advised him against looking too deeply at how the data was being used, warning him: “Do you really want to see what you’ll find?” Parakilas said he interpreted the comment to mean that “Facebook was in a stronger legal position if it didn’t know about the abuse that was happening”.

He added: “They felt that it was better not to know. I found that utterly shocking and horrifying”…

Parakilas, 38, who now works as a product manager for Uber, is particularly critical of Facebook’s previous policy of allowing developers to access the personal data of friends of people who used apps on the platform, without the knowledge or express consent of those friends.

This is deeply unethical and sleazy behavior, but it also was routine for years before the Cambridge Analytica news.

That feature, called friends permission, was a boon to outside software developers who, from 2007 onwards, were given permission by Facebook to build quizzes and games – like the widely popular FarmVille – that were hosted on the platform.

Parakilas does not know how many companies sought friends permission data before such access was terminated around mid-2014. However, he said he believes tens or maybe even hundreds of thousands of developers may have done so.

Parakilas estimates that “a majority of Facebook users” could have had their data harvested by app developers without their knowledge. The company now has stricter protocols around the degree of access third parties have to data.

Parakilas said that when he worked at Facebook it failed to take full advantage of its enforcement mechanisms, such as a clause that enables the social media giant to audit external developers who misuse its data.

Legal action against rogue developers or moves to ban them from Facebook were “extremely rare”, he said, adding: “In the time I was there, I didn’t see them conduct a single audit of a developer’s systems”…

During the time he was at Facebook, Parakilas said the company was keen to encourage more developers to build apps for its platform and “one of the main ways to get developers interested in building apps was through offering them access to this data”. Shortly after arriving at the company’s Silicon Valley headquarters he was told that any decision to ban an app required the personal approval of the chief executive, Mark Zuckerberg, although the policy was later relaxed to make it easier to deal with rogue developers.

While the previous policy of giving developers access to Facebook users’ friends’ data was sanctioned in the small print in Facebook’s terms and conditions, and users could block such data sharing by changing their settings, Parakilas said he believed the policy was problematic.

“It was well understood in the company that that presented a risk,” he said. “Facebook was giving data of people who had not authorised the app themselves, and was relying on terms of service and settings that people didn’t read or understand.”

It was this feature that was exploited by Global Science Research, and the data provided to Cambridge Analytica in 2014. GSR was run by the Cambridge University psychologist Aleksandr Kogan, who built an app that was a personality test for Facebook users.

The test automatically downloaded the data of friends of people who took the quiz, ostensibly for academic purposes. Cambridge Analytica has denied knowing the data was obtained improperly, and Kogan maintains he did nothing illegal and had a “close working relationship” with Facebook.

“Kogan’s app was one of the very last to have access to friend permissions,” Parakilas said, adding that many other similar apps had been harvesting similar quantities of data for years for commercial purposes. Academic research from 2010, based on an analysis of 1,800 Facebooks apps, concluded that around 11% of third-party developers requested data belonging to friends of users.

If those figures were extrapolated, tens of thousands of apps, if not more, were likely to have systematically culled “private and personally identifiable” data belonging to hundreds of millions of users, Parakilas said.

The ease with which it was possible for anyone with relatively basic coding skills to create apps and start trawling for data was a particular concern, he added.

Parakilas said he was unsure why Facebook stopped allowing developers to access friends data around mid-2014, roughly two years after he left the company. However, he said he believed one reason may have been that Facebook executives were becoming aware that some of the largest apps were acquiring enormous troves of valuable data.

He recalled conversations with executives who were nervous about the commercial value of data being passed to other companies.

“They were worried that the large app developers were building their own social graphs, meaning they could see all the connections between these people,” he said. “They were worried that they were going to build their own social networks.”

There’s a lot to process, so let’s recap the key points. Until around 2014, Facebook allowed app developers to harvest data of not only the people who actually downloaded the app and gave permission, but also the data of the app users’ friends without their knowledge. It’s hard to overstate how sleazy and unethical this is.

When internal employees, such as Sandy Parakilas, noted the glaring issues with such practices and voiced them to management, he was repeatedly ignored and advised to look the other way. It was completely obvious to Mr. Parakilas that a black market for the data of millions was being unleashed into the wild due to Facebook’s shady policy.

Rampant data harvesting by third parties was routinely going on way before Cambridge Analytica came into the picture, and academic research highlighted the problem all the way back in 2010. Finally, Mr. Parakilas suspects the reason Facebook backed away from this sleazy practice wasn’t due to ethical or privacy concerns, but because executives feared others would collect enough user information from the company to eventually compete with it in the data mining business.

Unfortunately, there’s much more.

Take the following, reported a couple of days ago in the National Review:

Where were these worries four years ago for the much larger and arguably more manipulative effort by the Obama campaign?

Instead of using a personality quiz, the Obama campaign merely got a portion of its core supporters to use their Facebook profiles to log into a campaign site. Then they used well-tested techniques of gaining consent from that user to harvest all their friends’ data. Sasha Issenberg gushed about how the Obama campaign used the same permissions structure of Facebook to extract the data of scores of millions of Facebook users who were unaware of what was happening to them. Combining Facebook data with other sources such as voter-registration rolls, Issenberg wrote, generated “a new political currency that predicted the behavior of individual humans. The campaign didn’t just know who you were; it knew exactly how it could turn you into the type of person it wanted you to be.”

The level of data sophistication was so intense that Issenberg could describe it this way:

Obama’s campaign began the election year confident it knew the name of every one of the 69,456,897 Americans whose votes had put him in the White House. They may have cast those votes by secret ballot, but Obama’s analysts could look at the Democrats’ vote totals in each precinct and identify the people most likely to have backed him. Pundits talked in the abstract about reassembling Obama’s 2008 coalition. But within the campaign, the goal was literal. They would reassemble the coalition, one by one, through personal contacts.

Today’s Cambridge Analytica scandal causes our tech chin-strokers to worry about “information” you did not consent to share, but the Obama team created social interactions you wouldn’t have had. They didn’t just build a psychological profile of persuadable voters, and algorithmically determine ways of persuading them, but actually encouraged particular friends — ones the campaign had profiled as influencers — to reach out to them personally. In a post-election interview, the campaign’s digital director Teddy Goff explained the strategy: “People don’t trust campaigns. They don’t even trust media organizations,” he told Time’s Michael Sherer, “Who do they trust? Their friends?” This level of manipulation was celebrated in the press.

How did Facebook react to the much larger data harvesting of the Obama campaign? The New York Times reported it out, in a feature hailing Obama’s digital masterminds:

The campaign’s exhaustive use of Facebook triggered the site’s internal safeguards. “It was more like we blew through an alarm that their engineers hadn’t planned for or knew about,” said [Will] St. Clair, who had been working at a small firm in Chicago and joined the campaign at the suggestion of a friend. “They’d sigh and say, ‘You can do this as long as you stop doing it on Nov. 7.’ ”

In other words, Silicon Valley is just making up the rules as they go along. Some large-scale data harvesting and social manipulation is okay until the election. Some of it becomes not okay in retrospect. They sigh and say okay so long as Obama wins. When Clinton loses, they effectively call a code red.

What’s genius when it helps Obama, is suddenly a threat to Democracy when Trump wins. The truth is, it was dangerous and unethical before and it’s dangerous and unethical now.

Yet, there’s still more.

Take this, from IJR:

A former Obama campaign official lit up the internet on Monday after claiming that Facebook allowed them to mine massive amounts of Facebook data because “they were on our side.”

Now, comments she made in 2015 are shining even more light on exactly how extensive the data mining effort was — and how it may have given Democrats an “unfair” data advantage going forward.

Carol Davidsen, former director of integration and media analytics for Obama for America, said the Obama campaign was able to “ingest the entire social network” in the United States.

“Where this gets complicated is, that freaked Facebook out, right? So they shut off the feature,” she added. “Well, the Republicans never built an app to do that. So the data is out there, you can’t take it back, right? So Democrats have this information”…

“I’m a Democrat, so maybe I could argue that’s a great thing, but really it’s not in the overall process,” Davidsen said. “That wasn’t thought all the way through and now there’s a disadvantage of information that, to me, seems unfair. But I’m not Facebook.”

Here are the relevant tweets.

What’s going on here is pretty obvious. Mainstream press and “resistance” pundits have finally decided to care about the pernicious practices of a platform monopoly only because the wrong guy won. I’m fine with this, since I don’t really care why we decide to address serious societal issues, as long as we do address them.

That said, the danger is scores of dishonest actors are trying to hijack the narrative and make this only about Trump and Cambridge Analytica, versus a long-standing systemic issue of predatory practices that have always been central to Facebook’s business model. Monetizing your information.

It’s a topic I’ve written about many times over the years. Here are a few of those pieces:

A Very Disturbing and Powerful Post – “Get Your Loved Ones Off Facebook”

Facebook Faces High Profile Lawsuit Regarding Facial Recognition Technology ‘DeepFace’

Facebook Just Got a Whole Lot Creepier

Facebook Caught Secretly Lobbying for Privacy Destroying “Cyber Security” Bill

Video of the Day – Three Former U.S. Treasury Secretaries and a Facebook Executive Laugh About Income Inequality

Facebook Reveals its Master Plan – Control All News Flow

If you liked this article and enjoy my work, consider becoming a monthly Patron, or visit our Support Page to show your appreciation for independent content creators.

In Liberty,

Michael Krieger