Monthly Archives: July 2014

Facebook and Engineering the Public — Crosspost

[I posted this first in the Message collection at Medium, where I do most of my regular writing these days. I’m cross-posting here both for archival purposes, but because many people use RSS (yeay RSS!) to read blogs, including this one.]

There’s been a lot of brouhaha about a recent Facebook study in which Facebook altered the news feed of 689,000 of its users to see if moods were “contagious.” There has huge discussion of its ethics, and another one on its publication. There’s also the argument that the effect sizes were actually not that large (though it seems the researchers kept them small on purpose) and whether the research was well done.

These are all good discussions but I’m more struck by this defense, and this question:

Fourth, it’s worth keeping in mind that there’s nothing intrinsically evil about the idea that large corporations might be trying to manipulate your experience and behavior. Everybody you interact with–including every one of your friends, family, and colleagues–is constantly trying to manipulate your behavior in various ways. … So the meaningful question is not whether people are trying to manipulate your experience and behavior, but whether they’re trying to manipulate you in a way that aligns with or contradicts your own best interests. <read the rest here>

I’m struck by how this kind of power can be seen as no big deal. Large corporations exist to sell us things, and to impose their interests, and I don’t understand why we as the research/academic community should just think that’s totally fine, or resign to it as “the world we live in”. That is the key strength of independent academia: we can speak up in spite of corporate or government interests.

To me, this resignation to online corporate power is a troubling attitude because these large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams. These tools are new, this power is new and evolving. It’s exactly the time to speak up!

That is one of the biggest shifts in power between people and big institutions, perhaps the biggest one yet of 21st century. This shift, in my view, is just as important as the fact that we, the people, can now speak to one another directly and horizontally.

I’m not focusing on this one study, or its publication, because even if Facebook never publishes another such study, the only real impact will be the disappointed researchers Facebook employs who have access to proprietary and valuable databases and would like to publish in Nature, Science and PNAS while still working for Facebook. Facebook itself will continue to conduct such experiments daily and hourly, in fact that was why the associated Institutional Review Board (IRB) which oversees ethical considerations of research approved the research: Facebook does this every day.

I’ve been writing and thinking about this a lot. I identify this model of control as a Gramscian model of social control: one in which we are effectively micro-nudged into “desired behavior” as a means of societal control. Seduction, rather than fear and coercion are the currency, and as such, they are a lot more effective. (Yes, short of deep totalitarianism, legitimacy, consent and acquiescence are stronger models of control than fear and torture—there are things you cannot do well in a society defined by fear, and running a nicely-oiled capitalist market economy is one of them).

The secret truth of power of broadcast is that while very effective in restricting limits of acceptable public speech, it was never that good at motivating people individually. Political and ad campaigns suffered from having to live with “broad profiles” which never really fit anyone. What’s a soccer mom but a general category that hides great variation?

With new mix of big data and powerful, oligopolistic platforms (like Facebook) all that is solved, to some degree.

Today, more and more, not only can corporations target you directly, they can model you directly and stealthily. They can figure out answers to questions they have never posed to you, and answers that you do not have any idea they have. Modeling means having answers without making it known you are asking, or having the target know that you know. This is a great information asymmetry, and combined with the behavioral applied science used increasingly by industry, political campaigns and corporations, and the ability to easily conduct random experiments (the A/B test of the said Facebook paper), it is clear that the powerful have increasingly more ways to engineer the public, and this is true for Facebook, this is true for presidential campaigns, this is true for other large actors: big corporations and governments. (And, in fact, a study published in Nature, dissected at my peer-reviewed paper linked below shows that Facebook can alter voting patterns, and another study shows Facebook likes can be used to pretty accurately model your personality according to established psychology measures).

That, to me, is a scarier and more important question than whether or not such research gets published. As I said, if this scares Facebook from future publishing, the biggest group that loses is the Facebook research team’s aspirations of academic publications. This type of work itself will continue, stealthily, as it has been.

Hence, I object to this framing:

How is publishing the results of one A/B test worse than knowing nothing of the thousands of invisible tests?

The alternative is not that they don’t publish and we forget about it. The alternative is that they publish (or not) but we, as research community, continue to talk about the substantive issue.

So, yes, I say we should care whether Facebook can manipulate emotions, or voting behavior, or whether it can model our personality, or knows your social network, regardless of the merits or strength of finding of one study. We should care that this data is proprietary, with little access to it by the user, little knowledge of who gets to purchase, use and manipulate us with this kind of data. And of course it’s not just Facebook, every major Internet platform, along with governments, are in this game and they are spending a lot of money and effort because this is so important. As academics and the research community, we should be the last people who greet these developments with a shrug because we are few of the remaining communities who both have the expertise to understand the models and the research, as well as an independent standing informed by history and knowledge.


If you want to read more, I have a much longer version of this argument, focusing on the case of political campaigns, but is applicable more broadly. So, click here or on the title below for my paper that is accepted as publication and is forthcoming in the July 2014 issue of First Monday, Volume 19, Number 7.

Engineering the Public: Big Data, Surveillance and Computational Politics

Abstract: Digital technologies have given rise to a new combination ofbig data and computational practices which allow for massive, latent data collection and sophisticated computational modeling, increasing the capacity of those with resources and access to use these tools to carry out highly effective, opaque and unaccountable campaigns of persuasion and social engineering in political, civic and commercial spheres.

PS. Jonathan Zittrain has written up one of the examples in my paper, Facebook throwing an election, in latest issue of TNR. I had used the same example in public talks and writings and elsewhere since at least May of 2013, including an earlier version of this paper I had published, so this is a case of many minds come up with same examples. 🙂