What Sandberg and Modi didn't talk about

BY Geeta Seshu| IN Media Freedom | 04/07/2014
Facebook's violation of privacy and ethics in its 'emotional contagion' research is causing a furore yet does not seem to have figured in Narendra Modi's talks with Facebook COO Sheryl Sandberg.
The PM should have questioned her closely on behalf of India’s billions of Facebook users, says GEETA SESHU

“The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion”

- from Experimental Evidence of Massive-scale Emotional Contagion through Social Networks, PNAS, June 2014

One is reasonably sure, without having been a fly on the wall, that this study of a research exercise involving Facebook users,  published in the Proceedings of the National Academy of Science (PNAS), didn’t figure in the meeting this week between Prime Minister Narendra Modi and Facebook’s Chief Operating Officer (COO) Sheryl Sandberg.

This, despite the fact that 689,003 Facebook users were used for the shockingly intrusive and unethical research project. It was conducted for a week in January 2012, and its findings were released last month.

It can be asked, of course, why this matter should figure in the meeting, anyway? Should it concern India’s Prime Minister?

Well, we should take our cue from the fact that Sandberg, in interactions with the media a day earlier, spoke excitedly of the fact that India, as a country, has the second largest number of Facebook users in the world.

Of the 1.2 billion monthly active Facebook users in the world, over 100 million are from India, making it second only to the US, which will probably be reaching saturation point soon. Not so with India, as Sandberg correctly surmised; better connectivity and more mobile accessibility are likely to increase use of the Internet significantly.

Obviously, with such a large pool of users, India should worry about Facebook’s research and whether the social networking company has been violating the privacy of its users.

However, reports of the interaction between Sandberg and Modi seem to suggest the latter was instead concerned about using Facebook to promote tourism!

Here’s an excerpt from his post after the meeting:

"Had a very fruitful meeting with Sheryl Sandberg. She pointed out that India is a very important country for Facebook, considering the high number of active Facebook users in India.” 

"Being an avid user of social media myself, I talked about ways through which a platform such as Facebook can be used for governance and better interaction between the people and governments. I also talked about how Facebook can be used to bring more tourists to India." 

It does not seem likely that we can expect help in tackling ethical issues concerning Facebook from these quarters.

Facebook and privacy 

However, the Indian media did question Sandberg on this controversy, and it is instructive to look at her response. In an interview to NDTV, she termed it a case of bad communication: 

"This was one week and it was a small experiment," she said. "It has been communicated as an experience to shift emotions, it's not exactly what it was. It was an experiment in showing people different things to see -- to see how it worked. Again, what really matters here is that we take people's privacy incredibly seriously and we will continue to do that." 

But as more information comes in on the “emotional contagion” study, it is clear that a huge privacy violation has occurred and more unethically, that posts have been manipulated to tweak news feeds. 

Here’s what the PNAS paper laid out: 

Two parallel experiments were conducted for positive and negative emotion: One in which exposure to friends’ positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content in their News Feed was reduced. In these conditions, when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing. It is important to note that this content was always available by viewing a friend’s content directly by going to that friend’s “wall” or “timeline,” rather than via the News Feed. Further, the omitted content may have appeared on prior or subsequent views of the News Feed. Finally, the experiment did not affect any direct messages sent from one user to another. 

As Violet Blue, writing in Zdnet (Facebook: Unethical, untrustworthy, and now downright harmful) has pointed out, the use of the Linguistic Inquiry Word Count (LIWC) 2007 as a tool for the study, is faulty and cannot even identify a clearly negative statement like ‘I am not having a great day’. As she points out, ‘there is no way of knowing exactly what Facebook did to the emotional temperature of over half a million people’. 

After the uproar, the researchers have tried to defend themselves in a number of inventive ways: that they didn’t communicate the experiment properly, that they were ‘concerned that exposure to friends' negativity might lead people to avoid visiting Facebook’ and that their goal was never to upset anyone! 

Besides, as Facebook said in its defence, users do give the company permission to do research – something that we all miss out on when we click on that ‘agree’ button. How many of us bother to scroll down the Data Use policy section and notice the line (for internal operations, including troubleshooting, data analysis, testing, research and service improvement) in the section on ‘How we use the information we receive’? And in any case, Facebook’s privacy settings change so often that users hardly ever bother to change their settings.  

But coming back to India and India’s huge population of Facebook users, in the absence of a privacy law, there is a dire need for much more debate and understanding on how much of our data is up for grabs and how much of control we have over its use/misuse.  

Sadly, our Prime Minister, who should have been more alert to issues of privacy and surveillance, given the disclosure last week that the US government’s NSA had authorized surveillance of his own political party, seems to have looked the other way. 

Subscribe To The Newsletter
The new term for self censorship is voluntary censorship, as proposed by companies like Netflix and Hotstar. ET reports that streaming video service Amazon Prime is opposing a move by its peers to adopt a voluntary censorship code in anticipation of the Indian government coming up with its own rules. Amazon is resisting because it fears that it may alienate paying subscribers.                   

Clearly, the run to the 2019 elections is on. A journalist received a call from someone saying they were from Aajtak channel and were conducting a survey, asking whom she was going to vote for in 2019. On being told that her vote was secret, the caller assumed she wasn't going to vote for 'Modiji'. The caller, a woman, also didn't identify herself. A month or two earlier the same journalist received a call, this time from a man, asking if she was going to vote for the BSP.                 

View More