How are you feeling? Facebook wants to know. Wait, scratch that -- Facebook may already know. A just-published report about a 2012 study suddenly has the Internet buzzing with concern, once again, over Facebook policies.

This week's debate relates to the secretive social experiment Facebook conducted on a random selection of 689,003 of its one billion-plus users. According to an article published June 17, 2014, in the Proceedings of the National Academy of Sciences, researchers from Facebook and Cornell University were testing whether certain emotions could be manipulated and would then spread among people without face-to-face contact.

As part of the experiment, the number of positive and negative comments that Facebook users saw on their feeds of articles and photos was artificially altered without their knowledge in January 2012. In the end, the researchers found that users who were shown fewer positive words were found to write more negative posts, and those who were exposed to fewer negative terms ultimately shared more positive posts.

You Said We Could

The authors of the study were able to conduct the research because, they said, automated testing “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

But many are saying that the problem with gaining user consent this way is in the cursory look users give the privacy policies of Web sites, if they look at all. Even users who take the time to read Facebook’s user agreement might not understand what they’re signing up for, according to Susan Etlinger, an industry analyst with the Silicon Valley-based Altimeter Group.

“Facebook has been making changes to its timeline algorithm since it began, and will continue to do so,” says Etlinger. “What makes this different from other areas that are covered in Facebook’s terms and conditions is that it’s not made clear that users will be part of a behavior experiment.”

A Bit of Backtracking

Facebook has shown at least hints of contrition since the results of the study spurred such a strong reaction this week. Adam Kramer, a Facebook data Relevant Products/Services scientist who was among the study’s authors, wrote on his Facebook page earlier this week that the team was “very sorry for the way the paper described the research and any anxiety it caused.”

The main problem with the Facebook experiment, say observers, is that it exposes the notoriously weak form of consent that exists in many online transactions. The nearly unanimous uproar this week over the secretive experiment highlights the importance of getting truly informed consent for such transactions.

The experiment was made to look even more suspicious when it was reported earlier this week that four months after the study, Facebook made changes to its data use policy that could seem designed to cover its tracks in case of the kind of negative reaction the study has received. The new clause allowed “(f)or internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Whether the uproar leads to stronger regulation of online user agreements -- or just prompts users to read such agreements more carefully -- the industry will be paying attention.

“This is big because it’s a matter of social data ethics,” says Etlinger. “I think it’s good that there’s an outcry over this. It sets a dangerous precedent.”