This past summer, Facebook sparked controversy over its changing of users’ News Feeds for an emotion study and on Thursday, the company announced some changes to its research process. Some of the initial articles on the announcement point out what’s missing from it, say it’s not really sufficient or say it’s a step forward.
In a blog post Thursday, Mike Schropfer, Facebook’s chief technology officer wrote:
Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently.
Among the changes, according to the blog post, are an “enhanced review process” prior to research on certain groups of people and content that could be “considered deeply personal” and the creation of an internal review panel for these projects. The blog post states that the “bootcamp” for new engineers will include education on the company’s research practices and that its published academic research will available on one site.
But the New York Times’ Vindu Goel points out that this doesn’t include outside review and that there are some unanswered questions:
But no outside body will review Facebook’s research projects, and the company declined to disclose what guidelines it would use to decide whether research was appropriate. Nor did it indicate whether it will get consent from users for projects like its emotion manipulation study, which set off a global furor when it was disclosed this summer.
In essence, Facebook’s message is the same as it has always been: Trust us, we promise to do better.
Gigaom’s Carmel DeAmicis writes that it’s unclear whether “external policing” will be conducted and points to potential problems with an internal review panel:
Facebook says its internal study review team will be comprised of lawyers, engineers, and privacy experts. That sounds hunky-dory, but outside accountability matters, since anyone who works for Facebook may be too close to the company to accurately determine ethical practices. For academic researchers, there are clear ways of doing things determined by industry organizations, external review boards, and even federal law.
It’s a step in the right direction for the company, but without additional systems to hold Facebook accountable, it’s not quite enough.
Ellis Hamberger at The Verge, on the other hand, has a more positive take:
Facebook can’t take back what it did, but today’s measures go a long way towards rectifying the underlying structures that enabled such an aggressive study to happen without Facebook’s higher-ups having any idea it was taking place. It should also help new engineers understand that Facebook users aren’t just numbers on a chart. 1.3 billion users is a whole lot of people, but that doesn’t mean you can experiment on them — even a tiny percentage of them — without being more transparent about exactly what you’re doing.
Josh Constine at TechCrunch writes: “If we’ve gained anything from the emotional manipulation study backlash, it’s that more of Facebook’s research will now be out in the open.”
“Before, it was buried in academic journals and often lacked comprehensible explanations of what Facebook was doing and why,” he writes. “That both made it feel like Facebook was shadily being secretive, and left research open to sensationalist interpretation.”