Researchers say Facebook has a ways to go yet toward injecting more transparency and ethics into its research practices.
On Thursday, the company announced some new guidelines for the research it conducts on its site, with a focus on better training of employees and a review panel of different subject-matter experts. Education about research practices also will be incorporated into Facebook's six-week training for new engineers.
But outside researchers were underwhelmed by Facebook's announcement. They cited a lack of information on how Facebook decides whether to do studies on users in the first place, as well as on when and how they would tell users they were being studied.
"Facebook needs to be more transparent about its research methods and ethics, and we didn't get that today," said Elizabeth Buchanan, director of the center for applied ethics at the University of Wisconsin-Stout.
Facebook said proposed research would go under an enhanced review process by a panel of employees across the areas of engineering, research, legal, privacy and policy, before research could begin.
But the company did not say how it would decide whether any particular type of study should be conducted on users. Nor did Facebook use the word "ethics" in its post on the new guidelines.
The company did not respond to requests for more information.
The changes were laid out after controversy erupted earlier this year over a 2012 study by Facebook in which it changed which posts users saw in their feeds based on emotional content. The study was designed to measure the effect on users' moods, but many people characterized it as Facebook manipulating their emotions.
Mike Schroepfer, Facebook's chief technology officer, said it's clear now there are things the company should have done differently.
In interviews with the IDG News Service, some academic researchers lamented the fact that Facebook did not say it would be using outside independent experts in its review process going forward.
"Outside involvement is important," said Irina Raicu, Internet ethics program director at the Markkula Center for Applied Ethics at Santa Clara University.
"More internal review is not the answer," she said.
Bringing in outside experts across different fields such as anthropology, sociology and psychology who are experienced in research could bring an objective point of view to Facebook's studies outside of the company's own business concerns.
But Facebook is probably between a rock and a hard place. Incorporating more outside review could help top Facebook executives apply a new level of ethical thinking to its studies that's typically exercised by university-level researchers. But that might also jeopardize Facebook's business by exposing its algorithms and underlying technology, Buchanan said.
Google already has an outside council of experts, though with a slightly different focus. They're weighing issues around people's requests to have information about them removed from search results in Europe.
Regardless, the amount of data posted on the Internet today surely has researchers of all types salivating. Twitter has invested US$10 million in a lab at the Massachusetts Institute of Technology to let researchers study online social movements by looking at its firehose of public tweets.
Some experts called Facebook's new guidelines a decent first step forward toward improving its research practices, though not a huge leap.
"This is a step forward," said Julia Horwitz, an attorney at the Washington, D.C.-based Electronic Privacy Information Center, which previously lodged a complaint with the U.S. Federal Trade Commission over Facebook's "mood study."
"But it's not significant enough of a step where if we saw a repeat of that study, that I would be surprised," she said.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.