The researchers behind Facebook's controversial news feed tests have apologised to users of the social networking site for any "anxiety" their endeavours caused.
The side announced over the weekend the results of a study into its users' behaviour, which showed it can change their by tweaking timelines so that more positive or negative posts dominate.
The social network came under fire for not getting permission from the 700,000 users affected by the tests, which also revealed that users would post overly negative or positive messages on their timelines depending on which content type was shown.
Facebook published the findings in a report earlier this month, including an analysis of the posts people involved unknowingly made during the experimental period.
Anyone using Facebook with English set as their language was eligible for the study, but they were not advised whether they were being monitored or not.
The research discovered that the emotions felt by one person spread across the network quickly, but friends tended to respond to negative posts rather than the happier posts.
Those who received a lot of high-emotion posts on their timeline - whether positive or negative - tended to step back and stop posting.
Facebook worked on the experiment in partnership with Cornell University and the University of California over the course of a week in 2012.
Critics of the tests said the research could be used by Facebook to make people post more content and more shockingly, by highly influential bodies, such as governments, to manipulate people by changing timelines and making whole nations feel a certain emotion.
Others have questioned whether the site should have acquired the informed consent of users before carrying out the experiment.
However, changes to the firm's user policy, which came into force several months after the experiment took place, state that users of the site must accept their data will be used for "internal operations" and "research".
Given that this was reportedly only added to the firm's terms and conditions of use after the experiment, it could prompt questions about the eligibility of Facebook users for compensation over the trials.
Susan Fiske, professor of psychology at Princeton University, who carried out the research, told the Atlantic: "It's ethically okay from the regulations perspective, but ethics are kind of social decisions. There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done... I'm still thinking about it and I'm a little creeped out too."
Facebook said it didn't collect any data from the experiment, although the social network's guidelines do permit the use of data for "internal operations, including troubleshooting, data analysis, testing, research and service improvement."
Now, one of the researchers - and Facebook data scientist - has used the social networking site to apologise to users caught up in the experiment and to explain the rationale behind it.
"We care about the emotional impact of Facebook and the people that use our product. We felt it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out," wrote Adam D.I. Kramer.
"At the same time we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper."
Kramer then goes on to say that the affect on people who were affected by the study was "minimal" and that it never set out to upset anyone.
"My co-authors and I are very sorry for the way the paper described our research and any anxiety it caused," he continued.
"In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2023.