Facebook’s explosive emotion manipulation experiment on almost 700,000 users was one of literally hundreds of psychological experiments conducted by company, the majority of which had very few boundaries.
Facebook researchers ran the emotion manipulation study on users over the course of a week in January 2012 to test whether or not emotions were contagious on the social media platform. The recently published findings show how they were able to successfully alter users’ moods positively or negatively by curating the content in their News Feeds to highlight uplifting or depressing content.
The social media giant’s data science team — made up of about 30 academics, scientists and doctors — was established in 2007 and has run hundreds of tests since on Facebook’s 1.3 billion users with scarce limitations or oversight, according to the Wall Street Journal.
In one test, for example, thousands of users received a message from Facebook informing them their accounts were on the verge of being blocked because the company suspected they were either robots or using false names. The social network actually had no such suspicions — the messages were sent to real users as anti-fraud propaganda.
Other experiments and subsequent studies included studying the way political messages influenced congressional elections, the causes of loneliness, and the way families communicate. According to former Facebook data scientist Andrew Ledvina, such tests did not undergo a formal review process, and anyone could run one without jumping through the approval process similar academic studies require.
“They’re always trying to alter peoples’ behavior,” Ledvina said, alleging the research experiments were purposefully designed to manipulate users.
Ledvina said there were so many experiments being conducted simultaneously that researchers were concerned about the potential for inaccurate results stemming from users’ inadvertent involvement in multiple tests.
The week-long firestorm of media criticism has prompted Facebook to adopt more-stringent guidelines, including an internal review panel of 50 specialists and a terms of service update informing users of the potential to have their data used in research. Facebook has also stated it’s considering further measures to address users’ concerns in the wake of the study.