Facebook has been experimenting on it’s users.
A new paper in the proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to study ‘emotional contagion through social networks.’
The researchers tested whether reducing the number of positive messages people saw made those people less likely to post positive content themselves.
They did the same for negative posts and they found that Facebook posts can manipulate people’s feelings.
The researchers, who are affiliated with Facebook, Cornell, and the University of California–San Francisco, tweaked the algorithm by which Facebook sweeps posts into members’ news feeds.
Some users were fed primarily neutral to happy information from their friends; others, primarily neutral to sad. Then everyone’s subsequent posts were evaluated for affective meanings.
The methodology used for the study has come under fire, but the researchers say it was allowed through Facebook’s data use policy.