jlaw172364 wrote:Regarding the "manipulating emotions" story, well Facebook is media, and media is used to manipulate emotions. Television, film, writing, etc. It all can be used for the same ends.
Not quite. The public assumes that the contract to be manipulated by the media is for the advertised ends, is beneficial, and is as universally applied as possible. You rent a rom-com to feel sappily good, a horror movie to be scared, a comedy to laugh, etc. People use facebook to connect with friends, show off, see what people are up to, make connections, etc. I think even that default mode makes people unhappy but no one assumes they are being secretly targeted to be manipulated to feel unhappier then another subset of users. To me this is like Pepsi putting a depressant in a limited run of soda for a certain region to see how depressed they get compared to the regular formula.
Read the study again. 689,003 randomly selected FB users were targeted without their consent to be made to feel less happy. This is more than the population of Seattle. How many of those users may have already struggled with depression? How many were possibly suicidal? This "study" was intended to cause harm, and for no other reason than profit. I summon all the class action lawyers of the world to descend upon FB and to tear it to pieces and carry those pieces off to drop them into the ocean.
Facebook conducted a psychological experiment on its users by manipulating their emotions without their knowledge, a new study reveals.Researchers toyed with the feelings of 689,003 randomly selected English-speaking Facebook users by changing the contents of their news feed, according to a paper published in the June edition of the journal 'Proceedings of the National Academy of Scientists' (PNAS).During a week-long period in January 2012, researchers staged two parallel experiments, reducing the number of positive or negative updates in each user's news feed.
"When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks," said the authors of the paper, who include researchers from Facebook, Cornell University, and the University of California.
“We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days.”
The researchers indicated that the successful study is the first to find that moods expressed via social networks influence the emotions of others.
“These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
The Facebook users were not notified of the experiment. However, according to Facebook's terms of service (to which every person agrees when they register on the social network), users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The researchers argue that their experiment was consistent with Facebook’s Data Use Policy.