Facebook Admits to Manipulating Users' Emotions

  1. [​IMG]

    Facebook already knows a lot about its users, such as where you work and who you are in a current relationship with. Facebook has made it known to us that they know even more then we originally thought. They have told us one of their secrets: they know how to make their users feel more happy or even sad just by our key strokes on the computer's keyboard. Facebook has told us details of an experiment in which they manipulated information that 689,000 users' home pages, and found out that they could make people feel more positive or negative through a process of "emotional contagion."

    A study was performed by students from Cornell and the University of California where Facebook filtered its users' news feeds. One of tests in the study reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test in the study reduced the exposure to "negative emotional content" and as thought the opposite happened. The study came to the conclusion that "emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge the first experimental evidence for a massive-scale emotional contagion via social networks."

    Of course lawyers and internet activists alike said that this experiment in emotional manipulation was "scandalous", "spooky", and "disturbing." British politicians called for an investigation into how Facebook and other social networks like it manipulated emotional and psychological responses of users by editing information that is supplied to them. A member of the Commons Media select committee named Jim Sheridan said that this experiment was intrusive to users. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people", Sheridan stated. He also stated "they are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."

    A spokeswoman for Facebook state that the research was carried out "to improve our services and to make the content people see on Facebook as relevant and engaging as possible". Other commentators spoke about their fears that this process could be used for things like political reasons to runup elections.

    I personally think this is wrong. Facebook should not being things like this. It's very intrusive to a persons personal life and can be very controlling. Laws will hopefully be put in place to protect users like myself from Facebook.

    Source

    Share This Article

Comments

To make a comment simply sign up and become a member!