What others say: Facebook needs to show more respect for its users

Facebook is an extraordinary tool, but its pitfalls have become increasingly apparent. Users’ personal information, interests and habits are all fair game for the company, which has little compunction about analyzing the data and selling them to advertisers. Now Facebook has gone beyond capitalism and into creepy. For a week in 2012, it seems, the company manipulated users’ news feeds as part of a psychology experiment to see whether happier or sadder content led users to write happier or sadder posts. The result? Facebook appears to have altered people’s emotional states without their awareness.

 

This was wrong on multiple levels. It was unethical for Facebook to conduct a psychological experiment without users’ informed consent. And it was especially wrong to do so in a way that played with the emotions of its users. That’s dangerous territory.

Facebook, which employs a secret algorithm to determine what users see on their news feeds, conducted its research by altering the feeds of some 700,000 users, increasing or decreasing the number of “positive” and “negative” messages they saw to study the “emotional contagion” of social networking. The company, together with two academic researchers, published the results this month in the Proceedings of the National Academy of Sciences. In the study, Facebook asserted that users had given informed consent, which is standard protocol in psychological research, when they agreed to the company’s terms of service, which caution that users’ data can be mined for analysis and research. But that’s disingenuous. It’s hard to believe that users who took the time to read Facebook’s 13,000-word service agreements would have understood they were signing on to be lab rats.

In response to the outrage, the Facebook researcher who designed the study apologized for “any anxiety it caused.” He added that the company will seek to improve its internal review practices for future research. Certainly Facebook needs to revisit its policies to ensure that its users are not unwilling participants in psychological research. If this research is so valuable, the company should seek true informed consent.

But Facebook also needs to address its cavalier attitude toward its users. This latest controversy sends a troubling message to users that their personal information, their online activities and now even their feelings are all data points to be analyzed and manipulated according to the whims of a giant corporate machine.

— Los Angeles Times, June 30

More

Sat, 02/24/2018 - 21:36

Editorial: Problem won’t be solved by government alone

Though a number of steps have been taken at the state and federal level to combat the nation’s opioid crisis, comments from Alaska State Troopers... Read more

What others say: Well-intentioned bill sealing marijuana possession convictions should be voted down

A bill that would seal the records of marijuana possession convictions prior to Feb. 24, 2015 — when marijuana possession was officially legalized — has... Read more

What others say: California bill would ban tackle football until high school

Assembly members Lorena Gonzalez Fletcher, D-San Diego, and Kevin McCarty, D-Sacramento, have introduced legislation that would ban organized tackle football in California until high school.... Read more

Op-ed: Guns and Russians

Mark the date on your calendar: March 24. It’s a Saturday. It’ll be a test for the up-and-coming generation. For that matter, it will be... Read more