Facebook running involuntary psych experiments

RichardGarfinkle

Nurture Phoenixes
Staff member
Moderator
Kind Benefactor
Super Member
Registered
Joined
Jan 2, 2012
Messages
11,176
Reaction score
3,198
Location
Walking the Underworld
Website
www.richardgarfinkle.com
Facebook ran a psych test on users to see if their software could manipulate the users' emotional states.

The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can!

This is legal because of the Facebook user agreement. But I doubt most members realize that it entitles Facebook to willy-nilly use them as lab rats.

Psych question:
How much higher would the outrage level be if a government did this and why?
 

shadowwalker

empty-nester!
Super Member
Registered
Joined
Mar 8, 2010
Messages
5,601
Reaction score
598
Location
SE Minnesota
Not a fan of FB (I have an account mainly to see if any of my old classmates die), so don't pay a lot of attention to anything on there. But isn't this sort of like companies running a new ad in certain geographic areas to test its effectiveness before going nationwide/global (assuming the buy/don't buy is an emotional response)? So I'm not upset about some social media conducting such "experiments" - I would like to know what they intend to do now they have the results.
 

Don

All Living is Local
Super Member
Registered
Joined
May 28, 2008
Messages
24,567
Reaction score
4,007
Location
Agorism FTW!
Researching "the question of whether emotional states can be transmitted across a social network?" Was Captain Obvious in charge of this study?

As to Richard's psych question, given that FB is a government created, regulated and subsidized fictional entity that willingly colludes with government, I'd say there's very little difference between the two. Left hand of the political class, meet the right hand of the political class. Perhaps the task fell to FB instead of FedGov precisely because the outrage would be so much less.

I've gotta say though, given the obviousness of the outcome of the question, I'm shocked that FB didn't find some government honcho willing to give them a couple billion taxpayer dollars to perform the research.
 

Cella

Cella
Kind Benefactor
Super Member
Registered
Joined
Aug 22, 2009
Messages
26,851
Reaction score
13,880
Well I'm no rocket scientist, but everyone I know gets quote upset when FB makes any changes at all.
 

RichardGarfinkle

Nurture Phoenixes
Staff member
Moderator
Kind Benefactor
Super Member
Registered
Joined
Jan 2, 2012
Messages
11,176
Reaction score
3,198
Location
Walking the Underworld
Website
www.richardgarfinkle.com
The title of the paper is disingenuous. The article says this:

It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users.

Facebook was testing different algorithms for what posted items each user would see. What their results mean (assuming they are accurate and statistically significant) is that Facebook could choose for each user an algorithm that will make them happier or sadder.

They don't have to use the same algorithm for each user.

They also don't have to use the same algorithm in each day. So they could use a happy algorithm on days where things Facebook likes happens, and an unhappy algorithm on other days. They could push happiness followed by an ad blitz. Or sadness followed by polls about policies Facebook disapproves of.

The algorithm could be merely the first part of various one-two punches.
 

Witch_turtle

hanging around for a spell
Super Member
Registered
Joined
Feb 14, 2009
Messages
910
Reaction score
113
Location
North
I really wish fb would just show me everything my friends post. It's incredibly terrible at guessing what I want to see and who I want to see things from.

Besides that, it does seem to be getting increasingly skeevy.
 

kaitie

With great power comes
Super Member
Registered
Joined
Sep 10, 2009
Messages
11,050
Reaction score
2,637
The biggest problem with this is that, regardless of that little "you approve Facebook to use your profile for research" clause that they have in their user agreement, it still looks like this violates research requirements.

If I were to run a psychological experiment, I have to inform you not only that you are being researched, but what type of research I'll be doing. I would also need approval that my consent form was up to par.

This is something psychology takes very seriously. When you are talking about doing things that are intended to make people sad and give them negative emotions, that needs to be spelled out somewhere in the consent form. The person being researched must be aware of consequences before they agree. That's the way these things work. That way, say, a person suffering from severe depression who really doesn't want to be exposed to more negative emotion has the ability to opt out.

There was no explanation here. Facebook randomly assigned people and people had no way of even knowing they were actively involved in this kind of research. I'd also argue that what most people would associate with "research" in their user agreement is not anything near this. I imagine most people would think of something like analyzing their posts for advertisement purposes or things of that nature--in other words, passive research. Taking what's already there and aggregating it to see what they can learn about a person. That's very different from actively changing things to see how a person behaves. That's acting on the subject, and I imagine a lot of people on Facebook would not approve of being actively experimented on.

Facebook should have sent each user a consent form asking permission and describing what they were doing, and this needs to be separate from the user agreement (how many people actually read those? we should, but I'd guess a vast majority don't). Personally, I'm a little annoyed that this made it as far as it did. If I was a peer reviewer, I'd have shot them down at the lack of consent forms.
 

Celia Cyanide

Joker Groupie
Super Member
Registered
Joined
Oct 1, 2005
Messages
15,479
Reaction score
2,295
Location
probably watching DARK KNIGHT
I don't really see how the experiment is able to prove anything, anyway. Facebook statuses are not a reflection of what the user is really feeling, but what they are choosing to put out. All the results show is that seeing negativity makes users more likely to post negativity. It doesn't mean they're actually feeling more negative. It could just mean that they feel more comfortable sharing their negative feelings, since they see others doing it. I know several people who are always on Facebook talking about how great their life is, when I know they're really miserable in person.
 

Wilde_at_heart

υπείκωphobe
Super Member
Registered
Joined
Sep 12, 2012
Messages
3,243
Reaction score
514
Location
Southern Ontario
Exactly, Celia.

There's a reason so many call it 'fakebook'.

I get irked at the 'top stories' rubbish, so I always switch it to 'all' that rare time that I actually do ever log in. Not sure if that was part of their psy-op or what, but a lot of what they do on their just makes me want to bother going on there at all less and less all the time.
 

Cassiopeia

Otherwise Occupied
Super Member
Registered
Joined
Aug 1, 2006
Messages
10,878
Reaction score
5,343
Location
Star to the right and straight on till morning.
Facebook doesn't corner the market on this kind of research. Google and it's child email program, gmail are farming information from our emails and selling it to research companies every single day.

Facebook and social media sites have been using our information as research from the get-go. I'm not sure why this is so upsetting. It's merely the modern day way of doing research that's been done since heaven knows when.
 

Perks

delicate #!&@*#! flower
Kind Benefactor
Super Member
Registered
Joined
Apr 12, 2005
Messages
18,984
Reaction score
6,937
Location
At some altitude
Website
www.jamie-mason.com
Perhaps the conclusion of the study is actually just some bold and underlined reverse psychology - Don't Let What's On Facebook Spoil Your Mood.
 

Pyekett

I need no hot / Words.
Kind Benefactor
Super Member
Registered
Joined
Jan 30, 2011
Messages
1,290
Reaction score
202
Location
Translated.
Pure data analysis without testing an intervention is different from research on effects of manipulating the environment. One of the Forbes articles on this puts it bluntly:

That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish, and it’s why people are flipping out about this.

http://www.forbes.com/sites/kashmir...he-fuss-about-its-emotion-manipulation-study/

Not all people are flipping out. Most of the people flipping out are professional researchers or other people in academia. In part this is because the study was published in an academic journal, and there are supposed to be safeguards on human subjects research. Those are safeguards that bedevil the lives of researchers, but they are there for a reason.

I think taking upon the mantle of "respectable research" by publishing this way is going to bring a level of scrutiny and dissection to the study that Facebook never anticipated. That's appropriate.

Added: If you don't pay the piper, you can't call the tune.
 
Last edited:

RichardGarfinkle

Nurture Phoenixes
Staff member
Moderator
Kind Benefactor
Super Member
Registered
Joined
Jan 2, 2012
Messages
11,176
Reaction score
3,198
Location
Walking the Underworld
Website
www.richardgarfinkle.com
Facebook doesn't corner the market on this kind of research. Google and it's child email program, gmail are farming information from our emails and selling it to research companies every single day.

Facebook and social media sites have been using our information as research from the get-go. I'm not sure why this is so upsetting. It's merely the modern day way of doing research that's been done since heaven knows when.

There's a distinction between examining data arising from observation and manipulating people to see what happens.
 

Pyekett

I need no hot / Words.
Kind Benefactor
Super Member
Registered
Joined
Jan 30, 2011
Messages
1,290
Reaction score
202
Location
Translated.
I think people knew they were signing up for swimming in a fishbowl. They probably hadn't expected it to be a petri dish as well, though some additional thought shows the logical implications.

I'm not sure that broader society can or should break apart the petri dish. People should know that's what they signed up for, regardless. And the research community has an ethical responsibility to distance themselves from marketing departments if those departments are not going to agree to research rules.

Marketing departments are known to be shady in this way to anyone who's been paying attention. If you were paying attention, you knew that researchers--in contrast--have to waltz over the sun and back to prevent such shadiness.

Added: There is a very high price for all other subsequent research to pay unless we come down like a hammer on this. Rightly so. If that demarcation is not kept very clean, then the broader public cannot trust that academic research holds itself to a higher standard.

Also Added: This was funded in part by the Army Research Office, which means Department of Defense, which means taxpayer dollars. This isn't just internal quality control for a private company. This is at least in part publicly funded research. Generally that means strict standards and many levels of failsafe controls against unintended harms.

The study was funded in part by the James S. McDonnell Foundation and the Army Research Office. Other investigators included Jamie Guillory, a Cornell postdoctoral associate when the project began who now works at the UCSF Center for Tobacco Control Research and Education, and Adam D.I. Kramer of Facebook.
http://www.news.cornell.edu/stories/2014/06/news-feed-emotional-contagion-sweeps-facebook

And again: I'm disturbed that the editor of this journal article seemed not to know that there was some federal funding. This much lack of oversight in this context is, well, unsettling.

A lot of the regulation of research ethics hinges on government supported research, and of course Facebook's research is not government supported [but see above], so they're not obligated by any laws or regulations to abide by the standards.
-Susan Fiske, The Atlantic interview

http://www.businessinsider.com/facebook-mood-study-2014-6
 
Last edited:

veinglory

volitare nequeo
Self-Ban
Registered
Joined
Feb 12, 2005
Messages
28,750
Reaction score
2,933
Location
right here
Website
www.veinglory.com
As someone who has run psychological test, I would not do it this way and my ethics board would not permit it. Informed consent must be consciously made by participants in any study that is not purely observational.
 

Pyekett

I need no hot / Words.
Kind Benefactor
Super Member
Registered
Joined
Jan 30, 2011
Messages
1,290
Reaction score
202
Location
Translated.
Here is what the editor for the journal article had to say, again:

"I was concerned," she told me in a phone interview, "until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time..."

-http://www.businessinsider.com/facebook-mood-study-2014-6

And here is part of the statement released by Facebook:

“... We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

-http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/

I bet the "institutional review board" that the editor accepted at face value was really just Facebook's "internal review process." That isn't the same thing. IRBs are defined and constrained by Title 45 Code of Federal Regulations Part 46 and regulated by the Office for Human Research Protections within HHS.

"Institutional Review Board" isn't a throwaway title.

From a section of the appropriate Code of Federal Regulations:

§46.101 To what does this policy apply?
(a) Except as provided in paragraph (b) of this section, this policy applies to all research involving human subjects conducted, supported or otherwise subject to regulation by any federal department or agency which takes appropriate administrative action to make the policy applicable to such research. This includes research conducted by federal civilian employees or military personnel [emphasis added], except that each department or agency head may adopt such procedural modifications as may be appropriate from an administrative standpoint. It also includes research conducted, supported, or otherwise subject to regulation by the federal government outside the United States.

(1) Research that is conducted or supported by a federal department or agency, whether or not it is regulated as defined in §46.102, must comply with all sections of this policy.

(2) Research that is neither conducted nor supported by a federal department or agency but is subject to regulation as defined in §46.102(e) must be reviewed and approved, in compliance with §46.101, §46.102, and §46.107 through §46.117 of this policy, by an institutional review board (IRB) that operates in accordance with the pertinent requirements of this policy.

-http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html

According to the Cornell Chronicle:

The study was funded in part by the James S. McDonnell Foundation and the Army Research Office. Other investigators included Jamie Guillory, a Cornell postdoctoral associate when the project began who now works at the UCSF Center for Tobacco Control Research and Education, and Adam D.I. Kramer of Facebook.

-http://www.news.cornell.edu/stories/2014/06/news-feed-emotional-contagion-sweeps-facebook

The McDonnell foundation is private, and Facebook is its own player. It's debatable whether the research would be subect to the federal codes of ethical conduct if that were the extent of it, but it isn't. This research received federal funds through the Army Research Office, so it inarguably should have been truly IRB-vetted. Whether it was or not is still in question.
 
Last edited:

Plot Device

A woman said to write like a man.
Super Member
Registered
Joined
Apr 14, 2007
Messages
11,973
Reaction score
1,867
Location
Next to the dirigible docking station
Website
sandwichboardroom.blogspot.com
Facebook was founded as a psych experiment of sorts. Why would it be a surprise they'd rachet it up? :Shrug:

That's not the question.

The question isn't: "Gee, are you surprised too?"

The question is: "Would you agree that this is morally reprehensible, ethically unacceptable, and bordering on illegal?"
 

veinglory

volitare nequeo
Self-Ban
Registered
Joined
Feb 12, 2005
Messages
28,750
Reaction score
2,933
Location
right here
Website
www.veinglory.com
The Journal should not rest on the review board, their ethics are in their own hands. I have seen journals reject papers on ethical grounds dozens of times and they were right to do so. Institutional ethical review is hit and miss.

This is PNAS FFS. Unless I am missing something their choice to run this is bizarre.
 
Last edited:

Pyekett

I need no hot / Words.
Kind Benefactor
Super Member
Registered
Joined
Jan 30, 2011
Messages
1,290
Reaction score
202
Location
Translated.
Yeah, it's ugly. I agree PNAS did more than drop the ball. They stabbed it with a knife, set it on fire, and rolled in the melted debris.

However, I am interested in what will happen when someone figures out that there was research done using federal military funds for experimentation on human subjects without IRB vetting. That is a breach of federal law, and it isn't much of a matter of interpretation.
 

kaitie

With great power comes
Super Member
Registered
Joined
Sep 10, 2009
Messages
11,050
Reaction score
2,637
Didn't PNAS also run that hurricane name article that pissed the hell out of me recently? Honestly, I'm not that impressed at the moment. I feel like these guys should have higher standards.
 

cornflake

practical experience, FTW
Super Member
Registered
Joined
Jul 11, 2012
Messages
16,171
Reaction score
3,734
I...they did what? Someone published it too? What?

What the hell IRB approved this shit? An IRB I'm familiar with will send people back to the drawing board two, three, four, times, for non-invasive, non-deceptive studies on pools known to be willing subjects because the parameters are a little unclear or there's not enough control over what might maybe could possibly be a response.

As noted, you need research subjects to consent. You can deceive them, as part of the experiment, but that's what the IRB is for (besides making everyone's life hell ;) ). They determine what's ok and what's over the line.

This is just nuts. It also seems like crappy research, but that's kind of entirely beside the point.
 

Pyekett

I need no hot / Words.
Kind Benefactor
Super Member
Registered
Joined
Jan 30, 2011
Messages
1,290
Reaction score
202
Location
Translated.
There was no IRB. The editor has clarified that she thought the authors meant an IRB when she was told that the local review board had signed off on it. That review in question was a process internal to Facebook.
 
Last edited:

kaitie

With great power comes
Super Member
Registered
Joined
Sep 10, 2009
Messages
11,050
Reaction score
2,637
Okay, this whole mess makes me respect this editor less and less. Every person I've talked to about this who has even remote knowledge of research has said "how the heck did this ever get passed?"

The editor should have called them on it, and should have checked on it when he was told that they had been approved. This just sounds very irresponsible to me. It should have raised a ton of red flags, and when you have a ton of red flags, that should make it even more difficult to get a pass on something like this.