I. Facebook doesn’t care about ethics, so why should you?
The internet nearly broke when researchers published a new study, “Experimental Evidence of Massive-scale Emotional Contagion through Social Networks.” Scientists, conducting a psychological experiment including approximately 700,000 Facebook users, manipulated news feeds to examine the effects of positive and negative posts. Researchers found that Facebook posts influence users’ moods; the general public and research community learned that Facebook does not care about ethics.
To conduct research, scholars must obtain approval from Institutional Review Boards (IRBs). The Facebook researchers neither obtained permission from an IRB nor received informed consent from participants. Instead, they used Facebook’s Data Use Policy to justify their methods. Data use policies and end-user agreements, if you don’t know, are those wordy and complicated messages that pop up when users join and dismiss within a few scrolls and a click. Anticipating backlash, the editorial board of the journal justified their decision by highlighting the import of the study and the fact that Facebook is a private company. The explanations raise more questions than they answer.
As evidenced by a dizzying array of advertisements running along the side of your feed, Facebook is a money-making enterprise. Their motivations differ from research universities. Timothy Ryan thoughtfully notes that businesses frequently conduct market research and commercial experiments. It is not until individuals translate private findings into public scholarship that issues arise. And, that is the major issue: Why is any group, regardless of industry, allowed to conduct unregulated experiments that may harm human subjects?
II. Move over Facebook, OKCupid loves unethical experiments too
OKCupid, a popular matchmaking website, didn’t take long to provide a more egregious example of unethical practices. Less than a month after the Facebook controversy, they released findings from three studies. The most deceptive experiment included the company manipulating compatibility scores—in some cases, changing them from 30 to 90 percent—to monitor effects on interactions between matches. They emailed unknowing users the correct scores after the experiment. Like Facebook, OKCupid cited their user agreement as justification.
Unlike Facebook’s experiment, OKCupid’s studies present dubious scientific value and perverse underlying assumptions. Facebook published their findings in a peer-reviewed journal. While the experiment undoubtedly helps the social media company, it also contributes to a body of scientific knowledge. The researchers did not obtain informed consent; however, they could plausibly argue that study benefits outweighed participants’ risks. There was a hint of beneficence. Capitalizing on the recent debate about the other experiment, OKCupid posted findings on OKTrends, their own quirky and entertaining marketing blog and used the attention to advertise an upcoming book.
Christian Rudder, OKCupid’s president and co-founder, published an unapologetic blog about the experiments. He justified the research, stating “We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.” Rudder does not address ethics or informed consent. He invokes a coder’s ethos where curiosity, experimentation, and trial and error are key ingredients. He ignores a key fact: Data are people too, and tinkering with code could cause harm. He even belittles internet users, implying they are foolish to expect trustworthiness and transparency from companies.
Let’s think about Rudder’s point and extend it to a more accessible example. We all know that companies conduct market research. When I shop at a popular big box store, I understand that they gather data based on my shopping habits. They track my movements in the store and the impact of a new Transformers endcap on my purchasing habits. I also assume that what I see is what I get, that the store is not willfully deceiving me, and that any data they collect will be anonymous, unless I provide informed consent. When I take my Cheez-Its to the checkout counter, I expect to pay the advertised price for a box of delicious cheesy crackers. According to Rudder’s reasoning, the store has the right to change the contents of the box and the consumer is naive for expecting otherwise.
Now, let’s think about social media correlates. Most of us know that websites monitor our browsing habits. During my wife’s pregnancy, we spent nine months searching for baby gear. I’m not surprised when I see Babies R Us ads splattered across my Facebook page; I get that. There are trade-offs on social media. If I sign up for a dating website like Match.com (which, incidentally, is how I met my wife), I expect that the premise and underlying framework are trustworthy. I take a quiz. Someone else takes a quiz. Then, the website uses a well-tested algorithm to match us. Based on the OKCupid scenario, I’m foolish. At any moment, I could find out that my wife doesn’t share the same political views as me; she doesn’t like watching football; and, she doesn’t care about my level of education.
Rudder’s comments are shockingly tone deaf given the current milieu regarding social media ethics and best practices. His lack of remorse or awareness—coupled with his book promotion—add a level of creepiness. OKCupid’s experiments seem an equal measure of loose ethics and slimy publicity. Unfortunately for Rudder, even though he dismisses critics, the FTC may be getting involved with OKCupid too.
III. Analog Ethics in a Digital World
The 20th century provides numerous examples of misguided research that placed individuals at risk, from the Tuskegee syphilis experiment to Stanford Prison Experiment. In the United States, the National Research Act of 1974, which created the first national committee to establish ethics policy, was an important step towards protecting human subjects and limiting the potential for malfeasance. But, there’s more work to be done.
As we enter the 21st century, emerging technologies and digital spaces provide new opportunities for abuses of power. I have read a few commentators argue that people have a choice. Stay off social media or get informed, they say. That’s a naïve and unrealistic perspective. Facebook and YouTube both amass over a billion users per month. Social media has become a significant part of our lives. Others argue that all social media companies experiment. Social scientists don’t understand and they shouldn’t stifle a new golden age for knowledge creation. The arguments echo previous examples when a few power drunk individuals made unsound decisions that influenced many.
As social media companies accumulate more power and influence, we should demand increased ethical responsibility and accountability. They cannot reasonably expect users to waive all personal rights through some end-user agreement sleight of hand.
The Facebook and OKCupid studies are important not just because of what occurred but also because of what they portend. Before we have truly lamentable examples of user and data abuse, policymakers need to commission a new set of ethical principles that meet the needs of the 21st century.