Thoughts on the present and future of legal information, legal research, and legal education.
Monday, July 28, 2014
Purge? WTF?! Yet another form of cyberbullying
Wow! Social media has spawned a new form of cyberbullying, and surprise, surprise, it tends to focus on females. Purge happened worldwide on Facebook between July 17 and 19, with supposedly anonymous posts saying whatever folks liked about anybody they wanted, tagging and posting images. I think it started as a promo for a new movie that was just being released, "The Purge: Anarchy." The Purge for some may have been about anarchy and fighting power, from a few posts I saw. But apparently it quickly turned into a misogynistic woman-bash, posting and trading nude images and videos. Instagram was involved in the Purge as well. The Guardian is on the story. Their story focuses on Twitter, and the use of images as "revenge porn" where exes post nude images of their former lovers on social media.
It is new enough as a term that it's hard to search for online, but you can find Instagram links and there is a Twitter hashtag #stopthepurge. I stumbled on this sad new phenomenon because a Taunton, Massachusetts 14 year old (girl, of course -- did you have to ask?) committed suicide this summer, apparently after something like a purge attack. See story here from the Patriot Ledger. There seem to be several local purge attacks that boiled along after the big Facebook one. There was a Brockton Purge, for instance (a small town south of Boston), and I found a reference to a Kansas City Purge as well.
Apparently, ex-boyfriends (ex-girlfriends, too, I suppose, though I haven't seen an example) who received nude images from women while still in a relationship take revenge by posting them after the relationship breaks up. Then they post the nude images widely, with ugly commentary. Classy move.
Too late to learn this important lesson: You are going to go through a number of relationships in your life, before you (hopefully) end in a long-term happy marriage. Don't hand out nude images to everybody you link up with along the way! You might think he's the ONE, but there is just no hurry to supply him with nude images (no matter what he says). If he is Mr. Right, he won't be badgering you for nude pix, honey!
There are a number of posts claiming different numbers of suicides, arrests, homicides connected with "The Purge." It is not clear how many, if any really happened. It is true, however, that many teen suicides have been connected to sexting, which is basically what much of the Purge harassment turned into. It's easy to say that a suicide in response to such public shaming is an over-reaction. But on the Internet, you cannot get the image back. Once it's out there, it's out of your control. And even if the first poster regrets his action and removes the post. Even if Facebook removes all the posts that can be found, these images proliferate and scatter beyond recall. That image really is out there, forever.
What a hateful, misogynistic thing.
Thursday, July 03, 2014
Facebook, Secret Experiments and the Belmont Report
Facebook is eating some crow. Again.
Most OOTJ readers will have read about Facebook's data scientist, Adam D.I. Kramer and two academic partner running an experiment using Facebook users. The results were published in the Proceedings of the National Academy of Sciences, (PNAS), "Experimental evidence of massive-scale emotional contagion through social networks." But what got people riled was the inflammatory language used in the abstract and press releases:
The woman who edited the paper for the PNAS, Susan Fiske, has been quoted as finding the experiment creepy and troubling. She did interview the researchers and found that they had cleared the experiment with an Institutional Review Board. Institutional Review Boards (IRBs) are mandated by several federal agencies for any organization carrying out research on human subjects. The Food and Drug Administration (FDA) and Health and Human Services' Office for Human Research Protections (HHS' OHRP) are the two main agencies, and CFR main sections are 21 CFR Part 56 (FDA regulations on Institutional Review Boards), and 45 CFR Part 46 (the Common Core or Common Rule, from ORHP) are the most important and useful regulations.
The impetus for the development of IRBs and the protection of human research subjects was a series of high profile, cruel medical research projects through the 20th century that shocked the conscience of the nation. The Belmont Report was the crystallization of a series of meetings by a group of physicians, scientists, ethicists, lawyers and lay leaders on the problem of how to protect human subjects of all types of research in the future. It is the basis for all future regulations and for decision-making by IRBs, who are supposed to keep the interests of the research subjects at the center of their deliberations, while balancing the interests of researchers. There is also some thought for the interests of the organization they represent as well. But three principles are supposed to be the primary concern of the IRB:
1. Respect for Persons (requires the researcher to both acknowledge the individual as an autonomous person AND to protect individuals who may be diminished in their autonomous capacity)
2. Beneficence (will the research benefit the research subject?)
3. Justice (who bears the burdens of the research and receives the benefits?)
The IRB then looks at three main issues in the proposed research:
1. Informed Consent (This may be waived in very narrow circumstances: The principle of Respect for Persons requires in most cases that research subjects know what is being proposed to be done to them and have a chance to voluntarily choose to participate or withdraw with no consequences. 45 CFR Part 46.116 lays out the basic requirements for Informed Consent. More on waiver below.)
2. Assessment of Risk and Benefits (The principle of Beneficence requires that the research balance the risks to subjects against the potential benefits, either to the subjects or generally.)
3. Selection of Subjects (The principle of Justice requires that the selection of subjects for the research be done equitably, so that, for instance, not all research ends up being done on poor subjects unless there is a reason related to the topic of research.)
Waiver of Informed Consent
45 CFR Part 45.115 (d) allows IRBs to approve research with consent procedures that alter or waive some or all of the general requirements if they find:
This is not the first time that Facebook has manipulated and experimented with its users. In September, 2012, Facebook reported on an experiment that boosted voter turnout in a mid-term election. They divided users 18 and older into three groups.
Most OOTJ readers will have read about Facebook's data scientist, Adam D.I. Kramer and two academic partner running an experiment using Facebook users. The results were published in the Proceedings of the National Academy of Sciences, (PNAS), "Experimental evidence of massive-scale emotional contagion through social networks." But what got people riled was the inflammatory language used in the abstract and press releases:
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.People reacted with outrage, feeling that Facebook had (once again!) abused their membership in that social media giant. They did NOT like being manipulated without their knowledge. The experiment was really fairly benign, with a tweak to the algorithm showing a selection of users more positive newsfeed content, and others reduced positive newsfeed content. The experimenters then monitored the types of posts the various users made and judged whether they became more positive or more negative.
The woman who edited the paper for the PNAS, Susan Fiske, has been quoted as finding the experiment creepy and troubling. She did interview the researchers and found that they had cleared the experiment with an Institutional Review Board. Institutional Review Boards (IRBs) are mandated by several federal agencies for any organization carrying out research on human subjects. The Food and Drug Administration (FDA) and Health and Human Services' Office for Human Research Protections (HHS' OHRP) are the two main agencies, and CFR main sections are 21 CFR Part 56 (FDA regulations on Institutional Review Boards), and 45 CFR Part 46 (the Common Core or Common Rule, from ORHP) are the most important and useful regulations.
The impetus for the development of IRBs and the protection of human research subjects was a series of high profile, cruel medical research projects through the 20th century that shocked the conscience of the nation. The Belmont Report was the crystallization of a series of meetings by a group of physicians, scientists, ethicists, lawyers and lay leaders on the problem of how to protect human subjects of all types of research in the future. It is the basis for all future regulations and for decision-making by IRBs, who are supposed to keep the interests of the research subjects at the center of their deliberations, while balancing the interests of researchers. There is also some thought for the interests of the organization they represent as well. But three principles are supposed to be the primary concern of the IRB:
1. Respect for Persons (requires the researcher to both acknowledge the individual as an autonomous person AND to protect individuals who may be diminished in their autonomous capacity)
2. Beneficence (will the research benefit the research subject?)
3. Justice (who bears the burdens of the research and receives the benefits?)
The IRB then looks at three main issues in the proposed research:
1. Informed Consent (This may be waived in very narrow circumstances: The principle of Respect for Persons requires in most cases that research subjects know what is being proposed to be done to them and have a chance to voluntarily choose to participate or withdraw with no consequences. 45 CFR Part 46.116 lays out the basic requirements for Informed Consent. More on waiver below.)
2. Assessment of Risk and Benefits (The principle of Beneficence requires that the research balance the risks to subjects against the potential benefits, either to the subjects or generally.)
3. Selection of Subjects (The principle of Justice requires that the selection of subjects for the research be done equitably, so that, for instance, not all research ends up being done on poor subjects unless there is a reason related to the topic of research.)
Waiver of Informed Consent
45 CFR Part 45.115 (d) allows IRBs to approve research with consent procedures that alter or waive some or all of the general requirements if they find:
(1) The research involves no more than minimal risk to the subjects;This was probably the provision under which the IRB approved the waiver, although the response of Facebook to user outrage is that users had consented to the research by clicking the "agree" when they signed up for their accounts. I do not think such click amounts to any such consent for IRB informed consent purposes, and it certainly has not mollified any outraged users. Kramer has said that the research was undertaken because they wanted to test "the common worry that seeing friends post positive comments causes people to feel left out or negative, or that seeing too many negative posts might stop them from using the site." Yet people felt manipulated and that their trust was violated. The research probably does meet IRB/Belmont standards, but the reporting of the research was done in a ham-handed and inflammatory style that left Facebook users feeling used and disrespected. Ideally, after a secret or deceptive research project, subjects are supposed to be informed about the research, in a way that helps them, not makes them feel used or deceived. This is the Respect for Persons principle.
(2) The waiver or alteration will not adversely affect the rights and welfare of the subjects;
(3) The research could not practicably be carried out without the waiver or alteration; and
(4) Whenever appropriate, the subjects will be provided with additional pertinent information after participation.
This is not the first time that Facebook has manipulated and experimented with its users. In September, 2012, Facebook reported on an experiment that boosted voter turnout in a mid-term election. They divided users 18 and older into three groups.
About 611,000 users (1%) received an 'informational message' at the top of their news feeds, which encouraged them to vote, provided a link to information on local polling places and included a clickable 'I voted' button and a counter of Facebook users who had clicked it. About 60 million users (98%) received a 'social message', which included the same elements but also showed the profile pictures of up to six randomly selected Facebook friends who had clicked the 'I voted' button. The remaining 1% of users were assigned to a control group that received no message.(from online journal Nature, doi:10.1038/nature.2012.11401, link above). The report notes that only close real-world friends had the effect of increasing voting activity. The researchers also used real world voting data to check for those who simply clicked the "I voted" button, but didn't vote. This research did not cause the backlash that the recent experiment did. It did not seem as manipulative to people, or as deceptive. There are a few comments in media considering what would happen if a social media giant were to decide to use such tactics to nudge an election to one side or another, as opposed to simply increasing voting generally, or how it could impact elections just by increasing voter turnout. (New York Times sort of mention Sept., 2012, and Comment from Hiawatha Bray in Boston Globe July 3, 2014, bringing the old research up in new context of the new one).
The researchers then compared the groups' online behaviours, and matched 6.3 million users with publicly available voting records to see which group was actually most likely to vote in real life.
The results showed that those who got the informational message voted at the same rate as those who saw no message at all. But those who saw the social message were 2% more likely to click the 'I voted' button and 0.3% more likely to seek information about a polling place than those who received the informational message, and 0.4% more likely to head to the polls than either other group.
The social message, the researchers estimate, directly increased turnout by about 60,000 votes. But a further 280,000 people were indirectly nudged to the polls by seeing messages in their news feeds, for example, telling them that their friends had clicked the 'I voted' button. “The online social network helps to quadruple the effect of the message,” says [James] Fowler, [political scientist, University of California, San Diego].