February 18, 2022 | Category: HT Learning, Resources
What is Section 230?
Section 230 says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230). A lot of the debate around Section 230 has focused on the difference between a platform (“interactive computer service”) and a publisher. A publisher retains editorial control over what is offered on their website, whereas a platform is simply the place where the speech happens. However, electronic free speech experts have said this distinction is a distraction and less useful in the context of how today’s internet works. (EFF)
Platforms, Places, Publishers, and Moral Responsibility
To think through some of the issues around free speech and responsibility, let’s take the example of a coffee shop. Think about what responsibility the coffee shop holds in each situation.
Social connection, organizing, and education
- Alix is catching up with four of their friends in a coffee shop. Alix shares that their partner’s workplace has a rampant culture of sexual harassment, racial discrimination, and labor violations. This is private conversation.
- Alix’s partner Keyosha starts meeting regularly at the coffee shop with Alix’s organizer friends to discuss how they might push for change. At each meeting, a few more of Keyosha’s coworkers join. They start making plans to unionize. The coffee shop staff notice them meeting regularly. This is still private conversation.
- Keyosha and her coworkers do interviews for an article in the local independent news weekly (the “publisher”). One of Keyosha’s coworkers is quoted as calling their department manager a “predator.” The company’s legal team threatens a lawsuit against the news weekly. The news weekly maintains a stand in the coffee shop.
- Keyosha and the other organizers schedule an event to raise awareness about their concerns. The coffee shop owners agree to let them use the coffee shop as a meeting location. The organizers have a small rally in front of the coffee shop, and then meet inside afterward to have coffee, where they have a table with flyers and handouts about unionized labor.
- The coffee shop owners decide they support the efforts to unionize. Keyosha and the other organizers host a “Labor Rights Now!” event at the coffee shop, in which there is an open mic for folks to share stories, spoken word poetry, and songs about labor rights.
What moral or legal responsibility should the coffee shop have in each of these five situations?
Imagine the above situations, but with people organizing around asylum-seekers who were denied status? Or if Alix and Keyosha and their friends are sex workers, and they are organizing for their ability to distribute harm reduction information, or organizing for the right to work in teams or share “bad date” information.
Sometimes the activity they’re organizing around might not contradict laws, but might just be controversial. Imagine that in those five situations, Alix and Keyosha and their friends start hosting a meetup at the coffee shop for 2SLGBTQIA+ individuals in their community, which has a lot of homophobia and transphobia. What if some 2SLGBTQIA+ adolescents find out and show up to ask questions about how the adults there navigated coming out to their family, or how to let someone know you like them? The youth who sometimes show up tell them that having access to queer community means they feel less isolated, less suicidal, and more hopeful even in the face of abusive, homophobic families.
Or imagine that Alix and Keyosha are sexual health educators who sometimes rent out the back room of the coffee shop to hold classes in, and their sometimes include reproductive health information about accessing abortion, or how to self-medicate an abortion in a state hostile to abortion rights. What moral or legal responsibility should the coffee shop have in each situation? What if it is a chain coffee shop working in states where abortion is legal as well as in states where it is not?
Child Sexual Exploitation and Child Sexual Abuse Materials
While all of these above scenarios are common uses of the coffee shop that are essential to community and safety, the coffee shop might also be a place where people do concerning things. Teenagers who are dating may exchange sexually-themed selfies or videos of themselves there, not realizing that their consensual selfies might constitute “child sexual abuse materials” (CSAM). Sexual imagery might be shared nonconsensually (even if it was created consensually). Adults might be meeting there with minors they are grooming for exploitation, or with other adults to exchange CSAM. And in all of these cases, there would be a range of appearances. An adult grooming a child might look exactly like a parent taking their kid out for a parent-child date, or might look like an adult behaving inappropriately with a child. More discreet sharing of CSAM or nonconsensual images in the coffee shop might be indistinguishable from sharing of legal, consensual materials without violating the privacy of the customers. We’d think it was wildly intrusive if a coffee shop installed listening and/or video recording devices at every table to record and/or monitor all conversations. And yet, we’d also think it was wildly inappropriate if the sharing of CSAM or nonconsensual images was obvious – if someone was viewing them on a laptop in the middle of the coffee shop, projecting them on the wall, or selling them out of a small display at their table. If that were the case, we’d absolutely expect the coffee shop to immediately address the issue.
Legal and Moral Responsibility and Scale
When determining how to hold platforms reasonably responsible for actions that take place in their spaces, we have to balance:
- An expectation for reasonable staffing to monitor gross violations of human rights or law that occur in the space. For example, a manager who staffs a coffee shop that serves 1000 customers per hour with only one employee would be considered neglectful with regard to public safety. This employee couldn’t reasonably be expected to notice even the more public displays of exploitative behavior, much less prevent them or remove people immediately for violations. (As a side note, Facebook has 2.85 billion monthly active users, and 40,000 safety/security staff. A coffee shop with 1000 customers each day and the same percentage of users to security staff would have .14 staff devoted to safety.)
- An expectation for reasonable privacy of the individuals who use the space. We would never consent to a coffee shop being able to monitor, record, and/or track all our conversations while we were in it as a condition of using it, nor would we ever hold a coffee shop accountable for every illegal or controversial action that took place in it.
- An expectation of responsibility for publishing would remain with the publisher. If the news weekly published facts or content that were being contested, the publisher and/or author would be held responsible, not the venues that had a news rack in them. Similarly, we would expect any person who shared CSAM or nonconsensual sexual content to be held accountable for their behavior.
- An expectation that obvious violations would be promptly reported and removed. In the example of our coffee shop, this would mean adequate staffing to notice and support safety, an easy way for customers to know who to report to and how, and a highly-responsive and swift protocol for reported content and posters to be removed from the shop without further harm to the individual exploited.
- The reality that changes made to address one issue (such as child sexual exploitation) will lead the coffee shop to make decisions that will impact all issues and activities that take place there.
Legal and Moral Responsibility and “Impacted Populations”
We recognize that online platforms are used for a variety of purposes, including marginalized groups organizing for their safety as well as sharing life-saving sexual health resources. In some rural communities, online platforms may be the only connection people have to this information. During a global pandemic marginalized communities are experiencing increased isolation, and access to online communications is essential. When we consider who is impacted by legal challenges or changes to Section 230, we must consider the impact of the violence we are trying to reduce (child sexual exploitation) as well as the impact of the changes themselves on marginalized communities. Our legislation, policies, and practices are not neutral – for every change to help some, others may be put at increased risk. How do we create a safer coffee shop overall, in which children are not exploited, in which privacy is not unnecessarily violated, and in which other individuals who rely on the coffee shop as a place to share information, strategy, and connection still have access and privacy? And how do we create cultures of increased safety and reduced stigma so that people do not have to rely on organizing in secret to take care of each other when society has failed them?
Legal and Moral Responsibility and “End-to-End Encryption”
“End-to-end encryption” means that the message is encrypted on the sender’s device and remains encrypted until it is opened on the receiver’s device. It is the equivalent of having a private conversation in the coffee shop that other people cannot hear or lip-read. With end-to-end encryption, even if the message is intercepted on the server it passes through on its way from one user to another, it cannot be read or decoded. Some people prefer using end-to-end encryption apps, such as Signal, for their private conversations because they feel uncomfortable with the level of technological surveillance that happens on other platforms. People choosing to have a private conversation in a coffee shop without other customers listening in is not evidence of criminal intent. Similarly, use of end-to-end encryption does not indicate criminal intent or activity, and should never be used as evidence of such.
Legal and Moral Responsibility and Evaluation
In 2018, the government passed significant changes to Section 230 in the form of FOSTA/SESTA. In anticipation of these changes, Backpage.com shut down, and the Government Accountability Office’s (GAO) 2021 report indicates that “gathering evidence to bring cases against users of online platforms has also become more difficult” as many trafficking posts have now moved to foreign-hosted platforms that are not required to comply with US subpoenas. The FBI even reports that its “ability to identify and locate sex trafficking victims and perpetrators was significantly decreased following the takedown of backpage.com.” (GAO) The GAO report also indicated that “as of March 2021, DOJ had brought one case under the criminal provision established by section 3 of FOSTA.” And while FOSTA has brought one charge in three years, individuals in the sex trades consensually have indicated significant harms as a result, including a decrease in income, lack of harm reduction options, lack of community and associated mental health impacts of isolation, and removal from financial technologies. (Blunt and Wolf, Anti-Trafficking Review) While it is easy to assume a rift between sex workers and survivors of trafficking or CSE, some sex workers may be trafficking survivors using consensual sex work income as a means of independence or survival. Before we make additional changes to Section 230, we have a responsibility to evaluate the effectiveness and impacts of FOSTA/SESTA.
Responsibility, Privacy, and Legislation
With this overview of the complex principles at play in Section 230 in mind, the NSN will be offering legislative overviews for clarity on what is and is not included in proposed legislation.
EARN-IT OVERVIEW 2022