03_FEAT_Oltmann
Feature

Accessing LGBTQ+ Content in One US State

The Role of CIPA and Internet Filters

Passed in 2000, the Children’s Internet Protection Act (CIPA) required public schools and public libraries to use a technology protection measure to limit minors’ access to various types of content, though the specific implementation of this law is left up to individual institutions. In the subsequent 20+ years, internet filters have been used to block access to a wide range of content, including some that was not intended to be covered by CIPA. In this research project, we tested internet filters in public libraries across one Southern US state by examining whether we could access LGBTQ+ content; this data was then supplemented with interviews of library staff. We discovered that LGBTQ+ content was not inappropriately blocked but was in fact overwhelmingly accessible. Though previous research indicated LGBTQ+ content was blocked in some public libraries, this study did not corroborate these findings. It appears that implementation of internet filters to comply with CIPA has become less controversial and more routine than has been depicted.

Courts have recognized a government interest in protecting children from inappropriate or indecent speech, that would otherwise be protected by the First Amendment (e.g., Ginsberg v. New York 1968). However, doing so in an online environment has proven difficult. One way that some nations, including the US, have dealt with the explosion of online pornography and explicit content is with laws mandating internet filtering. In this context, internet filtering refers to software that blocks particular content. It typically functions by classifying websites into various categories, then blocking whichever categories are selected (see below).

Congress attempted various approaches to restricting minors’ access to explicit online content, some of which were overturned by the Supreme Court. However, the Children’s Internet Protection Act (CIPA) was upheld by the court system and went into effect in 2003. This focused on public libraries and public schools, often the primary sources of internet access for youth at the time. There have been numerous reports of overzealous use of internet filtering in these institutions since 2003, but most data pertaining to internet filtering is outdated and incomplete. The project described here offers new data, focusing on access to lesbian, gay, bisexual, transgender, and queer/questioning (LGBTQ+) information, as well as a fresh perspective on the implementation of internet filtering in public libraries.

The paper proceeds as follows: the next section describes the policy background of internet filtering, CIPA itself, and the overall efficacy of internet filtering. It also includes a brief overview of access to information for the LGBTQ+ community. The subsequent section outlines the methods used to collect data. The following section details the findings of the project, followed by a discussion and conclusion.

Literature Review

Policy Background

In 1996, Congress enacted the Communications Decency Act (CDA)1 as part of the Telecommunications Act; this was Congress’ first attempt to regulate pornography and obscenity on the internet. The CDA prohibited the transmission of obscene or indecent messages or images and the sending or displaying of “patently offensive” sexual messages to minors. However, the Supreme Court overturned the CDA, in part because “many terms within the CDA created uncertainty among internet users” (Wardak 2004, p. 683; Reno v. ACLU, 2000). Furthermore, the application of “contemporary community standards” is difficult at best in a global medium such as the internet. In summary, “the Court found that the terms of the CDA were overbroad and not narrowly tailored, thereby rendering the statute an unconstitutional limitation on free speech” (Wardak 2004, p. 684). As Peltz-Steele (2002) explained, “The Court observed that ‘the “community standards” criterion as applied to the Internet means that any communication available to a nation-wide audience will be judged by the standards of the community most likely to be offended by the message,’ an impermissible ‘least common denominator’ approach” (p. 421).

After this judicial defeat, Congress tried again to regulate minors’ access to content on the internet, with the Child Online Protection Act (COPA). Wardak (2004) wrote, “For COPA to apply, the materials must (1) depict or represent in a ‘patently offensive’ manner as pertains to minors or sexual acts or body parts of minors, (2) have been intended to appeal to a prurient interest of minors, and (3) ‘lack serious literary, artistic, political, or scientific value for minors.’” (pp. 685–86). One of the primary differences with COPA was that it focused on material “harmful to minors,” a more narrowly defined category of information. In addition, it defined minors as those under the age of 17 (not 18, as CDA had done). Nonetheless, in Ashcroft v. ACLU (2002), plaintiffs challenged the constitutionality of the law. After being overturned by an appeals court, the case worked its way to the Supreme Court, who remanded it back to the appeals court, where it was overturned for overbreadth. Because COPA was content-based restriction of speech, it was subject to strict scrutiny by the courts (Peltz-Steele 2002). The Third Circuit, in addition, determined that “contemporary community standards” has no functional meaning online because web publishers cannot limit access to their content based on geographical location (Peltz-Steele 2002). In 2004, the Supreme Court affirmed this ruling.

Children’s Internet Protection Act

Peltz-Steele (2002) noted, “Faced with courts troubled by efforts to silence speakers on the internet, and by restrictions that treated adults and children alike, Congress needed a bill that could (1) target recipients of communication rather than speakers; (2) treat adults differently from minors; and (3) offer a minimally restrictive means to identify unprotected content as to adults and minors respectively” (pp. 425–26). With these needs in mind, CIPA was developed and passed in 2000. According to this law, all public schools and public libraries that receive certain federal funds must install a “technology protection measure” to prevent minors from accessing images that are child pornography, obscenity, or “harmful to minors.” While child pornography and obscenity have a long (though sometimes contested) history of falling outside First Amendment protection, the category of “harmful to minors” referred to a visual depiction that:

(A) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (B) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and (C) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.

This definition evolved from case law based around Ginsburg v. New York (1968).

CIPA defines a technology protection measure as an internet filter; to comply with the law, all computing devices in an affected institution must be filtered (not only those used by minors). This requirement is tied to federal E-rate funding, which helps public schools and public libraries afford internet access and other telecommunication products and services (USAC 2023). In addition to CIPA, 26 states have enacted further laws requiring internet filtering in public schools and/or public libraries (National Conference of State Legislatures 2016). CIPA does not offer guidance regarding evaluating if a visual depiction is obscene or harmful to minors. The law “delegates these decisions to local authorities (e.g., school administrators and library directors) who were (and are) free to select, configure, and implement a filter to meet their needs” (Peterson, Oltmann, and Knox 2017, p. 4587; see also Minow 2004).

The American Library Association (ALA) brought a lawsuit to challenge CIPA in the early 2000s.2 They argued that internet filtering went against core values of librarianship, such as access and intellectual freedom (https://www.ala.org/advocacy/intfreedom/corevalues). The ALA further argued that internet filtering was akin to censorship, as information protected by the First Amendment would inevitably be blocked. In the District Court of the Eastern District of Pennsylvania, the judges ruled that CIPA was unconstitutional because it restricted speech in a public forum; they issued an injunction to block the statute (ALA v. United States 2002).

Due to a provision in CIPA, the federal government appealed directly to the Supreme Court, which in 2003 upheld the constitutionality of CIPA by a plurality. The three dissenters all noted the likely unconstitutionality of permanent filters. In separate concurring opinions, both Kennedy and Breyer “reasoned that the statute should be upheld primarily because of the disabling function” (Desai 2023, para. 8). Similarly, Klinefelter (2010) explained, “eight of the Justices found the ability of adult patrons to gain access to protected internet speech to be important to the constitutionality of the library’s use of internet filters” (p. 362). One convincing argument in support of CIPA relied on Congressional authority to regulate how funds are spent. Under this view, Congress was applying limitations to federal funding (the E-rate program), and libraries and schools could choose whether or not to concede to those limitations.

The Efficacy of Internet Filters

Most internet filtering software is produced by for-profit companies, such as CYBER-sitter and Net Nanny. As a result, the exact methods used to filter access are proprietary and not public knowledge. There are a variety of ways that internet filtering can be implemented, but perhaps the most common approach is to install filtering software at the system level (i.e., across all machines at a public library). Filters can work by preventing users from accessing sites that have been blacklisted while allowing access to other sites. Users will receive an error message when trying to access blocked sites.

Generally, filters group blocked sites into categories such as adult themes, alcohol, gambling, and so on (see Peterson, Oltmann, and Knox 2017 for examples of actual categories from filtering companies). This sampling of categories, clearly, does not neatly align with the categories prohibited by CIPA. In fact, all of the categories listed above are protected by the First Amendment as legal speech. Furthermore, because these categories do not map neatly onto the law, filtering becomes “inherently subject to the normative and technological choices made during the software design process” (Deibert et al. 2008, p. 372; see also Brown and McMenemy 2013). Internet filters are well-known to have two shortcomings: they both under-block and over-block content (e.g., Cooke, Spacey, Creaser et al. 2014; Cooke, Spacey, Muir et al. 2014; Deibert et al. 2008). Some content that should not be allowed gets through, while content that should be allowed is blocked; past research suggested that filters over- or under-block 15–20 percent of the time (Batch 2014).

Research testing the efficacy of internet filters is both somewhat limited and dated (see, e.g., Heins et al. 2006). For example, Chou et al. (2010) tested the efficacy of three top-ranked internet filters and found that all were out-performed by using text mining3 approaches. Some researchers have examined whether internet filtering is effective in protecting minors, but the data “fails to provide support for governmental and industry advice regarding the assumed benefits of filtering for protecting minors online” (Przybylski and Nash 2017). The American Library Association (2006) states “Content filters are unreliable because computer code and algorithms are still unable to adequately interpret, assess, and categorize the complexities of human communication whether expressed in text or image” (para. 3).

It is unclear exactly how widespread internet filtering is in US public libraries (though the picture may be clearer in public schools).4 Estimates vary widely and tend to be dated. In 2009, Jaeger and Yan (2009) estimated that at least 51.3 percent of public libraries used internet filters and that 100 percent of schools used internet filters. In contrast, Kolderup (2013) reported that 65 percent of public libraries were filtering by 2005. However, by 2014, the Institute for Museum and Library Services (IMLS) estimated that 73 percent of public libraries received E-rate discounts in 2014 and over 90 percent of libraries had used E-rate at least once in the past eleven years (Institute of Museum and Library Services, 2014); according to CIPA, all of those libraries would have to certify they were using filters. It is troubling that CIPA mandates internet filtering yet there seems to be no hard data on compliance in libraries or schools. It is important to note that millions of Americans lack (reliable) personal computing devices and/or reliable, ongoing access to the internet. Because of this, “the constraints and consequences of Internet filtering (a) affect many people and (b) especially impact the poor, elderly, and less-educated individuals who are less likely to have home broadband” (Peterson, Oltmann, and Knox 2017: 4588).

In February 2011, the American Civil Liberties Union (ACLU) launched its “Don’t Filter Me” campaign, designed to uncover, and then rectify, cases where school libraries were filtering LGBTQ+ content (ACLU 2011). The ACLU’s final report explained that the campaign was launched “after hearing reports from students across the country that their schools’ web filtering software was programmed to block these LGBT-supportive resources while at the same time allowing free access to websites [that] condemned homosexuality or opposed legal protections for LGBT people” (3). Most schools, when contacted about this discrepancy, changed their filtering settings, but the ACLU and supportive organizations had to go to court to get a preliminary injunction (PFLAG v. Camdenton School District Case No. 2, 2012) against a school in Missouri. This set a precedent, at least in public schools, that sites should not be filtered merely because they were supportive of LGBTQ+ individuals.

One study (Peterson, Oltmann, and Knox 2017) drilled down and studied filtering implementation in detail in one particular state, Alabama. In their research, “no two implementations of the same system had the same selection of common categories, and no two filtering systems had the same category set” (p. 4596). Each library and school had a different filtering configuration. However, several libraries chose to block access to content about the LGBTQ+ community; some libraries chose to block access to a category titled “alternative lifestyle,” a phrase commonly used to denote LGBTQ+ individuals and communities.

Anecdotal evidence suggests that some libraries have, or continue to, block access to LGBTQ+ content online, though these reports usually focus on K-12 school libraries. Quillen (2011) reported that school districts in Georgia and Missouri blocked access to educational LGBTQ+ sites and faced potential lawsuits over their filtering (see the ACLU’s “Don’t Filter Me” campaign). In 2021, Utah students requested access to blocked LGBTQ+ sites, which was quickly granted by the local school administrative authorities (Deininger 2021). Similarly, in 2022, the Katy Independent School District (ISD), in Texas, faced a complaint from the ACLU because its internet filters blocked access to LGBTQ+ content such as the Trevor Project (which supports LGBTQ+ teens and adults facing bullying and ostracism). Katy ISD changed the settings on filters in high schools and some middle schools following the complaint (Williams 2022).

Despite losing the court case in 2003, the ALA still opposes internet filtering in public libraries. Their position statement explains that:

CIPA-mandated content filtering has had three significant impacts in our schools and libraries. First, it has widened the divide between those who can afford to pay for personal access and those who must depend on publicly funded (and filtered) access. Second, when content filtering is deployed to limit access to what some may consider objectionable or offensive, often minority viewpoints, religions, or controversial topics are included in the categories of what is considered objectionable or offensive. Filters thus become the tool of bias and discrimination and marginalize users by denying or abridging their access to these materials. Finally, when over-blocking occurs in public libraries and schools, library users, educators, and students who lack other means of access to the Internet are limited to the content allowed by unpredictable and unreliable filters (para. 8).

The LGBTQ+ Community’s Access to Information

For decades, before the advent of the internet, information about and for the LGBTQ+ community was difficult to come by, especially outside of major metropolitan areas. Individuals in the LGBTQ+ community often relied on personal conversations and references. During the post-WWII period, early affinity groups began forming, such as the gay male-focused group The Mattachine Society and the lesbian-focused group Daughters of Bilitis; many of these affinity groups published newsletters and magazines, available via subscription to local or national audiences and often passed from individual to individual (see, for example, Johnson 2019). The Mattachine Society’s ONE magazine was initially seized as obscene material but was eventually protected by the Supreme Court (following Roth v. United States, 1957; ONE Inc v. Olesen, 1958). In 1962, another Supreme Court case (MANual Enterprises v. Day) further protected the legality of LGBTQ+ publications, which cemented the practice of “newsletters and publications circulated from reader to reader” (Brooks 2019, para. 1; see also Meeker 2006). The Advocate, the oldest continually-publishing LGBTQ+ publication, began in 1967 (Angelo 2015). In 1969, The Washington Blade (originally called The Gay Blade) began publishing; it has been called the “gay publication of record” because of its comprehensive coverage (Angelo 2015).

LGBTQ+ bookstores began opening and flourishing in the 1960s and often functioned as de facto community centers; the first ones were located in Philadelphia, New York City, Washington DC, and San Francisco (Brooks 2019; Hogan 2016). Most content came from small gay and lesbian publishers (such as Alyson Books) (Hogan 2016). By the early 1970s, most concern about obscenity was focused on hard-core unsimulated sex portrayals (especially gay male sex); LGBTQ+ publications were legal, but there was often much gatekeeping.

As societal changes occurred (such as the Stonewall Riots of 1969, the American Psychiatric Association’s changing stance on homosexuality [removing it from a list of mental disorders in 1973], and the election of the first openly gay politician, Harvey Milk, in 1977), information about the LGBTQ+ community continued to be difficult to obtain; these same changes also sparked some backlash across the US (Rosen 2014). Throughout the 1980s and 1990s, LGBTQ+ publications persisted, as did the community, though information about LGBTQ+ individuals or groups continued to be difficult for many to learn about. This necessarily abbreviated discussion in part demonstrates the paucity of information available for many during this time period, pre-internet.

In many ways, the internet has enabled broader distribution of more information to more people. As Last (2019) wrote, the internet

allows LGBT+ [individuals] to connect beyond geographic and physical boundaries, and to reduce the feeling of isolation that can so commonly be part of the LGBT+ experience. . . . Social media has also helped to amplify the voices of those who have previously been marginalised and sidelined – and this new prominence has undoubtedly contributed to increasing acceptance (paras. 5-6).

Information about sexual and gender identities, coming out, getting married and starting a family, and other issues is now present online and presumably accessible to many. Nonetheless, there has been pushback about the availability of LGBTQ+ related information, and it is unclear just how accessible it is, particularly to minors in the US.

As this literature review demonstrates, there are significant gaps in our knowledge about internet filtering in public libraries. We do not know the full extent of filtering in public libraries, how it is implemented, or the effects of its implementation (for example, what sort of information is restricted). Furthermore, most research into internet filtering is several years old; we lack current information on these questions in particular.

Methods

To address the gaps in research, a multi-prong methodology was developed: internet filtering was tested in nearly 30 different library systems, and 11 library staff were interviewed about their perspectives on internet filtering. This took place in three phases: first, libraries that agreed to participate were visited; second, some library staff (who consented to participate) were interviewed; third, libraries that had not volunteered to participate were visited. These steps are further explained below.

Testing the implementation of internet filters required several steps. The researcher obtained a list of all public library systems utilizing internet filtering, in one southeastern, politically conservative state.5 Then we contacted the director of each system (which are primarily organized by county in this state) to ask if we could visit their library, use their computers as a guest, and subsequently interview staff members who volunteered and consented; each library director then had to send a letter agreeing to participate in the study (these steps were mandated by the researcher’s Institutional Review Board [IRB]). The researcher drove to 13 randomly selected6 libraries which had agreed to participate, to utilize their computers, with their own particular implementation of internet filtering. Since Peterson, Oltmann, and Knox (2017) found that each library implemented filtering in a different way, it was seen as necessary to test each library’s configuration. Of the 13 randomly selected libraries, they were located in both rural and urban areas, of varying socioeconomic status, and all in politically conservative counties (in this particular state, nearly every county is considered politically conservative, as most went for Trump in 2020).

To test the local implementation of internet filtering, a list of websites with LGBTQ+ content was developed, in partnership with the researcher’s university office of LGBTQ+ resources. The researcher wanted to investigate whether the state’s public libraries blocked or provided access to LGBTQ+ content. The list was based in part on Nowak and Mitchell (2016), who devised a cataloging system for a physical LGBTQ+ library. Their library subject headings were used as a guide to develop subject headings for the list for this project. This list had ten categories:

  • Famous person
  • Cultural studies
  • Psychology
  • Issues
  • Relationships
  • Religion
  • Sex
  • Anti-bullying
  • Intersectionality
  • Pro-family

To compile the list, a volunteer from the university LGBTQ+ office searched each heading with “LGBTQ”. For example, the first search was “LGBTQ famous person” (without the quotation marks). The volunteer then examined the search results and copied the first ten URLs that were not duplicates (for example, if there were two search results from cnn.com, only the first one would be included). This was repeated for every category except “pro-family”, which was searched without the LGBTQ prefix; this phrase is often used as a euphemism for conservative, anti-LGBTQ+ information. This category was included to see if libraries blocked pro-LGBTQ sites but allowed sites opposed to LGBTQ communities. Overall, this process resulted in 100 unique URLs, 90 of which specifically had LGBTQ+ content (and ten of which were “pro-family,” a phrase often used in opposition to LGBTQ+ rights and visibility).

Once the list was complete, and the participating libraries were identified and approved by the IRB, the researcher drove to each library. At each library, the researcher asked to use the library’s computers as a guest, received a guest pass, logged in, and began trying to access the 100 URLs on the list using the Google Chrome browser. Success or failure in reaching each URL was tallied.

After this process was complete, the researcher cleared the computer cache and logged off, then asked to speak to the director. In a conversation with the director, the researcher identified herself, reminded the director about the research project, and asked the director to circulate an email inviting interested library staff to an interview (again, as directed by the IRB). (This research focused on library implementation and perspectives regarding internet filtering, not patron knowledge or perspectives.) Staff who agreed to be interviewed emailed the researcher to set up a mutually agreeable time for a telephone interview. Eleven total staff, from nine different libraries, were interviewed. These interviews lasted between 5:58 to 23:23 minutes (see table 1). The brevity of some interviews reflects that some respondents found it difficult to talk about an everyday, taken-for-granted software and its implications.

Interviews were audio-recorded with permission, transcribed, and analyzed iteratively using Dedoose software. All interviewees were given randomly generated pseudonyms, and to protect their identities, job titles and library names/locations are not provided in this article. Staff roles varied from front desk worker, to technologist, to director; many libraries were small enough to not have an identified technologist/technology specialist. Further, we wanted to hear perspectives from a wide variety of workers, not just technologists (and some technologists may not have wanted to be interviewed). From these interviews, 20 codes were developed, as reflected in table 2; note that some excerpts were coded multiple times.

Toward the end of the research process, a third step was added. Because the libraries being investigated were knowingly and willingly engaging in the research, perhaps they were not representative of all public libraries in this state with internet filters. It was possible that only those libraries who had particularly light, unrestrictive filtering had agreed to participate, while those libraries who maintained a stricter, more restrictive filter had declined to participate. Thus, in this phase of the project, the researcher selected 13 libraries to visit, without prior communication about the visit. The researcher found the political leanings (as measured by Trump votes in the 2020 election) of the 13 previously visited communities and identified 13 additional communities with matching political leanings. For example, if 63 percent of a first-round county voted for Trump, a second-round county that had a similar voting record was found and matched. (Libraries in this stage similarly varied in terms of rural/suburban and were of similar socioeconomic status.) Because these visits only involved computer use, and interviews were not sought with the staff, IRB approval was not needed for this step.

In summary, 13 public libraries that agreed to participate in the research were randomly selected and visited; 13 public libraries that had not agreed to participate in the research were purposively selected and visited; and 11 library staff members were interviewed. The following section describes the results of this process.

Findings

Website Access

Did these public libraries block or enable access to LGBTQ+ content, as represented on the list? Overall, these libraries provided remarkably strong access to this information. Figure 1 shows the rate of successful access to the listed sites at the first set of libraries visited, while figure 2 shows the rate of successful access at the second set of libraries visited. Across all libraries, on average, 95 percent of the sites were able to be accessed despite the internet filter.

The list of LGBTQ+ websites included news organizations, like CNN and BBC, commonly known LGBTQ+ organizations, such as Human Rights Watch, GLAAD, and the Trevor Project, academic websites, Wikipedia, journal articles, and lesser-known LGBTQ+ and human rights organizations. None of these were more or less likely to be blocked by the internet filter. Toward the end of the study, one pro-family site was consistently inaccessible, but this is because it was hosted in Singapore and that country blocked access.

Libraries in rural, urban, and suburban areas were visited. Some were in liberal areas, while many were in conservative areas (as demonstrated by the percentage of votes Trump garnered in the 2020 election). There were no differences in rates of access based on the size of the community or the political leaning of the library’s community. Libraries in figure 1 (in the first round) consented to be visited and studied, while libraries in figure 2 (in the second round) did not consent to be studied. There were no significant differences in rates of access between these two categories of libraries. In other words, the LGBTQ+ content tested here was widely accessible across a wide range of public libraries in this state.

Library Staff Responses

All library staff interviewed for this project were aware of the internet filter in their library, though few had any thorough understanding of how it worked. For example, Winona said, “I know very little. I do know that we have a system that filters based on, I believe it’s based on, the information that is put in, what is pulled from it from the website.” Athena explained, “We have a filter that has certain categories that we have selected, and that it will track, then it will filter them.” and Peg said, “The basics of filtering software [is] we keep the bad stuff out. And occasionally the good stuff gets blocked, and we have to go in and get it changed.” Dorothy tried to explain:

No. I mean, it’s because it’s not on, like, it’s not on content—that doesn’t make sense. It’s not on subject, knowing. I’m not sure how I’m trying to explain this, but it’s the way it’s presented. So, if you’re looking at something like breast cancer, and you go through and you’re not looking for specific pictures, if you go through breast cancer, and you go to find WebMD and then WebMD will have some pictures and those aren’t blocked. But if you type in, I want images of breasts, then it’s going to block that. So as long as the websites that you were looking at were just informational. And they weren’t, you know, “hey, look at all this.”

Similarly, library workers knew that internet filters try “to make sure that patrons aren’t getting on websites that have, like, I guess harmful material, especially for like minors, anything with, like, pornography and stuff like that” (Mary). When pressed, though, library staff could not elaborate on what was really blocked by filters. Dorothy, for example, said, “I’m sure it’s like pornography or anything like that,” while Katherine said, “I don’t know a list. I mean, you know, I could make assumptions about a lot of things. I definitely do not know what the categories are.” Peg suggested that “the biggie is pornography, and anything that’s going to, like, have malware or you know, prone to viruses and that sort of thing.” Respondents indicated that most people trying to access blocked content were adult males; juveniles “just want to play games and stuff” (Beatrix). Samuel, Beatrix, and Kirsten acknowledged they did not know what was blocked by their internet filter. On the other hand, Winona, Athena, and Brennan said only their tech support person would really know about what the internet filter blocked.

Since this study specifically examined access to LGBTQ+ websites, the respondents were asked about their views on this. Athena said, “I don’t see any reason why that specifically needs to be filtered. I don’t think that would make any sense . . . I would not like to see a library filtering that content as a category.” Katherine, likewise, explained, “I don’t think it should be blocked. It’s certainly important information for people in our community, for folks who use our library. If people are searching for sexual, legitimate information, they should be able to get it.” Winona added, “Being the parent of two LGBTQ children, I’m glad that that information is available out there if someone needs it. They shouldn’t have to get permission to go through a filtering service.” Some respondents were surprised that so many LGBTQ+ sites were accessible at their library through the internet filter. For example, Mary said, “I know my director is all about diversity, but this community is not, necessarily . . . I am just really surprised that it wasn’t blocked because a lot of internet filtering just picks and chooses things.” Kirsten elaborated:

I hate to say it, but yeah [I am surprised.] I do know that has been a problem in different internet things I use personally. I know there have been issues where certain tags like the LGBT community in the [school] district have been blocked because apparently, even having material about that topic is just inherently, you know, not safe for work.

Even though these libraries all had internet filters, some material that should have been blocked still managed to be found by patrons. In those cases, library workers generally first told the patrons to stop viewing “inappropriate” material, and, if needed, escalated to temporary bans from the public library. Tasha explained that she would say, “‘I’m sorry, but the content that you’re viewing isn’t appropriate. You know, I’m going to have to ask you to stop viewing the content.’ You know, if they don’t, we kick them off.” Likewise, Kirsten said, “We go speak to the patron and ask them to, you know, we let them know that that’s not appropriate for being in the library and shut them down.” However, most respondents said that accessing inappropriate material happened relatively rarely in their libraries.

Mary indicated that sometimes the internet filter worked in problematic ways:

Well, the systems aren’t really set up to, I guess, work with the way human language and different things are set up. So sometimes it blocks more information or sometimes less information than it’s supposed to. And so, you know, the patrons that are trying to access some material that it’s blocking—if we don’t have an easy way to override it, then they’re not able to get access to information that they should easily have access to.

Tasha, also, said, “I think it can unintentionally block sites sometimes that are being accessed for a legitimate reason. And that person isn’t always going to ask staff for help.” Samuel said that there are problems when the filter “will probably not allow for very wide access to information that will be used in a practical everyday situation. For example, one of filters can be very sensitive to bananas and interpreted it as something very pornographic.” Beatrix added, “If you’re doing research on something, and you know, it’s not necessarily considered pornography, but it may have nudity, that’s probably a part of the filter. So, you know, from a research standpoint, it could be a disadvantage.” Kirsten said:

I think sometimes people make the filters too restrictive, so that perfectly legitimate material that—because one person or one group of people has deemed something inappropriate, that they can decide that it’s not appropriate for the rest of the community, like LGBT materials. There’s nothing inherently wrong with LGBT materials. Now, there are certain LGBT materials that should not be viewed in public spaces, like pornographic materials. But, you know, there’s nothing inherently wrong with somebody looking up information about the queer community. But it’s the people who are sitting at the filters who decide, ‘oh, that’s inappropriate.’ Because it’s about LGBT materials . . . that goes from being, like, suffering for the public good to censoring really quick.

However, library workers were still overall positive about internet filtering in their libraries—in part because filters allowed these libraries to qualify for E-rate funding. Tasha explained, “We have it in place because we’re required to in order to get the E-rate funding. We have to be CIPA compliant, which means we have to have the filtering to block pornography and stuff like that.” Athena, similarly, said, “We take funding from the federal government and part of the agreement means that we have to comply with laws regarding filtering . . . I think it’s reasonably substantial funding as well, that we receive, to help out with technology [and] connectivity.”

In addition, having reliable, consistent internet filtering protects the library staff. As Katherine said, “From the staff side, one of the arguments [in favor of filtering] was that having to deal with, you know, really vulgar and obscene pornography was a form of harassment or staff harassment.” She added that installing the internet filtering “was a gift with a sense of relief” for the staff. Brennan added that filters are “making sure that [patrons] are complying with the rules, but not making an awkward situation for anybody.”

Finally, library workers discussed whether they saw tension between intellectual freedom (one of the core values of librarianship) and internet filtering. Mary said, “If we don’t have an easy way to override [the filter], then patrons are not able to get access to information that they should easily have access to. And then that borders the line of censorship.” Katherine noted, “In the most broadest [sic] sense, yes. . . . Intellectual freedom means everything that’s available, and people are free to use, read, access whatever they wish. And filtering by definition reduces that.” However, Brennan thought that, on a day-to-day basis, internet filters had little effect on intellectual freedom. He said:

I know when I took classes in library school, I’ve been to conferences, and yeah, filtering has come up . . . they always use something like, you know, maybe breast cancer or some research. That’s an example of something, you know, that could potentially, you know, be filtered out. But it’s, you know, obviously, not something that you want the filter to catch. But I can’t really recall a real-world scenario where we’ve ever had somebody that, you know, came up and said, you know, hey, I’m trying to do legitimate research about this topic or look something up and can’t access information.

Discussion and Conclusion

From one perspective, these findings may not be surprising or worthy of much discussion. In 2022 (when the study was conducted), LGBTQ+ online content was widely available in this state’s public libraries. It may seem self-evident that LGBTQ+ content should be, and is, accessible to communities across the state; American perspectives on LGBTQ+ individuals have generally grown more tolerant in the past 20 years (though there is a sizable minority of Americans who are vitriolic about the LGBTQ+ community) (e.g., Borelli 2022).

Yet, access to LGBTQ+ information has a complicated history. For decades, it was notoriously difficult to locate and peruse. The internet did significantly change that, but there are many people who believe that access to LGBT+ information should still be restricted in some way. As of January 2023, the ACLU noted that politicians had introduced over 120 bills to restrict the rights of LGBTQ+ people (ACLU 2023). These bills target “their freedom of expression” among other issues (para. 1).

People find numerous ways to limit access to LGBTQ+ content. For example, in Michigan, a town voted to defund the public library rather than accept certain LGBTQ+ books in the library (Cantor 2022). In Louisiana, threats from citizens angry about LGBTQ+ content resulted in public librarians afraid to go to work (Chavez 2023). Public library patrons repeatedly challenge the inclusion of LGBTQ+ books in their libraries (see, e.g., Lavietes 2023). Access to LGBTQ+ information, particularly in libraries, is under siege. From this vantage point, accessing LGBTQ+ content online is particularly valuable—and perhaps unexpected. In the 2020s, access to LGBTQ+ content in any format cannot be taken for granted.

This study did not explicitly address the efficacy or success of CIPA with respect to keeping content that is “harmful to minors” out of the hands of minors. However, we found that content that is not harmful (that is, non-pornographic LGBTQ+ content) is in fact accessible. This may be a partial indication that CIPA is functioning as intended (or as written).

Nonetheless, several questions remain. The overall efficacy of internet filters remains elusive: do they function as the law intends, as libraries intend, as parents/guardians intend, and/or as the companies that market the filters intend? Data addressing this question is outdated (e.g., Chou 2010) and incomplete. As filters have steadily improved and become more nuanced, the vast quantity of information online has also grown exponentially, so it is unclear if internet filters have managed to keep pace with the explosion of information quantity.

Filter efficacy must also be considered from another perspective: how difficult are the filters to confuse, trick, or overcome? In this project, many of the interviewees had stories about persistent patrons being able to get past the filter to gain access to content that should be blocked. It is unclear how often this happens or the skill level needed to outsmart the filter. Furthermore, as both visual content and social media have proliferated, it is unclear if internet filters can evaluate and restrict these types of content. It is also unclear if internet filters can successfully block virtual private networks (VPNs) which would easily allow routing around the filter. Recent research (Thurman and Obster 2021) indicates that teens in the UK frequently view pornography via social media and pornographic websites, and nearly half have used VPNs to do so; these UK researchers also note that every legislative approach to regulating pornography access has flaws. CIPA was written prior to the advancement of visual content and social media, as well as so-called deepfake or AI-based pornography, so it is unclear how effective internet filters can be as online content continues to evolve and expand. In addition, large language model programming could potentially be used to censor “controversial” content in public libraries, expanding upon the book censorship that is currently escalating across the US.

Of course, the concept of content that is “harmful to minors” is—or ought to be—contested and debated. Some individuals might argue that sites that condone or support firearms, tobacco or drugs, gambling, violence, or hate speech are harmful to minors and ought to be regulated, while other individuals might well be tolerant of some or all of those categories. It is unsurprising that, in America, “harmful to minors” has been defined exclusively as sexual content, with no consideration for violence. In addition, we ought to consider whether (trying to) block minors’ access to harmful content is the best approach; would frank, thoughtful discussion of difficult topics be more beneficial to minors? For example, rather than attempting to ban minors’ access to pornography, perhaps conversations and policies about safer sex practices, erotica, masturbation, sexuality, and sexual/gender stereotypes would be more useful in both the short and longer term. This may be analogous to findings that comprehensive sex education (as opposed to abstinence-only sex education) for minors results in reduced rates of teen pregnancy (e.g., Mark and Wu 2022).

The policy implications from this study are murky, meaning that the implications for CIPA and internet filtering are somewhat unclear. Because the research did not seek to evaluate the efficacy of internet filters or CIPA more generally, we cannot make specific recommendations on whether to revise the law as it currently stands. Perhaps more particular guidance about how to interpret CIPA—specifically how to implement filtering certain categories of content—would be useful and beneficial for public libraries and schools. Although the ALA still maintains opposition to internet filtering, it could craft guidance for these institutions, which would be useful for them; for example, ALA could recommend categories like “malware” be blocked, but categories like “alternative lifestyles” be allowed (these examples of categories come from Peterson, Oltmann, and Knox 2016). In addition, we still lack information about how widely internet filtering is deployed in public libraries and schools.

References

American Civil Liberties Union. 2012. “Don’t Filter Me Final Report.” https://www.aclu.org/documents/dont-filter-me-final-report.

American Civil Liberties Union. 2023. “Over 120 bills Restricting LGBTQ Rights Introduced Nationwide in 2023 So Far,” January 19. https://www.aclu.org/press-releases/over-120-bills-restricting-lgbtq-rights-introduced-nationwide-2023-so-far.

American Library Association v. United States. 2002. 201 F Supp. 2d 401.

American Library Association. 2006. “Filters and Filtering.” https://www.ala.org/advocacy/intfreedom/filtering.

American Library Association. 2001. “ALA Files Lawsuit Challenging CIPA.” http://www.ala.org/advocacy/advleg/federallegislation/cipa/alafileslawsuit.

Angelo, Pier. 2015. “A Brief History of Gay Newspapers.” South Florida Gay News, 2015. https://southfloridagaynews.com/Community/a-brief-history-of-gay-newspapers.html.

Ashcroft v. American Civil Liberties Union. 2002. 535 U.S. 564.

Batch, Kristen R. 2015. “Fencing out Knowledge: Impacts of the Children’s Internet Protection Act 10 Years Later.” American Library Association Policy Brief No. 5. http://www.ala.org/advocacy/sites/ala.org.advocacy/files/content/FINALCIPA_Report_V4_8%205x11PAGES%20%282%29.pdf.

Borelli, Gabriel. 2022. “About Six-in-Ten Americans Say Legalization of Same-Sex Marriage is Good for Society.” Pew Research Center. https://www.pewresearch.org/short-reads/2022/11/15/about-six-in-ten-americans-say-legalization-of-same-sex-marriage-is-good-for-society/.

Brooks, Laken. 2019. “Giovanni’s Room and the Fate of LGBT Bookstores in a Dying Industry. National Trust for Historic Preservation.” https://savingplaces.org/stories/giovannis-room-and-the-fate-of-lgbt-bookstores-in-a-dying-industry.

Brown, Graeme, and David McMenemy. 2013. “The Implementation of Internet Filtering in Scottish Public Libraries.” Aslib Proceedings 65 (2): 182–202.

Cantor, Matthew. 2022. “US Library Defunded After Refusing to Censor LGBTQ Authors: ‘We Will Not Ban the Books.’” The Guardian, August 5. https://www.theguardian.com/books/2022/aug/05/michigan-library-book-bans-lgbtq-authors.

Chavez, Roby. 2023. “As LGBTQ Book Challenges Rise, Some Louisiana Librarians Are Scared to Work.” PBS News Hour. https://www.pbs.org/newshour/nation/as-lgbtq-book-challenges-rise-some-louisiana-librarians-are-scared-to-go-to-work.

Children’s Internet Protection Act. 2000. 47 U.S.C. 254.

Chou, Chen-Huei, Atish P. Sinha, and Huimin Zhao. 2010. “Commercial Internet Filters: Perils and Opportunities.” Decision Support Systems 48: 521–30.

Cooke, Louise, Rachel Spacey, Claire Creaser, and Adrienne Muir. 2014. “‘You Don’t Come to the Library to Look at Porn and Stuff Like That’: Filtering Software in Public Libraries.” Library & Information Research 38 (17): 5–19.

Cooke, Louise, Rachel Spacey, Adrienne Muir, and Claire Creaser. 2004. “Filtering Access to the Internet in Public Libraries: An Ethical Dilemma?” In Ethical Dilemmas in the Information Society: How Codes of Ethics Help to Find Ethical Solutions, edited by Amélie Vallotton Preisig, Hermann Rösch, and Christoph Stückelberger 81–92. Geneva: globalethics.net.

Deibert, Ronald, John Palfrey, Rafal Rohozinski, and Johnathan L. Zittrain. 2008. Access Denied: The Practice and Policy of Global Internet Filtering. Cambridge, MA: MIT Press.

Deininger, Michelle. 2021. “Park City Students Fight Online Filters for LGBTQ Searches.” Associated Press, March 21. https://apnews.com/article/technology-race-and-ethnicity-gay-rights-park-city-utah-324a3dae9def9ce5d24f871612c2d7e4.

Desai, Anuj C. 2023. “United States v. American Library Association (2003).” Free Speech Center. https://firstamendment.mtsu.edu/article/united-states-v-american-library-association-2003/ .

Ginsberg v. New York. 1968. 390 U.S. 629.

Heins, Marjorie, Christina Cho, and Ariel Feldman. 2006. “Internet Filters: A Public Policy Report.” The Brennan Center. https://www.brennancenter.org/our-work/research-reports/internet-filters-public-policy-report.

Hogan, Kristen. 2016. The Feminist Bookstore Movement: Lesbian Antiracism and Feminist Accountability. Durham, NC: Duke University Press.

Institute of Museum and Library Services. 2014. “New Data: More Than 90% of U.S. Public Libraries Have Used E-Rate.” https://www.imls.gov/blog/2014/04/new-data-more-90-us-public-libraries-have-used-e-rate.

Jaeger, Paul T., and Zheng Yan. 2009. “One Law with Two Outcomes: Comparing the Implementation of CIPA in Public Libraries and Schools.” Information Technology & Libraries 28 (1): 6–14.

Johnson, David K. 2019. Buying Gay: How Physique Entrepreneurs Sparked a Movement. New York: Columbia University Press.

Klinefelter, Anne. 2010. “First Amendment Limits on Libraries’ Discretion to Manage Their Collections.” Law Library Journal 102 (3): 343–74.

Kolderup, Gretchen. 2013. “The First Amendment and Internet Filtering in Public Libraries.” Indiana Libraries 32 (1): 26–29.

Last, Meera. 2019. “How Technology Has Changed the LGBT+ Experience.” Tech Nation. https://technation.io/news/how_technology_has_changed_lgbt/.

Lavietes, Matt. 2023. “Over Half of 2022’s Most Challenged Books Have LGBTQ Themes.” NBC News, April 25. https://www.nbcnews.com/nbc-out/out-politics-and-policy/half-2022s-challenged-books-lgbtq-themes-rcna81324.

Mark, Nicholas D. E., and Lawrence L. Wu. 2022. “More Comprehensive Sex Education Reduced Teen Births: Quasi-Experimental Evidence.” PNAS 119, no. 8.

Meeker, Martin. 2006. Contacts Desired: Gay and Lesbian Communications and Community, 1940s–1970s. Chicago: University of Chicago Press.

Minow, Mary. 2004. “Lawfully Surfing the Net: Disabling Public Library Internet Filters to Avoid More Lawsuits in the United States.” First Monday 9 (4). http://firstmonday.org/ojs/index.php/fm/article/view/1132/1052.

National Conference of State Legislatures. 2016. “Laws Relating to Internet Filtering, Blocking and Usage Policies in Schools and Libraries.” http://www.ncsl.org/research/telecommunications-and-information-technology/state-internet-filtering-laws.aspx.

Nowak, Kristine, and Amy Jo Mitchell. 2016. “Classifying Identity: Organizing an LGBT Library.” Library Philosophy & Practice April: 1–16.

One, Inc. v. Olesen. 1958. 355 U.S. 371.

Peltz-Steele, Richard J. 2022. “Use ‘the Filter You Were Born With’: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Libraries.” Washington Law Review 77: 397–479.

Peterson, Chris, Shannon M. Oltmann, and Emily J. M. Knox. 2017. “The Inconsistent Work of Web Filters: Mapping Information Access in Alabama Public Schools and Libraries.” International Journal of Communication 11: 4583–4609.

PFLAG v. Camdenton R-III School District. 2012. 853 F. Supp. 2d 888.

Przybylski, Andrew K., and Victoria Nash. 2017. “Internet Filtering Technology and Aversive Online Experiences in Adolescents.” Journal of Pediatrics 184: 215–19.

Quillen, Ian. 2011. “ACLU Puts Pressure on Districts to Ease Internet Filtering.” Education Week, October 17. https://www.edweek.org/policy-politics/aclu-puts-pressure-on-districts-to-ease-internet-filtering/2011/10.

Reno v. ACLU. 1997. 521 U.S. 844.

Rosen, Rebecca J. 2014. “A Glimpse into 1970s Gay Activism.” The Atlantic. https://www.theatlantic.com/politics/archive/2014/02/a-glimpse-into-1970s-gay-activism/284.

Roth v. United States. 1957. 354 U.S. 476.

Savage, David G. 2015. “Supreme Court Faced Gay Rights Decision in 1958 Over ‘Obscene’ Magazine.” Los Angeles Times, January 11. https://www.latimes.com/nation/la-na-court-gay-magazine-20150111-story.html.

Thurman, Neil, and Fabian Obster. 2021. “The Regulation of Internet Pornography: What a Survey of Under-18s Tells Us About the Necessity for and Potential Efficacy of Emerging Legislative Approaches.” Policy Internet 13: 415–32.

Wardak, Leah. 2004. “Internet filters and the First Amendment: Public libraries after United States vs. American Library Association.” Loyola University Chicago Law Journal 35 (2): 657–735.

Williams, Jack. 2022. “Katy ISD Removes LGBTQ Filters from Internet after ACLU Complaint.” Houston Public Media. https://www.houstonpublicmedia.org/articles/education/schools/2022/09/16/433193/katy-isd-has-removed-lgbtq-filters-from-internet-after-aclu-complaint/.

Universal Service Administrative Co. (USAC). 2023. “E-rate.” https://www.usac.org/e-rate/.


1. The CDA has been back in the mainstream media recently due to Section 230, but this section is not relevant to the analysis of this project.

2. The ALA was the primary named plaintiff in the suit, though “Plaintiffs in the suit include libraries, library users, state library associations and the Freedom to Read Foundation” (ALA 2001, para. 11).

3. The approach used by Chou et al. focused on the contents of webpages, rather than creating URL lists, but they focused on “work-related” and “non-work-related” webpages within the context of a business.

4. The lack of available national data is in sharp contrast to other nations, particularly the UK and Scotland. Though there is no equivalent to CIPA there, researchers have investigated the rate of internet filtering in public libraries. In 2013, Brown and McMenemy (2013) reported that all of their respondents had implemented filtering. Blocked content included actually illegal content/activity, potentially illegal content/activity, and value judgment grounded (such as the category “tasteless”) (p. 192). Across the UK, Cooke, Spacey, Creaser et al. (2014) studied the implementation of internet filtering and, again, 100 percent of their respondents reported using filtering. They note that “currently, there appears to be little standardisation, guidance or transparency about measures being taken to prevent misuse” (p. 6). In the US, state library agencies may have comprehensive data for their particular states, but to the best of the author’s knowledge, this information is not aggregated anywhere, nor made publicly available.

5. This list came from the state’s department of libraries. More details cannot be given without revealing the state studied, which may implicate libraries or library workers. Also, because this is a politically conservative state, revealing its identity may prompt state legislators to mandate more strict filtering than currently exists.

6. Counties were listed alphabetically, then a random number (7) was chosen using a random number generator online. Each seventh library was then selected until one-fourth of all filtering libraries were selected. From this pool, thirteen agreed to participate and completed the documentation required by the IRB.

Refbacks

  • There are currently no refbacks.




ALA Privacy Policy

© 2024 OIF