Repositories at Master’s Institutions
A Census and Analysis
Deborah B. Henry (henry@mail.usf.edu) and Tina M. Neville (neville@mail.usf.edu) are both Librarians at Nelson Poynter Memorial Library, University of Florida St. Petersburg.
Manuscript submitted June 13, 2016; returned to authors for revision September 2, 2016; revised manuscript submitted October 11, 2016; manuscript returned to authors for minor revision January 3, 2017; revised manuscript submitted January 19, 2017; accepted for publication March 10, 2017.
Using a population of Carnegie-designated master’s institutions, this study attempted to quantify the existence of digital repositories at those institutions. A content analysis of repositories containing some type of faculty content was conducted. Pathways of discovery of these collections—including open web searching, inclusion in repository directories, and access through an institution’s website—were also noted. Approximately 20 percent of the master’s colleges and universities maintain repositories containing faculty scholarship plus many other types of student productivity and university documents.
Since Lynch and Lippincott published a comprehensive census of institutional repositories (IR) in 2005, numerous studies have examined topics relating to the growth, development, and content of academic repositories.1 Subsequent investigations often focused on repositories at major research institutions, particularly members of the Association of Research Libraries (ARL) since these institutions were early adopters of IRs.2 Much of the IR literature is survey- or interview-based, soliciting information and experience from librarians, repository administrators, faculty, and students about the maintenance of the repository or user awareness of it.3 Other researchers conducted content analyses of repositories, but many of those projects are dated or considered as a subset of operating repositories in the United States.4 Investigators indicated a need for more research on IRs at smaller academic institutions, analyses comparing faculty and student content, and assessments of scholarly and non-scholarly content.5
Master’s-level colleges and universities provide a unique contrast between institutions that focus primarily on teaching undergraduates and those with a dominant research agenda. The majority of repository content at smaller and teaching-oriented institutions may consist of student research.6 Faculty at master’s institutions often have larger teaching assignments yet still have a strong interest in and an obligation to conduct research. As at research-focused universities, faculty at master’s-level institutions may be very interested in promoting their research accomplishments through an IR.
The main purpose of this study was to conduct a thorough census of institutional repositories supported by Carnegie-classified master’s colleges and universities (small, medium, and large programs), thus providing a comprehensive and updated inventory of master’s repositories.7 In addition to documenting the existence of these repositories, this project sought to investigate the type of content that they contained. Considering research expectations at master’s institutions, the study focused primarily on determining the percent of repositories that contained some type of faculty content but also recorded other types of content to compare results with previously published studies on academic repositories. A third goal of the study was to analyze discoverability using these possible pathways: entry for the IR in an established directory (Registry of Open Access Repositories (ROAR) or the Directory of Open Access Repositories (OpenDOAR)), tracking discoverability through the open web, and through the home organization’s webpages.8
Literature Review
Censuses
Several authors have attempted to define the number and growth of institutional repositories throughout the United States. Lynch and Lippincott conducted the first major study in 2005. Their analysis focused on Coalition for Networked Information (CNI) members, a joint project of ARL and Educause. Survey respondents were consortial members from ninety-seven doctoral-granting institutions and thirty-five liberal arts colleges. At the time of the survey, 40 percent of the CNI members had an IR in place and 88 percent of the remainder planned to implement one. Only two of the liberal arts institutions, however, had a working repository at that time.9
As a follow-up to the 2005 census, McDowell broadened the potential study pool by using ROAR and membership lists from DSpace and bepress’ Digital Commons repository software. She also conducted Google searches of all doctoral-granting institutions and the top ranked liberal arts colleges to locate as many repositories as possible regardless of institution size or focus. This study revealed that the IR movement was not limited to ARL or large doctoral-granting institutions. By late 2006, more than half of the repositories in the United States were at institutions with enrollments below 15,000 students and 53 percent of the seventy-three repositories were at non-ARL institutions.10 A 2006 survey of academic library directors at four-year institutions found that 10.8 percent of the respondents (n = 446) had an established IR, and an additional 15.7 percent were actively planning to launch a repository.11 The Bishoff Group in 2014 re-examined non-ARL institutions, noting that 81 percent of the respondents were collecting digital content, including some faculty and student research.12
Navigational Studies
Although many institutions register their repositories with directories such as ROAR or OpenDOAR, not all repositories are included in these directories and, even when they are, searchers may not be aware of them. As Crow commented in his early SPARC position paper, “For the repository to provide access to the broader research community, users outside the university must be able to find and retrieve information from the repository.”13 Coates used Google Analytics data to investigate how users were finding electronic dissertations at Auburn University. She compared navigational paths for local and out-of-state researchers. Local users found the dissertations using a variety of methods: links on the university’s website, open search engines, or direct access to the dissertation. External users, however, discovered the dissertations mostly by using open search engines. This finding emphasizes the need for repositories to make their content as accessible as possible to web crawlers.14
Jantz and Wilson found that forty of sixty-three institutions that they examined provided a link to the repository from the library’s website. Of those that included a link, the most common path was via the “scholarly communication” page, with the “for faculty” page as the next most common navigational path.15 Mercer reported that although some libraries linked directly to the repository from their home page, many navigational paths require two to four links to reach the repository.16 St. Jean’s 2011 study used semi-structured interviews to understand how repository users located the site. Although respondents mentioned several discovery methods, the most common method reported was a direct link to the repository from the library’s homepage, with a Google search being the second most common method. When asked why researchers might not use the repository, nearly two-thirds of the respondents noted the resource’s lack of visibility. In fact, one respondent considered the IR to be a “well-kept secret.”17
Repository Size and Platform
Lynch and Lippincott noted the difficulties in comparing repository sizes (number of items) since “it is clear that no two institutions are counting the same things.”18 This is especially true when comparing IRs using different software platforms; however, it has not prevented researchers from attempting size comparisons. In 2005, McDowell discovered a correlation between Carnegie classification and content size of the repository in an analysis of seventy-three repositories. Only the institutions with the highest research classification held more than 500 items in their entire collection.19 By 2009, however, Nykanen located fourteen baccalaureate or master’s institutions with repository counts greater than 500 items.20
There is general consensus that DSpace and Digital Commons are the two most frequently used platforms at American institutions. In studies where researchers reported software platform usage, DSpace installations ranged from 43 to 58 percent with Digital Commons implementations ranging from 21 to 27.8 percent of all platforms identified.21
Repository Content
Detailed analyses of IR content are sometimes hampered by platform interface differences and the institution’s desire to organize and present its content in ways that reflect its organizational needs. Investigators have analyzed the type of faculty content, the percentage of faculty content compared to the repository as a whole, and faculty participation rates.22 In addition to scholarly publications, non-research content such as teaching materials, university governance documents, campus history, etc. has also been considered.23
Studies have examined the size and variety of student content, particularly at institutions where teaching and student research are a priority. Some authors have conjectured that student scholarship provides visibility for undergraduate research and helps with repository growth.24 Student contributions may include electronic theses, capstone projects, student research journals, undergraduate research presentations and posters, and specific course papers and projects.
Hertenstein discussed the effect that repository submissions may have on students’ later attempts to get their scholarship accepted by traditional publishers.25 Presenters at an Association of College and Research Libraries (ACRL) Conference shared comments from faculty mentors regarding student postings of preliminary research, and whether that preempts faculty from publishing final results in peer-reviewed journals. Faculty also questioned if repositories clearly differentiate between student and faculty authors.26
Master’s-level Institutions
As previously noted, several studies have attempted large-scale investigations of repositories at non-ARL institutions. Many of these analyses include master’s-level institutions but do not provide detailed breakdowns of size or content by institution type.27 Case studies examining implementation at one specific institution are also available.28 While individual studies are useful exemplars for others who are considering building or increasing the size of a repository and the larger census studies give a general idea of the status of repositories at non-research-intensive universities, none of them provides the details or context needed to consider the unique conflicts between teaching and research found at many master’s-level institutions.
Method
The authors obtained a list of small, medium, and large master’s-level institutions from the Carnegie Classification of Institutions of Higher Education and downloaded it into an Excel spreadsheet.29 They created the list on March 6, 2015, and work began to ascertain how many of those institutions have an IR. Various definitions of repositories are found in the literature. The most regularly cited definition comes from Lynch’s 2003 article introducing the concept of institutional repositories:
A university-based institutional repository is a set of services that a university offers to the members of its community for the management and dissemination of digital materials created by the institution and its community members. It is most essentially an organizational commitment to the stewardship of these digital materials, including long-term preservation where appropriate, as well as organization and access or distribution . . . a mature and fully realized institutional repository will contain the intellectual works of faculty and students—both research and teaching materials—and also documentation of the activities of the institution itself in the form of records of events and performance and of the ongoing intellectual life of the institution.30
To conduct analyses of comparable collections and using Lynch’s definition as a guide, the authors created the following definition to direct the focus of this study:
An online, institution-wide or consortial, multidisciplinary repository that includes scholarly works of faculty and students and may also include institutional history and documentation, institution-sponsored publications or partnerships, and other local digitized collections. Only those institutions showing a clear intent to include the scholarly pursuits of faculty and students are included.
Size was not necessarily a factor in the review if the repository included the criteria listed above. Many library websites provide a description of their physical or print collections or provide digitized finding aids to these collections. These were not included in the analysis since the collections themselves were not digitized. A large number of institutions have digitized special collections of images or text that are very narrow in scope and often related to local history or prominent local dignitaries. Although of value to the larger research community, these collections were not included in this analysis since they do not relate to the institution’s scholarly output or administration.
The first review of all institutions was completed on April 29, 2015. Each institution on the list was examined to determine the existence of a repository that fit the authors’ definition. A navigational analysis was performed based on the methods described by Jantz and Wilson.31 The same procedure was followed to search for each repository and the results of each step were recorded on the master spreadsheet. First, a Google advanced search was performed using the search strategy: “exact word” (institutional name) AND “any of these words” (repository archive). A well-cited research study by van Deursen and van Dijk reported that 91 percent of Google searchers do not go past the first page of results.32 Based on that fact and the need to keep the navigation portion of this study manageable, only the first page of results was examined. Next, the OpenDOAR and ROAR directories were searched. Finally, the authors examined each institution’s main site and the institution’s library homepage to see if there were links to the repository. In addition to searching for a link on the main institutional webpage, the authors examined other institutional pages aimed at faculty, research, or general academics, plus an A to Z list or site index. If no repository was found using any of these steps, the final action was to conduct a keyword search of the entire institution’s website for the terms “repository” or “archive.” Again, only the first screen of results was examined.
Repositories were considered as discoverable in Google if they could be reached using no more than one link from Google. Sources that could not be located within one click of the initial Google search were excluded. Broken links on Google were not included in the discovery search for IRs. The repository name (if applicable) and its URL were recorded. The study found some independent institutions participating in what appeared to be a consortial or shared repository where each was able to present collections unique to their organization. Similarly, some of the master’s institutions that are part of a multi-campus system shared the same platform, each with its own discrete collection of materials. These collections were included in the final analysis as long as the institution’s collection could be accessed independently from the larger group.
If the steps described above failed to identify any semblance of a repository, the institution was recorded as lacking an IR. If an institution had a website or collection that required further investigation, this was recorded and a second review was conducted to carefully determine if the established criteria for this study were met. URLs that failed to open or resulted in the display of an error message after several attempts were not counted in the final analysis. Institutions located outside of the fifty United States, entities that had gone out of business, or those that appeared to have changed from a master’s institution to another Carnegie classification were excluded from final consideration. The initial review of the 137 repositories that met the study’s definition gathered basic descriptive information such as the software platform and a count of the total number of items in the collection, if it could be determined.
A navigational analysis of each library’s website was conducted to locate links to the IR. When available, the following pages were examined: “about the library,” “for faculty,” scholarly communications, collections or resource lists, an A to Z list, digital collections, special collections, news and events, “finding information,” and any discovery tools. Direct links including those from a pull-down menu, a persistent toolbar, or on the main page of a LibGuide were counted.
A more detailed qualitative content analysis of each repository was also conducted. Content types defined by earlier studies were employed in the analysis.33 As software platform features may vary considerably, it can be difficult to determine if a particular type of content was included in the repository, much less quantify how many of a certain type of item were in the repository. For this reason, a qualitative approach seemed more practical. Therefore, if the authors found one faculty-authored journal article or if one student presentation or thesis was identified, the IR was marked as including that type of content. In addition to peer-reviewed papers, faculty content consisted of books, book chapters, conference presentations, reports, working papers, and data sets. Syllabi or other course-related teaching materials such as learning objects or assignments were also found in a few IRs. Student-generated content included theses (both honors and masters), capstone or class projects, poster sessions, and student journals.
Results
The total number of master’s institutions downloaded from Carnegie was 724. Of these, institutions that were located in US territories or foreign countries (n = 15) were excluded, as were institutions that appeared to be out of business or whose Carnegie classification could not be verified (n = 7), resulting in twenty-two organizations eliminated from the initial download. Additionally, four institutions appeared to have an IR but the URL could not be opened after repeated attempts, making the final population equal to 698.
The search for IRs was conducted for the remaining 698 universities and colleges. The number of institutions with a working repository that met some of the study’s criteria for a repository, was 190 or 27 percent of all of the institutions examined (190/698). Of the total IRs, however, 28 percent (53/190) lacked any type of faculty scholarship, which was the focus of this study. The final total of qualifying repositories that met the authors’ definition of an IR and included faculty content numbered 137 (20 percent of 698). Table 1 illustrates the distribution of master’s institutions according to type for the final set of repositories. Table 2 provides a comparative breakdown by student enrollment.34
This study also investigated how discoverable these IRs were using four possible avenues: Google, OpenDOAR, ROAR, and the institution’s main website (see table 3). Overall, Google and ROAR provided the most access. In a cross-comparison of the IRs with faculty content, OpenDOAR registered only one unique IR, i.e. one not discoverable by either a Google search or listed in ROAR. ROAR listed five unique IRs whereas the Google search discovered thirty-one unique IRs. One IR was only found by searching its institutional website. It follows that 72 percent (99/137) of the IRs could be found by more than one method. As illustrated in table 4, navigation to the IR on the library website is often more obvious, with most libraries including a link to the repository not only on their library homepage but also providing access through other library pages.
Ten repository software platforms are represented in the set of 137 IRs containing faculty content, with Digital Commons being the most popular platform (80 or 58.4 percent). DSpace was the second most heavily used repository software (36 or 26.3 percent) with 15.3 percent using other repository solutions. Nineteen colleges and universities share platforms (13.9 percent) while 86.1 percent (118) maintain their own repository platform. Of those that share, eleven are master’s large (57.9 percent), seven medium (36.8 percent), and one small (5.3 percent). IRs with faculty content are more likely to use Digital Commons than those lacking faculty content. Digital Commons offers a sophisticated software module expressly designed to store and display faculty profiles and content, which may account for this preference. The types of platforms used to support archives are summarized in table 5.
The total number of items in the repositories containing faculty scholarship ranged from 7 to 57,649. To be consistent, the total number of items provided by the software platform was recorded rather than a manual count of items. Five repositories’ platforms did not generate item totals and are not included in the data presented. The most common type of scholarship found was the journal article, followed by presentations, books or book chapters, and reports. Although finding raw research data was more difficult to locate in collections, thirteen IRs contained obvious data files (see table 6). In reviewing types of student scholarship, theses and dissertations were the most common type of content. Capstone or class projects, distinct from theses and dissertations, were the second most common, followed by student journals and presentations (see table 7).
Other types of content, including syllabi, other course-related materials, and library working documents were also noted. University materials, such as minutes, policies, and guidelines, were defined as governance related. Newsletters, catalogs, yearbooks, reports, and other types of university publications were classified separately. Any type of media collection (e.g. images, photos, maps, or audio files) was recorded. Each repository was examined to see if it hosted one or more external journals (see table 8).
Discussion
Census
In one of the earliest censuses, only two of the liberal arts consortial members of the CNI group had an established IR in 2005.35 A broader study in 2006, however, discovered that 19 percent of the master’s-level institutions sampled had already implemented an IR and 32 percent were in the process of implementing one.36 In the current study of all master’s institutions, 27 percent (190/698) had a working IR of any type and 20 percent (137/698) had an IR with faculty content.
McDowell used ROAR and open web searches, along with directories from the major IR software vendors, to compile a list of active IRs. Her 2006 search located seventy-three active IRs with 47 percent of those coming from ARL institutions. McDowell also noted that more than half of the IRs were located at academic institutions with student enrollments below 15,000.37 This project discovered that 79 percent of the IRs with faculty content were supported by institutions with student populations below 15,000.
In this study, the collection sizes ranged from a low of seven items to a high of 57,649, with a mean collection size of 4,538 and a median of 1,822. This appears to be commensurate with numbers and averages reported in the literature. For example, Nykanen’s 2009 study showed an average of 2,968 items.38 Xia and Opperman in 2009, examining master’s and baccalaureate institutions, saw a range from four to 7,573 items.39 Mercer’s review of faculty content at ARL institutions found a wide variety in size with a range of eleven to 46,823 items.40
Location and Navigation
A perceived lack of discoverability was noted by Davis and Connelly in interviews with several Cornell faculty who saw the IR as “a single island completely isolated from other institutional repositories.”41 Good metadata and navigational links allow users from any location to find IR content. The current study indicates that IRs are more visible when links are provided on a variety of library webpages, including the homepage. Scholarly communications, faculty, and collections pages continue to be popular gateways to the IR, but more libraries are now adding links on general library pages such as those devoted to services, news, or “about the library.” See table 4 for more information.
Platform
In Hertenstein’s 2013 survey (n = 36) of institutions with established IRs, 43 percent were using DSpace.42 Jantz and Wilson’s 2009 study reported DSpace as the most common platform with bepress as the second choice.43 Xia and Opperman’s 2009 study of fifty IRs at master’s and baccalaureate institutions also found that DSpace was used most often, followed by Digital Commons.44 In contrast, this study found the Digital Commons software (a bepress product) to be much more heavily used than DSpace confirming that Digital Commons and DSpace continue to dominate IR software implementations. Additional studies by Mercer, Nykanen, Rieh, and Lynch allowed direct comparisons to the current study of platform use (see table 9).
Content: Faculty
This study provides a qualitative review of the types of faculty content in 137 master’s IRs (see table 6), and is similar in nature to the overall content of faculty collections described in other studies. Because of the size of the population, quantitative data on the number of items of each faculty content type were not collected here; therefore, the data is not directly comparable to the quantitative data included in some smaller studies.45 A future study of a small, randomly selected subset of master’s IRs would enable counts of faculty items, thus providing comparable data.
Content: Student
Rozum’s 2014 survey of librarians working with IRs that contain student content concluded that “libraries are somewhat passive collectors of student research,” willing to take student content but not seeking it in the same way that they push for faculty content.46 While this may be true, other studies have reported that student contributions at master’s and baccalaureate repositories account for a large percentage of the overall content.47 In 2013, Hertenstein’s survey discovered that 92 percent of the institutions with IRs included student content.48
Although the content analysis of this study was limited to IRs that contained faculty scholarship, like Hertenstein, some type of student content was present in 91 percent (125/137) of the IRs. The largest category of student content was theses (93 percent). Fifty-one percent of the IRs hosted some type of student research journal. The results of this study are similar to those found in a 2013 study of student content. In the earlier report, 85 percent of the IRs contained theses or dissertations and 45 percent provided access to student presentations or posters. There appears to be a slight increase in the inclusion of student class papers and projects with 63 percent (79/125) of the current IRs containing these materials compared to 39 percent of the IRs examined in 2013.49
Content: Other
McDowell found that 4.5 percent of the IRs in her study consisted of non-scholarly content including marketing materials and university governance documents.50 These materials were a larger part of IRs at institutions with less than 10,000 students, comprising 16.9 percent of the content.51 In this study, over 48 percent (66/137) of the IRs contained library materials and 49 percent (67/137) had some sort of university-related governance materials. Syllabi were included in 12 percent (17/137) of the current IRs and course-related materials were present in 26 percent (35/137).
Conclusion
This study benchmarks IR development in Carnegie-designated master’s institutions. Since no other research published to date has examined this exact population, speculating on the growth of IRs in this segment of the academic community is difficult. Rieh’s early study of 446 four-year institutions found that 118 respondents either had or were actively planning IRs.52 In 2014, Bishoff and Smith reported that 117 (81 percent) of the two-year and four-year master’s and doctorate institutions in their study maintained IRs.53 Rather than looking at a sample, this project investigated all Carnegie-designated master’s institutions. Within this population of 698 institutions, the 137 IRs with faculty content and 190 total IRs seem to indicate at least some kind of growth over the last ten years.
The nature of the content appears very similar to that found in other study populations, whether at teaching or research institutions. In general, it appears that faculty scholarship, primarily journal articles and presentations, continues to represent an important part of most repositories. Student content is still primarily theses; other types of student productivity, however, such as student projects and presentations, are also included. This study indicates there may be an increasing interest in content beyond faculty peer-reviewed books and articles. In the current review, 66 percent of the IRs contained faculty working papers and technical reports.
A 2009 study of fifty master’s and baccalaureate institutions was unable to locate much in the way of teaching materials and found just one IR that contained syllabi.54 In this analysis, course syllabi were included in 12 percent of the IRs and nearly 26 percent had other kinds of course-related materials. Nykanen’s examination of the content in ten repositories found that 16.9 percent of the overall content was devoted to university documentation and marketing materials, much of which was produced by the library.55 Some degree of university governance documents and library materials appeared in nearly half of the IRs in this study.
Examining the discoverability of IRs with faculty content, Google searching appears to be the most successful way to discover IRs and produced the most unique number of IRs, i.e., those not found elsewhere. The ROAR directory consistently included more repositories than OpenDOAR, and had a larger number of unique entries than OpenDOAR. IR visibility also appears to be increasing on library webpages with 62 percent (85/137) of the libraries in this study including a link to the IR on their library homepage as compared to only four (n = 40) libraries of those analyzed in 2006.56
The current study represents a snapshot in time and the creation and development of IRs is continually changing. Different platforms and even IR organizational structure make direct comparisons on size and content difficult. That said, additional analyses of content, such as full-text versus bibliographic content, comparisons by discipline, etc., would be useful.
In one of the earliest papers describing the potential of IRs, Lynch commented, “Not every higher education institution will need or want to run an institutional repository, though I think ultimately almost every such institution will want to offer some institutional repository services to its community.”57 This report offers some quantitative and qualitative evidence that less than 20 percent of the master’s institutions in the United States have established repositories with faculty content, but those that do, contain content similar to those other types of institutions previously examined.
References
- Clifford A. Lynch and Joan K. Lippincott, “Institutional Repository Deployment in the United States as of Early 2005,” D-Lib Magazine 11, no. 9 (2005): 1–10, accessed February 21, 2015, http://webdoc.sub.gwdg.de/edoc/aw/dlib/.
- Philip M. Davis and Matthew J.L. Connolly, “Institutional Repositories: Evaluating the Reasons for Non-use of Cornell University’s Installation of DSpace,” D-Lib Magazine 13, no. 3–4 (2007): 1–17, accessed September 1, 2014, http://www.dlib.org/dlib/march07/davis/03davis.html; Ronald C. Jantz and Myoung C. Wilson,”Institutional Repositories: Faculty Deposits, Marketing, and the Reform of Scholarly Communication,” Journal of Academic Librarianship 34, no. 3 (2008): 186–95, https://doi.org/10.1016/j.acalib.2008.03.014; Holly Mercer et al., “Structure, Features, and Faculty Content in ARL Member Repositories,” Journal of Academic Librarianship 37, no. 4 (2011): 333–42, https://doi.org/10.1016/j.acalib.2011.04.008.
- Liz Bishoff and Carissa Smith,”Managing Digital Collections Survey Results,” D-Lib Magazine 21, no. 3–4 (2015): 1–7, accessed June 5, 2015, http://www.dlib.org/dlib/march15/bishoff/03bishoff.print.html; Davis and Connolly, “Institutional Repositories”; Elizabeth Hertenstein,”Student Scholarship in Institutional Repositories,” Journal of Librarianship & Scholarly Communication 2, no. 3 (2014): eP1135, accessed October 6, 2016, http://jlsc-pub.org/articles/abstract/10.7710/2162-3309.1135/; Karen Markey et al., “Institutional Repositories: The Experience of Master’s and Baccalaureate Institutions,” portal: Libraries & the Academy 8, no. 2 (2008): 157–73, https://doi.org/10.1353/pla.2008.0022; Soo Young Rieh et al., “Census of Institutional Repositories in the U.S.: A Comparison Across Institutions at Different Stages of IR Development,” D-Lib Magazine 13, no. 11–12 (2007): 1–13, accessed September 2, 2014, http://dlib.org/dlib/november07/rieh/11rieh.html.
- Ellen Dubinsky, “A Current Snapshot of Institutional Repositories: Growth Rate, Disciplinary Content and Faculty Contributions,” Journal of Librarianship & Scholarly Communication 2, no. 3 (2014): eP1167, accessed January 16, 2015, http://jlsc-pub.org/articles/abstract/10.7710/2162-3309.1167/; Jantz and Wilson, “Institutional Repositories Faculty Deposits,” 186–95; Cat S. McDowell, “Evaluating Institutional Repository Deployment in American Academe since Early 2005: Repositories by the Numbers, Part 2,” D-Lib Magazine 13, no. 9–10 (2007): 1–14, https://doi.org/10.1045/september2007-mcdowell; Mercer et al., “Structure, Features, and Faculty Content”; Melissa Nykanen, “Institutional Repositories at Small Institutions in America: Some Current Trends,” Journal of Electronic Resources Librarianship 23, no. 1 (2011): 1–19, https://doi.org/10.1080/1941126X.2011.551089; Betty Rozum et al., “We Have Only Scratched the Surface: The Role of Student Research in Institutional Repositories,” ACRL 2015 Conference. Portland, OR: Association of College and Research Libraries. (2015): 804–12, accessed May 4, 2016, http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Rozum_Thoms_Bates_Barandiaran.pdf; Jingfeng Xia and David B. Opperman, “Current Trends in Institutional Repositories for Institutions Offering Master’s and Baccalaureate Degrees,” Serials Review 36, no. 1 (2010): 10-18, https://doi.org/10.1080/00987913.2010.10765272; Hong Xu, “The Current Situation of Faculty Participation in Institutional Repositories—A Study of 40 DSpace Implementations Supporting IRs,” Proceedings of the American Society for Information Science and Technology 44, no. 1 (2007): 1–3, https://doi.org/10.1002/meet.1450440332.
- Markey et al., “Institutional Repositories: The Experience,” 159; Dubinsky, “A Current Snapshot,” 17–18.
- Markey et al., “Institutional Repositories,” 167; McDowell, “Evaluating Institutional Repository Deployment,” 10–11; Nykanen, “Institutional Repositories at Small Institutions,” 13; Yuji Tosaka, Cathy Weng, and Eugenia Beh, “Exercising Creativity to Implement an Institutional Repository with Limited Resources,” Serials Librarian 64, no. 1 (2013): 254–62, https://doi.org/10.1080/0361526X.2013.761066.
- Indiana University Center for Postsecondary Research, The Carnegie Classification of Institutions of Higher Education Interim Site, accessed March 6, 2015, http://carnegieclassifications.iu.edu/.
- University of Southampton School of Electronics and Computer Science, Registry of Open Access Repositories (ROAR), accessed May 21, 2016, http://roar.eprints.org/; University of Nottingham Centre for Research Communications, The Directory of Open Access Repositories—OpenDOAR, accessed May 21, 2016, www.opendoar.org/.
- Lynch and Lippincott, “Institutional Repository Deployment,” 2–4.
- McDowell, “Evaluating Institutional Repository Deployment,” 4–5.
- Rieh et al., “Census of Institutional Repositories,” 3.
- Bishoff and Smith, “Managing Digital Collections,” 2.
- Raym Crow, “The Case for Institutional Repositories: A SPARC Position Paper,” ARL Bimonthly Report, no. 223 (August 2002): 5, accessed June 9, 2015, http://sparcopen.org/wp-content/uploads/2016/01/instrepo.pdf.
- Mildred Coates, “Electronic Theses and Dissertations. Differences in Behavior for Local and Non-Local Users,” Library Hi Tech 32, no. 2 (2014): 285–99, https://doi.org/10.1108/LHT-08-2013-0102.
- Jantz and Wilson, “Institutional Repositories Faculty Deposits,” 192–93.
- Mercer et al., “Structure, Features, and Faculty Content,” 335.
- Beth St. Jean et al., “Unheard Voices: Institutional Repository End-Users,” College & Research Libraries 72, no. 1 (2011): 29–30, 35, https://doi.org/10.5860/crl-71r1.
- Lynch and Lippincott, “Institutional Repository Deployment,” 4.
- McDowell, “Evaluating Institutional Repository Deployment,” 6.
- Nykanen, “Institutional Repositories at Small Institutions,” 9.
- Hertenstein, “Student Scholarship,” 4; Lynch and Lippincott, “Institutional Repository Deployment,” 6; Mercer et al., “Structure, Features, and Faculty Content,” 335; Rieh et al., “Census of Institutional Repositories,” 9-10.
- Nykanen, “Institutional Repositories at Small Institutions”; McDowell, “Evaluating Institutional Repository Deployment”; Xu, “The Current Situation.”
- Anne M. Casey, “Does Tenure Matter? Factors Influencing Faculty Contributions to Institutional Repositories,” Journal of Librarianship & Scholarly Communication 1, no. 1 (2012): eP1032, accessed May 13, 2016, https://doi.org/10.7710/2162-3309.1032; Lynch and Lippincott, “Institutional Repository Deployment,” 5-6; Xia and Opperman, “Current Trends in Institutional Repositories,” 14.
- Hertenstein, “Student Scholarship,” 11; Nykanen, “Institutional Repositories at Small Institutions” 14, 17; Rozum et al., “We Have Only Scratched the Surface,” 811.
- Hertenstein, “Student Scholarship,” 7.
- Rozum et al., “We Have Only Scratched the Surface,” 810.
- Bishoff and Smith, “Managing Digital Collections”; Dubinsky, “A Current Snapshot.”
- Jonathan Bull, Jonathan, and Bradford Lee Eden, “Successful Scholarly Communication at a Small University: Integration of Education, Services, and an Institutional Repository at Valparaiso University,” College & Undergraduate Libraries 21, no. 3–4 (2014): 263–78, https://doi.org/10.1080/10691316.2014.93226; Gregory J. Kocken and Stephanie H. Wical, “‘I’ve Never Heard of it Before’: Awareness of Open Access at a Small Liberal Arts University,” Behavioral & Social Sciences Librarian 32, no. 3 (2013): 140–54, https://doi.org/10.1080/01639269.2013.817876.
- Carnegie Commission, “Interim Site.”
- Clifford A. Lynch,”Institutional Repositories: Essential Infrastructure for Scholarship in the Digital Age,” portal: Libraries and the Academy 3, no. 2 (2003): 328, https://doi.org/10.1353/pla.2003.0039.
- Jantz and Wilson, “Institutional Repositories: Faculty Deposits,” 190.
- Alexander J. A. M. van Deursen and Jan A. G. M. van Dijk, “Using the Internet: Skill Related Problems in Users’ Online Behavior,” Interacting with Computer 21, no. 5–6 (2009): 398, https://doi.org/10.1016/j.intcom.2009.06.005.
- Lynch and Lippincott, “Institutional Repository Deployment,” 5–6; McDowell, “Evaluating Institutional Repository Deployment,” 9–10.
- Carnegie Commission of Institutions of Higher Education, The Carnegie Classification of Institutions of Higher Education Basic Classification Methodology, accessed May 21, 2016, http://carnegieclassifications.iu.edu/methodology/basic.php; Institute of Education Sciences, National Center for Education Statistics. IPEDS, 2013-2014 Final Data, accessed March 30, 2016, https://nces.ed.gov/ipeds/datacenter/Default.aspx.
- Lynch and Lippincott, “Institutional Repository Deployment,” 4.
- Markey et al., “Institutional Repositories: The Experience,” 162.
- McDowell, “Evaluating Institutional Repository Deployment,” 4-5.
- Nykanen, “Institutional Repositories at Small Institutions,” 9.
- Xia and Opperman, “Current Trends in Institutional Repositories,” 14.
- Mercer et al., “Structure, Features, and Faculty Content,” 334.
- Davis and Connolly, “Institutional Repositories,” 13.
- Hertenstein, “Student Scholarship,” 4.
- Jantz and Wilson, “Institutional Repositories: Faculty Deposits,” 192.
- Xia and Opperman, “Current Trends in Institutional Repositories,” 12.
- McDowell, “Evaluating Institutional Repository Deployment,” 10; Nykanen, “Institutional Repositories at Small Institutions,” 13; Xia and Opperman, “Current Trends in Institutional Repositories,” 14.
- Rozum et al., “We Have Only Scratched the Surface,” 811.
- McDowell, “Evaluating Institutional Repository Deployment,” 10; Nykanen, “Institutional Repositories at Small Institutions,” 13; Xia and Opperman, “Current Trends in Institutional Repositories,” 12.
- Hertenstein, “Student Scholarship,” 4.
- Ibid, 5.
- McDowell, “Evaluating Institutional Repository Deployment,” 10.
- Nykanen, “Institutional Repositories at Small Institutions,” 13.
- Rieh et al., “Census of Institutional Repositories,” 3.
- Bishoff and Smith, “Managing Digital Collections,” 2.
- Xia and Opperman, “Current Trends in Institutional Repositories,” 16.
- Nykanen, “Institutional Repositories at Small Institutions,” 13.
- Jantz and Wilson, “Institutional Repositories: Faculty Deposits,” 193.
- Lynch, “Institutional Repositories: Essential Infrastructure,” 335.
Table 1. Number of institutions with repositories
Carnegie type |
Total number of master’s institutions (N = 698) |
Institutions with IRs having faculty content (n = 137) |
Master’s Large |
405 (58%) |
96 (70%) |
Master’s Medium |
176 (25%) |
30 (22%) |
Master’s Small |
117 (17%) |
11 (8%) |
Total |
698 |
137 |
Source: The Carnegie Classification. Basic Classification Methodology.
Table 2. IRs by institutional enrollment
Enrollmenti |
IRs with faculty content (n = 135) |
Student population 0-5000 |
38 (28%) |
Student population 5001-10000 |
46 (34%) |
Student population 10001-15000 |
23 (17%) |
Student population 15001-20000 |
13 (10%) |
Student population over 20000 |
15 (11%) |
Totalii |
135 |
- Source: Enrollment figures taken from National Center for Education Statistics.
- Two institutions did not provide student enrollment figures.
Table 3. Discoverability of IRs
Path Source |
IRs with faculty content (n = 137) |
|
112 (82%) |
ROAR |
86 (63%) |
OpenDOAR |
50 (36%) |
Campus website |
18 (13%) |
Table 4. Library website analysis of IRs with faculty content
Type of library webpage |
Number of libraries with page type |
IR link found on page |
Library homepage |
137 |
62% (85/137) |
Digital Projects or Digital Collections page |
70 |
60% (42/70) |
Scholarly Communications page |
38 |
58% (22/38) |
Collections & Resources page or Database list |
136 |
51% (70/136) |
Special Collections page |
115 |
48% (55/115) |
“For Faculty” page |
108 |
45% (49/108) |
Services page |
119 |
29% (35/119) |
“About the library” page |
132 |
25% (33/132) |
News & Events page |
124 |
23% (29/124) |
“Finding Information” page or Discovery tool |
134 |
20% (27/134) |
Table 5. Software platform comparison
Software platform |
All IRs (n = 190) |
IRs with faculty content (n = 137)i |
Digital Commons |
86 (45%) |
80 (58.4%) |
DSpace |
59 (31%) |
36 (26.3%) |
Web-based program |
8 (4%) |
8 (5.8%) |
ContentDM |
26 (14%) |
7 (5%) |
Islandora |
3 (2%) |
1 (0.73%) |
Ebrary |
1 (0.5%) |
1 (0.73%) |
Omeka |
1 (0.5%) |
1 (0.73%) |
SelectedWorks (bepress) |
1 (0.5%) |
1 (0.73%) |
Open Repository |
1 (0.5%) |
1 (0.73%) |
ArchivalWare |
1 (0.5%) |
1 (0.73%) |
Eprint3 |
1 (0.5%) |
--- |
ContentPro |
1 (0.5%) |
--- |
Irplus |
1 (0.5%) |
--- |
- Totals may not equal 100% because of rounding.
Table 6. Faculty content by type (n = 137)
Type of faculty scholarship |
At least one record in IR |
Journal article |
126 (92%) |
Presentations, etc. |
108 (79%) |
Book or book chapters |
95 (69%) |
Reports |
90 (66%) |
Data |
13 (9%) |
Table 7. Student content by type (n = 125)
Type of student content |
At least one record in IR |
Theses |
116 (93%) |
Projects |
79 (63%) |
Student journal |
64 (51%) |
Presentations |
60 (48%) |
Table 8. Other Content (n = 137)
Content type |
At least one record in IR |
Course syllabi |
17 (12%) |
Other course materials |
35 (25.5%) |
Library-related documents |
66 (48%) |
University governance |
67 (49%) |
University publications |
87 (63.5%) |
Media collections |
89 (65%) |
Hosted external journals |
51 (37%) |
Table 9. Software platform comparison
Current study IRs with faculty content (n = 137, 2015 data) |
Current study all IRs (n = 190, 2015 data) |
Mercer (n = 72, 2009 data) |
Nykanen (n = 14, 2007 data) |
Rieh, (n = 446, 2006 data) |
Lynch & Lippincott (n = 38, 2005 data) |
|
Digital Commons |
58% |
45% |
27.8% |
50% |
26.8% |
21% |
DSpace |
26% |
31% |
56.9% |
43% |
46.4% |
58% |
Web-based |
6% |
4% |
--- |
--- |
--- |
--- |
ContentDM |
5% |
14% |
4.2% |
--- |
4.9% |
--- |
Islandora |
1% |
2% |
--- |
--- |
--- |
--- |
Other |
4% |
4% |
11.1% |
7% |
21.9% |
--- |
Sources: Mercer, et al., “Structure, Features, and Faculty Content,” 335; Nykanen, “Institutional Repositories at Small Institutions,” 11; Rieh, et al., “Census of Institutional Repositories,” 9–10. (implemented IRs); Lynch and Lippincott, “Institutional Repository Deployment,” 6.