As new technologies and information delivery systems emerge, the way in which individuals search for information to support research, teaching, and creative activities is changing. To understand different aspects of researchers' information-seeking behavior, this article surveyed 2,063 academic researchers in natural science, engineering, and medical science from five research universities in the United States. A Web-based, in-depth questionnaire was designed to quantify researchers' information searching, information use, and information storage behaviors. Descriptive statistics are reported. Additionally, analysis of results is broken out by institutions to compare differences among universities. Significant findings are reported, with the biggest changes because of increased utilization of electronic methods for searching, sharing, and storing scholarly content, as well as for utilizing library services. Generally speaking, researchers in the five universities had similar information-seeking behavior, with small differences because of varying academic unit structures and myriad library services provided at the individual institutions.
The advent of personal computers and the Internet followed by the introduction of online electronic journals and databases at the beginning of the 1990s led to the development of online academic resources and the transformation of the practice of scholarly communication. The ease of access and ease of use provided by electronic resources has made it easier for researchers to access and share scientific knowledge. Today, the use of online electronic resources has become widespread in almost all fields of scientific research. However, the impact of these new technologies varies considerably both across academic domains and institutions. In an effort to understand how scientists are responding to the transition to electronic communication, the NeoRef research group at the University of North Carolina at Chapel Hill has conducted several surveys of academic researchers. This article reports results from the conclusion of a national study surveying academic researcher's information-seeking behavior at five universities. This national study follows the same methodology established in the initial survey of academic scientists initiated at the University of North Carolina at Chapel Hill (Hemminger, Lu, Vaughan, & Adams, 2007). It extends this work to encompass four additional universities, to study differences among five universities in information-seeking behavior. The main research aims are to provide a baseline description of current information-seeking behavior of academic scientists on a national level as institutions change to primarily electronic communications, to understand where changes in behavior are occurring and why, and what theoretical and practical implications this has for information-seeking behavior models and for library services.
This article is the first of three articles that details the results of the national survey and the analysis of these results. This first article describes the survey, documents the basic descriptive results, and discusses some interesting intersite differences and differences with previously published results. The second and third articles contain detailed analyses that examine the effect of factors, such as position, department, age, etc., on information-seeking behaviors. Thus, most of the research questions are addressed in detail in the companion articles. Additionally, the companion articles compare the results to previously proposed information-seeking behavior models from the information science literature, and they propose a refinement of Wilson's (1997) revised general model of information behavior and Buente and Robbin's (2008) model.
As part of a national study of the information-seeking behavior of academic researchers, a single survey instrument was used to survey researchers at five universities in the United States from 2005–2007. The initial survey was conducted at the University of North Carolina at Chapel Hill (UNC) in 2005 (Hemminger et al., 2007). This survey was developed over the course of 1 year, with feedback from the university's science libraries and the Health Science Library at UNC, as well as from other universities planning to participate in the national study. Advertising for the national study was conducted at the ASIST 2005 conference, in conjunction with the presentation by one of the authors (BMH) of the preliminary UNC study survey results, as well as through contacts with science librarians at institutions in the United States. Nineteen sites initially expressed interest and received information regarding the study requirements. Five qualifying sites were selected for the first phase of the national study. Site selection was based on the following criteria: significant research activities; a local site coordinator who could oversee the study at their site (including handling their Institutional Review Board (IRB) submission and recruiting subjects); readiness to start within the first year; and diversity (size and type of library infrastructure). Although the sites are only somewhat geographically diverse, they are reasonably diverse in size and type of institution as well as library infrastructure. The participating sites were as follows: University of North Carolina at Chapel Hill (UNC), University of Florida (FL), University of Oklahoma (OU), Colorado State University (CSU), and University of South Florida (USF). Sites received a small stipend to cover the cost of recruiting subjects. After a preliminary data analysis based on the first three sites (UNC, FL, OU), minor modifications and refinements were made to the survey questionnaire to improve the validity of the responses. The new version was applied to the remaining two universities: CSU and USF.
According to the Carnegie Classification (http://classifications.carnegiefoundation.org/), the five colleges in our sample are all large research universities with very high research activities. The primary differences among them are in their focus and level. UNC, FL, and USF offer a comprehensive doctoral program with medical/veterinary majors, whereas OU has no medical/veterinary departments. CSU also belongs to the category of doctoral research universities, but their program focuses primarily on natural science, technology, and engineering, and includes a veterinary program. According to the American Best Colleges Rankings (US News, 2008), UNC and FL are at the top tier of colleges (ranked 28 and 49, respectively), OU and CSU (ranked 108 and 124, respectively) belong to the second tier, and USF is a third-tier college. As far as library service is concerned, the library system at UNC is an across-campus network with one main library and a dozen departmental libraries. Florida has approximately 10 branch libraries that form a system that serves the whole campus. Similarly, the Norman campus of the OU libraries includes the main library, six branch libraries, and three special collections. The USF and CSU library systems comprise both a principal research library and a few specialty libraries. USF has a health sciences library and CSU has a veterinary library.
The literature concerning information-seeking behavior is quite large, and some of it focuses on occupations, roles, and demographic groups. The principle demographic groups that have been described (along with relevant citations) are as follows: general public, children, and students (Hirsh, Jacobson, & Ignacio, 1997; Neuman, 1995); researchers and scholars (a series of research by Tenopir & King; Brown, 1999; Hemminger et al., 2007; Nicholas, Huntington, & Jamali, 2007); professionals such as lawyers and nurses (Gorman, 1995; Leckie, Pettigrew, & Sylvain, 1996; Nicholas & Martin, 1997; Urquhart & Crane, 1994); women, minorities, and the poor (Chatman & Pendleton, 1995; Liu, 1995; Meho & Haas, 2001; Shade, 1998). The focus of this study is the demographic group of researchers and scholars who are in the fields of science, medicine, and engineering. These three fields were chosen because this work is part of a larger effort studying the changing scholarly communications of scientists.
As both Case (2002) and Wilson (1994) point out, the study of information-seeking behavior was, from the 1940s to the 1970s, dominated by the investigations of scientists and, to some extent, engineers. This has changed since the 1980s, with more work covering the information-seeking behavior of previously less well-studied groups and disciplines, for example, social scientists and humanists. In the 1990s, there was an increase in the coverage of health-related information seeking, rivaled mainly by the ever-constant attention to students of all types and ages. Generally, previous research on academics' information searching behavior tended to focus on the following relevant fields: health science ( Vibert, Rouet, Ros, Ramond, & Deshoullieres, 2007; Tenopir & King, 2004); social science and humanities (Cronin, 1982; de Tiratel, 2000; Folster, 1989; Francis, 2005); and natural science and engineering (Brown, 2007; Davis, 2004; Hallmark, 1994; Henderson, 1995; Kraut, Egido, & Galegher, 1988; Stewart, 1996). Interdisciplinarity is a theme that was addressed in the late 1990s and early 2000s in works by Bates (1996), Searing (1996), Westbrook (1997, 2003), and Gerhard (2001). Brown (1999) studied the information-seeking behavior of astronomers, chemists, mathematicians, and physicists in the electronic information age, and he reported that all of the scientists surveyed greatly relied on the journal literature to support their research and creative activities. Brown additionally found that mathematicians also relied on monographs, preprints, conferences, and personal communication to support their research activities. With increasing interdisciplinary research work, it is becoming harder to generalize about habits and preferences exhibited by researchers based on a narrow subject area or specific discipline. Moreover, the discipline difference may be affected by culture differences. Majid, Anwar, and Eisenschitz (2000) studied agricultural scientists in Malaysia, de Tiratel (2000) studied social scientists in Argentina, and Francis (2005) studied social scientists in India.
Another group of researchers conducted interinstitution or intercultural studies on information-seeking behavior. Wang (2006) interviewed 65 researchers from China and the United States and compared their information seeking or information searching behavior. She found that Chinese scholars used slightly less digital resources than their U.S. counterparts, which was mainly because of the availability of the digital resources. She also asserted that the “digital divide” between hard sciences and humanities was more obvious than that between the two selected cultures of the same discipline. Still another group of researchers studied information seeking differences among institutions. King, Tenopir, Montgomery, & Aerni (2003) studied journal-use patterns of faculty at three universities having different levels of electronic journal implementations, and summarized the similarities and differences among these three institutions. Friedlander (2002) used structured telephone interviews to survey faculty members and students from more than 200 colleges and universities on how the Internet affects their scholarly work and the consequences it might have on campus libraries. Nicholas et al. (2007) constructed a deep log analysis to evaluate four universities using the OhioLINK journal system and found large differences between the research and teaching universities.
Prior research has been abundant enough to provide insight into the overall field of information-seeking behavior. Some limitations of previous studies were small sample sizes, narrow topics of study, or covering only a few departments or disciplines. This potentially limits the ability to make comparisons among fields or institutions. Additionally, research questions were often narrowly focused, e.g., e-journal usage (Nicholas et al., 2007) and electronic resources versus traditional library (Liu, 2006), and it was not extensive enough to capture the whole picture of subjects' information-seeking behavior.
This study surveyed a large sample of 2,063 academic researchers from approximately 50 different departments at five universities in the United States. The design of the survey allows for comparisons of differences among these institutional types as well as across academic disciplines. The extensive question set, including current practices and technology use, provides an in depth examination of current information-seeking behavior by researchers and scholars.
The questionnaire comprised six parts: background information, types of resources used, keeping current, searching for information, personal article collection, and searching and using information. These questions attempted to quantify academic scientists' transition to electronic communications and how this affects different aspects of information seeking. Many questions were chosen intentionally to parallel questions in prior studies of the information-seeking behavior of scientists (Brown, 1999; Friedlander, 2002; King et al., 2003). To reach large numbers of participants, improve reliability and validity of answers through automatic logic checking at time of entry, and to be easily replicable at multiple institutions, a Web-based survey design was used. A complete copy of the original questionnaire and a more detailed description is available in Hemminger et al. (2007). A copy of the final revised multi-institution survey (not including introductory and closing pages) is available at http://ils.unc.edu/bmh/isb/National-ISB-Survey.pdf and is included in Appendix A. Surveys differed by site only in the introductory and closing informational pages and the second demographic question, which was tailored by site to uniquely identify the participants' department.
Survey Population and Demographics
All 2,063 participants were scientific researchers from UNC, FL, OU, CSU, and USF. These five universities are representatives of large research institutions at different rankings in the United States. Included in the survey were faculty, research staff, and students (graduate and postdoctoral) from different departments, including natural science, engineering, and medical science. Participants were recruited within departments at the universities. Departmental participants were notified by electronic means (e-mail) and physical means (letters, flyers). Other inducements were used which differed depending on the local setting, including pizza parties, prize giveaways, and other forms of recognition. Table 1 shows the sample sizes and the response rates at each college. There were differences in response rates among the institutions. Possible explanations for the higher response rates at UNC, OK and FL were the substantial recruiting effort undertaken, which included flyers, e-mails, letters, and support from departmental chairs, in addition to prizes (pizza parties, iPods, cash) for top-performing departments. The inducements and recruitment seemed to have a positive effect, as most institutions reported having higher response rates than for similar surveys (for instance, FL conducted a very similar survey with the same audience without inducements and had a 2.5% response rate (Tennant, Cataldo, Sherwill-Navarro, & Jesano, 2006).
Table 1. Sample size and response rate.
Note. UNC=University of North Carolina at Chapel Hill; FL=University of Florida; OU=University of Oklahoma; CSU=Colorado State University; USF=University of South Florida. Participants refers to the number of actual participants from each site. Recruited refers to the number of people who were recruited to participate in the study at each site. Response rate is the percentage responding (participants/recruited).
About 40% of respondents are between 20–30 years of age (mostly graduate students) and the rest are evenly distributed by decade. Gender distribution is balanced in the UNC, CSU, and USF samples, whereas there are approximately 60% males at FL and OU. As to academic position, Figure 1 shows the percentages of the participants by their academic position. UNC, FL, and OK have a large percentage of doctoral student participants and a small fraction of master's students, CSU and USF have higher percentages of master's students, and CSU has substantially more research/adjunct staff than the other schools. Appendix B shows the breakdown of participants by department (summarized across all five institutions).
Excluding the three open-ended questions, all questions in the questionnaire could be grouped as having either categorical or numerical answers. All the data records were cleaned and processed (checked by experimenter for validity, properly coded, and verified), and then imported into SAS 9 for analysis. Descriptive statistics, correlation analysis, regression analysis, and cluster analysis were performed on these data, resulting in both descriptive and inferential outcomes. Because of length considerations, only the results of descriptive statistics are reported in this article. Results from the correlation analyses, regression analyses, and cluster analyses will appear in two companion article.
Results and Discussion
The results summarized in this article include the major descriptive results from the national study and emphasize interuniversity areas of difference. Areas not discussed in this article generally had consistent results across universities and agreed with the results from the prior study of UNC researchers (Hemminger et al., 2007). A complete summary of the results for all the questions in the national survey is available online as supplemental material (http://ils.unc.edu/bmh/isb/NationalStudyCompleteSummaryStatistics). Included in these materials are the counts and percentages for the all the figures presented in this article. All other analyses are included in the companion articles.
Researchers were asked how far away their office (the one they used most often) is from the campus library. Geographic situations, as shown in Figure 2, are differ among five institutions. OU provides the most geographically convenient library service to its patrons, as more than 40% of researchers could visit a library in the same building in which they work. This is likely because of OU's large library network, which has six departmental branches. Additionally, all of the OU study participants are in natural science departments, which house the majority of the branch libraries. In the other four colleges, most researchers have to walk a very short distance (one-quarter mile) to visit the library. Despite the physical proximity of the library to the majority of subjects, participants indicated a preference for searching for and acquiring information electronically. It is interesting to note that less than 10% of researchers at CSU and USF are able to visit a library in the same building in which they work. As mentioned before, this probably results from the fact that these two institutions have fewer branch libraries, and so it is less likely that there is a branch library in the same building or nearby for researchers at those institutions.
Table 2 summarizes answers from a survey question that asked how many hours researchers spent reading information relevant to their work in a typical week. Average reading times per week were similar among the institutions, with slightly higher averages for FL and USF, and less for CSU and OU. Faculty and graduate students reported spending approximately 11 hours per week reading information from all sources to support their work. The relatively high standard deviation among reading hours suggests there is a large variance among individuals. The reading times reported in this study are larger (average of 495 hours per year, assuming 45 weeks of work per year) than those reported in previous studies, which ranged from 80–400 hours per year (Tenopir & King, 2002; Quinn, 1994; Brown, 1999; Majid, 2000; Friedlander, 2002; Tenopir, King, & Bush, 2004; Fancis, 2005; Schwarz & Hondras, 2007; Tenopir, King, & Wu, 2008). Although the larger number of reading hours in this study may suggest an increase in reading in recent years, there are some confounding factors. In this study, participants indicated how much time they spent reading for research, so this would include other activities beyond just reading journal articles, which was what several studies recorded (Tenopir & King, 2002; Tenopir, King, & Bush, 2004; Tenopir, King, & Wu, 2008; Brown, 1999) Also, other studies examined different groups of scientists who likely have different reading patterns (Tenopir, King, & Bush, 2004). Comments from participants suggest that more articles are being read, but with less time spent per article, i.e., supporting the “strategic reading” observed in other studies (Tenopir 2009; Renear 2009).
Table 2. Average reading hours of researchers in a typical week.
Note. UNC=University of North Carolina at Chapel Hill; FL=University of Florida; OU=University of Oklahoma; CSU= Colorado State University; USF=University of South Florida; SD=standard deviation.
Regarding background information, researchers were asked how many articles they have published in the last 5 years. As illustrated by Figure 3, more than 60% percent of participants in UNC, FL, and OU have very few publications (only 0–4). This survey question was modified for the last two sites to breakout the number of publications through more appropriate grouping of publication numbers. The results for CSU and USF, using the new breakdown, are shown on the right side of Figure 3. More than 35% of researchers still did not report a publication and about 25% reported 1–3 publications. As expected, the graduate students, who are just beginning their research careers, comprise the majority of the participants who have zero or few publications. Removing the graduate students and including only faculty results in a more even distribution, as seen in Figure 4.
When the survey was distributed at CSU and USF, a new question was added to address how much interdisciplinary collaboration was occurring in these universities. Responses from the two institutions were similar, with half the respondents indicating their percentage of collaborations was in the range of 0%–20%, one quarter in the range of 21%–40%, and each of the other quintiles were 10%. This suggests that interdisciplinary collaborations are becoming more common. Potentially, this survey question may provide a baseline for future measurements of interdisciplinary collaborations.
Resources Used for Research
To identify researchers' resources of information and how frequently these resources were used, participants were asked how often they used books, journals, preprints, Web pages, online databases, and personal communication and attended conferences and conference proceedings. There is similarity across the five universities. The primary resources are journals, Web pages, and personal communications, which are used on a daily basis to support research activity (see Table 5 in Hemminger et al., 2007). In addition, they read books monthly or weekly, attend conferences annually, and rarely use preprints. This finding appears consistent with prior work, which generally finds that journals and personal communications are the most important tools used by researchers (Brown 1999; Majid et al., 2000). Most studies before 2000 did not include Web pages as a potential response (Bichteler & Ward, 1989; Brown, 1999; Grefsheim, Franklin, & Cunningham, 1991; Majid et al., 2000). With researchers now having high-speed Internet in their office, accessing online Web pages has become an important tool and is used more frequently than personal communications. The five universities demonstrate slight differences in using conference proceedings and personal communication. As shown in Figure 5, more scholars at UNC and OU tend not to use conference proceedings, while at FL, CSU, and USF around 35% prefer to use them annually, which matches with their frequency of conference attendance. As to personal communication, more than 25% of scientists at UNC, FL, and USF regard interpersonal talking as a daily activity, while 30% of those at OU and CSU talk with their colleagues weekly. Some contributing factors to personal interaction might be discipline differences and academic atmosphere in a specific university. It is also interesting to note that more than 10% of researchers at UNC, FL, OU, and USF do not list personal communications as one of their information sources.
Survey respondents were asked whether they used alerts to keep current about new information. Results were consistent across the universities, with all sites very close to the average rate of 36% of respondents utilizing alerts. When a researcher did use alerts, the average number was fairly consistent across sites, with 2 being the average number of alerts used at UNC and OK, 3 at FL and CSU, and 3.6 at USF. The most popular alerting services were also consistent across universities, with PubMed the clear favorite; the others were Nature, Science Direct, ISIS, eTOC, and Faculty of 1000.
One of the survey questions asked scientists to estimate the number of articles they retrieved from specific sources. As shown in Figure 6, the overall trend of source preference among institutions is similar. Researchers showed a strong preference for electronic versions of resources rather than print formats, as indicated by the top four resources. Electronic journals accessed through the library and open access electronic journals are the two primary methods of accessing electronic resources. There are minor differences among universities. On average, FL researchers retrieve 25 articles from library-subscribed e-journals in 1 month, 11 papers more on average than those from CSU. This may be because of the FL library subscribing to a large number of journals in electronic format, while CSU has relatively fewer e-journal subscriptions.
Also notable is that the institutional differences of library-subscribed e-journals are parallel to those of free online content, with FL researchers retrieving the most and CSU the least. More scholars at CSU use interlibrary loan probably because the interlibrary loan program at CSU is very fast and cost effective (https://rapid2.library.colostate.edu/PublicContent/AboutRapid.aspx#t1), and because they have comparatively fewer journal subscriptions than institutions with larger budgets.
Today, people ubiquitously use search engines to begin their information searches. Most all researchers have experience with search engines, Google in particular. Using search engines appears to have affected researchers' expectations for library searching, in that, first, they often express a preference for metasearching (a single search over all resources instead of identifying and searching resources individually; Hemminger et al., 2007), and second, they want to have the ability to instantaneously see results and bring up the content item (Hemminger et al., 2007). The vast majority of academic searching for research purposes now appears to be conducted in this fashion, either from the library Web site or via a search engine.
To identify which type of interface is preferred by researchers in this academic setting, the survey asked participants to indicate their preference between the Google search interface and their library homepage interface. Responses from participants in the five universities were consistent and split nearly half and half, with only a small difference (Figure 7). These results disagree with some findings that suggest that academic searching is predominantly done via search engines like Google, for instance, Haglund and Olsson's (2008) conclusion that there is now an “almost complete dominance of Google as a starting point for searching scientific information.” This study's results suggest that many scholars still prefer the library homepages as a pathway to the many academic resources it holds, perhaps for reasons such as that suggested by Vibert et al. (2007): “Google is too generic and cannot guarantee the relevance of the results it gives…” It may also be because library Web sites increasingly support “Google-like” text search boxes that allow their patrons to interactively search across all library catalog content. An interesting question is how users prefer to use the combination of text searching popularized via search engines, with faceted-based searching possible via the metadata. Empirical research at North Carolina State and UNC libraries using an Endeca interface for their library catalogs (Antelman, 2006; Cory, 2008) suggests that although users predominantly prefer to begin their searches with text searches, they do make use of metadata a significant fraction of the time (Cory, 2008). Overall, there is an increase in the use of text-based metasearch interfaces for library catalogs, including, in some cases, the outright adoption of Google search boxes on library homepages. All five institutions in this study supported a text-based search box on the library Web site. UNC, USF, and UF Web sites show resulting matches, with the ability to refine the search via faceted metadata. CSU supports the text searching, but not the refinement, via faceted metadata. The text search at OK does not lead directly to results, but directs the user to resource categories to search within (locations, Web pages, knowledge Bbses, LORA, catalog).
From the open-ended questions in the survey, researchers across the five institutions indicated frustration when they were required to identify and search many different content sources. They indicated a preference for metasearch tools by which they could enter a single search string that would search against all content in all resources. However, many researchers still felt that something more than a simple Google search interface was needed. Examples of some of the shortcomings mentioned included the need for bibliographic searches, a better ability to find references to specific articles, too many matches being returned, making it difficult to identify the most relevant content, and assurances about quality of the content.
To identify which search tools scientists used, respondents were asked to list their five most important individual search tools ranked in the order of importance. Responses to this question included general categories (e.g., Web search engine) as well as specific tools (e.g., Google, Yahoo). Specific answers were coded into general categories, and the summary results for the general categories are reported in Figure 8. The primary search tool reported was a citation/bibliographic database, followed by a general Web search engine. Scientists from OU indicated a stronger preference (11.89%) than other institutions for full-text digital library searching. New forms of scholarly communication are appearing, as approximately 2%–4% of scientists across five universities mentioned listservs, blogs, and wikis as their tools for searching for information. It seems that in an academic field, traditional ways (e.g., citation/bibliographic database) still dominate while novel forms are at the early adoption phase.
Personal Article Collection
Figure 9 shows researchers' responses of whether they maintain a personal bibliographic database. Although more than 85% of researchers maintain print article collections, only approximately half of them maintain a bibliographic database. Bibliographic databases were more commonly utilized at UNC and USF, perhaps because of marketing or support for the products at those institutions (both of which provided free software and training). However, OU provided the same services and UF provided free software, and yet they had substantially lower usage rates. This may be because the information-seeking and information-handling habits of researchers are very personal, as has been suggested by Davies (1998) and several others. Davies' longitudinal case study showed that information is often badly managed because of a low awareness of the increased need for information-handling skills once technology is involved. Related to this problem is the difficulty that many scientists have in admitting insufficient knowledge of sources and searching mechanisms (Miller, 2002).
To understand the use of bibliographic databases, researchers were also asked what percentage of articles from their article collection was in their bibliographic database, if they had one. There is similarity across institutions, with all reporting that if a researcher has a bibliographic database, a little more than half of the papers from his or her collection are imported to the database (Table 3).
Table 3. Percentage of articles from article collection in bibliographic database.
Note. UNC=University of North Carolina at Chapel Hill; FL=University of Florida; OU=University of Oklahoma; CSU=Colorado State University; USF=University of South Florida.
Using Information and Using the Library
There is unanimous agreement among all institutions in the preference of searching electronically over print media (average overall is 96.3% vs. 3.7%). This is because of the convenience, speed, and interactivity of searching on the Internet. Beginning in 2002, electronic resources were playing an increasing role, but their usage by established scholars was still dominated by traditional media (Odlyzko, 2002). However, in just a few years, electronic materials have been the predominant resources for academic researchers, especially in information searching (Liu, 2006; Hemminger et al., 2007).
Regarding reading the articles, the majority of researchers preferred to utilize both electronic and print formats, with fewer individuals preferring just one or the other (Figure 10). This finding is significant in that no single method of delivery for reading is indicated—both print and electronic versions have their purposes and depend on the person and the situation. This is likely the reason for the popularity of PDFs, which allow for high-quality print and electronic renditions, giving the user the freedom to choose the appropriate medium. Importantly, for all five universities, reading in an electronic-only format was the least preferred option. This suggests that scientists still like the traditional way of reading information in print, and that researchers are not ready for electronic formats to completely replace print, at least not for some reading purposes. These results agree with most other studies (for instance Liu, 2006; Tenopir & King, 2002), which generally find the printed format preferable for reading. When studying this question, it is important to distinguish between how researchers prefer to search and how they prefer to read (which is sometimes confounded in studies, for instance, Liu, 2006).
Table 4 summarizes the answers to a question asking researchers how many times they visited the library in person during last 12 months. As shown in Table 4, the average number is similar among the five institutions, with the highest number at OU, 39.23, and the lowest at FL, 15.54. As might be expected, the universities' number of library visits is correlated with distance to the library (Figure 2). OU has the highest percentage (43.88%) of researchers having a library in their building and most frequent library visits, and FL has the highest percentage (35.48%) of academics needing to walk one-half mile or more to their libraries and, thus, lowest library visits. Griffiths and King (1993) show that as the physical distance to a library increases, the usage decreases dramatically. The correlation between the two suggests proximity is an important way for libraries to attract patrons. To further check the distribution of frequency for library visits, results were combined into several groups, as shown in Figure 11. Perhaps most striking is that except for the researchers at OU, 37%–48% of academic scientists visit their library less than five times a year. Based on the comments given in the survey, it is clear that many researchers now directly access their library's online journal collection, which previously would have required a physical trip to the library. The small numbers of visits found in this study support the already documented trend of declining library visits per year (Odlyzko, 2002).
Table 4. Average number of times researchers visited the library in last 12 months.
Note. UNC=University of North Carolina at Chapel Hill; FL=University of Florida; OU=University of Oklahoma; CSU=Colorado State University; USF=University of South Florida; SD=standard deviation.
Scientists were asked to list, from a preselected list of nine reasons, why they visited the library. The relative percentages are summarized in Figure 12. Generally speaking, “pick up/drop off materials” and “photocopy materials” were most frequently chosen. There are clear institutional differences, which might reflect differences in each of the university library's focus or quality of services. At FL for instance, all the physical uses of the library are less frequently utilized than other institutions except for photocopying materials. This is consistent with FL's lowest average number of visits to the library (15.54; Table 4). At CSU, fewer researchers go to the library for photocopying materials. This is believed to be because older bound journals may be checked out at CSU, and it is a common practice to copy articles for a reduced cost at nearby commercial copy centers. Checking out items to photocopy may explain why the CSU respondents used the library more often for picking up/dropping off materials. In addition to traditional library functions, other factors are important to researchers such as “quiet reading space” (especially for graduate students), access to computers, and classrooms/meeting places. Though the types of uses of the physical space in libraries are changing, it is still clearly important. These findings may serve as a guide for libraries to help them evolve as a service-oriented facility, rather than simply a traditional brick-and-mortar repository for physical materials. Although the number of visits to the physical library is decreasing, the amount of utilization of library resources has generally been increasing, especially for electronically delivered content. Additionally, many libraries are more strongly emphasizing programs that provide service directly to the researcher. For example, at the UF Health Science Center Libraries, which had the lowest number of average visits, they have a strong liaison librarian program, where each department or college has its own “personal” librarian (Cataldo et al., 2006). As such, many reference and consultation interactions occur over the phone, via e-mail, or chat, thus negating the need for researchers to visit the physical library.
The last three questions in the questionnaire were open-ended and asked scientists' opinions about their libraries. Only the answers from the UNC results have been coded and analyzed at this time (Hemminger, Lu, Vaughan, & Adams, 2007). In those results, scientists were generally happy with the library services, particularly with the personal support provided by librarians. Most negative comments involved users not being aware of resources or services. A complete description is given in Vaughan, Hemminger, and Pulley (2008), and the results are freely available for others to analyze on the Web via a specially built interactive tool (http://bioivlab.ils.unc.edu/icis/)
This article surveyed 2,063 academic researchers in natural science, engineering, and medical science from five research universities in the United States to understand different aspects of researchers' information-seeking behavior. Descriptive statistics are reported by institutions to compare differences among universities. The most significant findings reflected the dominant utilization of electronic methods for searching and accessing scholarly content. Generally speaking, differences in information-seeking behavior among universities are not as clear as among disciplines and demographics. Researchers in the five universities are rather similar in information-seeking behavior. Our findings also have implications for academic libraries who must adapt to continue to support the needs of scientists. Another notable trend is that novel forms of scholarly communication such as collaborative information sharing technology are evolving gradually. This may be the beginning of a more significant transformative change, particularly in sharing information within laboratories or groups or among multisite collaborations. Many professors have begun utilizing blogs, wikis and multimedia (e.g., YouTube) to communicate with their colleagues or students. Collaborative search systems (I-SPY), Academic social bookmarking systems (CiteULike), open shared rankings and reviews (Faculty 1000, Adobe Acrobat 8.0), open access journals (PubMedCentral, BioMedCentral, PLoS), and online sharing bibliographic databases and annotations (Connotea) are all examples of new scholarly communication information technologies. The adoption of these is consistent among the respondents across the five universities.
Ongoing work at UNC includes correlation analyses breaking out results by departments or demographic variables. Additional work in conjunction with other researchers is looking at longitudinal comparisons to study trend analysis. Surveys can only provide a superficial understanding of the complex information-seeking behavior, and a complementary study at UNC is conducting in-depth interviews combined with information-seeking behavior captured through screen logging and diaries to better understand information-seeking behaviors and build working models of scientists' information seeking, use, management and sharing.
This work was supported by a grant from the Ochiltree Foundation to study information-seeking behavior in the context of scholarly communications. Statistical support was provided by the Odum Institute for Research in Social Science.
Table . Appendix B: Table of Departments Surveyed and Counts of Respondents