This paper reports results of a preliminary study on why first–year college students select certain online research resources as their favorite. Results, based on a survey of over 500 U.S. college students in first–year writing classes, offer a more complex picture of student motivation than popular accounts of these students as disinterested, lazy, and ignorant. Students reported most frequently that they favored resources for reasons of ease, quality, and connectivity. They reported least frequently that they favored resources for reasons of relevance, variety, and speed. These results suggest that students value finding scholarly sources above relevant sources.
Results and discussion
Implications and future directions
It has become a commonplace understanding in post–secondary education that students rely on the Internet to conduct research for both academic and non–academic tasks. Today’s traditional college students have never lived in a time without the Internet. Moreover, those in developed nations with consistent Internet access generally rely heavily on the Internet for communicating, socializing, and finding information, including the information they use for research–writing tasks for school (see Jones and Madden, 2002; Smith, et al., 2011). While some teacher–scholars (e.g., Bauerlein, 2008; Gardner, et al., 1999; Gillette and Videon, 1998; Graham and Metaxas, 2003; Tensen, 2010; Wang and Artero, 2005), librarians (e.g., Davis, 2003; Grimes and Boening, 2001; Herring, 2001; Thompson, 2003), and journalists (e.g., Carr, 2011) have despaired at or lamented this reliance, that students conduct research in a “Googlepedia” world is clear (McClure, 2011) . But why they do is not.
This paper shares results from a study that begins to explore why. The study seeks to answer the following research questions: (1) Which online research resources do first–year college students say they use? (2) Which do they select as their favorite? (3) Why do they select these research resources as their favorite? To answer these questions, this article reports results of a questionnaire that asked 523 first–year writing students which research resources they used and why certain online research resources were their favorite. The study found that students most frequently reported favoring resources for reasons of ease, quality, and connectivity. These results present a more complex picture of student motivation than popular accounts of NextGen college students as disinterested, lazy, and ignorant.
This paper contributes to existing literature by offering insight into students’ rationales for selecting online research resources and, in turn, their conceptions of research. Knowledge of these rationales can lead to more effective research and research–writing pedagogy and research tool designs. It can also inform future studies of students’ research and research–writing practices.
Much research on students’ digital research practices productively enhances our understanding of the nature and effects of students’ use of the Internet for research and research–writing. Less research, however, considers motivations for this use. Research published on students’ motivations, moreover, tends to address students’ reasons for selecting particular source texts rather than the resources to find those texts.
Project Information Literacy
For Project Information Literacy, Alison J. Head and Michael B. Eisenberg have studied the ways and conditions in which U.S. college students use the Internet for school and non–school research. They provide insight into how students
- struggle to find desired texts and establish appropriate contexts for their research–writing (2009a);
- return to the same familiar resources (e.g., course readings, Google, Wikipedia) despite a plethora of resource options (2009b; see also Head and Eisenberg, 2010a);
- employ research strategies and resources that they see as predictable and efficient (2010b; see also Head, 2007);
- turn to Wikipedia as a source for course–related research for background information, current information, and easily comprehensible information (2010a);
- use Internet resources (e.g., public search engines, Wikipedia) together with off–line resources (e.g., family, friends, classmates) for nonacademic “everyday” research (2011a) ; and,
- use the academic library as a “safe harbor” to manage technology access and networks rather than as a space to consult library resources (2011b) .
Perhaps the most in-depth look at students’ research behaviors to date, this work richens our understanding of what college students say they do online (and off). Given the influence and importance of this research, I contextualize some of my study results with Project Information Literacy findings.
Pew Internet and American Life Project
The Pew Internet and American Life Project similarly provides a window into how college students use the Internet. Among its many and varied studies, several provide insight into U.S. college students’ research and research–writing behaviors online. Steve Jones and Mary Madden’s 2002 study, “The Internet goes to college,” established that “[c]ollege students are heavy users of the Internet.” Of particular interest for this study, almost 75 percent of surveyed students reported using the Internet more than the library for research. Jones and Madden, moreover, observed that students using a library or campus computer lab for academic work “made use of commercial search engines rather than university and library Web sites.”  These results coincide with results of Lee Rainie, Leigh Estabrook, and Evans Witt’s 2007 study of U.S. adults seeking information to solve problems. They found that adults consulted the Internet more than any other resource and visited libraries primarily to use networked computers to access the Internet.
Kathryn Zickuhr’s 2010 Pew study reported that, among different generations, Millenials (adults born between 1977 and 1992) were more likely than adults of any other generation to access the Internet wirelessly via laptop or cell phone. It also reported, however, that “certain key [I]nternet activities are becoming more uniformly popular across all age groups,” including using search engines and conducting online research for health, news, and religious information . So according to Zickuhr, while college students in particular turned to the Internet for research, adults of all ages were increasingly doing so. Aaron Smith, Rainie, and Zickuhr’s 2011 study added to these findings that college students (both undergraduates and graduates not attending community colleges) were more likely than college–aged (18–24 year–old) non–students, community college students, and the overall adult population to use the Internet, to access the Internet via home broadband, and to access the Internet via laptop wireless. They were also more likely to own a cell phone, laptop, and iPod or other mp3 player with which they could get online.
Taken together, these Pew studies emphatically support that the Internet is college students’ most frequent and often first stop for research. They substantiate that Internet use for research, common among many adults, is especially prevalent among college attendees. Such research warrants continued study of how young adults, particularly college students, use and explain their motivations for choosing the Internet for research.
Writing studies research
Scholars and teachers of writing have viewed firsthand, often in required first–year and advanced composition classes, how students rely on the Internet for research–writing assignments. In response to their observations (and to an awareness of college students’ Internet use as described in the Pew studies), researchers in writing studies have studied how (and which) students use networked digital technologies. Much of this scholarship has focused on the ways in which students use the Internet to complete academic research–writing tasks and the productive potential of Internet (re)sources for this work (e.g., Day, et al., 2009; Lundin, 2008; Purdy, 2010, 2009; Purdy, et al., 2007; Sidler, 2002; Sorapure, et al., 1998) . Other writing studies research has advocated collaborations between compositionists and librarians to better instruct students in research–writing (e.g., Elmborg and Hook, 2005; McClure and Baures, 2007; Peele and Phipps, 2007; Peele, et al., forthcoming). On the whole, however, writing studies research has not addressed students’ digital research–writing practices outside the context of plagiarism (a topic for which there is a wealth of writing studies scholarship that has productively analyzed and challenged notions of plagiarism in a digital world).
There is a noteworthy exception. In a study more directly attuned to students’ source use inclinations and instruction, Vicki Tolar Burton and Scott A. Chadwick (2000) surveyed 533 college students regarding their source use, their source evaluation criteria (for both Internet and library sources ), and the instruction they received in evaluating sources for academic research–writing assignments. They found that few students received explicit training in college about how to evaluate Internet sources. They also found that, as a result, most applied the same criteria to evaluating both Internet and library (i.e., print) sources, with ease of access being the most important criterion . Most students used both Internet and library sources for research rather than exclusively one or the other . Research–writing pedagogy for many instructors has changed since Burton and Chadwick’s study, and many new digital resources are now available. Thus, up–to–date studies on students’ responses to these changes are all the more important.
Library and information science research
Library and information science scholars likewise have viewed firsthand the research behaviors of college students. Their research, in addition to studying source retrieval and search tool design, has explored explicitly why students choose particular online research sources. Randall McClure  and Kellian Clink (2009), for instance, analyzed the bibliographic citations in 100 research papers written by students in first–year writing classes. They examined students’ source use considering three criteria commonly used for evaluating Internet sources: timeliness, authority, and bias. Their study provides a useful in–depth look at students’ selection of Internet source materials but does so through the lens of only these three source characteristics.
Other studies of students’ source selection have focused on a broader range of criteria. Barbara Fister’s 1992 study of fourteen undergraduate students and Michelle Twait’s 2005 study of thirteen undergraduate students found content/topic to be a (or, for Twait, the) primary criterion students used to select sources. Kyung–Sun Kim and Sei–Ching Joanna Sin (2007) found other factors to influence students’ decisions. In their survey of 225 undergraduate students, they learned that students valued accuracy/trustworthiness most, followed by accessibility and ease of use, when making source choices . More recently, Robert Detmering and Anna Marie Johnson (2012) analyzed information literacy narratives college students wrote. They paid particular attention to students’ reported experiences finding, evaluating, and using information for academic research–writing assignments.
This work characterizes the larger body of library and information science scholarship on students’ research practices. It productively focuses on students’ attitudes toward research and why students select particular sources (i.e., which individual texts they cite in their writing). But it has yet to explore fully why they select particular research resources to find those texts.
That is where this article seeks to intervene. Few studies have explored why first–year college students say they use and favor the research resources they do . This article begins that work. It discusses a preliminary study that explores the reasons why students select particular research resources, especially digital resources, as their favorite.
To learn which online research resources student use, select as their favorite, and why, Joyce R. Walker and I developed a questionnaire on students’ thoughts about their research practices and predilections. We developed questions in response to a prior study (2007) in which we observed the online research practices of five college students and interviewed those students about their practices. With Institutional Review Board approval, in spring 2009 I distributed the questionnaire to all 1,169 students taking the second of two required first–year writing courses at a mid–sized Midwestern university; 523 students completed and returned the questionnaire. Unless students expressed interest in being interviewed for the study, they filled out questionnaires anonymously.
The questionnaire included ten questions, two of which are relevant for this article: 
On the grid below, create a bar graph to indicate the relative importance of each resource in your own research. The more blocks you fill in, the greater importance you give that resource.
Books Public search engines (e.g., Yahoo, Google) Scholarly search engines (e.g., Google Scholar) Newspapers or magazines Face–to–face discussions/interviews with people Online message boards or discussion groups Public records (e.g., government documents) Library catalog Library online journal databases (e.g., JSTOR) Online book stores (e.g., Amazon)
What is your favorite online research tool or resource (e.g., Google, Google Scholar, Furl, Wikipedia, Amazon, library Web site)? Why?
Results for the first question were entered into the online survey program SurveyMonkey (http://www.surveymonkey.com) to facilitate data analysis. Results for the latter question were coded by hand.
Students completing the questionnaire comprised 487 freshman (93 percent of the total), 20 sophomores (4 percent), 12 juniors (2 percent), one senior (0 percent), and three abstentions (0 percent) . Students had majors from across campus (business, health sciences, social sciences, humanities, etc.). I surveyed students in first–year writing classes because these classes frequently have responsibility for teaching research and research–writing processes in post–secondary education. Therefore, they take on particular importance in how students learn to be and see themselves as researchers.
The sample size is too small and not sufficiently representative to draw definitive conclusions about why students prefer particular digital research resources. Conclusions, therefore, are necessarily preliminary. The data reported here come from students at one research university so may not apply to students at other or other types of universities. Moreover, as Amy E. Robillard (2008) pointed out, questionnaires in academic contexts can carry bias. Students sometimes report what they think faculty want to hear and seek to represent themselves in the best light . The anonymity of the questionnaires and their lack of connection to any course grade protected against this bias to some extent, and on the whole students’ responses (e.g., their clear preference for Google, their admitted use of Wikipedia) show at least some honesty. But the possible bias behind student self–reports must still be considered.
Despite its limitations, however, this study plays an important part in ongoing research into students’ research and research–writing practices. It offers a valuable preliminary understanding of what rationales fuel students’ decision–making. Thereby, it serves as an important foundation for research and research–writing pedagogy that can exploit these motivations and for future, larger–scale studies that can test for these motivations.
Results and discussion
Before turning to which online resources students designated as their favorite and why, this article first addresses the relative importance of research resource options for students. Students’ responses to this question support why I focused on students’ preferences among, and rationale for, using online research resources.
The most important research resources for students
The three resources students designated as having the most importance for their research are all online resources. As shown in Figure 1, students gave public search engines such as Google and Yahoo, scholarly search engines such as Google Scholar, and library online databases such as JSTOR the highest average ratings, 7.98, 7.95, and 5.79 (out of 10), respectively. Clearly, students found online research resources most important.
Figure 1: Students’ average ratings of the relative importance of certain research resources.
Note: Ratings are out of a possible 10. This chart was generated using SurveyMonkey (http://www.surveymonkey.com).
Figure 2 shows that these averages translate into 67.3 percent of students (352) rating public search engines as 8 or higher and 50.3 percent of students (262) rating scholarly search engines as 8 or higher. Thus, over half of the surveyed students placed significant importance on online search engines for their research.
Figure 2: Percentage and numerical breakdown of students’ ratings of the relative importance of particular research resources.
Note: The maximum rating is 10. Highlighted squares show the most frequent response for each resource. This figure was generated using SurveyMonkey (http://www.surveymonkey.com).
These findings coincide to some extent with those of Head and Eisenberg. In their survey (2011a) of 8,353 students from 25 U.S. colleges, Head and Eisenberg found 95 percent of students indicated using search engines such as Google within the last six months to find “everyday life information” . In their 2009b study, they likewise found that students frequently used Google for course–related research (though students first consulted course readings rather than Google) .
Students’ favorite research resources
Consistent with their answers regarding the research resources they value, students overwhelmingly identified their favorite online research resources as Google and Google Scholar. Four hundred and six students identified Google and/or Google Scholar as their favorite, 73.6 percent of the total responses (552) . Students selected Google as their favorite 312 times, over three times as often as the next favorite resource, Google Scholar, still a Google product. Taken together, all non–Google research resources accounted for only 26.4 percent (146 of 552) of students’ favorite research resources, reinforcing Google’s dominance. This dominance reflects Eszter Hargittai, et al.’s (2010) finding that the Google brand (more than any other) elicited positive emotional responses from the students they observed and interviewed .
Library databases and the library Web site both fell within the top five answers. However, students designated them as their favorite for only 13.6 percent of the total responses (75 of 552). This percentage is likely lower than most academic librarians and writing teachers would want.
Figure 3 illustrates the number of responses for each resource students named as their favorite.
Figure 3: Research resources students selected as their favorite.
Note: This chart was generated using Microsoft Word.
Students’ most frequent rationales
Students offered 10 primary reasons why they selected these research resources as their favorite. Figure 4 shows the number of responses for each reason. The total number of responses, 630, exceeds 523, the number of students who completed the survey, because some students offered more than one reason for why a resource was their favorite.
Figure 4: Reasons students chose research resources as their favorite.
Note: This chart was generated using Microsoft Word.
Ease of use was the main reason students indicated they selected a research resource as their favorite. Students offered this reason 193 times, over twice as many times as any other reason, illustrating its dominance in guiding students’ decision–making. In addition to “easy,” responses in this category of rationale included “simple to use,” “straightforward,” “easy to operate,” “self–explanatory,” “don’t have to be taught [how to use] it,” and “very clear format.” Overall, students valued simpler as better.
This predilection for ease is consistent with studies of students’ motivations for online source selections. Burton and Chadwick (2000), for instance, found that the two top reasons students selected particular Internet sources were that they were “easy to understand” and “easy to find“ . Students’ preference for ease in this study is also consistent with Kim and Sin’s (2007) finding that ease of use was one of the top three reasons students selected particular sources to include in their writing . Head and Eisenberg’s (2010a) findings likewise coincide. They found that 69 percent of the sophomore, junior, and senior students they surveyed consulted Wikipedia for academic research tasks because its interface is easy to use and 64 percent of students consulted Wikipedia because its articles are easy to understand . These results suggest that students select both research resources and sources returned by these resources based on their perception of ease.
While this favoring of ease may be seen as a consequence of student laziness, the second most popular reason students identified a research resource as their favorite was quality. Students indicated that they favored a resource that returns trustworthy, credible, and scholarly sources. Representative responses in this category include “more likely to be credible,” “great at weeding out the crap,” “because you know it’s scholarly,” “good for scholarly articles,” “it’s scholarly,” “quality sites and info,” and “[I] know I can use it [for] appropriate research for my paper.” That students simultaneously valued scholarliness and favored Google and Google Scholar may signal a misperception of the scholarliness of Google’s results and an overreliance on Google Scholar’s judgment of scholarliness. This finding is consistent with Hargittai, et al.’s (2010) discovery that students placed too much trust in search engines to vet credible results . This finding may also indicate that ease of use trumps scholarliness in students’ hierarchy of rationales.
The third most frequent reason students cited as guiding their choice of favorite research resource was connectivity. In other words, they valued a resource’s digital affordance to connect directly to a source text. Responses illustrating this rationale are “[it] takes you to online databases,” “[it links directly to] full text and scholarly journals,” “[it] links information together,” and “lots of connections to new info.” In particular, students valued hyperlinking functionality, the ability of digital research resources to take them directly to sought texts. Similarly, students indicated favoring resources that provided access to .pdfs of texts they desired. These results suggest students expect and value “all–in–one” resources — those that provide not only a source’s bibliographic information, but also the source itself.
Students’ valuation of connectivity in this study to some degree coincides with Head and Eisenberg’s (2010a) study on students’ Wikipedia use. They found that 54 percent of students they surveyed used Wikipedia for schoolwork because of its hyperlinked citations. In their survey, however, connectivity was not among the top five reasons students used Wikipedia . The lower ranking in their study may reflect that students view Wikipedia more as a source of background information than as a gateway to other sources and hold connectivity as more important for research resources than source texts.
Students’ least frequent rationales
As notable as the reasons students offered most frequently for why they favored particular research resources are the reasons they offered least frequently. The three reasons students provided least were relevance, variety, and speed. As Figure 4 shows, however, students offered 14 reasons that did not fall into any of the 10 categories. I grouped these responses into a miscellaneous category. Miscellaneous responses include reasons such as, “[it] changes the pictures for holidays, which I like”; “You can look at things in different languages”; and “One cent goes to charity.” Thus, strictly speaking students did provide other reasons less frequently than relevance, variety, and speed. For the purposes of this study, though, it seemed more useful to focus on categories of reasons, that is, recognizable patterns among student responses, rather than outliners or anomalies.
Only six students explicitly identified relevance as a reason a research resource was their favorite. In other words, only six students favored a resource for its ability to return sources relevant for their topic and/or assignment. Responses in this category include “[r]elevance,” “so much relevant info,” “always something on your topic,” and similar rationales.
This finding diverges from other studies. Students in Head and Eisenberg’s (2009a) Project Information Literacy study frequently reported frustration with an “inability” to find “desired materials” . Michelle Twait (2005) also found that students most frequently selected sources based on content/topic . Yet students in my study did not frequently identify a resource’s ability to find relevant sources as a reason to favor it. Thirty–six, however, did identify a resource’s ability to provide background information for an assignment as a reason. Fifty also identified a resource’s ability to return many sources. These results suggest that students favor resources that return any results, particularly scholarly results, above resources that return relevant results.
The second least frequently offered reason was variety. Twenty–eight students identified a resource’s ability to return many different kinds of sources or to allow for conducting multiple different kinds of searches as a reason it was their favorite. Responses that evidence this rationale include “variety of results,” “has subject variety,” and “you can find such a wide range of sources.” Taken together with the other reasons students offered, this data suggests that students hold finding many scholarly sources as more important than finding a range of topically relevant sources. That is, it seems that as long as a research resource returns scholarly results, students are satisfied.
The reason students provided third least often was speed, a resource’s ability to return results quickly. With responses such as “fast,” “quick,” and “super fast,” students cited efficiency only 32 times as a reason a particular research resource was their favorite. This result complicates Head and Eisenberg’s (2010b) finding that efficiency was a driving factor of students’ research strategies . For the students in this study, efficiency did not significantly motivate their designation of their favorite research resource .
This study reveals that students preferred research resources they find straightforward and uncomplicated. They likewise preferred resources that return scholarly results. They also favored resources that link them directly to sought texts rather than provide only bibliographic information. Students indicated they are motivated less by how quickly a resource returns results or if it offers a variety of results. They are also motivated less by whether a resource provides topically relevant sources.
Per Kim and Sin’s (2007) study, students’ research resource selection motivations largely match their source selection motivations. The top three reasons Kim and Sin found students to select particular sources were quality, accessibility, and ease of use. These coincide with the top three reasons I found students to select particular online research resources: ease, quality, and accessibility. In this study, however, the reasons occur in a different order of frequency. Per Fister’s (1992) and Twait’s (2005) studies, however, students had different motivations for selecting source texts and research resources. Topical relevance was one of the least influential factors in this study. It was, though, a primary motivating factor for students in both Fister’s (1992) and Twait’s (2005) studies. These conflicting results substantiate the need for additional research.
Given the overwhelming volume of Internet information, easy–to–use resources that connect to quality results help students meet their research needs. In this regard, students’ motivations show them to be savvy. They also show that students are not primarily interested in finishing research tasks quickly. Speed was the third least provided reason students offered. Quality was the second most popular rationale. This frequency reinforces that students care more about finding credible and accurate materials than finishing fast. These results suggest students may willingly devote time to finding scholarly materials — as long as searches are not too difficult. That is, students’ answers indicate they may be amenable to persistent searching with simple resources that access full text sources.
Viewed through a hopeful lens, this stance means students are willing to engage in sustained research. More pessimistically, this stance means students are unlikely to use more rigorous search methods that take effort to learn. These more rigorous search methods may be what they need to find sources appropriate for academic research tasks.
This study provides insight into not only students’ research resource choice motivations, but also students’ definition of research. That students valued quality over relevance indicates students may define research as meeting particular task criteria, rather than generating knowledge. For example, they may see good research as referencing five scholarly sources rather than conversing with topically relevant sources. This prioritization suggests students approach research as about finding a certain kind of source rather than engaging relevant content. That they identified both quality and relevance as reasons particular online research resources were their favorite is encouraging. However, students’ responses intimate that they hold fulfilling a task’s requirements as more important than engaging deeply with a topic. Though students’ responses do not suggest their rationales are necessarily mutually exclusive, they do evidence a hierarchy.
Given their answers, students implicitly profess that finding any scholarly source supersedes finding a scholarly source relevant to their topic. This stance may explain their sometimes inappropriate or perplexing source choices for academic research–writing tasks.
Implications and future directions
These findings should give instructors pause. Students indeed are listening to mandates about using scholarly sources. They are listening so much that sometimes just finding scholarly sources is seemingly sufficient. Even if referencing them does not make good sense for a given research or research–writing task, they do so anyway.
Those of us who teach and research digital research practices, then, might consider adjustments to our pedagogy. One approach is talking explicitly with students about the need for and value of relevant, not just scholarly, sources. Students seem to know well that scholarly sources are important. Indeed, they apparently hold the scholarly label as more important than a source’s content. To help, we can ask students to explain the relevance of each source they reference for an assignment. The goal is to compel students to justify including sources beyond their classification as “scholarly.” Thus, we might ask students to find reasons beyond the fact that a scholarly database or search engine returned sources. That sources come from scholarly resources can be a good start, of course. But that alone is not sufficient. This study indicates some students may hold the scholarly resource label to be sufficient. They would benefit from instruction otherwise.
Another part of this instruction can entail emphasizing context. We need to teach students that research resources — and sources they return — have different use value in different rhetorical situations. We should model for students that a scholarly (re)source, like the five–paragraph essay, works well in some, not all contexts. “Scholarliness” is a label that should be used with precision and not immediately accepted at face value. We need to clarify for students that a scholarly (re)source appropriate for one task may not be appropriate for another. That a (re)source is “scholarly” does not make it universally useable. For instance, a source returned by JSTOR may be suitable for a research project in the humanities. However, it may not be appropriate for a research project in the sciences, although it remains scholarly. Similarly, a source returned by Google Scholar may work well for non–academic research into side effects of a medical procedure. Yet it may be inappropriate for an academic essay on that procedure, despite it being scholarly.
Part of this instruction can include helping students learn to evaluate the scholarliness of (re)sources on their own. Their responses in this study show they may rely only (or too heavily) on how teachers, librarians, database providers, or others designate scholarliness. To help, class time may be devoted to analyzing how Google Scholar and library databases work. Students may then be asked to evaluate what kinds of sources these resources classify as scholarly.
Perhaps most important is fostering in students — and the public — a view of scholarly research as foundational to knowledge creation. One key to achieving this goal is asking students to conduct their own primary research for research–writing tasks. That way, they can better understand practices involved in scholarly research and the studies discussed in secondary sources they read. Another key is more clearly explaining to public audiences the work scholarly research entails and how it influences their lives. Also key is offering civic resources that educate citizens on how to read and assess expert information and scholarly sources. Such information and sources are invoked in political advertisements, governmental policies, and legislative decisions.
This study is only preliminary. Future studies might
Ask students what specific features and functions of Google (and other favored resources) they find easy. Instructors, librarians, and research tool designers could apply this knowledge to design scholarly resources they want students to consult. The result could be more user–friendly library interfaces and journal article databases. This approach may mean balancing robust functionality with intuitive interfaces. Certainly robust functionality is attractive to professional researchers. But if novice researchers do not avail themselves of this robust functionality, it does little good.
Ask students about research resource selection in context. That is, ask why they used certain resources for particular academic and non–academic research tasks. Doing so could help determine how time constraints, task objectives, and available options shape student motivations and behaviors.
Observe students selecting and using online sources for a variety of research and research–writing tasks. These observations can help to supplement survey data. They can provide insight into how students’ actions compare and contrast with their stated rationales.
Investigate how students define research and see themselves as researchers. Such studies might consider to what extent these definitions differ for academic and non–academic tasks. They might also consider whether these definitions change over time. Finally, they might also consider which research resources students use for each kind of task and why.
Test redesigned digital research resources to determine which students favor and why. This usability testing may lead to research resources students are more likely to favor and, thereby, use.
Such work is essential to understanding more fully how students approach research and learning the motivations that fuel that work. With this knowledge we can help students approach research as more than simply fulfilling a checklist of generic features. For instance, they can view research as more than quoting three scholarly sources and providing correct APA citations. The result can be a notion of research as answering pertinent questions in ongoing scholarly conversations. This perspective can help NextGen researchers draw on relevant foundations to produce new discoveries and new understandings of prior conclusions.
About the author
James P. Purdy, Ph.D., is assistant professor of English/writing studies and director of the University Writing Center at Duquesne University in Pittsburgh, Pa. His research and teaching interests include digital research and writing practices, Web 2.0 technologies, and composition and writing center theory. He has published articles in scholarly journals, including College Composition and Communication, Computers and Composition, Journal of Literacy and Technology, Kairos, Pedagogy, and Profession, and chapters in edited collections, including The New Work of Composing, RAW (Reading and Writing) New Media, and Writing Spaces. With co–author Joyce R. Walker, he won the 2011 Ellen Nold Award for the Best Article in Computers and Composition Studies and the 2008 Kairos Best Webtext Award.
E–mail: purdyj [at] duq [dot] edu
Thanks to First Monday’s editor and reviewers for their thoughtful and insightful suggestions for revising this paper. I am also grateful to Erin Rentschler for her assistance in coding questionnaire results. This research was supported in part by a Wimmer Grant from Duquesne University.
1. McClure, 2011, p. 221ff.
2. Head and Eisenberg, 2011a, “Results.”
3. Head and Eisenberg, 2011b, p. 2ff.
4. Jones and Madden, 2002, pp. 2–3, 12.
5. Zickuhr, 2010, pp. 1–2.
6. Scholarship in writing studies has fruitfully addressed students’ non–academic writing. However, its studies of research have focused primarily on academic work. A notable exception is the Writing in Digital Environments (WIDE, 2010) research collective’s survey of 1,366 first–year writing students in colleges and universities. This research focused on the kinds of academic and non–academic writing and research students do and with what digital technologies.
7. The binary between Internet and library sources around which Burton and Chadwick (2000) structured their analysis no longer holds. Many library resources are now available via the Internet. Thus, it is misleading and inaccurate to present library sources as opposed to Internet sources.
8. Burton and Chadwick, 2000, pp. 320–322.
9. Burton and Chadwick, 2000, pp. 318–320. See also Purdy and Walker, 2007.
10. McClure is a writing studies rather than library and information science (LIS) scholar. He, however, works at the intersection of writing studies and LIS. McClure has co–authored with library science researchers and published in LIS journals, so I reference him here.
11. Communication researchers Eszter Hargittai, Lindsay Fullerton, Ericka Menchen–Trevino, and Kristin Yates Thomas (2010) studied how first–year college students assessed the accuracy/trustworthiness of the online sources they used. As part of that work, they also investigated why students were drawn to certain search engines. Though not situated in library and information science, their work draws on LIS research.
12. Burton and Chadwick (2000) and Head and Eisenberg (2010a) likewise asked students why they made particular research and research–writing decisions. Thus, in some sense I follow their lead. Yet while they productively asked college students why they used Internet sources and Wikipedia, respectively, this study differs from their work in two important ways. First, Burton and Chadwick (2000) and Head and Eisenberg (2010a) did not focus on first–year college students. Burton and Chadwick (2000) surveyed college students at all levels, from undergraduates to graduate students. Only about 16 percent of the students in their survey (87 of 543) were first–year students (p. 314). Head and Eisenberg (2010a) did not include first–year students in their study. They surveyed sophomore, junior, and senior undergraduates (“Methods”). Second, Burton and Chadwick (2000) and Head and Eisenberg (2010a) focused their studies on sources students used rather than the resources students used to find those sources. Head and Eisenberg (2010a) found that students they surveyed used Wikipedia as a resource for finding other sources (“Why Wikipedia”). However, they primarily framed their discussion of Wikipedia as a source.
13. To read students’ responses to other items from the questionnaire and a discussion of the larger study in which the questionnaire operates, see Purdy, forthcoming.
14. Due to rounding, the percentages for students completing the questionnaire total 99 rather than 100 percent.
15. Robillard, 2008, p. 30.
16. Head and Eisenberg, 2011a, “Results.”
17. Head and Eisenberg, 2009b, p. 3.
18. The total number of responses, 552, exceeds 523, the number of students completing the survey. Some students identified more than one resource as their favorite.
19. Hargittai, et al., 2010, p. 480.
20. Burton and Chadwick, 2000, pp. 319, 321.
21. Kim and Sin, 2007, p. 659.
22. Head and Eisenberg, 2010a, “Why Wikipedia?”
23. Hargittai, et al., 2010, pp. 469–470, 479–480.
24. Head and Eisenberg, 2010a, “Why Wikipedia?”
25. Head and Eisenberg, 2009a, p. 3.
26. Twait, 2005, p. 569.
27. Head and Eisenberg, 2010b, p. 3ff.
28. Students may ultimately hold some of these less cited reasons as important. For instance, implicit in why students valued digital resource connectivity was that such resources saved them time. Tracking down print copies or taking citations to the library are potentially more time–intensive tasks. A useful follow–up study would provide students with the list of ten reasons from this study and have them rank which they value most.
Mark Bauerlein, 2008. The dumbest generation: How the digital age stupefies young Americans and jeopardizes our future (or, don’t trust anyone under 30). New York: Jeremy P. Tarcher/Penguin.
Nicholas Carr, 2011. The shallows: What the Internet is doing to our brains. New York: Norton.
Philip M. Davis, 2003. “Effect of the Web on undergraduate citation behavior,” portal: Libraries and the Academy, volume 3, number 1, pp. 41–51.http://dx.doi.org/10.1353/pla.2003.0005
Michael Day, Randall McClure, and Mike Palmquist (editors), 2009. Composition in the freeware age: Assessing the impact and value of the Web 2.0 movement in the teaching of writing, special issue of Computers and Composition Online (Fall), at http://www.bgsu.edu/departments/english/cconline/Ed_Welcome_Fall_09/compinfreewareintroduction.htm, accessed 30 May 2012.
Robert Detmering and Anna Marie Johnson. 2012. “‘Research papers have always seemed very daunting’: Information literacy narratives and the student research experience,” portal: Libraries and the Academy, volume 12, number 1, pp. 5–22.http://dx.doi.org/10.1353/pla.2012.0004
James K. Elmborg and Sheril Hook (editors), 2005. Centers for learning: Writing centers and libraries in collaboration. Chicago: Association of College and Research Libraries.
Barbara Fister, 1992. “The research processes of undergraduate students,” Journal of Academic Librarianship, volume 18, number 3, pp. 163–169.
Susan A. Gardner, Hiltraut H. Benham, and Bridget M. Newell, 1999. “Oh, what a tangled web we’ve woven! Helping students evaluate sources,” English Journal, volume 89, number 1, pp. 39–44.http://dx.doi.org/10.2307/821354
Mary Ann Gillette and Carol Videon, 1998. “Seeking quality on the Internet: A case study of composition students’ works cited,” Teaching English in the Two–Year College, volume 26, number 2, pp. 189–194.
Leah Graham and Panagiotis Takis Metaxas, 2003. “‘Of course it’s true; I saw it on the Internet!’ Critical thinking in the Internet era,” Communications of the ACM, volume 46, number 5, pp. 70–75.http://dx.doi.org/10.1145/769800.769804
Deborah J. Grimes and Carl H. Boening, 2001. “Worries with the Web: A look at student use of Web resources,” College & Research Libraries, volume 62, number 1, pp. 11–22.
Eszter Hargittai, Lindsay Fullerton, Ericka Menchen–Trevino, and Kristin Yates Thomas, 2010. “Trust online: Young adults’ evaluation of Web content,” International Journal of Communication, volume 4, pp. 468–494, and at http://ijoc.org/ojs/index.php/ijoc/article/view/636/, accessed 15 August 2012.
Alison J. Head, 2007. “Beyond Google: How do students conduct academic research?” First Monday, volume 12, number 8, at http://firstmonday.org/article/view/1998/1873, accessed 16 February 2012.
Alison J. Head and Michael B. Eisenberg, 2011a. “How college students use the Web to conduct everyday life research,” First Monday, volume 16, number 4, at http://firstmonday.org/article/viewArticle/3484/2857, accessed 16 February 2012.
Alison J. Head and Michael B. Eisenberg, 2011b. “Balancing act: How college students manage technology while in the library during crunch time,” Project Information Literacy Progress Report (12 October), at http://projectinfolit.org/pdfs/PIL_Fall2011_TechStudy_FullReport1.2.pdf, accessed 16 February 2012.
Alison J. Head and Michael B. Eisenberg, 2010a. “How today’s college students use Wikipedia for course–related research,” First Monday, volume 15, number 3, at http://firstmonday.org/article/view/2830/2476, accessed 24 June 2012.
Alison J. Head and Michael B. Eisenberg, 2010b. “Truth be told: How college students find and use information in the digital age,” Project Information Literacy Progress Report (1 November), at http://projectinfolit.org/pdfs/PIL_Fall2010_Survey_FullReport1.pdf, accessed 16 February 2012.
Alison J. Head and Michael B. Eisenberg, 2009a. “Finding context: What today’s college students say about conducting research in the digital age,” Project Information Literacy Progress Report (4 February), at http://projectinfolit.org/pdfs/PIL_ProgressReport_2_2009.pdf, accessed 14 February 2012.
Alison J. Head and Michael B. Eisenberg, 2009b. “Lessons learned: How college students seek information in the digital age.” Project Information Literacy Progress Report (1 December), at http://projectinfolit.org/pdfs/PIL_Fall2009_Year1Report_12_2009.pdf, accessed 14 February 2012.
Susan Davis Herring, 2001. “Faculty acceptance of the World Wide Web for student research,” College & Research Libraries, volume 62, number 3, pp. 251–258.
Steve Jones and Mary Madden, 2002. “The Internet goes to college: How students are living in the future with today’s technology,” Pew Internet & American Life Project (15 September), at http://www.pewinternet.org/~/media//Files/Reports/2002/PIP_College_Report.pdf, accessed 24 February 2012.
Kyung–Sun Kim and Sei–Ching Joanna Sin, 2007. “Perception and selection of information sources by undergraduate students: Effects of avoidant style, confidence, and personal control in problem–solving,” Journal of Academic Librarianship, volume 33, number 6, pp. 655–665.http://dx.doi.org/10.1016/j.acalib.2007.09.012
Rebecca Wilson Lundin, 2008. “Teaching with wikis: Toward a networked pedagogy,” Computers and Composition, volume 25, number 4, pp. 432–448.http://dx.doi.org/10.1016/j.compcom.2008.06.001
Randall McClure, 2011. “Googlepedia: Turning information behavior into research skills.” In: Charles Lowe and Pavel Zemliansky (editors). Writing spaces: Readings on writing. Volume 2. Fort Collins, Colo. and West Lafayette, Ind.: WAC Clearinghouse and Parlor Press, pp. 221–241.
Randall McClure and Kellian Clink, 2009. “How do you know that? An investigation of student research practices in the digital age,” portal: Libraries and the Academy, volume 9, number 1, pp. 115–132.http://dx.doi.org/10.1353/pla.0.0033
Randall McClure and Lisa Baures, 2007. “Looking in by looking out: The DNA of composition in the information age,” Computers and Composition Online (Fall), at http://www.bgsu.edu/cconline/McClure_and_Baures, accessed 19 May 2012.
Thomas Peele, Melissa Keith, and Sara Seely, forthcoming. “Teaching and assessing research strategies in the digital age: Collaboration is the key,” In: Randall McClure and James P. Purdy (editors). The new digital scholar: Exploring and enriching the research and writing practices of nextgen students. Medford, N.J.: Information Today.
Thomas Peele and Glenda Phipps. 2007. “Research instruction at the point of need: Information literacy and online tutorials,” Computers and Composition Online (Fall), at http://www.bgsu.edu/cconline/PeeleandPhipps/, accessed 19 May 2012.
James P. Purdy, forthcoming. “Scholarliness as other: How students explain their research–writing behaviors,” In: Randall McClure and James P. Purdy (editors). The new digital scholar: Exploring and enriching the research and writing practices of nextgen students. Medford, N.J.: Information Today.
James P. Purdy, 2010. “Wikipedia is good for you!?” In: Charles Lowe and Pavel Zemliansky (editors). Writing spaces: Readings on writing. Volume 1. Fort Collins, Colo. and West Lafayette, Ind.: WAC Clearinghouse and Parlor Press, pp. 205–224.
James P. Purdy, 2009. “When the tenets of composition go public: A study of writing in Wikipedia.” College Composition and Communication, volume 61, number 2, pp. W351–W373, at http://www.ncte.org/library/NCTEFiles/Resources/Journals/CCC/0612-dec09/CCC0612When.pdf, accessed 19 December 2010.
James P. Purdy and Joyce R. Walker, 2007. “Digital breadcrumbs: Case studies of online research.” Kairos: A Journal of Rhetoric, Technology, and Pedagogy, volume 11, number 2, at http://kairos.technorhetoric.net/11.2/binder.html?topoi/purdy-walker/, accessed 19 May 2012.
James P. Purdy, Douglas Eyman, Joyce R. Walker, and Colleen Reilly (editors), 2007. “Online research, writing, and citation practices,” special issue of Computers and Composition Online (Fall), at http://www.bgsu.edu/cconline/edwelcome_special07.html, accessed 19 May 2012.
Lee Rainie, Leigh Estabrook, and Evans Witt, 2007. “Information searches that solve problems,” Pew Internet & American Life Project (30 December), at http://libraries.pewinternet.org/2007/12/30/information-searches-that-solve-problems/, accessed 19 June 2008.
Michelle Sidler, 2002. “Web research and genres in online databases: When the glossy page disappears,” Computers and Composition, volume 19, number 1, pp. 57–70.http://dx.doi.org/10.1016/S8755-4615(02)00080-4
Aaron Smith, Lee Rainie, and Kathryn Zickuhr, 2011. “College students and technology,” Pew Internet & American Life Project (19 July), at http://www.pewinternet.org/Reports/2011/College-students-and-technology/Report.aspx, accessed 22 February 2012.
Madeline Sorapure, Pamela Inglesby, and George Yatchisin, 1998. “Web literacy: Challenges and opportunities for research in a new medium,” Computers and Composition, volume 15, number 4, pp. 409–424.http://dx.doi.org/10.1016/S8755-4615(98)90009-3
Bonnie L. Tensen, 2010. Research strategies for a digital age. Third edition. Boston: Wadsworth.
Christen Thompson, 2003. “Information illiterate or lazy: How college students use the Web for research,” portal: Libraries and the Academy, volume 3, number 2, pp. 259–268.http://dx.doi.org/10.1353/pla.2003.0047
Michelle Twait, 2005. “Undergraduate students’ source selection criteria: A qualitative study,” Journal of Academic Librarianship, volume 31, number 6, pp. 567–573.http://dx.doi.org/10.1016/j.acalib.2005.08.008
Yu–Mei Wang and Marge Artero, 2005. “Caught in the Web: University student use of Web resources,” Educational Media International, volume 42, number 1, pp. 71–82.http://dx.doi.org/10.1080/09523980500116670
Writing in Digital Environments (WIDE) Research Center, Michigan State University, 2010. “The writing lives of college students,” A WIDE survey and whitepaper (September) at http://wide.msu.edu/special/writinglives/, accessed 20 September 2010.
Kathryn Zickuhr, 2010. “Generations 2010,” Pew Internet & American Life Project (16 December), at http://pewinternet.org/Reports/2010/Generations-2010.aspx, accessed 24 February 2012.
Received 3 June 2012; revised 9 July 2012; accepted 23 July 2012.
This paper is licensed under a Creative Commons Attribution–NonCommercial–NoDerivs 3.0 Unported License.
Why first-year college students select online research resources as their favorite
by James P. Purdy
First Monday, Volume 17, Number 9 - 3 September 2012