First Monday

How college students use the Web to conduct everyday life research by Alison J. Head and Michael B. Eisenberg



Abstract
This paper reports on college students’ everyday life information–seeking behavior and is based on findings from 8,353 survey respondents on 25 U.S. college campuses. A large majority of respondents had looked for news and, to a slightly lesser extent, decision–making information about purchases and health and wellness within the previous six months. Almost all the respondents used search engines, though students planning to purchase something were more likely to use search engines, and those looking for spiritual information were least likely to use search engines. Despite the widespread use of search engines, the process of filtering relevant from non–relevant search results was reportedly the most difficult part of everyday life research. As a whole, these students used a hybrid information–seeking strategy for meeting their everyday life information needs, turning to search engines almost as much as they did to friends and family. A preliminary theory is introduced that describes the relationship between students’ evaluation practices and their risk–associated searches.

Contents

Introduction
Literature review
Research questions
Methods
Methodological issues
Results
Discussion
Conclusion
Opportunities

 


 

Introduction

Besides “Googling it,” how do today’s college students look for information to solve problems in their daily lives?

As part of an ongoing research study, we investigated how college students conduct everyday life research — what types of information needs they have, and what information sources and practices they use to satisfy these needs.

Developmental psychologists have long identified the early 20s as a crucial time for learning and applying problem solving skills (Arlin, 1975; Commons, et al., 1989) [1]. Ideally, the college experience rapidly advances students’ cognitive development. Students are often asked about differences in viewpoint, what aspects of a topic may remain unexplored, and how a piece of knowledge or an issue may serve as a call for individual action later in life.

At the same time, students must perform information–seeking tasks for school, work, and their personal, daily lives, often for the first time. As a result, information–seeking activities may be equally or more complex for students than those undertaken by full–fledged adults who have already adjusted to life at large (Rieh and Hilligoss, 2008).

These factors make college students a unique cohort to study, especially today when an unprecedented number of students were born digital [2]. A parade of new digital technologies has been a constant feature in most of their lives. For this generation, information–seeking strategies are being formed, practiced, and learned. These methods are put to the test in the vast information landscape of their college years.

Overall, little is known about the everyday information worlds of today’s college students. What kinds of information do students frequently need in their daily lives? Which online and off–line sources do they use for solving information problems? What makes everyday life research difficult for them?

This paper presents findings from a survey of 8,353 students on 25 U.S. campuses in the spring semester of 2010. We collected data about how students conceptualized and operationalized research for personal use in their daily lives.

The primary contribution of this research is an inside view of the early adult’s everyday life research process. Specifically, we focus on students’ blended usage of computer– and human–mediated communication channels for solving information problems and evaluating sources in everyday life.

 

++++++++++

Literature review

Scholars in library and information science have long been concerned about college students and their information problem solving strategies. The concept of information literacy has been formalized as an essential element of a library’s mission, especially in college settings (Maughan, 2001).

In 1989, the Association of College and Research Libraries (ACRL) defined information literacy as a “set of abilities requiring individuals to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information” [3]. ACRL updated its standards in 2000 in response to three characteristics of the digital age: (1) a plethora of new information technologies and online information sources, (2) a professional concern about the “escalating complexity” of the information retrieval environment, and, (3) the critical need to teach undergraduates skills for lifelong learning [4].

Numerous books and dozens of studies have been devoted to information literacy instruction and assessment (Eisenberg and Berkowitz, 1990; Gavin, 2008; Gross and Latham, 2009; Oakleaf, 2011, 2008; Radcliff, et al., 2007; Warner, 2008). Qualitative and quantitative models for assessing the information problem–solving process have also been developed (Head and Eisenberg, 2009, 2008; Kuhlthau, 2004).

Despite these efforts, at last count, only 13 percent of a sample of test–takers made up of high school seniors and college students could be considered information literate [5].

Library and information science researchers have contended many college students have little or no knowledge of the on–going scholarly research process (Leckie, 1996). Most students are frustrated by the ambiguity of intellectual discovery (Kuhlthau, 2004).

Moreover, undergraduates struggle with finding different kinds of contexts (i.e., big picture, language, situational, and information gathering) when conducting course–related research, and to a lesser extent, everyday life research (Head and Eisenberg, 2009; 2008).

Regardless of the abundant online and off–line sources available to them, most students rely on a small collection of “tried and true” sources — course readings, search engines, and Wikipedia for course–related research (Head and Eisenberg, 2010; 2009).

Everyday life research

While one critical gap in the library and information science research has been its predominant confinement to information literacy in the context of formal learning environments, there is a thin strand of research about how college students conduct research for personal use in daily life.

In the mid–1990s, Reijo Savolainen, a Finnish scholar, first defined the research field of everyday life research. Applying Pierre Bourdieu’s concept of habitus, Savolainen developed a framework for understanding information–seeking behavior in work and at home [6]. Notably, he claimed that individuals engaged in hobbies and sought practical information shaped and solely driven by their personal values and attitudes (Savolainen, 1995).

Further studies in everyday life research introduced the concept of information grounds — purposeful and temporary places where serendipitous and informal information sharing occurs. These exchanges are a by–product of some intended activity, such as receiving treatment at health clinics (Pettigrew, 1999).

One study has investigated the information grounds of college students (Fisher, et al., 2007). Based on interviews (n=729), researchers found students frequently exchanged everyday life information in bars, coffee shops, and/or in hallways outside of classrooms.

Most college students in the study (70 percent) visited some sort of information ground daily. Nearly half the sample found the everyday life information they gathered useful, whether it was about a class or a new idea about life that had not occurred to them before. Overall, the research suggests college students frequently engage informal, serendipitous information exchanges with other like minds.

Online activity studies

As a whole there is a relatively small group of studies about everyday life research (Chatman, 2000; Dervin, 1992; Fisher, et al., 2007; Meyers, et al., 2009). A majority of the literature focuses on information sharing in conventional physically located places. The usage of alternate networked information grounds, such as Facebook, has yet to be widely studied (Savolainen, 2009).

The ongoing research from the Pew Internet & American Life Project is a bountiful “fact tank” about Internet usage. Based on telephone surveys of large U.S. samples, the Pew studies have focused on individual online activities.

Over the years, several Pew studies have focused on college students and their Internet usage. A 2002 study (n=2,501) found that 42 percent of college student sample used the Internet to communicate socially with friends and only 10 percent of college students used the Internet primarily for entertainment (Madden and Jones, 2002).

In a follow–up longitudinal study findings were compared with the 2002 Pew study and a 2005 replication study (n=7,421) (Jones, et al., 2009). Results from this study showed Internet usage for entertainment almost tripled for students between 2005 (28 percent) and 2002 (10 percent). Researchers suggested that e–mailing, searching, and browsing habits might have been replaced, within three years, by the use of Web 2.0 sites like Facebook and YouTube.

More recently, a 2010 Pew study reports how different generations use the Internet, including millennials — those born between 1977 and 1992 (Zickuhr, 2010) [7]. All in all, the study found millennials (n=676) frequently engage in a variety of information–seeking activities using the Internet. They rely on search engines to do so; a majority of them search for health, news, purchasing, and trip–planning information.

Taken together, studies such as these provide trend data about students’ online activities. In particular, the data have measured college students and their increased use of the Internet for social communication. A large body of scholarly studies has also delved into college students and their use of social network sites, specifically to acquire online social capital (boyd and Ellison, 2007; Ellison, et al., 2007; Valenzuela, et al., 2009).

The purpose of our research is to provide data about the range of students’ everyday life information needs, the online and off–line sources they consult, their evaluation practices, and the barriers and challenges they have with their processes. Findings such as these are significant for understanding what kind of lifelong learners college students, who were born digital, may eventually become.

 

++++++++++

Research questions

We investigated how college students apply their everyday life information literacy competencies — independently of course work, professors’ expectations, and grades.

The goals of this study were twofold: (1) to understand what information needs students have in their daily lives; and, (2) to explore how students solve and satisfy their needs for personal information by using online and off–line sources.

We studied how college students are conducting everyday life research in five related areas:

  1. What personal information needs occur in the daily lives of students?

  2. What sources do students consult for finding everyday life information?

  3. What predictors reveal which type of students are more or less likely to use search engines such as Google for solving information problems?

  4. What evaluation criteria do students use to judge the quality of sources they have found and whom do students ask for help with evaluating everyday life information sources?

  5. What is difficult about conducting everyday life research?

 

++++++++++

Methods

Our research was conducted as part of Project Information Literacy (PIL). PIL is a national study in the University of Washington’s Information School [8]. The ongoing research is a large–scale study of college students and their research habits. In this study, we have used the college student experience to study everyday life research behavior.

We collected data for this paper in two phases: (1) student focus groups in 2008, and (2) a large–scale survey with follow–up interviews in 2010.

Phase one: Student focus groups

The PIL team conducted 11 student focus groups on seven campuses in the U.S. between October and December 2008 [9]. On average, each session was 90 minutes long. A total of 86 students participated in the sessions.

We used the focus groups to find the consensus about participants’ research habits, approaches, and experiences. The qualitative data helped define response categories in our 2010 survey. A segment of the sessions focused on everyday life research. We discussed information needs, behaviors, and sources that college students used.

Participants ranged from 20 to 30 years of age. They were full–time sophomores, juniors, and seniors from four–year public and private colleges and universities, and full–time community college students, who had completed at least one semester at the institution [10]. Seventy percent of the students who participated in the focus groups were female [11].

The focus group sample consisted primarily of students in the humanities or social sciences [12]. This group of students, we assumed, was likely to be acquainted with “desk research” (i.e., secondary data that has been collected by someone else). The mean GPA for the total student sample across all seven schools was 3.44, or just above a B+.

Phase two: Large–scale student survey

We also collected data through a large–scale survey we administered to 112,844 students on 25 U.S. campuses from 6 April 2010 through 18 May 2010 [13].

Our sample size was 8,353 responses. The overall response rate was 7.4 percent.

The 22–item survey was administered online and ran for two weeks on each campus. One e–mail reminder was sent to non–respondents after the first week of the survey launch.

During our ongoing research, we have studied both course–related and everyday life research processes. We define everyday life information research as the information seeking conducted for personal reasons and not directly related to fulfillment of a course assignment. This includes information for solving problems arising in the course of daily life and to satisfy general inquisitiveness about a topic.

The data appearing in this paper is based on three survey topics about students’ everyday life information–seeking behavior. We asked respondents about their needs, approaches, evaluation methods, and difficulties.

Survey sample

We collected data from a large voluntary sample of sophomores, juniors, and seniors during the spring of 2010. Table 1 presents an overview of the demographic make–up of sample.

 

Table 1: Description of the survey sample.
DemographicsNFrequency
Total8,353100%
Female5,41565%
Male2,82334%
Declined to state571%
No response58
 
Sophomore2,25527%
Junior2,72433%
Senior3,37440%
 
18 to 20 years old3,04637%
21 to 22 years old3,68444%
23 to 25 years old6758%
Over 25 years old88111%
Declined to state28
No response39
 
Arts and humanities1,74721%
Business administration91311%
Engineering88311%
Sciences2,31628%
Social sciences2,36628%
Double majors731%
Undecided48
No response7
 
Private college or university (four–year)1,23615%
Public college or university (four–year)7,04084%
Community college (two–year)771%

 

More students who were 21 or 22 years old (44 percent) took the survey than students of any other age group. In other words, the largest percentage of students in our sample were born in 1989 — the same year Timothy Berners–Lee, a researcher at CERN, wrote his initial proposal describing the World Wide Web.

More of the students in the sample were studying social sciences (28 percent), and the sciences (28 percent). Other respondents were studying arts and humanities (21 percent), business administration (11 percent), and engineering (11 percent).

The most frequently reported GPA was in the category of 3.4 to 3.7. As a point of reference, we calculated this grade point average as being between a B+ and an A- [14].

Follow–up interviews

Lastly, we conducted follow–up interviews with students in our sample who had volunteered their time (n=25).

The sample was segmented along four lines: (1) respondents with high (4.0) vs. low GPA (2.4), (2) disciplinary area of study, (3) frequent vs. infrequent use of librarians, and (4) specific difficulties with research.

Each interview was conducted by telephone and lasted from 15 to 30 minutes. The interviews were recorded and interviewees were asked for their permission to record. An audio file of eight hours and 10 minutes was the end result.

We used a script with seven open–ended questions as a guideline for the conversational interviews with participants. To ensure consistency, the same person conducted all of the interviews.

 

++++++++++

Methodological issues

There are several challenges associated with using a survey methodology in any social science research study.

It is one thing to limit conclusions to the sample of respondents who actually participate in the study and quite another when an attempt is made to generalize from those responses to some larger population.

The sampling strategy for descriptive studies relies upon the degree to which the sample is representative of a larger population. The most common approach to this problem is by means of a sampling design where there is a known probability of inclusion for every respondent sampled from a larger population.

In our research, the sample for our 2008 focus groups and 2010 survey were both composed of self–selected volunteers from a larger population. Such samples may be biased in unknown ways. There is no basis for making inferences to the population from the sample responses.

Another frequent issue is response rate. A limitation of our study is the seven percent response rate to the student survey. Clearly, a seven percent response rate is too low to be generalizable to the entire college student population.

Instead, analytical studies such as ours test the robustness of relationships appearing in the data. Thus, while it might be difficult to argue about the absolute level of utilization of a specific information–seeking technique, for example, focusing on relationships allows us to test the robustness of what has been found. It can be argued that these relationships do exist in the larger population and could be seen in any sample used to describe them.

While fully acknowledging that further research is required to confirm our findings, especially in terms of generalizing to the full college population, we assert that the relationships among variables are consistent across samples and reflect relationships that do exist in the larger population.

Clearly, response rate matters, but it matters more in descriptive than in analytical studies. This issue has been raised and the importance of a high response rate has been questioned in the last five or six years. The American Association of Public Opinion Research (AAPOR) (and others) has published provocative studies claiming the relationship between response rates and survey quality has become less clear [15].

 

++++++++++

Results

Unsurprisingly, we found different information needs arose in the daily lives of college students.

Could a recent tick bite cause Lyme disease? What news is being reported in the hometown newspaper? What does a diagnosis of breast cancer mean for the patient? What is the starting salary for civil engineers? What are the values of a certain religious group? [16]

In our focus groups, participants identified three kinds of information needs. Participants discussed searching for information to (1) satisfy curiosity, (2) find a fact quickly, and (3) solve a specific information problem (Head and Eisenberg, 2008) [17].

Almost all of the participants described searches to satisfy their curiosity (e.g., the year in which the Boer War ended) and information for fact–finding (e.g., movie times at a local theater) as being quick one–offs.

Students also discussed searches for solving some information problems they considered somewhat riskier and more complicated. These searches sometimes lasted for days, especially since there was no deadline assigned by an instructor or a grade given as with course–related research (e.g., what a diagnosis of cancer in a relative meant).

As one participant explained, “Everyday research can be circuitous and time–consuming and it does not involve the same type of research skills as course–related research does. Often a lot is at stake with everyday research — the only way to find out if you are right is to go out into the world and see if your answer works for you.”

We heard similar comments about the connection between the importance of the problem and search time and effort from survey respondents in the follow–up interviews. One respondent said, “Money is a big basis for how I research things in my life. The more expensive something is, the more time I’m going to research and determine whether I really want it.”

Another survey respondent described a search for information requiring computer–mediated along with human–mediated sources.

In the case of curing food, if you do it improperly you can get sick and die. I went online and looked through a couple of blogs but the comments sounded really corny, so I blew those off and found a cookbook with basic information online.

You need to be careful about what your sources are. I looked online but I also went to the County Extension Office and asked for credible sources, too. If you’re just writing a paper for class, it reflects on your knowledge, skills, abilities and ethics. If you’re curing a ham, the knowledge, skills, ability and discernment you use actually affect your health and your life. Big difference.

Taken together, nearly all of the participants agreed they were “more caught up” and “more engaged” in everyday life research than with course–related research. This was especially true when searches were meant to solve information problems with higher–stakes or real–life consequences.

Information for making decisions

Students we studied had a strong need to know what is going on in the world beyond the campus. More students in our survey sample (79 percent) had searched for news in the previous six months than for anything else (see Figure 1).

Yet, as a whole, the majority of students’ information needs were directly related to personal decision–making in their daily lives. Nearly three–quarters of the sample reported looking for information about a product and/or a service (74 percent) and/or health and wellness topics (74 percent). Another two–thirds of the sample had searched for information about jobs or a career (67 percent) and about travel and trip planning (61 percent).

Looking for information for making decisions trumped finding someone with similar interests, (i.e., social communication). Slightly more than half of the respondents (51 percent) reported searching for information for making social contacts. These findings suggest respondents drew a distinction between needing information for solving everyday life problems vs. communicating with others.

Further, less than half the sample reported that their recent search for information was related to their domestic life (46 percent). About a third of the respondents (36 percent) had searched for an answer to a work–related question and/or information about advocacy or causes (32 percent). Still, fewer students in the sample (24 percent) searched for spiritual information about a group and/or beliefs or to find an expert, such as a physician, therapist, or attorney (20 percent).

Overall, the results reveal students’ in our sample had an underlying hierarchy to their information needs. While most respondents sought information for staying current, another two–thirds of the sample looked for information about making decisions directly related to their individual lives (e.g., purchasing something, health/wellness, finding a job, and trip planning).

At the same time, few respondents appear to have searched for information that might lead to community involvement or civic engagement (i.e., advocacy or spiritual/religious information). These findings suggest students’ more frequent information needs may be more motivated by personal needs than community engagement.

 

Figure 1: Students' everyday life information needs
Figure 1: Students’ everyday life information needs.
This figure shows information needs arising for respondents within the previous six months. Respondents were asked to “click all that apply.”

 

Finding information

Almost all the respondents relied on the same few information sources for finding everyday life information. A large majority of respondents used the Web for everyday life information needs. Nearly all of the respondents (95 percent) used Web search engines for gathering everyday life information (see Figure 2).

Similarly, focus group participants also mentioned using search engines. Unsurprisingly, most participants mentioned Google by name.

The combined familiarity of using Google and accessibility drove its use. As one focus group participant put it: “Google is always my first step, even though I know it may not be the best first step, but it is my most accessible one.”

In our student follow–up interviews, we also found that search engines serve up consensus, which some value. As one interviewee said, “typing something into Google and finding the same information from different sites verifies information for me — most people agree; they are thinking the same thing about a given subject — it works.”

Another frequently used source was Wikipedia. Almost nine out of 10 in the survey sample (87 percent) reported using it for everyday life research.

When talking with students, we found an inevitable relationship between Google and Wikipedia. In other words, they students recognized that a Google search often returned a Wikipedia entry on the first page of results.

As one participant in the focus groups explained: “I don’t really start with Wikipedia; I Google something and then a Wikipedia entry usually comes up early on, so I guess I use both in kind of a two–step process.”

A survey respondent we interviewed reported going to Wikipedia because “for the most part I trust Wikipedia because it is something that is double–checked by its users pretty frequently.”

Yet, we also found students surveyed did not solely rely on the Web when asked how often they consulted a list of 13 computer–mediated and human–mediated sources. A large majority of respondents also reported turning to friends/family, and classmates (see Figure 2).

Over four–fifths of the respondents (87 percent) turned to friends/family and classmates (81 percent) for everyday life information. To a far lesser extent, the sample turned to instructors (53 percent), and librarians (14 percent).

Convenience was a trigger for prioritizing the use of certain sources — both computer– and human–mediated. As one focus group participant explained, “I know it sounds kind of bad, but I’ll only ask a person for everyday life information if they are closer than my computer.”

In a follow–up interview, a survey respondent said, “my parents are generally the first people I ask because they are overall pretty intelligent and I can always get a hold of them.”

A large percentage of respondents also relied on their own documentary collections (75 percent) to meet information needs in daily life. These were materials already had in hand (e.g., notes, books, magazines, printouts of online materials).

A majority of the sample used other Web sites to find information. Almost two–thirds of the sample (63 percent) reported turning to government Web sites. Half the sample (50 percent) used blogs for everyday life information.

At the same time, seven out of 10 respondents (70 percent) used social network sites, such as Facebook, for everyday life information. The finding suggests respondents used social networks for solving information problems as well as for social communication.

We were struck by respondents’ reported use of online research databases (e.g., JSTOR, EBSCO, or ProQuest) for everyday life research. The sources are usually considered the domain of course–related research and are available through the campus library.

Yet, well over a third of the respondents also reported using research databases (40 percent) for finding everyday life information. Other campus materials used for personal searching by students in the sample included online and print encyclopedias, such as Britannica (37 percent) and the campus library’s shelves (28 percent).

Overall, findings confirm the conventional wisdom — the Web, and especially search engines, are the go–to sources for finding information everyday life. At the same time, respondents report, also relied heavily on friends, family, and classmates almost as much as they relied on the Web for everyday life information.

These findings suggest respondents are driven by familiarity and habit. The use of convenient nearby sources drives usage. Yet, to a lesser extent, respondents consulted materials in the campus library, including scholarly research databases. This finding suggests students may have also a need for authoritative fact–finding sources found through the library when conducting everyday life research.

 

Figure 2: Sources students use for everyday life information
Figure 2: Sources students use for everyday life information.
Results are ranked from the most to the least frequent sources students used for everyday life research within the previous six months. Responses of “almost always,” “often,” and “sometimes” have been conflated into a new category of “use.”

 

Ubiquitous search engine usage?

That nearly all of the respondents used search engines to find everyday life information is unsurprising. What needs to be examined, however, are the circumstances in which search engines were more likely to be used — and not used.

We used logistic regression analysis to investigate which members in our college student sample were likely to use search engines to meet which kinds of information needs.

We examined the relationship of specific student characteristics (i.e., age, major, information resource usage, and information needs) to the likelihood respondents would use search engines for everyday life research (see Table 2).

The model contained 27 independent variables in three groups:

  1. Information needs (i.e., health/wellness, news, purchasing, job–related questions, domestic life, work/career planning, spiritual, travel, advocacy, social contacts, and experts).

  2. Information resource usage (i.e., Wikipedia, friends/family, classmates, personal collection, government sites, scholarly research databases, social networks, instructors, encyclopedias, blogs, library shelves, and librarians).

  3. Major area of study (i.e., arts and humanities, business administration, engineering, sciences, and social sciences).

The model’s dependent variable was “the use of search engines.” We determined use by students’ response to a survey questions about the use of search engines during the everyday life research process.

The full model containing all predictors of search engine usage correctly classified 98.6 percent of the cases and had a (Nagelkerke) R–squared value of 28 percent. In other words, 28 percent of all the variance in the use of search engines can be accounted for by these variables, using this model.

As shown in Table 2, although their effect was small, five independent variables were associated with search engine usage with some substantive significance (.05 percent) level. These variables appear bolded and asterisked in the first column of the Table below.

 

Table 2: Predicting the probability of using search engines during everyday life research.
 BS.E.POdds ratio95% for C.I.
Odds ratio
 LowerUpper
Health/wellness.260.227.2531.297.8312.206
News.253.226.2641.288.8272.006
*Purchasing.812.239.0012.2531.4113.596
At–work question-.162.254.524.851.5171.399
Domestic life.194.257.4511.214.7342.007
Work/career.278.227.2191.321.8472.060
*Spiritual-.641.262.014.527.315.880
Travel.355.248.1521.426.8772.318
Advocacy.017.304.9541.018.5611.847
Social contacts.322.271.2341.380.8122.347
Search for expert.733.441.0972.081.8764.942
*Blogs.730.164.0002.0751.5042.863
*Wikipedia.492.086.0001.6351.3811.936
*Government sites.513.115.0001.6701.3322.093
Scholarly databases.199.122.1021.220.9611.548
Librarians-.305.182.094.737.5161.053
Library shelves.017.156.9151.017.7481.381
Instructors.089.114.4341.094.8741.368
Encyclopedias.087.130.5041.091.8451.408
Classmates.111.126.3771.118.8731.430
Friends/family.080.115.4861.084.8651.358
Social network sites.129.099.1921.138.9371.381
Arts and humanities majors (default)  .416   
Business majors.539.365.1401.714.8383.506
Engineering majors-.117.266.660.889.5281.498
Social science majors.199.377.5981.220.5822.557
Science majors-.111.373.765.895.4301.860
Constant-2.465.504.000.085  

 

Overall, the strongest predictor of using search engines was someone looking for information about purchasing something, with an estimated odds ratio of 2.25 (controlling for all other factors in the model). That is, the odds of someone using a search engine are 2.25 to one compared to their election not to use a search engine.

There three other predictors of search engine usage were: (1) someone who also used blogs for everyday life research, with an estimated odds ratio of 2.08 (controlling for all other factors in the model); (2) someone who used government sites for everyday life research, with an estimated odds ratio of 1.67 (controlling for all other factors in the model); and, (3) someone who used Wikipedia for everyday life research, with an estimated odds ratio of 1.64 (controlling for all other factors in the model).

Findings suggest respondents were likely to use search engines in combination with a small set of other information sources — blogs, government sites, and Wikipedia. Given the interactive nature of blogs, this finding suggests blogs may be a frequented networked information ground for search engine users.

Respondents who were looking for spiritual information about a group or beliefs were less likely to use search engines for everyday life research, with an estimated odds ratio of .53 (controlling for all other factors in the model) [18].

In other words, about half as many respondents used search engines when searching for spiritual information as when searching for other types of information.

Overall, the predictors from our model about the use of search engines are as follows:

  1. Respondents planning to purchase something were twice as likely to use search engines than those who were not (controlling for all other factors in our model).

  2. Blog readers were twice as likely to use search engines than respondents who did not use blogs (controlling for factors in our model).

  3. Respondents who used government sites and/or Wikipedia were one and half times more likely to use search engines than respondents who did not (controlling for all other factors in our model).

  4. Those who looking for spiritual information about a group and/or beliefs were less likely to use search engines than those who were not looking for spiritual information.

Critical to a fault

Most searches for information involve sizing up the information quality of a source once it is found. Is the source credible? Is the source up–to–date? Is the information accurate? Is the source useful for the solving the information problem at hand?

We collected data about how frequently respondents judged sources using three criteria: (1) self–taught criteria, (2) traditional standards from the print world, and (3) domain–specific standards (see Figure 3).

Overall, we found most respondents were frequent evaluators of information for personal use. More than any other criteria, respondents relied on self–taught criteria for assessing the quality of everyday information they culled from the Web. More often than not, a site’s design received the most scrutiny (56 percent) [19].

As one participant in our focus groups explained, “the design of a site does a lot for me, if the color is bright pink, or lots of ads, or looks like it was made by a 15–year–old, then I think it probably isn’t worth my time.”

Similarly, in a follow–up interview, a survey respondent said: “When I’m searching the Web, one of the biggest things that I’m going to look at is the ease of use and if there is a bunch of broken links or ads for weird products then it’s a site I generally won’t trust.”

Another deciding factor for respondents was a site’s familiarity. More than half of the students surveyed (54 percent) reported that whether they had used the site before was a frequent criteria used for assessing the quality of Web content.

Yet, familiarity was clearly different than referrals, according to students sampled. Fewer students (44 percent) relied on whether they had heard about a site before and even fewer (11 percent) considered whether a librarian referred a site to them to use.

At the same time, students relied on traditional and formal standards — timeliness and authority — from the scholarly print world and librarianship. More than half of the respondents (54 percent) considered the currency of Web content (e.g., checking the data in footer details). They also relied on the authority of posted content, too, by judging the origin of a site’s URL (49 percent) and/or an author’s credentials (49 percent).

The least applied standards were domain–specific standards. That is, criteria specific to the Internet and often used for judging reliability, authority, and credibility of Web content (e.g., linkage, origins of a URL, footer details). Specifically, we found less than half of the respondents (43 percent) checked for a site’s external links whether an author had credited sources used (32 percent), and/or whether there was a bibliography of some kind (23 percent).

 

Figure 3: Criteria for evaluating Web content
Figure 3: Criteria for evaluating Web content.
Results are ranked from most frequent to least frequent evaluation techniques. Responses of “almost always” and “often” have been conflated into a new category of “frequent use.”

 

Ask a friend

Students in our sample not only turned to people as information sources — they also trusted them when evaluating the quality of the sources they had found (see Figure 4). Almost two–thirds of the sample (83 percent) turned to friends and/or family when they needed help evaluating sources for personal use — more than any other people in their lives [20].

Respondents also asked classmates (74 percent) and instructors (45 percent) for help. Yet, far fewer students asked licensed professionals (35 percent) or librarians (14 percent) for assistance when evaluating information in their everyday lives.

Students in the follow–up interviews explained friends, family, and in some cases, professors were both trusted and convenient sources for both recommending sources and discussing the quality of information they found.

One student from the survey said, “I will ask my friends or my parents or even some professors about a Web site they would suggest, especially if I’m making purchases. For sure, I ask them for their knowledge and experiences so I don’t have to learn the hard way by having a bad experience.”

A few students we interviewed also said they often searched and evaluated online content on their own. However, if a search was important enough to them (e.g., making a purchase) they turned to another person in their lives for assistance.

One student from the survey explained, “sometimes I ask someone else, but it really depends on what I’m buying or how important something is to me but I usually wouldn’t ask someone about the reliability of a source because I feel I am pretty good at judging for myself what’s reliable and what I probably should stay away from.”

Overall, we found evaluation rarely occurs in a vacuum for the majority of students. Students tend to take little at face value when it comes to conducting everyday life research. Moreover, the findings suggest evaluation of sources frequently occurs and it is far from being a solitary task. Most students rely on friends and family when they need assistance — people in their lives close at hand, available, and trusted.

 

Figure 4: Asking for help with evaluating everyday life sources
Figure 4: Asking for help with evaluating everyday life sources.
Results are ranked from most frequent to least frequent used people students turn to for evaluation guidance and help within the previous six months. Responses of “almost always,” “often,” and “sometimes” have been conflated into a new category of “use.”

 

Difficulties: Sorting and sizing up

Lastly, we investigated the difficulties with the everyday life information–seeking process. We collected data about 15 categories of research challenges.

We found respondents experienced the most problems during the later stages of the search processes for personal use (see Figure 5).

As a student in the focus group sessions explained: “What’s hard is finding the ‘right’ source that is exactly what you are looking for — it’s all there, but then how do I find that one source that helps later on when I need it again?”

Moreover, students surveyed struggled most with sorting through all they had found. Filtering relevant from non–relevant results (41 percent) was more difficult than anything else, the respondents reported.

To a lesser extent, students also reported being hobbled by being unable to locate information that they knew existed (33 percent). A quarter of the sample had trouble deciding when their search for an answer/information was actually finished (23 percent).

Evaluating sources for personal use (24 percent), and particularly, determining credibility (26 percent) also hampered a third or less of the sample of students.

The task of finding an information source — an early step in the search process for personal use — was not as problematic for respondents (18 percent).

Likewise, few of the sample reported having problems finding Web content (11 percent), creating search terms (17 percent), reading materials online (19 percent), finding current sources (19 percent) or finding articles in databases (20 percent).

The findings suggest that students have the most difficulty with using information — selecting from results they have searched and then netted — rather than the initial decision of which information source to use for a search.

The widespread use of search engines may further explain why sorting through results was difficult. Even the most poorly constructed search queries are likely to return results when search engines are used. But, making sense of the results — deciding and prioritizing relevance — is more complex and challenging.

Typing in a few search terms in the input box may be fairly easy, but deciding what use may be far more difficult. If an inferior information source is selected and applied, it may have dire personal and financial consequences, depending on the information need.

 

Figure 5: Difficulties with everyday life research
Figure 5: Difficulties with everyday life research.
Results are ranked from most to least agreed statements about student difficulties with everyday life research. Responses for “strongly agreed” and “somewhat agreed” have been conflated into a new category of “agreed.”

 

 

++++++++++

Discussion

Overall, our data present surprising findings that belie conventional wisdom about the information–seeking habits of college students outside of the academic realm.

By far, not all of the searches college students conduct in their daily lives are one–offs to satisfy a passing curiosity, settle a bar bet, or to find something to do that night.

Instead, our discussions with students revealed that many searches involve decision–making to resolve a specific problem with real–life consequences. These searches were more time–consuming, sometimes going on for days. Almost all of the participants in our focus groups agreed that everyday life research was far more engaging than course–related research.

For many students surveyed, search engines such as Google were the go–to–source for everyday life information. At the same time, it is significant that we found some exceptions to the ubiquitous search engine rule.

Notably, when students in our sample were seeking spiritual information they were least likely to use search engines — about half as many respondents used search engines when searching for spiritual information as when searching for other types of information.

The data we present, though, does not explain why students use search engines less when looking for spiritual information. One explanation for this finding — the 24 percent of the sample who looked for spiritual information — is students found religious information (e.g., printed brochures) without using search engines [21].

In this sense, the data we collected from our questionnaire is limited. Our survey did discover spiritual information was a topic least queried with search engines, but this finding raises some interesting questions that are beyond the scope of our study and worthy of future research. For instance, what other topics may not be “search engine–first” topics? Why, according to users?

At the same time, we found respondents were more than twice as likely to use search engines when looking for purchase information [22]. This data suggests students do not use search engines under the same information–seeking conditions. In short, the findings tell us not all Google searches are created equal when it comes to information seeking in everyday life.

Moreover, we found a majority of our sample frequently used Wikipedia, social networks, and government Web sites for finding everyday life information. This finding suggests students use an information–seeking strategy that is not single–source driven. In other words, students’ searches for personal use do not automatically start and end by typing a few keywords into Google — many go to site addresses they already know.

To that end, it is significant that respondents reported using friends and family in their everyday life information–seeking process. The students we studied turned to friends and family more than they did Wikipedia. More than four–fifths of the respondents asked friends and/or family when they needed help evaluating sources for personal use. This finding suggests students use a hybrid information–seeking strategy that blends online sources (e.g., Wikipedia) with off–line sources, such as people that they know.

In a larger sense, these results are striking if they are compared with data we collected from the same sample about course–related research. We found fewer students in the sample turned to someone else for help when evaluating materials for assignments (Head and Eisenberg, 2010).

These findings lead us to conclude that evaluating information for personal use is a critical and highly collaborative process, perhaps, more than most may think. All in all, few students appear to let Web content stand on its own. Many students appear to apply a multi–faceted self–taught criterion for judging Web content, sizing up the design of a site, its familiarity to them, and its timeliness. In many cases, students discuss the quality of the information they have found online with a trusted friend or family member.

A preliminary theory

Our data lays the groundwork for a preliminary theory of the Web evaluation process used during everyday life research by young adults. Our theory proposes college students use a fairly involved process when evaluating — not just finding — certain kinds of Web content.

While researchers have found people initially use interpersonal sources (i.e., friends and family) for finding information sources about recreational activities (Kayahara and Wellman, 2007) and music (Tepper, et al., 2008) and then go online for supplementary information, our preliminary theory adds another piece to this puzzle. Our preliminary theory describes the relationship between students’ evaluation practices and their risk–associated searches.

Our student interviews, in particular, suggest students may be more likely to use a blended evaluation process by employing both online with off–line sources when more is at risk (e.g., spending money). In other words, when students perceive the consequences to be greater, they are more apt to go off–line to double–check the quality of information they have found with a human–mediated source, or sources.

At the same time, we fully acknowledge these suggested outcomes are based on a small set data in our study, derived from student interviews (n=25) or focus group comments (n=86). We have no data from the survey sample (n=8,353) connecting the relationship between associated risk and the likelihood of using a computer–mediated and human–mediated evaluation process.

We therefore recommend further research to explore the relationship between evaluation practices and risk associated searches in order to substantiate our preliminary theory. Results from a large survey sample along with statistical testing may help to reveal useful results. In–depth interviews may present other methodological options adding qualitative depth and richness to the data collected.

In either case, if it holds true that students “amp up” their evaluation efforts during risk associated decision–making, the findings would add an important piece to a blended Web evaluation theory. Future research may be able to answer additional questions about how online channels may be interwoven with human–mediated ones, to what extent, in what order of use, and under what information–seeking circumstances. What is the basis of a risk–associated search for students, besides making purchases? How far do students go in their evaluation process to offset their anticipated risks?

Depending on the findings, of course, the data may show today’s students spend time, dig deep, and double–check certain kinds of information well beyond what they find before them on the screen — even when answers may not be as nearby and convenient as what they may be able to find online.

In a larger sense, these findings provide further data that debunks the myth of “digital natives.” In other words, the findings would lend support that not all students who were born digital go online for everything. Our findings suggest many of today’s students may not think, learn, and find information in profoundly different ways from prior generations [23].

Ironic twist

Lastly, we address an ironic twist in our data, which suggests a different research opportunity. Despite their widespread use of search engines, our sample struggled with processing all that the sites served up to them. Specifically, more respondents found it difficult to sort relevant from irrelevant results than anything else when trying to find information for use in their daily lives.

This finding leads us to conclude that making use of everyday life information — getting to the most useful information — may be the information literacy skill students lack the most when it comes to their everyday life information research process.

Future research in information literacy about the challenges students face beyond their academic information–seeking activities is much needed. While our data tells us students suffer from information overload, future research needs to investigate what solutions and workarounds students may employ and to what end, as far as making them better informed in daily life.

In the often–neglected area of everyday life research, such studies could help inform librarians, educators, and administrators what happens to students the day after graduation — once they enter the workplace, communities, and become full–fledged adults and lifelong learners.

 

++++++++++

Conclusion

This study investigated how college students conduct their everyday life research. We studied the information needs, sources, evaluation practices, and challenges arising when students looked for information to use in their daily lives.

Overall, we found:

  1. Beyond the academic realm, college students frequently searched for information about news, purchases, and health/wellness in their daily lives. Respondents used search engines most often. Yet, they also turned to friends and family nearly as much, and also asked them for help with evaluating the quality of what they had found. These findings suggest students use a hybrid information–seeking process for finding information for personal use, combining computer–mediated sources with human–mediated ones in a fairly complex evaluation process.

  2. College students’ reliance on search engines to meet any and all information needs did not always prove to be true under any circumstances in our study. Respondents were least likely to use search engines when looking for spiritual information about a group and/or beliefs. In addition, they were twice as likely to be used for finding information about a possible purchase — and for making decisions with that had some level of financial risk (e.g., spending money). These findings suggest search engines, despite their frequent use, are not used for any and all kinds of searches in students’ daily lives.

  3. Ironically, students struggled with processing results and finding the good, relevant stuff they need. These findings suggest when students are left to their own — apart from course work, grades, and professors’ expectations — they may lack the skills for selecting the most relevant results they need for solving information problems in their daily lives.

 

++++++++++

Opportunities

Findings from this paper may present opportunities for librarians, educators, information resource vendors, who want to want to be proactive in training and transferring information literacy competencies to students. Moreover, their may be opportunities for students, who want to become more adept at finding information in their daily lives.

  1. We have found throughout our ongoing research, as a whole, teaching students how to develop effective information–seeking strategies for everyday life tends to be more implicitly than explicitly taught to students on many college campuses. Curriculum that teaches students how to craft more effective searches may directly benefit students the most, by giving them the life–long learning skills they can take into the workplace and their lives after graduation.

  2. In particular, students searching for everyday life decision–making information may benefit from more hands–on training and coaching from librarians and instructors in developing effective methods for getting at the results they value most. Also, students may benefit from learning hands–on critical thinking strategies for asking the most useful questions when turning to friends and family as information sources and co–evaluators.

  3. Based on students’ use of online database resources for everyday life research, there may also be some entrepreneurial opportunities for information publishers. There may be a market in developing everyday life online information sources for college students, in addition to the part and parcel sources already developed for course–related research and campus libraries.

  4. Lastly, this study lays the preliminary groundwork for further research in four areas: (1) how to teach and coach college students in finding everyday life information for use in their lives, future workplaces, and for lifelong learning; (2) what role blogs may play as networked information grounds in college students’ daily lives; (3) what the relationship may be among search engine usage, decision–making, and associated risk; and, (4) the usefulness of a Web content evaluation theory for describing how students size up Web content during everyday life research. End of article

 

About the authors

Alison J. Head, Ph.D. and Michael B. Eisenberg, Ph.D. are the Co–Principal Investigators and Co–Directors of Project Information Literacy (PIL), which is based in the Information School at the University of Washington. In the Information School, Head is a Research Scientist and Lead Researcher for PIL and Eisenberg is Dean Emeritus and Professor in the Information School.
Web: The PIL Web site is located at http://projectinfolit [dot] org
E–mail: ajhead1 [at] u [dot] washington [dot ] edu and mbe [at] u [dot] washington [dot ] edu.

 

Acknowledgements

We are grateful to Susan Gilroy (Harvard College), David Nasatir (U.C. Berkeley), and Karen Schneider (Holy Names University) who made useful suggestions for this paper. This research was sponsored with contributing funds from the John D. and Catherine T. MacArthur Foundation. A full report of the 2010 study is available at http://projectinfolit.org/pdfs/PIL_Fall2010_Survey_FullReport1.pdf.

 

Notes

1. In an effort to expand Piagetian theories of formal thought about cognitive stages of development, scholars have developed their own theories, based on the assumption that the distinctive characteristic of adult thought, which often first appears during the “late formal stage” (ages 17–25), is the acceptance and integration of various, and at times incompatible, truths that are highly dependent upon context and upon the way in which the individual perceives them without the individual needing, as the adolescent does, to look for and to find a single truth.

2. John Palfrey and Urs Gasser (Palfrey amd Gasser, 2008) first used the phrase “born digital” to describe a growing segment of the population born in 1980 or beyond, who have grown up “immersed in digital technologies, for whom a life fully integrated with digital devices is the norm.” Quoted and retrieved from Berkman Center’s “Youth and Media Project” site, at http://cyber.law.harvard.edu/research/youthandmedia/digitalnatives, accessed 1 December 2010.

3. The original definition of information literacy issued by ACRL in 1989 is cited in “Information Literacy Competency Standards for Higher Education,” ACRL Standards Committee and approved by the Board of Directors of the Association of College and Research Libraries (ACRL) for American Library Association (2000), at http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm, accessed 1 December 2010.

4. Ibid., p. 2.

5. The sample for this study was drawn from 800 high school students and 3,000 college students in the U.S. For preliminary results from the study, see Educational Testing Services, ICT Literacy Assessment Preliminary Findings, at http://www.ets.org/Media/Products/ICT_Literacy/pdf/2006_Preliminary_Findings.pdf, accessed 25 February 2011.

6. Pierre Bourdieu’s concept of habitus is the set of dispositions (i.e., long–lasting habits, beliefs, values, tastes, bodily postures, feelings, and thoughts) affecting an individual’s perception and actions in the world. Habitus is derived from the individual’s personal history, interaction with others, and surroundings of his/her everyday life (Bourdieu, 1984).

7. Though there is no set definition for describing the age range of millennials, we have used Pew Internet & the American Life Project’s definition which describes millennials as those born between 1977 and 1992 (Zickuhr, 2010), accessed 26 January 2011.

8. For more background about our ongoing research project, see the Project Information Literacy Web site at http://projectinfolit.org, accessed on 22 December 2010.

9. The student discussion groups were held on seven U.S. campuses with full–time sophomores, juniors, and seniors at Harvard College, University of Illinois at Urbana–Champaign, Mills College, University of Washington, and with students, who had completed at least one semester at three community colleges: Diablo Valley College (Calif.), West Valley College (Calif.), and Shoreline Community College (Wash.), during October, November, and December 2008.

10. We intentionally excluded freshmen from our four–year institution sample, and students who had taken fewer than 12 units from our community college sample. These students were excluded because they were more likely to discuss the research strategies they had used in high school, rather than those they had acquired (or were learning) and had begun using in college.

11. For the discussion groups, we did not intentionally try to balance our sample for gender (one of the institutions in the campus sample was a women’s college). Without this campus in the sample, more than half of the sample from co–ed campuses was female (63 percent).

12. In the discussion group sample, there was representation from students studying anthropology, art history, communication, economics, education, English, gender studies, global studies, health, history, international relations, languages, linguistics, music, political science, psychology, social studies, and sociology. To a much lesser degree (nine percent of the sample), some student “walk ins” were studying computer science, nursing, engineering, and business administration.

13. The survey was administered to full–time sophomores, juniors, and seniors at the following 25 U.S. campuses: Boise State University, Cal Maritime (CSU), Colgate University, College of William and Mary, Colorado State University, Corban College, Eastern Michigan University, Felician College, Gettysburg College, Holy Names University, Linfield College, New Mexico State University, Northern Kentucky University, Northern Michigan University, Ohio State University, Purdue University, St. Mary’s College of Maryland, Southern Nazarene University, State College of Florida, Manatee–Sarasota, Temple University, University of Arizona, University of Michigan, University of Minnesota, West Virginia University, and Winston–Salem University. A Google map of the institutions participating the sample is also available at: http://tinyurl.com/y4smquw.

14. For purposes of our analysis, we employ University of Washington’s scale for translating GPA to letter grades, courtesy of the Office of the Registrar, at http://www.washington.edu/students/gencat/front/Grading_Sys.html, accessed on 1 December 2010.

15. See “Response Rates — An Overview,” American Association for Public Opinion Research, at http://www.aapor.org/Response_Rates_An_Overview.htm, accessed 14 February 2011.

16. These are everyday life research questions participants in our 2008 focus groups discussed having within the previous six months.

17. The explanation of everyday life information research we provide here is based on students’ perceptions of the process, through the lens of their experience. We fully acknowledge that research in the library and information science field provides detailed frameworks and models for understanding everyday information–seeking behavior (Chatman, 2000; Savolainen, 1995; Dervin, 1992).

18. The survey question (#13) defined spiritual information as a topic and included a parenthetical example for clarification, as follows: (e.g., finding out about different religious beliefs.).

19. It is interesting to note that while interface design (e.g., fonts, colors, and layout) was reportedly used by over half of the sample (56 percent) as a cue for detecting credibility of a Web site, few respondents reported judging the design of charts (39 percent), specifically, as a criterion (assuming charts existed on sites).

20. The percentages are based on responses of “almost always,” “often” and “sometimes” in this paper. In our 2010 report, “Truth Be Told: How College Students Evaluate and Use Information in the Digital Age,” we conflated “almost always” and “often” into a new category of “frequently used” and the percentages, therefore, differ, p. 13.

21. Interestingly, a 2001 Pew survey about cyberfaith indicated many of those searching for religious information on the Web tend to find sites by “word of mouth,” not search engine searches. Nearly half of Pew’s study survey sample (46 percent) reported they learned of religious Web content through family, friends, or a church brochure (or other print materials) with a Web address printed on it (Larsen, 2001). Clearly, this earlier trend from Pew may still hold true a decade later — few users rely on search engines for finding religious information (Jansen, et al., 2010). The researchers conducted a large–scale analysis of over a million search engine data sets occurring between 1997 and 2005 and searches for religious–related information. The study used five data sets from Excite, Alta Vista, and Dogpile. Google search engine results were not included in the data analysis. They found only 1 to 1.5 percent of the sessions were searches for religious information.

22. Although more respondents reported looking for news (79 percent) rather than purchasing information (74 percent), the use of search engines for finding news, an independent variable in our logistic regression model, was not statistically significant. This finding suggests respondents used a specified news site, such as the site for their hometown newspaper and/or nytimes.com, rather than using a search engine to find news.

23. For discussions about the limitations of the phrase “digital natives,” see “The Net Generation Unplugged,” Economist (4 March 2010), at http://www.economist.com/node/15582279, accessed 26 January 2011 and Howard Rheingold, 2011. “Crap Detection 101: Required Coursework,” Project Information Literacy Smart Talk, number 5 (3 January), at http://projectinfolit.org/st/rheingold.asp, accessed 4 January 2011.

 

References

American Association for Public Opinion Research, 2011. “Response Rates — An Overview,” at http://www.aapor.org/Response_Rates_An_Overview.htm, accessed 18 February 2011.

American College and Research Libraries (ACRL), 2000. “Information Literacy Competency Standards for Higher Education,” ACRL Standards Committee and approved by the Board of Directors of the Association of College and Research Libraries for the American Library Association (includes 1989 standards), at http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm, accessed 1 December 2010.

Patricia K. Arlin, 1975. “Cognitive Development in Adulthood: A Fifth Stage?” Developmental Psychology, volume 11, number 5, pp. 602–606.http://dx.doi.org/10.1037/0012-1649.11.5.602

Pierre Bourdieu, 1984. Distinction: A Social Critique of the Judgment of Taste. Cambridge, Mass: Harvard University Press.

danah m. boyd and Nicole B. Ellison, 2007. “Social Network Sites: Definition, History, and Scholarship,” Journal of Computer–Mediated Communication, volume 13, number 1, pp. 210–230, and at http://jcmc.indiana.edu/vol13/issue1/boyd.ellison.html, accessed 20 December 2010.

Elfreda Chatman, 2000. “Framing Social Life in Theory and Research,” New Review of Information Behaviour Research, volume 1, pp. 3–17.

Michael L. Commons, Jan D. Sinnott, Francis A. Richards and Cheryl Armon, 1989. Adult Development. Volume 1: Comparisons and Applications of Adolescent and Adult Developmental Models. New York: Praeger.

Brenda Dervin, 1992. “From the Mind’s Eye of the User: The Sense–Making Qualitative–Quantitative Methodology,” In: Jack D. Glazier and Ronald R. Powell (editors). Qualitative Research in Information Management. Englewood, Colo.: Libraries Unlimited, pp. 61–84.

Economist, 2010. “The Net Generation Unplugged,” Economist (4 March), and at http://www.economist.com/node/15582279, accessed 26 January 2011.

Educational Testing Services, 2006. “2006 ICT Literacy Assessment Preliminary Findings,” at http://tinyurl.com/4fcbl6v, accessed 25 February 2011.

Michael B. Eisenberg and Robert E. Berkowitz, 1990. Information Problem-Solving: The Big6TM Skills Approach to Library & Information Skills Instruction. Norwood, N.J.: Ablex.

Nicole B. Ellison, Charles Steinfield, and Cliff Lampe, 2007. “The Benefits of Facebook ‘Friends’ Social Capital and College Students’ Use of Online Social Network Sites,” Journal of Computer–Mediated Communication, volume 12, number 4, pp. 1,143–1,168, and at http://jcmc.indiana.edu/vol12/issue4/ellison.html, accessed 20 December 2010.

Karen E. Fisher, Carol Landry, and Charles Naumer, 2007. “Social Spaces, Casual Interactions, Meaningful Exchanges: An Information Ground Typology Based on the College Student Experience,” Information Research, volume 12, number 2, at http://informationr.net/ir/12-2/paper291.html, accessed 25 February 2011.

Christy Gavin, 2008. Teaching Information Literacy: A Conceptual Approach. Lanham, Md.: Scarecrow Press.

Melissa Gross and Don Latham, 2009. “Undergraduate Perceptions of Information Literacy; Defining, Attaining, and Self–Assessing Skills,” College and Research Libraries, volume 40, number 4, pp. 336–350, and at http://crl.acrl.org/content/70/4/336.abstract, accessed 25 February 2011.

Alison J. Head and Michael B. Eisenberg, 2010. “Truth Be Told: How College Students Find and Use Information in the Digital Age,” Project Information Literacy Progress Report (November), at http://projectinfolit.org/pdfs/PIL_Fall2010_Survey_FullReport1.pdf, accessed 1 December 2010.

Alison J. Head and Michael B. Eisenberg, 2009. “Lessons Learned: How College Students Seek Information in the Digital Age,” Project Information Literacy Progress Report (December), at http://projectinfolit.org/pdfs/PIL_Fall2009_finalv_YR1_12_2009v2.pdf, accessed 25 February 2011.

Alison J. Head and Michael B. Eisenberg, 2008. “Finding Context: What Today’s College Students are Saying about Conducting Research in the Digital Age,” Project Information Literacy Progress Report (February), at http://projectinfolit.org/pdfs/PIL_ProgressReport_2_2009.pdf, accessed 1 December 2010.

Bernard Jansen, Andrea Tapia, and Amanda Spink, 2010. “Searching for Salvation: An Analysis of U.S. Religious Searching on the World Wide Web,” Religion, volume 40, pp. 39–52, and at http://faculty.ist.psu.edu/jjansen/academic/jansen_searching_for_salvation.pdf, accessed 2 February 2011.

Steve Jones, Camille Johnson–Yale, Sarah Millermaier, and Francisco Seoane Pérez, 2009. “Everyday Life, Online: U.S. College Students’ Use of the Internet,” First Monday, volume 14, number 10, at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2649/2301, accessed 14 February 2011.

Jennifer Kayahara and Barry Wellman, 2007. “Searching for Culture — High and Low,” Journal of Computer–Mediated Communication, volume 12, number 3, at http://jcmc.indiana.edu/vol12/issue3/kayahara.html, accessed 11 March 2011.

Carolyn Kuhlthau, 2004. Seeking Meaning: A Process Approach to Library and Information Services. Second edition. Westport, Conn.: Libraries Unlimited.

Elena Larsen, 2001. “Cyberfaith; How Americans Pursue Religion Online,” Pew Internet & American Life Project (23 December), at http://www.pewinternet.org/Reports/2001/CyberFaith-How-Americans-Pursue-Religion-Online.aspx, accessed 2 February 2011.

Gloria J. Leckie, 1996. “Desperately Seeking Citations: Uncovering Faculty Assumptions about the Undergraduate Research Process,” Journal of Academic Librarianship, volume 22, number 3, pp. 201–208.http://dx.doi.org/10.1016/S0099-1333(96)90059-2

Mary Madden and Steve Jones, 2002. “The Internet Goes to College,” Pew Internet & American Life Project (15 September), at http://www.pewinternet.org/Reports/2002/The-Internet-Goes-to-College.aspx, accessed 14 February 2011.

Patricia D. Maughan, 2001. “Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California–Berkeley Experience,” College and Research Libraries, volume 62, number 1, pp. 71–85, and at http://crl.acrl.org/content/62/1/71.abstract, accessed 25 February 2011.

Eric Meyers, Karen E. Fisher, and Elizabeth Marcoux, 2009. “Making Sense of an Information World: The Everyday Life Information Behavior of Teens,” Library Quarterly, volume 79, number 3, pp. 301–341.http://dx.doi.org/10.1086/599125

Megan J. Oakleaf, 2011. “Are They Learning? Are We? Learning in the Academic Library,” Library Quarterly, volume 81, number 1, pp. 61–82, and at http://meganoakleaf.info/aretheylearningoakleaf.pdf, accessed 25 February 2011.

Megan J. Oakleaf, 2008. “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” Libraries and the Academy, volume 8, number 3, pp. 233–253.http://dx.doi.org/10.1353/pla.0.0011

John Palfrey and Urs Gasser, 2008. Born Digital: Understanding the First Generation of Digital Natives. New York: Basic Books.

Karen W. Pettigrew, 1999. “Waiting for Chiropody: Contextual Results from an Ethnographic Study of the Information Behavior among Attendees at Community Clinics,” Information Processing and Management, volume 35, number 6, pp. 801–817.http://dx.doi.org/10.1016/S0306-4573(99)00027-8

Carolyn J. Radcliff, Mary Lee Jensen, Joseph A. Salem, Jr., Kennerh J. Burhanna, and Julie A. Gedeon, 2007. A Practical Guide to Information Literacy Assessment for Academic Librarians. Westport, Conn.: Libraries Unlimited.

Howard Rheingold, 2011. “Crap Detection 101: Required Coursework,” Project Information Literacy Smart Talk, number 5, at http://projectinfolit.org/st/rheingold.asp, accessed 4 January 2011.

Soo Young Rieh and Brian Hilligoss, 2008. “College Students Credibility Judgments in the Information–Seeking Process,” In: Miriam J. Metzger and Andrew J. Flanagin (editors). Digital Media, Youth, and Credibility. Cambridge, Mass.: MIT Press, pp. 49–72.

Reijo Savolainen, 2009. “Small World and Information Grounds as Contexts of Information Seeking and Sharing,” Library & Information Science Research, volume 31, number 1, pp. 38–45.http://dx.doi.org/10.1016/j.lisr.2008.10.007

Reijo Savolainen, 1995. “Everyday Life Information–Seeking: Approaching Information–Seeking in the Context of ‘Way of Life’,” Library & Information Science Research, volume 17, number 3, pp. 259–294.http://dx.doi.org/10.1016/0740-8188(95)90048-9

Steven J. Tepper, Eszter Hargittai, and David Touve, 2008. “Music, Mavens, and Technology,” In: Steven J. Tepper and Bill Ivey (editors). Engaging Art: The Next Great Transformation of America’s Culture Life. New York: Routledge, pp. 199–220, and at http://www.webuse.org/pdf/TepperHargittaiTouve-MusicMavens2007.pdf, accessed 14 March 2011.

Sebastián Valenzuela, Namsu Park, Kerk F. Kee, 2009. “Is There Social Capital in a Social Network Site? Facebook Use and College Students’ Life Satisfaction, Trust, and Participation,” Journal of Computer–Mediated Communication, volume 14, number 4, pp. 875–901, and at http://online.journalism.utexas.edu/2008/papers/Valenzuela.pdf, accessed 22 December 2010.

Dorothy Warner, 2008. A Disciplinary Blueprint for the Assessment of Information Literacy. Westport, Conn.: Libraries Unlimited.

Kathryn Zickuhr, 2010. “Generations 2010,” Pew Internet & American Life Project (16 December), p. 11, at http://www.pewinternet.org/~/media//Files/Reports/2010/PIP_Generations_and_Tech10.pdf, accessed 26 January 2011.

 


Editorial history

Received 8 February 2011; revised 8 March 2011; revised 14 March 2011; accepted 15 March 2011; revised 29 March 2011.


Creative Commons License
“How college students use the Web to conduct everyday life research” by Alison J. Head and Michael B. Eisenberg is licensed under a Creative Commons Attribution–NoDerivs 3.0 Unported License.

How college students use the Web to conduct everyday life research
by Alison J. Head and Michael B. Eisenberg.
First Monday, Volume 16, Number 4 - 4 April 2011
https://firstmonday.org/ojs/index.php/fm/article/download/3484/2857