FreshQuest Logo

Project Methodology Details

The research instruments (all the survey quetions, interview and survey protocols) we used can be downloaded here.

There were two research sites, University of California, Berkeley and Ohlone College in Fremont, California. Most of the research was done at Berkeley, and Ohlone was used to test some of our assumptions about the Berkeley students. The data presented in most of the papers on this site is from the UC Berkeley research site, and is data about 18 and 19 year old freshmen collected during their freshmen year in from February 2005 to April 2006.

Summary of Data Collected:

Two primary sources of data were collected: quantitative data through surveys and qualitative data through interviews and focus group. The quantitative and qualitative data provide a rich picture of the “technological careers” (DiMaggio and Celeste, 2004) of research subjects at UC Berkeley. Qualitative data gives in depth stories about the young people while the quantitative data made it possible to see trends across the Berkeley campus population.

In all focus groups and interviews we tried to solicit narratives about student's technology adoption and use. The interviews focused heavily on the technological careers of individuals, and the specifics of technology use in the context of their social networks. The focus groups were better for getting at the students feelings about technologies and more about their current experiences with them, since the students were able to share opinions and stories with each other. The difference in the information gathered in the focus groups versus the interviews was largely due to the strengths of the different methods.

Different papers on this site rely on different "slices" of the data. The papers on mentorship in the technology adoption process relies more heavily on data from interviews where the section on negotiating the role of technology in the lives of students relies more heavily on the focus group data. The only survey data that is reported on in this document is the data from the UC Berkeley OSR survey. This data set is far more robust than the other survey data collected, and should prove to be extremely valuable.

Date

Method

Subjects

Location

note

February/March 2005

Web Survey

-80 out of a random sample of 500 freshmen responded

-160 respondents from “snowball” group (all grades)

UC Berkeley

-raffled off iPod and four $20 gift certificates to pizza dinner as reward

March 2005

Interview

(interviewers: Megan Finn, David Shlossberg, David Hong, Judd Antin)

 

-22 interviewees all solicited from the 80 random sample survey respondents

UC Berkeley

-subjects paid $30

-interviews are approximately 60 to 90 minutes

August 2005

Web Survey

-2921 incoming freshman responded out of 4105 incoming freshman

UC Berkeley

-the survey was conducted by Berkeley’s Office of Student Research (OSR).

March 2006

Focus group

(all conducted by Megan Finn with the assistance of Paul Poling)

-32 subjects total

-subjects recruited based on OSR survey responses

-8 focus groups total

-4 of the 8 focus groups solicited from students with a reported income of less than $35,000

-other focus groups were loosely constructed of heavy technology users or light technology users

UC Berkeley

-subjects given $40 gift certificates to Amazon.com

-focus groups lasted approximately 90-100 minutes

March 2006

Web Survey

-90 respondents

-50 of the respondents were 18 or 19

Ohlone College

(Fremont, CA)

-raffled of iPod shuffle

April 2005

Interviews

(interviews done by Megan Finn and Paul Poling)

-8 interviews

-interviewees solicited from the 50 survey respondents

Ohlone College

(Fremont, CA)

-$40 gift certificates to Barnes and Nobles

Details about the data collected:

February/March 2005 survey- The first part of this study took place in March 2005. The data from the first study strongly informed future studies. The March 2005 survey was pre-tested with 6 Berkeley freshman. We distributed a survey to a random sample of about 500 Berkeley freshmen, for which we had 80 respondents. The survey aimed to get an overview of how Berkeley freshmen used technology. The list of technologies and questions for the survey and interviews were generated from our pre-test subjects.

March 2005 interviews- In March 2005, 22 open-ended interviews lasting from 60 to 90 minutes were conducted with Berkeley Freshmen. The interview questions were pre-tested with four 22-24 year olds. These Freshmen were recruited from a base of students who had responded to the March 2005 survey. The March 2005 interviewers had the survey responses of the interviewees prior to the interview, thus they had significant background information on the interviewees going into the interview. The interview was largely open ended, guided by technologies that the students said were important to them. While most questions asked for more detail about a student's use and adoption of specific technologies, some questions were aimed at better understanding the larger context in which the students use these technologies, such as "If you needed to get in touch with your parents, how would you do that?" Because we had survey results for all of the students that we interviewed, we were able to connect information from the survey about student's backgrounds, to more rich information from the interviews. This also allowed us to validate some of our survey data, and make observations that we might not have been able to make. (We also had the survey responses for the Ohlone interviews in April 2006, however, we did not have all the survey responses for focus groups subjects in March 2006 because the survey was conducted by another research group.) This multi-method approach allowed us to see some trends from the students from the surveys, and to understand students in more depth from the interviews. The March 2005 interviews provided a base understanding of what the Berkeley experience was like for freshmen, especially in terms of their technology use. In March of 2005, the research particularly focused on social aspects of technology adoption and how students used information and communication technologies to support their social network. The data from the March 2005 study was used to write survey and focus group questions for the surveys and interviews that followed.

August 2005 survey- In the summer of 2005 the Office of Student Research at UC Berkeley did a web based survey of all incoming UC Berkeley freshmen. The Freshquest team was given the opportunity to help the OSR researchers brainstorm a few questions to ask incoming freshmen. The OSR survey had 2921 responses out of 4105 new freshmen students. The final version of the survey asked questions covering when the students began using technologies, how often they use them, who taught them how to use those technologies, and how they thought that their technology use compared to their high school peers.

March 2006 focus groups-In the March of 2006, 8 focus groups were conducted varying in size from 2 to 5 students. A total of 32 Berkeley freshmen participated in the focus groups. The focus group participants were recruited based on responses to a survey conducted by Berkeley’s Office of Student Research. The Office of Student Research gave an online survey in August of 2005 to all of the incoming Freshmen. Four of the March 2006 focus groups (which included 16 of the students) were composed of students who claimed on the OSR survey that they came from households that made less than $35,000 per year and who responded they were “very confident” about this response (roughly 266 responses). This pool of the “lowest income” students was divided into four groups, and we recruited for the focus groups from each of the four groups. One focus group was recruited from those who responded that they updated their blog at lease once a day. Another group was formed by students who either started playing video games at an early age (before the age of 13), or play games very frequently (at least once a week). The last two groups were formed by randomly dividing the remaining students into two segments. The students were not told how they were selected for the focus groups other than it was “based on their responses to the OSR survey.” Students who were not part of the lowest income focus groups (which included 16 students) were recruited based loosely on answers to the OSR survey. I tried to group students into groups that seemed to use technology a lot, and those who didn't use it frequently, but this was quite inexact.

Data Limitations:

While traditional ethnographic observation yields more accurate pictures of what people are actually doing, observation has a few limitations that for the purposes of this study made it an impractical option. First, the focus of this study is student's stories in how they use information and communication technologies, but also importantly how they came to use these technologies (the student's “technological career”). The student's technological career can span their entire lifetime and is unobservable.

Yet another reason focus groups and interviews seemed like an appropriate research medium was that most of the information and communication technology we were interested in was either personal communication software on a computer, or mobile devices. It would have been hard to observe students using these technologies without having to sit in students' dorm rooms, or follow students around campus looking over their shoulder, or install tracking software on their devices that could invade their privacy. Thus, simply asking students about their practices was more practical.

However, there are some notable problems with using these data collection methods, in particular, it is hard to assess the accuracy of the data that we collected. All of the data in this study presented below is exclusively from the perspective of young people. Thus, statements are made below about the families and parents of young people that might not completely reflect an accurate reality, but do reflect the perspective of the young people in this study, which has it's own significance. By asking students to tell stories about technology use and adoption in their own words, a recall problem is introduced. Students may not accurately remember their childhood that they are asked about in the survey or focus groups. Additionally, there is a self-reporting problem. Students often have trouble accurately describing their actions. One way to attempt mitigate this problem is to use the survey results to confirm trends found in the focus groups and interviews.

All of the subjects for all of the studies were recruited via email. This introduces a self-selection bias into our data. Students who responded were those who were willing to be interviewed or surveyed about their technology practices, and were also individuals that read and responded to email promptly. This might bias our data in that we interviewed students who are particularly heavy email users. Additionally, heavy internet users might be more likely to respond to an internet based survey. Additionally, by recruiting students over email, we might have biased the survey to those who check their email more frequently, or read their email more thoroughly.