Courtney Kennedy

by Susan Rosegrant
March 2009

Cell phones have unnerved the world of survey research. Let’s say you’re planning to conduct a telephone survey. You can restrict your calls to landlines, or to cell phones, or include both —makes no difference, right? Or does it? It’s not just that cell phone users are demographically different from landline users. They use the phones in different places, often while multi-tasking. And they pay for them differently—for example, paying out equally for incoming and outgoing calls. Still, current research suggests that none of this seems to have much of an impact on survey responses. But Courtney Kennedy, a third-year graduate student at the Institute for Social Research’s Program in Survey Methodology, isn’t so sure. By the time she finishes her dissertation, “Nonresponse and Measurement Error in Mobile Phone Surveys,” the question of impact should be a whole lot clearer.

At 27, Kennedy is one of the fresher faces at ISR, but her youth belies her experience: she’s been working at the institute for eight years, ever since she found her way there as a freshman political science major in 2000. How did an accomplished golfer from the tiny city of Brighton, 15 miles north of Ann Arbor, become an up-and-coming survey methodologist at ISR? To hear Kennedy tell it, it was all about luck. She was already a political junkie when she arrived at the University of Michigan—the kind of teenager who had taken every political science class in high school, devoured political biographies, and dissected the evening news with her mother every night. She also had watched her father, a regional sales manager, move from software company to software company—a stress that made her vow to avoid the corporate world.

But the pivotal moment came when Kennedy decided to join UROP—the Undergraduate Research Opportunity Program that pairs first- and second-year students at Michigan with faculty to serve as assistants and learn about research. As she searched through the thick book of UROP listings, a particular opportunity caught her eye: a chance to do research on the accuracy of election polls with Michael Traugott, research professor at ISR’s Center for Political Studies, and then chair of Michigan’s Department of Communication Studies. The two met, and Traugott, spotting “a great deal of potential,” quickly offered her the position. Things took off from there. “It was November 2000—what a year to be involved in election polls!” Kennedy states, with unabashed enthusiasm. “This was Gore/Bush, and it just hit the fan.”

According to Kennedy, the work they undertook was immediately riveting. Despite polling error horror stories then prevalent in the media, Traugott concluded that the 2000 pre-election polls generally were quite accurate—as were such polls going back several decades. When Traugott was invited to present the findings to the annual conference of the American Association for Public Opinion Research (AAPOR), he helped Kennedy arrange UROP funding to accompany him. “He taught me how to do the software, the statistics, how to write an article—I mean, he taught me everything,” says Kennedy. “I was one-on-one as a freshman with the chair of the communications department.”

Kennedy worked for Traugott all four years as an undergraduate, soon switching to a dual major in political science and statistics. She and Traugott also discovered something else they had in common: a love of golf. “I think there’s some overlap in the skill sets,” Kennedy muses. “When you’re working with data, when you’re designing surveys and analyzing the data, no detail can go unnoticed without you compromising the quality of the work that’s produced. With golf there tend to be a lot of details, too. The golf swing is not a natural movement, and there are a lot of different planes and angles to play with. You have to pay attention.” Kennedy, who as a high school senior was on the state’s Division I All-State team, adds: “I was lucky. As a kid, I seemed to have a bit of a knack for it.”

As her skills and interest in survey methodology grew, Kennedy took on increasingly sophisticated tasks. In the fall of 2001, her sophomore year, she sat in as the top leadership of ISR collaborated on a study designed to assess the psychological toll taken by the attacks of September 11. Then she helped analyze the data. In her junior year, Kennedy helped Traugott and Elizabeth Martin, a Michigan graduate and the senior survey methodologist for the U.S. Bureau of the Census, develop a new way to measure accuracy in the polls, revising statistical methods that hadn’t been updated since the end of the 1940s. Their article on the research, Traugott’s first ever with an undergraduate, appeared in Public Opinion Quarterly.

By the time Kennedy graduated in 2004, she knew she wanted to continue in survey methodology. That summer, she moved to Maryland in preparation for entering the Joint Program in Survey Methodology, the graduate training program jointly offered by the University of Maryland and the University of Michigan. Kennedy also applied for a summer internship at the Washington, D.C.-based Pew Research Center for the People and the Press. While still in college, Kennedy had been struck by a Pew study on whether low survey response rates—known as nonresponse—were causing errors. The organization did two identical telephone surveys, in one case spending extra time and money to boost the response rate to 50 percent, and in the other stopping at a more typical 25 percent. What Pew discovered, Kennedy says, was that the findings of both surveys were nearly identical. “I was blown away that they did this great experiment,” she says. “For me it was the Holy Grail.”

Scott Keeter, director of survey research at Pew, says the organization was not keen on bringing in an intern, given that it was a small shop and that interns can require a lot of mentoring. But when they met Kennedy at the urging of Joint Program faculty, they were “really wowed,” he says. That impression was cemented after he contacted the head of a company where Kennedy had previously interned. Recalls Keeter: “This person, who is not given to effusive praise for anybody, sent me back a one-line message that said, ‘Hire her over anybody that you have applying for any job at any level.’”

The Pew internship soon turned into a part-time position, and then near full-time work as a project director. To accommodate, Kennedy shifted her class work to part-time. At Pew, she focused on monthly national surveys to study political attitudes and media consumption. “I got to do all the pieces of the process: questionnaire design, analysis, pre-testing,” Kennedy says. She also was a leader in Pew’s cell phone research, and was instrumental in turning a replication of the non-response study she earlier had admired into an article for Public Opinion Quarterly.

By 2006, Kennedy was ready to commit to a Ph.D. “It gives you the capacity to lead methodological studies, to design them, to get them funded, and to be able to publish the findings,” she says. Two main factors convinced her to come back to ISR. Her primary research interests in the growing impact of cell phones and in the effect of different survey modes—whether a survey is conducted by landline, cell phone, web, or other means—were a better fit with the Michigan faculty. Also, she was genuinely afraid that if she stayed at Maryland, she would be unable to pry herself away from Pew, thus delaying her Ph.D.

Traugott became the chair of Kennedy’s dissertation committee, and Kennedy began working on connections between non-response and measurement error for Robert Groves, director of ISR’s Survey Research Center, and Roger Tourangeau, a research professor at ISR and director of the Joint Program at Maryland. She was also thinking about her own research. For her dissertation topic, Kennedy wanted to do a rigorous experiment that would make a real contribution to cell phone research. Telephone surveys had relied on landline users for decades, but with the explosion in mobile phones, this approach was neglecting 15 percent of the U.S. population overall and about 30 percent of young people. The proportion of cell-only users was even higher overseas, making this a major issue for multinational survey research projects. To rectify that, survey firms had begun to call cell phone users, as well. But little was known about either the participation rates or response quality of those participants. “What factors contribute to their decisions about whether they’re going to take the survey or not?” Kennedy asks. “And are they paying attention? They’re on their cell phone; they could be at the mall, they could be out walking their dog, are they really going to answer a survey carefully? My dissertation is designed to address both those issues.”

According to Kennedy, earlier research in this area hadn’t revealed big differences between cell phone and landline users in participation and data quality. But there was a potentially serious flaw in the research: Cell phone users were disproportionately young, while landline users were skewing older. “There are two different devices, but there are also two different groups demographically,” Kennedy explains. “You don’t know if the results are this way because it’s a landline and a cell phone, or is it because these groups are different.” For example, cell phone users concerned about the cost of answering questions might decline to participate in surveys, thus omitting a cost-conscious segment of the population and distorting the results. And if people on cell phones turned out to be more distracted and less capable of thoughtful answers than landline users, their response quality would suffer.

Kennedy designed her research to separate any impact of the two user groups from the impact of the devices. First she would conduct a telephone survey—including both cell phone and landline users—that would identify individuals who use both kinds of phones. For a second survey, she would randomly assign half of those dual users to be called on their landlines, and the other half to be contacted on their cell phones. “For the nonresponse component, all I’m doing is seeing who participates and who doesn’t, and see what the predictors are,” Kennedy explains. To judge how thoughtful respondents were, she would compare data quality for the two groups.

Kennedy thought her research design was good, but she didn’t actually think the experiment would happen: Conducting the two surveys and running the analysis would cost about $150,000. But as she described her idea to participants at an AAPOR conference in 2008, a researcher and former professor who knew Kennedy and who ran a survey research firm in Maryland offered to take it on. The catch: Kennedy would not only run the experiment and analyze the data, she would also do other work for the company—in all, working 20 hours a week, the same hours she had been working at ISR. According to Kennedy, James Lepkowski and Tourangeau—the directors of the Michigan and Maryland programs—were “incredibly generous,” giving their blessing to the arrangement despite the loss of one of their graduate research assistants. In October 2008, she began working remotely for the Maryland-based Everett Group. Says Traugott: “She has a chance to do some very innovative work based upon an original data collection activity that no graduate student would ordinarily be able to afford.”

Kennedy plans to collect her data during the winter of 2008-09, and to defend her dissertation at the end of 2009. What she does after that is an open question. Clearly, Pew would like to have her back. “She’s incredibly organized, and very positive in her outlook, and extremely creative in thinking about difficult methodological challenges, while retaining a very collegial style,” Keeter says, then quickly adds: “I don’t want to say too many positive things about her because I don’t want other potential employers to see it.”

Wherever she goes, Kennedy will bring a rock solid belief in the importance of her chosen profession. “I see a lot of value in providing objective, accurate data,” she explains. “But also I was really touched early on by the things I learned about George Gallup, who was all about the democratic notion of giving voice to people. Aside from election day, this is the main mechanism to measure people’s attitudes and how they feel about how the government is doing.”

And Kennedy continues to talk about luck—in golf and in her career. “I’ve taken enough statistics and probability classes to know that over the long run, things even out,” she says. “At a given moment, luck can play a factor. But over the course of a career, people make their own luck.” She pauses. “I guess I want to contradict myself, though, because I think I’ve been lucky with my career, so maybe there is an element of luck in it all.” According to others, though, Kennedy is likely the only person who views it that way. “She might have been lucky at the very start when she opened up this book in UROP and saw an ad for a position,” Traugott says. “But very soon beyond that point it was her own qualities and skills and the tools that she developed that propelled her forward.”