Friday, October 12, 2012

Gallup.Com - Polling Matters by Frank Newport: Survey Methods ...

The methods we and other pollsters use to conduct surveys are very complex, but also very important -- and very interesting to people. Lots of correspondents, with varying degrees of understanding and different motivations, ask about how we conduct our surveys. As a former college professor and former president of our country?s largest professional polling association (AAPOR), I think it?s great to educate and explain what it is that we do.

Gallup conducts surveys asking questions about social, cultural, health, wellbeing, political, economic, and many other topics both in the U.S. and around the world, using a number of different methodologies and survey ?vehicles.? All are focused on the goal of producing a scientific random sample of the population under study.

Here in the U.S., Gallup used in-person interviewing in randomly selected households in the first decades of our history. We switched to telephone interviewing in the late 1980s, and added in cell phones in January 2008. We have also used various telephone survey procedures over the years, including what we can call ?stand-alone polls? which are self-contained, cross-sectional surveys usually conducted over a three- or four-day period, and various iterations of tracking polls, in which interviewing is conducted continuously day in and day out.

Gallup instituted a wonderful Gallup Daily tracking program in 2008, based on separate random surveys of 1,000 national adults conducted each day. Many of Gallup?s basic economic and political measures that are reported on a continuous basis have been contained on a random half sample of this tracking survey since 2008. Our Gallup Poll Social Series surveys reported each month, along with many of the surveys conducted in conjunction with USA Today, have been based on stand-alone surveys.

In all cases we are continually tweaking, modifying, and improving our methodology -- based on decisions made by an outstanding team of survey professionals and methodologists at Gallup. Domestically, we always have the objective of being able to accurately represent the target -- the adult population of the U.S.

One focus point over the past decade (for all of us in the survey profession) has been the need to stay consistent with changes in the communication behavior and habits of those we are interviewing. As noted, Gallup switched primarily to telephone interviewing a few decades ago based on the increased penetration of phones in American households and the increased costs of going into Americans? homes for in-person interviewing. Now we know, based on government statistics (and what we observe around us), that Americans are shifting rapidly from reliance on landline phones to mobile devices. We first began to add cell phones to our samples in January 2008 and have been increasing the proportionate representation of cell phones in our samples on a periodic basis from that point forward. That?s based on the knowledge that there are more households with cell phones than landlines today in the U.S., or conversely, more households without landlines than without cell phones. We get updated estimates of telephone use from the U.S. government.

For our final month of political surveys before the Nov. 6 election, we are now conducting a separate daily tracking program consisting of interviews with a random sample of 500 U.S. adults each night. This provides us a survey vehicle focused just on the election and other political measures, particularly important in the current situation, in which we need to include a list of likely voter questions along with other political and election questions.

As we began this election tracking program on Oct.1, our methodologists also recommended modifying and updating several procedures. We increased the proportion of cell phones in our tracking to 50%, meaning that we now complete interviews with 50% cell phones and 50% landlines each night. This marks a shift from our Gallup Daily tracking, which has previously been 40% cell phones. This means that our weights to various phone targets in the sample can be smaller, given that the actual percentage of cell phones and cell-phone-only respondents in the sample is higher. We have instituted some slight changes in our weighting procedures, including a weight for the density of the population area in which the respondent lives. Although all Gallup surveys are weighted consistently to census targets on demographic parameters, we believe that these improvements provide a more consistent match with weight targets. The complete statement of survey methods is included at the end of each article we publish at Gallup.com.

The fact that the election and political questions are included on a shorter, politically-focused survey each night instead of being included within a longer tracking survey has resulted in increased response rates -- a good thing. One hypothesis is that certain types of respondents may be more likely to stay on the line with a shorter political survey than with longer, more general surveys -- which, in turn, could affect not only response rates but the percentage of "don?t know?s" and refusals. (Gallup also typically sees an increase in response rates for political surveys conducted close to presidential elections in general.) All in all, it is possible that these changes in methods, which we believe increase the representation of our overall samples, may have some impact on political or other measures included in the surveys. Although it's too early to tell definitely, for example, we're looking?to see if these improvements in the survey methods have, for example, had an impact on the average values of our presidential job approval rating.

We will most likely make other changes to our survey procedures as our survey methods and procedures evolve.

As noted, one reason for our investment in separate daily tracking during this political environment is the necessity to carry likely voter questions on the survey each night. Our traditional procedure has been to ask seven likely voter questions, and assign each respondent a score based on his or her answers to the questions and then isolate the pool of those who score higher on the scale and are thus -- based on historical trends -- more likely to vote. We have modified some of these questions slightly based on the increasing prevalence of early voting. At this point we are not attempting to precisely estimate real world turnout or to correlate that to our internal sample percentages, but rather making the assumption that those at the top end of the scale are the most likely voters.

As has always been the case, we do not attempt to weight the composition of the likely voter sample in any way -- such as by political party or race or age -- to approximate some guess of what we or others think it should look like demographically on Election Day. That approach is precarious given that the electorate can look quite different (especially looking at political parties) from one election to the next.? Also,?party identification estimates are often based on exit poll results, which themselves are surveys using totally different methodologies, and?we generally do not rely on judgment calls to predict what the ultimate electorate will look like.?Our basic underlying sample of national adults is weighted to known population parameters on demographic and phone use variables, as noted,?and the likely voter pool derives from that, based on how our randomly selected respondents answer the likely voter questions.

People ask about the order of the questions on our survey. We are following our historical procedures in this regard at this point in time -- asking the likely voter questions, then the presidential ballot, and then presidential job approval. This marks a slight change from prior to the addition of the likely voter questions, when the presidential ballot and the presidential job approval question did not have any questions before them on the survey, other than a registered voter screening item.

Source: http://pollingmatters.gallup.com/2012/10/survey-methods-complex-and-ever-evolving.html

fafsa branson missouri davy jones dead monkees last train to clarksville tim tebow taylor swift post grad

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.